I’m sure we’ve been through this before, I but I remain a bit uncertain as to the basis one is meant (or allowed) to use when determining voltage drop in socket circuits to confirm compliance with 525 of the regs.
For a start, 525 is badly written. 525.1 and 525.100 (BGB) both relate specifically to fixed equipment, yet the next item (515.101 in BGB) introduces the concept of sockets, yet talks about what is deemed to satisfy “the above requirements” (which relate only to fixed equipment).
That aside, the acceptable voltage drops (with which we are all familiar) are given in Table 4Ab in 6.4 of Appendix 4 (BGB). A footnote to that table reads “The voltage drop is determined from the demand of the current-using equipment, applying diversity factors where applicable, or from the value of the design current of the circuit”.
If one has fixed loads, that’s straightforward enough. If one uses the ‘design current of the circuit’ for socket circuits, it’s also pretty straightforward – although here I have to ask whether one has to calculate on the basis of the (theoretically impossible) scenario of the entire design current (e.g. 32A for a standard RFC or 4mm² radial circuit) being drawn at the furthest point in the circuit.
However, my main question is this. If one has a 32A (or 20A) socket circuit, does the designer have to calculate voltage drop on the basis of the maximum demand which that circuit could serve (32A or 20A), or is (s)he allowed to exercise judgment, based on the location of the sockets and the nature of their probable use, to calculate voltage drop on the basis of the maximum probable demand? Situations in which this could be relevant would include supplies to distant outhouses or ‘distant’ upper floor bedrooms where demand was anticipated to be very low.
I suppose the official answer might be that if the designer believed that the demand was going to be low, (s)he should perhaps protect the circuit with a 6A, 10A or 16A OPD, as appropriate, rather than 32A or 20A one – in which case I presume that it would be acceptable to regard the rating of that OPD as the ‘design current of the circuit’ and use that for calculation of VD. However, what do people feel about the (more likely) situation in which there is a 32A or 20A OPD?
Of course, in the case of an 'outhouse', it's quite likely that a single supply will serve lighting as well as sockets, in which case the limiting factor will be the permissible VD for lighting (although perhaps one could cheat by having plug-in lighting!) - but it would still be the VD due to socket use which would be the primary determinant of the actual VD.
Kind Regards, John
For a start, 525 is badly written. 525.1 and 525.100 (BGB) both relate specifically to fixed equipment, yet the next item (515.101 in BGB) introduces the concept of sockets, yet talks about what is deemed to satisfy “the above requirements” (which relate only to fixed equipment).
That aside, the acceptable voltage drops (with which we are all familiar) are given in Table 4Ab in 6.4 of Appendix 4 (BGB). A footnote to that table reads “The voltage drop is determined from the demand of the current-using equipment, applying diversity factors where applicable, or from the value of the design current of the circuit”.
If one has fixed loads, that’s straightforward enough. If one uses the ‘design current of the circuit’ for socket circuits, it’s also pretty straightforward – although here I have to ask whether one has to calculate on the basis of the (theoretically impossible) scenario of the entire design current (e.g. 32A for a standard RFC or 4mm² radial circuit) being drawn at the furthest point in the circuit.
However, my main question is this. If one has a 32A (or 20A) socket circuit, does the designer have to calculate voltage drop on the basis of the maximum demand which that circuit could serve (32A or 20A), or is (s)he allowed to exercise judgment, based on the location of the sockets and the nature of their probable use, to calculate voltage drop on the basis of the maximum probable demand? Situations in which this could be relevant would include supplies to distant outhouses or ‘distant’ upper floor bedrooms where demand was anticipated to be very low.
I suppose the official answer might be that if the designer believed that the demand was going to be low, (s)he should perhaps protect the circuit with a 6A, 10A or 16A OPD, as appropriate, rather than 32A or 20A one – in which case I presume that it would be acceptable to regard the rating of that OPD as the ‘design current of the circuit’ and use that for calculation of VD. However, what do people feel about the (more likely) situation in which there is a 32A or 20A OPD?
Of course, in the case of an 'outhouse', it's quite likely that a single supply will serve lighting as well as sockets, in which case the limiting factor will be the permissible VD for lighting (although perhaps one could cheat by having plug-in lighting!) - but it would still be the VD due to socket use which would be the primary determinant of the actual VD.
Kind Regards, John