What's your average single phase measured voltage? Is 251V at point of measurement common?

Sorry for delay, I've been very distracted by several calls on my time ;)

I never dreamed that I would ever find myself struggling to get my head around things which I had 'taken for granted' for 50+ years, but that seems to be what has happened since I 'made the mistake' of actually thinking about it!

I have come to the conclusion that much of what you (Simon) has been saying is correct, in as much as I cannot really argue with any of his maths. However, I had been relying more on 'physics intuition', rather than the maths (and my simulation model was based on the same 'intuitive' assumptions') but I'm having grave difficulties in reconciling that (presumably incorrect!) intuition with the maths!

I will explain and discuss some of the 'intuitive anomalies' that I am trying to rationalise in a subsequent message, when I have a bit of time!

..... But, we were discussing losses in the network. Let's assume we have a loop impedance of 0.1Ω, and 100off such loads on the network.
At 200V, 100 of such loads will be 200A and the volt drop will be 20V, so the losses in the distribution network will be 20V * 200A = 4,000W while supplying a total load of 40kW - so network losses are around 10% of supplied energy.
If we were to double the voltage to 400V, then without any controls, each load would pull 4A and 1,600W. 100 such loads would draw 400A and 160kW. The losses in the network would now be 40V drop @ 400A, so 16kw, so still around 10% of supplied energy.
Indeed. With 'uncontrolled' loads (pretty rare for large loads in the real world), the supply voltage is irrelevant, the losses in the DNO network being the same percentage of the power supplied to the load, regardless of voltage. Similarly,with the figures you''re using, that 10% figure is inevitable, again regardless of voltage. The same current (hence same I²) will obviously flow through both network cables and load(s), so that the power dissipated in them will simply be proportional to the resistances of network and loads. With a network resistance of 0.1Ω and an effective load resistance of 1Ω (100Ω / 100), the former will always be 10% of the latter, hence power dissipated in network always 10% of that dissipate in the load(s)..
Now consider if the loads are all thermostatically controlled. Each will now change it's duty cycle to around 25% - 25% of 1,600W is 400W. At any point in time, "around" 25% of loads will be on, and so the total of 100 loads will be "around" 100A, give or take a bit - I suspect if you plot current versus frequency it'll come out looking something like a bell curve.
.... But back to our discussion. We now have around 100A in our distribution network. That's going to give us around 10V drop. Regardless whether you do 10V * 100A, or 100A² * 0.1Ω, you get 1,000W. So for the same 40kW supplied, we now only lose 1kW in the network, only ¼ of the losses at 200V, and 2.5% of supplied energy.
As above, I can't dispute that maths, per se, but I think it only works ("iis valid") if the timing and magnitudes of the loads is such that the total load is essentially the same at every point in time.

To use an extreme example to illustrate how that approach apparently goes wrong when that assumption is not satisfied, consider the situation in which 100 identical 100Ω loads all start simultaneously and are all controlled so as to 'switch off' after 2 kWh of energy had been delivered to each load. All loads would therefore 'switch on' simultaneously and all would 'switch off' simultaneously 5 hours later.

With 200V, the current throughout that 5-hour period would be 200A, so the power dissipated in the (0.1Ω) DNO network would be (200² x 0.1) = 4,000 W. Over the full 5-hour period, the energy loss would therefore be 20 kWh.

Now move to 400V. Total current, when present, is now 400 A, so that power dissipated in DNO network (whilst current is flowing into loads) becomes (400² x 0.1) = 16,000 W (16 kW). However, that current now only flows for 1.25 hours, so that the total energy loss would be 20 kWh - the same as with 200V.

If, in that situation, one took an approach similar to yours, and looked at the 'average' current over 5 hours, with 200V that average current would still be 200 A, so energy loss would still be 20 kWh. However, again looking over the 5 hour period, with 400V there would be a current of 400A for 1.25 hours, followed by zero current for for 3.75 hours - an average current over the 5-hour period of 100A - and that would presumably lead you to conclude that the power loss was (100 x 0.1) = 1,000 W and hence the energy loss over the 5 hours of only 5 kWh (one quarter of the true energy loss).

Of course, such an extreme situation, with loads oscillating between 'high' and zero is not going to arise in practice. However, I strongly suspect (and will investigate if/when I have some time!) that the further one gets from the situation of load being the same at every point in time,the less will be the reduction in energy loss (over a period, such as 'a day') resulting from increasing voltage.

More when I have a bit more time!
 
Sponsored Links
Yeah me too, I`m a bit tied up yet but I do intend to get my head around it and it is an interesting observation
I look forward to hearing your thoughts in due course.

One of the things that troubles my intuition most is that I think I have always 'assumed' (essentially subconsciously) that adding a particular amount of current to that flowing through a resistance would always result in the same increase in power dissipated in that resistance, regardless of anything (in particular, regardless of what current was flowing through the resistance before the additional current was added).

However, the maths indicates that this is not the case - the increase in power dissipated in the resistance is actually very much dependent upon the current that was flowing before current was added. Perhaps 'worse' (in terms of intuition'), the greater the pre-existing current (i.e. the 'less significant' is the current added, in relation to the pre-existing current), the greater the increase in power dissipated in the resistance when the current is added. Consider this series of (fairly extreme) examples.

Assuming DNO network cables have a resistance of 0.1Ω, and regardless of supply voltage, then:

Starting with no load current, add a load drawing 2 A​
Power dissipated in network increases from zero to 0.4 W, an increase of 0.4 W due to adding 2 A.​
Starting with a load current of 2 A, add an additional load drawing 2 A​
Power dissipated in network increases from 0.4 W to 1.6 W, an increase of 1.2 W due to adding 2 A.​
Starting with a load current of 200A, add an additional load drawing 2 A​
Power dissipated in network increases from 4,000 W to 4,080.4 W, an increase of 80.4 W due to adding 2 A.​
Starting with a load current of 2000A, add an additional load drawing 2 A​
Power dissipated in network increases from 400,000 W to 400,800.4 W, an increase of 800.4 W due to adding 2 A.​

Thinking of the physics, rather than the maths, I still find it difficult to get my head around that!

Indeed, thinking a bit more ..... an increase of total current of 2 A will always result in a 0.2 V increase in voltage across the (0.1Ω) network resistance (regardless of the pre-existing current through the resistance). so my intuition might well (probably does!) expect that the increase in power dissipated in the network resistance would always increase by 0.2 V x 2 A = 0.4 W, regardless of the pre-existing current.

Can anyone help me here, before more of my hair gets pulled out?
 
I remember two theorems, one started with Th but can't remember their names, did it in University but never used it, so can't remember them. It does get a little complex where using a supply ring, with many outlets, and the problem is the whole idea of the ring is so any part can be isolated to work on it.

This was the debate, the supply needs to have an impedance of 0.35 Ω to supply 100 amps, and this would be with ring broken, so with ring intact likely it would be well below that, say 0.25 Ω for example, so at that if in the property we have a B32 MCB should the loop impedance be 1.38 Ω or if the supply is 0.25 Ω should it be max of 1.28 Ω to allow for when the supply is being worked on and the ring is open?

As the user we have no idea if the supply is in a ring, so do we measure and assume it will always be as measured, or do we measure and allow for the impedance to raise to 0.35 Ω which is the max for the 100 amp supply, we simply don't know.

It was so much easier where I started and it was fuses, and if the loop impedance was to raise it just took a little longer to rupture, with MCB's we move from the magnetic trip time to the thermal trip time, so just a little too high of an impedance makes a huge change in tripping time.

But we as electricians have little or no control over the supply voltage, if the solar inverter logs disconnection due to over/under voltage then we can complain to DNO but very little else we can do. Last house a load of solar panels were installed local to me, and the supply voltage dropped, I will assume some one complained about high voltage,

It was a pain, as a 65 watt fluorescent rated 240 volt had worked for years with a 58 watt tube, but when voltage dropped it would fail to ignite, so had to change to LED.

But to the question, 4 am and my supply is
1727233885237.png
not same house as fluorescent lamp problem, I have not seen the supply at 230 volt, it is always on the high side, but the solar has not tripped, so has not exceeded 253 volt.
 
Sponsored Links
First of all could I clarify a few myths?
In my living memory our basic UK mains voltage was 240V, always was, always will be in my humble opinion.
To claim it is now 230V is a bit of a Red Haddock really.
Take a difference of about 4% voltage and correspondingly 4% current and that makes about 9% difference on a purely resistive circuit.
That`s why I always talk in terms of 240V UK because in reality that is what it is and unlikely to change.
Other countries in Europe are equally unlikely to change their actual voltages too for similar reasons.

OK, Europe etc got together to find a way to harmonise in order to make appliances throughout the continent to behave similarly throughout the region.
Good idea, it has merits.
So it was redeclared as 230V and the limits opened up and now just about fits all countries as intended as most of them, if not all of them, had 220 to 240 or thereabouts as at least one of their useful voltages.

Notice I said "Europe" I did not say "EU" or "EEC" as it is not an EU thing as such although admittedly the EU did have some representation at the thinking part of this process, quire reasonable I think.

It would be carnage if we tried to declare a worldwide voltage of 1000v plus or minus 100% but if we did then suddenly everything would fit.
Of course I am joking but it helps to illustrate a particular point of thought.

Wouldn`t it be nice if we got the rest of the world on 50Hz ? can not see that happening somehow though. But we can live in hope.

I am pretty much OK with what we have now (except those damn 3 phase colours).

So I would class it pretty much as a job well done. So long as we realise the reality of the situation.

Anyway, that is my take on it, others might disagree.

Power loss thingy, I have not taken a real look at it yet, I am hoping it is not long before I do so.
 
Indeed, thinking a bit more ..... an increase of total current of 2 A will always result in a 0.2 V increase in voltage across the (0.1Ω) network resistance (regardless of the pre-existing current through the resistance).
Right.

so my intuition might well (probably does!) expect that the increase in power dissipated in the network resistance would always increase by 0.2 V x 2 A = 0.4 W, regardless of the pre-existing current.
Essentially you have three components to the increase in power dissipation.

extra voltage * extra current.
extra voltage * existing current.
existing voltage * extra current.

Your intuition seems to want to focus on the first of these, even though the latter two are normally more significant.
 
In my living memory our basic UK mains voltage was 240V, always was,

No, not always. It varied on what was available locally, through to the 1970's - maybe even later. Wholesalers, and lamp sellers, stocked a variety of lamp wattage and voltages.
 
I was told that a man came to the door a week before the change, counted the house's light bulbs, and handed over the same number of new ones in 240v.
 
Essentially you have three components to the increase in power dissipation.
extra voltage * extra current.
extra voltage * existing current.
existing voltage * extra current.
Your intuition seems to want to focus on the first of these, even though the latter two are normally more significant.
Thanks. Yes, that';s clearly what is going on, and is reflected by the maths. If the resistance is R, existing current is Ie and extra current is Ix, then (determining power as V*I) :

Existing power dissipated = (Ie * R) * Ie
and
Total power with extra = [(Ie + Ix) * R] * (Ie + Ix)

hence:
Increased power dissipation = [(Ie * R) * Ie] - { [(Ie + Ix) * R] * (Ie + Ix) }
... which, with a bit of algebra, can be reduced to:
Increased power dissipation = [{ (2 * Ie * Ix) + Ix²} * R ]

... which with a bit more algebraic fiddling can be re-written as:
Increased power dissipation = [ (Ix * R) * Ix ] + [ Ix * R) * Ie ] + [ (Ie * R) * Ix ]
.... those three terms in square brackets being the three components (of increased power dissipation) you mention.

However, although I am (and, I suppose, always have been) 'convinced by the maths, that doesn't make it feel any more 'intuitive' to me - but I guess that is 'just me' ;)

I suppose that, as I mentioned before, the thing that feels (to me) most non-intuitive is that, for a given 'extra current', the impact of that on power dissipated becomes progressively greater as it comes to be an increasingly small/'trivial' proportion of the total (essentially 'pre-existing') current. However, again, I 'understand' the reason (per the maths), essentially due to the fact that the 'existing' current is also travelling through a resistance that has an increased voltage across it, but I'm afraid that that 'understanding' doesn't make it 'feel' much more intuitive to me :)
 
Last edited:
.... hence:
Increased power dissipation = [(Ie * R) * Ie] - { [(Ie + Ix) * R] * (Ie + Ix) }
... which, with a bit of algebra, can be reduced to:
Increased power dissipation = [{ (2 * Ie * Ix) + Ix²} * R ]
I could perhaps have added, since it illustrates the difference between the truth and my (flawed) 'intuition' ...

... that 'flawed intuition' would want the increased power dissipation to be Ix² * R - i.e. (Ix * Ix) * R. However, since 'the truth is ..
[{ (2 * Ie * Ix) + Ix²} * R ]
... the difference from my 'intuition' is that (2 * Ie * 1Ix) is added to the Ix²
 
To use an extreme example to illustrate how that approach apparently goes wrong when that assumption is not satisfied, consider the situation in which 100 identical 100Ω loads all start simultaneously and are all controlled so as to 'switch off' after 2 kWh of energy had been delivered to each load.
As you say, an extreme situation - and in practice won't happen. In part, that's why I suggested a large number of loads. Most large domestic loads are a) controlled and b) not in any way synchronised. So while the network load will not be constant, I expect it'll follow a fairly narrow spread centered around the average.
One notable exeception would be night storage heating controlled by radio teleswitch - that would fit your description of many loads with synchronised switch on.

BUT, I suspect things may start changing with the pressure to move to smart demand management. So potential for large amounts of loads all turning on when a cheap charging period starts.
 
In my living memory our basic UK mains voltage was 240V, always was, always will be in my humble opinion.
When I started work all the warning signs said 440 volt, or 254 volt single phase, but it was always 420 volt or 242 volt single phase, and plus/minus 6% but when it went to 400 volt (230.9 volt single phase) it went plus 10% minus 6% so the max volts when 257 to 254 hardly any change.

And for many years the supply stayed the same, but the solar panels and EV charging points changed that, with many finding the panels tripped on over voltage, so we saw at long last the voltages drop to 230 volt, although not it seems with this house, today 242.4 volt as I write. EU was 380 volt (220 volt) and they went up to 400 volt (230) in an attempt to harmonise, but in real terms nothing changed either side of the channel.

The 16% variation is causing a problem, with detecting loss of PEN, the method around the problem seems to be the 50 volt to earth has been raised to 70 volt to earth for EV charging points, yet 25 volt can kill a cow, so even 50 volt is on the high side.

My system logs 1727310293912.pnga power cut is called no AC connection I assume? So total of 4 times over a year it reports voltage out of range.

The recorded voltage is for a shorter time1727310403300.pngit shows 241.3 lowest and 248.2 highest, but only over last hour and a quarter, the problem is without solar panels I would have no record of any event.
 

DIYnot Local

Staff member

If you need to find a tradesperson to get your job done, please try our local search below, or if you are doing it yourself you can find suppliers local to you.

Select the supplier or trade you require, enter your location to begin your search.


Are you a trade or supplier? You can create your listing free at DIYnot Local

 
Sponsored Links
Back
Top