Sorry for delay, I've been very distracted by several calls on my time
I never dreamed that I would ever find myself struggling to get my head around things which I had 'taken for granted' for 50+ years, but that seems to be what has happened since I 'made the mistake' of actually thinking about it!
I have come to the conclusion that much of what you (Simon) has been saying is correct, in as much as I cannot really argue with any of his maths. However, I had been relying more on 'physics intuition', rather than the maths (and my simulation model was based on the same 'intuitive' assumptions') but I'm having grave difficulties in reconciling that (presumably incorrect!) intuition with the maths!
I will explain and discuss some of the 'intuitive anomalies' that I am trying to rationalise in a subsequent message, when I have a bit of time!
To use an extreme example to illustrate how that approach apparently goes wrong when that assumption is not satisfied, consider the situation in which 100 identical 100Ω loads all start simultaneously and are all controlled so as to 'switch off' after 2 kWh of energy had been delivered to each load. All loads would therefore 'switch on' simultaneously and all would 'switch off' simultaneously 5 hours later.
With 200V, the current throughout that 5-hour period would be 200A, so the power dissipated in the (0.1Ω) DNO network would be (200² x 0.1) = 4,000 W. Over the full 5-hour period, the energy loss would therefore be 20 kWh.
Now move to 400V. Total current, when present, is now 400 A, so that power dissipated in DNO network (whilst current is flowing into loads) becomes (400² x 0.1) = 16,000 W (16 kW). However, that current now only flows for 1.25 hours, so that the total energy loss would be 20 kWh - the same as with 200V.
If, in that situation, one took an approach similar to yours, and looked at the 'average' current over 5 hours, with 200V that average current would still be 200 A, so energy loss would still be 20 kWh. However, again looking over the 5 hour period, with 400V there would be a current of 400A for 1.25 hours, followed by zero current for for 3.75 hours - an average current over the 5-hour period of 100A - and that would presumably lead you to conclude that the power loss was (100 x 0.1) = 1,000 W and hence the energy loss over the 5 hours of only 5 kWh (one quarter of the true energy loss).
Of course, such an extreme situation, with loads oscillating between 'high' and zero is not going to arise in practice. However, I strongly suspect (and will investigate if/when I have some time!) that the further one gets from the situation of load being the same at every point in time,the less will be the reduction in energy loss (over a period, such as 'a day') resulting from increasing voltage.
More when I have a bit more time!
I never dreamed that I would ever find myself struggling to get my head around things which I had 'taken for granted' for 50+ years, but that seems to be what has happened since I 'made the mistake' of actually thinking about it!
I have come to the conclusion that much of what you (Simon) has been saying is correct, in as much as I cannot really argue with any of his maths. However, I had been relying more on 'physics intuition', rather than the maths (and my simulation model was based on the same 'intuitive' assumptions') but I'm having grave difficulties in reconciling that (presumably incorrect!) intuition with the maths!
I will explain and discuss some of the 'intuitive anomalies' that I am trying to rationalise in a subsequent message, when I have a bit of time!
Indeed. With 'uncontrolled' loads (pretty rare for large loads in the real world), the supply voltage is irrelevant, the losses in the DNO network being the same percentage of the power supplied to the load, regardless of voltage. Similarly,with the figures you''re using, that 10% figure is inevitable, again regardless of voltage. The same current (hence same I²) will obviously flow through both network cables and load(s), so that the power dissipated in them will simply be proportional to the resistances of network and loads. With a network resistance of 0.1Ω and an effective load resistance of 1Ω (100Ω / 100), the former will always be 10% of the latter, hence power dissipated in network always 10% of that dissipate in the load(s)....... But, we were discussing losses in the network. Let's assume we have a loop impedance of 0.1Ω, and 100off such loads on the network.
At 200V, 100 of such loads will be 200A and the volt drop will be 20V, so the losses in the distribution network will be 20V * 200A = 4,000W while supplying a total load of 40kW - so network losses are around 10% of supplied energy.
If we were to double the voltage to 400V, then without any controls, each load would pull 4A and 1,600W. 100 such loads would draw 400A and 160kW. The losses in the network would now be 40V drop @ 400A, so 16kw, so still around 10% of supplied energy.
As above, I can't dispute that maths, per se, but I think it only works ("iis valid") if the timing and magnitudes of the loads is such that the total load is essentially the same at every point in time.Now consider if the loads are all thermostatically controlled. Each will now change it's duty cycle to around 25% - 25% of 1,600W is 400W. At any point in time, "around" 25% of loads will be on, and so the total of 100 loads will be "around" 100A, give or take a bit - I suspect if you plot current versus frequency it'll come out looking something like a bell curve.
.... But back to our discussion. We now have around 100A in our distribution network. That's going to give us around 10V drop. Regardless whether you do 10V * 100A, or 100A² * 0.1Ω, you get 1,000W. So for the same 40kW supplied, we now only lose 1kW in the network, only ¼ of the losses at 200V, and 2.5% of supplied energy.
To use an extreme example to illustrate how that approach apparently goes wrong when that assumption is not satisfied, consider the situation in which 100 identical 100Ω loads all start simultaneously and are all controlled so as to 'switch off' after 2 kWh of energy had been delivered to each load. All loads would therefore 'switch on' simultaneously and all would 'switch off' simultaneously 5 hours later.
With 200V, the current throughout that 5-hour period would be 200A, so the power dissipated in the (0.1Ω) DNO network would be (200² x 0.1) = 4,000 W. Over the full 5-hour period, the energy loss would therefore be 20 kWh.
Now move to 400V. Total current, when present, is now 400 A, so that power dissipated in DNO network (whilst current is flowing into loads) becomes (400² x 0.1) = 16,000 W (16 kW). However, that current now only flows for 1.25 hours, so that the total energy loss would be 20 kWh - the same as with 200V.
If, in that situation, one took an approach similar to yours, and looked at the 'average' current over 5 hours, with 200V that average current would still be 200 A, so energy loss would still be 20 kWh. However, again looking over the 5 hour period, with 400V there would be a current of 400A for 1.25 hours, followed by zero current for for 3.75 hours - an average current over the 5-hour period of 100A - and that would presumably lead you to conclude that the power loss was (100 x 0.1) = 1,000 W and hence the energy loss over the 5 hours of only 5 kWh (one quarter of the true energy loss).
Of course, such an extreme situation, with loads oscillating between 'high' and zero is not going to arise in practice. However, I strongly suspect (and will investigate if/when I have some time!) that the further one gets from the situation of load being the same at every point in time,the less will be the reduction in energy loss (over a period, such as 'a day') resulting from increasing voltage.
More when I have a bit more time!