I wasn't aware that a stable output voltage from a varying input voltage was a requirement of a switch mode PSU.
Most electronic 'transformers' used for ELV lighting are more akin to SMPSs, capable of delivering a stable output over a range of input voltages, so the simple model of a fixed resistance load goes out the window.
So, just because you quote a singular source of opinion on a DIY forum, that becomes the de-facto definition of a switch mode power supply? It may well be the case with
many SMPSUs, but it is
not a requirement in order for a power supply to be classified as switch mode. As the name suggests, it's called Switch Mode because of the high frequency switching method employed to regulate the output voltage.
Anyway, a modern dimmer doesn't vary the voltage to a load (be it a lamp or transformer),
Of course it does - if it didn't then a directly connected incandescent lamp would not dim.
I think you'll find it would, and it works because you're varying the amount of time the filament is powered ON for, vs the amount of time it's OFF for. It works because the switching happens many times a second (120 times, every half cycle), such that it cannot be detected by the human eye. The fact that the filament does not have enough time to cool down between cycles also hides the way the dimmer is really working.
It could be, but is it?
Do mass-market consumer dimmers use PWM, or just a simple thyristor circuit which fires at a given point in the cycle?
A thyristor firing at a certain point in the cycle is a form of PWM. The timing of the firing vs the zero crossing point is varied to give different levels of dimming.
Also worth noting that PWM dimming will also work on a DC supply (although of course it will not have to be in sync with any zero crossing point). Having a sine wave of varying voltage is not a requirement in the dimming process, as the on/off period of the lamp is what's important, not variation of current in the load.
In fact, thyristor/SCR/triac dimmers have to be in sync with zero crossing on an AC supply because the control circuit is only able to turn the thyristor ON, after that it is latched, and will only stop conducting once the zero crossing point is reached and no current is passing through the thyristor.
Some more modern designs employ IGBTs and the like, which can be turned off at any point in the cycle. These designs are preferred as they allow the load current to gradually ramp up as the supply voltage in each half cycle rises, avoiding the RFI interference (and the requirement for large chokes) generally associated with switching on of large loads mid-cycle.
so a SMPSU would be able to look at the mark/space ratio of the incoming PWM modulated AC mains, and approximate this to the required output on the ELV side for the lamp.
.
It could, but does it?
Do mass-market consumer ELV supplies do that?
Good question. While I have a fair amount of experience with dimmers, having built computer controlled dimming units in the past, I confess to not knowing huge amounts about the workings of mass-market dimmable SMPSUs as used to supply ELV lighting.
Equally, in contradiction to my last post, it could work merely based on the fact that the incoming AC is rectified to DC inside the SMPSU before being chopped, which would produce a varying DC voltage based on the duty cycle of the incoming PWM modulated 230v from the dimmer.
If you have any info yourself on the exact workings of a dimmable switching transformer then I'd be interested to know more.
I suppose you could argue that if you approximate the output of a PWM dimmer to an RMS voltage, you could say that the output voltage is being varied. However, as the dimmer can only turn it's output on and off at varying speeds and durations, the peak voltage at the output will still remain the same, and the dimming is achieved by only passing current through the load for a set period of time every 1/50th of a second.
In other words the voltage
is reduced.
The only voltage we ever talk about is the RMS one. The value is actually the integral of the waveform - if parts of it are chopped off the area under the curve is reduced, i.e. the RMS voltage is reduced, even though the peak stays the same.
Agreed, but let's keep in context - strictly speaking, the dimmer still isn't reducing the voltage in the same way as a variac. With a scope, you'd see that the variac is reducing the overall amplitude of the sine wave, whereas the PWM dimmer is cutting away parts of the wave.