Most of what I've found so far simply make assertions similar, often almost identical, to what endecotp said, and which I don't yet understand. For example, as usual, one of the first hits was the Wikipedia, which says "A Bitstream or 1-bit DAC is a consumer electronics marketing term describing an oversampling digital-to-analog converter (DAC) with an actual 1-bit DAC (that is, a simple "on/off" switch) in a delta-sigma loop operating at multiples of the sampling frequency. The combination is equivalent to a DAC with a larger number of bits (usually 16-20)".That interwebby thing must be full of info on how they work?
In the context of this thread, consider a meter designed to be able to meter a max peak current of about 70A (i.e. max of about 50A RMS). I would have thought that one would then set the 0/1 changeover point of the DAC at about mid-point - i.e. about 35A. To my simple mind, that means that when the peak current consumption is <35A (RMS <~25A), the changeover point will never be reached, no matter how fast one samples, so the output of the DAC will be a constant '0', regardless of what actual current is flowing.
All I can think of at the moment is that perhaps the "delta-sigma loop" actually alters the DAC's 0/1 changeover threshold each time it iterates through its looping. By so doing, I suppose it could effectively emulate an N-bit ADC - but I would then call that an "emulated N-bit ADC", not a "1-bit" one In effect, one would be time-multiplexing a 1-bit ADC is order to make it functionality of a N-bit one - but whether that would or would not result in a net reduction in the number of logic elements, I wouldn't like to guess/say!
Kind Regards, John