As I explained before, if you have a 100mA RCD and a 30mA one in series and then gradually increase the (L-E) fault current (detected by RCDs as a L-N imbalance), the 30mA one will obvioulsy trip first. However, if a large L-E fault current arises suddenly (which is what happens with most real faults), the theoretical disconnection times of both devices will both be extremely short - milliseconds or less (but obviously subject to a bit of variabilitty between different devices), so there is really no guarantee that the 30mA will have broken the circuit before the 100mA one operates.Getting back to the subject under discussion, why does a 100mA device not give discrimination over a 30mA one?
Indeed, that's not logical at all. It could be due to a problem with one of the RCDs OR (as above) it may simply be due to individual variation, such that the (very short) disconnection time of the 100mA one happens to be slightly shorter than the (also very short, but not quite so short) disconnection time of the 30mA one in the face of a large L-N imbalance (large L-E fault current).And why would a 100mA device trip 9 times out of 10 before the 30mA one? Does'nt seem logical, unless there is a problem with the 100mA one....
Does that make sense?
Have you yet clarified what the device between DNO' cutout and meter actually is - an RCD or a voltage-operated ELCB?
Kind Regards, John.