- Joined
- 27 Jan 2008
- Messages
- 25,365
- Reaction score
- 2,982
- Location
- Llanfair Caereinion, Nr Welshpool
- Country
In the past, my meters measured in 100's of an Ω, so would look at say 0.35Ω they could be turned to amps and read the PSCC, sometimes they would assume 230 volts, so 0.35Ω = 230/0.35 = 5.6kA.
But my cheap new meter only measures in 10's of an Ω, so showing 0.3Ω and 0.01 V this does not seem to make sense. Is the 0.01 volt the voltage difference between on and off load test? Set for a 30 mA RCD then 9 mA is the maximum leakage, so I know many testers are designed to draw 9 mA but however I try to work it out, can't see how 0.01 volts = 0.30 Ω.
RCD tester works well, and 0.30 Ω seems about right, but what do the volts refer to?
But my cheap new meter only measures in 10's of an Ω, so showing 0.3Ω and 0.01 V this does not seem to make sense. Is the 0.01 volt the voltage difference between on and off load test? Set for a 30 mA RCD then 9 mA is the maximum leakage, so I know many testers are designed to draw 9 mA but however I try to work it out, can't see how 0.01 volts = 0.30 Ω.
RCD tester works well, and 0.30 Ω seems about right, but what do the volts refer to?