View Single Post
  #827  
Old 19-02-22, 15:49
Chris Suslowicz Chris Suslowicz is offline
Junior Password Gnome
 
Join Date: May 2007
Location: England
Posts: 814
Default

Bear in mind that resistors have a voltage rating too, and that high value low wattage resistors may flash over rather than burn out if the voltage rating is exceeded. For high wattage resistors you can assume the voltage rating is adequate for the full power rating, so voltage = square root of (resistance value x power rating).

For your 8.2k 10W resistors that's 286 volts and with 4 in series = 1145 volts which should be fine.

Using them all in series as a test load and with a suitably high resistance multimeter (20k ohms per volt), you should have no problem measuring the output voltage on a 40 watt (approximately) load by using the meter between ground and the 'hot' end of the lowest resistor in the chain. (I'd suggest starting on the 1,000 volt range of the meter first, just in case.)

Then multiply the meter reading by 4 to get the actual (approximate because the resistor could be only within 20% of its marked value) voltage.

Alternatively, you can put the meter in series with all the resistors and measure the current through the chain - put the meter between ground and the lowest resistor and don't touch it when the power is on (if the meter is open circuit parts of it will be at full HT voltage and that is definitely lethal) - then multiply total resistance by current drawn to get the applied voltage.

Chris
Reply With Quote