How can I determine the input amperage requirements for a transformer? For example, if I have 230v AC input and want 100v DC at 40amps after rectification. I have a basic idea but don't know about other factors such as transformer efficiency etc.
watts in = watts out
then there is the transfo's effeciency ( extra watts primary, or watts lost )
if a tranformer is 75% efficient then ie:
100 watts in will give you 75 watts out ( 25% power loss)
watts = volts x amps