I have found the Motor Generator thread to be fascinating and enlightening. But it has made many a reference to the 400 Hz or other frequency much higher than mains line frequency. Despite the comments about the frequency, I'm still confused as to why the higher than mains frequency was used.

Were the higher frequencies used because it directly effected the amount of time / duration in (fractions of) seconds between peaks of rectified (but not yet smoothed) power?

I ask because it seems to me like the percentage of time / duty cycle of raw rectified but not yet smoothed) power would be the same at any and all frequencies. Is this assumption / understanding correct or completely off the mark?

A few different people made references to the amount of capacitance needed at 400 Hz et al. vs 50/60 Hz mains frequency. Someone even spoke about high power DC being produced by polyphase converters and the possibility to tweak tweak winding voltages in order to possibly do away with the need for capacitors.

Am I starting to understand the motivation behind the 400 Hz or is there something else behind it? Is this really playing to the (dis)charge time of capacitors in between peaks of rectified (but not smoothed) sources?

Aside: I started a new thread for this very specific minutia to not mire the other Motor Generator thread down.

Thank you for all the comments and those who respond to help me learn something new today. :-)



--
Grant. . . .
unix || die

Reply via email to