Do amplification circuits require a DC power supply to function properly
Because the amplifier did not truly amplify the original signal. The signal output by the amplifier is mostly new energy retrieved from the power supply, and has little to do with the amplified signal. The so-called "amplification" is just an illusion.
As a transistor, working in a switching state is equivalent to opening a large switch with a small current; If working in an amplified state, the degree of opening of this switch will vary with the variation of this small current. The "appropriate" power supply mentioned by the questioner refers to the voltage and current that this power supply can provide, which should be greater than the voltage and current in the amplified signal. Because the essence of "amplification" is to introduce new energy and release the energy of this new energy synchronously according to the variation pattern of the amplified signal and several times its intensity, and then use it to "impersonate" the amplified signal and output it to the user (load). Obviously, if the power supply provided by the amplifier is not suitable (such as using a 3W power supply to power a 5W amplifier), the amplifier cannot function properly.
It is the upper and lower bias resistors of a transistor, providing the base voltage required for normal operation of the transistor. R3 is a collector load, and the amplified current will generate a voltage drop here. This constantly changing voltage is the so-called "amplified" signal.
