View Single Post
Old 10th Sep 2019, 12:36 am   #1
unitelex
Hexode
 
Join Date: May 2005
Location: Greater Manchester, UK.
Posts: 496
Default Cathode follower line out - best practice

Working on a Ferrograph 631 which had a line output fitted not so long ago.
The installer did a neat job and added an ECC82 configured as a cathode follower. There is a grid to ground resistor (470K) and a cathode resistor of 2K2 with a DC blocking capacitor on the output.

Anode current is about 4mA taken from C20. Such anode current is pulling down the HT voltage for the rest of the equipment by some 50V causing other problems, so not ideal.

To reduce anode current I was considering increasing the cathode resistor, but mindful of the good practice of a low output impedance to drive line levels.

Apparently this additional line out is used to drive the line in for some modern transistorized/digital equipment with a short cable so not convinced it needs a very low impedance but I don't have the equipment to try it. So I am aiming for a reasonable compromise.

What is a reasonable line output impedance?

Considered a few options:
1. A series chain in the cathode with overall higher resistance, taking the output from the lower resistor to keep impedance down at the sacrifice of some attenuation. With the preceding stage driven harder the signal level can be recovered back to what it was but this simple cathode follower is not biassed to support a large input signal. A spice simulation showed it close to clipping.

2. I have seen better biassing arrangements with the grid resistor tied back to the cathode chain. That way it handles a bigger signal without clipping however I don't want to drive the preceding stage too hard either to minimize distortion.

What is best practice for a cathode follower to get sufficiently low output impedance without suffering high anode current?
unitelex is offline