I am much more familiar with Dynaco ss designs than with valve ones. In ss designs, Dynaco frequently used "select" semiconductors that were at the tail end of the curve of standard specs. This meant that if they failed, you could not replace them with "off the shelf" parts of the same type - even back when the parts were available. You either got the part from Dynaco (if you were lucky) or substituted a different part althgether with all that entails. A good example is the power supply of the ST-120. I am rebuilding one now with Van Alstine's mod that uses a MOSFET instead of the unobtainable transistor. Any ST-120 rebuilt with the "stock" transistor, unless NOS from Dynaco (very unlikely) will usually fail the first time the amp is pushed at all. All of which is 'off topic' in this forum!PeterCapo wrote:
Dmtparker, I don't think the original Dynacos used "special" parts to achieve their goals such that the original designs are not viable with parts available today.
+8
dmtparker
audiobill
stewdan
Ed Rosenberger
GP49
MexicoMike
Bob Latino
tygr1
12 posters
Bias voltage for ST70 with VTA driver board?
dmtparker- Posts : 14
Join date : 2014-09-25
Location : Bocas del Toro, Panama
Guest- Guest
.
Last edited by PeterCapo on Thu Dec 03, 2020 10:24 pm; edited 1 time in total
GP49- Posts : 792
Join date : 2009-04-30
Location : East of the sun and west of the moon
dmtparker wrote:
I am much more familiar with Dynaco ss designs than with valve ones...Dynaco frequently used "select" semiconductors that were at the tail end of the curve of standard specs. This meant that if they failed, you could not replace them with "off the shelf" parts of the same type - even back when the parts were available. You either got the part from Dynaco (if you were lucky) or substituted a different part althgether with all that entails. A good example is the power supply of the ST-120....
The outputs on the Stereo 120, too. Although the original manual said "2N3055" there were additional specifications (gain, breakdown) in the parts list, which if not observed, would result in catastrophic failure. The 2N3055 could handle all manner of current but if a particular sample's breakdown voltage were too low (which could be within 'official' 2N3055 specifications but not meeting Dynaco's specified requirements) all bets were off.
I often saw Dynaco Stereo 120s cross my bench with blown generic 2N3055 replacements, that had failed as soon as the homebrew-repaired amplifier was turned on. Most of the time if one went, the other would too; and most likely also the drivers.
As Dmtparker mentions, the power supply transistor was even more picky, though it, too, was identified as a 2N3055.
It wasn't until a few years later that a revision specified new transistor numbers for replacement. It also included upgrading of the driver transistors from TO-39 types to TO-220 with substantially higher ratings, and other changes to improve reliability.
It was ironic that two of the most technician-disliked components of that era, the Garrard Lab 80 automatic turntable and the Dynaco Stereo 120 power amplifier, were ones of which I fixed hundreds to reliable, yearslong service, and in the process made a lot of money. I had other shops and dealers sending them to me because they didn't want to try fixing them![/quote]
Last edited by GP49 on Thu Jan 22, 2015 7:04 pm; edited 2 times in total
Guest- Guest
.
Last edited by PeterCapo on Thu Dec 03, 2020 10:24 pm; edited 1 time in total
GP49- Posts : 792
Join date : 2009-04-30
Location : East of the sun and west of the moon
Yes. I'd forgotten the "official" name Dynaco gave to the modifications. The sheets that came out describing them recommended that if a Stereo 120 needed service, the modification be done regardless, even to a good channel. I think TIP referred to the Texas Instruments transistors in TO-220 casings but I found that they weren't all that critical; there were plenty of TO-220 more than adequate to substitute.
defec- Posts : 9
Join date : 2010-12-19
Hi Bob, how's it going? Anyway I believe that the extra biasing has more to do with the "limited" power supply of the ST70 than the driver board as to why Mr Gillespie noted that the THD climbed when the output tubes weren't biased into class A operation. Mike Samera insisted to me years ago that the VTA board also sounded better biased at 50ma. The VTA board retains pretty much the same as the original power supply section that goes to the rectifier/choke/output transformer/power tubes as a stock ST70.
Anyway, of course I can be off on this. Just food for thought. It would be interesting to see the VTA board's distortion measured the same way as in Dave Gillespie's article to see if it benefits or not from same biasing.
Anyway, of course I can be off on this. Just food for thought. It would be interesting to see the VTA board's distortion measured the same way as in Dave Gillespie's article to see if it benefits or not from same biasing.
Bob Latino wrote:dmtparker wrote:I have seen several places, your dictum to set bias @ 40mA/tube, but David Gillespie in his article on "Dynaco ST-70, Baseline Testing" says the following:Bob Latino wrote:Set each EL34 output tube's bias at .400 volts DC ..
Bob
"For example, with a Biaset voltage of 1.40 vdc (45 ma total current flow per tube), 1 kHz THD climbs nearly 55%. If you reduce it further to 1.25 vdc (40 ma per tube) it climbs another 170%, for a total of 225%! So, reducing the current draw of each tube by 6% of its rated cathode current, when it is already operating at just 30% of rated cathode current and 62% of rated plate dissipation, all to gain what, 225% more distortion? That is a very poor compromise to make, versus the very well thought out operating conditions that Hafler set up for the tubes to operate at."
That seems like an awful trade-off. What gives? Do you have different measurements of THD @ 40mA vs. 50mA? I'm sure my original ST-70 had 100's & 100's of hours on it @ 50mA bias as it was my main amp all through college and I never replaced the tubes.
Yes - BUT - Mr. Gillespie is working with an ST-70 with a STOCK 7199 (or 6GH8A) driver board. All his facts and figures relate to the stock driver circuit. The VTA driver board is a different animal and has lower distorion than the stock driver board at all drive levels .. maybe biasing that 7199 driver circuit at higher bias levels (55 milliamps per each EL34) does reduce distortion but it will surely lead to shorter tube life than biasing at 40 milliamps.
Answer this one > If the original 7199 driver circuit was as "good" as Mr. Gillespie says, how come no other manufacturer besides Dynaco ever used this 7199 "one driver tube per channel" circuit ? Dynaco used this 7199 driver circuit not because it was good, but because it saved one tube per amp. A saved 7199 driver tube @ $1.50 (1960's pricing ) X 350,000 amps = over 1/2 million (1960's dollars) saved. When the whole ST-70 kit with tube cage and all tubes in 1960 costs $99 and you save $1.50 it means something.
Bob
GP49- Posts : 792
Join date : 2009-04-30
Location : East of the sun and west of the moon
The "one tube" pentode voltage amp/triode phase splitter was a widely used design, not only by Dynaco. Most manufacturers may not have used the 7199, but a 6AN8 or 6U8 instead. But the circuitry was otherwise essentially the same and its operating principles identical. What is true is that few of the higher-end amplifiers used the one-tube driver (and that is a misnomer, actually two tubes...two independent sections in one envelope).
Distortion in an output stage will behave exactly as Gillespie states. His hypothesis applies whether the push-pull output stage is driven by a single-tube 7199/6AN8 or a multi-tube setup using triodes as voltage amps. In fact the THEORY is the same if one assumes an absolutely perfect driver stage, which of course does not exist. The mechanism of distortion reduction from ideal biasing into Class A is independent of driver circuit type. Overall performance may be better with the more complex circuit...or worse if the more complex circuit is badly designed. But the principle remains the same. It even remains the same if the output tubes are driven by a SOLID STATE driver, which has been seen in commercial Stereo 70 modifications.
When I was doing audio work professionally, I was privileged enough to be using the best test bench distortion analyzer on the market, and did the same thing Gillespie writes about, on different amp drivers, driving output stages where idle current is altered by adjustment of that "fixed" bias voltage, the classic design used by Dynaco. His findings are correct. They are correct even in cathode-bias amplifiers, where the unit under test can have its bias, and therefore idle current, "adjusted" by altering its cathode resistor. For a given tube type, the results were always the same and dependent upon the type and design of the output tube. Reducing the idle current lower in the Class AB range invariably increased measured distortion.
Why, then, run the EL34 at a lower idle current? Because of tube life. The current-production EL34 is not really an EL34 at all, NONE of them. There is a specific set of specifications that define the EL34 and amplifier designers worked with those definitions when designing output stages. Variations in the current production tubes that are called "EL34" result in lower durability and tube life. For instance, the maximum plate voltage in the EL34 specification is a whopping 800 volts. Put that to today's Russian, Chinese or other "EL34" and they will arc over and be destroyed. Similarly the margin of safety in plate dissipation is lower. Tube life will be increased, and the likelihood of catastrophic failure reduced, if idle current is reduced. It is a TRADEOFF, not an optimization. Gillespie's figures and conclusion are not wrong, but his numbers for the distortion increase at lower idle current may sound worse than they are. If distortion in the output stage is 0.08% under ideal idle current, a not unreasonable figure, what would seem a whopping increase of 300% would be 0.24%, still not bad...and that increase could be swamped by the distortion in a non-ideal driver stage, so you wouldn't hear it. It could also be virtually inaudible if it is largely second-harmonic...and likely it would be.
I used the EL34 as an example because that is what is in my modified Mark II amplifiers. But the same applies no matter what the output tube. The behavior of the push-pull output circuitry remains the same. Unfortunately the difference between genuine old tubes by their original makers, and the copies, often reverse-engineered, from current makers remains the same, too.
Distortion in an output stage will behave exactly as Gillespie states. His hypothesis applies whether the push-pull output stage is driven by a single-tube 7199/6AN8 or a multi-tube setup using triodes as voltage amps. In fact the THEORY is the same if one assumes an absolutely perfect driver stage, which of course does not exist. The mechanism of distortion reduction from ideal biasing into Class A is independent of driver circuit type. Overall performance may be better with the more complex circuit...or worse if the more complex circuit is badly designed. But the principle remains the same. It even remains the same if the output tubes are driven by a SOLID STATE driver, which has been seen in commercial Stereo 70 modifications.
When I was doing audio work professionally, I was privileged enough to be using the best test bench distortion analyzer on the market, and did the same thing Gillespie writes about, on different amp drivers, driving output stages where idle current is altered by adjustment of that "fixed" bias voltage, the classic design used by Dynaco. His findings are correct. They are correct even in cathode-bias amplifiers, where the unit under test can have its bias, and therefore idle current, "adjusted" by altering its cathode resistor. For a given tube type, the results were always the same and dependent upon the type and design of the output tube. Reducing the idle current lower in the Class AB range invariably increased measured distortion.
Why, then, run the EL34 at a lower idle current? Because of tube life. The current-production EL34 is not really an EL34 at all, NONE of them. There is a specific set of specifications that define the EL34 and amplifier designers worked with those definitions when designing output stages. Variations in the current production tubes that are called "EL34" result in lower durability and tube life. For instance, the maximum plate voltage in the EL34 specification is a whopping 800 volts. Put that to today's Russian, Chinese or other "EL34" and they will arc over and be destroyed. Similarly the margin of safety in plate dissipation is lower. Tube life will be increased, and the likelihood of catastrophic failure reduced, if idle current is reduced. It is a TRADEOFF, not an optimization. Gillespie's figures and conclusion are not wrong, but his numbers for the distortion increase at lower idle current may sound worse than they are. If distortion in the output stage is 0.08% under ideal idle current, a not unreasonable figure, what would seem a whopping increase of 300% would be 0.24%, still not bad...and that increase could be swamped by the distortion in a non-ideal driver stage, so you wouldn't hear it. It could also be virtually inaudible if it is largely second-harmonic...and likely it would be.
I used the EL34 as an example because that is what is in my modified Mark II amplifiers. But the same applies no matter what the output tube. The behavior of the push-pull output circuitry remains the same. Unfortunately the difference between genuine old tubes by their original makers, and the copies, often reverse-engineered, from current makers remains the same, too.
defec- Posts : 9
Join date : 2010-12-19
GP49, thank you for the excellent informative response. I learned a great deal from it. I have had two quads of EH 6CA7's biased to dynaco's spec and they have held up for a few years so far. I'm sure though that they wouldn't last as long as the good old stuff.
corndog71- Posts : 840
Join date : 2013-03-19
Location : It can get windy here
GP49 wrote:For instance, the maximum plate voltage in the EL34 specification is a whopping 800 volts. Put that to today's Russian, Chinese or other "EL34" and they will arc over and be destroyed.
The plates can handle 800 volts but the screens need to be kept at 350 volts. You need a custom transformer to accomplish this.