Did you ever wonder how we arrived at so many of the standards that we use every day?  I don’t mean standards like the volt and the ohm, which are covered in high school electricity courses.  I mean, for instance, why do we use 19” racks?  (Answer:  the U.S. Navy developed the standard.  EIA hopped on board later, giving it a sort of professional gloss.)  The Navy was also responsible for specifying that special grey colour that gets sprayed on most things Hammond.


Of course, many of our standards come from the phone company.  They developed the VU meter, for instance, and in typical telco fashion, specified it more or less to death.  Meters that do not fully comply with the VU standard are more properly called “VU-style” meters, or VI’s (volume indicators)—and there are many of these.  Part of the standard covers the ballistics of the needle, which is a good thing … a proper VU meter will not have significant over- or undershoot when exposed to a 300 mS 0 VU burst of 1 kHz tone.  But did you know that the only true dial colour for a VU meter is “buff?”—by the rules, a white-faced VU meter doesn’t fully comply with the standard, and so is not a true VU meter!  No doubt the telco types researched and found that “buff” was particularly pleasing to the eye!


In the old days, transmission standard level was +10 dBu, and studio equipment was typically set for 0 VU = +8 dBu.  Ever wonder why most modern equipment is set up for 0 VU = +4 dBu?  Well, it has to do with the output driving capabilities of early generation integrated circuits.  The +/- 15V supply rails that IC’s operated on would allow an output level of +14 dBu before clipping, but not +18 dBu.  Since they couldn’t get the necessary 10 dB headroom over standard operating level, they dropped the operating level to +4 dBu.  Later generation IC’s that were meant for use in pro audio gear were able to run the rails up to +/- 18V, which solved this problem, but the die, as they say, had already been cast.


At least one transmitter manufacturer in the 1970’s (CCA) decided to make the input level of their transmitters 0 dBm instead of everybody else’s +10, reasoning that most station engineers of the day were using Heathkit test oscillators, which couldn’t make it to +10 dBm.


Our standards of “tip” and “ring,” the phone plug, and many of our multipair cable colour codes, come from the telephone company (or sometimes from Belden).  The ubiquitous BNC comes from the U.S. Navy (“Bayonet Navy Connector”), as do its siblings, the TNC (“Twist Navy Connector”), and the N (go ahead and guess) connector.


XLR connectors came originally from Cannon, and there’s a long and storied history there.  The XLR we know today is the “miniature” version of the original big beast, which frankly was more suitable for hooking power up to welding machines than carrying microphone audio.  A grand battle over phase polarity (is pin 2 + or -?) was fought for many years between Ampex and Studer, with the Swiss triumphant finally.  That one forced some of us to reverse the habits of many years (and look what it did to Ampex!).


The designation “B+” to refer to the main power supply lead of an amplifier goes back to the early days of radio.  “A” referred to the filament supply, “B” to the plate, “C” to the bias, and “D” to the screen grid supply.  This even carried over to the naming of the older battery types, with the “B” cell being rated at 67.5 V for the plate supplies of radios (and when we last checked, B cells were still available from Eveready!).  It took the Canadian Rogers Majestic company to change the landscape with the Rogers Batteryless radio, for which CFRB is named.   But that’s another story, for another day…