The Learning Curve.         

How my friends and I progressed from CB myth to fact.

  

I've been involved in the CB hobby since 1970.  When I started, I was barely 10 years old, and other than possessing an above average interest in radio and wireless communications, I knew virtually nothing technical about it.  This was essentially true for most of my friends as well, as they too started on their journey in this interesting hobby,  back in those horrifically primitive days before the internet and cell phones provided the forum for communicating between each other.  Back then, two-way radio was "the" hobby, if you were like me, and the idea of sending your voice through thin air, without wires, somehow fascinated you.  Or if you simply wanted to have an open roundtable forum, where you could keep in touch with your friends during those times when you were not able to meet up in person. This was a common problem for "pre-car" teens who often had friends from school who lived beyond practical walking and bike riding distance.  We also had parents who had strict rules about talking on the only phone in the house, before such a thing as "call waiting".  They were always afraid that they'd miss that "important" call (which usually meant that someone in the family had died) if one of the teenagers was yakking on the phone for hours, so we usually were limited to no more than 15 minutes or so per call of telephone time.  

So it was that we sought out radio to be our method of communications and social networking medium.  But like any sufficiently advanced technology (Which WAS distinguishable from magic), there were certain physical laws which applied to it, certain truths if you will, which could not be altered.  Many people would learn these laws through formal education.  Others stumbled around in the darkness of their ignorance, and would eventually learn these truths the hard way.  Most of my friends and I fell into that latter group early on.  We learned through trial and error and experimentation.  Eventually, we also discovered that the library was a great resource for theory books, and that certain older, wiser radio ops were a wealth of knowledge and experience and were more than eager to impart their wisdom, so long as we paid them the respect and reverence that they deserved.  As the years wore on, I learned more and more.  Experimentation augmented by formal education combined to shine a very bright light on what was once the darkness shrouding the mysteries of R.F. and radio propagation.  Having finally and painstakingly reached this metaphorical pinnacle of the mountain of basic radio understanding (You never fully learn it all, because new technology is always being developed), I now find it interesting to take a look back through the years, to those early days and marvel at some of the things we actually believed back then, and how certain important aspects were brushed over, while other relatively unimportant details were accentuated.  So with this in mind, it's time once again to step into the way-back machine, and take another trip down memory lane to revisit those early days and have a chuckle or two at my own expense......

One of the earliest pieces of our misunderstood radio theory, dealt with antennas.  Anyone in the know will tell you that the antenna is probably the most important element of a radio operator's station.  The antenna's job is to take the power from your transmitter and radiate it as efficiently as possible into the atmosphere, where hopefully it'll carry your voice to that station in the distance, whether that distance is 1 mile or 3000.  Of course, the optimal antenna's physical dimensions depend on the frequency of operation.  How high you mount it can have a pronounced effect on how well, and in which manner,  your signal propagates.  But back in the early days, our first antennas were the small 2' and 3' telescopic rod types that usually came standard with a Part 15, 100 mW Walkie-Talkie.  When we became disillusioned with our pitiful one or two block range and wanted to take some steps to increase the range of our signal challenged low power radios, one of the easiest things to try was extending the length of the antenna.  Back then, most FCC part 15 W-T's had little stickers on the back warning that the antenna length was not to exceed 5'.  So, with the usual child-like defiance as a primary driving force, the first thing we did was to make an antenna longer than 5'.  After all, we thought, why else would the FCC tell us not to exceed 5'?  It must be to limit the range right? Seemed a logical assumption at the time.  As a confirmation of our assumption, we did see a small increase in range with the longer antennas, and that, as they say,  was that.   Pandora's box had been opened and the sky was now the limit.  Knowing nothing at all about resonant length, or impedance matching, we instead ran by the simplistic "more must be better" philosophy.  If a 5' antenna made an improvement in range, what would 10' do? 15'? 20'? We obviously couldn't use an antenna that long on a Walkie-Talkie itself, so we began to string wire antennas out of the back of our houses, and simply wrap the wire around the tip of the WT's antenna.  Once again here's another example of where a little knowledge can be dangerous.  I had read somewhere that for an ideal shortwave receiving antenna, it should be as long as practical, and suspended between, and insulated from, two supporting structures.  So with that in mind, I strung a wire between my house and up to the top of a crabapple tree in my back yard. It ended up being a bit of a "sloper" style antenna with a total length of about 30 feet.  Some of my friends made similar kludge antenna systems, most ended up in the horizontal plane.  Of course, the shortwave bands are lower in frequency than CB, and 30', 40' or so of antenna is ideal.  But not for the CB band.  But we didn't know that.  Nonetheless, despite our gross lack of antenna theory, the wire antennas did seem to help our range, although not as much as we had hoped.  Probably the biggest "gain" that we realized was by simply getting the antenna outside of the house.  I would eventually construct other homemade antennas, which roughly resembled commercial CB antennas, even if they were physically smaller and not properly matched.  Seemingly good performance (relatively speaking), when used with a higher powered radio, only served to reinforce the notion that antenna length was not all that important.  Later on, when I finally learned that different frequencies demanded specific length antennas, I built my own 1/4 wave whip.  But even though it was shorter and lower in elevation than a neighbor's 1/2 wave ground plane, I assumed that it would be competitive because I had 1 watt more transmitter power. And that segues nicely to the next major misconception, transmitter power.

Back in the early days, we were fixated on transmitter power as THE most important factor to "getting out".  And this was based on the difference we saw in range between a .1 watt (100 mW) Walkie-Talkie and a 1 watt W-T, which was significant. The jump in range between a 1 watt W-T and a 4 watt CB was also significant.  But we placed most of the credit for the increase in range on the respective power output rather than the difference in antennas.  We also believed that just increasing power from 4 to 5 watts would result in a significant difference in signal.  I can remember more than one disappointed individual who metered his brand new rig only to find that it was delivering "only" 3.5 watts instead of the advertised 4 watts.  They were mortified that their rigs were shy of the legal 4 watt limit and felt that this would result in a much reduced signal potential in the distance, and they couldn't wait to have someone break out the golden screwdriver and peak the radio up to its full potential.   I can also remember looking through radio catalogs,  some of which listed the radio's actual output power (This was back in the days when radios were all rated at 5 watts of input power but the actual output power could vary).  A radio was given the thumbs down if it did "only" 3 watts, compared to others which produced close to or even slightly more than 4 watts.  This is probably why many greenhorn CB'ers are so captivated by radio "Peak Jobs".  I guess this over dependence on power started based on the fact that at the low power levels we were used to dealing with initially, small power changes made bigger differences in signal. The difference between .1 watt and 1 watt of power translated to a 10 db (nearly 2 "S" units) signal increase potential.  Yet 1 watt is only 900 mW more in actual power.  But increasing from 1 watt to 2 watts, a similar change in actual power, is only a 3 db (1/2 "S" unit) increase in signal.  Similarly, jumping from 2 watts to 3 watts is less than 2 db and 3 watts to 4 watts is less than 1.5 db in signal gain.  As power levels continue to increase, the less signal difference that single 1 watt increase will make.  Signal gain does not track linearly with power output.  As the power level increases, the amount of power change needed to achieve a given level of signal gain increases logarithmically.  But more importantly, whatever power you did produce would be wasted if it was not coupled to an efficient and optimally mounted antenna system.  But it took us a few years to finally realize that.   

Now we bounce back to the antenna arena a little bit to expose yet another myth: that SWR could be "tuned" by trimming the coax cable. I can remember one guy who chopped a 100' length of coaxial feedline down to 50' in small chunks while trying to reduce his high SWR. Needless to say, it didn't change much. This myth had some validity because you could see small differences in SWR when the feedline was trimmed in small increments.  But you couldn't resolve a high SWR (over 3:1) this way. The truth is that when there is a significant SWR at the antenna, there are standing waves produced in the coaxial feedline. These waves peak every 1/2 wave (Which is about 12' in a cable at CB frequencies). Depending on where your SWR meter is placed, with respect to these standing wave peaks, the SWR reading on the meter will change. But this is only fooling the meter, as the actual feedpoint mismatch remains constant.  Ideally, if you do have standing waves, you want your feedline cut to multiples of 1/2 wave, so that both ends of the cable terminate at the peak of the standing waves for best power transfer. But if you manage to cure the SWR at the antenna (Where it's supposed to be cured), and you present close to a 50 Ohm load impedance, those standing waves drop to almost nil, and feedline length becomes irrelevant (unless, of course, you're using  some sort of a phasing harness).

As the quest for higher power continued, invariably our group was introduced to "linear" power amplifiers. These units could (illegally) take your stock 4 watt signal and boost it, typically to anywhere between 50 and 400 watts depending on the size of the amp.  This was enough for an instant 12 to 20 db signal gain, and it was like an addictive drug.  Once you sampled it, it was hard to go back to stock power alone.  Back in the 1970's, almost all base amplifiers utilized vacuum tubes. Tube amps required that you tune them depending on frequency and output impedance.  Usually this tuning was accomplished by two (or more) variable capacitors; a "Plate Tune" (which matches the tube's impedance to the tank), and the "Antenna Load" (Which matches the antenna and feedline impedance to the tank). Adjusting these controls was simple. You keyed the transmitter and tuned each control back and forth until you saw the highest wattage reading on the wattmeter.  Some amps recommended that you "dip" the plate tune for minimum plate current, but not all amps had a current meter.  But despite the seemingly simple tuning procedure, for all those years we were actually tuning those amps wrong!  While it's true that you should tune for maximum power, you have to also tune it at the point of maximum drive (peak) power. This is due to the fact that the tube's output impedance changes depending on the device's power dissipation.  The first (but way too subtle for us to understand back then) clue  that we were doing it wrong, was when the amp would tune to a different max point when we switched to SSB. Peak SSB power (which we used a steady whistle to achieve) was much closer to maximum drive (Typically 12 watts or more).  When we would then return to AM, and retuned with only the 4 watts of carrier drive, the control settings changed.  But what we were doing unknowingly, was reducing the peak power output potential.  While the dead key carrier power increased when tuned for the 4 watt dead key, full peak modulation power would be less than if it had been tuned at the peak SSB point.  So the proper way to tune an amp is with maximum recommended drive power (Either SSB or AM).  Your dead key AM carrier will be a bit lower, but the peak power will be correct, and it will sound cleaner.

Since we're on the subject of "getting out" (which seemed to be the ultimate brass ring of achievement), while we concentrated a lot of effort in the pursuit of this goal, our thinking on distances and signal potential was woefully two-dimensional back in the day.  Common misconceptions included: If someone could talk 20 miles in one direction, then they should be able to do so equally in all directions, regardless of the terrain.  Another was that if your mobile in your driveway put out a signal to the locals that was 2 "S" units weaker than your base, then it could never beat your base's distance potential (even if you parked it on the top of a 1000' nearby hill).  Another glaring example of two dimensional thinking was the theory that if someone was 4 miles away from you, given the same power output, then he should always give you a better signal than someone else who was 7 miles away, even if the 7 mile distant guy was on top of a more significant hill.  We had the basic concept of the inverse square law down, but failed to recognize the influence of height, radiation angle, the curvature of the earth, and uneven terrain on signal potential.  Yet another misconception: Someone with a 6 watt radio and a 5/8th wave antenna  will put out a better signal in the distance, than another guy with a 4 watt radio and a 1/2 wave antenna, even if the 4 watt station was at a 100' higher elevation.  Yet another example of our naive belief that raw power and antenna gain were worth more than a location with more height above average terrain.  

Some of our early assumptions would have been more probable if we all had lived in Ohio, Florida, or some other state where the height above average terrain never varied by more than 50 feet.  But our area was nothing like that.  We had many ranges of rolling hills, most of which were really not all that noticeable, so they were not so obvious until you look at a topographical map.  So while we began to understand the truth about antennas and radio power, we failed to take into account the dramatic effect that varying elevation had on someone's signal potential, especially in the distance.  So while you might have a clear shot in one direction, you may be blocked by a range of hills in another, and that will dramatically alter your distance potential in that direction.  In the same vein, while two stations at a similar distance, but one is 100' higher, might give you a similar signal a mile or two away, when you move 10 or 15 miles out, the differences will be much more dramatic.  No more evident is this than in my own current situation.  My present home is at close to 400' ASL.   With a barefoot 4 watt radio, I can easily cover as much distance as I used to do back in the heyday, with my 40' tower and 3 element beam and a 250 watt amplifier.  Of course, back then my ground level elevation was only about 150' ASL, and I was surrounded by higher ground, which effectively trapped my signal in a valley. The same thing was true when out in the mobile.  While the mobile did not radiate as well as the base when parked on the same ground, I could drive the mobile to the top of the highest local hill, and that would more than make up for the lack of ERP potential.  Many locals discovered the thrill of "hilltop DXing" in the later years, and it was especially appealing for those of us who had base stations which were "altitude challenged".  Some true diehards actually take along easily collapsible beams or other base antennas which they can quickly set up and connect to the mobile's transmitter for even better gain while they are parked hill topping.

No story like this  would be complete without taking some time to look at the truly silly myths.  Most of us know that R.F. power can be converted to heat, just like any other form of A.C. power. It is also fairly well known that high power radio stations often have "hot" components in the transmitter and antenna system.  But in the early days, some people actually believed that a prolonged keying of a CB transmitter could produce enough R.F. heat to melt ice off of their base antennas or burning the feet of birds that might indiscriminately land on one of the elements. Well, maybe that might be true, if they had 50,000 watts to key, instead of just 4......

Then there was the myth about the ability to pop out the receiver front end of another's station simply by pulling up in front of their house and flipping on a fairly large amplifier (and "large" in those days was anything over 150 watts).  I can remember a tale once told by one of our locals (Steve), where he claimed to have blown out the monitor radio in an FCC van, which had pulled in front of his house, by using some undisclosed amount of power.  Obviously Steve was just taking advantage of our naive ignorance to elevate his status, with this obvious untruth.  But Steve wasn't the only one who likes to bend or outright break the truth, there were others who were good at telling tall tales.  Back then, there were often threats made by people on the channels, to pull up in front of someone's house and pop their radios out.  But as crazy as it sounds, this myth was actually rooted in some factually based truth.  In the very early days of solid state radios, the front end transistors were sensitive to R.F. overload, and there were actual cases of the front ends popping on close encounters with other mobiles, where there can be less than 10 feet between antennas. They could also be blown by nearby lightning strikes.  But generally speaking, it took a fair amount of power to do this. As solid state circuits evolved, provisions were put in place to protect against strong R.F. overload (Ever wonder what those back to back diodes across the front end tank coil were for?). Indeed, I have been in close proximity to some fairly strong R.F. signals throughout  the years (Including an 8000 watt mobile parked in front of my house), and nothing has happened to any of my radios.  In the early 80's, when I first had a 500 watt amp in the truck, I used to feed back 10 watts of reflected power to mobiles parked adjacent to mine according to their in-line watt meters. The radio's "S" meter's moved and audio could be heard from the speaker with the radio turned off.  But no front end damage occurred. And so another myth is busted!

These were some examples of some of the "CB Science theories" that we used to believe back in the very earliest days of our CB hobby. There is no greater weapon to debunking these myths than education.  No, you don't need a graduate degree in engineering, but read a few books, and seek out those older and wiser, and soon you will be on your way to better operating.

 

Back