Why is 50 ohms the most common standard impedance for cable and antenna work? The answer is quite simple: 50 ohms represents the best compromise between low attenuation and maximum power transfer in transmission lines and cables.

The 50-ohm standard has been around for a long time, nearing 100 years now.

The Problem

The original question was how we can transmit a large amount of power in the most efficient way over a long distance.

Studies by researchers working for Bell Labs investigated the problem. Their conclusions were interesting.

For maximum power transfer, impedances around 30 ohms were ideal, while for minimum loss, around 77 ohms was ideal.

The Solution

Since the goal was to find the best of both worlds, 50 ohms was hit upon as the ideal solution.

Obviously, that 50 ohms is closer to the 30 ohm number than the 77 ohm number. However impedance versus power transfer and impedance versus loss curves are somewhat parabolic. In the final results, 50 ohms ends up being the best compromise, due to the parabolic relationships between impedance, power, and loss.

The Antenna Application

So how does this relate to antennas?

Antennas terminate transmission lines. It follows that the antenna impedance will be designed to match the transmission line impedance, which in turn will match the source impedance. Hence, as 50 ohms is the standard transmission line and source impedance, it follows that most antennas will have to be 50 ohms, as well, to match.

Of course, the impedance of an antenna can be adjusted. Some antennas like the inverted-F are very adjustable, but other types of antennas will need some form of matching network or transformer to work at 50 ohms.

There is no rule that says that 50 ohms must always be used. However, 50 ohms has so long been a standard that most devices requiring an antenna do use it. That said, 75 ohms is the standard impedance in applications such as television.