The origin of 50 ohms in impedance matching




For traces with a certain width, three main factors will affect the impedance of PCB traces. First of all, the EMI (electromagnetic interference) of the near field of the PCB trace is proportional to the height of the trace from the reference plane. The lower the height, the smaller the radiation. Secondly, the crosstalk will change significantly with the height of the trace. If the height is reduced by half, the crosstalk will be reduced to nearly a quarter. Finally, the lower the height, the smaller the impedance, and it is less susceptible to capacitive loads. All three factors will allow the designer to keep the trace as close as possible to the reference plane. The reason that prevents you from reducing the trace height to zero is that most chips cannot drive transmission lines with an impedance of less than 50 ohms. (A special case of this rule is Rambus that can drive 27 ohms, and National's BTL series, which can drive 17 ohms). Not all situations are best to use 50 ohms. For example, the very old NMOS structure of the 8080 processor works at 100KHz without the problems of EMI, crosstalk and capacitive load, and it cannot drive 50 ohms. For this processor, high impedance means low power consumption, and you should use thin, high-impedance wires as much as possible. A purely mechanical perspective must also be considered. For example, in terms of density, the distance between layers of a multilayer board is very small, and the line width process required for 70 ohm impedance is difficult to achieve. In this case, you should use 50 ohms, which has a wider line width and is easier to manufacture. What is the impedance of the coaxial cable? In the RF field, the issues considered are not the same as those considered in PCBs, but coaxial cables in the RF industry also have a similar impedance range. According to the IEC publication (1967), 75 ohms is a common impedance standard for coaxial cables (note: air is used as an insulating layer) because you can match some common antenna configurations. It also defines a 50 ohm cable based on solid polyethylene, because when the external shielding layer with a fixed diameter and the dielectric constant is fixed to 2.2 (the dielectric constant of solid polyethylene), the 50 ohm impedance skin effect loss is the smallest . You can prove from basic physics that 50 ohms is the best. The skin effect loss of the cable L (in decibels) is proportional to the total skin effect resistance R (unit length) divided by the characteristic impedance Z0. The total skin effect resistance R is the sum of the resistance of the shielding layer and the intermediate conductor. The skin effect resistance of the shielding layer is inversely proportional to its diameter d2 at high frequencies. The skin effect resistance of the inner conductor of a coaxial cable is inversely proportional to its diameter d1 at high frequencies. The total series resistance R is therefore proportional to (1/d2 +1/d1). Combining these factors, given d2 and the corresponding dielectric constant ER of the insulating material, you can use the following formula to reduce the skin effect loss. In any basic book about electromagnetic fields and microwaves, you can find that Z0 is a function of d2, d1 and ER (note: the relative permittivity of the insulating layer). Put Equation 2 into Equation 1, and the numerator and denominator are multiplied by d2. , After sorting out formula 3, the constant term (/60)*(1/d2) is separated, and the effective term ((1+d2/d1)/ln(d2/d1)) determines the minimum point. Take a closer look at the minimum point of the formula in formula 3, which is only controlled by d2/d1, and has nothing to do with ER and the fixed value d2. Take d2/d1 as a parameter and draw a graph for L. When d2/d1=3.5911 (Note: Solve a transcendental equation), obtain the minimum value. Assuming that the dielectric constant of solid polyethylene is 2.25 and d2/d1=3.5911, the characteristic impedance is 51.1 ohms. A long time ago, radio engineers, for convenience, approximated this value to 50 ohms as the optimal value for coaxial cables. This proves that around 0 ohm, L is the smallest. But this does not affect your use of other impedances. For example, if you make a 75 ohm 5 cable with the same shield diameter (Note: d2) and insulator (Note: ER), the skin effect loss will increase by 12%. For different insulators, the optimal impedance generated by the optimal d2/d1 ratio will be slightly different (Note: For example, air insulation corresponds to about 77 ohms, and the engineer chooses a value of 75 ohms for easy use). Other supplements: The above derivation also explains why the 75-ohm TV cable cut surface is a lotus-shaped hollow core structure while the 50-ohm communication cable is a solid core. There is also an important reminder. As long as the economic situation permits, try to choose a cable with a large outer diameter (Note: d2). In addition to increasing the strength, the main reason is that the larger the outer diameter, the larger the inner diameter (the optimal diameter ratio d2) /d1), the RF loss of the conductor is of course smaller. Why has 50 ohms become the impedance standard for RF transmission lines? Bird Electronics provides one of the most circulated version of the story, from Harmon Banning's "Cable: There may be many stories about the origin of 50 ohms." In the early days of microwave applications, during the Second World War, the choice of impedance was completely dependent on the needs of use. For high-power processing, 30 ohms and 44 ohms were often used. On the other hand, the impedance of the lowest loss air-filled line is 93 ohms. In those years, for higher frequencies that were rarely used, there were no flexible flexible cables, just rigid ducts filled with air medium. Semi-rigid cables were born in the early 1950s, and real microwave flexible cables appeared about 10 years later. With the advancement of technology, impedance standards need to be given in order to strike a balance between economy and convenience. In the United States, 50 ohms is a compromise choice; for the joint army and navy to solve these problems, an organization called JAN was established, which was later DESC, specially developed by MIL. Europe chose 60 ohms. In fact, the most commonly used conduit in the United States is made up of existing rods and water pipes, and 51.5 ohms is very common. It feels strange to see and use an adapter/converter from 50 ohm to 51.5 ohm. In the end, 50 ohms won, and special conduits were manufactured (or maybe the decorators slightly changed the diameter of their tubes). Soon after, under the influence of a dominant company in the industry like Hewlett-Packard, Europeans were also forced to change. 75 ohms is the standard for long-distance communication. Since it is a dielectric filling line, the lowest loss is obtained at 77 ohms. 93 ohm has been used for short connection, such as connecting a computer host and a monitor. Its low capacitance feature reduces the load on the circuit and allows longer connections; interested readers can refer to the MIT RadLab Series, Volume 9, which contains There is a more detailed description.


Navigation