1. Many people new to impedance often ask why common single-ended traces on PCB boards are typically controlled to 50 ohms rather than 40 ohms or 60 ohms.
2. This seemingly simple question is not easily answered. Before writing this article, we reviewed extensive information, including the well-known response from Dr. Howard Johnson on this topic.
3. Why is it difficult to answer? The issue of signal integrity involves numerous trade-offs, making the industry’s most famous answer: “It depends…” There is no one-size-fits-all solution.
4. Today, Mr. Expressway will briefly summarize this issue by integrating various perspectives. This introduction aims to encourage others to consider additional relevant factors from their viewpoints.
5. First, the 50-ohm standard has historical roots related to standard cables. Much of modern electronics originated from military applications and gradually transitioned to civilian use.
6. During World War II, impedance choices were driven by practical needs. As technology advanced, standards were established to balance cost and convenience. In the U.S., conduits often used 51.5 ohms, but adapters and converters commonly converted between 50 ohms and 51.5 ohms.
7. The JAN organization, later known as DESC, was created to address these issues. After thorough evaluation, 50 ohms was chosen as the standard, leading to the production of specialized cables.
8. At that time, the European standard was 60 ohms. However, under the influence of industry leaders like Hewlett-Packard, Europe eventually adopted the 50-ohm standard, making it the industry norm.
1. It has become a convention, and the PCB connected to various cables is ultimately required to comply with the 50 ohm impedance standard for impedance matching.
2. Secondly, from the perspective of achievable PCB circuit board production, 50 ohms is more convenient to achieve. From the impedance calculation formula, it is evident that too low impedance requires a wider line width and a thinner medium (or a larger dielectric constant), which is challenging to meet in high-density boards; too high impedance necessitates a narrower line width and a thicker dielectric (or smaller dielectric constant), which complicates EMI suppression and crosstalk. Additionally, for multi-layer boards, the reliability of processing and mass production is relatively poor; 50 ohms’ typical line width and dielectric thickness (4~6mil) using common materials meet design requirements (as shown in the figure below), are easy to process, and thus, it has gradually become the default choice.
3. Third, from the perspective of loss, basic physics demonstrates that 50 ohm impedance yields the smallest skin effect loss (according to Howard Johnson, PhD). Generally, the skin effect loss of the cable L (in decibels) is proportional to the total skin effect resistance R (per unit length) divided by the characteristic impedance Z0. The total skin effect resistance R combines the resistance of the shielding layer and the inner conductor. The skin effect resistance of the shielding layer is inversely proportional to its diameter d2 at high frequencies, while the skin effect resistance of the inner conductor of a coaxial cable is inversely proportional to its diameter d1 at high frequencies. Therefore, the total series resistance R is proportional to (1/d2 + 1/d1). Combining these factors and given d2 and the corresponding dielectric constant Er of the isolation material, the following formula can be used to minimize skin effect loss.
4. Any basic book on electromagnetic fields and microwaves will show that Z0 is a function of d2, d1, and Er. Substitute formula 2 into formula 1, multiply both the numerator and denominator by d2, and get the result. Separate the constant term (/60)*(1/d2) from formula 3, and the effective term ((1+d2/d1)/ln(d2/d1)) to determine the minimum point. Carefully note that the minimum point of formula 3 is controlled solely by d2/d1 and does not depend on Er and the fixed value of d2. Plotting a graph for L with d2/d1 as a parameter, the minimum value is obtained when d2/d1=3.5911. Assuming a dielectric constant of solid polyethylene is 2.25 and d2/d1=3.5911, the characteristic impedance is 51.1 ohms. Historically, radio engineers approximated this value to 50 ohms as the optimal value for coaxial cables, demonstrating that around 50 ohms, L is minimized.
5. Finally, from the perspective of electrical performance, the advantage of 50 ohms is also a compromise after comprehensive consideration. Purely in terms of PCB trace performance, lower impedance is preferable. For a transmission line with a given line width, reducing the distance from the plane decreases EMI, crosstalk, and susceptibility to capacitive loads. However, considering the full path, the drive capability of the chip is crucial. Early chips could not drive transmission lines with impedance less than 50 ohms, and higher impedance lines were difficult to implement. Thus, 50 ohm impedance became standard.
6. To sum up: 50 ohms as the default value in the PCB industry has inherent advantages and is a well-considered compromise, but it is not mandatory. The choice depends on specific matching needs. For instance, 75 ohms remains the standard for remote communication, and certain cables and antennas use 75 ohms. In such cases, a matching PCB line impedance is required. Additionally, some specialized chips achieve better EMI and crosstalk suppression by reducing the transmission line impedance through improved drive capability, with examples including most Intel chips requiring impedance control at 37 ohms, 42 ohms, or lower.
2. This seemingly simple question is not easily answered. Before writing this article, we reviewed extensive information, including the well-known response from Dr. Howard Johnson on this topic.
3. Why is it difficult to answer? The issue of signal integrity involves numerous trade-offs, making the industry’s most famous answer: “It depends…” There is no one-size-fits-all solution.
4. Today, Mr. Expressway will briefly summarize this issue by integrating various perspectives. This introduction aims to encourage others to consider additional relevant factors from their viewpoints.
5. First, the 50-ohm standard has historical roots related to standard cables. Much of modern electronics originated from military applications and gradually transitioned to civilian use.
6. During World War II, impedance choices were driven by practical needs. As technology advanced, standards were established to balance cost and convenience. In the U.S., conduits often used 51.5 ohms, but adapters and converters commonly converted between 50 ohms and 51.5 ohms.
7. The JAN organization, later known as DESC, was created to address these issues. After thorough evaluation, 50 ohms was chosen as the standard, leading to the production of specialized cables.
8. At that time, the European standard was 60 ohms. However, under the influence of industry leaders like Hewlett-Packard, Europe eventually adopted the 50-ohm standard, making it the industry norm.
1. It has become a convention, and the PCB connected to various cables is ultimately required to comply with the 50 ohm impedance standard for impedance matching.
2. Secondly, from the perspective of achievable PCB circuit board production, 50 ohms is more convenient to achieve. From the impedance calculation formula, it is evident that too low impedance requires a wider line width and a thinner medium (or a larger dielectric constant), which is challenging to meet in high-density boards; too high impedance necessitates a narrower line width and a thicker dielectric (or smaller dielectric constant), which complicates EMI suppression and crosstalk. Additionally, for multi-layer boards, the reliability of processing and mass production is relatively poor; 50 ohms’ typical line width and dielectric thickness (4~6mil) using common materials meet design requirements (as shown in the figure below), are easy to process, and thus, it has gradually become the default choice.
3. Third, from the perspective of loss, basic physics demonstrates that 50 ohm impedance yields the smallest skin effect loss (according to Howard Johnson, PhD). Generally, the skin effect loss of the cable L (in decibels) is proportional to the total skin effect resistance R (per unit length) divided by the characteristic impedance Z0. The total skin effect resistance R combines the resistance of the shielding layer and the inner conductor. The skin effect resistance of the shielding layer is inversely proportional to its diameter d2 at high frequencies, while the skin effect resistance of the inner conductor of a coaxial cable is inversely proportional to its diameter d1 at high frequencies. Therefore, the total series resistance R is proportional to (1/d2 + 1/d1). Combining these factors and given d2 and the corresponding dielectric constant Er of the isolation material, the following formula can be used to minimize skin effect loss.
4. Any basic book on electromagnetic fields and microwaves will show that Z0 is a function of d2, d1, and Er. Substitute formula 2 into formula 1, multiply both the numerator and denominator by d2, and get the result. Separate the constant term (/60)*(1/d2) from formula 3, and the effective term ((1+d2/d1)/ln(d2/d1)) to determine the minimum point. Carefully note that the minimum point of formula 3 is controlled solely by d2/d1 and does not depend on Er and the fixed value of d2. Plotting a graph for L with d2/d1 as a parameter, the minimum value is obtained when d2/d1=3.5911. Assuming a dielectric constant of solid polyethylene is 2.25 and d2/d1=3.5911, the characteristic impedance is 51.1 ohms. Historically, radio engineers approximated this value to 50 ohms as the optimal value for coaxial cables, demonstrating that around 50 ohms, L is minimized.
5. Finally, from the perspective of electrical performance, the advantage of 50 ohms is also a compromise after comprehensive consideration. Purely in terms of PCB trace performance, lower impedance is preferable. For a transmission line with a given line width, reducing the distance from the plane decreases EMI, crosstalk, and susceptibility to capacitive loads. However, considering the full path, the drive capability of the chip is crucial. Early chips could not drive transmission lines with impedance less than 50 ohms, and higher impedance lines were difficult to implement. Thus, 50 ohm impedance became standard.
6. To sum up: 50 ohms as the default value in the PCB industry has inherent advantages and is a well-considered compromise, but it is not mandatory. The choice depends on specific matching needs. For instance, 75 ohms remains the standard for remote communication, and certain cables and antennas use 75 ohms. In such cases, a matching PCB line impedance is required. Additionally, some specialized chips achieve better EMI and crosstalk suppression by reducing the transmission line impedance through improved drive capability, with examples including most Intel chips requiring impedance control at 37 ohms, 42 ohms, or lower.