Linear fiber-optic links reconcile noise and distortion obstacles

Aug. 1, 1995

Linear fiber-optic links reconcile noise and distortion obstacles

Comprising a laser transmitter, fiber-optic cable and receiver, a basic lightwave link confronts and overcomes an array of analog and digital signal degradation sources

Hank blauvelt and

lawrence A. Stark

ortel corp.

Because of noise and distortion effects, linear fiber-optic links mandate careful adherence to the laws of physics, established design rules and proven engineering practices. Bacially, a fiber-optic link consists of a transmitter, an optical-fiber cable and a receiver. In the transmitter, the input radio-frequency signal modulates the laser optical output. This modulated optical signal is then transmitted through a fiber-optic cable to the receiver, where it is demodulated to recover the original RF signal.

In high-performance analog systems, the light source in the transmitter is generally a distributed-feedback semiconductor laser diode operating at 1310 nanometers. Fabry-Perot lasers are not used because mode-partition noise severely reduces their performance over fiber transmission distances greater than a few kilometers. For lasers used in broadband links, the efficiency is typically 0.25 milliwatt/ milliampere.

To optimize the performance of a laser diode in a broadband transmitter, direct current and RF circuits are used. The transmitter includes RF gain to amplify the input signal to the optimum level for modulating the laser. In cable-TV networks, the transmitter typically contains predistortion linearization and automatic gain circuitry. Although distributed-feedback laser diodes are linear devices, small nonlinearities remain because of processing imperfections.

Noise sources

Four major noise sources in a broadband fiber-optic link include noise from the RF amplifiers in the transmitter, the laser diode, the photodiode and RF amplifiers in the receiver. Because fiber-optic links typically exhibit signal loss, the noise from the transmitter amplifiers is generally much less than that generated by the receiver amplifiers, and is therefore neglected.

Laser noise arises from random fluctuations in the intensity of the optical signal. The two main noise contributors are fluctuations in light intensity, which comes from the laser diode, and interferometric noise, which arises because of multiple light reflections in the optical fiber. Laser noise is often known as relative intensity noise, because the amplitude of the noise is referred to as the average amplitude of the optical signal. If there were no other noise sources, the signal-to-noise ratio of a detected optical signal would be equal to the relative intensity noise multiplied by the square of the modulation depth.

Photodiode noise, or shot noise, occurs because light is composed of photons, which are discrete packets of energy. Thus, the signal is conveyed, not as a smooth flow of energy, but as infinitesimal "quanta" of energy. The randomness of the arrival time of each photon generates a random noise in the output current of the photodiode. The laws of physics demonstrate that the magnitude of shot noise is proportional to the average value of the optical signal.

When laser, photodiode and receiver noise are summed at the receiver, each noise source obeys a different law with respect to the magnitude of the received optical signal. The effect of receiver amplifier noise is independent of optical power, shot noise is proportional to the optical received power, and laser noise is proportional to the square of the optical power. Consequently, as the received power increases, laser noise dominates. In fact, for most fiber-optic links, if the optical loss is low--which means high received optical power--link noise is determined entirely by the laser. Conversely, for small received optical power, link noise is determined entirely by the receiver amplifiers.

The design objective of most broadband systems is to ensure that the shot noise is the dominant noise factor. This shot-noise limited design represents the theoretical "best" design of optical systems.

Linearization by predistortion

Laser diodes and photodiodes are both linear devices. Ideally, a laser diode produces one photon for every electron injected into the device, and a photodiode produces one electron for every photon incident upon the photodiode. In practice, though, the conversion efficiencies are less than one. And, primarily with the distributed-feedback laser, small nonlinearities emerge from the electrical-to-optical conversion process.

Several distortions mar the output of a distributed-feedback laser. In some applications, a portion of the injected current bypasses or leaks around the active layer of the laser. This leakage often increases nonlinearly as the total current is increased, leading to nonlinearities in the laser output. Also, because of the characteristics of laser-stimulated emission, "hole burning" occurs while the electrons inside the laser are converted into photons.

Axial hole burning, for example, refers to the effect caused when the optical power density along the length of the laser chip from the front to back facet is nonuniform. As the laser`s optical output power is increased, relatively more stimulated emission occurs in regions where the light intensity is high. This effect lowers the carrier density in those regions because the electrons are converted to more photons. This carrier suppression or hole burning alters the operating characteristics of the laser and leads to nonlinearities. In addition to leakage and hole burning effects, at higher frequencies the fundamental interactions between the light and carriers inside the laser can also produce nonlinearities.

With careful design, though, these deleterious effects can be minimized, but not eliminated. These effects mostly result in second-order distortion. None of the mechanisms dominate, and the magnitude of each contribution varies from device to device. The net effect is that the total distortion of distributed-feedback lasers varies from device to device. Some devices meet the requirements of broadband systems without linearization, especially if one distortion mechanism cancels another. The advantage of these lasers is that the amount of linearization required is small and capable of implementation with a relatively inexpensive predistorter design.

Predistortion of lasers

In a linearization technique known as predistortion, the RF signals are passed through a device that generates distortion that is equal in amplitude and opposite in sign to that produced by the primary device (the laser, in this case). The RF signals plus the distortion signals are then applied to the primary device. The distortion generated by the primary device will then cancel the injected distortion, resulting in a highly linear output signal.

In a patented predistorter design, the RF signals are split off the main line to distortion arms. In these distortion arms, second- and third-order distortions are generated and then appropriately filtered and delayed to synthesize the required complex distortion characteristics. Next, the distortion signals are recombined with the main signals before being injected into the laser. This approach is effective in linearizing both second- and third-order distortions produced by distributed-feedback lasers.

Basic performance

In using distributed-feedback lasers for amplitude-modulation video transmission, the output power characteristics of 1310-nm distributed-feedback lasers have increased from an original 2 to 3 milliwatts to more than 10 milliwatts. Output powers as high as 30 milliwatts have been demonstrated in the laboratory.

The output power of a distributed-feedback laser module is determined by the efficiency of the laser chip, the efficiency of coupling laser light to a singlemode fiber and the operating current of the laser.

Laser chip efficiency. Laser chip efficiency is expressed as milliwatts per milliamperes, or mW/mA. The theoretical maximum efficiency for a semiconductor laser would occur if each electron injected into the laser would produce one photon. For 1310-nm distributed-feedback lasers, the maximum possible efficiency corresponds to 0.94 mW/mA. In practice, the laser efficiency is lower because not all of the injected carriers result in the generation of light, and not all of the light generated is emitted from the front facet of the laser. Typical efficiencies have improved over time from approximately 0.3 mW/mA to 0.45 mW/mA. Although there may be breakthroughs in laser efficiency, the efficiency of production lasers are expected to saturate at a level of 0.5 mW/mA.

Fiber coupling efficiency. The light emitted from the laser chip must be coupled into the optical fiber. The beam leaving the chip is highly divergent and elliptical in shape. The main losses in coupling efficiency result from optical aberrations in focusing the highly divergent output beam from the laser into a singlemode fiber. Some losses also result from coupling the elliptical beam from the laser into a circular fiber. Also, there is some loss from the internal optical isolator. Typical coupling efficiency from production devices has improved from approximately 40% to 60%, primarily as a result of improvements in coupling optics. The expectation is that production efficiencies will continue to improve to a probable saturation at almost 70%.

Operating current. The impact of improving laser efficiency is to increase linear fiber-optic system optical-loss budgets with no other changes in operating characteristics, such as linearity or RF drive levels. Another way to increase optical output power is to operate the lasers at higher DC currents. In practice, the maximum operating current for linear distributed-feedback lasers is set by the requirement that the device remain linear. During the past several years, the typical maximum operating current above threshold for which amplitude-modulated composite-second-order performance is maintained has increased from 40 mA to 60 mA. Much higher operating currents are possible if the length of the laser chip is increased. A practical drawback to high operating currents is that the RF power required to modulate the laser also increases when the current is higher. This condition makes it difficult to design the input circuits to the transmitter without adding undue levels of distortion.

Although future advances are possible, market trends have resulted in a saturated demand for high-power lasers and a growing demand for low- to moderate-power devices. To date, laser power in the 20- to 25-milliwatt range represents the highest value supplied by manufacturers.

Channel loading

Most broadband fiber-optic systems require an amplitude-modulated transmission capacity of 80 channels in the 50- to 550-megahert¥range. Added digital signals extend the range to 750 MHz, but at lower signal levels. Therefore, the AM channels represent the primary signal loading requirement on the laser.

The common operating condition for the transmitter is to modulate the laser so that the largest negative peaks in the modulating current drive the laser down to near zero power. This condition is referred to as the clipping limit. For 80 channels, this condition corresponds to a modulation depth per channel of a few percent, where the modulation depth is the ratio of the amplitude of the modulated light signal per channel to the DC optical power. If all the carriers were exactly in phase, the peak current would drive the laser well below zero power, and the laser would exhibit severe distortion. Fortunately, the probability of all the carriers aligning in phase is essentially zero.

The fundamental tradeoff related to modulation depth is between the carrier-to-noise ratio and distortion. A high modulation depth value improves the carrier-to-noise ratio, but the distortion increases. The goal for broadband transmitters is to meet AM linearity requirements for modulation depths up to the clipping limit.

The distortion in multichannel video systems results from a large number of distortion beats that fall at particular frequencies within the operating band. Second-order distortion mechanisms result in intermodulation distortion at frequencies that are either the sum or difference between two carriers. For National Television Standards Committee signals, the carrier frequencies are generally offset +1.25 MH¥from a multiple of 6 MHz. Additive beats (f1+f2) are then 2.5 MH¥above a multiple of 6 MHz, and subtractive beats (f1-f2) are at a multiple of 6 MHz. The largest number of subtractive beats occurs at the low-frequency channels, and the largest number of additive beats occurs at the high-frequency channels. Because there is generally no fixed phase relationship between the carriers, the composite distortion power is the sum of the power of each distortion beat. This power is commonly referred to as composite second order. The level of second-order distortion for distributed-feedback lasers--like other RF devices--increases 2 decibels for every 1-dB increase in the RF modulating signal up to the clipping limit.

Third-order distortion mechanisms result in distortion beats at frequencies that are sums or differences among three carriers. Most of these beats are in the form of f1+f2-f3. These third-order beats fall at the same frequencies as those of the RF carriers. The number of such beats is largest for the middle channels, which have more than 2000 beats for an 80-channel frequency plan. Again, the composite distortion power is the sum of the power of each beat and is referred to as the composite triple beat. The level of third-order distortion for distributed-feedback laser--like other RF devices--increases 3 dB for every 1-dB increase in the RF modulating signal up to the clipping limit.

Receiver performance

Optical receivers for broadband links convert the optical signal to an electrical signal and must do so without adding significant noise or distortion. Optical receivers for baseband digital applications generally use trans impedance-type amplifiers and either avalanche photodiodes or positive intrinsic negative photodiodes. Such receivers exhibit good bandwidth and noise properties, but generally do not have adequate linearity for broadband links. Both avalanche photodiodes and most transimpedance amplifier designs contribute distortion at unacceptably high levels. However, properly designed PIN photodiodes provide excellent linearity.

To meet the stringent linearity requirements of broadband links, the receiver must include a linear PIN photodiode and a low distortion RF amplifier. The amplifier should also have a low noise figure. A common choice for the amplifier is a cable-TV hybrid amplifier, because of the low-cost and low-distortion characteristics of these amplifiers. However, these amplifiers are not optimized for a low noise figure. If a PIN photodiode is directly connected to a cable-TV hybrid amplifier, then the noise from the amplifier will result in an unacceptably large degradation in link carrier-to-noise ratio unless the received optical power is high (greater than 1 milliwatt).

Decreased noise performance would result in a better receiver, even at the expense of somewhat higher distortion. A cost-effective way to achieve this performance is through the use of a broadband current transformer and appropriate tuning circuits that passively boost the signal current delivered to the hybrid amplifier. Typical current gains for 750-MH¥designs are 6 dB. This gain effectively reduces the impact of the amplifier noise by 6 dB, making receiver amplifier noise relatively unimportant for the normal range of received optical power. This signal boost does increase receiver distortion, though, from the photodiode and the amplifier; however, the receiver distortion level is still low compared to the transmitter distortion for the normal range of received optical power.

Despite the fact that receiver distortion is generally low, the receiver cannot be ignored in the overall link distortion design. To optimize noise performance, broadband receivers are typically designed to operate over a relatively narrow range of received optical power. If the optical power is below the design range, then the designated link carrier-to-noise ratio will not be met. If it is above the design range, then the receiver will degrade link linearity. A common problem in the installation of broadband links has been the tendency to overestimate the loss budgets. This approach has resulted in the need to install optical attenuators at the receiver. When you are designing broadband links, it is important to consider the minimum and maximum loss budgets for the links. u

Hank Blauvelt is vice president of fiber-optic technologies and Lawrence A. Stark is vice president and business manager of broadband communications products at Ortel Corp., Alhambra, CA.

Sponsored Recommendations

State of the Market: AI is Driving New Thinking in the Optical Industry

Dec. 5, 2024
The year 2024 marked an inflection point for AI. In August, OpenAI’s ChatGPT reached 200 million weekly active users. Meanwhile, McKinsey reported that 72% of ...

From Concept to Connection: Key Considerations for Rural Fiber Projects

Dec. 3, 2024
Building a fiber-to-the-home network in rural areas requires strategic planning, balancing cost efficiency with scalability, while considering factors like customer density, distance...

Getting ready for 800G-1.6T DWDM optical transport

Dec. 16, 2024
Join as Koby Reshef, CEO of Packetlight Networks addresses challenges with three key technological advancements set to shape the industry in 2025.

Optical Transceivers in the Age of AI: Impacts, Challenges, and Opportunities

Jan. 13, 2025
Join our webinar to explore how AI is transforming optical transceivers, data center networking, and Nvidia's GPU-driven architectures, unlocking new possibilities in speed, performance...