By François Robitaille
Overview
The shift to more complex modulation formats for high-speed communications means a change in the way designers and network managers measure signal performance. Several options are available, but modulation analyzers based on optical sampling provide the best option.
With bandwidth demand growing and fiber availability decreasing, carriers and equipment manufacturers have no choice but to turn to high-speed optical transmission systems to increase the capacity of the networks. However, traditional one-bit-per-symbol modulation schemes, such as on/off keying (OOK) or differential phase-shift keying (DPSK), do not offer the spectral efficiency necessary to significantly increase the overall data-carrying capacity of optical fibers. Moreover, if scaled up to significantly higher transmission speeds, these traditional modulation formats become highly sensitive to chromatic dispersion (CD) and polarization-mode dispersion (PMD), rendering them unusable on existing networks.
Last August, the influential Optical Internetworking Forum (OIF) recommended using a fully coherent 4-bit-per-symbol dual-polarization quadrature phase-shift keying (DP-QPSK) modulation format for 100-Gbps system design, since it is both spectrally efficient and highly resilient to CD and PMD (when coupled with suitable signal-processing algorithms). In addition, the industry is already seriously looking at longer-term evolution with modulation schemes such as 16-QAM and orthogonal frequency-division multiplexing.
These radical changes in modulation formats bring formidable challenges to equipment manufacturers and their carrier customers. Such modulation schemes cannot be characterized using traditional test instruments and methods that are sensitive only to the time-varying intensity of the signal light, and not to the phase of the light signal. Therefore, in addition to well-known eye diagram analysis, which provides intensity and time information for OOK signals, new measurements must be performed to retrieve the phase information. Constellation diagram analysis responds to this need.
Different instruments can be used to recover the intensity and phase information necessary to fully and properly characterize advanced modulation schemes such as QPSK, 16-QAM, or their dual-polarization versions: high-resolution optical spectrum analyzers, and modulation analyzers based on real-time electrical sampling oscilloscopes, as well as modulation analyzers based on optical sampling. Each approach has its advantages and disadvantages; therefore, when characterizing high-speed transmitters, it is important to understand the key elements affecting the constellation diagram recovery and the quality of the measurements that can be obtained from the diagram.
Measurement approaches
As just mentioned, one approach to recover intensity and phase information of a signal is to employ a high-resolution optical spectrum analyzer. Such sophisticated instruments use nonlinear effects combined with a local oscillator to recover both amplitude and phase information from the signal. The intensity and phase versus frequency information is then converted to the time domain using a fast Fourier transform (FFT). However, this results in severe limitations, including the inability to capture long sequences of data or recover the information from framed OTU-4 data.
The second (and probably most obvious) approach is to use a real-time electrical sampling oscilloscope—i.e., sampling light with electronics—to acquire data samples in a similar way as any coherent receiver would do it, by sampling at the symbol rate. This approach yields very accurate information on the symbol position on the constellation and can provide bit-error-rate information on samples of the data stream. However, their limited effective bandwidth and the impedance mismatches typically found when interfacing with high-speed electronics make it impossible to obtain accurate transition information and perform distortion-free waveform recovery.
Finally, an alternative approach is to employ a modulation analyzer based on optical sampling—i.e., characterizing light with light.This approach uses short laser pulses as a stroboscope to effectively open a sampling gate to generate samples with energy proportional to the power of the input signal. These samples are then detected by lower-speed electronics.
The main advantages of this stroboscopic optical sampling approach are its very large effective bandwidth (arising from its high temporal resolution; see Fig. 1) and the absence of impedance mismatch, allowing distortion-free waveform recovery at signal-under-test symbol rates that can reach 100 Gbaud (see Fig. 2). Optical sampling oscilloscopes and modulation analyzers typically have an effective bandwidth 5 to 10 times larger than electrical sampling solutions.
Measurements that match the objectives
When designing transmitters or transmission systems, the objective is always the same: Transmit as much data as possible, as fast as possible, in the smallest channel bandwidth possible, and make sure it can be received error-free at the other end. Several parameters will affect the performance of transmitters and systems, which is why performing the right optimization and verification is critical.
For transmission based on more conventional modulation formats like non-return-to-zero or DPSK, sampling scopes that provide eye diagram analysis have been used to characterize and optimize transmitters. These instruments also can help obtain measurements such as eye opening, signal-to-noise ratio, skew, extinction ratio, rise/fall time, and jitter.
Advanced modulation formats with coherent transmission require similar analysis. But since the information is not only encoded in the intensity of the signal but also in its phase, a constellation analyzer is required to test transmitters and systems (see Fig. 3). This introduces new measurement parameters, adapted to evaluate, for instance, the tuning of the modulator or pulse carver. Parameters may include I-Q imbalance, phase and intensity error, and error vector magnitude (EVM), the last referring to the deviation of the measured constellation when compared to the ideal constellation. If dual-polarization transmission is used, variations between the signals carried on the two states of polarization also need to be identified and measured since they may indicate problems in the transmitter design or balance.
At the system level, one of the parameters generally used to qualify performance—and considered the ultimate indicator that the data sent through the network is received without errors—is the bit-error rate (BER). When transmission is good and BER is low, there is no need for additional troubleshooting. A high BER value, however, raises key questions: What if the BER increases to a point where the system cannot completely correct errors? What information can be gleaned from the BER apart from revealing that the system is not working?
It is under these circumstances that it is critical to rely on a test instrument able to provide distortion-free waveform recovery and precise transition information to enable engineers to identify the causes of the transmission problems, such as polarization crosstalk, dispersion, signal-to-noise ratio, etc. (see Fig. 4).
BER measurement using a modulation analyzer
As mentioned previously, BER is an important test to carry out on any transmission system. When performing this test using a modulation analyzer, whether based on electrical real-time sampling or optical sampling, a very important limitation must be taken into consideration.
Over sufficiently long periods (i.e., more than a few seconds), only a small percentage of the symbols are actually sampled, either in a short duty cycle real-time “burst mode,” as is generally the case with electrical real-time sampling, or by undersampling over a limited time interval, as with “equivalent-time” optical sampling (see Fig. 5). In practice, data transfer and processing take most of the time during each sampling cycle, i.e., between 95% and 99% of the cycle, depending on the sampling method and the processing power of the test instrument.
The consequence of this short duty cycle is simple. The BER estimated using a modulation analyzer will be valid only if errors have a normal statistical distribution. None of the sampling approaches will allow reliable measurement of glitches that might generate errors. In order to “capture” such short error bursts, a true continuous real-time sampling device (such as a receiver) is required, but the excessive data transfer rates necessary for full analysis cannot be handled by contemporary technology.
The other relevant question with regard to a BER value obtained with a modulation analyzer pertains to the hardware. Since the modulation analyzer generally employs a different receiver frontend than the actual receiver used in the customer system, we need to determine if the BER measured with it is representative of the true BER that can be achieved with the receiver—a question whose answer relies on both the quality of the instrument and the quality of the receiver.
From this perspective, the BER estimated using a modulation analyzer can only be used as a general indication of the transmission quality and certainly not as an absolute confirmation that the system is optimized and will provide the best possible performance.
Advantages of optical equivalent-time sampling
Modulation analyzers based on optical sampling offer many key features for the characterization and optimization of 40/100-Gbps (and beyond) transmitters and systems:
- High temporal resolution in the range of a few picoseconds
- Extremely low intrinsic timing jitter and phase noise
- No impedance mismatches created by high-speed electronic components, as the sampling is performed at the optical level and requires only low-speed electronics
- Sufficient effective bandwidth to allow characterization of signals up to 100 Gbaud
Not only can such an optical-sampling-based instrument measure parameters similar to those of real-time electrical sampling instruments, it also permits measurement of very accurate constellation diagrams, including transition information and distortion-free waveform recovery for current and future modulation formats and transmission speeds. In essence, the waveform captured this way is the true optical waveform unimpaired by the testing instrument. Optical modulation analyzers can therefore be viewed as the “golden testing instrument” that can serve as a baseline and common standard in the industry.
François Robitailleis a senior product line manager at EXFO Inc.
Links to more information
LIGHTWAVE: Optical Constellation Analyzers Tackle Complex Measurements
LIGHTWAVE ONLINE: EXFO Launches Turnkey Optical Modulation Analyzer for 100-Gbaud Signal Characterization