Is DP-QPSK the endgame for 100 Gbits/sec?

Nov. 1, 2008

by Meghan Fuller Hanna

The Optical Internetworking Forum (www.oiforum.com) this summer agreed to begin work on a project framework for 100–Gbit/sec long–haul transmission, focusing primarily on an advanced modulation scheme known as dual–polarisation quadrature phase–shift keying (DP–QPSK), paired with coherent detection. While many in the industry agree that DP–QPSK shows great promise, some wonder whether we can conclusively say it holds the most promise at such an early stage in the development of 100G.

According to Joe Berthold, former president of the OIF and an author of the current 100G project, the OIF's decision to focus on DP–QPSK was coloured by the particular application it is aiming to address. When the organisation assessed the industry, it determined that the first market for 100–Gbit/sec transmission would be core backbone networks in the U.S. And any technology designed to address this application has to satisfy two key objectives: support distances of 1,000 to 1,500 km without requiring regeneration of the signal, and be spectrally efficient or able to support more capacity per fibre than is currently available.

Enter DP-QPSK. Dual polarisation enables the transmission of more channels on a given fibre by supporting a signal in two polarisations. One signal is sent in the horizontal polarisation and the other in the vertical polarisation; both signals are in the same frequency, and they're polarised 90° from each other so they will not interact.

Unlike traditional modulation, where the signal is either light or dark, on or off -- one bit per interval -- QPSK enables the modulation of two bits per interval or symbol because the signal can be in one of four different phases. The signal could be a 0/0, 1/1, 0/1, or 1/0.

"Basically, dual polarisation gets you two times QPSK, which means you get four bits in the bit interval that the receiver is seeing," explains Helen Xenos, metro Ethernet marketing manager at Nortel. "And that means four times the bandwidth."

"It really is the combination of dual polarisation, which gives you twice as much information in a slot, and then QPSK, which gives you twice as many bits within a given bit rate," adds Berthold. "Effectively, you end up with electronics that are dealing with a signal that is a factor of four slower, and now you can use CMOS technology to do processing on that information."

And there's one of the key benefits inherent in the use of DP–QPSK: It allows the electronics to operate at a slower speed, which enables the use of inexpensive, easy–to–produce CMOS. The signal itself may be fairly complicated from an optical perspective, but the DSP device is only processing a 25–Gbit/sec signal -- making it easier than some of the early 40G implementations, notes Berthold. "This is actually easier to make, and cheaper to implement, and it will propagate better," he says.

In the OIF implementation -- and in Nortel's now commercialised 40G version DP–QPSK is paired with a coherent receiver, which some say may be the most challenging aspect of the system.

In the past, optical receivers received signals by detecting the intensity of the light, which was turned on and off (using a more basic modulation format known as on–off keying.) The typical receiver operates much like an AM radio, says Xenos, which makes the coherent receiver more like an AM/FM radio, complete with local oscillator. The local oscillator is tuned to the frequency of the incoming signal and only extracts that information. In other words, the coherent receiver locks onto both the frequency and phase of the DP–QPSK signal and is able to accurately recover the incoming bits.

Moreover, the coherent receiver is also used to compensate for optical impairments like chromatic dispersion and polarisation mode dispersion, eliminating the need for expensive, discrete optical compensators on the line. The coherent receiver, therefore, enables a better optical signal–to–noise ratio (OSNR), which, in turn, increases the distance the signal can travel without regeneration.

Unlike other standards bodies like the ITU or IEEE, the OIF is not aiming for interoperability, nor is it aiming to develop a "be–all–and–end–all" standard, says Berthold. Instead, the OIF is intending to create "a critical mass of vendors in the market to buy technology components of a particular type so we can create an ecosystem so the people who make various photonic components can have at least some reasonable belief that there is more than one company that might be a market for a particular product," he says. "And to make the people who invest in various silicon ASICs think, 'Hey, there might be more than one company that is a potential buyer for this ASIC.' "

In other words, the OIF is aiming to mitigate the risk inherent in component R&D, and drive down the cost of the resultant components by creating an ecosystem of potential buyers.

While Berthold insists that the OIF would have "backed another horse" had someone brought forth a viable alternative to DP–QPSK (no one did), some in the industry wonder whether it is still too soon for the OIF to be backing any horse today. For all its benefits, DP–QPSK is still a relatively immature technology.

One of the biggest challenges facing DP–QPSK is the need for super–fast digital signal processing. The signal itself maybe slower, but the receiver has to process it at much faster speeds. For real 100G transmission, many believe digital signal processing should be at least two times faster than it is for 40G -- and without it, the industry is back to square one.

"QPSK with dual polarisation and coherent detection -- everything is just fine," says Milorad Cvijetic, vice president and chief technology strategist at NEC America Corp. (www.necam.com). "But what is missing is heavy digital signal processing. Without digital signal processing, we still face some issues, and, I would say, it won't be quite feasible to apply these three great technologies."

While vendors have prototypes available, mass production of DSP devices with that kind of processing power is still likely at least a year away.

Another challenge is photonic integration, one the OIF is directly tackling with a proposed implementation agreement (IA). The IA will define integrated receive and transmit optical components for DP–QPSK with an eye towards a possible future multi–source agreement. The organisation is also developing an IA around forward error correction (FEC).

For his part, Cvijetic agrees with the OIF's decision but also wonders what the future will hold. "The OIF initiative came at the right time to attempt to concentrate efforts in the direction that many of us believe is right, at least for now," he says. "The question is, will DP–QPSK be the endgame? I don't think so. Probably after some time -- and I'm talking about three to five to seven years -- we will find that there is the possibility, with heavy digital signal processing, that we can even go beyond QPSK to maybe APSK or OFDM."

The folks at Hitachi Telecom (www.hitel.com) are looking at amplitude and phase shift keying (APSK), which is a generic term for any modulation scheme with multiple amplitudes and multiple phases. They have been using a single laser with amplitude and phase modulation rather than dual polarisation.

"Our research guys have been looking at a single laser and an incoherent receiver, so you don't have to have that oscillator at the receiver, which gets the cost way down," reports Scott Wilkinson, vice president of product management and system engineering at Hitachi Telecom. "If OFDM is nine to ten times more [expensive than 40G] and DP–QPSK is six–and–a–half to seven times more [expensive], then APSK has the potential of only being about three times more expensive. Now, three times more expensive than 40 Gig is still a bad thing," he says, "but you're talking about it being a half or less the cost of some of the other technologies."

While Wilkinson freely admits that the performance of APSK is "not as good" as DP–QPSK, "the question is, is it good enough?" he asks.

There is a contingent of folks in the industry who wonder the same thing about optical orthogonal frequency division multiplexing (OFDM). A relative newcomer to the 100G field, optical OFDM was the subject of an invited paper from KDDI at the recent ECOC Conference in Brussels. Results from KDDI's study indicate that optical OFDM may have a comparable performance to DP–QPSK, but others believe it will likely be more expensive to implement because it requires complicated electronics at both the receiver and the transmitter.

Still others are exploring alternative modulation formats not because they don't believe in DP–QPSK but because they're targeting a different application space. ADVA Optical Networking (www.advaoptical.com), for example, recently announced its participation in a European research project dubbed 100 Gigabit Ethernet Transport or 100GET and says it will take a leadership role in the definition of a metro–oriented 100G interface.

"Our objective is to provide something that is lower cost for shorter–reach applications typical with access or metro networks," says Paul Morkel, senior director of business management at ADVA Optical Networking. "We understand that the polarisation–multiplexed QPSK has very good performance, and it may be the optimum solution for long distance where you have a lot of transmission impairments. But we want to use the lowest modulation format, and we want to use currently available components."

For the record, Berthold himself notes that the OIF's DP–QPSK work is "driven by a distance objective somewhere in the 1,000- to 1,500-km range. This might not necessarily be the right decision if we were trying to go 30 km across a metro."

Morkel says the 100GET initiative is examining APSK formats that wouldn't necessarily work in long–haul networks. Multi–level amplitude shift keying may enable the transmission of more bits per symbol, but it is also more sensitive to conditions of high fibre non–linearity, which frequently characterise long–haul networks. "But that generally doesn't matter when you're talking about short distances," says Morkel.

For now, the efforts of organisations like the OIF and 100GET are attempting to jump–start research and investment in 100G despite difficult market conditions and technical challenges. Whether or not DP–QPSK wins the near–term race ultimately may be a moot point over the long run.

"When you look to optical communications five years, maybe 10 years [or even] 15 years [from now -- we will not recognise what is going on," says Cvijetic.

Meghan Fuller Hanna is senior editor at Lightwave.

Sponsored Recommendations

The Road to 800G/1.6T in the Data Center

Oct. 31, 2024
Join us as we discuss the opportunities, challenges, and technologies enabling the realization and rapid adoption of cost-effective 800G and 1.6T+ optical connectivity solutions...

Understanding BABA and the BEAD waiver

Oct. 29, 2024
Unlock the essentials of the Broadband Equity, Access and Deployment (BEAD) program and discover how to navigate the Build America, Buy America (BABA) requirements for network...

On Topic: Optical Players Race to Stay Pace With the AI Revolution

Sept. 18, 2024
The optical industry is moving fast with new approaches to satisfying the ever-growing demand from hyperscalers, which are balancing growing bandwidth demands with power efficiency...

Today, Tomorrow, and in The Future: The Status of AI/ML in Fiber-Optic Communications

Sept. 25, 2024
Struggling to balance customer demand with the challenges of network upgrades, rollout of new products and services, and guaranteeing service level agreements (SLAs)? Discover...