FTTH not free from urban legends

July 1, 2006

Our world is full of myths, half-truths, and shams-many of them promulgated via the Internet. With access to the far reaches of the globe at billions of fingertips, people can conspire at their computers and spread rumors that run rampant in the minds of the unsuspecting. And while e-mail jokesters and scammers sit at their desktops, the traffic they create may be feeding one of today’s greatest networking myths: that PON is always the best alternative for FTTH networking.

It’s true-even the telecom industry and outside plant (OSP) practitioners can get caught in the rumor mill. But now it’s time to debunk the legend that PON is the hands-down best approach for delivering broadband to the home.

PON began with the goal to reduce OSP complications, especially the power supplied to OSP equipment. Imagine an OSP world where power needed to be engineered at only the central office (CO) and the residence. The days of paying remote cabinet power bills, replacing batteries, and negotiating easements and rights-of-way would be over.

PON would appear to make this dream a reality. The architecture uses an optical platform in the CO, known as an optical line terminal (OLT), to transmit traffic to 16 to 64 residential end units, or optical network terminals (ONTs). Between the OLT and ONT are passive optical devices called splitters that can divide a single downstream transmission into multiple endpoint streams, as well as aggregate upstream traffic from multiple ONTs into a common stream going back to the CO. But, like most dreams, PON must face a wake-up call.

Four significant problems stand in the way of PON’s selection as the ultimate optical OSP architecture. Each could lead a carrier to consider active or point-to-point optical networks, usually using a form of Ethernet technology, as alternatives.

New construction, commonly named “greenfield,” is the largest area for fiber deployment today. FTTH costs nearly the same as new copper deployment, so it becomes an easy choice to use in these situations, especially given the expected future demands for higher-bandwidth services. And then the first reality of PON sets in: the reality of OSP design. If PON is going to be most cost-effective, splitters are necessary to obtain a high degree of utilization. The price of one OLT port and laser must be shared across 16, 32, or more endpoints. Come up with fewer and the cost per home grows substantially. Therefore, a telco creates a blueprint for service delivery when a developer begins a community and model homes start springing up.

Figure 1. Bandwidth requirements are expected to outstrip the current capabilities of BPON and GPON architectures.

But the plan deviates from the goal far too often. Some lots fill up more quickly. Other sections are left vacant. It takes longer to build than previously thought. In the end, the telco has used a lot of expensive equipment that sits without revenue or return, in effect becoming the optical “highway to nowhere.”

In existing suburbs, this problem exists in a different form-service take rates are not spread evenly. Even the most superior multisplitter design can’t foresee which users will subscribe to the optical network’s services. Again, the drive toward full network capacity slows to a standstill.

Some providers try to counter both forms of this reality with a modified design concept where the splitters are set in the CO and home run fibers travel from the CO to the house. Splitter ports can be used sequentially, but due to the optical loss budget of the lasers, homes have to be much closer to the CO than originally expected. The feasibility of serving the entire development is now put into question.

The adage that human beings can destroy any perfect system rings true at the core of the second reality: the reality of bandwidth requirements. Nowadays, users can’t seem to satisfy their desire for bandwidth for their Internet services. Remember back in 2000? A 56-kbit/sec, dial-up modem could usually satisfy most users and make them feel adequate in the online world. Today, most people have upped their level of acceptability to somewhere between 1 and 3 Mbits/sec-that’s nearly 35 times more bandwidth required, all in 5 to 6 years. If trends like this continue-and circumstantial evidence is that bandwidth needs are increasing, not holding steady-a subscriber will want between 35 and 105 Mbits/sec in 2011.

Added to that is the changing nature of applications. Instead of leisurely surfing the Net, users now pursue a more active online experience by transferring high-resolution digital photos; participating in interactive, graphics-intensive gaming; and watching video clips or full-length productions-all while adding a second or third PC to home networks. These factors contribute to a much more symmetrical network traffic pattern, with users wanting to upload information at download speeds.

Technology Futures has conducted surveys about high-speed Internet requirements for users over the next 6 to 7 years (see Figure 1). Results indicate that broadband PON (BPON) will likely be too slow as early as 2008, and that by 2010 Gigabit PON (GPON) will not satisfy many users, either. These conclusions come before adding the requirement to carry multiple IPTV channels of standard- or high-definition television. Meanwhile, on the competitive front, cable and its DOCSIS standards are developing a 100-Mbit/sec high-speed Internet service delivery mechanism.

The table shows the various upstream rates for a fully split PON. In most cases, the rates are found to be well below anticipated requirements.

Figure 2. As this graph illustrates, PON architectures are not necessarily the most cost-effective approach for delivering high-bandwidth services.

Some deployments are trying to solve this bandwidth limitation by lowering the number of users on a single splitter. For example, rather than deploy the full 64 users a 2.4-Gbit/sec GPON system can support, carriers are choosing to deploy only 24 and therefore have 100 Mbits/sec of downstream bandwidth. However, the cost of the OLT port is 2.5 times as much as expected, and some splitter capacity may be stranded. The proposed solution results in too little bandwidth or too much cost.

The cost of operating and maintaining a network usually far outweighs the price of initial deployment. That leads to the third reality: the complexity of troubleshooting.

At its foundation, PON is an intricate optical TDM architecture where different users share parts of the fiber. Plug in most of today’s more sophisticated optical diagnostic tools and fiber integrity can be determined. But it is next to impossible to troubleshoot a problem in the payload, particularly of an individual or group of endpoints.

PON prescribes new capital equipment for troubleshooting and a substantial investment in training for the carrier’s staff. Complexity costs.

Most of our networks are already in place, and they do not fall into the “greenfield” category. Services are installed over copper infrastructure, complete with its lovable cabinets, rectifiers, crossconnects, batteries, easements, etc. If the network were to go purely passive optical, all of this equipment would need to be left behind and in many communities eliminated. Thus, the final reality: the reality of an installed base.

A 2003 study from Carnegie Mellon explored the overall costs of deploying three types of optical networks: passive, active home runs, and active stars. An active star uses remote cabinets for the OLTs. The research suggested that the most important determinant in choosing an optical technology should be the subscriber density. A recent update in this research shows that combining subscriber density with the amount of reuse of OSP equipment, specifically established cabinets with pre-existing easements, often makes a new PON more expensive than an active star (see Figure 2). When considering the higher degree of reuse possible with active star, as well as the ability to use present optical access rings, this result makes sense.

Urban legends occur when people overlook investigation as an important step to drawing conclusions. If you are planning to deploy an optical network, you should examine the claims of all three optical approaches for your network, customers, and services. Factors such as deployment designs, subscriber trends, and operational costs should be heavily considered before you can make the most informed decision for your network.

Russ Sharer is vice president of sales and marketing at Occam Networks (www.occamnetworks.com). He can be reached at [email protected].

*Note: Copies of the Carnegie Mellon study can be found at www.occamnetworks.com.

Sponsored Recommendations

How AI is driving new thinking in the optical industry

Sept. 30, 2024
Join us for an interactive roundtable webinar highlighting the results of an Endeavor Business Media survey to identify how optical technologies can support AI workflows by balancing...

Meeting AI and Hyperscale Bandwidth Demands: The Role of 800G Coherent Transceivers

Nov. 25, 2024
Join us as we explore the technological advancements, features, and applications of 800G coherent modules, which will enable network growth and deployment in the future. During...

On Topic: Optical Players Race to Stay Pace With the AI Revolution

Sept. 18, 2024
The optical industry is moving fast with new approaches to satisfying the ever-growing demand from hyperscalers, which are balancing growing bandwidth demands with power efficiency...

The Journey to 1.6 Terabit Ethernet

May 24, 2024
Embark on a journey into the future of connectivity as the leaders of the IEEE P802.3dj Task Force unveil the groundbreaking strides towards 1.6 Terabit Ethernet, revolutionizing...