Though HTTP adaptive streaming was originally devised for over-the-top (OTT) applications, it could end up transforming video delivery as the cable operators now know it. Indeed, it could become the dominant mode of video delivery in the network, according to panelists on a Cable Show panel entitled Managing QoE in a Changing Video Environment."It is difficult to say how long it will be until the world flips upside down, but in three to five years, a lot of video consumed will be delivered [via adaptive streaming], even over a managed network," Michael Adams, vice president of software strategy for Ericsson Television, told BTR prior to the session.Adaptive bit rate streaming is the transmission of the optimum number of bits based on network conditions and the nature of the end user device. The goal the industry has been working toward is less fluid. It's a single network infrastructure for all services, Adams said. The "connection-oriented bandwidth reservation system" needed for today's transport is not used in adaptive streaming, which allows for more flexibility and rapid deployment of services, he wrote in the paper he presented during the session.Moving from a reservation-based system to an ABR approach is similar to voice communication's transition with the coming of VoIP: In legacy phone networks, a set amount of bandwidth was set aside for transmission. Now, it is used as needed. That saves bandwidth. It is more complex and, if not done correctly, more prone to error. "[But] today with constant bit rate, MPEG-2, as long as the network is working, video will look fine," Adams explained. "With adaptive streaming, the network is working, but the video [could be] bouncing between two profiles, and people are seeing effects."More Devices, More Content ConsumptionThe result can be quality of experience (QoE) challenges. Despite this, the format appears to be here to stay simply because broadband speeds continue to ramp up and the number of devices (tablets, PCs, etc.) that support adaptive bit rate streaming continues to grow. More important, consumers are increasingly using them to watch video. A study sponsored by Sandvine and presented during the panel discussion showed a 63% increase in total visits to 2011 NCAA March Madness on Demand carried on three websites. This was in comparison to 2010 numbers.Additionally, 13.7 million hours of streaming video of the event were viewed via iPads and iPhones this year. Subscribers watched the game primarily on their PCs, but afterward went to handhelds for highlights or replays, according to the report. The effect made traffic heavier than it had been in peak times and extended the periods defined as peak."Three to four years ago, half of traffic was web browsing. Now traffic is [more] real-time video," said Matt Tooley, vice president of consulting solutions for Sandvine. "[Things] have shifted from best effort being good enough to [people] clicking and expecting [content] now and expecting it to work."That's a tall order in the nascent world of adaptive streaming. In this environment, content is pulled from the network by the client. Video is broken into chunks of between two to 10 segments. The same file is encoded at different bit rates. The client judges the amount of available bandwidth at a given moment and switches on the fly to the right bit rate."Instead of one session, you have to track thousands of really short sessions delivered through the network," noted Goran Appelquist, vice president of product management and pre-sales for Edgeware.So, What's an MSO to Do?Appelquist suggested introducing a session concept on top of the adaptive streaming format and aggregating the data at the video server level."Aggregating data at something like a minute resolution would reduce the amount of data, but still give enough detail to provide quality," Appelquist said. "You can go down to the video session to see what sort of quality was actually delivered."Averaging the bit rate is not enough. A quality index is necessary to take different devices and networks into account and to separate different use cases to determine a customer's experience. "A customer that receives the average bit rate through a [program] will be different than one [who] receives a high bit rate for half [the program] and a low bit rate for the other half," Appelquist said.Multiple Monitoring LocationsFellow panelist Jeremy Bennington, general manager and vice president of Cheetah Technologies, explained that monitoring should occur at multiple points. They are before the file is segmented, after it is turned into streaming content in the cloud, at the edge of the cloud where the content delivery network drops it off to the carrier, and close to the CMTS. The latter point is where the downshift in quality will occur if the network is constrained.Operators are looking to deploy QoE tools that will emulate devices on the network, essentially pretending to be a customer, in order to determine the quality of the content being viewed. "Like a canary in a coal mine," Bennington observed.The last step would be to put the monitoring technology in consumer devices. "It would only send information if there was poor quality," Bennington said. "If you have [the technology] on a million devices, you really only care about the thousand that are messed up. There would have to be additional intelligence so you are not flooded with information."Monta Monaco Hernon is a freelance writer. She can be reached at [email protected].
Sponsored Recommendations
Sponsored Recommendations
On Topic: Metro Network Evolution
Dec. 6, 2024
The Pluggable Transceiver Revolution
May 30, 2024
Getting ready for 800G-1.6T DWDM optical transport
Dec. 16, 2024