The Dream of Mobile Content Delivered via HQME
The fears that video will crush cell phone networks as people casually scan YouTube clips on the street or stream Netflix (NFLX) movies from their iPads is forcing mobile operators, entertainment companies, and electronics companies to rethink their networks. But the entertainment industry, SanDisk (SNDK), and mobile operators are also dreaming up a new standard that provides top-ranked content on the handset even before the user requests it.
Right now most people turn to Wi-Fi because it's cheaper and faster than cellular networks, although those surfing on Verizon's (VZ) LTE network may no longer worry about speeds. However, there's still a cost. Chugging gigabytes through an HD movie stream on a cell phone network will eat through your wallet. Under current Verizon plans, five hours of HD streaming would eat up the entire $50 5GB plan, leaving users to pay $10 more for every gigabyte over. So in general, for long-form content, Wi-Fi is where it's at. However, even short-form content—such as hot YouTube videos—takes its toll on the network.
That's one reason SanDisk has teamed up with Softbank, Sony (SNE), and Orange (FTE) to create a standard to deliver content to handsets via Wi-Fi. The standard, known as HQME—or high-quality mobile entertainment—is being debated in the IEEE, an association of global tech professionals. Noam Kedem, a vice-president of marketing at SanDisk helping lead the HQME charge, explains that an executive at an international operator told him that if someone could deliver the top 100 YouTube videos via Wi-Fi, that would cut down on 80 percent of the problems on his network.
I wrote about the HQME standard a few weeks back and explained it would automatically detect when a mobile device hits a Wi-Fi network (and is plugged in), then begin downloading content such as movies or e-books for consumption later, while seemingly keeping digital rights management and subscription information intact. I had some issues with this, namely that most people consume short-form and unplanned content via their mobile networks, so having some type of predetermined download wouldn't help all that much. But I didn't have the full story.
Kedem explained that the technology ideally would be able to work from a Netflix queue to download content a user is likely to want. For example, if I'm watching the second episode of Downton Abbey before I go to bed that night, it might preload episode three over my Wi-Fi connection so I can take it on the go with me the next morning. This assumes the content owner is willing to let me store content that was licensed for streaming—a big if. In another example, Kedem explains that an operator could cache the top 100 YouTube videos on a handset or tablet when the user encounters a Wi-Fi network using the HQME standard. Then if someone sent the handset's owner a viral video, it's likely the video would already be cached on the handset. Again, this is an example, and would require help from YouTube and may likely irritate content owners whose copyrighted product tends to show up on popular YouTube videos.
This intelligent and predictive caching model also depends on phones and tablets having the room to store a few gigabytes of video content—something that isn't exactly feasible on most of today's handsets. Kedem says the onboard storage might change (he stresses it won't use the phone or tablet's SD card, which could perhaps contain SanDisk memory), or users could set limits on how much of their storage would be dedicated to caching. But is that something most users would want or even know how to do in the "post-PC" era?
While the hurdles to adoption on the consumer side and on the content side seem large, the threatened deluge of mobile video content is frightening enough that the industry is trying to find new ways of building out their networks, selling new data plans, and even pushing forward with open standards in an effort to keep their networks afloat. Even if the HQME standard doesn't come through, predictive content caching is an idea worth looking at.
Also from GigaOM: