Streaming HD video may be clogging up the last mile in homes, but in an enterprise setting it's not Vin Diesel flicks that are the problem—it's larger and more important data being stored in the cloud. Medical records containing radiographic scans or genomic data for cancer research are transferred from corporate offices and university connections over the long-haul network. These records can consist of terabytes of data that need to travel to cloud storage vendors. Each terabyte contains the equivalent of 100 HD movies at 10 GB each. This massive data migration could drive the deployment of faster broadband networks that will benefit everyone.
The enterprise last-mile networks generally involve faster, dedicated connections as compared with those in our homes. The common corporate link to the outside world, a T-1 line, offers speeds of 1.5 Mbps, and is able to max out at about 15 GB of information per day. According to Geoff Tudor, founder and senior vice-president for business development and product strategy at cloud storage company Nirvanix, if one assumes a corporate employee generates 3MB to 5MB of data per day, once you get over 300 employees sending their files to off-site cloud storage for backup, the T1 is tapped out. Over one of the fastest telcommunications options, an OC-48 line with speeds of about 2.5 Gbps, it will still take about an hour to send 1 TB of data.
After getting through the last mile, data travel over the long-haul networks crisscrossing the country, which are currently being upgraded from 10 Gbps to 40 Gbps. The slower long-haul networks can still be 4 to 60 times as fast as the last-mile connections, but they're still not fast enough for the even more demanding data sets required by scientific computing. Jay R. Boisseau, director of the Texas Advanced Computing Center (home to the Ranger supercomputer), is worried that high-performance computing, which deals with petabytes of data, will be left in the slow lane as providers upgrade their long-haul networks with an eye toward the less demanding consumer and enterprise bandwidth needs. When I asked about the move from 10 Gbps to 40 Gbps on long-haul networks, Boisseau scoffed, "Great, now it will take me one day instead of four to move my data sets."
But enterprise adoption of all things cloud may have a silver lining for Boisseau and the HPC set, as enterprises start sending their own terabytes of data to cloud storage providers. Recently, Nirvanix won a contract to store 240 TB of NASA moon imagery data, and Tudor thinks that's just the beginning of a trend toward terabyte and even petabyte data transfers. Nirvanix has a 1 Gbps connection from its data centers to the Web, which Tudor says are kept pretty full, even before NASA's bits and bytes start coming in. The company is now in talks with carriers to provide cloud storage and build out bandwidth to address the coming network demands that sending fat files over the Net require.
Getting carriers involved not only means that the bandwidth providers will be directly involved; companies such as AT&T, Level 3 or Verizon have the trust of corporate customers when it comes to securely and reliably storing their data. That will help enterprises trust clouds for data storage. Verizon is even trying to get a company to build superfast long-haul network equipment to boost bandwidth. For enterprises, the next issue is price. Tudor estimates with corporate data growing at 30% to 60% a year, storage for enterprises is going to become too expensive to keep in-house. If that happens, bandwidth providers will suddenly have the demand—and a customer willing to pay—for lightning-fast pipes. When this happens, science, storage clouds, and even your own Web-surfing experience could benefit.