Edge data centers are delivering the network edge computing and network resources demanded by increasingly video-centric consumers and Internet of Things (IoT) businesses that need local compute.
How we got here: consumer demand
Internet traffic is booming, driven largely by video-centric consumers using smartphones. In 2021 there will be 127 times more global Internet traffic than there was in 2005, reports Cisco. So much of that traffic is video that “IP video traffic will be 82 percent of all consumer Internet traffic by 2021, up from 73 percent in 2016,” and up from 12% in 2006.
It’s notable that the smartphone was introduced in 2007. Mobile devices have created an environment where consumers expect to have the information they want, anywhere, anytime. Increasingly, the information they want is video with 60% of all video clips are consumed on mobile devices, according to Ooyala.
The proliferation of video streaming – over the top (OTT) video – seems to be an unstoppable trend, such that 40% of the peak Internet traffic in the U.S. is Netflix, reports Statista. With more than 50 million US subscribers, Netflix subscriptions exceed all the top cable company subscriptions combined.
Social media plays a big part in video consumption today. Ten years ago, there was no Facebook, Twitter, or Instagram. Now video-centric social media proliferates.
Download speeds and video buffering are largely solved
Insatiable consumption of video created an environment with lag and buffering. It’s better thanks to changes in Internet infrastructure. A big improvement came from content companies using content delivery networks. CDNs let companies connect their content directly to local and regional ISPs, bypassing areas of Internet slowness.
On top of that is $1.6 trillion of investment in U.S. telecom infrastructure in the last 20 years, according to USTelecom. Wireline, wireless, and cable companies put fiber deep in the network, splitting nodes and moving nodes closer to homes, and laying fiber to nodes and homes. In the last several years we’ve seen a lot more local connections and bypassing of big Internet pipes that cause latency and slowness, which has dramatically improved the video experience for most consumers.
Edge data centers: robust, local connectivity
Download speeds have improved, but upload speeds need to catch up in an environment where consumers generate and upload their own videos, including live streaming. Making YouTube videos and engaging in Twitter Live, Instagram Stories, and Facebook Live are interactive experiences that require faster upload speeds. Edge data centers get social platforms better network connections closer to their users.
As a country, the U.S. lags on Internet speeds, and we’re even further behind on upload speeds, which require fiber to the home. Unless you have fiber directly to your home – which is the trend, but not available everywhere — then you don’t have symmetrical download speeds and upload speeds. Faster upload speeds are needed for changing consumer usage patterns.
Edge data centers: IoT use case for edge computing
The Internet of Things is another trend we’re seeing that is leading business to seek edge data center resources. IoT is very local and distributed. IoT devices require information and act on that information locally. In many cases, it doesn’t make sense to send that data far away for compute and then to send it back – only to generate latency. In addition to rich fiber in the last mile, compute and networking power are needed at the Internet’s edge.
TierPoint data centers deliver edge computing and connectivity
TierPoint helps companies create good edge networking paths and edge compute paths to analyze data that’s coming at the Internet’s edge. For example, we help social media platforms move content closer to users and find good networking interconnections to get to the last mile.
We also help Internet of Things (IoT) businesses with compute needs where local data needs to be analyzed and a decision needs to be made locally – without incurring latency to a distant data center.
We’re building more data centers, sized to accommodate our current regional customer base and new customers with latency sensitivities and last-mile connectivity requirements. Our edge data centers have the robust network connections that content companies need to get to their end users. The size of these edge data centers also drives economies of scale in power, cooling that benefit all customers, including businesses that need local compute to cut latency. And, of course, these data centers are highly secure.
Conclusion: the move to the network edge
Gartner said recently, “Currently, around 10% of enterprise-generated data is created and processed outside a traditional centralized data center or cloud. By 2022, Gartner predicts this figure will reach 50%.” Today, most compute is done far from where the activity is taking place. In the future, that will change: much more compute activity will take place at the network’s edge to avoid delays that surround sending massive amounts of data to distant places.
Recently, TierPoint’s President and CFO, Mary Meduski participated in a dynamic panel discussion at SXSW called How Connectivity Will Control Everything We Know. The panelists tackled big topics like the future of IT infrastructure and the edge of the network and cloud. Check out the recording of the full session to hear more about the edge.
Dominic Romeo is a Senior Product Manager and is responsible for all things network-related at TierPoint. In addition to helping create new products and answering detailed questions to tackle specific issues, he gathers feedback from customer interactions to guide product improvements and create new solutions to meet evolving business needs.