Imagine this. You’re planning to migrate your ERP applications to the cloud, and you’re considering two different data centers. One is located in Nashville near the offices with the majority of your administrative staff. The other is in Omaha.
You don’t have offices in Omaha, but you have several manufacturing facilities scattered throughout the Great Plains states including Kansas, Oklahoma, and Nebraska.
The two data centers seem to offer comparable security, resiliency, and capacity. The Nashville data center has more customers, but it still has plenty of room to expand. The Omaha data center is smaller, but by no means is it a “fly by night” operation. One is considered an edge data center and the other isn’t.
Which one do you choose?
What is an edge data center?
An Edge data center is a data center facility that handles processing (data and computing) physically closer to your end-users or devices. By moving processing closer to your users or devices, you gain latency and application performance improvements. An edge data center is an alternative to a micro data center. All are an extension of edge computing.
What are your IT Infrastructure priorities?
Many organizations will gravitate toward Nashville. After all, it’s closest to the center of their IT operations, making it easier for those responsible for infrastructure strategy to keep an eye on things. That’s sound reasoning, but you also need to look at the decision from your end user’s point of view.
In the simple scenario above, the administrative users in the Nashville office use productivity applications like Microsoft Office 365 and a variety of financial applications. The operational users in the manufacturing facilities rely on applications like ERP, configuration and quoting systems, warehouse management, and the like. If the organization’s supply chain is digitally integrated, end users may also include customers and vendors as well.
Arguably, the operational users are going to be more impacted by delays. If you’ve never looked closely at an application like MRP (material requirements planning) or supply chain operations, suffice it to say their complex computational functions can be data-intensive.
You might be thinking, so what if my production planners in Nebraska have to wait an extra 30 milliseconds for a screen to come up, or if MRP takes a little longer to run? Irritating yes, but not the end of the world. Yet.
The industry is becoming much more real-time. No matter what business you’re in, chances are good you’re seeing old processes and protocols transformed as computing power and the internet allow you to do things in ways that would have been impossible just ten to fifteen years ago. When you’re competing with other more digital-savvy organizations, that 30 extra milliseconds of wait time can make a difference, especially when it’s your customer doing the waiting.
The Internet of Things (IoT) is accelerating the transformation of businesses as well. Many manufacturers are using IoT devices to automate a wide array of manufacturing tasks, many of which are hazardous to humans.
Also read: Private Cloud in Edge Data Centers: Better for Customers
Many organizations are also exploring Artificial Intelligence (AI) to determine how it can be used to improve the customer experience while decreasing costs. The biggest challenge may not be the capabilities of AI so much as the average person’s willingness to communicate with a computer program mimicking a human. If the computer must wait for a response from the central data center before it responds to something the customer said, the interaction is going to be very cumbersome.
[embed_cta portal=”135870″ cta=”a66d3b5b-2a2c-4ca3-a521-c5edb53b74fc” centered=”true”]
Edge data centers reduce latency, satisfy the need for speed
The business of business is changing in so many ways thanks to technology. However, each of these transformative innovations must reduce latency.
Put simply, latency is a measure of the time you spend waiting for information to travel from your PC to the data center and for you to receive a response. The more data-intensive your request is (refer to any one of the examples above) the more the impact of latency is felt by your end-users.
This is where the edge data center comes in. There are a lot of ways to decrease latency, e.g., faster processing speeds, lower resource utilization rates, direct connections (bypassing the internet), etc. The distance your data has to travel is one of the primary factors.
Putting data centers close to the end-user reduces that distance and the time your users need to wait for the system to process their request. These are also known as “edge data centers”, data centers closer to your end-users.
As TierPoint Director of Product Management, Dominic Romeo, explains, “When you’re inside a 50-mile radius, latencies get really, really low. The time it takes for the end-user to send a command to the server and for the server to come back with a response are in the neighborhood of single-digit milliseconds versus double- or triple-digit milliseconds of round-trip time. That can have a tremendous impact on productivity and the customer experience.”
Also read: Key Considerations for Edge Computing Deployments
Your users might benefit from being closer to the edge
It isn’t the location of your data center that puts it at “the edge.” It’s the location of your end-users. For operational users in the central Midwest, a data center in Omaha may offer the best user experience and better handle increased demand. You might even consider splitting workloads, housing some workloads in Omaha and some in a data center closer to your home office in Nashville.
One of TierPoint’s advantages is our network of data centers in tier 2 and 3 markets that are farther from the public cloud data centers. We can even help you leverage hyperscalers like AWS and Azure, and we offer colocation data center services where you house your equipment in our facilities.
You can schedule a tour of one of our data centers or reach out to us with additional questions. We’re always happy to help businesses find their edge.