How to Leverage True Edge Flexibility and Overcome Operational Challenges

Nov. 5, 2018
As more users connect to the cloud and request data heavy in content and size — utilizing the edge for fast delivery will make complete sense. Learn more about the potential and benefits of true edge flexibility and how to tackle the challenges involved. 

This is the third entry in a four-part Data Center Frontier special report series that explores edge computing from a data center perspective. This post covers the potential and benefits of true edge flexibility and how to tackle the challenges involved. 

Download the full report.

Many organizations are now looking for better ways to deliver rich content to users who are heavily spread out. We’re seeing even more companies push out applications, desktops, and various services to rural locations. A big challenge here revolves around performance and user experience. After all, just because an application can be delivered doesn’t mean that’s happening efficiently.

This isn’t limited to rural locations only. Edge can live in urban environments where network resources are constrained or slow (i.e. where number of peering hops are too frequent). For example, content providers from Boston needing to peer into New York City to get to a major hop point.

Either way, this was the ultimate challenge when it came to cloud computing. As we see more data and services impact traditional cloud systems, we saw real inefficiencies in trying to stream and work with all of this data from a cloud ecosystem. Organizations needed a better way to process this data.

As more users connect to the cloud and request data heavy in content and size — utilizing the edge for fast delivery will make complete sense. Gartner recently stated that emerging technologies require revolutionizing the enabling foundations that provide the volume of data needed, advanced compute power, and ubiquity-enabling ecosystems. The shift from compartmentalized technical infrastructure to ecosystem-enabling platforms is laying the foundations for entirely new business models that are forming the bridge between humans and technology.

“When we view these themes together, we can see how the human-centric enabling technologies within transparently immersive experiences — such as smart workspace, connected home, augmented reality, virtual reality and the growing brain-computer interface — are becoming the edge technologies that are pulling the other trends along the Hype Cycle,” said Mike J. Walker, research director at Gartner.

Edge flexibility, design, and overcoming challenges

The entire concept of edge is to be able to impact users as well as services based on proximity. So, edge computing offers tremendous benefits in terms of how you deploy edge solutions and manage data.

Today, you can deliver modular edge data center infrastructure solutions which provide standardized deployment options. This gives you the flexibility and capability to meet the demands of compute today and beyond.

From the customer’s perspective, edge computing can be any services or architecture which helps you simplify and localize the delivery of applications, data sets, and services. These services help you gain more control over your WAN, bandwidth requirements, and how rich content is delivered. The future absolutely looks to be a lot more interconnected with more user distribution. And, with the influx of new data, edge will be even more important.

This means that edge design is flexible, and specifically caters to high-performance or even latency-sensitive applications. The really cool part here is that you can control how data flows throughout your entire edge ecosystem, secure the processing of that data, and still positively impact the user experience.

That said, deploying edge can have its challenges. Remember, edge solutions aren’t just ‘another data center site.’ They’re smaller, use-case specific, and are designed to be dense environments to help you process more services and user data. With that, there are three challenges to be aware of when working with edge design:

  1. Use-case definition: This is actually a major stopping point for edge projects. There may be a great idea or concept but defining the use-case reaches a barrier. This usually happens when there’s misalignment between IT, business requirements, and management. In these situations, it’s important to take a step back and look at the long-term strategy of your own organization. Are you growing? Will you be supporting remote users? Are you trying to deliver new types of connected services? If you see that edge is a fit, take the next steps to write up a good business plan and technology strategy to support it. You don’t have to be an edge expert to clearly define your own use case. Furthermore, there are great providers who can help you on this journey. However, it’s important to align infrastructure and business to ensure that your strategy can take off. From there, it’s key to work with the right people who can bring that vision to life. Which brings us to the next point.
  2. Lack of expertise: If you’ve tried to deploy edge solutions in the past but find yourself on a support island, you’re not alone. Over a fairly recent time frame, we really did have serious lack of expertise when it came to deploying edge solutions. Plus, these aren’t just inexpensive projects where you can just ‘wing it.’ So, even if an organization is able to define a usecase, they might get stuck when it came to working with good partners who could help them implement the vision. Again, edge is not like a typical data center. There are different considerations around space, density, power, management, connectivity, redundancy, and much more. This is why working with the right people can make the entire process so much easier. The good news is that today there are great organizations, partners, and data center providers which are ready and able to help with edge solutions. Don’t let this be a stopping point, work with partners which can help you scale and build out your own edge solution.
  3. Concerns around data management: This is a big one which adds a key complication into deploying edge. Basically, ‘what happens to my data?’ You’re going to have to take some extra time to define your data requirements and management policies. Is the data transient or will it be stored at the edge? What is the data that’s being processed? What is the connectivity control method around the data? Again, all of this will need to be defined and integrated into your own edge solution. That said, you can absolutely still build in compliance and regulation into an edge architecture. However, you’ll need to take extra precautions to ensure data security and control. Take into consideration the location of the edge, storage systems at the edge, how the data will be process, and who will have access. The cool part is that software-defined solutions allow you to integrate with core data center systems and support powerful data locality policies. This can really impact industries like pharma, healthcare, and other regulated organizations.

Understanding and managing the latency budget

To an end user, latency is the reason that downloading a movie “takes so long”, but to a content provider the number of milliseconds it takes to complete a function can be measured in customer dissatisfaction and cost.

Furthermore, to a business, latency can also mean the loss of business or a competitive edge.

Even at the speed of light the round trip from a central data center, a facility located in a Tier I market for example, can mean the accumulation of transmission costs. A study conducted by ACG Research estimated that caching content locally in a metro population can save approximately $110 million over a five-year period. If we were to apply this same logic to a company running an IIoT parts tracking application, the hard costs of transmission could be measured, but the associated cost in the degradation of the performance of the application would be incalculable.

It’s important to have your data reside closer to your users as well as the applications or workloads which are being accessed.

Consider this as an example — using supply chain in physical systems, like Walmart or even Amazon Prime Same Day Delivery. In any supply chain system, as traffic increases, transportation costs go up. As a result, distribution gets closer to end users to decrease transport costs and increase throughput. The same concept can be applied to edge and data delivery.

Edge computing will take data everywhere, including the floor of the ocean, as is the case for Microsoft’s Project Natick deployment in Scotland. (Photo by Scott Eklund/Red Box Pictures for Microsoft).

With the increase of traffic moving through the edge, there is a greater demand for more bandwidth and less latency. As discussed earlier, it’s important to have your data reside closer to your users as well as the applications or workloads which are being accessed. Where data may have not fluctuated too much in the past, current demands are much different.

  • Bandwidth Bursts. Many providers now offer something known as bandwidth bursts specifically for edge solutions. This allows the administrator to temporarily increase the amount of bandwidth available to the environment based on immediate demand. This is useful for seasonal or highly cyclical industries. There will come a time when for a period of business operation, more bandwidth is required to help deliver the data. In those cases, look for partners who can dynamically increase that amount and then de-provision those resources when they are no longer being used.
  • Network Testing. Always test your network and the network of the edge provider. Examine their internal speeds and see how your data will act on that network. This also means taking a look at the various ISP and connectivity providers being offered by the colocation provider. Many times, a poor networking infrastructure won’t be able to handle a large organization’s ‘Big Data’ needs despite potentially having a fast Internet connection. Without good QoS and ISP segmentation, some edge data centers can actually become saturated. Look for partners with good, established connections providing guaranteed speeds.
  • Know Your Applications. One of the best ways to gauge edge data requirements is to know and understand the underlying application or workload. Deployment best practices dictate that there must be a clear understanding of how an application functions, the resources it requires and how well it operates on a given platform. By designing the needs around the application, there is less chance that improper resources are assigned to that workload.

There are a lot of benefits and use-cases around edge and connected systems. Take the time to think about your own strategies and whether your current infrastructure is capable of supporting these initiatives.

This Data Center Frontier series, focused on edge computing, will also cover the following topics over the coming weeks:

Download the full Data Center Frontier Special Report on Edge Computing, courtesy of BASELAYER

Explore the evolving world of edge computing further through Data Center Frontier’s special report series and ongoing coverage.

About the Author

Bill Kleyman

Bill Kleyman is a veteran, enthusiastic technologist with experience in data center design, management and deployment. Bill is currently a freelance analyst, speaker, and author for some of our industry's leading publications.

Sponsored Recommendations

Guide to Environmental Sustainability Metrics for Data Centers

Unlock the power of Environmental, Social, and Governance (ESG) reporting in the data center industry with our comprehensive guide, proposing 28 key metrics across five categories...

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

How Modernizing Aging Data Center Infrastructure Improves Sustainability

Explore the path to improved sustainability in data centers by modernizing aging infrastructure, uncovering challenges, three effective approaches, and specific examples outlined...

iStock, courtesy of AFL

Hyperscale: The AI Tsunami

AFL's Alan Keizer and Keith Sullivan explore how AI is driving change and creating challenges for data centers.

White Papers

DCF media kit 2022
DCF media kit 2022
DCF media kit 2022
DCF media kit 2022
DCF media kit 2022

Data Center Frontier Media Kit

Oct. 16, 2021
Data Center Frontier is ideal for companies that want to be seen as a thought leader in the data center industry. The programs include opportunities to build awareness, submit...