John Krzesicki
John Krzesicki

It’s often said that there’s nothing new under the sun, only new ways of doing old things. That may be especially true in how we handle data.

Ever since Power Systems Research began tracking global production of engines and powertrains in 1976, its analysts have been alert to new trends surrounding power and data movement.

Today, as three essential metrics in data handling change— cloud costs, volume and processing time—, the structure of data networks also is changing. In many cases, it’s not practical to send vast amounts of data to the cloud to be processed and then wait for the results. Now, it’s often necessary to have smaller data centers located near the activity, at the edge of the action, if you will.

Consider automated vehicles. In a traffic situation with a high volume of automated vehicles transmitting data for processing, it’s necessary to have instantaneous response times for each vehicle’s data. Further, as that data is “use” data, there is no need for it to move to the cloud and move out of the cloud adding to the cloud bill.  Many companies are repatratiating their data from the cloud due to these costs alone.

Expanding further on the high volume of automated vehicles transmitting data at the same time, consider hundreds of automated vehicles moving on a roadway simultaneously within the same small area: what happens if many of them want to switch lanes or change speeds at exactly the same time? It’s necessary to have virtually simultaneous responses for requests from all these vehicles. That’s a problem that will severely test cloud computing; a pragmatic solution to this type of challenge requires many data centers near the roadways to handle the high volume of requests instantaneously. 

Edge data centers also feature the concept of many microgrids, not a small number of wide-spread grids where a failure could disrupt large areas and hundreds of thousands of individuals.  Adding to the case for increased use of microgrids are the aspects of sustainability and environmental stewardship that are only growing in importance and often easier to accomplish in smaller footprints.

There are growing advantages and applications for edge data centers.

These Edge data centers have specific needs that are like large data centers but at a smaller scale. Think power. Think cooling. Think storage. Think transmission. Think security.  Edge data centers offer a resiliency boost as they also provide geographic diversity and a lowering of risks related to local power outages. 

The concept of edge computing is not new, but the need for this approach is growing every day. The idea of edge computing comes from the Content Delivery Networks (CDN) developed in the 1990s to deliver heavy volumes of data for website and video applications from edge servers located close to users.

In the 2000s, applications were added to these data centers, thus creating essential commercial edge computing services.

As trends point towards a greater penetration and adoption of technologies requiring edge data centers, Power Systems Research analysts continue to maintain a close focus on the products and equipment that will be impacted by these trends. 

We realize the way technologies are deployed in the field will ultimately impact future demand within the powered equipment markets as OEMs and various supply chain companies navigate the path forward.  There are bound to be changes and detours along the way, but it is certainly helpful to step back and observe the roadmap that appears to be in front of us and anticipate the changes ahead. PSR

John Krzesicki is Business Development Manager for Power Systems Research