The competitive ‘edge’

 

The competitive ‘edge’

The unstoppable growth of IoT will lead to severe ‘traffic’ challenges.
The huge amount of data produced by network devices worldwide almost literally causes blockages in the pipelines to the Cloud. Almost all data collected by sensors, chips and readers from countless devices are now sent directly to external data storage for processing. It is expected that by 2020, 50 billion devices will be connected. All of which gather information and exchange it to the Cloud, causing delays and increasing costs.

Local versus central
With edge computing, instead of sending all data to the Cloud, processing it and sending it back, all devices process the real-time data locally on a laptop, IoT sensor or any kind of internet connected device. Only the data required for storage or analytics will be send to the Cloud. This way ‘data traffic jams’ can be avoided and sensors can more efficiently share and receive the critical operating data that help drive business operations.

Edge computing will significantly help to reduce cost for organizations while at the same time improve data speed, quality and accuracy. The real-time data is processed locally, providing direct feedback/instructions for users on the spot (i.e. car- or weather information). As sensitive data is kept close to the data source, edge computing contributes to increase cybersecurity. Non-sensitive data can be transferred to the Cloud immediately for analysis and storage.

Big iron versus small iron
Michael Dell is convinced that the competitive advantage of Dell will be ‘on the edge’. He predicts that with current developments there will be an enormous demand for hardware on the edge, which Dell will readily provide.

IBM on the other hand thinks that in a world powered by Cloud computing, the high-performing mainframe data center continues to play a critical role in transforming businesses.

Our take
On the one hand, most of our online services today require real-time access to data, this process must be quick in order to deliver an optimal customer experience. On the other hand, the gathering and analysis of enormous amounts of data provides valuable business information which is instrumental for enhancement of customer service and innovation. Those are very different processes with different IT requirements. Old school, big iron thinking no longer gets the job done. In order to best serve these two worlds and ride the waves of innovation, there’s a need for a different, more agile approach. In our opinion legacy systems stand in the way of this much needed transition.

Today’s x86 systems have everything the big iron has and more. Scalability, security, performance and flexibility. In light of this, expensive and high maintenance mainframes become redundant and turn into millstones around IT organization’s necks.

However, they still contain intellectual property either in the form of (customized) business logic or in the form of customer data which needs to be preserved. When taking the legacy transformation route, 1:1 migration of the applications, batch processing and the actual data in the mainframe databases is performed and the path will be paved to further innovate the migrated environment and leverage the intellectual property beyond its current form and into the new world.