Infrastructure News

Building an infrastructure to meet the needs of Software 2.0

[ad_1]

It has often been said that we are in the midst of the Fourth Industrial Revolution, and given the increasing technological drives across the globe, the idea doesn’t seem so far off. Owing to the paramount growth of big data and IoT, applications are becoming more intelligent and data-driven by the day. AI is at the backend of most of our daily interactions. It has become a trend in the market, and rightly so. Intelligent innovations like edge AI, computer vision, decision intelligence, deep learning, and machine learning are creating a transformational impact on businesses—so much so that the status of AI for business ROI has changed from AI being a luxury to a necessity. 

AI is at the core of all intelligent applications today, but what is at the core of AI? Computing. At the backend of all smart applications is accelerated computing, which provides results in fractions of a second, with minimal latency. Accelerated computing has proven to be more beneficial than any other innovation when it comes to supporting AI and increasing the performance of AI models. 

All intelligent solutions and applications are built on the foundations of AI, making it critical to create an infrastructure that supports the efficient, speedy, and computation heavy workloads of AI. Traditionally, we have leveraged specialised hardware to dramatically speed up work, augment tasks, and offload demanding work that can bog down CPUs. But this hardware has almost always worked independently. Modern accelerated computing brings together the pioneering benefits of three essential hardware: CPU efficiency, GPU parallel processing, and DPU accelerated data.

The world of Software 2.0

Accelerated computing and the growth of IoT have birthed Software 2.0. With massive computational power, intelligent machines in Software 2.0 can augment the work of software engineers. Gone are the days of coding software algorithms. Instead, engineers can train the neural network to optimise the nodes’ weights and generate the desired output. Our AI journey has arrived at a point where software writes software. 

To create such repeatable AI, we require large amounts of computing power that brings together our data. Modern computing today handles scientific simulations, visualisation, data analytics, and machine learning workloads. But there are huge infrastructural requirements that need to be met to support machines. These needs are pushing supercomputing centres, cloud providers, and enterprises to rethink their computing architecture. Today, the data centre itself is the single compute node.

As a developing nation, India has the opportunity to be the AI garage of the world. This is because India is leading the way with its vibrant start-up ecosystem, fundamental research, internet protocol, and AI products and services. The country is a pioneer in research, and given the massive amount of data scientists that graduate in India, there is a huge scope to develop the industry further into services, outsourcing, IP and product development channels. The investments in the industry soared up to $38 billion in India during the pandemic, giving it a major boost for continued growth. Skilling and retraining its large student and developer base is the key to ensuring more conducive growth.

As of now, India lacks the number of data centres needed to harness the power of Software 2.0. The country needs more initiatives from the public and private sectors. 

The state of India

In India, we are still struggling to create the right computing infrastructure to leverage AI and accelerated computing to solve the core of the country’s problems. AI applications must be useful to everyone in society, especially those at the bottom of the pyramid, to be completely successful. The growth of technology needs to be well-rounded to become a pioneer. 

To put this into perspective, let us look at chatbots and virtual assistants. Those literate in the English language with a smartphone would have easy access to bots such as Siri or Alexa to tell them the time or look up questions on Google and give them other necessary information. But the technology is not productive until it is helping a local language-speaking farmer by telling him the best techniques to store his grains or the latest fertiliser practices. Virtual assistants are far from speaking localised languages or correctly translating them. Farmers will not be able to gain assistance from AI until voice bots can speak to them in their local languages, which spotlights a great opportunity for technology to solve the challenges of feeding the billions of people on our planet.

To enable more advancements in services, we need to meet the computational parameters of Software 2.0. But how can we develop the right infrastructure? We need to build on the three key ingredients that come together to support Software 2.0 in India. Let’s discuss them briefly.

  1. Compute Power

To support AI applications, computers need to perform thousands of operations in a single second. To gauge the computing power of a country, we need to look at the number of data centres present. These data centres are the measurement of computing power, and we foresee them being fused with AI in the coming years. Massive workloads of data and computing cannot be done only on one or two servers. Instead, a high compute density is required to support model performance and flexibility. 

Data centres step in here to allow for a smooth flow of data to the network. They are the necessary infrastructure for setting up cloud computing and dealing with massive quantities of high-quality datasets that will be further used to generate AI products and services. This makes it important to have a systemised organisation and flow of data. Within data centres, the DPU is critical to accelerating the flow of data between compute nodes. 

While India is late in entering the data centre industry, we see a collaborative effort between the industry and the government to achieve India’s supercomputing self-reliance goal. As a result, India is expected to add at least 28 hyperscale data centres over the next three years.

Additionally, India has a key competitive advantage in computing, and we need to leverage it. The three integral components, viz electricity, workforce, and cooling are all cheaper in India. Plus, we can move beyond the IT hub cities and set up computing infrastructures in cities like Himachal or Sikkim that have a cooler climate and access to cheap hydroelectric power.

  1. Datasets

Software 2.0 is heavily dependent on neural networks and software that learn from input data and create more software and models. Data is at the core of this technology, and massive datasets are required to support Software 2.0. A massive set of data is required to train the AI model; the latter is further applied to another huge dataset to test it and make fresh predictions. The larger the training dataset, the better the model. We need datasets with thousands and millions of parameters to create a reliable machine learning model that we can trust to generate more applications. 

The large dataset size is only set to increase exponentially. In fact, 90% of the data available today has been generated in the last two years. One can only imagine the rate of data we will produce in the future. Typically, the data generated is fed into data centres to compute algorithms and process computational tasks. But, given the amount of data and the high power of computational nodes involved, data movement in the centres has become an infrastructural bottleneck. This warrants the need for data centres to invest in DPUs or data processing units. DPUs can increase the network’s volume by easing the movement of data between computational nodes. Creating an accelerated networking stack by combining the networking power of DPU with the efficiency of the CPU and the ability of GPUs to parallelly process capabilities is the critical path forward. 

Accelerating the work inside data centres requires us to take a holistic approach to compute power, data movement and integration of hardware and software. Building data centres and efficient datasets is not enough; we need to find approaches to bridge the gap and allow for efficient data movement. 

  1. Algorithms

Software 2.0 demands software engineers that can create and train machine learning models to create algorithms. The new process entails specifying a goal or intention into the ML program and a skeleton of the neural network architecture, backed up by loads of computational resources. The neural networks are trained on large datasets to come up with their own predictive model. Finally, the algorithm will write itself. 

The way forward

India has to accelerate its computational cap and heavily invest in the infrastructure needed to support advanced AI and computational applications in the country. The foundations of the technology need to be strong enough to support future research and development in integral areas. India is already one of the biggest hubs for IT with huge scope to become self-sufficient in Software 2.0. But it is important to leverage our workforce strength and take up new initiatives to really make a difference and make India a superpower in computing. 

This article is written by a member of the AIM Leaders Council. AIM Leaders Council is an invitation-only forum of senior executives in the Data Science and Analytics industry. To check if you are eligible for a membership, please fill the form here.

[ad_2]

Source link