Technology

Trend 2020: AI, the crucial question of data and hardware

What we put today under the name “artificial intelligence” is a generic term which indicates various techniques of recognition of forms and sounds. Machine learning and its derivatives, like deep learning, essentially boil down to pattern recognition. We witnessed several breakthroughs in the 2010s, but the seeds of most techniques and algorithms were planted decades ago.

We have seen immense progress in the performance of AI systems in many areas, going from a situation below human skills to catching up or even exceeding these skills. How is it possible ? The answer is twofold and can be summed up as follows: data and calculation.

The phenomenon of digitization of almost all human activities has led to an explosion in the volumes of data generated. The algorithms now have a lot more data to work with, and are actually much more efficient. Progress has also been made in parallel in image recognition. Adjustments to neural networks have increased the accuracy of the algorithms. ImageNet is a good example of this.

A gap between the knowing and the non-knowing

Artificial intelligence is advancing in new areas at a breakneck pace. In 2019 alone, we saw great progress in areas such as automatic natural language processing to name just one.

Improvements in quality of results and speed of execution were noted every month last year. The amount of resources devoted to it is staggering, and research is progressing in record time. Should we be preparing for the new world of AI? Maybe not so fast actually.

The problem with AI is that the gap continues to widen between those who have the knowledge and those who lack it. Not necessarily because of the resources and expertise of the big players. This gap is self-sustaining, and the simple fact of designing and producing products from data means that these products catalyze even more data as they work.

The virtuous circle of data usage

Following this reasoning: if the increased use of data ensures the deployment of better AI, then more reliable products are designed, which generates even more data, and so on. Facebook is a prime example, but it is not the only one. When The Economist or others are calling for stronger data regulation, that should put us in the dark.

However, data is only part of the equation. The other part concerns the hardware. Without the enormous progress made in the hardware field during the previous decade, artificial intelligence would not exist. Access to the computing power necessary to process the huge amounts of data required to process machine learning was once a privilege reserved for a minority.

The type of computer equipment to which large technology companies have access remains inaccessible to ordinary people, but a parallel market with almost equivalent capacities has become more democratic. The combination of the cloud, characterized by its unlimited access to processing power, with more efficient computer hardware, has enabled more companies than before to access AI acceleration chips, provided to put the price on it.

The rise of NVIDIA

In the 2010s, NVIDIA won the innovation prize for hardware hardware designed for AI. This manufacturer, best known for its graphics processors (GPU) – generally used by gamers – has reinvented itself as an AI champion. It turns out that the architecture of GPUs adapts very well to AI workloads.

Since Intel dominated the market for traditional processors (CPU), and other GPU manufacturers could not keep up, so NVIDIA has risen to the rank of leader in hardware for AI. All this is not set in stone, the world of hardware knows rapid innovations.

While NVIDIA has acquired a whole software ecosystem dedicated to AI, disruptions are coming to attack the chip market head-on. At the end of 2019, Intel was offensive by buying Habana Labs. Habana Labs is one of the start-ups active on the AI ​​chip market, focused on the search for new designs and which adapt from A to Z to the specific workloads required to process AI.

The flea race is just beginning

Even though many do not (yet) know Habana Labs, these chips are already used by cloud providers and manufacturers of autonomous vehicles. GraphCore, which became the first AI chip unicorn in late 2018, recently announced that its chips are now used in Microsoft’s Azure cloud. Far from over, this flea race is just beginning.

This article is part of our dossier on the 5 technologies of the future decade. To go further, read also:

.

Tags

Related Articles

Back to top button
Close