Data Analytics & AI

Increase compute power to enable AI

The evolution of artificial intelligence (AI) techniques and the increased gathering of data require an increasing ability to perform computations. Without a corresponding increase in the availability of compute power, AI would not be able to take advantage of these.

In its recently released report AI in telecommunications, Cambridge Network specificed two areas where this increase in computing power has enabled AI to progress. The first is an increase in power of graphical processing units (GPUs) which are used by servers to undertake machine learning. The second is the increase in specification of personal computing devices which provide more local processing of information and implementation of AI models without the reliance on a connection to a cloud server.

Cambridge Network explains that the increase in processing power is underpinned by the continuing advances in semiconductor technology and the shrinking size of transistors within processors. As shown in the figure below, the size of transistors has been reducing year-on-year. Qualcomm announced at the beginning of 2017 that their Snapdragon 835 processor will contain 10nm transistors, meaning more processing power and lower energy consumption.

Source: Extremetech

Faster GPUs

Increasing computing power has become available through the use of GPUs for machine learning. These were initially designed to carry out millions of calculations in parallel to meet the demand for fast-moving graphics in video games. Their massively parallel architectures are ideally suited to performing the calculations required by machine learning algorithms, and so GPUs have been harnessed to enable machine learning algorithms to run quickly and efficiently. The company best known for its GPUs, Nvidia, has released increasingly powerful GPUs, as shown in the figure below, increasing the power available for machine learning applications.

Source: Nvidia

Higher specification personal devices

The increasing processing power available to personal computing devices at a reducing cost, shown in the figure below, enables more local processing of information using AI algorithms. For example, voice recognition is becoming increasingly effective on personal devices. Increasingly, personal computing devices will contain AI chips, such as Apple’s new iPhones which have a “neural engine” and Huawei’s Mate 10 with a “neural processing unit”69. This will pave the way for a greater number of AI apps and functions being performed locally on mobile device.










Source: Gizmodo


    About The Author


    Arti has been writing and editing for seven years in the fields of technology, business and finance. She is particularly interested in how firms are innovating to bring us into the next digital age.

    Leave A Reply

    Back to top