Artificial Intelligence(AI) was everywhere in 2018. And it looks like the hype is not going to die down anytime soon.
AI signals to a future where machines carry out not only the physical work, as they have been doing since the industrial revolution, but also perform tasks that involve the use of cognitive abilities such as making decisions, planning and strategizing. Debates are still on whether this will lead to a utopic world where humans are left free to do meaningful things rather than what the economic conditions dictate or to a world where there is widespread social unrest and unemployment.
The recent budget speech given by India’s Acting Finance Minister Piyush Goyal revealed the government’s plan to reap the benefits of AI. For this purpose, the government is set to establish National Centre on Artificial Intelligence and several other centres of excellence.
Such is the growth of AI that it continues to astound us everyday. A study by Juniper Research predicts that the global spending on AI will rise to $7.3 billion per year by 2022 from $2 billion in 2018. Yet another study by International Data Corporation predicts that the compound annual growth rate for the global spending on artificial intelligence will be 50.01% reaching $57.6 billion by 2021.
Biggies such as Apple, Google, Facebook, Microsoft and IBM are investing heavily in the field of AI. So, we are sure to see further advancements in AI, maybe even better than Sophia and Alexa!
Following are the top three AI trends that you need to be watch out in 2019:
Emergence of Artificial Intelligence Enabled Chips
Artificial Intelligence depends on specific kinds of processors that support the CPU. The speed of an AI training model cannot be enhanced even by using the most advanced and fastest CPU. The model requires extra hardware during inferencing to execute difficult mathematical computations for speeding up tasks like facial recognition and object detection.
This year, chip manufacturers like Qualcomm, Intel, ARM, AMD, and NVIDIA will ship special kinds of chips that can enhance the performance of AI- enabled apps. These specialized chips will be optimised for specific scenarios and use cases related to natural language processing(NLP), speech recognition and computer vision. Most next generation apps from automobile to healthcare industries would provide intelligence to end users by using these chips.
This year, you can also expect companies such as Microsoft, Google, Facebook and Amazon to increase their investments in custom chips that are based on application specific integrated circuits (ASIC) and field programmable gate arrays (FPGA). Such chips will be optimized to a great degree in order to run modern workloads that are based on AI & high-performance computing (HPC). Few of these would also help next gen databases to enhance predictive analytics and query processing.
Coming Together of AI and IOT at the Edge
This year, artificial intelligence and internet of things will converge at the edge computing layer. Nearly all the models which are public cloud trained would be employed at the edge.
Industrial internet of things is one of the popular AI use case that can carry out predictive maintenance of the equipment, outlier detection and root cause analysis.
Deep neural networks based advanced ML models will run at the edge after they have been optimized. These models have the ability to deal with unstructured data, time-series data, speech synthesis, and video frames that are created by devices like microphones, cameras, and other types of sensors.
Internet of things is thus all poised to be a top driver of enterprise AI. Specialized artificial intelligence chips based on ASCIs and FPGAs will enable edge devices.
Neural Network Interoperability Will Become Important
Selecting the right type of framework is extremely important for developing neural networks. There are many frameworks from which developers and data scientists can make this choice such as Apache MXNet, Microsoft Cognitive Toolkit, TensorFlow, Caffe2, and PyTorch. Porting a trained model to some other framework is difficult after that model is trained and evaluated in a particular framework.
One of the reasons that is hampering the increased adoption of AI is the absence of interoperability among neural network toolkits. In order to address this concern, companies such as Facebook, Microsoft and AWS have come together to develop Open Neural Network Exchange (ONNX) . With ONNX, will be possible to use the trained neural network models again across various frameworks.
ONNX will rise to become a must-have tech for the industry in 2019. Most of the key players such as researchers, edge device builders and many others will start depending on ONNX as they will consider it as the standard runtime for inferencing.