For several years now, the Consumer Electronics Show (CES) has served to herald the application of new technology in the latest crop of consumer gadgets. The 2018 exhibition was attended by 106,288 visitors eager not only to do business and learn what products they can expect in the market during the year, but to find out about cutting-edge technologies that are still in development.
Among the key themes of CES 2019 are machine learning (ML) and artificial intelligence (AI), with 39 conference sessions across various tracks expected to discuss this topic. The reason is simple: artificial intelligence touches a wide variety of market segments, including robotics, fintech and digital money, automotive, and even marketing.
ML — the capability by which machines self-correct and refine tasks — and the broader AI have been in development for decades and have reached a point where they are about to initiate a revolutionary restructuring of business.
From monitoring and controlling the manufacturing floor to predicting trends for business decisions, smart applications will be made possible by AI’s ability to turn mountains of data that we are already generating into actionable insights. AI will take automation from simple repetitive tasks and progress it to complex processes, including those with variability.
This growing sophistication is possible with improving infrastructure, such as cloud-based tools. An example of this is the Microsoft Azure Machine Learning services that allow the use of automated ML to identify suitable algorithms and tune parameters, and seamlessly deploy to the cloud and the edge.
Arrow Electronics has already brought this level of cloud-based intelligence to consumer Internet of Things (IoT) products. For instance, Arrow’s SmartEverything board easily connects to multiple cloud providers to enable data analytics for decision making.
AI on edge
In the industrial environment, where a large number of sensors for predictive maintenance and machine condition monitoring contribute high data volume and velocity, and real-time decision-making requirements make it prohibitively expensive to stream all the data to the cloud, the case for AI at the edge is well known. However, in consumer products too, AI is reaching into cameras and smart speakers to avoid hauling large amounts of audio and video data into the cloud, decrease latency and address privacy issues. For instance, high-resolution cameras coupled with ML models for demographics like age, gender and mood, give retailers insights on consumers in their stores but raise privacy concerns if inferencing is not done at the edge.
AI at the edge is enabled by ML models that are trained in the cloud with large data sets but easily deployed to each device or gateway at the edge for inferencing. The computing resources that make this possible include specialized AI chips. Qualcomm, for instance, has several processors that can handle such workloads at the IoT node. Among the most powerful of such Qualcomm chips is the Snapdragon 845.
The system on chip (SoC) offers:
- Spectra 280 image signal processor, which enables capture of up to 16 MP at 60 images per second,
- Adreno 630 visual processing subsystem, which features room-scale 6DoF (degrees of freedom) positional tracking with simultaneous localization and mapping (SLAM),
- Adreno Foveation, which reduces the GPU’s workload by giving higher priority to image resolution within the user’s fixation point, and
- Hexagon 685 DSP, which was designed specifically for on-device AI and ML, and efficiently handling image, voice and sensor data.
This year, eInfochips, an Arrow company, began offering a hardware development kit (HDK) based on the Snapdragon 845 mobile platform. The Eragon 845 HDK provides an open-frame solution that supports Android 8.0 and contains a processor card with Snapdragon 845, a mini-ITX carrier board, a 12V AC power adapter, battery and USB cable. It supports camera and display as optional accessories.
Another family of devices built for AI at the edge comes from NVIDIA. The company’s Jetson systems, including Jetson TX1, TX2 and AGX Xavier can all handle AI workloads. The Jetson TX2, which comes in three versions – TX2, TX2i and TX2 4GB – allows bigger, more complex deep neural networks than the TX1. Their latest module, the AGX Xavier, consumes 30 W to deliver 32 TOPS performance and finds use in particularly demanding AI applications, like handheld real-time DNA sequencing and industrial robots.
Interested in learning more?
For more information on this topic, or to get in touch with engineering specialist who can help answer any questions you might have, head to arrow.com