Introduction:
Several new ideas, concepts and forecasts were made at JEDEC’s Mobile & IOT Forum on March 26, 2018 in Santa Clara, CA. Those related To Artificial Intelligence/ Machine Learning/ Deep Learning and New IT Requirements for Edge Computing are summarized in this article. In particular, three presentations are detailed while some thoughts on AI/ML for the intelligent IT edge are provided by this author.
- I. Making Sense of Artificial Intelligence – A Practical Guide
- II. Signs of Intelligent Life: AI Simplifies IoT
- III. A Distributed World – the New IT Requirements of Edge Computing
Review of 3 JEDEC Forum Presentations:
I. Making Sense of Artificial Intelligence – A Practical Guide:
This keynote presentation by Young Paik of Samsung was the clearest one I’ve ever heard on Artificial Intelligence (AI) – one of the most hyped and fudged technologies today. Although it has existed in many forms for decades (this author took a grad course in AI in 1969), recent advances in Deep Learning (DL) and neural net processors have finally made it commercially realizable and marketable. According to Young, there is real promise for AI and DL, but there are also real limitations. His talk provided an introductory overview of how AI and DL works today and some insights into different deployment scenarios.
DL has enabled AI to approach human level accuracy, as per this illustration:
A high level AI functional flow (but not implementation) and the Circle of DL Life are shown in the two graphics below.
In the second illustration, note that DL models need to be constantly fed data. A home thermostat is used as an example of AI feedback:
Mr. Paik’s said there are three takeaways from his talk:
1. Data is King: The more data => greater the accuracy.
2. Deep Learning is hard. Best to leave it to the professionals.
3. You don’t have to use one AI: Many, smaller AIs are better than one big one.
The following illustration proposes functional blocks for implementing mobile speech recognition:
Two ways to improve DL are: Transfer Learning (take a pre-trained DL model and retrain it with new data) and Model Compressions (selectively remove weights and nodes which may not be important). Those “tricks” could permit you to remove several functional blocks in the previous illustration (see above).
Finding new ways of using old technology and making use of multiple types of AI are shown in the following two figures:
i.e. Four different use cases (i.e. applications) of AI are shown in this slide:
In conclusion Young suggested the following:
• AI is still early in its development.
• Design of AI systems is evolving.
• You may find new uses for old ideas.
II. Signs of Intelligent Life: AI Simplifies IoT
In his opening keynote presentation, Stephen Lum of Samsung said that some IoT (industry vertical) device volumes have seen an explosion of demand due to the introduction of Artificial Intelligence into their usage model.
The connection and control of those devices is driving tremendous data traffic volumes into the cloud where the AI/ML/DL actually takes place. For example, the Amazon Echo and Google Home connected device control has all voice recognition, language understanding, AI/ML/DL done in cloud resident data center compute servers owned and programmed by Amazon and Google, respectively.
Autonomous vehicles will also have AI/ML/DL done in the cloud but likely at the network edge to provide ultra-low latency.
Stephen stated that a simple thesis of deep learning is that the more data used to train neural networks, the smarter the AI becomes. Some of his other cogent data points were the following:
- New AI chips are being designed to efficiently process deep neural networks.
- Solid state memory needs to keep pace with processors to prevent bottlenecks. See bullet points below for UFS.
- Scalability becomes more critical as consumers adopt new services.
- Universal Flash Storage (UFS) is a high performance, scalable interface for storage at the edge of the network.
- UFS combines the key features of eMMC (embedded Multi-Media Controller) and SSDs (Solid State Drives).
- UFS Card brings benefits to a removable storage form factor.
The diverse needs of three popular IoT industry verticals were said to be as follows:
- Wearables (e.g. smart watches, fitness trackers, etc): Low power, Low density, Specialized form factors.
- Smart Home (AKA Connected Home): Low cost, Low to mid density, Low to high bandwidth –depending on the device to be analyzed and/or controlled, 2-5 years longevity.
- Automotive (more than just autonomous vehicles): High bandwidth, High density, Very high reliability, 7-10 years longevity.
Summary:
- Artificial Intelligence (AI) is enabling more innovative real-time services to be delivered to consumers.
- AI in the Cloud simplifies edge devices, encourages their adoption with low cost of entry.
- Autonomous vehicles, cannot be Cloud dependent, will become AI servers on wheels.
- JEDEC has enabled tremendous advances in memory while expediting quick adoption and provides a firm foundation for memory-related ecosystems.
III. A Distributed World – the New IT Requirements of Edge Computing:
The number of distributed, connected data sources throughout the world has been multiplying rapidly and are creating tremendous amounts of data. IoT has now given rise to a new trend of aggregating, computing, and leveraging data closer to where it is generated – at the IT “Edge” – between the Cloud and the IoT endpoint device.
This presentation by Jonathan Hinkle of Lenovo provided insights into the new IT requirements for edge computing systems and how they are used.
Jonathan asked: How do we leverage our IT resources to unlock the value of all the data now being generated? While no direct answer was given, several suggestions were made during his presentation.
Ideally, we should be able to gain many things from analyzing “big data” which includes: Business Insights, Optimize Services, Recognize Behaviors, and Identify Problems (when they occur).
IoT Architecture Components include the following:
- Software: Analytics, Applications, Platforms
- Security, Networking, Management
- IoT Endpoint devices (“things”)
- Edge IT (especially for low latency applications)
- Core network and cloud IT
The functions required from the IoT endpoints to the cloud are: Generate Data /Automate / Control / Pre-process / Aggregate / Analyze / Store / Share the data. Observations:
- It costs time, money, and power to move data.
- Best practice: move data when useful or necessary
- Reduce data set required to be forwarded to each stage
Keeping data local (instead of backhauling it to the cloud for processing/analysis) requires:
- Store data nearer to sources (IoT endpoints) whenever possible. This is accomplished by filtering data at the edge such that less data (to be analyzed by powerful compute servers) are sent upstream to the cloud.
- Maintain fast action on time-sensitive data by doing computation immediately instead of moving the data first.
In conclusion, Mr. Hinkle said that data growth will continue as the sources multiply – both from computing sources (e.g. smart phones, tablets, other handheld gadgets) and IoT endpoints which produce digital data to represent our analog world. “Edge IT infrastructure will enable us to scale with that data growth and unlock its inherent value.”
Author’s Note: Mr. Hinkle did not touch on how much, if any, AI/DL would be implemented in the “Edge IT infrastructure.” Unfortunately, the moderator didn’t permit time for questions like that one to be addressed. In an email to this author, Jonathan wrote: “My personal thoughts on AI/ML/DL moving to the edge – absolutely, but just like many things it’s a matter of time.”
AI/ML/DL at the IT Edge or in IoT Devices:
Especially for IoT use cases, many experts believe that AI/ML/DL will move from cloud resident servers to the mobile edge of the network or a server/gateway on customer premises.
Qualcomm takes it one step further and talks about “on-device AI” in this article.
Similarly, ARM’s project Trillium is a new AI processing platform specifically designed for machine learning and neural network capabilities. It can scale from being embedded in IoT devices to the IT edge to powerful compute servers located in cloud resident data centers according to this article.
2 replies on “JEDEC Forum: Artificial Intelligence/ML/DL & IT Requirements for Edge Computing”
Excellent summary, Alan and lot of food for thought here and reinforces things I heard at last week’s Nvidia GPU conference. The importance of data reminds me of what Nvidia’s Jensen Huang said, “that data is the new source code.”
Regarding the idea that machine learning is hard, Huang agreed and summarized it with an acronym, PLASTER
https://blogs.nvidia.com/blog/2018/03/26/live-jensen-huang-keynote-2018-gtc/
Programability
Latency
Accurate
Size
Throughput
Energy efficiency
Rate of learning
From a GPU perspective, Huang suggested that GUP performance is beating what Moore’s law would predicted by 2.5x improvement when he stated, “that’s 25X growth in five years, whereas Moore’s Law would have suggested 10x growth.”
This, according to Huang is fueling AI growth as, “AI growing in double exponential rate.”
It looks like they are about two generations from chipsets that will go into commercial autonomous vehicles, which would put them at about 2020/2021, which is consistent with what Jensen said in an off-the-cuff press conference he held after his keynote. He said that for the next 2 to 3 years most of their AV revenue would be from development kits and after that it would be commercial. They have approximately 370 AV partners.
They also had a significant announcement for a product called CLARA, which uses AI to create 3D and photorealistic images/videos from CT scan type imagery. He didn’t say it, but it could potentially have a huge impact on the “right-to-life” debate, as the images/videos bring 2 dimensional scans to life.
Here is my interview with Alain Kornhauser at that conference, a professor from Princeton, who focuses on autonomous vehicles and such:
http://viodi.com/2018/03/28/its-the-edge-cases-youve-got-to-get-right/
Kornhauser is of the mind that 5G is a nice-to-have, but not must-have. This is consistent with others in the AV industry, as Mercedes Benz says they design with the idea that they can’t count on having connectivity and, if they do, it won’t be reliable.
Regarding Project Trillium, Nvidia announced their support for Arm’s imitative by making its open-source, Deep Learning Accelerator available for these low-cost, edge processors.
https://nvidianews.nvidia.com/news/nvidia-and-arm-partner-to-bring-deep-learning-to-billions-of-iot-devices
Did I hear AI everywhere?
Thanks for your comment Ken. My primary unresolved issue is what and when will AI/ML come to the Intelligent IT edge?
At the IDC Directions 2008 conference last month the following forecasts were made for the “Intelligent Network Edge”:
-Intelligent Edge IT goes mainstream in 2022 replacing 88% of existing edge applications.
-Edge net must be as agile as the apps and services it supports.
-AI/ML at the intelligent edge will enable policy an intent to facilitate autonomous networking.
-Industry specific clouds will complicate networking functions to be done at the edge.