Introduction:
Several new ideas, concepts and forecasts were made at JEDEC’s Mobile & IOT Forum on March 26, 2018 in Santa Clara, CA. Those related To Artificial Intelligence/ Machine Learning/ Deep Learning and New IT Requirements for Edge Computing are summarized in this article. In particular, three presentations are detailed while some thoughts on AI/ML for the intelligent IT edge are provided by this author.
- I. Making Sense of Artificial Intelligence – A Practical Guide
- II. Signs of Intelligent Life: AI Simplifies IoT
- III. A Distributed World – the New IT Requirements of Edge Computing
Review of 3 JEDEC Forum Presentations:
I. Making Sense of Artificial Intelligence – A Practical Guide:
This keynote presentation by Young Paik of Samsung was the clearest one I’ve ever heard on Artificial Intelligence (AI) – one of the most hyped and fudged technologies today. Although it has existed in many forms for decades (this author took a grad course in AI in 1969), recent advances in Deep Learning (DL) and neural net processors have finally made it commercially realizable and marketable. According to Young, there is real promise for AI and DL, but there are also real limitations. His talk provided an introductory overview of how AI and DL works today and some insights into different deployment scenarios.
DL has enabled AI to approach human level accuracy, as per this illustration:

A high level AI functional flow (but not implementation) and the Circle of DL Life are shown in the two graphics below.
In the second illustration, note that DL models need to be constantly fed data. A home thermostat is used as an example of AI feedback:


Mr. Paik’s said there are three takeaways from his talk:
1. Data is King: The more data => greater the accuracy.
2. Deep Learning is hard. Best to leave it to the professionals.
3. You don’t have to use one AI: Many, smaller AIs are better than one big one.
The following illustration proposes functional blocks for implementing mobile speech recognition:

Two ways to improve DL are: Transfer Learning (take a pre-trained DL model and retrain it with new data) and Model Compressions (selectively remove weights and nodes which may not be important). Those “tricks” could permit you to remove several functional blocks in the previous illustration (see above).
Finding new ways of using old technology and making use of multiple types of AI are shown in the following two figures:


i.e. Four different use cases (i.e. applications) of AI are shown in this slide:
In conclusion Young suggested the following:
• AI is still early in its development.
• Design of AI systems is evolving.
• You may find new uses for old ideas.
II. Signs of Intelligent Life: AI Simplifies IoT
In his opening keynote presentation, Stephen Lum of Samsung said that some IoT (industry vertical) device volumes have seen an explosion of demand due to the introduction of Artificial Intelligence into their usage model.
The connection and control of those devices is driving tremendous data traffic volumes into the cloud where the AI/ML/DL actually takes place. For example, the Amazon Echo and Google Home connected device control has all voice recognition, language understanding, AI/ML/DL done in cloud resident data center compute servers owned and programmed by Amazon and Google, respectively.
Autonomous vehicles will also have AI/ML/DL done in the cloud but likely at the network edge to provide ultra-low latency.
Stephen stated that a simple thesis of deep learning is that the more data used to train neural networks, the smarter the AI becomes. Some of his other cogent data points were the following:
- New AI chips are being designed to efficiently process deep neural networks.
- Solid state memory needs to keep pace with processors to prevent bottlenecks. See bullet points below for UFS.
- Scalability becomes more critical as consumers adopt new services.
- Universal Flash Storage (UFS) is a high performance, scalable interface for storage at the edge of the network.
- UFS combines the key features of eMMC (embedded Multi-Media Controller) and SSDs (Solid State Drives).
- UFS Card brings benefits to a removable storage form factor.
The diverse needs of three popular IoT industry verticals were said to be as follows:
- Wearables (e.g. smart watches, fitness trackers, etc): Low power, Low density, Specialized form factors.
- Smart Home (AKA Connected Home): Low cost, Low to mid density, Low to high bandwidth –depending on the device to be analyzed and/or controlled, 2-5 years longevity.
- Automotive (more than just autonomous vehicles): High bandwidth, High density, Very high reliability, 7-10 years longevity.
Summary:
- Artificial Intelligence (AI) is enabling more innovative real-time services to be delivered to consumers.
- AI in the Cloud simplifies edge devices, encourages their adoption with low cost of entry.
- Autonomous vehicles, cannot be Cloud dependent, will become AI servers on wheels.
- JEDEC has enabled tremendous advances in memory while expediting quick adoption and provides a firm foundation for memory-related ecosystems.
III. A Distributed World – the New IT Requirements of Edge Computing:
The number of distributed, connected data sources throughout the world has been multiplying rapidly and are creating tremendous amounts of data. IoT has now given rise to a new trend of aggregating, computing, and leveraging data closer to where it is generated – at the IT “Edge” – between the Cloud and the IoT endpoint device.
This presentation by Jonathan Hinkle of Lenovo provided insights into the new IT requirements for edge computing systems and how they are used.
Jonathan asked: How do we leverage our IT resources to unlock the value of all the data now being generated? While no direct answer was given, several suggestions were made during his presentation.
Ideally, we should be able to gain many things from analyzing “big data” which includes: Business Insights, Optimize Services, Recognize Behaviors, and Identify Problems (when they occur).
IoT Architecture Components include the following:
- Software: Analytics, Applications, Platforms
- Security, Networking, Management
- IoT Endpoint devices (“things”)
- Edge IT (especially for low latency applications)
- Core network and cloud IT
The functions required from the IoT endpoints to the cloud are: Generate Data /Automate / Control / Pre-process / Aggregate / Analyze / Store / Share the data. Observations:
- It costs time, money, and power to move data.
- Best practice: move data when useful or necessary
- Reduce data set required to be forwarded to each stage
Keeping data local (instead of backhauling it to the cloud for processing/analysis) requires:
- Store data nearer to sources (IoT endpoints) whenever possible. This is accomplished by filtering data at the edge such that less data (to be analyzed by powerful compute servers) are sent upstream to the cloud.
- Maintain fast action on time-sensitive data by doing computation immediately instead of moving the data first.

In conclusion, Mr. Hinkle said that data growth will continue as the sources multiply – both from computing sources (e.g. smart phones, tablets, other handheld gadgets) and IoT endpoints which produce digital data to represent our analog world. “Edge IT infrastructure will enable us to scale with that data growth and unlock its inherent value.”
Author’s Note: Mr. Hinkle did not touch on how much, if any, AI/DL would be implemented in the “Edge IT infrastructure.” Unfortunately, the moderator didn’t permit time for questions like that one to be addressed. In an email to this author, Jonathan wrote: “My personal thoughts on AI/ML/DL moving to the edge – absolutely, but just like many things it’s a matter of time.”
AI/ML/DL at the IT Edge or in IoT Devices:
Especially for IoT use cases, many experts believe that AI/ML/DL will move from cloud resident servers to the mobile edge of the network or a server/gateway on customer premises.
Qualcomm takes it one step further and talks about “on-device AI” in this article.
Similarly, ARM’s project Trillium is a new AI processing platform specifically designed for machine learning and neural network capabilities. It can scale from being embedded in IoT devices to the IT edge to powerful compute servers located in cloud resident data centers according to this article.
Leave a Reply