Sponsored Content

Processing has moved to the edge. Now it’s becoming “aware” says NXP

If you’ve been following the evolution of IoT, you know that in the early days of this revolution, data generated by the user wasn’t processed locally but at faraway data centers. But before long, it became obvious this wasn’t the most effective approach. A round trip from where the data was generated to the data center and back took too long, cost too much, and consumed communications resources.

“You wouldn’t notice the milliseconds of latency in your smart thermostat,” says Ron Martino, senior vice president and general manager of edge processing at NXP. “But for an industrial robot and many real-time systems, it can be the difference between safety hazard and productive assembly line, and for connected vehicles, it can be a matter of life and death.”

These issues can be resolved if machine learning algorithms process data locally, assuming enough processing performance can be delivered at a reasonable cost in a fairly small size. But cost-effectiveness wasn’t a priority when complex edge processing was first conceived, so higher-end gateways and cloud computers were the solutions.

But not long after, microprocessor-based SoCs made a great leap forward in performance, capabilities, and cost reduction. While typically considered commodities, microcontrollers began to include multiple processors dedicated to specific tasks with significant processing power, several wireless protocols, advanced power management, and other impressive features.

Today, they are powerful enough to make decisions based on data aggregated from multiple sensors. They perform analysis that was formerly the sole domain of the cloud, send commands to machines with virtually no latency, and transfer only a summary of the information (a much smaller amount) to the cloud. In addition to near-zero latency, they reduce communication and cloud-based processing cost, energy consumption, and keep proprietary information safe at the user location. An intelligent door lock, for example, can facilitate unlocking your doors when it recognizes your face because it “knows it’s you,” and it can store and process the image data locally for a speedier response and enhanced privacy.

Image Credits: Getty Images

So, now that edge computing adoption is expanding, the next step is to add even more intelligence to achieve a level of “awareness.” If this seems fanciful, consider that after being trained, tiny microprocessor-based systems can now perform machine learning and make decisions without any external assistance, and require minimal power, in a footprint the size of a postage stamp. Add sensors to one of these systems and together they become data-generating, decision-making powerhouses.

“This aware edge is much more intelligent and capable of solving very complex problems because aware devices can collaborate with each other and ‘know’ much more about the environment and can act with even greater insights,” says Martino. “The difference between an intelligent device and one that is aware is that the former may process only voice, for example, and follow its instructions, while an aware device in a collaborative network could combine nuances such as the tone of the voice, facial expressions, and body gestures.”

With the aware edge, data from multiple smart home sensors can recognize danger signals such as someone falling and send an alert to someone else. And if some data must be sent to the cloud, an intelligent edge processor anonymizes it along with other information such as audio profiles about your voices and others who are authenticated.

“We’ve gone from home security cameras that sense and record activity, to aware cameras that scan faces and recognize family members and even alert authorities of suspicious activity. They can also be occupancy-aware, recognizing when the house is empty and then automatically arm the security system,” says Martino.

“A typical security camera might capture video of car thieves in your driveway, but that’s not all that helpful if you’re watching it hours later,” he continues. “You may have a timestamp from 2:33 am to 2:45 am that shows the event, but all it did was record the theft instead of making any meaningful decisions. If you can add awareness and context, the system will spot a person it doesn’t recognize who likely should not be there at that hour and send out an alert to the owner as it happens.”

In an occupancy-aware building, data from cameras and other sensors can be processed to control temperature and lighting to optimize energy efficiency. 

Image Credits: Getty Images

For predictive maintenance, data from multiple sensors can be collected and analyzed for anomalies and the system alerts the user before a system failure occurs.

In fleet operations, a smart wearable monitor can sense fatigue and sound alarms to the driver. If the driver is unresponsive, it can make a decision to inform someone at the company. Using machine learning, predictions can be made where and when the drivers are likely to become fatigued, which can help improve safety and productivity.

In addition, with the aware edge, vision systems throughout an airport can not only detect abandoned luggage but automatically trigger airport-wide detection searches of the last recorded luggage owner. 

In crowded urban areas, object detection systems can classify vehicles and coordinate traffic throughout the area to enhance traffic flow. 

These are just a few of the functions that can be performed at the aware edge—all by a small, low-power device that is a fraction of the size of an edge computer. In the future, sensor fusion will continue to advance, which will increase accuracy in almost any application in which multiple sensors are used.

To be practical, solutions like these must operate in an environment populated by sensors that run on battery power where energy efficiency is crucial. That is, they must conform to their operating environment rather than the other way around.

To address this, NXP introduced its Energy Flex device architecture that allows power to sections within devices such as NXP’s EdgeVerse processors to be turned on and off based on when they are needed. Only the active part of the device that needs power gets it. So, most of the device can remain in a deep sleep, low-power state but still be alert to its environment.  NXP has taken this well beyond simple power domains and traditional dynamic controls by leveraging investments in technology such as fully depleted silicon-on-insulator (SOI) to shut down sources of leakage, architectural enhancements for finer controls on reducing active power with dynamic firmware-based control systems, and expanded heterogeneous compute options within a single processor. 

In the future …

The transition of processing and analysis from the data center to the edge is quite new, but in the blink of an eye, it has already made great leaps forward. Analysis that was once considered impossible in tiny devices connected to a sensor is now not just possible but being implemented in applications from face and voice recognition to object detection. In the coming years, the transition to the aware edge will become commonplace, even in low-cost home automation and surveillance systems. For this, we owe enhancements to the once-lowly microcontrollers that have evolved to transform the edge and expand their reach into smart buildings and cities, and beyond.