Much of what comes together in the Internet of Things depends on the first node in the network, which is most often a sensor. Transformational Industry 4.0 analytics and automation require even better input, which has kept vendors busy trying to bring smarter embedded sensors to market.

The role of smart sensors is looked at even more closely now, as an industry-wide architectural switch is underway that moves more processing to the edge, reducing reliance on “cloud-only” processing. As sensors proliferate, it becomes very difficult to handle all sensor information in the distant cloud, so sensors and the edge nodes they reside upon the need to get smarter.

Smart Sensors are Becoming Intelligent

The sensor itself, an analog-to-digital conversion function, a computational – or microcontroller unit – and a communication engine are the four basic sections of a typical smart sensor. It might be wireless or wired nowadays.

Many smart sensors, for example, include many sensor kinds, each tuned to a specific purpose. The purpose of the integrated microcontroller unit varies as well (MCU). Some embedded MCUs are simply state machines that regulate the data conversion and communication processes, while others run entire sensor fusion algorithms.

Purposeful Power Conservation

A smart sensor is an electronic component that can read and store physical measures such as acceleration, light, flow, and humidity, as well as execute more complex operations for various reasons.

As sensors move closer to the edge of the Internet of Things, power conservation is becoming increasingly important. In some applications, battery-free sensors that capture energy from the environment are becoming more popular.

Always on the Edge

Power consumption issues in mission-critical edge applications extend beyond the sensor to encompass the processor. Concerns are particularly apparent for computers designed for machine learning.

Rich electrical power supply is a given in cloud data centers. On the IoT edge, however, this is not the case.

Adapting the Fitbit Model

Sensor feature advancements and price reductions have been aided by consumer gadgets such as the iPhone, AirPod, and Fitbit. The gold-rush-style push for assisted and autonomous vehicles might achieve the same thing, if not more, by encouraging sensor fusions that combine several sensor approaches.

For obvious safety concerns, assisted and future autonomous driving systems require extensive and diverse sensor data. Smart sensor advancements in mission-critical industrial applications – which update at longer intervals than constantly changing consumer apps – may, however, take longer to spread.

Gauging Sensor Requirements

In the face of the blast of new technologies, basic trade-offs remain similar. The challenge of moving to the edge and deploying machine learning does little to alter the basic system decisions that have always influenced sensor system design.

The role embedded sensors and AI play in emerging IoT sectors is an evolving one. Sensor advances in imaging, MEMS, lidar, Wi-Fi, UWB, radar and elsewhere clearly are plentiful – as are diverse machine learning cores that work to “make sense of the sensors.”

(This is a slightly modified version of an article originally published in Boss Magazine. The original article can be found at https://thebossmagazine.com/new-obstacles-for-robotics/)