Keynoter: Noise Analysis Beats Google Now

Keynoter: Noise Analysis Beats Google Now

SAN JOSE, Calif. — Researchers are mining a largely untapped data source — the signals and noise generated by smart-device sensors — to enable technologies that solve the world’s hardest human-machine interface problems, Intel Fellow Lama Nachman, Director of the Anticipatory Computing Lab, told a keynote audience at SEMI’s MEMS & Sensors Executive Congress 2017 (San Jose, Calif.). The resultant applications will accurately detect emotions and anticipate needs, without requiring a Google-like dossier of user habits, she predicted.

“Technology needs to be more active at understanding the needs of the user,” Nachman said. “To do that, our job at the Anticipatory Computing Lab is to really understand what type of help you need in any situation.”

Reviewing earlier stabs at productivity-enhancing personalized assistants, Nachman praised Apple’s Siri and Amazon’s Alexa because they kept it simple, only offering a helping hand in response to specific user requests, whereas Microsoft’s initial efforts to make ad hoc suggestions to users wound up irritating them at best and breaking their train of thought at worst. She praised Google Now’s ability to make ad hoc suggestions to its users that are actually useful (for the most part). The downside to Google Now is the deep knowledge it needs to mine from users’ habits with respect to browsing, location, email, purchasing, and other behaviors — a collection of data amounting to a dossier on each user.

 

Intel Fellow Lama Nachman, director of its Anticipatory Computing Lab, was the keynote speaker at SEMI's MEMS & Sensors Executive Congress. Credit: IntelIntel Fellow Lama Nachman, director of its Anticipatory Computing Lab, was the keynote speaker at SEMI�s MEMS & Sensors Executive Congress.
Credit: Intel

Instead, Intel’s Anticipatory Computing Lab aims to repurpose the signals and noise produced by the legions of sensors already deployed in smartphones, smart watches and wearables, smart automobiles, and the Internet of Things (IoT) to make ad hoc suggestions that entertain, increase productivity, and even save people’s lives — whether or not they are Intel users — all in real time and without a Google-like secret data bank on user habits.

“Intel is taking all the sensor feeds available now and reinventing the way they can help people with volunteered information that is always relevant to the person, what you are doing, and what goals you are trying to achieve,” Nachman said. “But to do so, there is a very large set of capabilities that we need to understand, such as emotions, facial expressions, nonverbal body language, personal health issues, and much, much more.”

Nachman worked with Stephen Hawking to perfect a 'cheek muscle' sensor that the renowned physicist uses to talk, write, and run his computer. Credit: IntelNachman worked with Stephen Hawking to perfect a �cheek muscle� sensor that the renowned physicist uses to talk, write, and run his computer.
Credit: Intel

Many of these personal parameters can be gleaned from the normal usage of the sensors built into our smartphones, wearables, and IoT devices — for instance, facial expressions from a smartphone’s user-facing camera or the volume of a user’s voice. The sensor data can be fused with smart-watch data on pulse rate, activities, location, and more to anticipate a user’s actions and needs with unprecedented accuracy, according to Nachman.

“To understand emotions ‘in the wild,’ so to speak, it is essential to understand, for instance, when you are angry. Even if you are not cursing or yelling, your computer should understand when you are pissed off,” said Nachman. “Physical factors like breathing fast can be seen by a user-facing smartphone camera, fast heart rate can be measured by your smart watch, but we need to fuse that with facial expressions and a deeper understanding of how individuals behave.”

NEXT PAGE: Noise is the Signal 


PreviousCall to Arms on Cybersecurity for Industrial Control
Next    Andes, Rockchip to Join FD-SOI Club