The 5 Senses of Sensors
How IoT is able to replicate and manufacture the 5 senses and beyond. Featuring interviews with VANTIQ, Sensome, Pensa Systems, Naked Labs, Genki Instruments, Tanvas, and Microshare.
Sensing technology has become so mature that it is now indispensable to modern life as we know it. Advanced sensor networks are used in many different fields and are undergoing continuous development. Due to AI technologies, sensor devices are becoming increasingly intelligent, being able to communicate with one another and develop autonomous behavior. Some even do computing on the edge. By collecting data from homes, buildings, and vehicles, sensors shape a digital version of the environment in which they operate. Like this, they enable applications in literally every existing industry.
Sensing technology spans over all the 5 core human senses, and beyond. Let’s break down digital sensing technology and some of its applications.
See
Vision systems are some of the most popular digital sensing systems, that are as common in industrial use as they are in daily life. Surveillance cameras, infrared scanning systems, and LiDAR technologies are all built on the capacity to model 2D and 3D environment information. Visual sensing technology lays the foundation for multidisciplinary development, from research engineering projects to large-scale industrial ventures.
Algorithms that can process visual sensor data enable countless applications. Security systems perform live motion detection, face tracking, and identity recognition. Similarly, self-driving cars assess trajectory in real-time and estimate pedestrian intentions. Algorithms and visual sensing technology work hand in hand and create opportunities for innovation and development of increasingly mature software products.
VANTIQ offers visibility into COVID19 back-to-work safety and security solutions based on real-time video analysis. David Sprinzen, Director of Product Marketing at VANTIQ explains, “intelligent security cameras can flag suspicious activity in real-time, this removes the need for tedious & time-consuming assessment of video footage, mitigating the risk of exposure and contact with threatening persons.” Similarly, their pre-built components —symptom detection, physical distancing, contact tracing, access management, safety compliance, and asset monitoring— enable software developers to rapidly build real-time applications that span a wide range of use-cases and industries. The modern software stack gives end-users easy access to a live monitoring dashboard. Their security solution provides an elegant interplay of visual sensing, hardware, and software capabilities.
Sensome engineered a micro-sensor that can identify biological tissues instantly, aiding doctors with real-time decisions in assessing and treating blood clots. Sensome CEO and Co-Founder, Franz Bozsak, explains that “their micro-sensor technology integrated with a guidewire can accurately identify blood clot composition by tiny emitted electrical fields. Machine learning algorithms then analyze sensor collected data in real-time and provide doctors with a reliable measure for choosing their intervention tools.” Clinical trials are expected to begin in 2021. This technology has disruptive potential across multiple medical fields, such as ischemic stroke, interventional cardiology, and oncology. Accuracy and success rates of different approaches will be monitored and through machine learning, they will be able to recommend the best retrieval method. In this way, connected medical devices are offering visibility to better patient management.
Pensa Systems uses computer vision drones to manage retail inventory autonomously, helping solve the $1T problem of stockouts. Their drones scan shelves and use AI machine learning to visually recognize individual products automatically, which is a huge upgrade from the clipboard method typically deployed in supermarkets. Mobile phone camera sensors can feed the same automated processing. As Richard Schwartz, CEO of Pensa Systems puts it, “the goal is to minimize stockouts, optimize product planning, and ultimately increase revenue.” They boast an accuracy rate of 98% for detecting out-of-stock items, Schwartz continues, “even when tested in more challenging scenarios, like correctly identifying items among competing brands for the same type of product like, Heinz versus French’s Mustard or regular soy sauce versus low sodium.” The next phase as Schwartz puts it, will be “roverizing, which achieves scale through substituting what would have been thousands of separate fixed-location sensors or things that had to be collected individually, into one autonomous IoT roving robot that just goes out learns and reports.”
Naked Labs, who combine data from multiple sensors for their home 3D body scanner to extend the human sense of vision to a new inner perspective of the body’s composition. In order to achieve a 360 degree, head-to-toe body scan, the solution relies on “over 4 million data points stitched together, creating a 3D model of your body with accuracy up to five millimeters in 60 seconds,” explains Thomas Ward, UX Researcher and Head of Customer Experience at Naked Labs. He continues, “Measurements of stereo wave echoes between you and the mirror, essentially create a topographical map of your body. Then circumferences from the 3D scan are used to calculate DXA-based body fat. Providing a digital perspective like never before to discover and track progress.”
Hear
Audio sensing technology is as ubiquitous in day to day life, as it is in large-scale industrial projects. By capturing sound vibrations, sensors give shape to the non-visible aspects and events of an environment.
The building blocks of hearing sensors are inspired by the mechanical properties of the human ear. However, they can be calibrated to cover frequencies well beyond those perceivable by humans (such as ultrasounds). Such flexibility has made them available for many industries: they are used in industrial automation, as measurement devices in construction equipment, and as part of medical imaging techniques, among others.
Advanced audio processing capabilities enable the development of increasingly intelligent consumer electronics. Basic signal processing algorithms are responsible for filtering noise, detecting simple patterns, and building recognition models. On top of that, advanced ML enables voice recognition, speaker identification, and natural language processing. Such applications represent the core capabilities of AI-powered personal assistants such as Alexa and Siri.
These advanced models are not only limited to linguistics. Movement and dynamics are also languages they can interpret. Genki Instruments, a music technology company that is the maker of Wave, a smart ring MIDI controller that lets musicians add natural expression to their setup. Music creation today is a largely digital experience, however, Wave allows musicians to use “natural movement to modulate strings, add vibrato, and control dynamics.” The sensors in Wave allow musicians to control their digital audio workstation using six natural gestures with the ring. This solution not only allows artists to record their music but their style.
Smell
Olfactory sensors offer researchers endless innovation possibilities, inspired by the complexity of the human olfactory system. Simply put, they are built to capture volatile chemical compounds in the air and aggregate them to identify scent fingerprints. The so-called electronic noses have found their way in a multitude of industries. They are used across manufacturing & cosmetics industries, for environmental control, in clinical diagnosis, as tools for pharmaceutical investigations and food and beverage quality control.
Plume Labs is employing digital olfactory sensors in a quest for a better understanding of air quality around the world. One of their products is a wearable air pollution sensor that offers real-time air quality insights. Their monitoring application enables users to keep an eye on live air conditions across cities. Similarly, their API gives businesses access to live & forecast air pollution data. By aggregating air quality data with a multitude of data sources (population density, road network data), they provide accurate models of air quality data.
Inhalio offers automotive & hospitality businesses a chance to integrate digital scent solutions as part of their products and services. Their scent technology platform enables easy control of scent infusion and diffusion tools. A cloud-based dashboard makes it possible to control environment-specific aspects in real-time, by aggregating and interpreting multiple data sources. This is a simple example of how data analytics can complement digital sensing technology.
Taste
Digital taste sensors may not be as popular in consumer applications, but they play a crucial role in the pharmaceutical and food industries. They mimic the taste mechanisms of the human palate and are sometimes referred to as “e-tongues” (electronic tongues). They are commonly used to discriminate among substance samples in research labs when relying on human taste is understandably prohibited.
Engineering digital taste goes well beyond simply identifying substances and chemical compounds: it also requires advanced modeling of reaction cascades, similar to how human receptors react to different tastes. For this, pattern recognition systems interpret the electric output of e-tongues and integrate data from multiple sensors. Afterward, analytical software classifies different types of sensor data into predefined taste fingerprints. This creates an accurate taste model, that is “synchronized” with the human palate and receptors.
Touch
Research of touch sensor technologies is nowadays highly motivated by modern consumer applications. This has led to a previously unimaginable variety of touch sensors. Capacitive, resistive, piezoresistive, piezoelectric, infrared, or surface acoustic wave sensors can all detect tactile stimuli in an environment. Their fine-grained capability to model different types of physical interactions gives them a crucial role in industrial, automotive, medical, and consumer applications.
Tactile data is inherently complex since it represents a wide range of physical interaction parameters, such as temperature, shape, texture, or even directional forces. Sensors that collect such heterogeneous data are only fully usable when the data they generate can be made sense of. This is where advanced data & signal processing comes into play and shapes increasingly fine-grained capabilities.
Leap Motion offers haptic interfaces that work without direct physical contact. Instead, they use ultrasound technology to generate a feeling of touch. Ultrasound waves are configured to travel from different directions and arrive at precise focal points in 3D space (usually on the surface of objects or human hands). Combined with their hand-tracking software, they offer a fully interactive 3D model of human hands for virtual reality applications. This kind of haptic interface plays a crucial role in making virtual touch interactions feel real.
Another company revolutionizing touch is Tanvas who makes a combined touch sensor / haptic actuator for multi-touch devices including touchscreens, tablets, and laptops. The technology, TanvasTouch, enables programmable textures and haptic effects that can be felt with the swipe of a finger. Unlike traditional vibrotactile haptics, “TanvasTouch surface haptics have no moving parts. Instead, the finger’s movement is sensed by an integrated multi-touch sensor, and surface friction is altered using a physical phenomenon called electroadhesion. This effect uses electric fields to increase friction locally as fingers slide across a smooth plane. The technology brings featureless surfaces to life and can produce a wide range of haptic effects including textures of different roughness, pitch, and magnitude, edges of varying sharpness, and virtual bumps of different shapes.” – Phill LoPresti, CEO of Tanvas
And Beyond
Sensor-based applications are by no means restricted to the use of the above sensors only. Digital sensing technology displays a high variety of sensor types, each corresponding to increasingly specific use cases. Tens of specific sensors are designed, engineered, and improved every year. The development of specialized sensors is a trend that is likely to continue, given the major role they play in IoT electronic equipment.
The use of sensors is also not limited to a single sensor at a time. On the contrary, AI-enabled devices and applications make use of multiple sensors synchronously. By using complementary data sources, product developers are able to achieve increased accuracy, keep a competitive advantage, and focus on innovation. This is only possible given the predictive power of ML algorithms and the maturity of data engineering processes.
Microshare, takes multiple sensor inputs to another level with their smart-building solutions used to monitor hospital cleanliness, saying, “We aim to provide every building with vital signs like a living breathing organism,” so in other words, “you need all the senses to do this properly.” They offer a suite of capabilities that together, get to ROI that people care about; cost efficiency, sustainability, and satisfied occupants in any type of facility.
“I think we are able to manufacture the elements that create that sixth sense of awareness and prediction that the best facilities managers have. This additional sense is obtained through a complete sensor-fed level of awareness, making it tangible and replicable.”-Michael Moran, Microshare, Chief Risk and Sustainability Officer
Summary: Sensing Technologies need Data Science
All examples above illustrate a visible trend: sensing technology and data science work hand in hand to provide opportunities for multi-disciplinary applications. Since sensing devices collect large amounts of heterogeneous data, sensing platforms need to aggregate, transform, and interpret rich sensor data. This is only possible using modern data science tools and processing frameworks.
A consistent thread between each of the senses is the need to unify and interpret data to unlock the core value. Blue Orange Digital, a data science, and transformation company helps extract, unify, and deploy predictive modeling to enable advanced insights and utilize machine learning. They develop custom dashboards and data processing pipelines, allowing sensor network owners to focus on their sensing technology, without worrying about data analysis infrastructure.
Data processing capabilities are more affordable than ever. Companies across industries are gaining access to previously expensive data analytics technology. Modern cloud infrastructure ensures access to computational power, storage is becoming cheaper and cheaper, while data science expertise is widely available. This creates an ideal ecosystem for the integration of sensing technologies and data science.
Originally Published: https://www.theinternetofthings.eu/josh-miramant-5-senses-sensors