Publication

Multi-modal deformation and temperature sensing for context-sensitive machines

Abstract

Owing to the remarkable properties of the somatosensory system, human skin compactly perceives myriad forms of physical stimuli with high precision. Machines, conversely, are often equipped with sensory suites constituted of dozens of unique sensors, each made for detecting limited stimuli. Emerging high degree-of-freedom human-robot interfaces and soft robot applications are delimited by the lack of simple, cohesive, and information-dense sensing technologies. Stepping toward biological levels of proprioception, we present a sensing technology capable of decoding omnidirectional bending, compression, stretch, binary changes in temperature, and combinations thereof. This multi-modal deformation and temperature sensor harnesses chromaticity and intensity of light as it travels through patterned elastomer doped with functional dyes. Deformations and temperature shifts augment the light chromaticity and intensity, resulting in a one-to-one mapping between stimulus modes that are sequentially combined and the sensor output. We study the working principle of the sensor via a comprehensive opto-thermo-mechanical assay, and find that the information density provided by a single sensing element permits deciphering rich and diverse human-robot and robot-environmental interactions.|Future robots require compact sensing architectures capable of discerning multiple stimuli. Here, Baines et al., present a multi-modal deformation and temperature sensor which, exploiting the light-to-state mapping, discerns combined stimuli of bending, stretching, compression, and temperature.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.