In Person Conference Wrap Up
MRS Blogger Sebastián Suárez Schmidt

Symposium EQ11—Neuromorphic Computing and Biohybrid Systems—Materials and Devices for Brain-Inspired Computing, Adaptive Biointerfacing and Smart Sensing

Written by Corrisa Heyes

Raphael Ahlmann, TU Dortmund

Fabrication of Gas-Sensitive Memristive Devices

Raphael Ahlmann builds on previous work detailing the sensitivity of memristor structures to ambient gases with a goal of developing a low-cost, ultra-low power, CMOS compatible, memristive device gas sensor. The current theoretical work demonstrates fast diffusion of gases into thin films at room temperature and good response to concentration changes in simulation. Experimental data supports the ability to control for degradation and drift phenomena through re-programming schemes. Concentrations of hydrogen as low as 1% have been detected experimentally, although that sensitivity is expected to improve dramatically with the development of an automated controller for the device.

 

Jun Tao, University of Southern California

Machine Vision with Programmable Floating-Gate Phototransistor for Color-Mixed Image Recognition

Jun Tao addresses the need for high speed, low power machine vision options. This work presents a CMOS compatible, floating-gate photo-field-effective transistor (FC-PFET) for sensing and processing image data simultaneously. Simulation data based on color-mixed handwriting detection performance projects that a trained FG-PFET device should be able to perform at a 94%+ accuracy rate, even off-line and due to the fact that the relationship between amplitude and responsivity is wavelength-dependent, the FC-PFET device is demonstrated to be capable of seeing in full color as well.

 

Andres Arrieta, Purdue University

Memory Formation and Mechanosensing in Neuromorphic Mechanical Metamaterials

Andres Arrieta demonstrates the groundwork for an “event camera” to facilitate the creation of more robust interphases between the physical world and systems like soft robotics.  Such a camera would ideally integrate memory formation, retention, and recall over long periods of time. Additionally, this camera should measure tactile inputs to allow systems to “feel” their environment and recognize/interact with the objects around them. This goal is partially addressed by the presented simple, low-cost, 3D printed, bistable, mechanical metamaterial that senses and records tactile inputs in response to physical stimuli. Then by associating a “neuron” to each unit, a Hopfield network can be leveraged to convert the input into a binary array and stored long term as a vector. This metamaterial application has the benefit of being easily scalable to integrate over large areas without triggering a data overload state as well as the ability to train in situ, so robots can interact with their environments and accumulate memory along the way.

Comments

The comments to this entry are closed.