Bio-inspired eyes for the digital world
The smart pixels in event-based cameras enable image data to be captured only where and when something moves, offering a low-data, low-power way to give mobile devices spatial awareness—a vital capability for augmented reality. Start-up Insightness has developed a bio-inspired pixel whose ability to operate in both event and normal modes represents the next step in bringing the technology to market. CSEM designed a system-on-chip around the pixel, creating a best-in-class event-based image sensor.
This SiliconEye sensor compresses and processes visual information within each pixel with ultra-low latency. This is fast enough for autonomous devices to "see" and avoid moving objects, and to allow realistic, lag-free position and movement updates in virtual- and augmented-reality environments. Other applications include automated manufacturing, inspection, and logistics. The chip caught the attention of Sony, which acquired Insightness with the aim of creating a new research center (in Switzerland) that combines its own chip production processes with Insightness’s event-based vision sensor technology.