OrCam Unveils Wearable Personal Assistant, New Technologies for People With Disabilities
With new natural language processing abilities and other AI technologies, OrCam’s devices help vision and hearing impaired people navigate day-to-day life with greater ease
Right now, the device’s new functions are only available in English, but the company intends to make it compatible with additional languages by the end of the year.The most significant technological leap, according to Shashua, is the addition of natural language processing abilities, in addition to the device’s existing machine vision capabilities. The second is orientation in space. Orcam’s most recent MyEye model beeps faster when it recognizes that the wearer approaches a door, or emits rhythmic beeps that can guide the wearer towards an object—say, a glass of water on a table. Right now, the device has a library of 15 objects it can recognize, and Shashua said it is expected to grow. “We took all the existing abilities of the device—facial recognition, barcode reading, object identification, and added the verbal interaction with the device,” Shashua said. The device, which increases the independence of vision and hearing impaired individuals, does not require an internet connection or cloud computing, Shashua said. “It has to work in real-time . Camera, computing, and a lot of AI algorithms make this a reality.” But it is the new product OrCam unveiled at the event, OrCam Hear, that is the real star of this show. OrCam Hear wants to solve a conundrum that has perplexed scientists for decades: the cocktail party problem. According to Shashua, this is a situation when you are talking to someone in a noisy room where countless other conversations are taking place simultaneously. Our brain is very sophisticated, Shashua said. It allows us to focus on the conversation and neutralize the surrounding noise, but hearing devices cannot do that. Instead, they enhance all the voices indiscriminately, and the effect is sheer cacophony. According to him, the device represents a real technological breakthrough. “The ability to combine video, the acoustic wave, see the lip movements—we've been working on it for two years now, and we've been able to deliver a product that will be released to the market by mid-2020.” The small device hangs like a locket around the user’s neck. When they look at their conversation partner, the device communicates with the earpiece via bluetooth, isolating the person’s voice in accordance with their lip movement. The device is far from complete—voice is still transmitted with delays—but the hardware is working and there are only the final tunings left, according to Shashua. Before it hits the market, the company intends to add two more levels of voice isolation, one based on sound signature samples, and another that will just reduce background noise. The device was named Best of Innovation as part of CES 2020 Innovation Awards. “The fact that it has been getting attention shows that there is a real need,” Shashua said. OrCam came to CES to establish partnerships, Shashua said. As a first stage, the company wants to connect with hearing aid companies who might offer their technology as an add-on.
The author was a guest of Intel subsidiary Mobileye at CES. OrCam was founded by Mobileye co-founders Ziv Aviram and Amnon Shashua.