UltraSense Systems Brings Neural Processing to Smart Surfaces with a New Generation of Multi-mode Sensing Solutions
UltraSense Systems announced its next generation of multi-mode touch sensing technology with the release of TouchPoint Edge to revolutionize and cost-effectively replace a cluster of mechanical buttons under any surface material (e.g., metal, glass, plastic, etc.) and further forge the UI/UX paradigm shift from mechanical to digital interfaces for smart surfaces.
TouchPoint Edge is a fully integrated system-on-a-chip (SoC) which replicates the touch input of mechanical buttons by directly sensing up to eight standalone UltraSense ultrasound + force sensing TouchPoint P transducers. TouchPoint Edge also uses an embedded, always-on Neural Touch Engine (NTE) to discern intended touches from possible unintended false touches, eliminating corner cases and providing input accuracy of a mechanical button.
Recommended AI News: Mobilum Technologies Signs Agreement To Integrate Off-Ramp Services Into The Ledger Live Platform, Gateway For Digital Assets & Web3
Smart surfaces will further change the way we interact with products in the coming years. The definition of smart surfaces is a solid surface with underside illumination to show the user where to touch. The UI/UX paradigm shift first started with the smartphone over a decade ago with the removal of the mechanical keyboard to simply tapping the keyboard on a capacitive screen. Not all smart surfaces will be a capacitive display though. UltraSense System’s first-generation sensor, TouchPoint Z, continues the paradigm shift by cost-effectively removing mechanical buttons and improving the user experience (UX) in smartphones, electric toothbrushes, home appliances and automotive interior overhead lights.
TouchPoint Edge takes the experience to the next level in applications that use many mechanical buttons. For instance, the automobile cockpit has many uses cases including removing mechanical buttons in the steering wheel, center and overhead console controls for HVAC and lighting, door panels for seating and window controls and even embedded into soft surfaces like leather or even in foam seating to create new user interfaces where mechanical buttons could not be implemented before. Other applications include appliance touch panels, smart locks, security access control panels, elevator button panels and a multitude of other applications.
“In just three years from first funding, we were able to develop, qualify and ship to OEMs and ODMs a fully integrated virtual button solution for smart surfaces,” said Mo Maghsoudnia, CEO of UltraSense Systems. “We are the only multi-mode sensor solution for smart surfaces, designed from the ground up to put neural touch processing into everything from battery-powered devices to consumer/industrial IoT products and now automotive in a big way.”
Recommended AI News: Crypto Art That Touches You: NIVEA Launches Free NFT Art About The Value of Touch
Human machine interfaces are highly subjective and are extremely complex under a solid surface to replicate a mechanical button press. It is more than an applied force being larger than a threshold to trigger a press of a surface. When a user applies force to a mechanical button, the user is applying a time-varying force curve where the mechanical button reacts with a lot of non-linearity due to friction, hysteresis, air gaps and spring properties to name a few. As a result, a simple piezoresistive or MEMS force-touch strain sensor with some algorithms and one or two levels of triggering thresholds cannot effectively and accurately recreate the user experience of a mechanical button and eliminate false triggers.
TouchPoint Edge, with multi-mode sensing and embedded Neural Touch Engine, processes on-chip machine learning and neural network algorithms, so the user intention can be learned. As with TouchPoint Z, TouchPoint Edge captures the unique pattern of the user’s press with respect to the surface material. The data set is then used to train the neural network to learn and discern the user’s press pattern, unlike traditional algorithms which accept a single force threshold. Once TouchPoint Edge is trained and optimized to a user’s press pattern, the most natural response of a button press can be recognized. Additionally, the unique sensor array design of the TouchPoint P transducer allows for the capture of unique, multi-channel data sets within a small, localized area, as a mechanical button would be located, which greatly improves the performance of the neural network to replicate a button press. The Neural Touch Engine improves the user experience and is even better enhanced by being tightly coupled with the proprietary sensor design of TouchPoint P to provide optimal performance. Finally, having the Neural Touch Engine integrated into TouchPoint Edge is a game changer in system efficiency where neural processing can be performed 27X faster with 80% less power versus offloading the same system setup to an external ultra-low-power microcontroller.
“The challenges of replacing traditional mechanical buttons with sensor-based solutions requires technologies such as illumination of the solid surface, ultrasound or capacitive sensing, and force sensing,” said Nina Turner, research manager of IDC. “But those sensors alone can lead to false positives. The integration of machine learning integrated with these touch sensors brings a new level of intelligence to the touch sensor market and would be beneficial in a wide array of devices and markets.”
Recommended AI News: ICONFi – Stop Losing Crypto on Trading and Earn More from the Compound Yields
[To share your insights with us, please write to sghosh@martechseries.com]
Comments are closed.