Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Motion Gestures Launches Breakthrough Gesture Recognition Platform

Revolutionary Software Enables Creation Of A Gesture Interface In Minutes, Without Programming Or Data Input; Closes $1.65 Million USD In Seed Financing

Motion Gestures, a Canadian machine learning startup specializing in gesture recognition software, launched a breakthrough platform for gesture recognition applications. The software is a significant advance over conventional approaches and dramatically reduces the time, cost, and effort of building a gesture interface for any system, device, or app. It supports gesture recognition with a wide variety of motion, touch, and vision sensors and can be deployed on cloud, gateway/hub, and embedded platforms.

The patent-pending technology defines the state-of-the-art in gesture recognition in six fundamental ways:

  • No need to code custom gestures. Just draw the desired motion trajectory on a mobile or tablet screen and edit it.
  • No need to collect data for training gesture models. The system generates data automatically for model training.
  • No need to build gesture models for different sensors. Once a model is ready, utilize it with any supported sensor.
  • No need to build gesture models for different platforms. Once a model is ready, deploy it on any supported platform.
  • No limitation in gesture choices. Choose from a library comprising letters, numbers, shapes, symbols, and rotations.
  • No complicated testing procedures. Test any gesture using your smartphone and receive comprehensive feedback.

Read More: Interview With Jeffry Van Ede, Co Founder & CEO At Simplaex

Motion Gestures also announced today closing of $1.65 million USD in seed investment. China Canada Angels Alliance (CCAA) led the round that also included participation from Golden Triangle Angel Network (GTAN), Keiretsu Forum, and Propel(x) The financing will be used to further the development of company’s gesture recognition platform and initiate international marketing.

Alan Yang, Vice President of CCAA, said: “Gesture recognition is the new frontier of human-machine interaction.  We are very impressed with Motion Gestures’ platform’s ability to popularize and de-mystify gesture interaction. We see major applications of its gesture recognition software in key business verticals, such as wearables, automotive, mobile phones, consumer electronics, home automation, medical devices, virtual reality, toys, drones, and robotics, to name a few.”

Kashif Kahn

“We believe gesture recognition as a human-machine interface is today where voice recognition was five years ago,” said Kashif Kahn, CEO and Co-Founder of Motion Gestures. “We want to dominate this emerging space by offering a powerful solution that eliminates the difficulties of building a gesture interface and helps bring gesture control into consumer mainstream. We invite developers to experience the power of our breakthrough platform via use of a free developer license.”

Motion Gestures specializes in machine learning-based gesture recognition software and was founded in Waterloo, Canada in 2016. The company aims to enable rapid development and deployment of gesture-enabled interfaces for systems, devices, and apps. Its software supports advanced 2D and 3D gestures using motion, touch, and vision sensors and can be deployed on cloud, gateway/hub, and embedded platforms. Developers are offered free licenses to the cloud SDK and can quickly create and test any gesture using their smartphones.

Leave A Reply

Your email address will not be published.