Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Neurala Awarded Patent for Multi-Modal AI Sensing and Object Detection

New Patent Furthers Neurala’s Mission to Develop AI That Learns in the Same Way That Humans Do

Award-winning AI company Neurala announced that it has been awarded a new patent for technology that will enable AI to make sense of its environment based on multiple sensory inputs, in the same way as the human brain works.

AI systems use sensory input from cameras, microphones, infrared sensors, and the like to recognize objects and people in its environment. But, current AI approaches are unable to make sense of this array of inputs, focusing only on siloed information from each individual sensor, rather than taking a holistic view of how different modalities influence one another.

“If we want AI to effectively work in real-world environments, we need to bolster its ability to make sense of complex environments based on multiple senses, in the same way that the human brain would,” said Neurala co-founder and CTO Dr. Anatoli Gorchet, who has previously been awarded three other patents. “If you think about how we as people make sense of our surroundings, it’s through a mix of signals and senses  including what we see, hear, and perceive. That methodology has been missing in the AI ecosystem until now. The approach outlined in Neurala’s new patent allows for more robust analysis of sensory input, considering how different modalities converge to enable more human-like perception capabilities in AI.”

Read More:    Sarah Davies, Former VantageScore Executive, Joins Nova Credit

The new patent will further Neurala’s mission to provide the foundation for AI that can continuously learn at the edge and on-device. Neurala has already made strides in this area, having developed the Neurala Brain: proprietary, lifelong deep learning neural network (L-DNN) software that emulates the way biological brains see the world, by enabling AI to continuously learn from its surroundings and environment. With the Neurala Brain and any camera, organizations can deploy AI that can recognize objects and people in an image or video, and track them as they move.

Related Posts
1 of 40,740

“As machines incorporate more and more sensors, we need to build AI that can make sense of the wealth of information at its disposal, and respond as intelligently and accurately as possible,” Gorchet continued. “With the Neurala Brain, and the capabilities made possible by our latest patent, Neurala will be at the forefront of delivering AI that truly leverages this zoo of hardware options.”

Read More: AI Must Support Customer Experience Outcomes, Not Just Processes

US Patent No. 10083523, “Methods and apparatus for autonomous robotic control,” was issued on 25 September 2018. Neurala has more than 20 additional patent applications pending.

Neurala is helping to make AI solutions like these more accessible to organizations with Brain Builder: a complete SaaS platform to streamline creation, deployment, analysis and management of deep learning applications.

Interested companies and universities can apply to be part of the Brain Builder beta program here. Users who sign up for the initial Brain Builder beta program will have first access to the full suite when it launches later this year.

Read More: Interview with David Hayes, CEO at Wave Optics

1 Comment
  1. Scrap copper pricing Copper scrap management solutions Scrap metal recycling technologies
    Copper cable scrap sorting, Scrap metal sustainability practices, Recycled copper commodities

Leave A Reply

Your email address will not be published.