Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Neurotechnology Releases SentiBotics Robot Navigation Development Kit

The New SentiBotics Release Provides Tools for the Development of Autonomous Robot Navigation

Related Posts
1 of 1,548

Neurotechnology, a provider of deep learning-based solutions, robotics and high-precision biometric identification technologies, announced the release of the SentiBotics Navigation Software Development Kit. Designed for researchers and engineers working on autonomous robot navigation, SentiBotics Navigation SDK provides the tools for the development of pathway learning, including object recognition and obstacle detection, in robotics systems.

The system derives navigational parameters through initial user-driven input, developing a framework of data that then becomes the environment for subsequent autonomous operation. The software also allows for autonomous recharging capability once the functional environmental features are identified.

Read More:  Why Food Manufacturers are Turning to Industrial AI

SentiBotics Navigation SDK can be purchased as either a complete package, a ready-to-run robotics system that includes Neurotechnology’s mobile reference platform prototype, or as a software-only option for integration into existing robotics hardware. A free 30-day trial of the SDK is available for use in a Gazebo robotics simulator.

“SentiBotics Navigation SDK provides deep neural network based robot navigation software ready for integration into customers’ robotic systems,” said Dr. Povilas Daniusis, Neurotechnology team lead for robotics. “The SDK provides robust functionality for robotics engineers as well as academic and educational institutions. The robotics algorithm implementations are designed for robots operating in real indoor environments. They can also be used to compose more complex autonomous behavior.”

The Navigation SDK improves on the previously released SentiBotics Development Kit 2.0 and offers some important practical advantages when compared to other available autonomous navigation systems.

Read More:  Leveraging the Real ‘A’ in AI to Gain a Business Advantage

It relies on a single webcam and two low cost ultrasonic range finders for input and allows autonomous navigation over long distances (hundreds of meters or more). The new SDK also enables navigation system training, and further adaptation to visual changes in the environment may be enhanced through additional user input, altering the system by additional data collection in changed areas or problematic locations.

Important features in the new SentiBotics Navigation SDK include:

  • Training and execution of visuomotor trajectory controls. The robotics user, via the control interface, first runs the robot several times in the desired closed trajectory. During this process, training-data pairs (images and control pad commands) are captured and a deep neural network and imitation learning based motion controller is established off-line using the TensorFlow framework and provided controller training infrastructure. Once the controller is fully trained, the robot may be placed at any point along the learned trajectory and it will function autonomously within that environment. The controller function can be executed via wireless link, from a remote machine (e.g. computer with GPU), or onboard using a low-power usage Movidius NCS.
  • Object learning and recognition. Additional enhancement of the controller-learned space is available through user-enrollment of objects of interest or concern into the object recognition engine via selection with a mouse or pad.
  • Robot and environment simulation. A test environment, complete with an office layout and simulated SentiBotics robot, is integrated into the SentiBotics Navigation SDK.
  • Basic obstacle handling. Using two front-facing sensors, the robot will stop when it detects an obstacle, such as a person crossing the trajectory, and continue its movement only when the pathway is again clear.
  • Autonomous recharging. Because the SDK provides for the combination of trajectory controller execution and object recognition functionality, it is possible to train for autonomous recharging. Identification of the charging station within the learned functional environment will allow the robot to operate along its trajectory until it detects the need for recharging, at which point it will look for the charging station and position itself accordingly.

Read More:  Interview with Martins Liberts, Co-Founder at Debitum Network

1 Comment
  1. Copper wire scrap buyers says

    Scrap Copper separation methods Copper scrap segmentation Industrial scrap metal pricing
    Copper cable scrap repurposing, Metal scrap sorting technology, Scrap Copper transport and logistics

Leave A Reply

Your email address will not be published.