Neurotechnology Releases SentiBotics Robot Navigation Development Kit

VILNIUS, Lithuania, Dec. 18, 2018 /PRNewswire/ -- Neurotechnology, a provider of deep learning-based solutions, robotics and high-precision biometric identification technologies, today announced the release of the SentiBotics Navigation Software Development Kit. Designed for researchers and engineers working on autonomous robot navigation, SentiBotics Navigation SDK provides the tools for the development of pathway learning, including object recognition and obstacle detection, in robotics systems.

The system derives navigational parameters through initial user-driven input, developing a framework of data that then becomes the environment for subsequent autonomous operation. The software also allows for autonomous recharging capability once the functional environmental features are identified.

SentiBotics Navigation SDK can be purchased as either a complete package, a ready-to-run robotics system that includes Neurotechnology's mobile reference platform prototype, or as a software-only option for integration into existing robotics hardware. A free 30-day trial of the SDK is available for use in a Gazebo robotics simulator.

"SentiBotics Navigation SDK provides deep neural network based robot navigation software ready for integration into customers' robotic systems," said Dr. Povilas Daniusis, Neurotechnology team lead for robotics. "The SDK provides robust functionality for robotics engineers as well as academic and educational institutions. The robotics algorithm implementations are designed for robots operating in real indoor environments. They can also be used to compose more complex autonomous behavior."

The Navigation SDK improves on the previously released SentiBotics Development Kit 2.0 and offers some important practical advantages when compared to other available autonomous navigation systems.

It relies on a single webcam and two low cost ultrasonic range finders for input and allows autonomous navigation over long distances (hundreds of meters or more). The new SDK also enables navigation system training, and further adaptation to visual changes in the environment may be enhanced through additional user input, altering the system by additional data collection in changed areas or problematic locations.

Important features in the new SentiBotics Navigation SDK include:

    --  Training and execution of visuomotor trajectory controls. The robotics
        user, via the control interface, first runs the robot several times in
        the desired closed trajectory. During this process, training-data pairs
        (images and control pad commands) are captured and a deep neural network
        and imitation learning based motion controller is established off-line
        using the TensorFlow framework and provided controller training
        infrastructure. Once the controller is fully trained, the robot may be
        placed at any point along the learned trajectory and it will function
        autonomously within that environment. The controller function can be
        executed via wireless link, from a remote machine (e.g. computer with
        GPU), or onboard using a low-power usage Movidius NCS.
    --  Object learning and recognition. Additional enhancement of the
        controller-learned space is available through user-enrollment of objects
        of interest or concern into the object recognition engine via selection
        with a mouse or pad.
    --  Robot and environment simulation. A test environment, complete with an
        office layout and simulated SentiBotics robot, is integrated into the
        SentiBotics Navigation SDK.
    --  Basic obstacle handling. Using two front-facing sensors, the robot will
        stop when it detects an obstacle, such as a person crossing the
        trajectory, and continue its movement only when the pathway is again
        clear.
    --  Autonomous recharging. Because the SDK provides for the combination of
        trajectory controller execution and object recognition functionality, it
        is possible to train for autonomous recharging. Identification of the
        charging station within the learned functional environment will allow
        the robot to operate along its trajectory until it detects the need for
        recharging, at which point it will look for the charging station and
        position itself accordingly.

SentiBotics Navigation SDK is available through Neurotechnology or from distributors worldwide. For more information and trial version, go to: www.neurotechnology.com. As with all Neurotechnology products, the latest version is available as a free upgrade to existing SentiBotics customers.

For more information about SentiBotics, please visit our website.

About Neurotechnology
Neurotechnology is a developer of high-precision algorithms and software based on deep neural networks and other AI-related technologies. The company was launched in 1990 in Vilnius, Lithuania, with the key idea of using neural networks for various applications, such as biometric person identification, computer vision, robotics and artificial intelligence. Since the first release of its fingerprint identification system in 1991, the company has delivered more than 200 products and version upgrades. More than 3,000 system integrators, security companies and hardware providers in more than 140 countries integrate Neurotechnology's algorithms into their products. The company's algorithms have achieved top results in independent technology evaluations, including NIST MINEX and IREX.

Media Contact:
Jennifer Allen Newton
Bluehouse Consulting Group, Inc.
+1-503-805-7540
jennifer (at) bluehousecg (dot) com

View original content to download multimedia:http://www.prnewswire.com/news-releases/neurotechnology-releases-sentibotics-robot-navigation-development-kit-300767884.html

SOURCE Neurotechnology