Building on my exploration of NextMind I took the chance to see how far we could push the tech and integrate it with our Stretch RE1 robot that we lovingly named Rory.
There are many potential applications for the Stretch RE1. It’s open-source approach and interchangeable tools make it a versatile and adaptable platform. Specifically, with NextMind control the obvious applications are as an assistive device for people with challenges with fine motor control who may not be able to use traditional fiddly control surfaces used to control like joysticks and would benefit from control of a robot assistant. There may be other applications that use traditional control surfaces as well as the NextMind too, for example using the NextMind to control larger movements whilst simultaneously using a control surface to control fine manipulation of the gripper or another tool head.
Stretch RE1 has some out of the box functionality provided by hello robot, especially around vision and object identification. Some of these demos even focus on simple assistive scenarios like mouth finding for assistive feeding. For Rory specifically we have been working to implement Azure cognitive services to enable more intelligence. Currently Rory can run a fully voice interactive chatbot, reply with generated speech, understand lots of commands from simple questions like ‘what is your name’ to full voice-controlled manoeuvrability. Rory also has a great camera system that we have enabled through Azure to recognise objects, people, faces, and measure distances using the D435i depth sensor. Rory can use all of this visual information to describe what he can see and make decisions about objects and their location.
On the NextMind side, we have a simple unity interface with NeuroTag controls. Using the ‘On Triggered’ event we start a timer, and ‘On Release’ we call a script that sends a command to the python server via REST where the amount to move is proportional to the time the focus was maintained.
This results in a way to control the movement of the robot more precisely. The ratio of time to movement can be adjusted for more fine control of the robot too, eventually we’d like to have that as an option to change through the interface.
Watch a video of the NextMind controlled Stretch RE1 here: