What if you could have a personal robot helper that transforms into a self-balancing scooter? That’s the appeal of the Segway Robot, an adorable device which first debuted during CES. We were able to take a close look at it this week during Intel’s Developer Forum in San Francisco, and while it’s clearly a prototype (it doesn’t even have an official name yet), it still has plenty of potential. Above, check out our interview with Sarah Zhang, senior director of robotics business operations at Ninebot and Segway, who dives into what makes this little bot so special.
Using an Intel RealSense camera embedded above its “face,” the Segway Robot is able to detect depth and traverse environments without bumping into things. At the moment, its capabilities are limited to just following people around on command, but it’s not hard to see how it could be used for teleconferencing or home security. Imagine a connected helper like Amazon’s Echo that can actually follow you around your house, for example.
At IDF this week, Segway announced that it’s opening up SDKs for the robot, so that developers will actually be able to make it useful. That includes a robot SDK, giving devs access to things like vision, speech, movement and interaction, and a mobility SDK, which lets them control the bot remotely.
The Segway Robot is built on the frame of the Ninebot Mini, so it’s already a capable self-balancing scooter. The company’s engineers were able to hop on and zoom about the show floor with ease. Riding the bot didn’t go so well for me, unfortunately, but that was mostly due to my inexperience with self-balancing devices.
While there’s plenty of work to be done on the Segway Robot, it’s still one of the most appealing personal bot concepts we’ve seen. In comparison, the ASUS Zenbo seems like a silly toy, and Anki’s Cozmo, while cute, won’t help much when you’re away from home. Segway plans to ship developer editions of the robot later this year, and a consumer version will hit sometime in 2017.