Skip to content
cbm edited this page Jun 18, 2011 · 5 revisions

Introduction

NaoYarp is an open-source project, that provides a convenient method for developing applications on Aldebaran's humanoid robot, Nao. It aims to supply a generic and constant interface, which enables source code development that is independent of the underlying programming platform, the NaoQi middleware, and to further support execution of the same algorithm implementations on different robotic platforms. To fulfill that purpose, it utilizes Yet Another Robotic Platform (YARP), which abstracts from the hardware and provides the necessary generality, while it internally employes NaoQi for accessing and communicating with the robot devices. NaoYarp's ultimate target, is to model all of Nao's sensors and actuators in a fast and efficient way, as well as to supply network wrappers for remote access and communication to the robot devices.

YARP Devices

NaoYarp models the Nao robot by grouping the devices into categories. These devices represent the concept of generic devices, as is being defined in YARP, and implement the appropriate interfaces in each case.

To begin with, the first group of devices are the motor actuators of the robot. We categorize them into sub-categories, by the joint chain (or limb of the robot) that they belong. Actuators in each joint chain are modeled as a part of the device. Thus, the following actuator devices are formed, controlling one or more robot servos:

  • Head Chain
  • Left Arm
  • Right Arm
  • Left Leg
  • Right Leg

Motor devices provide angle- and Cartesian-space control capabilities, can get and set the impedance of each joint in the chain, and access the encoder data. The schematics are defined by YARP interfaces, and each device implements the following:

  • IPositionControl
  • IControlLimits
  • IEncoders
  • IImpedanceControl
  • ICartesianControl

As the current A.P.I. of the robot, using the NaoQi – the official middleware, does not provide a injection to the whole set interfaces, several methods were omitted and not implemented. This is especially the case on the Cartesian space control where the access methods provide are very limited functionality.

Furthermore on the modeling, the camera is accessed and configured through the frame grabber and frame grabber controls interfaces accordingly. The current implementation support only RGB image retrieval, while the original image format is YUV422. The accelerometers, gyroscopes, force-sensitive resistors and ultra-sonic devices, using the generic sensor interface.

Future Work

Finally, in order to code with the different editions of the robots, Academic and RoboCup, as well as with the different versions of the middleware, an other set of interfaces was created. These interfaces are very similar to the actual A.P.I. methods, but are common for all the versions of NaoQi. The appropriate implementations of the interfaces, are selected by the building's system configuration. For the time being, most of the testing has been on RoboCup editions, with NaoQi v1.10.25 or later.

Clone this wiki locally