Control of mobile robot Kobuki

Kobuki is a research mobile robot released from Yujin Robotics. Vacuum cleaner Robot is almost the same size as Roomba and can be controlled from a PC etc. via USB serial connection. It is equipped with IO, serial input/output, power supply connector, button, LED, etc. It is suitable for use as an experimental robot.

You can download the script that installs the software etc. necessary to operate the following Kobuki's sample from here.

 $ wget  http://svn.openrtm.org/Embedded/trunk/RaspberryPi/tools/rpi.sh
 $ chmod 755 rpi.sh
 $ sudo ./rpi.sh hostname --type kobuki

In the following explanation, the environment construction and Kobuki's sample are compiled automatically.

(G)Connection between Raspberry Pi and Kobuki

The figure below is the main panel of Kobuki.

kobuki_panel.png
Kobuki DC output connector

We use 5V 1A DC output connector for power supply to Raspberry Pi, USB connector for connection with Raspberry Pi.

(G)Power supply

Kobuki has a DC output connector capable of outputting 5V 1A and you can supply the power of Raspberry Pi from here.

5V 1A output connector with the following model number is used.

Kobuki 5V 1A connector
Housing Molex PN: 43645-0200
Terminal Molex PN: 43030-0001

kobuki5v_connector.png
Connector for Kobuki DC 5V 1A

You can also purchase it at RT robot shop etc.

We will supply power to Raspberry Pi by creating DC connector and USB conversion cable as shown below.

kobuki_raspberry_dccable.png
DC cable for Raspberry Pi

In recent years, many batteries with a USB output terminal for smartphones are on sale so these power supplies can also be used.

battery.png
Battery for smartphone

(G)USB

Connect Kobuki and Raspberry Pi with the USB cable that came with Kobuki. From Raspberry Pi side it appears as /dev/ttyUSB0.

 $ ls /dev/ttyUSB*
 /dev/ttyUSB0

(G)Connection

Install Raspberry Pi in Kobuki and connect the power supply and USB. If you make Raspberry Pi wireless LAN connection, it becomes Kobuki which can be controlled wirelessly.

kobuki_and_raspi.png
Kobuki 'with Raspberry Pi

Since there is a possibility of dropping during Kobuki operation, it is good to fix Raspberry Pi with a Velcro tape or the like.

(G)Compiling Kobuki AIST RT components

In the previous section we also tested the compilation of the RT component, but we will review it again here. First, check out the Kobuki AIST RT component from the following repository and build it.

  $ svn co http://svn.openrtm.org/components/trunk/mobile_robots/kobuki
  $ cd kobuki
  $ mkdir build
  $ cd build
  $ cmake -DCMAKE_INSTALL_PREFIX=/usr ..
  $ make
  $ cd src
  $ sudo make install

With this, Kobuki AIST RTC is built and the executable file
  • /usr/lib/openrtm-1.1/rtc/KobukiAISTComp It should be installed.

Try to start it up. Since access to the device file /dev/ttyUSB0 requires root privilege, it is running with sudo.

 $ rtm-naming
 $ sudo /usr/lib/openrtm-1.1/rtc/KobukiAISTComp

If you start RTSystemEditor and connect to Raspberry Pi's host name or IP address, you should see a component called KobukiAIST0. Please click to display the Configuration dialog.

Since it is designed to be able to operate LED1, LED2, etc, please click RED, GREEN etc. with the radio button. The LED will light up.

(G)Automatic activation of Kobuki AIST component

Kobuki AIST component is started automatically when Raspberry Pi is started. As a result, when you turn on the power to Kobuki, Raspberry Pi and Kobuki AIST components will start automatically and you will be able to operate Kobuki via RTC without having to login to Raspberry Pi every time.

Create the following script as /etc/kobuki.sh.

 $ sudo vi /etc/kobuki.sh

The contents of kobuki.sh are as follows.

 #!/bin/sh
 #
 # KobukiAIST RTC launch script
 #
 #       Copyright Noriaki Ando <n-ando@openrtm.org>
 #       2011.03.27
 #
 # This script should be executed from rc script like a rc.local
 # as the following command line.
 #
 #
 ns=/usr/bin/rtm-naming
 kobukiRTC=/usr/lib/openrtm-1.1/rtc/KobukiAISTComp
 workdir=/tmp/kobuki
 
 \$ns
 sleep 5
 
 if test -d $workdir ; then
         echo ""
 else
         mkdir \$workdir
 fi
 
 cd $workdir
 
 while :
 do
     rm -f \$workdir/*.log
     \$kobukiRTC
     sleep 5
 done

Give execute privilege.

 $ sudo chmod 755 /etc/kobuki.sh

Furthermore, to start automatically, insert the following line before the last exit 0 of /etc/rc.local.

 /etc/kobuki.sh 2>&1 | perl -p -e 's/\n/\r\n/g' 1>&2 &
 exit 0

Now, when Raspberry Pi starts up, the Kobuki AIST component also starts up automatically. Even if Kobuki AIST component is terminated by exit, it starts again after 5 seconds. As long as Kobuki is powered on, the Kobuki AIST component will continue to stay resident.

(G)Operation of Kobuki component

(G)Operation by TkJoystick

TkJoystick is a component included as a sample in OpenRTM-aist-Python. However, the output is only the XY value of the joystick and the output for the wheel speed of the opposed two-wheeled mobile robot, and there is no output of the two-dimensional velocity vector (TimedVelocity2D).

Improve the TkJoyStick component, output 2-D velocity vector (TimedVelocity2D), connect with Kobuki and operate.

On Windows, the following directories are also installed. (x.y is version)

  • C:\Program Files (x86)\OpenRTM-aist\x.y\examples\Python\TkJoyStick
  • C:\Program Files\OpenRTM-aist\x.y\examples\Python\TkJoyStick

(G)Hint

In TkJoystick.py, the left and right wheel speeds are calculated. Considering the kinematics of the mobile robot from here, the speed v and the angular velocity ω can be calculated. (For reference, I will make a link to Professor Kumagai of Tohoku Gakuin University.

TimedVelocity2D has the following data structure.

 struct Velocity2D
 {
    double va; // angular velocity [rad/s]
    double vx; // translation speed (forward) [m/s]
    double vy; // translation speed (lateral direction) [m/s] 0 for the opposite two-wheel type
 };
 struct TimedVelocity2D
 {
   Time tm;
   Velocity2D data;
 };

(G)Move autonomously

I will move Kobuki autonomously using a sensor. Kobuki is equipped with bumper sensor, proximity sensor (far/near), Cliff sensor. Here, like Roomba, when continuing to move forward, when detecting a wall, I try to move it with an algorithm that goes down a little, rotates and moves forward again. (Roomba moves a bit more wisely ...)

The sensor output of Kobuki AIST RTC is as follows. The IR sensor is for receiving signals from infrared from the dock and can not be used for obstacle detection. Therefore, only bumpers and cliff sensors can be used to detect obstacles and cliffs.

No. Enum Meaning
0 RIGHT_BUMPER Right bumper
1 CENTER_BUMPER Central bumper
2 LEFT_BUMPER Left bumper
3 RIGHT_WHEEL_DROP Right wheel release
4 LEFT_WHEEL_DROP Left wheel derailleur
5 RIGHT_CLIFF Right cliff sensor
6 CENTER_CLIFF Central cliff sensor
7 LEFT_CLIFF Left cliff sensor
8 RIGHT_IRFAR_RIGHT Right IR / dock right far
9 RIGHT_IRFAR_CENTER Right IR / dock Middle distance
10 RIGHT_IRFAR_LEFT Right IR / dock left
11 RIGHT_IRNEAR_RIGHT Right IR / dock right near
12 RIGHT_IRNEAR_CENTER Right IR / Dock Center Near
13 RIGHT_IRNEAR_LEFT Right IR / Dock Sightseeing
14 CENTER_IRFAR_RIGHT Central IR / dock far right
15 CENTER_IRFAR_CENTER Central IR / Dock Central Distance
16 CENTER_IRFAR_LEFT Central IR / dock left
17 CENTER_IRNEAR_RIGHT Central IR / dock right near
18 CENTER_IRNEAR_CENTER Central IR / Dock Center Near
19 CENTER_IRNEAR_LEFT Central IR / Dock Sight-seeing
20 LEFT_IRFAR_RIGHT Left IR / dock far right
21 LEFT_IRFAR_CENTER Left IR / Dock Middle distance
22 LEFT_IRFAR_LEFT Left IR / dock left
23 LEFT_IRNEAR_RIGHT Left IR / dock right near
24 LEFT_IRNEAR_CENTER Left IR / Dock Center Near
25 LEFT_IRNEAR_LEFT Left IR / Dock Sightseeing
26 KOBUKI_DOCKED Dock completed

To receive these outputs, one InPort of RTC::TimedBooleanSeq type is required. In addition, one OutPort of TimedVelocity 2D type for outputting the movement speed command to Kobuki is required.

Basic profile
Component name KobukiAutoMove
Module overview Kobuki auto move component
Version 1.0.0
Vendor name AIST
Activity
onInitialize, onFinalize, onActivated, onDeactivated, onExecute
Data port'
[in] bumper
Overview sensor information true: obstacle detection (bumper contact, wheel fall, cliff detection) false: no obstacle
Data type TimedBooleanSeq
Details Data [0]: right bumper, data [1]: central bumper, ... data [7]: left cliff sensor (see table above)
[out] targetVelocity
Overview Speed vector of mobile robot
Data type TimedVelocity 2D
Detail vx: translation speed, vy: 0.0, va: angular velocity
Unit vx [m/s], va [rad/s]

Using the above information as a clue, please create a simple component that will autonomously move Kobuki. If you are having trouble connecting components, please see Troubleshooting.

(G)Hint

The speed command of Kobuki is TimedVelocity 2D type, which is also shown above, but has the following data structure.

 struct Velocity2D
 {
   double va; // angular velocity [rad/s]
   double vx; // translation speed (forward) [m/s]
   double vy; // translation speed (lateral direction) [m/s] 0 for the opposite two-wheel type
 };
 struct TimedVelocity2D
 {
   Time tm;
   Velocity2D data;
 };

In the opposed two-wheeled mobile robot, it is considered that vy is always 0.0, for example,

 va = 0.0; vx = 0.2; vy = 0.0;

If you go backwards,

 va = 0.0; vx = -0.2; vy = 0.0;

If it turns on the spot

 va = 0.0; vx = 0.0; vy = 1.0;

timed_velocity_2d.png
Coordinate system of mobile robot and TimedVelocity 2D

If there is data to InPort, read the data and retrieve the bumper information. Bumper information is stored in a member of an array called .data of the data type TimedBoolSeq, and it can be seen that it is the 0th, 1st, 2nd element in the above table. If any of these are true, it means that a bumper has detected a collision, so once it gets down, it turns. And I will move forward again. If these movements are set to TimedVelocity 2D members and written to OutPort, speed command data will be transmitted to Kobuki. The algorithm flow chart is shown below.

kobuki_auto.png
Flowchart

It is ideal to control how much it goes down and goes down or turns or turns, but it is also possible to use the sleep function for the sake of simplicity. You can use coil::sleep on Linux

 coil::sleep(coil::TimeValue(0.01); // Wait 10 ms

In Windows, coil::sleep has poor accuracy, so it is better to use the Sleep function. Based on these hints, please create a control component that makes Kobuki move autonomously

(G)Answer

As an answer, the above TkJoyStick component and autonomous moving component are shown below.

Download

latest Releases : 2.0.0-RELESE

2.0.0-RELESE Download page

Number of Projects

Choreonoid

Motion editor/Dynamics simulator

OpenHRP3

Dynamics simulator

OpenRTP

Integrated Development Platform

AIST RTC collection

RT-Components collection by AIST

TORK

Tokyo Opensource Robotics Association

DAQ-Middleware

Middleware for DAQ (Data Aquisition) by KEK