Control software (compatible with all ToyCollect robots)
2020/02/06
Installation of TCserver
TCserver controls the robot, LEDs and cameras and receives commands via Bluetooth or Wifi. For V1.21 (R2X) we distinguish the MASTER Raspberry Pi with connected motor controller and the SLAVE. For the robot models with only one RPi, we conveniently also define it as MASTER.
The following three steps must be executed for the V1.21 (R2X) robot on MASTER and SLAVE, otherwise only on the one.
- The RPi should already be accessible via SSH. Please log in. We assume that Raspbian Lite is installed and Wifi is configured. Internet access directly at the RPi is necessary for the installation.
- Execute the command
sudo raspi-config
and change the following entries in Interfacing Options: P1 activate camera; P5 activate I2C; P6 deactivate login shell over serial, but activate serial port hardware. A reboot will be necessary. Log in again afterwards.
- To install the necessary libraries, execute the following commands:
sudo bash
apt-get update
apt-get install git wiringpi pigpio libncurses-dev netcat-openbsd joystick
The following steps need only be executed on MASTER.
- Log in and execute the following commands:
cd /home/pi
git clone https://git.seewald.at/TCserver
cd TCserver
- The configuration of the robot is performed by editing C++ include files. Copy an appropriate sample include file to
config.h
- for V1.21 (R2X) config_PINK_RPI2X_192.168.42.90_91.h
, for V1.3 (K3D) or V1.1 config_KULA3D.h
. Open the file with a textmode editor such as nano
or vi
.
- The given example include files are designed for the application of pre-trained deep learning models. If you want to control the robot using other remote controls instead, you must modify the new include file as follows:
- for V1.21 (R2X) comment out
#define DL_STREAM
using //
- for V1.3 (K3D) comment out
#define RPI_3BPLUS
and add #define USE_WLAN_SOCKET
as well as #define OUTPUT_3D
This also deactivates the override of the bluetooth controller.
- Settings for the motor controller (
MC_SERIALPORT, MC_BAUDRATE, MC_BAUDRATE_AUTO
) should be correct. If you set the motor controller to a fixed baud rate of 38,400 baud, comment MC_BAUDRATE_AUTO
out using //
- It can happen that motors are differently connected as in the original robot. The settings
MC_M0_*, MC_M1_*
are there to fix this.
Execute make clean && make tests/motor_controller_test
and tests/motor_controller_test
. With keys a,y,k,l you can test the motors.
M0
should be the left motor in driving direction (LEDs = front, Logo = back), positive values should move it forward and negative values backward. M1
should be the right motor and behave in the same way.
Should this not be the case, correct the corresponding values in config.h
and start the test anew from the first command. For example, if the motors are switched between left and right, exchange M0
and M1
in the #define
-rows. If one motor reacts inversely, exchange the command bytes for forward and backward just for this motor.
- The calibration settings,
mc_motor_calibration
, should be set to { 1.0f, 1.0f, 1.0f, 1.0f }
. With this setting you can correct different speeds between motors which can lead to systematic leftward or rightward drifts during straight forward driving. In this case you can brake the faster motor by entering a smaller value than 1. Since motors may behave differently between forward and backward movement, the correction can be specified differently for both types of movement.
- If you want to use a Bluetooth controller for direct control of the robot - which then also works without Wifi - you can enter its Bluetooth ID under
BT_CONTROLLER_ID
and its type under BT_TYPE
. Currently StratusXL and DualShock4 controllers are supported. Any Bluetooth gaming controller compatible with Linux should work. New types can be added in TCserver.c
.
To configure the Bluetooth controller it must be paired with the MASTER. This is done with the following commands. First, set the Bluetooth controller to a mode where it can be paired again!
bluetoothctl
(bluetoothctl command shell)
agent on
default agent
trust [bluetooth id]
pair [bluetooth id]
connect [bluetooth id]
exit
(command shell)
modprobe xpad
After this, /dev/input/js0
should exist and be testable via jstest --event /dev/input/js0
.
If you do not want to use a bluetooth controller, just comment out the line #define BT_CONTROLLER
.
NOTE: Currently, the bluetooth controller is the only way to override the robot when it executes a pretrained deep learning model.
- With
make tests/led_test && tests/led_test
you can test the attached LEDs. Using keys + and -, the brightness can be increased or reduced. This test should also be performed for SLAVE. The program led_test
can be copied to SLAVE via scp tests/led_test pi@[SLAVE IP-Adresse]:.
- With the V1.21 (R2X) robot you can additionally check the synchronisation between MASTER and SLAVE using
make tests/sync_test
. For this to work, this program must be started on both RPis in parallel. It is necessary only for troubleshooting unstable synchronisation between MASTER and SLAVE. sync_test
has as first parameter MASTER
or SLAVE
and must be started on one RPi with MASTER
and on the other with SLAVE
. The second parameter is a time delay in milliseconds which determines how often this value changes. Meaningful test values for the second parameter are 10, 100 and 1000.
- Once everything is configured, execute
make TCserver
and copy the executable file TCserver
- to
/usr/local/bin
- Only for V1.21 (R2X): to SLAVE, also to
/usr/local/bin
via scp tests/led_test pi@[SLAVE IP address]:.
, then log in to SLAVE and execute sudo cp TCserver /usr/local/bin
Also add /usr/local/bin/TCserver >>&! /home/pi/LOG.TCserver &
to /etc/rc.local
just before the last line (which contains exit 0
). For V1.21 (R2X) this must be done for both MASTER and SLAVE. After this change, TCserver will be automatically started at each boot and a logfile will be created.
- If you now restart the robot, the LED(s) should light up briefly after 1-2 minutes. This indicates that TCserver is now running and ready to receive commands. From this point on the robot can be controlled via Bluetooth controller (taking into account the override) or any other remote control software. This is also the prerequisite for the execution of pre-trained deep learning models with TCcontrol, which is described next.
Installation of TCcontrol
TCcontrol receives real-time stereo images either from the local robot or via Wifi, and calculates the corresponding control commands via a pre-trained Tensorflow Lite model, which are then sent back to the robot via local socket or Wifi.
The pretrained deep learning models require video output in uncompressed format and low resolution for efficiency reasons. However, most other remote controls require video output in compressed format (H2.64) and high resolution for quality reasons. Therefore, at present, the use of pretrained deep learning models can only be combined with the Bluetooth controller remote control.
- If you are assembling a V1.3 (K3D) robot, carry out the following steps directly on the RPi 3B+.
- If you assemble a V1.21 (R2X) robot, you need another RPi as server (at least 3B+, better 4). We recommend connecting it directly to the Wifi router with an Ethernet cable so that the available Wifi bandwidth is fully available for the transmission of robot video.
- The RPi should already be accessible via SSH. Please log in. We assume that Raspbian Lite is installed and Wifi is configured. Internet access directly at the RPi is necessary for the installation.
- First we install Tensorflow Lite. Since the installation is relatively complex and requires a lot of time and disk space, we use an already compiled version.
sudo bash
cd /usr/local/src
wget https://seewald.at/files/tf1.7_rpi_lite.tar.gz
tar -xvzf tf1.7_rpi_lite.tar.gz
rm -f tf1.7_rpi_lite.tar.gz
If you are using another Tensorflow Lite version than 1.7, please check if float *probs
in tc_eval.cc
line #156 has to be changed. In our estimate this is a bug of Tensorflow Lite 1.7 which might also exist in new versions.
- To install the necessary libraries, execute the following commands:
sudo bash
apt-get update
apt-get install git wiringpi pigpio libncurses-dev netcat-openbsd
apt-get install libopencv-core-dev libopencv-highgui-dev libopencv-imgcodecs-dev libopencv-imgproc-dev
exit
cd /home/pi
git clone https://git.seewald.at/TCcontrol
cd TCcontrol
The necessary programs to execute pretrained deep learning models - as well as the models themselves - are now in /home/pi/TCcontrol
.
- For configuration, the
#define
entries in the source code must be set differently depending on the robot.
- For V1.21 (R2X): Open
merge.cc
and enter the corresponding static IP addresses TC_SERVER_IP_LEFT, TC_SERVER_IP_RIGHT
of the left and right RPi Zero Ws on line #48 and #49. Left and right are to be understood in forward driving direction. TEST_LOCAL
must not be defined, if so please comment out. Save the changes and then open tc_test.cc
. Set TC_SERVER_IP
in line #92 to the IP address of the MASTER (i.e. where the motor controller is connected). TC_TEST_LOCAL
must not be defined, if so please comment it out. Save the changes again.
- For V1.3 (K3D):Open
tc_test.cc
. TC_TEST_LOCAL
and TC_TEST_LOCAL__KULA3D
must be defined. TC_TEST_LOCAL__CM3
must not be defined. Please change accordingly. Save the changes.
Afterwards compile TCcontrol:
cd /home/pi/TCcontrol
make
- The final installation is slightly different depending on the robot type.
- V1.3 (K3D):Automatic start shortly after TCserver. Please add the following lines to
/etc/rc.local
(before exit 0
and after /usr/local/bin/TCserver ...
):
sleep 5
cd /home/pi/TCcontrol
./runTcTest.sh K3D.tflite &
- V1.21 (R2X): Manual start - after LEDs blink at robot execute the following commands at the server:
cd /home/pi/TCcontrol
./runTcTest.sh R2X.tflite &
The robot should then move in about 30-60s while status messages are displayed. Alternatively, you can also add ./runTcTest.sh R2X.tflite &
to /etc/rc.local
before exit 0
on the server as above. However, this method has not been sufficiently tested.
Concerning bluetooth controller: By default, the deep learning model controls the robot. Only if one of the override (defined in TCserver.c by BT_BUTTON_LED_MINUS
or BT_BUTTON_LED_PLUS
- for the two supported controllers the L1 and R1 buttons on the front of the controller) buttons is held down commands are transmitted. if no deep learning models are used, this override function is disabled and commands are always transmitted (set in TCserver here). The left joystick is for throttle, the right one for direction. By means of the keys X or □ and B or ○ the robot can turn left or right when stationary.
Enjoy! If something does not work as it should, contact us.