Robobot architecture
Back to Robobot B
Software block overview
Figure 1. The main building blocks.
Software building blocks
Figure 2. The main software building blocks.
Mission app
The mission app block is the 'brain' of the robot.
The mission app is a Python app with a few essential capabilities.
It can subscribe to several messages from the Teensy interface and command the Teensy board to perform actions (like changing the motor voltage).
It attaches to the camera video stream and connects to the IO board (e.g. the start and stop buttons.
The mission app is in svn/robobot/mqtt_python; its main file is mqtt-client.py.
Teensy interface
The Teensy interface connects the Teensy USB interface and the MQTT protocol.
The interface can create log files for most data types. The configuration file robot.ini in svn/robobot/teensy_interface/build controls this.
The Teensy interface is started from the on_reboot.bash in the home directory (/home/local)
Mosquitto MQTT server
This Mosquitto server is an open-source MQTT protocol server - see https://mosquitto.org/. This server is running as a service.
Camera streamer
The camera streamer is a small Python app that takes data from the camera and streams the video to socket port 7123.
This means that the stream is also available on Wi-Fi.
The camera is configured in the Python file svn/robobot/stream_server/stream_server.py.
The streamer is started from the on_reboot.bash in the home directory (/home/local)
IP-disp
IP_disp is a silent app that is started at reboot (by on_reboot.bash) and has two tasks:
- Detect the IP net address of the Raspberry and send it to the small display on the Teensy board.
- Detect if the "start" button is pressed, and start the mission app.
Teensy PCB
The Teensy board is a baseboard used in the more straightforward 'Regbot' robot. This board has most of the hardware interfaces and offers all sensor data to be streamed in a publish-subscribe protocol. All communication is based on clear text lines.
The firmware is in svn/robobot/teensy_firmware_8, compiled with ./compile and uploaded to the Teensy by ./upload. NB! An upload will cut the battery power, so press the power switch during upload.
See details in Robobot circuits.
Software architecture
The software architecture is based on the old NASREM architecture, and this is the structure for the description on this page.
(The National Aeronautics and Space Administration (NASA) and the US National Institute of Standards and Technology (NIST) have developed a Standard Reference Model Telerobot Control System Architecture called NASREM. Albus, J. S. (1992), A reference model architecture for intelligent systems design.)
Figure 3. The NASREM model divides the control software into a two-dimensional structure. The columns are software functions: Sensor data processing, modelling, and behaviour control.
The rows describe abstraction levels:
- Level 1 with the primary control of the wheels for forward velocity and turn rate. This level also maintains the robot pose (position, orientation, and velocity, based on wheel odometry.
- Level 2 is drive select, where the drive can be controlled by just odometry (forward velocity and turn rate) or follow a line based on the line sensor. This level also includes other sensor detections like crossing lines and distances.
- Level 3 is where the overall behaviour is decided and includes camera sensor object detections like navigation codes and other objects.
Level 1; Pose and drive control
Figure 4. The lowest level in the control software. The encoder ticks are received from the hardware (from the Teensy board) into the sensor interface. The encoder values are then modeled into an odometry pose. The pose is used to control the wheel velocity using a PID controller. The desired wheel velocity for each wheel is generated in the mixer from a desired linear and rotational velocity. The heading control translates rotation velocity to the desired heading and uses a PID controller to implement.
More Robobot level 1 details of the individual blocks.
Level 2; drive select
Figure 5. At level 2 further sensor data is received, modeled, and used as optional control sources.
More robobot level 2 details of the individual blocks.
Level 3; behaviour
Figure 6. At level 3, the drive types are used to implement more abstract behaviour, e.g. follow the tape line to the axe challenge, detect the situation where the axe is out of the way, and then continue the mission.