BOW Logo
Docs
ModalitiesExteroception

Range Sensing: Sonar

Welcome to this tutorial series in robotics powered by the BOW SDK. This is tutorial covers range sensing using sonar sensors.

Recommended Robots

This tutorial requires a robot with a forward facing sonar/ultrasound sensor, therefore the recommended robots for this tutorial are:

  • DEEP Robotics - Lite 3
  • Softbank Robotics - Nao
  • Consequential Robotics - Miro-E

Prerequisites

The key libraries are:

  • OpenCV - a library of programming functions mainly for real-time computer vision

Before trying these tutorials make sure you have followed the instructions from the dependencies step to set up the development environment for your chosen programming language.

These tutorials also assume you have installed the BOW Hub available for download from https://bow.software and that you have subscribed with a Standard Subscription (or above) or using the 30 day free trial which is required to simulate robots.

This tutorial builds on the Step 1 - Vision tutorial. Although vision is not directly needed for this tutorial, it provides a structure for the code and another method of viewing the robots motion.

Range Sensors - Sonar

In this tutorial we will tackle sonar or ultrasound sensors. These sensors are very common in robotics and provide a measurement of the distance between the sensor and the nearest object in its field of view. This is achieved by emitting a high frequency sound and measuring the time taken for the echo to be received.

Sonar sensors are a tool used to provide understand the world around the robot and therefore sit under the exteroception modality. Within exteroception samples there is a "range" field which contains all range measuring sensors including sonar sensors.

A common use for sonar sensors is to prevent mobile robots from colliding with obstacles, and this is the application we will investigate in this tutorial. We will create a very basic controller which moves the robot forward until it gets close to an object, at which point it will turn left until the object is no longer measured as an obstruction and continue forward again.

To achieve this we will have the following:

Sense

  • Connect to a robot by calling QuickConnect
  • Get all exteroception data from the robot by calling GetModality("exteroception")
  • Get the data from the forward facing sonar sensor and find the range to the nearest obstacle

Decide

  • Decide how to move based on the measured range
  • If there are no obstacles or distant obstacles detected, plan to move forward
  • If there are obstacles getting close to the sensor, plan to turn
  • If there are obstacles very close to the sensor, or invalid measurements from the sensor (under minimum range), plan to both move backwards and turn
  • Populate a motor message based on these decisions

Act

  • Send locomotion decision to robot by calling SetModality("motor")

Running the Tutorial

To run the tutorial, navigate to the Exteroception/Sonar/Python folder within the SDK Tutorials repository that you cloned earlier.

cd SDK-Tutorials/Exteroception/Range-Sensors_Sonar/Python

Execute the example program:

python main.py

Interacting with the Tutorial

Behaviour

Once the tutorial is running you will be able to see the robots vision modality (as setup in Step 1), immediately you will see the robot begin to move forward. As it approaches an obstacle or wall it will begin to turn to the left. In the case that the robot gets very close to an obstacle, it will reverse. The resulting behaviour should be that your robot moves around the environment without colliding with walls or obstacles.

Obstacle Avoidance

Please note this is a very primitive form of obstacle avoidance which should not be heavily relied upon. This example is only meant to act as a teaching tool for accessing sonar data using BOW.

Sonar

Sonar sensors are not infallible, some soft or cushioned objects may not reflect the sound and therefore not be detected. Sonars will also not detect any object outside of their "cone" of emitted radiation. Objects approached at a sufficiently shallow angle may also not be detected as the sound will not reflect back to the receiver. This behaviour is also approximated in the simulator, as such the robot may not avoid walls or obstacle which it approaches at a shallow angle (22.5 degrees). The description of this behaviour can be found in the Webots documentation.

Stopping

To cancel the tutorial program's execution you can press Ctrl + C in the running terminal.

Code Breakdown

The following breakdown covers only the parts of the code that are relevant to the range sensing, as the other elements, such as vision and connection procedure have been covered in earlier tutorials.

First, it is necessary to ensure that all the modalities we require are included in our connection to the robot, in this case that means adding "exteroception" and "motor" to the list.

myrobot, error = bow.quick_connect(pylog=log, modalities=["vision", "motor", "exteroception"])

In order to ensure that we are accessing the correct sensor, in the case the forward most one, we first need to obtain a list of all the possible sensors the robot has. To obtain this list we call a get_modality on the exteroception channel. Immediately after connecting this may return an empty or invalid repsonse, as such, we loop until a valid sample is received.

ext_sample, err = myrobot.get_modality("exteroception", True)
while not err.Success:
ext_sample, err = myrobot.get_modality("exteroception", True)
time.sleep(0.1)

Next we pass the Range field of the recieved exteroception sample to the identify_front_sonar function. The Range field contains all of the range sensor data from the robot.

front_sonar = identify_front_sonar(ext_sample.Range)

This function first iterates through the range sensors and extracts all of the ultrasound (sonar) sensors by evaluating the "OperationType" field, which for convenience can be done with the OperationTypeEnum.

def identify_front_sonar(range_sensors):
    sonars = []
    for sensor in range_sensors:
        if sensor.OperationType == bow_utils.Range.OperationTypeEnum.Ultrasound:
        sonars.append(sensor)

It then iterates through these sonar sensors and evaluates their position in the X dimension. The coordinates system of the all robots is defined such that the X axis is aligned with the forward direction of the robot, as such the higher the Position.X value, the more forward placed the sensor.

front_sonar_x_pos = -100.0
front_sonar_name = None
for sonar in sonars:
    if sonar.Transform.Position.X > front_sonar_x_pos:
        front_sonar_x_pos = sonar.Transform.Position.X
        front_sonar_name = sonar.Source
 
# Return the name of the forward most sonar sensor
return front_sonar_name

Inside the main loop we simply need to get a new exteroception sample and locate our desired sonar sensor within it

ext_sample, err = myrobot.get_modality("exteroception", True)
if not err.Success or ext_sample is None:
    continue
 
# Iterate through range sensors until front sensor
sonar = None
for range_sensor in ext_sample.Range:
    if range_sensor.Source == front_sonar:
        sonar = range_sensor
        break

We then create a motor_sample ready to hold our desired actions. Based on the sonar sensor reading we then populate this sample's locomotion field, specifically:

  • If the reading is 0, -> no obstruction has been found (i.e. no ultrasonic reflection) -> clear to move forwards -> positive X translation velocity
  • If the reading is -1, -> invalid sonar data (i.e. reading under minimum range for sonar -> assume obstruction and turn to avoid -> non-zero Z rotational velocty
  • If the reading is valid, but still distant from sensor -> clear to move forwards -> positive X translation velocity
  • If the reading is nearing the sensor minimum -> perform avoidance by turning -> non-zero Z rotational velocty
  • If the reading is too close to sensor minimum -> perform avoidance by reversing -> negative X translational veloctity
# Create a motor message to populate
motor_command = bow_utils.MotorSample()
 
# Base the velocity command on the sonar reading
if sonar.Data == -1:
    print("Invalid Sonar Data: ", sonar.Data, " meters")
    motor_command.Locomotion.RotationalVelocity.Z = 0.5
 
elif sonar.Data == 0:
    print("No obstruction in range: ", sonar.Data, " meters")
    motor_command.Locomotion.TranslationalVelocity.X = 0.2
 
elif sonar.Min + 0.5 < sonar.Data < sonar.Min + 1.5:
    print("Obstruction approaching sensor minimum: ", sonar.Data, " meters")
    motor_command.Locomotion.RotationalVelocity.Z = 0.5
 
elif sonar.Data < sonar.Min + 0.5:
    print("Obstruction too close to maneuver, reverse: ", sonar.Data, " meters")
    motor_command.Locomotion.RotationalVelocity.X = -0.2
 
else:
    print("Obstruction detected at safe range", sonar.Data, " meters")
    motor_command.Locomotion.TranslationalVelocity.X = 0.2

Finally, we send this motor command to the roibot to be executed

myrobot.set_modality("motor", motor_command)

On this page