Streaming Data
Connect to your robot and sample from its sensors.
By the end of this tutorial you will be able to:
- Connect to a wide range of robots and sample their sensor data
- Explain the importance of a universal robot description for developing hardware-agnostic applications
- Visualise live camera data in your first BOW-enabled robotics application
One of the most important services provided by the BOW SDK is the ability to communicate with a robot’s components – its sensors and actuators – without needing to know the details of the underlying hardware.
In this tutorial we’ll be using the QuickConnect, CloseConnect and <channel>.Get commands to stream images from a robot’s camera (<channel>.Set will be covered in the next tutorial).
Take a look at the following to see what we're aiming for:
About the Application
The purpose of the application you will develop here is to enable users to visualise live camera data from their robots. The structure of the application is as follows:
Before we get stuck in:
If you are just browsing to get a sense of what's possible, take a look at the code online.
Running the Application
Navigate to the GettingStarted/StreamingData/Python
folder in the SDK Tutorials repository:
Execute the example program:
Investigation
Once the tutorial is running you will be able to see the robot's vision modality being output to your screen via a CV2 window. This provides a live view of the (simulated) visual scene captured by your robot, which you can verify by interacting with the robot inside Webots and monitoring the output.
Let's take a closer look at the key parts of the code:
The following connects to a robot (where available), and establishes the vision channel:
The following begins a sampling loop on the vision channel:
The following uses the opencv module to display the image data: