BOW Data Message

Protocol Documentation

Table of Contents




AudioSample contains a single packet of audio data from a single source

SourcestringSource indicates the origin of the audio data, such as a specific microphone.
DatabytesData contains the raw or compressed audio data.
Channelsuint32Channels specifies the number of audio channels.
SampleRateuint64SampleRate indicates the number of samples per second.
NumSamplesuint64NumSamples provides the total number of audio samples in the Data field.
CompressionAudioSample.CompressionFormatEnumCompression specifies the audio compression format used.
TransformTransformTransform provides the spatial location of the audio source, if applicable.
DesignationStereoDesignationEnumDesignation specifies whether the audio is intended for a specific channel in a stereo or multi-channel setup.


AudioSamples contains multiple instances of audio data.

SamplesAudioSamplerepeatedSamples is an array of AudioSample messages, each containing a sample of audio data from different sources


BlobSample provides flexible data type that can be used to define your own messages in a custom communication channel

BytesArraybytesrepeatedBytesArray holds an array of bytes.
IntArrayint64repeatedIntArray stores an array of integers.
FloatArrayfloatrepeatedFloatArray contains an array of floating-point numbers.
StringstringString stores a text string.
TransformTransformTransform specifies a spatial orientation and position in a 3D space.


EndEffector defines an update to be sent to an articulated assembly, such as an arm or a leg

NamestringOptional: Name of the end effector that this update is for. When not specified, position on array of EndEffectors is used to specify which command goes to which end effector
TransformTransformOptional: Transform specifies a target spatial orientation and position for the tip of the end effector relative to a provided reference frame. The required joint angles that result in the end effector going to the specified transform are calculated using built-in inverse kinematics. (opens in a new tab) This field is only used if UseTransform is set to true.
EnabledboolWhen Enabled is set to false, the update is not applied to the end effector.
AnglesfloatrepeatedAngles contain the angular positions of each joint in the articulated assembly, ordered according to the robot's specification. This is to enable manual control of the robot's joints if needed but should be avoided under normal circumstances because code will not be robot agnostic in behaviour. WARNING: NOT ROBOT AGNOSTIC
EndEffectorTorquefloatrepeatedOptional: EndEffectorTorque lists the torque to be applied at each joint of an articulated assembly as part of the update. This is to enable manual control of the force applied by the robot but should be avoided under normal circumstances because code will not be robot agnostic in behaviour. This is adhered to when possible but not every robot allows this level of control WARNING: NOT ROBOT AGNOSTIC
AngleUnitsAngleTypeEnumOptional: AngleUnits specifies the unit used for the joint angles if they are manually provided.
GripperGripperOptional: Gripper specifies a command to be sent to a gripping mechanism connected to the tip of the end effector, if present.
MovementDurationMsfloatOptional: MovementDurationMs defines the time allowed for the robot to move to its target transform or target joint angles in milliseconds. Ignored when robot does not support timing information on movements
ControlModeMotionGenerationControl mode specifies how the robot will be controlled. IK uses the Transform field together with Inverse Kinematics in order to calculate the required joint angles for a movement. Overwrites information in JointAngles and AngleUnits Custom uses the information in JointAngles and AngleUnits directly. WARNING: NOT ROBOT AGNOSTIC Other control modes specify additional functionality to be used alongside inverse kinematics such as self collision avoidance (Experimental)


Float32Array is a container for an array of floating-point numbers.

DatafloatrepeatedData is an array of 32-bit floating-point numbers.


GPS captures geographic positioning data.

LatitudefloatLatitude in decimal degrees
LongitudefloatLongitude in decimal degrees
ElevationfloatElevation in meters above sea level
TimestringTimestamp of the GPS sample


Gripper outlines the configuration of a hand-like gripper in terms of each finger's position.

ThumbfloatThumb specifies the % closed for the thumb [0-1] or distance of thumb-tip to palm.
IndexfloatIndex specifies the % closed for the index finger [0-1] or distance of index fingertip to palm.
MiddlefloatMiddle specifies the % closed for the middle finger [0-1] or distance of middle fingertip to palm.
RingfloatRing specifies the % closed for the ring finger [0-1] or distance of ring fingertip to palm.
PinkyfloatPinky specifies the % closed for the pinky finger [0-1] or distance of pinky fingertip to palm.
FingerAdductionfloatFingerAddiction specifies the spread of the fingers as a % [0-1]. Not used in DISTANCE_IK mode.
ThumbOppositionfloatThumbOpposition specifies the opposition of the thumb as a % [0-1]. Not used in DISTANCE_IK mode.
CustomAnglesfloatrepeatedCustomAngles allow for user-defined joint positions. WARNING: NOT ROBOT AGNOSTIC
JointNamesstringrepeatedJointNames names each joint corresponding to the CustomAngles. WARNING: NOT ROBOT AGNOSTIC
GripperModeGripperModeEnumGripperMode indicates if custom angles should be applied.


IMU (Inertial Measurement Unit) captures the sensor data needed to determine an object's orientation, velocity, and gravitational forces.

GyroVector3Gyro represents angular velocity measured in radians per second along X, Y, and Z axes.
AccVector3Acc signifies linear acceleration in meters per second squared along X, Y, and Z axes.
TransformTransformTransform provides a reference frame for the IMU data.


ImageSample contains various information related to a single image or frame.

SourcestringSource identifies the origin of the image, such as a camera name or sensor ID.
DatabytesData holds the image information in bytes.
DataShapeuint32repeatedDataShape provides the dimensions of the Data array, useful for reshaping the byte array back into an image.
CompressionImageSample.CompressionFormatEnumCompression indicates the algorithm used to compress the image.
ImageTypeImageSample.ImageTypeEnumImageType specifies the kind of image (RGB, Depth, etc.)
TransformTransformTransform represents the spatial positioning of the camera or sensor at the time the image was captured.
FrameNumberuint64Optional: FrameNumber is a unique identifier for this frame.
DesignationStereoDesignationEnumDesignation pertains to the role of the image in a stereo pair, if applicable.
HFOVfloatOptional: HFOV is the horizontal field of view of the camera in degrees
VFOVfloatOptional: VFOV is the vertical field of view of the camera in degrees.
NewDataFlagboolNewDataFlag indicates whether this sample contains new information since the last capture.


ImageSamples holds an array of ImageSample objects, each capturing a specific image or frame at a given point in time and with a specific transform

SamplesImageSamplerepeatedSamples is an array of ImageSample objects.


Int64Array is a container for an array of 64-bit integers.

Dataint64repeatedData is an array of 64-bit integers.


Locomotion encapsulates information regarding the robot's movement and positioning.

PositionVector3Position defines the robot's spatial coordinates in Cartesian form.
RotationVector3Rotation outlines the robot's orientation, described by a Vector3 of Euler angles.
LocomotionModeLocomotionModeEnumLocomotionMode defines whether the robot's locomotion should be position-based or velocity-based.
RotationUnitsAngleTypeEnumRotationUnits defines the units of the rotation vector.


MotorArray is a collection of MotorSamples, each accompanied by a timestamp, providing a time-ordered set of robot states. This is the data type for a recording.

SamplesMotorSamplerepeatedSamples contains an array of MotorSamples, each capturing a unique state of the robot's motors.
Tstampint64repeatedTstamp holds an array of timestamps in milliseconds, each corresponding to a MotorSample in the Samples array.


MotorSample combines all the parts of a motor update into a single message.

HeadVector3Head defines the orientation of the robot's head, if applicable in euler angles. X is yaw (look right when positive), Y is pitch (look up when positive) and Z is roll (lean head right when positive)
LocomotionLocomotionLocomotion provides an update for how the robot should move within its environment
EndEffectorsEndEffectorrepeatedEndEffectors is an array that holds an array of updates for end effectors, such as arms or legs.
OperatingHeightfloatOperatingHeight defines the robot's height in metres from a specified base, useful for robots with adjustable height. In the future, operating height and head will be considered as an end effector to generalise further
RecordFlagboolRecordFlag indicates whether the current motor sample should be recorded.
PlaybackFlagboolPlaybackFlag signals whether a previously recorded action should be replayed.
PlaybackActionNamestringPlaybackActionName specifies the name of the action to be replayed, assuming PlaybackFlag is true.


ProprioceptionSample contains all the information regarding the kinematic state of the robot and relating to the robot's environment

JointNamesstringrepeatedJointNames lists the names of all joints in the robot, ordered according to the robot's specification.
JointAnglesfloatrepeatedJointAngles holds the angular positions of each joint, ordered to correspond with JointNames.
AngleUnitsAngleTypeEnumAngleUnits specifies the unit of measurement for the joint angles
JointVelocitiesfloatrepeatedOptional: JointVelocities contains the angular velocities of each joint ordered to align with JointNames.
JointTorquesfloatrepeatedOptional: JointTorques lists the torques applied at each joint ordered to align with JointNames.
TorqueUnitsTorqueTypeEnumOptional: TorqueUnits indicates the unit of measurement for the joint torques.
SpatialOrientationEnabledboolOptional: SpatialOrientationEnabled flags whether spatial orientation data is included or not.
SpatialUnitsAngleTypeEnumOptional: SpatialUnits indicates the unit of measurement for spatial resolution.
SpatialResolutionfloatOptional: SpatialResolution specifies the granularity of the spatial orientation data.
SpatialOrientationfloatrepeatedOptional: SpatialOrientation provides the distance to an obstacle within an angular segment of size defined by spatial resolution.
ImuIMUrepeatedOptional: Imu provides the 3D acceleration, orientation, and gravitational forces experienced at different positions on a robot's body
GpsGPSOptional: Gps provides the global position of the robot, if available


Quaternion contains the components of a normalised quaternion that can be used to describe a rotation.

QuatXfloatX component of the normalised quaternion
QuatYfloatY component of the normalised quaternion
QuatZfloatZ component of the normalised quaternion
QuatWfloatW component of the normalised quaternion


StringSample encapsulates a single string-based data sample.

DatastringData contains the string value of the sample.


TactileSample captures data about a single tactile interaction or sensor reading.

SourcestringSource identifies the tactile sensor that provided the data.
LocationTagRelativeToEnumLocationTag specifies the location on the robot where the sensor is mounted.
DatabytesData contains the raw tactile data.
DataShapeuint32repeatedDataShape provides the dimensions of the tactile data array.
TactileTypeTactileSample.TactileTypeEnumTactileType specifies the type of tactile interaction being captured.
CompressionTactileSample.CompressionFormatEnumCompression indicates the tactile data compression format.
ClassifiedTexturestringClassifiedTexture offers a qualitative description of a detected texture.
FloatDatafloatrepeatedFloatData can contain additional floating-point data related to the tactile sample.
TransformTransformTransform gives the spatial orientation and position of the sensor
NewDataFlagboolNewDataFlag indicates whether the data sample is new or not.


TactileSamples is a container for multiple instances of tactile data samples.

SamplesTactileSamplerepeatedSamples is an array of TactileSample messages, each providing detailed information about different types of tactile experiences.


TimeSync enables the synchronization of clocks between devices.

TOrigin_0int64TOrigin_0 is the time on the originating device when the sync request is first made
TReceiverint64TReceiver is the time on the receiving device when the sync request is received
TOrigin_1int64TOrigin_1 is the time on the originating device when the sync response is received
ReturnTripboolReturnTrip signifies if the synchronization request has made a round trip


Transform captures a spatial point and orientation in Cartesian coordinates.

PositionVector3Position utilises a Vector3 to define distances in metres from a designated reference point. X is positive to the right, Y is positive upwards, Z is positive forwards.
QuaternionQuaternionQuaternion contains a normalised quaternion to define the object's rotation.
EulerAnglesVector3EulerAngles use a Vector3 to define rotation through specific angles, in units designated by AngleUnits.
AngleUnitsAngleTypeEnumAngleUnits outlines the unit of measurement used in EulerAngles.
LocationTagRelativeToEnumOptional: LocationTag identifies the origin point to which the transform is relative.
NamestringName assigns a distinct identifier for the Transform.


Vector3 contains three components expected for a vector in 3D cartesian space

XfloatX component of the vector
YfloatY component of the vector
ZfloatZ component of the vector


AngleTypeEnum designates the measurement units for angles.

UNKNOWN_ANGLE_UNITS0Default: Robot uses its default angle unit when not specified.
RADIANS1Angles are measured in radians.
DEGREES2Angles are measured in degrees.


CompressionFormatEnum lists available audio compression formats.

RAW0Default: No compression
SPEEX1Compressed in Speex formats


GripperModeEnum specifies the type of control to be used by the gripper

PERCENTAGE0Defult: Defines the use of a custom array of angles to be applied to a matching array of joint names to control the movement directly
DIRECT1Defines that the information stored for each digit is a float range of 0 (fully open) to 1 (fully closed)
DISTANCE_IK2Defines that the information stored for each digit is a distance from fingertip to palm. This information is used by IK to create a robot agnostic solution that can generate a solution independent of gripper joint configuration


CompressionFormatEnum defines the compression currently in use for this image

H2640h264 compression
VP81Default: vp8 compression
VP92vp9 compression
JPG3jpeg compression
RAW4no compression
H2655h265 compression


ImageTypeEnum defines the type of image or sensory data.

RGB0Default: Standard red-green-blue image in YUV422 format
DEPTH1Depth map
STEREO2Stereoscopic image
INFRARED3Infrared image


LocomotionModeEnum categorises the robot's movement control methods.

UNKNOWN_LOCOMOTION_MODE0Default: Robot uses its inherent default mode, either based on position or velocity.
POSITION_MODE1Defines the robot's target distance, specified in metres for translation and degrees / radians for rotation.
VELOCITY_MODE2Commands define the robot's target speed, specified in metres per second for translation or degrees per second / radians per second for rotation.


GripperModeEnum specifies the type of control to be used by the gripper

IK0Default: Defines the use of IK to generate robot movement
CUSTOM1Defines the use of custom angles to set robot joint angles
IK_WITH_SELF_COLLISION_AVOIDANCE2Experimental: Defines the use of self collision avoidance when generating a robot's movement. (Experimental)


RelativeToEnum marks specific points on the robot for spatial reference.

BASE0Default: Base indicates the robot's contact point with the floor.
HEAD1Head marks the robot's primary vision point, often where the main camera is located.
END_EFFECTOR2End_Effector identifies the terminal point of an articulated assembly.
END_EFFECTOR_LEFT3Identical to End_Effector but specifies the main articulated assembly on the left side of the robot.
END_EFFECTOR_RIGHT4Identical to End_Effector but specifies the main articulated assembly on the right side of the robot.
TORSO5Torso marks the robot's centre of mass.


StereoDesignationEnum identifies the camera's role as part of a stereo pair or a microphone's role as part of a binaural pair. (opens in a new tab)

NONE0Default: Camera functions independently, not as part of a stereo pair.
LEFT1Camera operates as the left component in a stereo pair.
RIGHT2Camera operates as the right component in a stereo pair.


CompressionFormatEnum lists available formats for tactile data compression.

RAW0Default: No compression


TactileTypeEnum enumerates the types of tactile interactions that can be captured.

UNDEFINED0Default: Undefined tactile type
PRESSURE1PRESSURE is a 1D float data type of range [0,1] with 1 item per sensor
VIBRATION2VIBRATION is a 1D float data type of range [0,1] with 1 item per sensor
TEXTURE3TEXTURE represents a classified texture provided as a string
FORCE_FEEDBACK4FORCE_FEEDBACK is a 1D float data type of range [0,1] with 1 item per sensor
TRIAXIAL5TRIAXIAL is a 3D float data type of range [0,1] with 3 items per sensor


TorqueTypeEnum outlines the measurement units for torque.

UNKNOWN_TORQUE_UNITS0Default: Robot uses its default torque unit when not specified.
CURRENT1Torque is quantified as electrical current in Amperes. This is not a direct measurement of torque but a good relative measure when one is not available
NEWTON_METRES2Torque is quantified in newton metres.

Scalar Value Types

.proto TypeNotesC++JavaPythonGoC#PHPRuby
int32Uses variable-length encoding. Inefficient for encoding negative numbers – if your field is likely to have negative values, use sint32 instead.int32intintint32intintegerBignum or Fixnum (as required)
int64Uses variable-length encoding. Inefficient for encoding negative numbers – if your field is likely to have negative values, use sint64 instead.int64longint/longint64longinteger/stringBignum
uint32Uses variable-length encoding.uint32intint/longuint32uintintegerBignum or Fixnum (as required)
uint64Uses variable-length encoding.uint64longint/longuint64ulonginteger/stringBignum or Fixnum (as required)
sint32Uses variable-length encoding. Signed int value. These more efficiently encode negative numbers than regular int32s.int32intintint32intintegerBignum or Fixnum (as required)
sint64Uses variable-length encoding. Signed int value. These more efficiently encode negative numbers than regular int64s.int64longint/longint64longinteger/stringBignum
fixed32Always four bytes. More efficient than uint32 if values are often greater than 2^28.uint32intintuint32uintintegerBignum or Fixnum (as required)
fixed64Always eight bytes. More efficient than uint64 if values are often greater than 2^56.uint64longint/longuint64ulonginteger/stringBignum
sfixed32Always four bytes.int32intintint32intintegerBignum or Fixnum (as required)
sfixed64Always eight bytes.int64longint/longint64longinteger/stringBignum
stringA string must always contain UTF-8 encoded or 7-bit ASCII text.stringStringstr/unicodestringstringstringString (UTF-8)
bytesMay contain any arbitrary sequence of bytes.stringByteStringstr[]byteByteStringstringString (ASCII-8BIT)