John Keogh | September 14, 2013
I have a previous post about using web services from robots. This
post is focused on using a web service from EyesBot Driver robot controller app. The audience for this blog post is
those who are interested in:
- Computer vision
- Robotics
- Know what a web service is
The reason that the ability to call web services was added to EyesBot Driver v1.1, which should be available in the Apple app store
around September 24, is so that the end user:
- Can extend the functionality of the EyesBot Driver app by creating a custom web service
- Has an easy way to get started with computer vision (CV), or to extend your existing CV expertise into the robotics realm
You can download a
zip file that contains an example REST style web service written
in PHP that accepts an image and returns a command, encoded in JSON, that tells the robot what direction to go based on the image. The
included code reacts to the orange color of a DFRobot
box under strong, uniform, white illumination. To see the way that a robot behaves using this web service, please see the
Robot Reacting to Visual Stimulus
post. In that case the processing was done on the iPod but the behavior is the same whether the processing is done
on the iPod or on a server called using a web service.
I'm working with RobotShop to create a kit for the robot, I'll post the link here when it is
available.
The following activity diagram shows the entities that are involved when EyesBot Driver consumes a web service. If you
don't have your iPod in a robot body, the Arduino isn't involved, but the rest remains the same.
The user can use a dropdown on the EyesBot Driver web UI to choose to either not send images to a web service, send images
to a web service, or control the robot with a web service. If the user chooses to control the robot with the web services, then the
command that the web service returns, which is in this format:
{
"actions":[
{
"duration": 1000,
"delay": 1000,
"left": 0,
"right": 0,
"light": 0,
"stop": "false",
"comment": ""
}
]
}
Will be used to tell the robot where to go. What each value means is explained in the PHP file in the
example web service.
Below is a short part of the code that makes up the example web service. This is where the
image that is passed in by EyesBot Driver is processed. You can do any processing you want,
including passing the image to OpenCV, but this shows a simple way to get all pixels that match
a specified color palette.
//the robot can only go left, right or straight, so only
//need to determine whether the object to follow is ahead,
//to the left or to the right, not whether it is up or
//down
for($column = 0; $column < $width; $column++){
for($row = 0; $row < $height; $row++){
$rgb = imagecolorat($im, $column, $row);
$r = ($rgb >> 16) & 0xFF;
$g = ($rgb >> 8) & 0xFF;
$b = $rgb & 0xFF;
if((($r>$RedLowerLimit)&&($r<=$RedUpperLimit))&&
(($g>$GreenLowerLimit)&&($g<=$GreenUpperLimit))&&
(($b>$BlueLowerLimit)&&($b<=$BlueUpperLimit))){
if($column>$middleThirdBoundary){
$rightThirdMatchSpots++;
}
else if($column>$leftThirdBoundary){
$middleThirdMatchSpots++;
}
else{
$leftThirdMatchSpots++;
}
$matchPoints++;
}
$totalPoints++;
}
}
After the number of matching pixels is determined, the logic decides where the robot should go and returns
the JSON encoded result
echo "{\r\n";
echo " \"actions\":[\r\n";
echo " {";
echo " \"duration\": $duration,\r\n";
echo " \"delay\": $delay,\r\n";
echo " \"left\": $leftvelocity,\r\n";
echo " \"right\": $rightvelocity,\r\n";
echo " \"light\": $light,\r\n";
echo " \"stop\": \"$stop\",\r\n";
echo " \"comment\": \"$errors\"\r\n";
echo " }\r\n";
echo " ]\r\n";
echo "}\r\n";
You can write a web service in any language you prefer, but make sure you return well formed JSON, using
the example service in the zip file as a guide. The
rate at which you process images is governed by the "delay" parameter, don't turn it down to low as image
processing is CPU intensive and the computer you are doing the processing on may get overwhelmed.