Robotics: How to Use an LED to Determine Distances
John Keogh | May 1, 2013
I looked at a few options for measuring distance for a small robot I'm working on, but it occurred to
me that I should be able to have the robot simply shine an LED on something, and, based on the amount of
light returned and the scatter of the light, be able to determine how far an object is from the
robot. This post describes how the experiment worked and how to do it yourself. The language used is
Objective-C, but the algorithm should be straight-forward to port if you are using another language.
This blog post has an accompanying video which briefly shows the robot that I mentioned above in action, and also
demonstrates using the technology this blog post covers:
The accuracy of the technique is reasonable, with the inaccuracy mostly related to the automatic adjustment of the iDevice camera to
changed ambient light levels when the LED is switched on.
Approach
These are the things that need to be done:
Make sure that the LED headlights are off
Pause for iDevice light level adjustment
Save image - this is the headlight off image
Turn LED headlights on
Do not pause for iDevice light level adjustment, the next image is saved quickly
Save image - this is the headlight on image
Compare pixel luminosity of headlight off and headlight on images.
take the inverse square of the ratio of changed pixels to the ones that remained the same and multiply that by an experimentally determined value
The calculation assumes an inverse square relationship between light intensity and distance, but it is actually more
complicated than that since light can be focused to attenuate more slowly over distance. Nonetheless, the assumption of
an inverse square relationship turned out to be reasonable.
There is also some choreography needed to do things in the right order. This references classes that
I am not going to release, but the choreography may be useful to have. The timing of saving of frames
is critical, since you need to have the LED on, but you need to avoid the automatic light level adjustment
of the iDevice camera, which will cause incorrect readings of number of pixels changed.
The purpose for creating this robot was to have a platform for working with machine learning in the
physical world
rather than purely software. In connection with this, I will include future po...
Javascript can be used to get a matrix of pixels which can be used for computer vision. This post looks
at a very simple application, which can look for a color palette and infer the presence of...
This post discusses an easy algorithm for obstacle detection and avoidance. The techniques and
code discussed in this article are related to two former posts........
One of the behaviors that is being evaluated for inclusion in EyesBot Driver is mapping, which can
generate large amounts of robot sensor data in a bursty manner. This post will be about how to
...
I couldn't finding an article which contained source and discussion for running a CIM motor from a Jaguar
controller using an Arduino, so if you are looking for this information, here it is. The ...
In an earlier post I described how to make a robot which is controlled using the free EyesBot Driver iOS app.
That robot required quite a bit of soldering and left the design of the robot chassis...
I have a previous post about using web services from robots. This post is focused on using a web
service from EyesBot Driver robot controller app. The audience for this particular blog post is...
One of the behaviors that make sentient beings appear sentient, is the ability to react to
the environment. In the case of reacting to an object, it's easy to mimic the behavior of a sentient
...
My last post was about using web services with robots and I recently read a post
in the Linkedin group "Robotics Guru" about the new MyRobots.com cloud monitoring
service from RobotShop (from wh...
I'm starting to add autonomy to the robots that have been the main subject of this blog and one of the major
considerations of permitting autonomy is how to coordinate and modify the behavior of ...
This is one of my infrequent technical posts, but related to one of my products.
I've written a few home/business security related apps that can monitor your residence or
business while you are ...
This post describes how to make a robot which is controlled using the EyesBot Driver iOS app (which should
be in the app store around June 3, if it passed the App Store review process without issu...
This is one of the occasional technical articles I'll be posting. The solution that is described
below stems from a problem that a few friends and I were discussing at lunch one day in Boulder, Col...
I looked at a few options for measuring distance for a small robot I'm working on, but it occurred to
me that I should be able to have the robot simple shine an LED on something, and, based on t...
I looked at a few options for measuring distance for a small robot I'm working on, but it occurred to
me that I should be able to have the robot simple shine an LED on something, and, based on t...
I built a robot which uses an iPod as it's brain and the questions of how to interface
the iOS device to the hardware was probably the most difficult early decision in the design
process. There ...
I use transistors frequently to drive components that take more current than an Arduino can
supply, and infrequently to amplify signals. This post attempts to answer two really common
questions ...
Several of our software products, which run on iPhones, iPods, and iPads, use simple computer vision
based algorithms to help to alert the user of potential security issues at their home or office...
This is related to the motion tracking blog post that I posted to my company's blog a few weeks ago, and builds on
the same source code. The problem I was trying to solve was how to teach an auto...
The EyesBot line of apps has some degree of autonomy and ability to influence and be influenced by its
environment - for example, EyesBot Watcher detects motion, tracks motion with it's "Eyes" and...