2d mapping using a webcam and a laser


Following on from my previous blog post using a webcam and a laser as a rangefinder, it’s now mounted on a cheap 3 euro stepper motor and used to make a 2d map of the surroundings. As you can see from the video the results are not great, I think with a bit more work there is room for improvement. A PIC18f14k50 is used to control the stepper motor and send the current step value over to the computer where a python script matches this to the current laser distance. Then using basic trigonometry x-y points can be found and plotted on a graph.

Webcam and laser rig
Webcam and laser rig

The laser and webcam were just glued to a piece of wood, then that piece of wood was stuck onto a stepper motor and the stepper motor is being held in a vice. The webcam is a Sweex WC003V3, available for 10 euros delivered on ebay. The laser diode is from china so that cost around 4 dollars delivered. The stepper motor is a 28BYJ-48 model, available for 3 euros delivered from china. All in all your looking at around 30 quid for some basic 2d mapping when you include the PIC microcontroller and H-Bridge IC etc.

Theory of operation:

X-Y points
X-Y points

We can work out theta by knowing how many many steps the stepper motor has taken and D is calculated using the webcam and laser combination. From there, by using basic trigonometric equations the x,y points can be worked out.

xpoints = D*cos(theta)
ypoints = D*Sin(theta)

So the webcam-laser rig is rotated through 360 degrees, as it is rotating values of D are matched with values of theta. After it has stopped rotating this information can be converted into x-y points and plotted on a graph. The datasheet for the stepper motor says there are 4096 steps per 360 degrees. I think that might be incorrect though as I found 2048 steps results in a 360 degree turn. From this information we can work out the minimum step angle possible.

2048 steps = 360,^{circ}
1 step = 0.176,^{circ}

This minimum angle is important for figuring out what the smallest gap that can be “seen” at the maximum range of the device. In this case the webcam and laser rig is calibrated to around 2.5 metres max range. To find the smallest gap noticeable simple trig can be used:

smallest gap
theoretical smallest gap

This means that at a range of 2.5 meters, the smallest gap that can be reliably detected is 0.763 cm wide. This is quite low, an ideal value for making a detailed map of a room. However this is a theoretical limit, this value assumes every step angle is matched with a value of distance from the webcam. This is not the case due to the fps of the webcam.

The main limitation is how fast you can store the information being received from both the PIC microcontroller and the webcam. The bottleneck in the process is definitely the webcam. At best I could get around 8 fps at 640×480 resolution. A higher fps could be achieved with a lower resolution and this is something I’m going to try in an attempt to get a faster, more detailed scan. As a result of the fairly low fps, the stepper needed to rotate fairly slowly to make up for the loss in scan resolution. The stepper takes 25 seconds to complete a full revolution.

25seconds = 360,^{circ}
1second = 14.4,^{circ}

In one second, the stepper rotates through 14.4 degrees. Since the webcam is running at 8fps, this gives us a new angle of 14.4/8 = 1.8 degrees. So now the minimum gap visible at max range becomes:

smallest gap
smallest gap

This brings the resolution down to 7.85 centimeters at the maximum range. This is still usable but it’s something I might try and improve. The easiest way would be getting a better webcam. The PlayStation eye toy camera works with opencv and it can easily capture 60 fps. So that is definitely an option.

Sourcecode:

Here is the code for the microcontroller to control the stepper and send the current step value over serial:

Here is the Python script to grab frames from the webcam, calculate distance, match that with a step number,calculate the x-y points and then plot those points:

Click here to download the C file for the PIC
Click here to download the python script

2D Plot:

As I said in the video, where it hits my door with all the clothes hanging on it, the map goes a bit mad.

2d map
2d map

Future improvements:
A higher FPS rate would allow for a faster scan time. This would then make possible a real time 2d map. For something like that I’d probably look into using processing to display the map on screen because matplotlib doesn’t seem to be all that fast at updating the display. From there it could be mounted on an autonomous robot of some kind. Removing the need for a computer would the main goal, that and making everything smaller. I have tried something like this with a Raspberry Pi but I just couldn’t get a decent FPS with the webcam using it. Using an FPGA might be an option.

21 thoughts on “2d mapping using a webcam and a laser

  1. Really impressive!! I wonder why some of the points scatter out beyond the wall, but they’re kind of at a consistent distance. This is a really interesting project.

      1. My guess would also be reflective surfaces. With a line-laser you could get more readings of the same wall per frame and such upsets would get much less likely. Finding the line would be more coplicated though.

  2. To get a bit more info on graphing effectively – check out this intro to SLAM by Claus Brenner. His intro pdf outlines how you get graphing easily done and his Youtube course is pretty darn useful too.
    Welcome to mapping on Python dude ! 🙂
    http://www.clausbrenner.de/slam.html

    Also useful but not right now:
    http://eclecti.cc/computergraphics/easy-interactive-camera-projector-homography-in-python

    If you want to go embedded CPU check out micropython – running on microcontroller. KS is over but boards wil be available soon. http://micropython.org/

    1. It’s a cool idea alright. As far as I know you can only get around 4fps using opencv with the raspi camera. This is because there is no access to the raspi gpu leaving the cpu to do all of the image processing . Things might have changed since I last checked though so I’ll give it a look

    1. Thanks for the links neon. I’m busy with college stuff at the moment but I have been thinking a lot about this. First of all I’m going to try and get it working with a line. I was thinking maybe it would be possible to rotate the laser through 360 degrees to get a 3d scan of something? That would be pretty cool to try

      1. Altering the angle of the vertical line +/- a small amount will increase your S/N. This wil be at teh cost of knowing precisely how non-vertical it is. Rotating it 360 is probably redundant.
        If you want to go super cool then place three lasers with diff coolurs at a slight angle to each other (so the lines cross somewhere). Then use either a filter, or initially just the RGB values, to detect each color line. This will make your scanner resistant to the colour of an object absorbing all that line’s colour and giving you poor results. Of course it will not help translucent objects or furry fabrics…
        You can just start with red/green to see the effect, a violet (DVD) laser is ideal for third colour. Or use an invisible infrared as well. But its easy to start with one, add green and then determine what to do about the rest. A servo will do if you want to try wobbling the vertical line.
        Good luck.

Leave a Reply

%d bloggers like this: