Homemade Pulse Sensor

Originally posted on www.letsmakerobots.com

I’ve been working on re-making the the Open Hardware Pulse Sensor so it’d be easy to send off to OSHPark and to make at home. I’m not sure, but I think I started this projects in March and I’ve just now finished it.

The bit of encouragement I needed was when hackaday.com put it up as their “Fail of the Week.” I thought I was going to be mature about it. But those four red letters started eating at me, so I gave it another go. Weirdly, I got it working.

I believe there were three problems:

  1. I had mixed up the op-amps again. In my defense, I’ve got 5 different ICs flying about in the same package as the op-amp.
  2. The Arduino I’d been plugging into was sitting on a surface that provided enough conductivity to create noise between the 3.3v pin on the underside and A0, which I was using for the op-amp in.
  3. Every time I touched the sensor the exposed vias were shorted through my own conductivity. Stupid mineral water.

** **

I’ve already detailed how I went about making it; so, I’ll try to stick to repeatability.

1. Order the parts.

  • Op-amp: .29 (Digi-Key)
  • Light Photo Sensor: 1.23 (Digi-Key)
  • LED: .79 (Digi-Key)
  • 0603 Schottky Diode: .50 (Digi-Key)
  • Passives: ~2.50 - Resistors: 1 x 470k, 1 x 12k, 2 x 100k, 1 x 10k, 1 x 3.3Meg - Capacitors: 3 x 4.7uF, 2 x 2.2uF
  • OSHPark Boards: **$.67 **(minimum 3 boards, costing $2.00. 3/2.00 = ~.67)

Total (approximate): $ 5.98

  1. Make sure you have theses tools.
  1. Solder the light-sensor.

The light sensor is the hardest bit, so take your time. I put a little bit of solder on each pad with my soldering-iron, then, cover the soldered pads in flux. Next, I attempt to align the light-sensor with the pads as close as possible. After, I put the board with the backside on an over-turned clothes iron. Let the iron heat up until the solder reflows and the sensor is attached.

  1. Flip the sensor and lock it to your surface with tacky-putty to solder the LED, passives, and op-amp. I won’t detail this, since my video shows the entire process.

  1. Wrap it with tape, cutting a small hole for the LED and light-sensor. (I’ll come up with a better solution, and a way it to clip it to your body, on the next iteration).

  2. **Wire it up to the Arduino **

Left —- Middle —- Right

A0 —— 3.3v ——–GND

  1. Run the Arduino and Processing sketches these amazing guys provided.

  2. Yell at me if you have problems.

Originally posted on www.letsmakerobots.com


UPDATE: 1/05/13

New angle. I finished my ATtiny Bitsy Spider (ABS) board and wanted to do something with it. While stringing it together I had thought of replacing the Arduino Pro Mini and the Servo Helper board with the ABS. Costs wise, it will be slighty more expensive ($1.50 or so?) but much smaller and a lot less hassle.

I’ve read several people had mixed results getting an ATtiny to control servos. Of course, I’m no better. But I was able to get acceptable functionality out of them (i.e., controlling continuous rotation servo speed, direction, braking). Anyway, here’s kinda how I approached the servos on the ATtiny 85.

I found several blogs about getting servos to work on the ATtiny but ultimately I used the Servo8Bit library (note, for ease of use I’m linking the “Arduino version” below, not AVR).

It doesn’t seem real friendly, but in a hack’s opinion, it seems like great code that is incomplete–hope someone corrects if I’m off. The problem I had, and I believe others, was the library using Timer1 for servo timing. The Tiny cores (at least the ones I’m using) use Timer1 for basic functionality, creating a conflict. This presented to me in the inability to use the delay() function. It was simply as if it had not effect. That’s when I popped the hood on the library itself. In the header files there is an option for which timer to use. So, I switched it from Timer1 to Timer0. I tried the code again. Great, delay() seemed to work now, but the ability to control the servos was gone. As soon as the myServo.attach(3) was called the servo would spin in with full speed in one direction. Damnit.

I didn’t feel like digging through the rest of the library trying to debug something I only half understood. So, I began researching. After a bit, I came upon this thread. Seems this fellow WireJunky was trying to figure out how to do the same, control continuous rotation servos with an ATtiny. At the end Nick Gammon told him he should just create his own timer function.

Anyway, I hacked this code out after reading the thread and was surprised it did what I want. I’m a hack hacking with a hacksaw!

//Basic Jot movement using ATtiny Spider

#include "Servo8Bit.h"
void mydelay(uint16_t milliseconds);  //forward declaration to the delay function

Servo8Bit myServoA;  //create a servo object.
Servo8Bit myServoB;

void setup()
  myServoA.attach(3);  //attach the servo to pin PB3

void loop(){

            myServoA.write(160);             // tell servo to go to position in variable 'pos'
            myServoB.write(50);             // tell servo to go to position in variable 'pos'
            mydelay(2000);                      // waits 15ms for the servo to reach the position
            myServoA.write(90);             // tell servo to go to position in variable 'pos'
            myServoB.write(90);             // tell servo to go to position in variable 'pos'
            mydelay(2000);                      // waits 15ms for the servo to reach the position
            myServoA.write(50);             // tell servo to go to position in variable 'pos'
            myServoB.write(160);             // tell servo to go to position in variable 'pos'
            mydelay(5000);                      // waits 15ms for the servo to reach the position

void mydelay(uint16_t milliseconds)
    for(uint16_t i = 0; i < milliseconds; i++)
}//end delay

There are a few issues. It seems my B servo has some jitter in it. It doesn’t like to stop at myServoB.write(90). I tried calling myServoB.detach(), then myServoB.attach(3) in a hackish attempt to stop the servo. It’ll stop but wont re-attach.

Anyway, even if troubleshooting it doesn’t work out I have some work arounds. For example, running the VCC for the servos through a P-Chan that is controlled by the ATtiny, it’d take an extra pin but would allow me to accurately control stopping them. Though, I believe this lack of “centeredness” is due to either a cheap 0805 I used in the conversion or other noisy stuff I have on the PB4 pin.

Of course, to use the ABS as a replacement brain on the Jot, I’lll need to create a star-network with the ABS’es, write a library to control the HMC5883L from the ATtiny, make sure there are no other timing issues, and fit it all in 8k Flash. Ugh. Righ now the code size is around 3k with servo and serial library.

UPDATE: 12/24/13

Well, I don’t know what to say. I think I’m going to take a break from this build for a bit and focus on finishing the Overlord projects with the Dot Muncher.

I discovered what was causing my problems with the NRF24L01. It wasn’t the voltage-regulator. It was the 1uF 0805s filtering the regulator. I replaced the unknown capacitors (ones bought off of eBay) with some from Digi-Key that were rated 25v. This fixed the problem and I had the Jot communicating nicely as I had hoped.

Of course, that wasn’t the end of the problems. I discovered the HCM5883L board was shorting, I believe, everytime I programmed the board. It’s pissing me off. I’ve burnt four compass boards and two Arduino Pro’s over the issue (smoked around $15 in parts). It has something to do with the HCM53883L I2C lines feeding backward through the board whenever the Arduino goes low. It causes the voltage-regulator on the HCM5883L board to pop almost right away. Of course, it does slight damage to other parts of the connected boards. I didn’t know it at the time, however, I believe this was the reason the filtering capacitors were damaged, the backward current.

Stuff Burnt on the Pyre of Stupidity –>

That’s not the only issue. The code I’ve got together for the NRF24L01 doesn’t play nice with the HCM5338L library.

But I can’t tell how to re-write the code in a way they are happy with each while the f’in boards keep burning up. Sigh.

Nevertheless, I think my next step, when I’ve got my gusto back, will be to make a complete schematic of the Arduino Pro Mini, Little Helper Board, and the HCM5338L. I figure the first two I have schematics for already, and I have enough HCM5883L boards to pull the components off and reverse engineer the PCB.

Still, I’m a little depressed. I’m feel like I’d been better off making the boards myself. At least then, I would know exactly how they are strung together and could only blame myself for issues.

I also feel like Frits needs to put together a “Robotics Fail of the Week” so I can be it’s first highlight. Man, even looking at that picture now makes me feel like I suck at life. Oh well, I’m going to list the good stuff I’ve learned from this.

  1. Reverse current is a bitch–low-drop diodes are your friends.
  2. I have put together code that makes the NRF24L01 have closer to Bluetooth functionality. Though, it doesn’t like being in the same code as the Wire library.
  3. Cheap parts are require you be time rich.
  4. NRF24L01 isn’t really meant for streaming data. I knew this to begin, but I didn’t understand how it really plays out in the code. The NRF takes a lot of code management. Unlike other devices that are hardware managed or SoC. This makes the NRF highly sensitive to what else your code is doing. In my case, running servos, communicating over I2C, and doing floating point math. As I progress in this build, I feel I’m taxing the NRF’s functionality beyond its ability.
  5. It is better to learn the circuits of all boards connected to yours. It might initially take more time, but in the end save time and money.
  6. If I fail it something, although looking ridiculous is not fun, documenting the failure makes me feel better. Like it meant something. Even if that something is, “Hey world, I’m an idiot.” :)

UPDATE: A Jot of Trouble

I didn’t want to float this post until I have something working to update, but I missed writing for a change. I’ve been working on the little Jot frequently over the last few months. Yet, I keep running into problems. The NRF24L01s are acting silly. One day they work, another they don’t. I guess we can’t make woman jokes now that Roxanna77 is here? (If you read this Roxanna, just wanted you to know I had to make sure my wife didn’t read this post, it’d been hell in my house).

I have reworked the servo board (v.9.5) to include a double 90-degree header. One set is to attach the servos, the other is to attach the Compass (HMC5883L). This was meant to make the hardware more compact, modular, and keep the compass level for a better reading. Oh yah, and I burnt a few HMC5883Ls trying to connect them with crappy braided wires.

Also, I’ve added solder-jumpers to tie the 3.3 SOT-23-5 voltage regulator’s enable pin to either high or low, depending which one I mistakenly buy.

On the top side I’ve included an SMD voltage divder running straight into analog pin A3. My intention is to allow the Jot to keep an eye on its battery voltage as a way of sensing how “hungry” it is.

I’ve added a 3.3v pin on the new double 90-header, in case I’ve a 3.3v sensor elsewhere on the bot. I dunno, I was trying to use all the extra pins I had.

Of course, since I’ve learned how to tent vias, I’ve also tented the vias on the board with hope I’ll save myself a fateful short or two.

I’ll eventually try to replace those bulky headers with what I’ve affectionaly begun to refer to as “Those short, round headers.” I like these little headers because of how utterly small they are. Of course, they are small but the bulk of their body does not make it through thicker PCBs. This is due to the point flaring closer to where the plastic header is. This flare prevents the short-rounds from sinking through the typical header hole on most boards.

But, I’ve got calipers, Eagle CAD, and OSHPark, so I made a little library of header holes that will allow these pins to slip neatly through the board and mate with short-rounds on the other side. I sent off to OSHPark for a sample, so I’ll report back when I’ve tested them for effect.

Now, for what has kept me from moving forward with this little damn bot: A cheap voltage regulator.

On my original version of the servo board (by the way, I refer to it as the Little Warmie Helper board or LWH board) I had used a different voltage regulator that cost more. The only difference I found between these were the output, the first I used putting out 200mA and the second 150mA. I really didn’t think this made a difference, given what I could find in the datasheet. I know there are passives effecting the power consumption, but it’s the only info I could find (datasheet, pg. 8) The NRF24L01 was using around 11.3mA for the transmitter and 13.5mA for receiver. Even though I didn’t know the power-consumption of the passives I believed I was well within the range to use the cheap 150mA voltage regulator. But, experience has proven otherwise.

This is where I ask the professionals to come in and tease me about missing something simple.

The only theory I could invent, given my limited understanding of electronics, is the NRF24L01 only used 11.3/13.5mA in an amp hour, but the burst use exceeds the constant 150mA of the cheap regulator? I don’t know. I’m at a loss.

Of course, this is pure speculation. I’m currently out of higher output voltage regulators (I should have some more by the end of the week). But, I can leave the NRF24L01 in place on my LWH board and solder on jumpers to the 3.3v and GND pins and get the NRF24L01 to function properly. This makes me believe the fault lies directly with the inadequacies of the voltage-regulator and not my board design (though, it’s inadequacies I’m sure are glaring).

Anyways, this is where I am with the little Jot.

A couple of notes. I have a backup design of the Jot that I’m working to get cheaper than $25, which uses BLE (yes, those HM-10s I’m in a love-affair). Also, I decided if I was to get serious about the Overlord projects I’d probably do better turning it into a Python module, which I’ve been doing in silence and is around 90% done. I’ll try to have it up before the end of the year. I need to finish several functions.

UPDATE: Progress on NRF24L01 code for working between PC, Hub, and Robot.

So, here is my attempt at a swarmie build. Not much here yet, simply a personal build log until I get an iteration cheap enough, then, I’ll start incorporating them into the Overlord projects.

I have to bow to Bajdi; those little NRF24L01 take a lot more brainpower than simple ole’ Bluetooth. I tried for some time to write my own code that would send and receive bytes to or from the other node. After a little of hair pulling I gave up and started reading other’s code. I came across Robvio on the Arduino Forums who had some rather nifty code that I left nearly intact.

#include <SPI.h>
#include "nRF24L01.h"
#include "RF24.h"

RF24 radio(8,7);
// Radio pipe addresses for the 2 nodes to communicate.
const uint64_t pipes[2] = {0xF0F0F0F0E1LL, 0xF0F0F0F0D2LL };

//for Serial input
String inputString = "";         // a string to hold incoming data
boolean stringComplete = false;  // whether the string is complete

//NRF Packages
byte SendPackage[32];
byte ReceivePackage[32];
boolean sending=0;

void setup(void)
  // Print preamble

  // optionally, increase the delay between retries & # of retries

void loop(void)
  //check for NRF received
  //check for Serial received (or filled by NRF)

void serialEvent() {
  while (Serial.available()) {
    char inChar = (char)Serial.read();
    inputString += inChar;
    if (inChar == '\n') {
      stringComplete = true;

byte NRFsend(String NRFPack = ""){
  NRFPack.getBytes(SendPackage, 32);
  bool ok = radio.write(SendPackage,sizeof(SendPackage));
  if (!ok) Serial.println("NRFerror");
  unsigned long started_waiting_at = millis();
  bool timeout = false;
  while ( ! radio.available() && ! timeout )
    if (millis() - started_waiting_at > 200 )
      timeout = true;
  if ( timeout )

void NRFreceive(){
  if ( radio.available() )
    //byte ReceivePackage[32];
    bool done = false;
    while (!done)
      done = radio.read( &ReceivePackage, sizeof(ReceivePackage) );
    inputString = ((char *)ReceivePackage);
    stringComplete = true;
    radio.write( "1", 1 );

void Serialreceive(){

  if (stringComplete) {
    if (inputString.startsWith("T:")) {
    if (inputString.startsWith("S:")) {

    inputString = "";
    stringComplete = false;

The way this code works is much like a software and serial simulated Bluetooth module.

To send serial data it goes like this, you type something with a prefix code, T for transmit and S for serial print, and ending with a newline character (\n).

For example, typing the following in the terminal on module A:

  • T:S: My message \n

Will send “My message” to the other module B, then, it will print “My Message” to serial line on the module B.

If you type,

  • T: My message \n

This will transmit “My message” from module A to module B, but it will not be printed to the serial line on module B.

I’ll let you guys look the code over and tell me if I can improve it for what I’m doing. Right now, I’ve tested it with some basic Python code to send a serial message to my hub (Arduino Uno and NRF24L01), which relays it to the robot (Arduino Pro Mini and NRF24L01).

Public Tinkercad Design

Cost to build

  1. Tower Micro 9g Servo x 2: $5.22
  2. Ball Caster 1/2” Metal x 1: $3.65
  3. Funduino (early Arduino Pro Mini): $4.89
  4. AAA x 4: $1.44
  5. NRF24L01 x 1: $1.31
  6. Compass (HMC5883L): $2.37
  7. 2-56 Threaded 2” Stud x 2: $1.00
  8. 2-56 1 1/2” Screw x 2: $.17
  9. 2-56 Hex Nut x 6: $.23
  10. AAA x Battery Case w/ Switch: 1.05
  11. Helper Board:$1.53
  12. SOT-23-5, 3.3v, .30mA LDO Voltage Regulator x 1: $.57
  13. 1uF 0805 ceramic capacitor x 2: $.20
  14. 0805 4.7k resistor x 2: $.03
  15. 0805 330-860oh resistor x 1: $.03
  16. 0603 LED (red, green, yellow) x 1: $.09
  17. Right Angle header x 8: $.05
  18. Straight Header x 26: $.08

Total (approximate): $23.95


Designing the build in Tinkercad:

Converting Tower Pro 9g Servo to Full Rotation for Motors:

Cutting Out the Build:

Putting the Pieces Together:

Making the Little Warmie Helper:


Originally posted on www.letsmakerobots.com

NOTE: Try as I might, guys, I can’t get the numbers to line up in my HTML version of my code. Instead, you might just load it into Geany or Notepad+ to follow along, since I indicated things by the line number. I’m sorry, I’m out of patience for it.

These are redneck instructions on how to control a robot with a static webcam for under 50 USD.

I’m a robot builder and I got tired of seeing universities play robot soccer or something with computer vision guiding their players, and no matter how much I begged, darn Ivy Leagues wouldn’t share.

So, I wrote my own. And while I did it, I swore I’d try to help anyone trying something similar.

Overlord and Dot Muncher

So, here’s an overview of how the code works:

Red Hatted Robot

  1. Webcam sends images of its field-of-view.
  2. OpenCV looks for the largest red blob.
  3. It begins tracking the red blob’s X, Y.
  4. The PC averages these X, Y positions for around 150 camera frames.
  5. If the blob hasn’t moved much, the PC assumes the red blob is the robot.
  6. The PC gets frisky and gives our robot a random target within the webcam’s field-of-view.
  7. The PC calculates the angle between the bot and the target.
  8. Meanwhile, the robot’s microcontroller is taking readings from a magnetometer on the robot.
  9. The robot, with a one time human calibration, translates true North to “video-game north,” aka, top of PC’s screen.
  10. The microcontroller transmits this code to the PC.
  11. The PC compares the angle of the bot from the target with the robots angle.
  12. The PC sends a code to the bot telling it to turn left, right, or move forward (closer to the target).
  13. When the robot has made it within an acceptable distance from the target he “Munches the Dot.”
  14. A new random dot appears. Rinse repeat. (For real though, don’t rinse the bot. Consider Asimov’s Third Law.)

About Me: (skip, it’s boring)

I’m a homeless outreach worker. The job’s amazing. But I’ll say, emotionally taxing. Skipping the politics and the sermon on harm-reduction, I decided at the start I needed something far from the job to allow my mind rest and prevent compassion fatigue. Something that consumed my brain-power so I’d not be stressing over the 6 months pregnant 17 year-old, shooting up under a bridge on I-35. Something to protect my down-time so I’d be frosty for the next day.

Well, I saw that TED talk about the Sand Flea and I told Bek, “That’s awesome, think I could build one?” “Oh crap,” she said, “new obsession?”

Now, robots are my relief. My way to prevent white-matter from becoming dark-matter as I rake through sludge looking for those who want out.

I started reading a lot. I discovered, Arduino, Sparkfun, eBay, Raspberry Pi, ferric chloride, Python, hackaday, HC-SR04, Eagle, OSHPark, and the list goes on. But every time I Googled something about robots, I’d end up at the same place.


These guys are brilliant. They are a college education from collaboration, I swear.

Soon, I ended up with my first bot. A piece of sh…short-circuits. Although, I did learn a lot interfacing the bot with the Raspberry Pi. Also, while I was working with a Raspberry Pi, I played with OpenCV, and was considering adding a face tracker to my bot before I got distracted. But before I quit, I created a proof-of-concept.

So, all these experiences began to culminate.

Meanwhile, I was taking a graduate Research Methods class at UTA and my professor disappeared. The university was amazing; good professors filled in and made sure our education didn’t suffer. But we wondered for many months. Sadly, it was discovered he had killed himself.

It shook me. I deal with suicidality every other day, but it’s usually on the street. Why a successful research professor? My thoughts got dark for a bit, which meant I sunk into robots even more. Yet, now, a question sat at the front of my mind:Will robots one day kill themselves?

This may sound silly. But I believe the formula for self-termination can be expressed in Boolean logic, and therefore coded.

Pseudo-code would be:

if painOfExistence > senseOfPurpose then:


Derived from work and life experience I genuinely believe the root-motive for suicide is existential-anxiety, which seems to me, entangled within both constructs.

Ok. Skipping the _Time_bit.

Someday, I’d like to delve into swarm robotics. Or, at least, attempt to replicate organic group behavior within a robot group. And I thought it might be possible to control a group of robots with a setup similar to those universities or research groups keep showing off. (Jockish Ivy Leagues :P)

Well, I found these desires, information, and tools synergized into a passion. After two days, I was able to write a basic OpenCV Python script that could control a robot using a static webcam looking down on it. Let me clarify, I’m of average intelligence, simply obsessive, so when I mentioned “two-days” I’m trying to convey the utter feasibility of this project, foranyone. Python, Arduino, and OpenCV make it so very easy; anyidiot like me can hack it out.

Of course, my purpose for this platform is to control robot groups. The group being the second social collection (one-to-eight) and social interaction seems to be the essential in developing a positronic brain. The white-mattered brained being necessary for me to test the above mentioned self-termination formula. So, maybe, I’ll learn if robots will commit suicide, or perhaps, have a better understanding of why humans do.

Dark and depressing! I know, right? Who writes this crap!?

A robot

It doesn’t matter what sort of robot you use, it only needs:

  1. A microncontroller (e.g., Arduino, PicAxe, etc.)
  2. Built from material of a bold, solid color.
  3. The ability to change directions and move.
  4. A magnetometer. I used the HMC5883L. They’re like 2 USD on eBay.
  5. A wireless serial connection. Bluetooth, Xbee, and nRF24L01 would be my recommendation since all are well documented creating a bridge between PC and microcontroller.

I personally built my own using red cutting-board I stole from Bek (shh). For my serial connection I used two $10 Bluetooth 4.0 modules, which I’ve written an instructable on setting up a Bluetooth 4.0 module to work with an Arduino and PC: Bluetooth 4.0 and Arduino.


Probably something less than 10 years old. It could be running Linux or Windows;though, I’ll be using Windows Vista (hey, I’m first-world poor and can’t afford Windows 7 :P).

  1. The PC will need to be running Python 2.7
  2. It’ll need OpenCV 2.4.4
  3. It will need a wireless serial connection that pairs with your bot. Again, I used my BT 4.0 modules.

A Webcam

It’s really up to you. I’m not going to lie, I went with the cheapest webcam I saw, which costs 6.87 USD. But I would _not _recommend this webcam. It didn’t like my PC, so every time my Python script stopped I had to unplug the webcam and re-plug it in. A real annoyance for debugging.

  1. I’d suggest a high-resolution webcam. Maybe even a IP cam, if you’re rich? If you are, would you buy me one too?
  2. Longmale-to-female USB cable. Again, I got two 15’ USB cables on eBay for around 4.50 USD. If you get everything setup and you notice problems with the webcam at the end of the cable, you can put a powered hub at the end of the cable with an extension cord and it’ll take care of the issue. Though, I didn’t have this problem at 15’.
  3. A wife that’ll let you screw your webcam into the ceiling. Or…don’t ask…

So, I made my robot, Dot Muncher, using an Arduino Uno, Motor Shield, and a Bluetooth 4.0 module. The chassis was made from HDPE, a cutting board I stole from my wife. The motors and tires were from eBay.

Now, about any robot will work, like I’ve stated, so Google away and select a robot build you like.

Of course, everything you’d every want to know can be found one this site :)

I’m just sayin’.

But the code, that’s the part we want to focus on. Really, our robot only has a nerves and muscles, the brain will actually be in the PC, all the robot does is,

  1. Calculates the compass info.
  2. Sends the compass info to the PC.
  3. Reads the movement codes from the PC.
  4. Translates the movement code received into a motor activation.

That’s it. Pretty simple.

//I've been using Zombie_3_6_RC in Processing to interact.

// Reference the I2C Library
#include <Wire.h>
// Reference the HMC5883L Compass Library
#include <HMC5883L.h>

// Store our compass as a variable.
HMC5883L compass;

// Record any errors that may occur in the compass.
int error = 0;

//int pwm_a = 10; //PWM control for motor outputs 1 and 2 is on digital pin 10
int pwm_a = 3;  //PWM control for motor outputs 1 and 2 is on digital pin 3
int pwm_b = 11;  //PWM control for motor outputs 3 and 4 is on digital pin 11
int dir_a = 12;  //dir control for motor outputs 1 and 2 is on digital pin 12
int dir_b = 13;  //dir control for motor outputs 3 and 4 is on digital pin 13

int lowspeed = 120;
int highspeed = 140;

//Distance away
int distance;

//Sets the duration each keystroke captures the motors.
int keyDuration = 10;

int iComp;

void setup()

  Wire.begin(); // Start the I2C interface.

  Serial.println("Constructing new HMC5883L");
  compass = HMC5883L(); // Construct a new HMC5883 compass.

  Serial.println("Setting scale to +/- 1.3 Ga");
  error = compass.SetScale(1.3); // Set the scale of the compass
  error = compass.SetMeasurementMode(Measurement_Continuous); // Set the measurement mode to Continuous

  pinMode(pwm_a, OUTPUT);  //Set control pins to be outputs
  pinMode(pwm_b, OUTPUT);
  pinMode(dir_a, OUTPUT);
  pinMode(dir_b, OUTPUT);

  analogWrite(pwm_a, 0);
  //set both motors to run at (100/255 = 39)% duty cycle (slow)  
  analogWrite(pwm_b, 0);

  pinMode (2,OUTPUT);//attach pin 2 to vcc
  pinMode (5,OUTPUT);//attach pin 5 to GND
  // initialize serial communication:


void loop()

  // Retrive the raw values from the compass (not scaled).
  MagnetometerRaw raw = compass.ReadRawAxis();

  // Retrived the scaled values from the compass (scaled to the configured scale).
  MagnetometerScaled scaled = compass.ReadScaledAxis();

  // Values are accessed like so:
  int MilliGauss_OnThe_XAxis = scaled.XAxis;// (or YAxis, or ZAxis)

  // Calculate heading when the magnetometer is level, then correct for signs of axis.
  float heading = atan2(scaled.YAxis, scaled.XAxis);

  // Once you have your heading, you must then add your 'Declination Angle', which is the 'Error' of the magnetic field in your location.
  // Find yours here: http://www.magnetic-declination.com/
  // Mine is: 237' W, which is 2.617 Degrees, or (which we need) 0.0456752665 radians, I will use 0.0457
  // If you cannot find your Declination, comment out these two lines, your compass will be slightly off.
  float declinationAngle = 0.0457;
  heading += declinationAngle;

  // Correct for when signs are reversed.
  if(heading < 0)
    heading += 2*PI;

  // Check for wrap due to addition of declination.
  if(heading > 2*PI)
    heading -= 2*PI;

  // Convert radians to degrees for readability.
  float headingDegrees = heading * 180/M_PI;

  // Normally we would delay the application by 66ms to allow the loop
  // to run at 15Hz (default bandwidth for the HMC5883L).
  // However since we have a long serial out (104ms at 9600) we will let
  // it run at its natural speed.
  // delay(66);

  //This throttles how much data is sent to Python code.  
  //Basically, it updates every second (10 microsecond delay X 100 iComps)
  if (iComp >= 30){

    int adjHeading = 0;
    //The "floor" part makes the float into an integer, rounds it up.
    headingDegrees = floor(headingDegrees);
    if (headingDegrees >= 280){
        adjHeading = map(headingDegrees, 280, 360, 0, 79);
    else if (headingDegrees <= 279) {
        adjHeading = map(headingDegrees, 0, 279, 80, 360);


  delay(10); //For serial stability.

  int val = Serial.read() - '0';

  if (val == 1)

  else if (val == 2)

  else if (val == 3)

  else if (val == 4)

  else if (val == 5)

void Back(){
//Straight back
      analogWrite(pwm_a, highspeed);
      analogWrite(pwm_b, highspeed);

      digitalWrite(dir_a, HIGH);  //Reverse motor direction, 1 high, 2 low
      digitalWrite(dir_b, LOW);  //Reverse motor direction, 3 low, 4 high


void Left(){
      analogWrite(pwm_a, lowspeed);
      analogWrite(pwm_b, lowspeed);

      digitalWrite(dir_a, HIGH);  //Reverse motor direction, 1 high, 2 low
      digitalWrite(dir_b, HIGH);  //Reverse motor direction, 3 low, 4 high


void Right(){
      analogWrite(pwm_a, lowspeed);
      analogWrite(pwm_b, lowspeed);

      digitalWrite(dir_a, LOW);  //Reverse motor direction, 1 high, 2 low
      digitalWrite(dir_b, LOW);  //Reverse motor direction, 3 low, 4 high


void Forward(){
  //set both motors to run at 100% duty cycle (fast)
  analogWrite(pwm_a, highspeed);
  analogWrite(pwm_b, highspeed);

  //Straight forward
  digitalWrite(dir_a, LOW);  //Set motor direction, 1 low, 2 high
  digitalWrite(dir_b, HIGH);  //Set motor direction, 3 high, 4 low


void Stop(){
  //set both motors to run at 100% duty cycle (fast)
  analogWrite(pwm_a, 0);
  analogWrite(pwm_b, 0);

  //Straight forward
  digitalWrite(dir_a, LOW);  //Set motor direction, 1 low, 2 high
  digitalWrite(dir_b, HIGH);  //Set motor direction, 3 high, 4 low


The first bit of robot code I’d like to focus on is the compass. Now, I’ve not detailed how to use the HMC5883L, since SparkFun has done this for me. I also won’t go into tilt-compensation, since I was more worried about proving the concept here than dead-on accuracy. But if you’re a smart-cookie and would like to take that chaellenge, feel free. Just be sure and share the code with us all when you’re done :P

No. Instead, I want to focus on adjusting the compass heading from a value respective to true North, to what we want it to think is north, in our case, whatever is the top of our screen. This process takes a little involvement, since the numbers must be set manually and with a little guesstimation.

See code above.

So, I got my compass module lying flat as possible and then bolted it to my robot. This helps assure your getting a full 360º and will keep you from having to re-calibrate what we’d like to call north every time the compass module gets knocked out of place.

106-114: These modules and the Arduino library are both designed to have 0º be North, but we want to set our own north, video-game north. Which is exactly what lines 106-114 are about. I found 80º is what value my robot was reading when he was headed towards the top of the screen. I had to find a way to adjust this to give me the reading 0º. I ended with this simple code to spin the compass.

I had to divide the adjustments into two sections for the math to stay simple. Lines 109-111 handle mapping 0-79º onto 280-0º, making the robot think 0-79º is 280-0º. Lines 112-114 do the same for 80-360º, converting it to 0-279º.

Honestly, I’ve got some spatial-retardation, so I have a hard time thinking through this,I just know it works. So, if you have problems I’ll answer emails and Skypes and we can work through it together. And, if you want to submit a better explanation, I’ll post it and be sure to give you credit.

Do know, my redneck solution was to change the orientation of the camera. Pfft. Too easy.

Moving on,

116: Sends the robot’s heading to the PC.

117:iComp is a variable allowing us to decide when to start sending data to the PC. We don’t want to send data to the PC before it’s ready or before the robot is warmed-up, we’d be dealing with poor readings.

118:This is a delay that makes sure we are not bogging down the serial line, since every time we callSerial.println(“whatever”)both the PC and the robot have to take some processing power to deal with it. In short, it’s to make sure the robot is not talking the computer’s ear off.

See code above.

This bit is pretty easy. It reads the codes being sent from the PC and translates them into a function call. I write all my robot-PC interactions this way, since if I want a code to mean something completely different, for instance I want to swap the robot’s right and left movements, I’d just swap lines 134 and 144.


125:If I remember correctly, this line reads serial data being sent from the PC and assures theval variable isn’t getting a bunch of zeros.

Easy one.

This is one of the functions called to make the motor move, or in the case of this function, stop.

188-189: This actually tells which pin on the Arduino, specified by the variablespwm_a andpwm_bto decrease to 0. This effectively stops our robot.

192-193: This bit actually tells the motor which direction to turn. The pins (dir_a anddir_b)are set either HIGH or LOW and this changes the direction of how the motor moves.

Tell you what, my good friend ChickenParmi explains it better here

See code above.

Now we have a our little robot setup, let’s setup our Python environment.

I’m going to use Python 2.7 (just found later versions piss me off).

Python 2.7 download

For windows, use the MSI Install respective to your architecture, either x86 or x64. Of course, Linux and Mac are versions are there as well. Go ahead and install Python 2.7, but I’m not a fan of their IDE. Instead, I use:


Though, this IDE is a little tricky to get running on Windows, since it’s meant for Linux. These posts over at Stack Overflow go through some popular Windows Python IDEs. Pick what you feel comfortable in. I suggest running ‘Hello World’ in each until you decide you like one.

Here we are, the hardest part of this whole project; if not careful, we fall into dependency hell.

I’m going to try and help you setup all the modules needed to run the Python code. It’s been difficult for me to do this right, so I’ll try to be descriptive.

There are seven modules we will use.

  1. OpenCV (which we’ll call cv2).
  2. Numpy
  3. Serial
  4. Time
  5. Math
  6. Random
  7. Threading

Of thesewe will need to install OpenCV, Numpy, and Serial, since the rest come built into Python 2.7.

The main trick with anymodule you install in Python is to make sure the exact path you install it to gets added to the Environment Variable (this is true for both Windows and Linux).

To explain this I’m going to hand it over to Lovely Ada as she tells us how to install the Serial module:

pySerial installation

Note the bit about adding the environment variable, since none of the other modules will explain this, but each will need to be there.

Now, let’s try OpenCV and Numpy. My favorite installation guide (meaning it worked for me) was written by Abid Rahman:

OpenCV 2.4.4 installation

At this point, you might bring up Python and try some simple webcam capture test code (if you have problems copying and pasting, I’ve added web capture code as an attachment as well):

See code above.

If you see a live feed from your webcam, you’re almost good to go.

If there any problems, like I said, you and me buddy. Feel free to ask questions here or Skype me:thomas_ladvien

Okay. Here’s all the Python code in one go. _Don’t be scared _if this looks confusing. I feel the same way. In fact, some of it I _still _don’t understand. (Hey, honesty a is a rare fault I seem to possess.) Again, don’t worry, we’re going to walk through it one section at a time, you and me, buddy. Until the end.

On the flip side, if you are a Python guru, or yanno, just a sassy-pants: Feel free to add corrections and comments on this page. I’d love to make this code grow through critique. Do know, I guarantee the following: Typos, grammar problems, illogical coding, artifacts from debugging, and the like. But don’t worry, I’m thick skinned and usually wear my big-boy panties.

I should state, the basic code for color tracking was written by Abid Rahman in a reply on Stack Overflow.

Also, I’ve included the code as an attachment, it’s at the bottom. Video-game south.

See code above.

Ok. The beginning.

So lines 3-10 pull in the modules we will need. My take on a module is the following, “Code some smart guy wrote and doesn’t want anymore, so he gave it to me to use.”

To be specific

  • cv2 is the OpenCV module.
  • Numpy, which we’ll call “np” throughout the code, is used for higher number functions needed for OpenCV to do her magic.
  • Serial is the module which will allow us to establish a serial connection between the PC and the robot, via whichever wireless device you’ve chosen.
  • Time allows us to basically idle the code. This is important in controlling many things, for instance, how far the robot moves. We tell the motors to turn on, wait 10 secs, then turn off. Because the sleep function actually puts the code into an idle state, we must have the threading module, since our code requires the PC to do several things at once.
  • Math. From the math module we get the code to help us simplify the trigonometry calculations, like the angle between the robot and target.
  • The random module is only used to gives us a random target.
  • Threading. Important module. Basically, threading allows the computers to do two tasks at the same time. This becomes important when we are both trying to track the robot and receive his position. Throughout this code we will have three threads
    1. The thread running the OpenCV stuff. This tracks the robot and is also the largest.
    2. A thread controlling the serial connection between the robot and PC.
    3. And a thread with the small job of telling the motors how long to be on, thereby controlling how far the robot will move.

See code above.

13: This is where we actually open a serial connection to the wireless device you are using. Note, we’ve named the serial connection we opened “ser” so when we go to send information it will be something like,ser.write(“What you want to send here”)

15-38: Here we declare a bunch of variables. The “global variable” lets the code know that this variable is going to jump between all threads. Next, thevariable = 0 actually declares the variable. Do know, you’ll have to remind each thread a variable is global by stating “global variable.”

One thing I should state,iFrame = 0 is an actual variable declaration, as well as setting it to 0. Of course, this is how one would declare an integer variable with an initial value of 0. On the flip,rx = “ “ isalso a variable declaration but this time a string. You’ll know I switched information from a integer to a string if you see something like this:

headingDeg = str(intHeadingDeg)

That tells the code, “I want to convert the value in intHeadingDeg, which is an integer, into a string and call it ‘headingDeg’”

The comments indicate what each variable is meant for. Not going to lie, not sure I don’t have some declared variables I meant to use, didn’t, and forgot to remove.

One important variable is theiFrame variable, since it tracks which frame we are on. This becomes key in all aspects of tracking our robot.

See code above.

42: Here we start this function that does most of the work,OpenCV():. It is one of the functions thatwill be threaded at lines 345-347.

44: We open up the webcam and give it the nicknamecap. If I remember right the “0” in the parenthesis refers to whatever camera comes first on your USB bus, so if you have more than one camera you can specify by changing this number, e.g.,cap = cv2.VideoCapture(3). Notice we called the OpenCV module cv2, so we are using the OpenCV module to access the webcam.

46-52: Just making the variables we declared work within this function. This might not be needed, but hey, I don’t read the whole Python manual.

55:This is just a string flag that is flipped to tell the PC to generate a new target for the robot. Note, we initially set it to “Yes” meaning the first time we run through this function a target needs to be generated.

58:This is an integer variable to count how many dots the robot has “ate.”

Ok, before I get to the next bit I need to take a minute and explain how we approach actually getting the coordinates of our robot. As you know, OpenCV does the hard work for us, giving us the X and Y coordinate of the largest red blob on the screen. Though, the coordinates it gives us are the center of the mass. Now, this is all just a logical guess because I didn’t read the whole OpenCV manual, but I believe the X or Y coordinate that refers to the center of this mass is called the centroid.

This might seems simple. That’s because it is, I’m not sure why we don’t just call it the damn center or something. Eh, oh well. Though, it will become important when we do collision detection between the robot and its target.

61-62: All that to say, the “c” incyAvgandcxAvgstands for centroid. So, these are variables that will hold the running average for the X and Y coordinates of the red blob’s centroid.

65-66:These are back-up variables of thecxAvg andcyAvgand will be important around line122-127when we are trying to decide if the color we are tracking is actually the robot or some other piece of junk with enough red in it to fool OpenCV.

69:This simply clears the string variable with data that came from the robot, like the robot’s heading, before another iFrame starts.

See code above.

71: Creates a loop within the OpenCV() function.

73-81:Ok, I need to be humble here and say I’m not sure what the Cthulhu’s Kitchen I was doing. I knowprintRx = str(intRx)is taking the information received from the robot and converting it into a string. intRxis as a global variable and it is loaded with robot data at line 326.headingDeg = printRxis moving the heading data from one variable to another; the idea here was if I wanted more information to come from the robot besides the compass heading it would come in throughprintRx, then I could chop it up and load it into variables respective to their purpose.

For instance, printRx.split(“,”) should give a list of strings based on how many commas are currently held within printRx.

printRx = “2, 23, 88” compass, sonar, battery_life = printRx.split(“,”)


compass = 2 sonar = 23 battery_life = 88

But the part that confuses me is I turn right back around and convert the string back to an integer? I’m not sure, guys. I might have Southpark while coding again.

At the end of that poor coding we end up with two variables to use:intHeadingDeg andheadingDeg.We the integerintHeadingDeg to do any calculations that involve the robot’s heading. The other,headingDeg, is to print the robot’s heading to the screen, which is done at line 263.

84-85:These are string variables that will will hold the “Target Locked X” or “Target Locked Y” if we are tracking the robot. These strings are needed so we can print this to the screen on line 259-260.

See code above.

We’re in the meat now.

88: This increments our frame counter.

91:We read a single frame from the webcam we declared, cap, at line 44.

OPENCV!Sorry, I just love it so much.

So, by now you know I’ve not read the OpenCV manual. And please don’t tell me, “What! Go RTFM!” You go RTFM! I’ve got a wife, kid, and a job I love. I’m just going to tinker with crap and get it to work. But this attitude will begin to show as we go through the OpenCV calls, since I don’t know their inner working. Instead, I’m going to offer my best guess, and as always, if someone wants to correct me or offer better explanation, I’ll post and give credit.

94:This blurs the image we got. You may say, “But I thought higher resolution was better?” It is. But jagged edges and color noise are not. A simple shape is much easier for math of OpenCV to wrap around then a complex one. Therefore, we blur the image a little, giving us softer edges to deal with.

Also, blur melds colors, so if there are 2 blue pixels and 1 red pixel in a group, then it become 3 blue-purplish pixels. This has the nifty benefit of speeding up the image processing a lot. How much? I don’t know I didn’t RTFM.

97-100:Our image is converted to a histogram here. Having the image in a histogram format allows us to use comparative statements with it. What we use it for is to get rid of all the colors except the one we are trying to find. This will give us a black and white image, the white being only the color we are looking to find.**Line 98 is where your color is defined (it’s the two “np.array”s).

In the next step I’ll go through how to select your robot’s exact color.**

103:Finds the contours of the white area in the resulting image.

107-112:OpenCV then counts how many pixels are in each contour it finds in the webcam image. It assumes whichever has the most white area (aka, “mass”) is our object.

114-117:After we decided which object we want to track, now we need to come up with the centroid coordinates. That is what lines 115-116 do. I’ve not done the research on the math there, but I believe it averages the moments of the polygon and calls the average either centroid X or Y, depending on the calculation. But, feel free to correct or explain better.

121-127:Here we lock onto the mass we believe is the robot. It begins by collecting a 150 samples before it will state is tracking the largest mass. But after it begins to track the largest mass, then we try to stay locked on to it. This is line 122-127. In essence, we allow the mass to move enough to be considered a movement by the robot, but not so much that noise (like a stray hand in the webcam image) will cause the tracking to switch off the robot.

See code above.

This particular line defines what color you are looking for, specifically, the two sets of values:130, 170, 110 and 190, 190, 200. These two values set the lower limit and the upper limit of the color you are looking to find. The reason we use upper and lower limits, which we’ll call color thresholds, is because our robot will move through different lights. Different light sources have a tendency to change how the webcam reads the color.

The color format we are using is HSV, which stands forhue, saturation, value. Later, I’ll probably write code to select the robot within our actual program, but for now I useGimpand the following method:

  1. Setup your webcam the in the area you’ll be using, just like you’re ready to control him.
  2. Run the webcam program attached in step 10.
  3. While the webcam program is watching your robot, hitCtrl + Print Screen
  4. Open Gimp.
  5. Hit Ctrl + V to paste the screen capture into gimp.
  6. Now, find the Color Selector tool.
  7. Select the main color of your robot.
  8. Now double click on the color square on the toolbar.
  9. A window should pop open with color information regarding the color you selected, your robot.
  10. Now, the three numbers listed should be close to what we need. Sadly, we have to convert from Gimp’s HSV number range to OpenCV’s HSV number range. You see, HSV value range in Gimp is H = 0- 360, S = 0-100, and V = 0-100. In OpenCV, H = 0-180, S = 0-255, V = 0-255. So, some conversion needs to take place.
  11. From my selection I ended with Gimp numbers of, H: 355, S:50, and V:61. I could get all fancy and calculate the right numbers, but I figure 180 (OpenCV) is half of 360, sofor my H I just divided by two: 177.The other two I kinda guess at a little. I doubled and added 25,S: 125 and V: 147.
  12. In the end, this gave me middle numbers. But I wanted an upper and lower threshold, so I took each number and subtracted 20 to give me a lower, and added 20 to give me an upper.
  13. The result for my robot was:

See code above.

I’ll try to code a color selector into the program to make this whole damn thing a cinch.

If you’d like to read more, two good posts on Stack Overflow.

  1. Choosing HSV
  2. Finding HSV in image.

See code above.

132-136: Here we actually take the running average of the centroids’ X and Y. We load this into the variablescxAvg andcyAvg, again, this is to assure we are tracking the robot.

142-145: Here the target, or “dot,” for the robot to run after is randomly generated. As you may notice I restricted the generation area of the dots towards the center of my webcam’s field-of-view. That’s because I’m messy and dots were going where the little robot couldn’t get.

147-153:This is a rough collision detection function. Basically, if the robot gets so close to the target (45px) then it has considered to have “eaten” the dot. If it did, then thedot variable is incremented showing the total amount he’s done ate and thenewTarget string variable is flipped so it can generate a new target the next run through.

See code above.

156-177:Here we are trying to find the angle between the robot and his target. We basically divide the entire screen up into four quadrants but always using the robot’s centroid as the point of origin. We then calculate the slope between the target’s X and Y (tY,tX) and the robot’s X and Y (cxAvg andcyAvg).

Something like this:

If the target were to be located in the quadrant III, it would go something like this.

If you’d like to dig further into Trigonometric Functions in Python, have fun. Share if you find better math :)

See code above.

181:When we find the angle between the robot and the target, then convert it into degrees, it ends up giving us a number which is a float. That’s more than we need, so here we convert the float(degs) to and integer(targetDegs) so we can compare to the robot’s compass heading.

184:We declare an empty string calledstrTargetDegs. 187: Then we convert the floatdegs into a string so we can print the target angle onto the screen at line 264.

See code above.

This is where I need help guys. My turning code has a bug, so if you find it and come up with a correction I’ll send you a prize. I dunno? A lint ball? It’d probably be one of my left over circuit boards, or some piece of hardware I hacked together.

But for now, let’s take a look.

The idea is like:

The code is supposed to go as follows:

if target1 = True then:


elif target2 = True then:


elif target3 = True then:


And for the most part that happens, but occasionally it is dumb and turns left when it should right. Not sure what I’m doing wrong. Hey, that “You and me buddy, until the end” is a two-way street. :P

Let’s step through it

195:We want to make sure we are deep into tracking the robot before we start moving it towards the target.

198:We compareintHeadingDeg, which is the robot’s heading angle, withtargetDegs, which is the angle between the robot and the target. But we do this + or - 30º. This means the robot does not have to have its heading angle exactly the same as the angle to the target it. It only need to be approximately pointing in the right direction.

199:The movement code for the robot to go forward is3, so here, given the robot is approximately headed in the right direction, we tell the robot to move forward. This happens by loading3into the variabletranx, which is transmitted to the robot at line 307. When this code gets transmitted to my robot, the Arduino code at line 137 tells theForward(); function to fire.

202:If our robot isn’t headed in the right direction, then which way should he turn?

**203-232: **Still debugging here. I’m sorry guys. I can tell you this code works “Ok.” But once I’m done with this tutorial, I’ll go back and focus on making it turn perfect. Sorry, this code took me two days to right, but this tutorial has taken too many days.

Though, within each of the if statements we have two variable assignments:tranx = XandmotorDuration = 10. The tranx tells the robot which direction to move and the motorDuration tells it how long to move that way (this is not yet being utilized in my code).

See code above.

Here, we are drawing every thing to the screen before we show the frame.

242:Red circle for target.

247:White box to display black text on. Note, we are drawing things bottom up. So, if you want something to have a particular Z level you’ll need to put it towards the top of this section.

250:This is the green line between the target and our robot.

253-267:We display all our info here. Compass heading, target-lock, etc.

270:This actually shows the color window (the window we wrote everything on).

271:This shows the HSV copy of the captured frame. Notice the white area to be assessed as our target.

See code above.

276:An if-statement that waits for the ESC to be pressed. If it gets pressed, we close stuff.

278:This releases our webcam.

279:This closes the windows we were displaying the color and HSV frames.

281:We send the code to stop our robot. If we don’t do this and we hit the ESC in the middle of a robot movement, that move will continue forever.

282:Here we closed the serial connection.

283:We quit.

Towards the beginning of this article I stated my webcam had crappy drivers; well, while writing this I noticed I had placed thecv2.destroyAllWindowsbeforecap.release(). This is what was causing the problem. My interpretation of this was our camera being sucked into the void where the destroyed windows go. Anyway, I switched the order and it seems to have solved the problem.

See code above.

Finally, we are opening our second threaded function. This function is much smaller than the OpenCV function. Here all serial communication takes place.

289:This helps in translating ASCII.

292-296:Global variables for passing robot information to other threads.

See code above.

303:We read information into the variablerx. The information is coming from the serial line we opened at the code’s beginning.

307:This is a flag gate that makes it where our Python code can only send a motor command to the robot if the robot isn’t already in the middle of a movement.

308:We write whatever value is intranx, which should be loaded with some sort of movement from lines 192-232.

313:I think I threw this in there so the serial-line would bog down the my code.

316: We strip the number down to three digits only;remember, this is the compass heading in degrees, e.g,000-360º.

319:When something is sent over serial it gets an end-of-line character. We don’t want that.

323:The robot collected this number from a compass, which gave a number with a decimal involved. This removes the decimal so we are only dealing with whole numbers.

326-329:I’m not sure what I was doing here, I think it had to do with the oddities of zero. Eh. I’ll try to remember.

See code above.

This is a short threaded function. It only really has one job, to control how long the motors on the robot stay on. It works like this, if we send the robot a message to move forward, it continues to do so until line341. **There, the command to stop is sent to the robot and themotorBusy** flag is set back to “No” meaning the motor is ready to be used again.

340:This sets how long the motor will stay on. For instance, if it were changed tosleep(1) the robot’s motor would continue in the direction they were told for 1 second.

342:This makes the robot wait in between movements. In theory, this was meant to ensure OpenCV could keep up with the little guy. So, if you have a fast robot, you might set this higher.

See code above.

Ok.Code’s End.

This bit starts all three threads:OpenCV,rxtx, andmotorTimer.

And here is my poor attempt to explain Python threading. Most Python code is run sequentially; the order it comes is the order it is executed. One problem is timing. If we have to cause a delay in code, then thewhole program has to pause. Threading allows us to get around this. I see it like a juggler performing that trick where he keeps all the balls going in one hand, while he holds one ball still in his other. I dunno, just how I see it.

Well, like I said,“You and me, buddy, until the end.” **And here we are. The end.

I hope this code has been helpful. But do know, you’re not alone.


Skype: thomas_ladvien**

Skype or email me if you have any questions. Likewise, all that crap I did a poor job explaining, coding, writing, just shoot me an email and I’ll fix it.

I still want to develop this into a Swarmie platform; so you might keep an eye out on www.letsmakerobots.com since I’ll post my unfinished work there. Alright, I’m off to work on the 8th iteration of my Swarmie…ugh.

Dot Muncher

Originally posted on www.letsmakerobots.com

I threw this little guy together for my son Silas because he wanted to play with dad’s “Wobot.” There’s not a lot to say about him, he’s a hodgepodge of parts I had lying about:

  • HDPE Bought at the Dollar Store for $2 (I guess that’s the Two Dollar store.)
  • 3-6v 400 RPM Geared Mini Motors: $8
  • Two wheels from eBay: $2
  • 4-40 bolts, nuts, and washers (local): $4
  • Arduino Uno: $9.85
  • Ardumoto Shield: $11
  • Bluetooth 4.0 Module: $9
  • 4 x NiHM lying about: $0
  • 1 x Free Sunday morning

Total: $36.85

The first iteration took maybe an hour.

But, after I tossed the little guy together there were a few adjustments. I noticed right away I got this “Oh lord! Don’t drop it!” feeling every time Silas picked him up. Psychology being my profession, I sat on my couch and analyzed it :P

I want my son to spend time with me so I may teach him how to live. I know males often need co-operative tasks to feel secure in their bonding. Therefore, if I’m constantly upset my son is playing with the fruits of my interest he will not share the interests with me. It’s a simple matter of reinforcement. Silas comes into my lab; Silas gets reprimanded; therefore, the behavior of coming into my lab is punished (negative reinforcement) and thereby decreases. This means, for Silas to share my interest, thereby allowing us to bond, I’d need to find a solution to my cognitive dissonance regarding him picking up the robot.

Like most things, I narrowed it down to money. I would get tense because I knew the robot was fragile. It had a mixture of 5 and 3.3v components, and it was still using breadboards and jumpers, I was afraid he’d drop it, it’d break, and I’d lose money.

I couldn’t ask a three-year-old not to pick up a robot; tactual experience is primary for young males, it was an expression of his interest, something I wanted. And I couldn’t make the parts cost less. This left me with only one option**: robustness. **

I vaguely remembered this was a key component of systems theory, but it was one I very often ignored. So, I did what someone who has never had a science would do, I added a lot of bolts.

Video of the “Process”:

Warning: My son is worse than Matthew McConaughey about wearing shirts. Hey, we try, boy’s just proud of his belly.

At the local hardware store I bought some 4-40 bolts and nuts, and started revamping the little bot.

In the end, I really didn’t do anything fancy, as apparent. I drilled holes into the battery plastic-case, that aligned with holes in the robot base, and bolted it together. I, for the first time, used the mounting holes in the Arduino Uno, bolting it to the base. I then “designed” a hood (bonnet) for the little guy. from match HDPE, making sure to bolt it down as well. Lastly, I sealed the motor-gears with electrical tape and put a few drops of oil in them. I noticed this regarding geared mini-motors, they collect hair and will strip out the gears.

In the end, I did nothing a second grader would be proud of, but I did force myself to drop it from hip heigt five times to make sure I was over the “Oh Shiii-nobi Ninja!” feeling. In psychology we call that systematic desensitization. Or something as equally important sounding.

It collected so much hair the tires poped off.


I was careful not to wrap too much of the motor, since I had the thought it might decrease thermal exchange.


Originally posted on www.letsmakerobots.com

  • UPDATE: 7/6/14 – Silkscreen corrections.
  • UPDATE: 6/1/14 BOM Corrections.
  • UPDATE: 4/2/14 – Corrected information and linked the new breakout board, v.9.9

Also, this fellow is working on open-source firmware for the HM-10.

UPDATE (2/514): I split this post, since it’s getting a little sluggish.  I’ve updated the breakout board versioÂn v.9.9, have instructions for updating the firmware, and added some research notes on a pseudo-Star-Network.

UPDATE (11/23/13): I’ve added research notes on networking the HM-10s and an ATtiny 85 serial filter (at bottom).


I know there are few Bluetooth 4.0 and Arduino solutions coming out.  Redbear Labs’ BLE Shield, the BLEDuinoe Kickstarter projects, and the Bluegiga Shield.  But I didn’t really like these due primarily to the price:

  • Redbear’s Mini: $39.95 (Note: This is a uC and BLE combo).
  • Redbear’s Uno Shield: $29.95
  • BLEDuino: $19.95 (if part of Kickstarter)
  • Bluegiga Shield: $69.95

These are out of my price range for a single module.  So, in the end, I created a breakout for a cheap module and got it interfaced with the Arduino for  approximately $10.03 a module.  Although, this price will be higher if you don’t buy components in bulk.

Here’s a Video Summary:

Now, I’ve not interfaced these with iOS or Droid devices, they are simply a Bluetooth 4.0 solution for a wireless serial connection.  I’ve interfaced these devices in a limited way with iOS.  I used the LightBlue App on my iPad Mini to open a rough serial interface.  Though, I’ll probably do this later with Jellybean 4.3’s Bluetooth 4.0 API.  UPDATE: I’ve discovered jnhuamoa provides sample iOS 7.0 interface code for the HM-10.

Proof of Concept Video

Now, if only I had the $99 to pay for an App store publisher license, I’d make us all a nice little robot interface :)

The modules I used were these HM-10’s.  I won’t go into making the breakout board, since I did that already.  I will state, though, the last iteration of the breakout boards I made had mistakes that I was able to correct for home use, and I’ve corrected them in the Eagle files I’ll put up, so the board files I put up are untested, though, they are on the way and when I’ve confirmed they work I’ll post a confirmation.  Also, the images I have of the boards I’m using are different, since I corrected the board files. UPDATE: It has come to my attention the traffic LEDs on the RX/TX lines are always on due to the level converter pulling the lines high. The board still functions as intended if the LEDs are left unpopulated.

Ok. Let’s make a breakout…

1.  This is the v .9.9 of my breakout.  I do not swear it is bug free, but it seems stable.  Working traffic LEDs and it uses a linear voltage regulator:

OSHPark link: Breadboard Footprint (~$6.35 for three)

Github: HM-10 Breakout Eagle Files

2. Order the boards from OSHPark.

3. Order the SMD pieces you’ll need.  

The bill-of-materials (BOM):

  1. HM-10 x 1
  2. BS1138 x 1
  3. 0603 LEDs x 3 (2 must have voltage drop of at least 3v; usually, green or blue)
  4. 0805 Resistors 10k x 3
  5. 0805 Resistor 20k x 1
  6. 0805 Resistors 470 x  3
  7. 0805 1uF capacitor x 2
  8. (OPTIONAL) SOT-23 LDO Voltage Regulator (it doesn’t make sense to use this, but I put the option on the board just in case. I’ll explain).

Again, I bought pieces in bulk, since I know I’ll use them on other projectss; my price per module is $10.03.  Of course, you can buy all these components on DigiKey but the price will be bit more.

Ok.  Let me explain the 3.3 linear regulator.  I added this option to the board in case there is no pre-regulated 3.3v source, but it inherently contradicts the purpose of using a Bluetooth 4.0 module: extremely low power consumption.  I tried to get a reading on the milliamps the HM-10 pulls, but my multi-meter only goes to the tenths (ma) and the module wouldn’t show at all, even during active use.  And as many (all?) probably already know, the linear regulator is extremely inefficient.  So, it’s much better to solder the jumper that bypasses the regulator and leave it un-populated. UPDATE: I’ve found info on power consumption:

  • Sleep mode 400~800uA
  • Search Mode for Master: 19.6mA
  • Transmission (Slave or Master): 8.5mA.

4.  Populate the breakout board.

A few notes on soldering the SMD pieces:

  • DON’T BE SCARED.  It’s really not that hard.
  • There are three musts to SMD, at least from my perspective: a small iron tip, sharp pointed tweezers, thread-like solder (at least .022” solder wire).
  • Other important soldering tools: A wet sponge and brass-ball will keep your fine soldering tip _fine.  _Sponge the solder tip, then run it through the brass-ball after each component to prevent build-up.  
  • To speak blasphemy: Flux is ok, but I find the tweezers often take place of the flux.
  • Practice using both hands during soldering. Tweezers in one and solder-iron in the other.

5. Wire it up to serial port.

So, this is the board I screwed up on.  Basically, like a dumb-ass I was trying to regulate 3.3v with a voltage divider.  Of course, I know better now.  Still, given the HM-10 pulls fewer than 10ma, I’ll probably go back and run the math to see if as voltage-divider is, in fact, a feasible solution.

Anyway, the hookup is pretty simple.

  • BT-3.3v <—> 3.3v
  • BT-RX <—> FTDI-TX
  • BT-TX <—> FTDI-RX
  • BT-IO1 <–> LED <–> 220 Res. <–> GND
  • (For the 3.3v I used a regulator and tied my grounds).

  • A few notes, the RX and TX lines are translated from 3.3v to 5v by way of a voltage divider and the BS1138.  All other lines will die at >3.3v.

Now, as I’ve stated, I’m connecting two modules together, so you have to set one module as the slave.

I used RealTerm to access the HM-10s firmware via AT commands (full list in the manual).

HM-10 Setup Instructions
  1. Under the “Port” tab
  2. Baud: 9600
  3. Parity: None
  4. Data Bits: 8
  5. Stop Bits: 1
  6. Hardware Flow Control: RTS/CTS
  7. Software Flow Control: Receive–Yes, Transmit–Yes
  8. Under the “Echo Port” tab
  9. Echo On: Yes
  10. Monitor: Yes

Then, under the “Send” tab type in AT commands and hit “Send ASCII”:

  • Send: AT
  • Response: OK

Now, setup one unit as the slave (they default as master).

  • Send: AT+ROLE1
  • Response: OK+Role:Slave

That should be all we need to do to setup the connection.  Now, whenever they power on they will automatically try to mate.  You’ll know if they are connected if the LED goes from blinking to solid.

7. Wire the modules to the target devices.

  • BT-3.3v <—> Arduino 3.3
  • BT-RX <—> Arduino TX
  • BT-TX <—> Arduino RX
  • BT-IO1 <–> LED <–> 220 Res. <–> GND (or if you’ve soldered on the 0603s you can skip that connection).

Notice the mistakes routing my board? :(

It was salvageable though.

10. Turn on the devices and make sure the LEDs go solid.

(10a. Yell at me if it doesn’t work.)

11.  If the LEDs go solid, then you have a serial connection between the devices.  Have fun :)

Some things I’ve discovered:

  1. They have much better range than I would have thought.  I’m getting around 30ft indoors.  I’ve not tried them outside.  For those of you who’ve read my developing post: Yes, having the copper planes underneath the antenna is what caused the range issue.  They’ve got super range now :) UPDATE: I found info on range: 60 feet indoors, 300 feet line-of-sight.
  2. They connect much faster than older Bluetooth devices.
  3. Actively sending or receiving draws fewer than 10mAs :)
  4. I love these little guys over Xbees :)

Research Towards a Hub and Node network using the HM-10s:


The Theory:

So, I’ve been working on putting an ATtiny 85 at the end of the HM-10’s serial line to allow for remote control of AT commands.  It goes something like this:

Using Software Serial to setup two serial lines.  Basically, the ATtiny 85 acts as a filter on the serial line.  If it is a regular message it passes from TX1 to TX2.  But the code in the Tiny will be looking for serial data that begins with “AT+” and if it sees something, it will instead write that command to RX1.

Now, stick with me a minute.

The Master has a mode called Remote, which is setup with the command AT+MODE2.  While in Remote mode the HM-10 will transmit serial data but also accept AT commands.  Sadly, this seems to only work on the Master.  So, we must have a different setup for the slaves.  

In the case of the slaves we use the reset line.  Each slave will have the ATtiny filter, but when it gets an “AT+” command in the serial data it will pull the reset line low.  This resets the HM-10.  We do this because the HM-10 has a command AT+IMME1 and this will put the HM-10 Slave into a mode where it wont automatically seek to pair.  Instead, it will accept AT commands until given the command “AT+WORK” whch will send it into pairing/transmission mode.

Ok.  Going back to our Slave setup.  So, when we setup our HM-10/ATtiny combos as Slaves, we put them all in the mode where they don’t seek to pair until given the command AT+WORK.  Of course, we program the ATtiny to send the HM-10 into pairing mode whenever it is turned on.  Then, once it pairs with our Master we can send a serial message through the Master to the Slave with the string, “AT+RESET&AT+PIO11&AT+WORK”  When the ATtiny gets this code it will pull the reset line low, putting the Slave back in AT mode.  Then, the ATtiny connected to the slave will send the command AT+PIO11 which puts pin 1 on the HM-10 high.  After, the ATtiny gives the command to the Slave to re-enter transmission mode. Voila.

Alright, so far, I’ve got all that coded and the hardware worked out–most everything above I can confirm works.

But, I’ve been skeptical as to whether or not the HM-10 will connect quick enough for a Master to have a seemingly seamless transmission between Slaves.  I derived this skepticism from watching the blinking connection LED everytime I reset one of the HM-10s that was formerly paired.  Then it hit me. They weren’t immediately reconnecting because the Slave still thought it was connected, therefore, the HM-10 firmware had not re-initialized pairing protocol.  I tested it.  And sure enough, if a Master and Slave are paired, one loses power, then the other will hang for 3 seconds before trying to pair again.  But, if one loses power and the other one is reset at the same time, when they both power back on (<100ms) they will almost immediately pair.


So, all we have to do is setup a code where a Master connects to a node, tells it what it needs to, then tells it to reset itself,.  Afterwards, the Master changes its own pairing pin, then resets itself, whenever the Master comes back up it should almost immediately connect to the new node.

And there we go.  A viable Bluetooth 4.0 Star Network.  I hope to have this fully tested before the Holidays.


(Warning: Lots of vehement expression towards datasheet-writers)

Ok. So here is what’ve I learned.

Alright, I’m beginning this article by saying; I love the HM-10.  Excellent device.  However! I want to beat the ever loving poo out of their datasheet writer.  To begin, I’ve ordered several HM-10s from www.fasttech.com over the course of several months.  And it never dawned on me they were upgrading the firmware quicker than I could buy them.  This wouldn’t be too bad, but it’s like the HM-10 monster took a poo and the datasheets are the result: actual commands for listed firmware versions don’t match the datasheets, there is different information in the Chinese datasheets than the English, some AT commands have been merged without being stated.  It’s just fubar.

So, some of the issues I’ve had trying to network the little devices I believe has come from firmware versions not playing nice.  

For example, the HM-10 V303 has a command AT+IMME1 (0 to turn it off) for the Master only that keeps it in AT mode until given the command AT+WORK.  I discovered that stupid-ass jnhuamao changed the firmware at some point (in the 4xx range) and this command merged with AT+START, which in my V303 datasheet is a command for something else. F’in poor translation.

Now, I have 2 boards with firmware V303 and 1 board with V502.  I also have 2 modules that I bought later which more than likely have something greater than V502.  I’m praying they are V508 or greater; at V508 they added the feature to upgrade the firmware through the serial line.  ‘Bout damn time.

I can’t find the datasheets (in either language) for V502, but looking at the V508 I can see the AT+TYPE command now has three options.  The V303 lists only two options for AT+TYPE.  Yet, somehow, my V303 boards actually take this third option (AT+TYPE2). Bizarre.

Moving on from the firmware and datasheet mess: Using the ATtiny 85 does work, but to get the HM-10 to take the commands it requires:

  • TinySerial.write(“AT+xxxxx”);

So, in theory, to get a HM-10 Master setup to only enter transmission mode when given a command, it goes something like this:

  1. TinySerial.write(“AT+RENEW”); //Reset to factory settings.
  2. TinySerial.write(“AT+ROLE0”); // Be the Master.
  3. TinySerial.write(“AT+IMME1”); // Don’t enter transmission mode until told.
  4. TinySerial.write(“AT+RESET”); // IMME takes effect after reset.
  5. TinySerial.write(“AT+”START”); // Ok, try to connect to something.

This resets it to factory settings, tells it not to connect until given the command, then it gives the command to start trying to connect.

Here’s example code I use on the ATtiny 85:

  This code has been modified for use on an ATtiny.
  Created by Matthew on June 11, 2013

   This example code is in the public domain.


  SoftwareSerial TinySerial(3, 4); // RX, TX
  SoftwareSerial TinySerial2(1, 2); // RX, TX

  String blah;
  int incomingByte = 0;

  void setup()  
    // Open serial communications and let us know we are connected
    TinySerial.begin(9600); //Serial line for the ATtiny85 to read/write from/to the HM-10.
    TinySerial.println("Tiny Serial Connected via SoftwareSerial Library");
    TinySerial2.begin(9600); //Serial line for the ATtiny85 to print to a serial port.
    TinySerial2.println("Tiny Serial Connected via SoftwareSerial Library");  

    TinySerial.write("AT+RENEW"); // Reset all settings.
    TinySerial.write("AT+ROLE0"); // Slave mode ("AT+ROLE1" is slave and "AT+ROLE0" is master)
    //TinySerial.write("AT+PASS001111"); // "AT+PASS001111" sets the password.
    //The work mode only works for the Master HM-10.
    TinySerial.write("AT+MODE2"); //"AT+MODE0" = Transmission Mode, "AT+MODE1" = Remote Control Mode, "AT+MODE2" = Modes 0 + 1.
    TinySerial.write("AT+IMME1"); // Don't enter transmission mode until told. ("AT+IMME0" is wait until "AT+START" to work. "AT+WORK1" is connect right away.).
    TinySerial.write("AT+START"); // Ok, go ahead and enter. BULLSHIT! Apparently "AT+WORK" is not what we use, it's "AT+START"


  void loop()


Ok.  I also learned a little more about PIN command.  To begin, “AT+PASS000001” will set the PIN, not “AT+PIN000001”.  Of course, it must be a 6 digit number, so, fill the others with zeros.  Now, depending on the firmware version there are 3 different settings for PIN pairing, all set by AT+TYPEx

  1. AT+TYPE0 – this is supposed to be “Connect without password mode”
  2. AT+TYPE1 – “Simple pairing” (no explaination).
  3. AT+TYPE2 – “Requires PIN for pairing”

Alright.  So, this was the key to my switching between modules. I thought I would set a unique PIN for each slave and the ATtiny 85 connected to my Master would switch the PIN on my Master depending on which node I wanted to connect.  Well, this feature is broken.  I played with it for several hours and no matter how I set the PIN or TYPE settings, the modules would pair even without the correct pin.  I could find no answer for this behavior.  Until, I read through the Chinese version of the datasheet and came across this gem.

  • “IMPORTANT: V515 previous versions, the directive no practical effect, after setting causes not connect, please do not use.”

Of course, this is a Google translation.  But I’m pretty sure I read that, “This feature on versions under V515 does not work.

And that’s where I am at the moment. I wanted to make sure I wrote some of this stuff down in case others were running into problems.  My next projects will be writing to jnhuamao and get some questions answered (e.g., “Any way to get upgrade the firmware on versions less than V508 so I’m not left with 5 unsecure HM-10s; maybe through the SPI interface?).