I’ve been working on this one in silence for a bit.
Awhile back it hit me, before I started growing my Overlord project in complexity I wanted to refine it for ease-of-use. Therefore, I began translating my Overlord project into a Python module I could build off.
I figure, this would make it easier for anyone to use. This includes myself, I’ve not forgotten my identity as a hack, nor will anyone who pops the hood on this module :)
But, at its core, there are few essential inputs:
Color to track.
So, I spent some time translating the code into a callable module. This experiment was mainly for my own use, yet I knew it’d grow healthier if I had LMR’s feedback, elder or noob.
When I started I actually planned (gasp) out what would make this code more user friendly. I didn’t think long; the two things that have taken the most time tweaking to get this code useful are:
Adjusting the compass heading.
Selecting the color to track.
To address the first issue, I developed a “auto-compass calibration function.”
Basically, this function takes the very first compass reading and adjusts all other readings. So, all you have to do is put your robot in the direction you want it to consider “North,” start your code, and this function will convert all other readings.
The second issue took me a little longer to deal with: easy color selection. In short, I rewrote most of the color detection parts of the code to take advantage of the OpenCV’s CamShift algorithm. This function is more resilient to lighting changes or other near color objects, but it is also more CPU intensive. At some point, I’ll probably go back and write a variant that sticks with the old largest-target-color-mass method.
Ok, what this means for the user? When the code starts you select the color you’d like by left-click and dragging a selection box over an area. The mean color of the selected area will be tracked and this will also start the rest of the code.
What does Friendly Overlord give you?
Well, a lot. And when I finish writing the damn thing, more than alot.
Here’s a list, and only one bit is untrue.
It tracks your robot, providing its x and y relative to your webcam.
It will provide a target coordinates, which I’ll later make addressable in case someone wants to do something cool, rather than have their robot drive around and catch virtual dots. Lame.
It will take the compass reading you provide, translate it to a heading relative to the camera, then, it will send commands to your robot telling it to turn until it is in alignment, then move towards the target.
Make you a cuppa (CP, DanM, did I use that right?)
It will allow you to tweak pretty much any element of the code (e.g., overlord.targetProximity = 5)
What does it not do?
Take care of your serial data. You’re own your on, bud.
Write your robot uC code for you.
Provide you with your robot’s heading (though, when I delve into two-color detection this could be done with two-dots on your bot. But really, it’d be easier and near cheaper to get an HMC5883L).
Alright, so let’s talk code. How little code does it take to use it?
importserialfromtimeimportsleepimportthreadingimportoverlord#Initialize Overlord variables.overlord.dVariables()#Open COM port to tether the bot.ser=serial.Serial('COM34',9600)defOpenCV():#Execute the Overlord.overlord.otracker()defrx():while(True):# Read the newest output from the Arduinoifser.readline()!="":rx=ser.readline()rx=rx[:3]rx=rx.strip()rx=rx.replace(".","")#Here, you pass Overlord your raw compass data. overlord.compass(int(rx))defmotorTimer():while(1):#This is for threading out the motor timer. Allowing for control#over the motor burst duration. There has to be both, something to write and#the motors can't be busy.ifoverlord.tranx_ready==Trueandoverlord.motorBusy==False:ser.write(overlord.tranx)ser.flushOutput()#Clear the buffer?overlord.motorBusy=Trueoverlord.tranx_ready=Falseifoverlord.motorBusy==True:sleep(.2)#Sets the motor burst duration.ser.write(overlord.stop)sleep(.3)#Sets time inbetween motor bursts.overlord.motorBusy=False#Threads OpenCV stuff.OpenCV=threading.Thread(target=OpenCV)OpenCV.start()#Threads the serial functions.rx=threading.Thread(target=rx)rx.start()#Threads the motor functions.motorTimer=threading.Thread(target=motorTimer)motorTimer.start()
This is fully functional code. You’ll notice that really, only about 10 lines get Friendly Overlord going, the rest handle Serial functions and motor firing. Be warned, the motor firing code will change, since it is written how I like it right now, eventually will be designed to be as flexible as possible.
overlord.dVariables() #Sets the Friendly Overlord variables.
overlord.otracker() # The module’s heart. Handles color tracking, angle calculation, etc.
overlord.compass(x) # You pass it an compass heading as an integer in degrees (0-360) and it does the rest.
overlord.tranx_ready # Simple flag to indicate last bit of serial data has be sent.
overlord.tranx # Variable that contains the serial command to be sent to the robot.
overlord.motorBusy # Flag to indicate if the robot is still in the middle of a movement.
That’s about it. In the module? 399 lines of code, or so. Still relatively small for a program but not something I want to wade through without a damned good reason.
Ok. So, where am I going with this?
Hell if I know. I want to make it as versatile as possible. Eventually, I’d like to be tracking nth number of robots. I envision a swarm of Yahmez’ Baby bots flying all over the place, Friendly Overlord tracking them, and communicating with them via IR.
But in the more immediate future, I’d like to make every variable tweakable. Especially, variables useful to others. For instance, the overlord.tX and overlord.tY are currently controlled by the module. They are simply randomized numbers. But, I’ll make a flag in the next two days to take control of them from your own code. You can decide where you’d like your robot to go. Whether it be to your mouse pointer (overlord.targetY = overlord.mouseY) or a complex set of way-points to lead him through a maze. Really, I’ll probably code around the feedback I get.
Now, some obligatory stuff.
Here are some of the current variables addressable from your program:
But I’d like to make every variable needed by the user available.
Ok. So, here’s what I need: Someone to use it and provide feedback. I’m getting too close to it and bleary of thought.
I’ve thought of doing a few things to get some feedback:
Setup a challenge (I’ve got some surplus).
Offer to mail one person a month a setup (two Bluetooth PCBs and a cheap webcam).
I think I’ll make a walkthrough video pretty soon (kinda miss making stupid videos) but I’m a little worn out right now.
I’ve been working on re-making the the Open Hardware Pulse Sensor so it’d be easy to send off to OSHPark and to make at home. I’m not sure, but I think I started this projects in March and I’ve just now finished it.
The bit of encouragement I needed was when hackaday.com put it up as their “Fail of the Week.” I thought I was going to be mature about it. But those four red letters started eating at me, so I gave it another go. Weirdly, I got it working.
I believe there were three problems:
I had mixed up the op-amps again. In my defense, I’ve got 5 different ICs flying about in the same package as the op-amp.
The Arduino I’d been plugging into was sitting on a surface that provided enough conductivity to create noise between the 3.3v pin on the underside and A0, which I was using for the op-amp in.
Every time I touched the sensor the exposed vias were shorted through my own conductivity. Stupid mineral water.
The light sensor is the hardest bit, so take your time. I put a little bit of solder on each pad with my soldering-iron, then, cover the soldered pads in flux. Next, I attempt to align the light-sensor with the pads as close as possible. After, I put the board with the backside on an over-turned clothes iron. Let the iron heat up until the solder reflows and the sensor is attached.
Flip the sensor and lock it to your surface with tacky-putty to solder the LED, passives, and op-amp. I won’t detail this, since my video shows the entire process.
Wrap it with tape, cutting a small hole for the LED and light-sensor. (I’ll come up with a better solution, and a way it to clip it to your body, on the next iteration).
New angle. I finished my ATtiny Bitsy Spider (ABS) board and wanted to do something with it. While stringing it together I had thought of replacing the Arduino Pro Mini and the Servo Helper board with the ABS. Costs wise, it will be slighty more expensive ($1.50 or so?) but much smaller and a lot less hassle.
I’ve read several people had mixed results getting an ATtiny to control servos. Of course, I’m no better. But I was able to get acceptable functionality out of them (i.e., controlling continuous rotation servo speed, direction, braking). Anyway, here’s kinda how I approached the servos on the ATtiny 85.
I found several blogs about getting servos to work on the ATtiny but ultimately I used the Servo8Bit library (note, for ease of use I’m linking the “Arduino version” below, not AVR).
It doesn’t seem real friendly, but in a hack’s opinion, it seems like great code that is incomplete–hope someone corrects if I’m off. The problem I had, and I believe others, was the library using Timer1 for servo timing. The Tiny cores (at least the ones I’m using) use Timer1 for basic functionality, creating a conflict. This presented to me in the inability to use the delay() function. It was simply as if it had not effect. That’s when I popped the hood on the library itself. In the header files there is an option for which timer to use. So, I switched it from Timer1 to Timer0. I tried the code again. Great, delay() seemed to work now, but the ability to control the servos was gone. As soon as the myServo.attach(3) was called the servo would spin in with full speed in one direction. Damnit.
I didn’t feel like digging through the rest of the library trying to debug something I only half understood. So, I began researching. After a bit, I came upon this thread. Seems this fellow WireJunky was trying to figure out how to do the same, control continuous rotation servos with an ATtiny. At the end Nick Gammon told him he should just create his own timer function.
Anyway, I hacked this code out after reading the thread and was surprised it did what I want. I’m a hack hacking with a hacksaw!
There are a few issues. It seems my B servo has some jitter in it. It doesn’t like to stop at myServoB.write(90). I tried calling myServoB.detach(), then myServoB.attach(3) in a hackish attempt to stop the servo. It’ll stop but wont re-attach.
Anyway, even if troubleshooting it doesn’t work out I have some work arounds. For example, running the VCC for the servos through a P-Chan that is controlled by the ATtiny, it’d take an extra pin but would allow me to accurately control stopping them. Though, I believe this lack of “centeredness” is due to either a cheap 0805 I used in the conversion or other noisy stuff I have on the PB4 pin.
Of course, to use the ABS as a replacement brain on the Jot, I’lll need to create a star-network with the ABS’es, write a library to control the HMC5883L from the ATtiny, make sure there are no other timing issues, and fit it all in 8k Flash. Ugh. Righ now the code size is around 3k with servo and serial library.
Well, I don’t know what to say. I think I’m going to take a break from this build for a bit and focus on finishing the Overlord projects with the Dot Muncher.
I discovered what was causing my problems with the NRF24L01. It wasn’t the voltage-regulator. It was the 1uF 0805s filtering the regulator. I replaced the unknown capacitors (ones bought off of eBay) with some from Digi-Key that were rated 25v. This fixed the problem and I had the Jot communicating nicely as I had hoped.
Of course, that wasn’t the end of the problems. I discovered the HCM5883L board was shorting, I believe, everytime I programmed the board. It’s pissing me off. I’ve burnt four compass boards and two Arduino Pro’s over the issue (smoked around $15 in parts). It has something to do with the HCM53883L I2C lines feeding backward through the board whenever the Arduino goes low. It causes the voltage-regulator on the HCM5883L board to pop almost right away. Of course, it does slight damage to other parts of the connected boards. I didn’t know it at the time, however, I believe this was the reason the filtering capacitors were damaged, the backward current.
Stuff Burnt on the Pyre of Stupidity –>
That’s not the only issue. The code I’ve got together for the NRF24L01 doesn’t play nice with the HCM5338L library.
But I can’t tell how to re-write the code in a way they are happy with each while the f’in boards keep burning up. Sigh.
Nevertheless, I think my next step, when I’ve got my gusto back, will be to make a complete schematic of the Arduino Pro Mini, Little Helper Board, and the HCM5338L. I figure the first two I have schematics for already, and I have enough HCM5883L boards to pull the components off and reverse engineer the PCB.
Still, I’m a little depressed. I’m feel like I’d been better off making the boards myself. At least then, I would know exactly how they are strung together and could only blame myself for issues.
I also feel like Frits needs to put together a “Robotics Fail of the Week” so I can be it’s first highlight. Man, even looking at that picture now makes me feel like I suck at life. Oh well, I’m going to list the good stuff I’ve learned from this.
Reverse current is a bitch–low-drop diodes are your friends.
I have put together code that makes the NRF24L01 have closer to Bluetooth functionality. Though, it doesn’t like being in the same code as the Wire library.
Cheap parts are require you be time rich.
NRF24L01 isn’t really meant for streaming data. I knew this to begin, but I didn’t understand how it really plays out in the code. The NRF takes a lot of code management. Unlike other devices that are hardware managed or SoC. This makes the NRF highly sensitive to what else your code is doing. In my case, running servos, communicating over I2C, and doing floating point math. As I progress in this build, I feel I’m taxing the NRF’s functionality beyond its ability.
It is better to learn the circuits of all boards connected to yours. It might initially take more time, but in the end save time and money.
If I fail it something, although looking ridiculous is not fun, documenting the failure makes me feel better. Like it meant something. Even if that something is, “Hey world, I’m an idiot.” :)
UPDATE: A Jot of Trouble
I didn’t want to float this post until I have something working to update, but I missed writing for a change. I’ve been working on the little Jot frequently over the last few months. Yet, I keep running into problems. The NRF24L01s are acting silly. One day they work, another they don’t. I guess we can’t make woman jokes now that Roxanna77 is here? (If you read this Roxanna, just wanted you to know I had to make sure my wife didn’t read this post, it’d been hell in my house).
I have reworked the servo board (v.9.5) to include a double 90-degree header. One set is to attach the servos, the other is to attach the Compass (HMC5883L). This was meant to make the hardware more compact, modular, and keep the compass level for a better reading. Oh yah, and I burnt a few HMC5883Ls trying to connect them with crappy braided wires.
Also, I’ve added solder-jumpers to tie the 3.3 SOT-23-5 voltage regulator’s enable pin to either high or low, depending which one I mistakenly buy.
On the top side I’ve included an SMD voltage divder running straight into analog pin A3. My intention is to allow the Jot to keep an eye on its battery voltage as a way of sensing how “hungry” it is.
I’ve added a 3.3v pin on the new double 90-header, in case I’ve a 3.3v sensor elsewhere on the bot. I dunno, I was trying to use all the extra pins I had.
Of course, since I’ve learned how to tent vias, I’ve also tented the vias on the board with hope I’ll save myself a fateful short or two.
I’ll eventually try to replace those bulky headers with what I’ve affectionaly begun to refer to as “Those short, round headers.” I like these little headers because of how utterly small they are. Of course, they are small but the bulk of their body does not make it through thicker PCBs. This is due to the point flaring closer to where the plastic header is. This flare prevents the short-rounds from sinking through the typical header hole on most boards.
But, I’ve got calipers, Eagle CAD, and OSHPark, so I made a little library of header holes that will allow these pins to slip neatly through the board and mate with short-rounds on the other side. I sent off to OSHPark for a sample, so I’ll report back when I’ve tested them for effect.
On my original version of the servo board (by the way, I refer to it as the Little Warmie Helper board or LWH board) I had used a different voltage regulator that cost more. The only difference I found between these were the output, the first I used putting out 200mA and the second 150mA. I really didn’t think this made a difference, given what I could find in the datasheet. I know there are passives effecting the power consumption, but it’s the only info I could find (datasheet, pg. 8) The NRF24L01 was using around 11.3mA for the transmitter and 13.5mA for receiver. Even though I didn’t know the power-consumption of the passives I believed I was well within the range to use the cheap 150mA voltage regulator. But, experience has proven otherwise.
This is where I ask the professionals to come in and tease me about missing something simple.
The only theory I could invent, given my limited understanding of electronics, is the NRF24L01 only used 11.3/13.5mA in an amp hour, but the burst use exceeds the constant 150mA of the cheap regulator? I don’t know. I’m at a loss.
Of course, this is pure speculation. I’m currently out of higher output voltage regulators (I should have some more by the end of the week). But, I can leave the NRF24L01 in place on my LWH board and solder on jumpers to the 3.3v and GND pins and get the NRF24L01 to function properly. This makes me believe the fault lies directly with the inadequacies of the voltage-regulator and not my board design (though, it’s inadequacies I’m sure are glaring).
Anyways, this is where I am with the little Jot.
A couple of notes. I have a backup design of the Jot that I’m working to get cheaper than $25, which uses BLE (yes, those HM-10s I’m in a love-affair). Also, I decided if I was to get serious about the Overlord projects I’d probably do better turning it into a Python module, which I’ve been doing in silence and is around 90% done. I’ll try to have it up before the end of the year. I need to finish several functions.
UPDATE: Progress on NRF24L01 code for working between PC, Hub, and Robot.
So, here is my attempt at a swarmie build. Not much here yet, simply a personal build log until I get an iteration cheap enough, then, I’ll start incorporating them into the Overlord projects.
I have to bow to Bajdi; those little NRF24L01 take a lot more brainpower than simple ole’ Bluetooth. I tried for some time to write my own code that would send and receive bytes to or from the other node. After a little of hair pulling I gave up and started reading other’s code. I came across Robvio on the Arduino Forums who had some rather nifty code that I left nearly intact.
The way this code works is much like a software and serial simulated Bluetooth module.
To send serial data it goes like this, you type something with a prefix code, T for transmit and S for serial print, and ending with a newline character (\n).
For example, typing the following in the terminal on module A:
T:S: My message \n
Will send “My message” to the other module B, then, it will print “My Message” to serial line on the module B.
If you type,
T: My message \n
This will transmit “My message” from module A to module B, but it will not be printed to the serial line on module B.
I’ll let you guys look the code over and tell me if I can improve it for what I’m doing. Right now, I’ve tested it with some basic Python code to send a serial message to my hub (Arduino Uno and NRF24L01), which relays it to the robot (Arduino Pro Mini and NRF24L01).
NOTE: Try as I might, guys, I can’t get the numbers to line up in my HTML version of my code. Instead, you might just load it into Geany or Notepad+ to follow along, since I indicated things by the line number. I’m sorry, I’m out of patience for it.
These are redneck instructions on how to control a robot with a static webcam for under 50 USD.
I’m a robot builder and I got tired of seeing universities play robot soccer or something with computer vision guiding their players, and no matter how much I begged, darn Ivy Leagues wouldn’t share.
So, I wrote my own. And while I did it, I swore I’d try to help anyone trying something similar.
The PC averages these X, Y positions for around 150 camera frames.
If the blob hasn’t moved much, the PC assumes the red blob is the robot.
The PC gets frisky and gives our robot a random target within the webcam’s field-of-view.
The PC calculates the angle between the bot and the target.
Meanwhile, the robot’s microcontroller is taking readings from a magnetometer on the robot.
The robot, with a one time human calibration, translates true North to “video-game north,” aka, top of PC’s screen.
The microcontroller transmits this code to the PC.
The PC compares the angle of the bot from the target with the robots angle.
The PC sends a code to the bot telling it to turn left, right, or move forward (closer to the target).
When the robot has made it within an acceptable distance from the target he “Munches the Dot.”
A new random dot appears. Rinse repeat. (For real though, don’t rinse the bot. Consider Asimov’s Third Law.)
About Me: (skip, it’s boring)
I’m a homeless outreach worker. The job’s amazing. But I’ll say, emotionally taxing. Skipping the politics and the sermon on harm-reduction, I decided at the start I needed something far from the job to allow my mind rest and prevent compassion fatigue. Something that consumed my brain-power so I’d not be stressing over the 6 months pregnant 17 year-old, shooting up under a bridge on I-35. Something to protect my down-time so I’d be frosty for the next day.
Well, I saw that TED talk about the Sand Flea and I told Bek, “That’s awesome, think I could build one?”
“Oh crap,” she said, “new obsession?”
Now, robots are my relief. My way to prevent white-matter from becoming dark-matter as I rake through sludge looking for those who want out.
I started reading a lot. I discovered, Arduino, Sparkfun, eBay, Raspberry Pi, ferric chloride, Python, hackaday, HC-SR04, Eagle, OSHPark, and the list goes on. But every time I Googled something about robots, I’d end up at the same place.
These guys are brilliant. They are a college education from collaboration, I swear.
Soon, I ended up with my first bot. A piece of sh…short-circuits. Although, I did learn a lot interfacing the bot with the Raspberry Pi. Also, while I was working with a Raspberry Pi, I played with OpenCV, and was considering adding a face tracker to my bot before I got distracted. But before I quit, I created a proof-of-concept.
So, all these experiences began to culminate.
Meanwhile, I was taking a graduate Research Methods class at UTA and my professor disappeared. The university was amazing; good professors filled in and made sure our education didn’t suffer. But we wondered for many months. Sadly, it was discovered he had killed himself.
It shook me. I deal with suicidality every other day, but it’s usually on the street. Why a successful research professor? My thoughts got dark for a bit, which meant I sunk into robots even more. Yet, now, a question sat at the front of my mind:Will robots one day kill themselves?
This may sound silly. But I believe the formula for self-termination can be expressed in Boolean logic, and therefore coded.
Pseudo-code would be:
if painOfExistence > senseOfPurpose then:
Derived from work and life experience I genuinely believe the root-motive for suicide is existential-anxiety, which seems to me, entangled within both constructs.
Ok. Skipping the _Time_bit.
Someday, I’d like to delve into swarm robotics. Or, at least, attempt to replicate organic group behavior within a robot group. And I thought it might be possible to control a group of robots with a setup similar to those universities or research groups keep showing off. (Jockish Ivy Leagues :P)
Well, I found these desires, information, and tools synergized into a passion. After two days, I was able to write a basic OpenCV Python script that could control a robot using a static webcam looking down on it. Let me clarify, I’m of average intelligence, simply obsessive, so when I mentioned “two-days” I’m trying to convey the utter feasibility of this project, foranyone. Python, Arduino, and OpenCV make it so very easy; anyidiot like me can hack it out.
Of course, my purpose for this platform is to control robot groups. The group being the second social collection (one-to-eight) and social interaction seems to be the essential in developing a positronic brain. The white-mattered brained being necessary for me to test the above mentioned self-termination formula. So, maybe, I’ll learn if robots will commit suicide, or perhaps, have a better understanding of why humans do.
Dark and depressing! I know, right? Who writes this crap!?
It doesn’t matter what sort of robot you use, it only needs:
A magnetometer. I used the HMC5883L. They’re like 2 USD on eBay.
A wireless serial connection. Bluetooth, Xbee, and nRF24L01 would be my recommendation since all are well documented creating a bridge between PC and microcontroller.
I personally built my own using red cutting-board I stole from Bek (shh). For my serial connection I used two $10 Bluetooth 4.0 modules, which I’ve written an instructable on setting up a Bluetooth 4.0 module to work with an Arduino and PC: Bluetooth 4.0 and Arduino.
Probably something less than 10 years old. It could be running Linux or Windows;though, I’ll be using Windows Vista (hey, I’m first-world poor and can’t afford Windows 7 :P).
It will need a wireless serial connection that pairs with your bot. Again, I used my BT 4.0 modules.
It’s really up to you. I’m not going to lie, I went with the cheapest webcam I saw, which costs 6.87 USD. But I would _not _recommend this webcam. It didn’t like my PC, so every time my Python script stopped I had to unplug the webcam and re-plug it in. A real annoyance for debugging.
I’d suggest a high-resolution webcam. Maybe even a IP cam, if you’re rich? If you are, would you buy me one too?
Longmale-to-female USB cable. Again, I got two 15’ USB cables on eBay for around 4.50 USD. If you get everything setup and you notice problems with the webcam at the end of the cable, you can put a powered hub at the end of the cable with an extension cord and it’ll take care of the issue. Though, I didn’t have this problem at 15’.
A wife that’ll let you screw your webcam into the ceiling. Or…don’t ask…
The first bit of robot code I’d like to focus on is the compass. Now, I’ve not detailed how to use the HMC5883L, since SparkFun has done this for me. I also won’t go into tilt-compensation, since I was more worried about proving the concept here than dead-on accuracy. But if you’re a smart-cookie and would like to take that chaellenge, feel free. Just be sure and share the code with us all when you’re done :P
No. Instead, I want to focus on adjusting the compass heading from a value respective to true North, to what we want it to think is north, in our case, whatever is the top of our screen. This process takes a little involvement, since the numbers must be set manually and with a little guesstimation.
See code above.
So, I got my compass module lying flat as possible and then bolted it to my robot. This helps assure your getting a full 360º and will keep you from having to re-calibrate what we’d like to call north every time the compass module gets knocked out of place.
106-114: These modules and the Arduino library are both designed to have 0º be North, but we want to set our own north, video-game north. Which is exactly what lines 106-114 are about. I found 80º is what value my robot was reading when he was headed towards the top of the screen. I had to find a way to adjust this to give me the reading 0º. I ended with this simple code to spin the compass.
I had to divide the adjustments into two sections for the math to stay simple. Lines 109-111 handle mapping 0-79º onto 280-0º, making the robot think 0-79º is 280-0º. Lines 112-114 do the same for 80-360º, converting it to 0-279º.
Honestly, I’ve got some spatial-retardation, so I have a hard time thinking through this,I just know it works. So, if you have problems I’ll answer emails and Skypes and we can work through it together. And, if you want to submit a better explanation, I’ll post it and be sure to give you credit.
Do know, my redneck solution was to change the orientation of the camera. Pfft. Too easy.
116: Sends the robot’s heading to the PC.
117:iComp is a variable allowing us to decide when to start sending data to the PC. We don’t want to send data to the PC before it’s ready or before the robot is warmed-up, we’d be dealing with poor readings.
118:This is a delay that makes sure we are not bogging down the serial line, since every time we callSerial.println(“whatever”)both the PC and the robot have to take some processing power to deal with it. In short, it’s to make sure the robot is not talking the computer’s ear off.
See code above.
This bit is pretty easy. It reads the codes being sent from the PC and translates them into a function call. I write all my robot-PC interactions this way, since if I want a code to mean something completely different, for instance I want to swap the robot’s right and left movements, I’d just swap lines 134 and 144.
125:If I remember correctly, this line reads serial data being sent from the PC and assures theval variable isn’t getting a bunch of zeros.
This is one of the functions called to make the motor move, or in the case of this function, stop.
188-189: This actually tells which pin on the Arduino, specified by the variablespwm_a andpwm_bto decrease to 0. This effectively stops our robot.
192-193: This bit actually tells the motor which direction to turn. The pins (dir_a anddir_b)are set either HIGH or LOW and this changes the direction of how the motor moves.
For windows, use the MSI Install respective to your architecture, either x86 or x64. Of course, Linux and Mac are versions are there as well. Go ahead and install Python 2.7, but I’m not a fan of their IDE. Instead, I use:
Though, this IDE is a little tricky to get running on Windows, since it’s meant for Linux. These posts over at Stack Overflow go through some popular Windows Python IDEs. Pick what you feel comfortable in. I suggest running ‘Hello World’ in each until you decide you like one.
Here we are, the hardest part of this whole project; if not careful, we fall into dependency hell.
I’m going to try and help you setup all the modules needed to run the Python code. It’s been difficult for me to do this right, so I’ll try to be descriptive.
At this point, you might bring up Python and try some simple webcam capture test code (if you have problems copying and pasting, I’ve added web capture code as an attachment as well):
See code above.
If you see a live feed from your webcam, you’re almost good to go.
If there any problems, like I said, you and me buddy. Feel free to ask questions here or Skype me:thomas_ladvien
Okay. Here’s all the Python code in one go. _Don’t be scared _if this looks confusing. I feel the same way. In fact, some of it I _still _don’t understand. (Hey, honesty a is a rare fault I seem to possess.) Again, don’t worry, we’re going to walk through it one section at a time, you and me, buddy. Until the end.
On the flip side, if you are a Python guru, or yanno, just a sassy-pants: Feel free to add corrections and comments on this page. I’d love to make this code grow through critique. Do know, I guarantee the following: Typos, grammar problems, illogical coding, artifacts from debugging, and the like. But don’t worry, I’m thick skinned and usually wear my big-boy panties.
Numpy, which we’ll call “np” throughout the code, is used for higher number functions needed for OpenCV to do her magic.
Serialis the module which will allow us to establish a serial connection between the PC and the robot, via whichever wireless device you’ve chosen.
Timeallows us to basically idle the code. This is important in controlling many things, for instance, how far the robot moves. We tell the motors to turn on, wait 10 secs, then turn off. Because the sleep function actually puts the code into an idle state, we must have the threading module, since our code requires the PC to do several things at once.
Math. From the math module we get the code to help us simplify the trigonometry calculations, like the angle between the robot and target.
The randommodule is only used to gives us a random target.
Threading. Important module. Basically, threading allows the computers to do two tasks at the same time. This becomes important when we are both trying to track the robot and receive his position. Throughout this code we will have three threads
The thread running the OpenCV stuff. This tracks the robot and is also the largest.
A thread controlling the serial connection between the robot and PC.
And a thread with the small job of telling the motors how long to be on, thereby controlling how far the robot will move.
See code above.
13: This is where we actually open a serial connection to the wireless device you are using. Note, we’ve named the serial connection we opened “ser” so when we go to send information it will be something like,ser.write(“What you want to send here”)
15-38: Here we declare a bunch of variables. The “global variable” lets the code know that this variable is going to jump between all threads. Next, thevariable = 0 actually declares the variable. Do know, you’ll have to remind each thread a variable is global by stating “global variable.”
One thing I should state,iFrame = 0 is an actual variable declaration, as well as setting it to 0. Of course, this is how one would declare an integer variable with an initial value of 0. On the flip,rx = “ “ isalso a variable declaration but this time a string. You’ll know I switched information from a integer to a string if you see something like this:
headingDeg = str(intHeadingDeg)
That tells the code, “I want to convert the value in intHeadingDeg, which is an integer, into a string and call it ‘headingDeg’”
The comments indicate what each variable is meant for. Not going to lie, not sure I don’t have some declared variables I meant to use, didn’t, and forgot to remove.
One important variable is theiFrame variable, since it tracks which frame we are on. This becomes key in all aspects of tracking our robot.
See code above.
42: Here we start this function that does most of the work,OpenCV():. It is one of the functions thatwill be threaded at lines 345-347.
44: We open up the webcam and give it the nicknamecap. If I remember right the “0” in the parenthesis refers to whatever camera comes first on your USB bus, so if you have more than one camera you can specify by changing this number, e.g.,cap = cv2.VideoCapture(3). Notice we called the OpenCV module cv2, so we are using the OpenCV module to access the webcam.
46-52: Just making the variables we declared work within this function. This might not be needed, but hey, I don’t read the whole Python manual.
55:This is just a string flag that is flipped to tell the PC to generate a new target for the robot. Note, we initially set it to “Yes” meaning the first time we run through this function a target needs to be generated.
58:This is an integer variable to count how many dots the robot has “ate.”
Ok, before I get to the next bit I need to take a minute and explain how we approach actually getting the coordinates of our robot. As you know, OpenCV does the hard work for us, giving us the X and Y coordinate of the largest red blob on the screen. Though, the coordinates it gives us are the center of the mass. Now, this is all just a logical guess because I didn’t read the whole OpenCV manual, but I believe the X or Y coordinate that refers to the center of this mass is called the centroid.
This might seems simple. That’s because it is, I’m not sure why we don’t just call it the damn center or something. Eh, oh well. Though, it will become important when we do collision detection between the robot and its target.
61-62: All that to say, the “c” incyAvgandcxAvgstands for centroid. So, these are variables that will hold the running average for the X and Y coordinates of the red blob’s centroid.
65-66:These are back-up variables of thecxAvg andcyAvgand will be important around line122-127when we are trying to decide if the color we are tracking is actually the robot or some other piece of junk with enough red in it to fool OpenCV.
69:This simply clears the string variable with data that came from the robot, like the robot’s heading, before another iFrame starts.
See code above.
71: Creates a loop within the OpenCV() function.
73-81:Ok, I need to be humble here and say I’m not sure what the Cthulhu’s Kitchen I was doing. I knowprintRx = str(intRx)is taking the information received from the robot and converting it into a string. intRxis as a global variable and it is loaded with robot data at line 326.headingDeg = printRxis moving the heading data from one variable to another; the idea here was if I wanted more information to come from the robot besides the compass heading it would come in throughprintRx, then I could chop it up and load it into variables respective to their purpose.
For instance, printRx.split(“,”) should give a list of strings based on how many commas are currently held within printRx.
But the part that confuses me is I turn right back around and convert the string back to an integer? I’m not sure, guys. I might have Southpark while coding again.
At the end of that poor coding we end up with two variables to use:intHeadingDeg andheadingDeg.We the integerintHeadingDeg to do any calculations that involve the robot’s heading. The other,headingDeg, is to print the robot’s heading to the screen, which is done at line 263.
84-85:These are string variables that will will hold the “Target Locked X” or “Target Locked Y” if we are tracking the robot. These strings are needed so we can print this to the screen on line 259-260.
See code above.
We’re in the meat now.
88: This increments our frame counter.
91:We read a single frame from the webcam we declared, cap, at line 44.
OPENCV!Sorry, I just love it so much.
So, by now you know I’ve not read the OpenCV manual. And please don’t tell me, “What! Go RTFM!” You go RTFM! I’ve got a wife, kid, and a job I love. I’m just going to tinker with crap and get it to work. But this attitude will begin to show as we go through the OpenCV calls, since I don’t know their inner working. Instead, I’m going to offer my best guess, and as always, if someone wants to correct me or offer better explanation, I’ll post and give credit.
94:This blurs the image we got. You may say, “But I thought higher resolution was better?” It is. But jagged edges and color noise are not. A simple shape is much easier for math of OpenCV to wrap around then a complex one. Therefore, we blur the image a little, giving us softer edges to deal with.
Also, blur melds colors, so if there are 2 blue pixels and 1 red pixel in a group, then it become 3 blue-purplish pixels. This has the nifty benefit of speeding up the image processing a lot. How much? I don’t know I didn’t RTFM.
97-100:Our image is converted to a histogram here. Having the image in a histogram format allows us to use comparative statements with it. What we use it for is to get rid of all the colors except the one we are trying to find. This will give us a black and white image, the white being only the color we are looking to find.**Line 98 is where your color is defined (it’s the two “np.array”s).
In the next step I’ll go through how to select your robot’s exact color.**
103:Finds the contours of the white area in the resulting image.
107-112:OpenCV then counts how many pixels are in each contour it finds in the webcam image. It assumes whichever has the most white area (aka, “mass”) is our object.
114-117:After we decided which object we want to track, now we need to come up with the centroid coordinates. That is what lines 115-116 do. I’ve not done the research on the math there, but I believe it averages the moments of the polygon and calls the average either centroid X or Y, depending on the calculation. But, feel free to correct or explain better.
121-127:Here we lock onto the mass we believe is the robot. It begins by collecting a 150 samples before it will state is tracking the largest mass. But after it begins to track the largest mass, then we try to stay locked on to it. This is line 122-127. In essence, we allow the mass to move enough to be considered a movement by the robot, but not so much that noise (like a stray hand in the webcam image) will cause the tracking to switch off the robot.
See code above.
This particular line defines what color you are looking for, specifically, the two sets of values:130, 170, 110 and 190, 190, 200. These two values set the lower limit and the upper limit of the color you are looking to find. The reason we use upper and lower limits, which we’ll call color thresholds, is because our robot will move through different lights. Different light sources have a tendency to change how the webcam reads the color.
The color format we are using is HSV, which stands forhue, saturation, value. Later, I’ll probably write code to select the robot within our actual program, but for now I useGimpand the following method:
Setup your webcam the in the area you’ll be using, just like you’re ready to control him.
Run the webcam program attached in step 10.
While the webcam program is watching your robot, hitCtrl + Print Screen
Hit Ctrl + V to paste the screen capture into gimp.
Now, find the Color Selector tool.
Select the main color of your robot.
Now double click on the color square on the toolbar.
A window should pop open with color information regarding the color you selected, your robot.
Now, the three numbers listed should be close to what we need. Sadly, we have to convert from Gimp’s HSV number range to OpenCV’s HSV number range. You see, HSV value range in Gimp is H = 0- 360, S = 0-100, and V = 0-100. In OpenCV, H = 0-180, S = 0-255, V = 0-255. So, some conversion needs to take place.
From my selection I ended with Gimp numbers of, H: 355, S:50, and V:61. I could get all fancy and calculate the right numbers, but I figure 180 (OpenCV) is half of 360, sofor my H I just divided by two: 177.The other two I kinda guess at a little. I doubled and added 25,S: 125 and V: 147.
In the end, this gave me middle numbers. But I wanted an upper and lower threshold, so I took each number and subtracted 20 to give me a lower, and added 20 to give me an upper.
The result for my robot was:
See code above.
I’ll try to code a color selector into the program to make this whole damn thing a cinch.
If you’d like to read more, two good posts on Stack Overflow.
132-136: Here we actually take the running average of the centroids’ X and Y. We load this into the variablescxAvg andcyAvg, again, this is to assure we are tracking the robot.
142-145: Here the target, or “dot,” for the robot to run after is randomly generated. As you may notice I restricted the generation area of the dots towards the center of my webcam’s field-of-view. That’s because I’m messy and dots were going where the little robot couldn’t get.
147-153:This is a rough collision detection function. Basically, if the robot gets so close to the target (45px) then it has considered to have “eaten” the dot. If it did, then thedot variable is incremented showing the total amount he’s done ate and thenewTarget string variable is flipped so it can generate a new target the next run through.
See code above.
156-177:Here we are trying to find the angle between the robot and his target. We basically divide the entire screen up into four quadrants but always using the robot’s centroid as the point of origin. We then calculate the slope between the target’s X and Y (tY,tX) and the robot’s X and Y (cxAvg andcyAvg).
Something like this:
If the target were to be located in the quadrant III, it would go something like this.
181:When we find the angle between the robot and the target, then convert it into degrees, it ends up giving us a number which is a float. That’s more than we need, so here we convert the float(degs) to and integer(targetDegs) so we can compare to the robot’s compass heading.
184:We declare an empty string calledstrTargetDegs.
187: Then we convert the floatdegs into a string so we can print the target angle onto the screen at line 264.
See code above.
This is where I need help guys. My turning code has a bug, so if you find it and come up with a correction I’ll send you a prize. I dunno? A lint ball? It’d probably be one of my left over circuit boards, or some piece of hardware I hacked together.
But for now, let’s take a look.
The idea is like:
The code is supposed to go as follows:
if target1 = True then:
elif target2 = True then:
elif target3 = True then:
And for the most part that happens, but occasionally it is dumb and turns left when it should right. Not sure what I’m doing wrong. Hey, that “You and me buddy, until the end” is a two-way street. :P
Let’s step through it
195:We want to make sure we are deep into tracking the robot before we start moving it towards the target.
198:We compareintHeadingDeg, which is the robot’s heading angle, withtargetDegs, which is the angle between the robot and the target. But we do this + or - 30º. This means the robot does not have to have its heading angle exactly the same as the angle to the target it. It only need to be approximately pointing in the right direction.
199:The movement code for the robot to go forward is3, so here, given the robot is approximately headed in the right direction, we tell the robot to move forward. This happens by loading3into the variabletranx, which is transmitted to the robot at line 307. When this code gets transmitted to my robot, the Arduino code at line 137 tells theForward(); function to fire.
202:If our robot isn’t headed in the right direction, then which way should he turn?
**203-232: **Still debugging here. I’m sorry guys. I can tell you this code works “Ok.” But once I’m done with this tutorial, I’ll go back and focus on making it turn perfect. Sorry, this code took me two days to right, but this tutorial has taken too many days.
Though, within each of the if statements we have two variable assignments:tranx = XandmotorDuration = 10. The tranx tells the robot which direction to move and the motorDuration tells it how long to move that way (this is not yet being utilized in my code).
See code above.
Here, we are drawing every thing to the screen before we show the frame.
242:Red circle for target.
247:White box to display black text on. Note, we are drawing things bottom up. So, if you want something to have a particular Z level you’ll need to put it towards the top of this section.
250:This is the green line between the target and our robot.
253-267:We display all our info here. Compass heading, target-lock, etc.
270:This actually shows the color window (the window we wrote everything on).
271:This shows the HSV copy of the captured frame. Notice the white area to be assessed as our target.
See code above.
276:An if-statement that waits for the ESC to be pressed. If it gets pressed, we close stuff.
278:This releases our webcam.
279:This closes the windows we were displaying the color and HSV frames.
281:We send the code to stop our robot. If we don’t do this and we hit the ESC in the middle of a robot movement, that move will continue forever.
282:Here we closed the serial connection.
Towards the beginning of this article I stated my webcam had crappy drivers; well, while writing this I noticed I had placed thecv2.destroyAllWindowsbeforecap.release(). This is what was causing the problem. My interpretation of this was our camera being sucked into the void where the destroyed windows go. Anyway, I switched the order and it seems to have solved the problem.
See code above.
Finally, we are opening our second threaded function. This function is much smaller than the OpenCV function. Here all serial communication takes place.
289:This helps in translating ASCII.
292-296:Global variables for passing robot information to other threads.
See code above.
303:We read information into the variablerx. The information is coming from the serial line we opened at the code’s beginning.
307:This is a flag gate that makes it where our Python code can only send a motor command to the robot if the robot isn’t already in the middle of a movement.
308:We write whatever value is intranx, which should be loaded with some sort of movement from lines 192-232.
313:I think I threw this in there so the serial-line would bog down the my code.
316: We strip the number down to three digits only;remember, this is the compass heading in degrees, e.g,000-360º.
319:When something is sent over serial it gets an end-of-line character. We don’t want that.
323:The robot collected this number from a compass, which gave a number with a decimal involved. This removes the decimal so we are only dealing with whole numbers.
326-329:I’m not sure what I was doing here, I think it had to do with the oddities of zero. Eh. I’ll try to remember.
See code above.
This is a short threaded function. It only really has one job, to control how long the motors on the robot stay on. It works like this, if we send the robot a message to move forward, it continues to do so until line341. **There, the command to stop is sent to the robot and themotorBusy** flag is set back to “No” meaning the motor is ready to be used again.
340:This sets how long the motor will stay on. For instance, if it were changed tosleep(1) the robot’s motor would continue in the direction they were told for 1 second.
342:This makes the robot wait in between movements. In theory, this was meant to ensure OpenCV could keep up with the little guy. So, if you have a fast robot, you might set this higher.
See code above.
This bit starts all three threads:OpenCV,rxtx, andmotorTimer.
And here is my poor attempt to explain Python threading. Most Python code is run sequentially; the order it comes is the order it is executed. One problem is timing. If we have to cause a delay in code, then thewhole program has to pause. Threading allows us to get around this. I see it like a juggler performing that trick where he keeps all the balls going in one hand, while he holds one ball still in his other. I dunno, just how I see it.
Well, like I said,“You and me, buddy, until the end.” **And here we are. The end.
I hope this code has been helpful. But do know, you’re not alone.
Skype or email me if you have any questions. Likewise, all that crap I did a poor job explaining, coding, writing, just shoot me an email and I’ll fix it.
I still want to develop this into a Swarmie platform; so you might keep an eye out on www.letsmakerobots.com since I’ll post my unfinished work there. Alright, I’m off to work on the 8th iteration of my Swarmie…ugh.
I threw this little guy together for my son Silas because he wanted to play with dad’s “Wobot.” There’s not a lot to say about him, he’s a hodgepodge of parts I had lying about:
HDPE Bought at the Dollar Store for $2 (I guess that’s the Two Dollar store.)
3-6v 400 RPM Geared Mini Motors: $8
Two wheels from eBay: $2
4-40 bolts, nuts, and washers (local): $4
Arduino Uno: $9.85
Ardumoto Shield: $11
Bluetooth 4.0 Module: $9
4 x NiHM lying about: $0
1 x Free Sunday morning
The first iteration took maybe an hour.
But, after I tossed the little guy together there were a few adjustments. I noticed right away I got this “Oh lord! Don’t drop it!” feeling every time Silas picked him up. Psychology being my profession, I sat on my couch and analyzed it :P
I want my son to spend time with me so I may teach him how to live. I know males often need co-operative tasks to feel secure in their bonding. Therefore, if I’m constantly upset my son is playing with the fruits of my interest he will not share the interests with me. It’s a simple matter of reinforcement. Silas comes into my lab; Silas gets reprimanded; therefore, the behavior of coming into my lab is punished (negative reinforcement) and thereby decreases. This means, for Silas to share my interest, thereby allowing us to bond, I’d need to find a solution to my cognitive dissonance regarding him picking up the robot.
Like most things, I narrowed it down to money. I would get tense because I knew the robot was fragile. It had a mixture of 5 and 3.3v components, and it was still using breadboards and jumpers, I was afraid he’d drop it, it’d break, and I’d lose money.
I couldn’t ask a three-year-old not to pick up a robot; tactual experience is primary for young males, it was an expression of his interest, something I wanted. And I couldn’t make the parts cost less. This left me with only one option**: robustness. **
I vaguely remembered this was a key component of systems theory, but it was one I very often ignored. So, I did what someone who has never had a science would do, I added a lot of bolts.
Video of the “Process”:
Warning: My son is worse than Matthew McConaughey about wearing shirts. Hey, we try, boy’s just proud of his belly.
At the local hardware store I bought some 4-40 bolts and nuts, and started revamping the little bot.
In the end, I really didn’t do anything fancy, as apparent. I drilled holes into the battery plastic-case, that aligned with holes in the robot base, and bolted it together. I, for the first time, used the mounting holes in the Arduino Uno, bolting it to the base. I then “designed” a hood (bonnet) for the little guy. from match HDPE, making sure to bolt it down as well. Lastly, I sealed the motor-gears with electrical tape and put a few drops of oil in them. I noticed this regarding geared mini-motors, they collect hair and will strip out the gears.
In the end, I did nothing a second grader would be proud of, but I did force myself to drop it from hip heigt five times to make sure I was over the “Oh Shiii-nobi Ninja!” feeling. In psychology we call that systematic desensitization. Or something as equally important sounding.
It collected so much hair the tires poped off.
I was careful not to wrap too much of the motor, since I had the thought it might decrease thermal exchange.