The idea of the Bitsy Spider board is diminutive controller node. I wanted it to be cheap and versatile enough to use as a node, but I think the final price is around $11 each.
1 x
3.3V LDO
300mA -- SOT-23-5 - Voltage Regulator: $.58
Total (approximate):
$10.55
There are lots of solder-jumpers on this board, given it is meant to be versatile.
Here is the programming pinout to use an Arduino as ISP
The board is intended to harness the serial connection of the HM-10. In this version I made it straight forward, you leave the jumpers between the RX/TX line of the HM-10 and the ATtiny unsoldered, program the ATtiny as many times as you like. To test your serial connection between the ATtiny and the HM-10 simply breadboard the PCB and put jumpers like so:
PB0 <---> TX
** PB1 <---> RX**
This will allow you to test your code, without having to solder and unsolder. Then, after your code is perfectish, solder the jumpers marked "PB1 & HM10 RX" and "PB0 & HM10 TX," then embed the Bitsy Spider.
This is an option that'll probably continue throughout different versions of the board. I ran the GND connection of the ATtiny through a N-Chan MOSFET, and tied the gate of the FET to the PIO1. The PIO1 pin of the HM-10's function is for a Connection Status LED. But one of the options one can set on the HM-10 is for the PIO1 to stay low unless the HM-10 has a connection. This can be set on the HM-10 when it's in AT mode by typing:
Type:
AT+PIO11
Response: OK+PIO11
When done, the ATtiny 85 will only power-up if the HM-10 has a connection. Of course, the solder-jumper is meant to bypass this feature.
The last solder jumper controls the HM-10's reset. If soldered, the ATtiny 85 can reset the HM-10 by sending PB4 high for ~100mS. I added this as I hope to create a star-network with the ATtiny Bitsy Spider.
Here's a summary explanation; the HM-10 has a time out feature after it losses connection from one HM-10 that prevents it from connecting to another for approximately 10 seconds. So far, there is no option to bypass this "lost connection" time-out. But resetting the HM-10 (<150mS power-cycle) bypasses this time-out. I'll update more on this setup when I've completely tested it. If there are questions, I've written a lot in the comments of my original
HM-10 post
. But also feel fre to contact me.
One last thing I should mention.
I expect one major mistake and two minors on the first run of every board I send off. This board is no exception. I forgot the decoupling capacitors on the voltage regulator and the HM-10. I've added them on the v.02 board. Of course, this shouldn't be a major flaw, but with a capacitor on the voltage regulator it causes it to spit out 3.6v instead of 3.3v. Major problem. I saved this set of boards by soldering a 0402 1uF between the legs of the SOT-23-5 regulator. Not fancy, but saved $5.
They might be small, but their extraordinary contrast and viewing angle more than make up for it. Plus, I mean, c'mon, they're $5. I will say I was a little annoyed that they operate at 3.3v. And I'm sure this means I'll be making a small little level converter board for them pretty soon. I estimate the converter board would be around $1.25, simagestill a good price.images
images imagesimagesSome perks of OLEDs:images
imagesWider viewing angle (i.e., you don't have to look straight down at it).images
imagesNo back light, making them flatter and use less power (not a lot less).images
imagesHigh refresh rate. The only time I saw a flicker is through my video camera. And I had delay(10); in my code :) images
imagesThey are cheap(er?).images
imagesThey're the future :) images
The only downside that really jumped out at me was the libraries are about 9k flash uploaded. The 64x64 LMR Bot was around 1k.
imagesWhen I got them I was worried I wouldn't be able to use them without digging into the
datasheets
. But come to find out, they were exactly the same unit as on
Adafruit's boards
. Sorry, I love you Ada...but...can't afford $19.50. Now, maybe if
Becky Stern
came with them. Erm. Anyway, with Ada's excellent guides and software I had the LMR Bot moving around in about 10 minutes. So, I'll end up buying something from Ada to monetarily say, "Thank you, love."images
UPDATE: Added info on making SPI programming jig (makes life a lot easier).
UPDATE: Added ATtiny 84 info (though, the post is meant for the ATtiny 85).
I've been non-traditional microcontroller curious for a bit. Then, I had to put a Digi-Key order together for some real cheap stupid stuff (some SOT-23 N-Channels for the LiPo charger circuit) and I thought, "What the hell, let's order some ATTiny 85s." Being cheap like I am, I ordered SMD:
This brought the
price for one ATTiny 85 board to $1.53 each
. This is great, since the ATTiny 85 has an internal oscillator up to 8mhz, allowing it run
without any passives
.
I was pretty excited the day they came in. I soldered them together, put some headers on them, and tossed them into a bread board. I switched over to Goggle and searched how to program these little guys. The first article I hit was the one I eventually used, I just warn you, dear reader, read careful not to miss the bit about ignoring the error. Personally, like the dumb-arse I am, programmed my first little ATtiny 85 a hundred times thinking it wasn't working before I caught the caveat in the instructable:
"It should give the following error twice: avrdude: please define PAGEL and BS2 signals in the configuration file for part ATtiny 85"
They were developed to be comparable to the Wire library for Arduino. To install them, just unzip them and place them in your Arduino libraries folder (e.g., C:\Program Files\Arduino\libraries).
Below is code meant to demonstrate the purpose of this projects. It sets the ATtiny 85 as an I2C slave. It receives data over the I2C line, parses it into an integer, then writes this value of the integer to pin 1 (PB1).
// the 7-bit address (remember to change this when adapting this example)#define I2C_SLAVE_ADDRESS 0x4// Get this from https://github.com/rambo/TinyWire#include<TinyWireS.h>// The default buffer size, Can't recall the scope of defines right now#ifndef TWI_RX_BUFFER_SIZE#define TWI_RX_BUFFER_SIZE ( 16 )#endif//Character variable used to echo data back.charchrSendData;//Variables used in getting and parsing data.charrxChrData;//Receives the data.charrxString[12];//Varbiable for holding one string of data.intrxIndex=0;//Used to index rxString.//Integer for holding the pwm value received from master.intpwmValA;voidrequestEvent(){TinyWireS.send(chrSendData);}//Handles receiving i2c data.voidreceiveEvent(uint8_thowMany){if(TinyWireS.available()){if(howMany<1){// Sanity-checkreturn;}if(howMany>TWI_RX_BUFFER_SIZE){// Also insane numberreturn;}howMany--;if(!howMany){// This write was only to set the buffer for next readreturn;}while(howMany--){//Gets i2c data.rxChrData=TinyWireS.receive();//Places the characters in an array one at a time.rxString[rxIndex]=char(rxChrData);//Increment the data array.rxIndex++;//If a stop character is read, parse the char array and convert it to a single integer. if(rxChrData==':'){//This is a low memory form of parsing the char array into an intgerpwmValA=int(100*rxString[2]+10*rxString[3]+rxString[4]);//Prints the parsed value.Serial.println(pwmValA);//Writes the parsed value to pin 1 (PB1).analogWrite(1,pwmValA);//Resets the char array index.rxIndex=0;}}}}voidsetup(){Serial.begin(9600);pinMode(1,OUTPUT);// OC1A, also The only HW-PWM -pin supported by the tiny core analogWriteTinyWireS.begin(I2C_SLAVE_ADDRESS);//Sets up the onReceive function (what we do if we get stuff).TinyWireS.onReceive(receiveEvent);//Sets up the onRequest function (what we do if asked to send something).TinyWireS.onRequest(requestEvent);}voidloop(){//Detects a stop sending command.TinyWireS_stop_check();//Puts the data we got into a variable to send back for error checking.chrSendData=char(rxChrData);}
I've also included the code I used on my Arduino Mega, which was setup as the master.
Your setup
should not look like this :P
I've got several ideas I'd like to attempt with this setup. But, it is somewhat silly. The I2C reduces the ATtiny 85 to four pins. But one of those is the reset pin (PB5), so really, only 3 usable pins.
Before I started working with the Tiny I was lurking in the shoutbox and oversaw
Protowrx
chatting about making an ATtiny into a serially controlled motor driver. So, I set to it. I had chosen I2C because I wanted to make a setup like the
DRV8830
(Spark fun has a
breakout
). Of course, like the numbskull I am I didn't do the simple math before sinking hours into interfacing with a chip.
Most H-Bridge ICs require three pins a motor. Two digital pins for direction and one for PWM. Even cheaping out and using one PWM pin for both motors, that's still five. _And..._the ATtiny 85 has 8 pins. 1 x Power, 1 x Ground, 2 x I2C line, which leaves us with....
4 pins.
Oh wait! One of those is the reset pin and cannot be used without losing the ability to program it without an AVR programmer (which I have, but what a pain in the ass). So! In short, there are 3 usable pins after interfacing with the ATtiny. I'd have done myself a favor if I had
remembered an 80s classic
.
Still, I've I got it in my head to attempt doing something like this:
2 Pin HBridge control
. Only, tying the PWM line together. Not having much luck right now (only spent about 20 minutes, wanted to get this typed up before I forgot crap).
Another idea is to use a Software Serial to
send one way communication
through the serial line. But it doesn't make much sense, since 4 pins aren't much better than 3, given my intentions.
Ok. In conclusion, I'm not sure why I did this. It really doesn't make much sense, other than the adventure. I'm sure it's one of those things I work out now and I won't find a use until much later. The real killer is thinking about how you can buy a full Arduino Pro Mini on the
eBay
for $3.45. A little more than double the cost of an ATtiny 85 but triple the pins and utility. Eh!
Making a ATtiny Jig:
I hate breadboarding. Let me admit that. Mainly, it is having wires everywhere, my little dyslexic brain can't keep up. And when I first started working with the ATtiny uCs I found it to be a pain to have to move my little ATtiny's between my full circuit and the SPI programming circuit. So, I thought, "Why not make a SMD programming pad and jig interface?"
Well, here is the crude son-of-a-bitch:
It's nothing fancy, but it is a time saver.
I put both the interface pads and the jig in my Eagle library.
It was not too hard to put together, I set it up something like this. Then, it is all about added flux to where the pins meet the PCB and soldering like you would usual header pins.
And here it is in action. It surprised the hell out of me, worked exactly like I wanted it.
I'm sure I'll eventually add some stabilizer bars between the two-PCBs and maybe a guide pin to prevent me from pressing the pins in the wrong holes :(
Still, it is MUCH easier than pulling it from the breadboard and moving it to a new circuit. Makes me happy.
I've been working on this one in silence for a bit.
Awhile back it hit me, before I started growing my
Overlord project
in complexity I wanted to refine it for ease-of-use. Therefore, I began translating my Overlord project into a Python module I could build off.
I figure, this would make it easier for anyone to use. This includes myself, I've not forgotten my identity as a hack, nor will anyone who pops the hood on this module :)
But, at its core, there are few essential inputs:
Color to track.
Compass reading.
So, I spent some time translating the code into a callable module. This experiment was mainly for my own use, yet I knew it'd grow healthier if I had LMR's feedback, elder or noob.
When I started
I actually planned
(gasp) out what would make this code more user friendly. I didn't think long; the two things that have taken the most time tweaking to get this code useful are:
Adjusting the compass heading.
Selecting the color to track.
To address the first issue, I developed a "auto-compass calibration function."
defmapper(x,in_min,in_max,out_min,out_max):#This will map numbers onto others.return((x-in_min)*(out_max-out_min)/(in_max-in_min)+out_min)defcompass(headingDegrees):globalcompassInitFlagglobalinitialRawHeadingglobalintRx#This sets the first compass reading to our 0*.ifcompassInitFlag==False:initialRawHeading=headingDegreescompassInitFlag=TrueprintinitialRawHeadingexit#This is the function that actually maps offsets the compass reading.globalintialRawHeadingifheadingDegrees>=initialRawHeading:adjHeading=mapper(headingDegrees,initialRawHeading,360,0,(360-initialRawHeading))elifheadingDegrees<=initialRawHeading:adjHeading=mapper(headingDegrees,0,(initialRawHeading-1),(360-initialRawHeading),360)#Here, our compass reading is loaded into intRxintRx=adjHeading
Basically, this function takes the very first compass reading and adjusts all other readings. So, all you have to do is put your robot in the direction you want it to consider "North," start your code, and this function will convert all other readings.
The second issue took me a little longer to deal with: easy color selection. In short, I rewrote most of the color detection parts of the code to take advantage of the OpenCV's
CamShift
algorithm. This function is more resilient to lighting changes or other near color objects, but it is also more CPU intensive. At some point, I'll probably go back and write a variant that sticks with the old largest-target-color-mass method.
Ok, what this means for the user? When the code starts you select the color you'd like by left-click and dragging a selection box over an area. The mean color of the selected area will be tracked and this will also start the rest of the code.
What does Friendly Overlord give you?
Well, a lot. And when I finish writing the damn thing, more than alot.
Here's a list, and only one bit is untrue.
It tracks your robot, providing its x and y relative to your webcam.
It will provide a target coordinates, which I'll later make addressable in case someone wants to do something cool, rather than have their robot drive around and catch virtual dots. Lame.
It will take the compass reading you provide, translate it to a heading relative to the camera, then, it will send commands to your robot telling it to turn until it is in alignment, then move towards the target.
Make you a cuppa (CP, DanM, did I use that right?)
It will allow you to tweak pretty much any element of the code (e.g., overlord.targetProximity = 5)
What does it
not
do?
Take care of your serial data. You're own your on, bud.
Write your robot uC code for you.
Provide you with your robot's heading (though, when I delve into two-color detection this could be done with two-dots on your bot. But really, it'd be easier and near cheaper to get an HMC5883L).
Alright, so let's talk code. How little code does it take to use it?
importserialfromtimeimportsleepimportthreadingimportoverlord#Initialize Overlord variables.overlord.dVariables()#Open COM port to tether the bot.ser=serial.Serial('COM34',9600)defOpenCV():#Execute the Overlord.overlord.otracker()defrx():while(True):# Read the newest output from the Arduinoifser.readline()!="":rx=ser.readline()rx=rx[:3]rx=rx.strip()rx=rx.replace(".","")#Here, you pass Overlord your raw compass data. overlord.compass(int(rx))defmotorTimer():while(1):#This is for threading out the motor timer. Allowing for control#over the motor burst duration. There has to be both, something to write and#the motors can't be busy.ifoverlord.tranx_ready==Trueandoverlord.motorBusy==False:ser.write(overlord.tranx)ser.flushOutput()#Clear the buffer?overlord.motorBusy=Trueoverlord.tranx_ready=Falseifoverlord.motorBusy==True:sleep(.2)#Sets the motor burst duration.ser.write(overlord.stop)sleep(.3)#Sets time inbetween motor bursts.overlord.motorBusy=False#Threads OpenCV stuff.OpenCV=threading.Thread(target=OpenCV)OpenCV.start()#Threads the serial functions.rx=threading.Thread(target=rx)rx.start()#Threads the motor functions.motorTimer=threading.Thread(target=motorTimer)motorTimer.start()
This is fully functional code. You'll notice that really, only about 10 lines get Friendly Overlord going, the rest handle Serial functions and motor firing. Be warned, the motor firing code will change, since it is written how I like it right now, eventually will be designed to be as flexible as possible.
Walkthrough:
overlord.dVariables() #Sets the Friendly Overlord variables.
overlord.otracker() # The module's heart. Handles color tracking, angle calculation, etc.
overlord.compass(x) # You pass it an compass heading as an integer in degrees (0-360) and it does the rest.
overlord.tranx_ready # Simple flag to indicate last bit of serial data has be sent.
overlord.tranx # Variable that contains the serial command to be sent to the robot.
overlord.motorBusy # Flag to indicate if the robot is still in the middle of a movement.
That's about it. In the module? 399 lines of code, or so. Still relatively small for a program but not something I want to wade through without a damned good reason.
Ok. So, where am I going with this?
Hell if I know. I want to make it as versatile as possible. Eventually, I'd like to be tracking nth number of robots. I envision a swarm of
Yahmez' Baby bots
flying all over the place, Friendly Overlord tracking them, and communicating with them via IR.
But in the more immediate future, I'd like to make every variable tweakable. Especially, variables useful to others. For instance, the overlord.tX and overlord.tY are currently controlled by the module. They are simply randomized numbers. But, I'll make a flag in the next two days to take control of them from your own code. You can decide where you'd like your robot to go. Whether it be to your mouse pointer (overlord.targetY = overlord.mouseY) or a complex set of way-points to lead him through a maze. Really, I'll probably code around the feedback I get.
Now, some obligatory stuff.
Here are some of the current variables addressable from your program:
#How close to does the robot need to be? Greater is less accurate.#Defaults to 5.overlord.targetProximity=5#GUI X, Y#Defaults to 0, 0overlord.guiX=440overlord.guiY=320#Random target constraint; so target doesn't get placed too far from center.#Defaults to 1, 640, 1, 480overlord.targetLeftLimit=20overlord.targetRightLimit=400overlord.targetBottomLimit=320overlord.targetTopLimit=20
But I'd like to make every variable needed by the user available.
Ok. So, here's what I need:
Someone to use it and provide feedback.
I'm getting too close to it and bleary of thought.
I've thought of doing a few things to get some feedback:
Setup a challenge (I've got some surplus).
Offer to mail one person a month a setup (two Bluetooth PCBs and a cheap webcam).
Any suggestions?
I think I'll make a walkthrough video pretty soon (kinda miss making stupid videos) but I'm a little worn out right now.
I've been working on re-making the the
Open Hardware Pulse Sensor
so it'd be easy to send off to OSHPark and to make at home. I'm not sure, but I think I started this projects in March and I've just now finished it.
The bit of encouragement I needed was when hackaday.com put it up as their "
Fail of the Week.
" I thought I was going to be mature about it. But those four red letters started eating at me, so I gave it another go. Weirdly, I got it working.
I believe there were three problems:
I had mixed up the op-amps
again
. In my defense, I've got 5 different ICs flying about in the same package as the op-amp.
The Arduino I'd been plugging into was sitting on a surface that provided enough conductivity to create noise between the 3.3v pin on the underside and A0, which I was using for the op-amp in.
Every time I touched the sensor the exposed vias were shorted through my own conductivity. Stupid mineral water.
The light sensor is the hardest bit, so take your time. I put a little bit of solder on each pad with my soldering-iron, then, cover the soldered pads in flux. Next, I attempt to align the light-sensor with the pads as close as possible. After, I put the board with the backside on an over-turned clothes iron. Let the iron heat up until the solder reflows and the sensor is attached.
Flip the sensor and lock it to your surface with tacky-putty to solder the LED, passives, and op-amp. I won't detail this, since my
video
shows the entire process.
Wrap it with tape, cutting a small hole for the LED and light-sensor.
(I'll come up with a better solution, and a way it to clip it to your body, on the next iteration).
New angle. I finished my
ATtiny Bitsy Spider
(
ABS
) board and wanted to do something with it. While stringing it together I had thought of replacing the Arduino Pro Mini and the Servo Helper board with the ABS. Costs wise, it will be slighty more expensive ($1.50 or so?) but
much
smaller and a lot less hassle.
I've read several people had mixed results getting an ATtiny to control servos. Of course, I'm no better. But I was able to get acceptable functionality out of them (i.e., controlling continuous rotation servo speed, direction, braking). Anyway, here's kinda how I approached the servos on the ATtiny 85.
I found several blogs about getting servos to work on the ATtiny but ultimately I used the
Servo8Bit library
(note, for ease of use I'm linking the "Arduino version" below, not AVR).
It doesn't seem real friendly, but in a hack's opinion, it seems like great code that is incomplete--hope someone corrects if I'm off. The problem I had, and I believe others, was the library using Timer1 for servo timing. The Tiny cores (at least the ones I'm using) use Timer1 for basic functionality, creating a conflict. This presented to me in the inability to use the delay() function. It was simply as if it had not effect. That's when I popped the hood on the library itself. In the
header
files there is an option for which timer to use. So, I switched it from Timer1 to Timer0. I tried the code again. Great, delay() seemed to work now,
but
the ability to control the servos was gone. As soon as the myServo.attach(3) was called the servo would spin in with full speed in one direction. Damnit.
I didn't feel like digging through the rest of the library trying to debug something I only half understood. So, I began researching. After a bit, I came upon
this
thread. Seems this fellow WireJunky was trying to figure out how to do the same, control continuous rotation servos with an ATtiny. At the end Nick Gammon told him he should just create his own timer function.
Anyway, I hacked this code out after reading the thread and was surprised it did what I want.
I'm a hack hacking with a hacksaw!
//Basic Jot movement using ATtiny Spider#include"Servo8Bit.h"voidmydelay(uint16_tmilliseconds);//forward declaration to the delay functionServo8BitmyServoA;//create a servo object.Servo8BitmyServoB;voidsetup(){myServoA.attach(3);//attach the servo to pin PB3myServoB.attach(4);mydelay(1);}voidloop(){myServoA.write(160);// tell servo to go to position in variable 'pos'myServoB.write(50);// tell servo to go to position in variable 'pos'mydelay(2000);// waits 15ms for the servo to reach the positionmyServoA.write(90);// tell servo to go to position in variable 'pos'myServoB.write(90);// tell servo to go to position in variable 'pos'mydelay(2000);// waits 15ms for the servo to reach the positionmyServoA.write(50);// tell servo to go to position in variable 'pos'myServoB.write(160);// tell servo to go to position in variable 'pos'mydelay(5000);// waits 15ms for the servo to reach the position}voidmydelay(uint16_tmilliseconds){for(uint16_ti=0;i<milliseconds;i++){delayMicroseconds(1000);}}//end delay
There are a few issues. It seems my B servo has some jitter in it. It doesn't like to stop at myServoB.write(90). I tried calling myServoB.detach(), then myServoB.attach(3) in a hackish attempt to stop the servo. It'll stop but wont re-attach.
Anyway, even if troubleshooting it doesn't work out I have some work arounds. For example, running the VCC for the servos through a P-Chan that is controlled by the ATtiny, it'd take an extra pin but would allow me to accurately control stopping them. Though, I believe this lack of "centeredness" is due to either a cheap 0805 I used in the conversion or other noisy stuff I have on the PB4 pin.
Of course, to use the ABS as a replacement brain on the Jot,
I'lll need to create a star-network with the ABS'es, write a library to control the HMC5883L from the ATtiny, make sure there are no other timing issues, and fit it all in 8k Flash.
Ugh. Righ now the code size is around 3k with servo and serial library.
UPDATE: 12/24/13
Well, I don't know what to say. I think I'm going to take a break from this build for a bit and focus on finishing the Overlord projects with the Dot Muncher.
I discovered what was causing my problems with the NRF24L01. It wasn't the voltage-regulator. It was the 1uF 0805s filtering the regulator. I replaced the unknown capacitors (ones bought off of eBay) with some from Digi-Key that were rated 25v. This fixed the problem and I had the Jot communicating nicely as I had hoped.
Of course, that wasn't the end of the problems. I discovered the HCM5883L board was shorting, I believe, everytime I programmed the board. It's pissing me off. I've burnt four compass boards and two Arduino Pro's over the issue (smoked around $15 in parts). It has something to do with the HCM53883L I2C lines feeding backward through the board whenever the Arduino goes low. It causes the voltage-regulator on the HCM5883L board to pop almost right away. Of course, it does slight damage to other parts of the connected boards. I didn't know it at the time, however, I believe this was the reason the filtering capacitors were damaged, the backward current.
Stuff Burnt on the Pyre of Stupidity -->
That's not the only issue. The code I've got together for the NRF24L01 doesn't play nice with the HCM5338L library.
But I can't tell how to re-write the code in a way they are happy with each while the f'in boards keep burning up. Sigh.
Nevertheless, I think my next step, when I've got my gusto back, will be to make a complete schematic of the Arduino Pro Mini, Little Helper Board, and the HCM5338L. I figure the first two I have schematics for already, and I have enough HCM5883L boards to pull the components off and reverse engineer the PCB.
Still, I'm a little depressed. I'm feel like I'd been better off making the boards myself. At least then, I would know exactly how they are strung together and could only blame myself for issues.
I also feel like Frits needs to put together a "Robotics Fail of the Week" so I can be it's first highlight. Man, even looking at that picture now makes me feel like I suck at life. Oh well, I'm going to list the good stuff I've learned from this.
Reverse current is a bitch--low-drop diodes are your friends.
I have put together code that makes the NRF24L01 have closer to Bluetooth functionality. Though, it doesn't like being in the same code as the Wire library.
Cheap parts are require you be time rich.
NRF24L01 isn't really meant for streaming data. I knew this to begin, but I didn't understand how it really plays out in the code. The NRF takes a lot of code management. Unlike other devices that are hardware managed or SoC. This makes the NRF highly sensitive to what else your code is doing. In my case, running servos, communicating over I2C, and doing floating point math. As I progress in this build, I feel I'm taxing the NRF's functionality beyond its ability.
It is better to learn the circuits of all boards connected to yours. It might initially take more time, but in the end save time
and
money.
If I fail it something, although looking ridiculous is not fun, documenting the failure makes me feel better. Like it meant something. Even if that something is, "Hey world, I'm an idiot." :)
UPDATE: A Jot of Trouble
I didn't want to float this post until I have something working to update, but I missed writing for a change. I've been working on the little Jot frequently over the last few months. Yet, I keep running into problems. The NRF24L01s are acting silly. One day they work, another they don't. I guess we can't make woman jokes now that
Roxanna77
is here? (If you read this Roxanna, just wanted you to know I had to make sure my wife didn't read this post, it'd been hell in my house).
I have reworked the servo board (v.9.5) to include a double
90-degree heade
r. One set is to attach the servos, the other is to attach the
Compass (HMC5883L)
.
This was meant to make the hardware more compact, modular, and keep the compass level for a better reading. Oh yah, and I burnt a few HMC5883Ls trying to connect them with crappy braided wires.
Also, I've added solder-jumpers to tie the 3.3 SOT-23-5 voltage regulator's enable pin to either high or low, depending which one I mistakenly buy.
On the top side I've included an SMD voltage divder running straight into analog pin A3. My intention is to allow the Jot to keep an eye on its battery voltage as a way of sensing how "hungry" it is.
I've added a 3.3v pin on the new double 90-header, in case I've a 3.3v sensor elsewhere on the bot. I dunno, I was trying to use all the extra pins I had.
Of course, since I've learned how to tent vias, I've also tented the vias on the board with hope I'll save myself a fateful short or two.
I'll eventually try to replace those bulky headers with what I've affectionaly begun to refer to as "
Those short, round headers
." I like these little headers because of how utterly small they are. Of course, they are small but the bulk of their body does not make it through thicker PCBs. This is due to the point flaring closer to where the plastic header is. This flare prevents the short-rounds from sinking through the typical header hole on most boards.
But, I've got calipers, Eagle CAD, and OSHPark, so I made a little library of header holes that will allow these pins to slip neatly through the board and mate with short-rounds on the other side. I sent off to OSHPark for a sample, so I'll report back when I've tested them for effect.
On my original version of the servo board (by the way, I refer to it as the Little Warmie Helper board or LWH board) I had used a
different voltage regulator
that cost more. The only difference I found between these were the output, the first I used putting out 200mA and the second 150mA. I really didn't think this made a difference, given what I could find in the datasheet. I know there are passives effecting the power consumption, but it's the only info I could find (
datasheet
, pg. 8) The NRF24L01 was using around 11.3mA for the transmitter and 13.5mA for receiver. Even though I didn't know the power-consumption of the passives I believed I was well within the range to use the cheap 150mA voltage regulator. But, experience has proven otherwise.
This is where I ask the professionals to come in and tease me about missing something simple.
The only theory I could invent, given my limited understanding of electronics, is the NRF24L01 only used 11.3/13.5mA in an amp hour, but the burst use exceeds the constant 150mA of the cheap regulator? I don't know. I'm at a loss.
Of course, this is pure speculation. I'm currently out of higher output voltage regulators (I should have some more by the end of the week). But, I can leave the NRF24L01 in place on my LWH board and solder on jumpers to the 3.3v and GND pins and get the NRF24L01 to function properly. This makes me believe the fault lies directly with the inadequacies of the voltage-regulator and not my board design (though, it's inadequacies I'm sure are glaring).
Anyways, this is where I am with the little Jot.
A couple of notes. I have a backup design of the Jot that I'm working to get cheaper than $25, which uses BLE (yes, those
HM-10
s I'm in a love-affair). Also, I decided if I was to get serious about the
Overlord
projects I'd probably do better turning it into a Python module, which I've been doing in silence and is around 90% done. I'll try to have it up before the end of the year. I need to finish several functions.
UPDATE: Progress on NRF24L01 code for working between PC, Hub, and Robot.
So, here is my attempt at a swarmie build. Not much here yet, simply a personal build log until I get an iteration cheap enough, then, I'll start incorporating them into the
Overlord
projects.
I have to bow to
Bajdi
; those little NRF24L01 take
a lot
more brainpower than simple ole' Bluetooth. I tried for some time to write my own code that would send
and
receive bytes to or from the other node. After a little of hair pulling I gave up and started reading other's code. I came across
Robvio
on the
Arduino Forums
who had some rather nifty code that I left nearly intact.
#include<SPI.h>#include"nRF24L01.h"#include"RF24.h"RF24radio(8,7);// Radio pipe addresses for the 2 nodes to communicate.constuint64_tpipes[2]={0xF0F0F0F0E1LL,0xF0F0F0F0D2LL};//for Serial inputStringinputString="";// a string to hold incoming databooleanstringComplete=false;// whether the string is complete//NRF PackagesbyteSendPackage[32];byteReceivePackage[32];booleansending=0;voidsetup(void){//// Print preamble//Serial.begin(9600);radio.begin();// optionally, increase the delay between retries & # of retriesradio.setRetries(15,15);radio.setPayloadSize(32);radio.openWritingPipe(pipes[1]);radio.openReadingPipe(1,pipes[0]);radio.startListening();//radio.printDetails();}voidloop(void){//check for NRF receivedNRFreceive();//check for Serial received (or filled by NRF)Serialreceive();}voidserialEvent(){Serial.println("Event");while(Serial.available()){charinChar=(char)Serial.read();inputString+=inChar;if(inChar=='\n'){stringComplete=true;}}}byteNRFsend(StringNRFPack=""){NRFPack.getBytes(SendPackage,32);radio.stopListening();radio.openWritingPipe(pipes[0]);radio.openReadingPipe(1,pipes[1]);boolok=radio.write(SendPackage,sizeof(SendPackage));if(!ok)Serial.println("NRFerror");radio.startListening();unsignedlongstarted_waiting_at=millis();booltimeout=false;while(!radio.available()&&!timeout)if(millis()-started_waiting_at>200)timeout=true;if(timeout){Serial.println("NRFerror");}radio.openWritingPipe(pipes[1]);radio.openReadingPipe(1,pipes[0]);}voidNRFreceive(){if(radio.available()){//byte ReceivePackage[32];booldone=false;while(!done){done=radio.read(&ReceivePackage,sizeof(ReceivePackage));delay(5);}radio.stopListening();inputString=((char*)ReceivePackage);stringComplete=true;radio.write("1",1);radio.startListening();}}voidSerialreceive(){if(stringComplete){if(inputString.startsWith("T:")){NRFsend(inputString.substring(2));}if(inputString.startsWith("S:")){Serial.print(inputString.substring(2));}inputString="";stringComplete=false;}}
The way this code works is much like a software and serial simulated Bluetooth module.
To send serial data it goes like this, you type something with a prefix code, T for transmit and S for serial print, and ending with a newline character (\n).
For example, typing the following in the terminal on
module A
:
T:S: My message \n
Will send "My message" to the other
module B
, then, it will print "My Message" to serial line on the
module B
.
If you type,
T: My message \n
This will transmit "My message" from
module A
to
module B
, but it will not be printed to the serial line on
module B.
I'll let you guys look the code over and tell me if I can improve it for what I'm doing. Right now, I've tested it with some basic Python code to send a serial message to my
hub
(Arduino Uno and NRF24L01), which relays it to the
robot
(Arduino Pro Mini and NRF24L01).
I threw this little guy together for my son Silas because he wanted to play with dad's "Wobot." There's not a lot to say about him, he's a hodgepodge of parts I had lying about:
HDPE Bought at the Dollar Store for $2 (I guess that's the Two Dollar store.)
3-6v 400 RPM Geared Mini Motors: $8
Two wheels from eBay: $2
4-40 bolts, nuts, and washers (local): $4
Arduino Uno: $9.85
Ardumoto Shield: $11
Bluetooth 4.0 Module: $9
4 x NiHM lying about: $0
1 x Free Sunday morning
Total: $36.85
The first iteration took maybe an hour.
But, after I tossed the little guy together there were a few adjustments. I noticed right away I got this "Oh lord! Don't drop it!" feeling every time Silas picked him up. Psychology being my profession, I sat on my couch and analyzed it :P
I want my son to spend time with me so I may teach him how to live. I know males often need co-operative tasks to feel secure in their bonding. Therefore, if I'm constantly upset my son is playing with the fruits of my interest he
will not
share the interests with me. It's a simple matter of reinforcement. Silas comes into my lab; Silas gets reprimanded; therefore, the behavior of coming into my lab is punished (negative reinforcement) and thereby decreases. This means, for Silas to share my interest, thereby allowing us to bond, I'd need to find a solution to my cognitive dissonance regarding him picking up the robot.
Like most things, I narrowed it down to money. I would get tense because I knew the robot was fragile. It had a mixture of 5 and 3.3v components, and it was still using breadboards and jumpers,
I was afraid he'd drop it, it'd break, and I'd lose money.
I couldn't ask a three-year-old not to pick up a robot; tactual experience is primary for young males, it was an expression of his interest, something I wanted. And I couldn't make the parts cost less. This left me with only one option**:
robustness
. **
I vaguely remembered this was a key component of systems theory, but it was one I
very
often ignored. So, I did what someone who has never had a science would do, I added a lot of bolts.
Video of the "Process":
Warning
: My son is worse than Matthew McConaughey about wearing shirts. Hey, we try, boy's just proud of his belly.
At the local hardware store I bought some 4-40 bolts and nuts, and started revamping the little bot.
In the end, I really didn't do anything fancy, as apparent. I drilled holes into the battery plastic-case, that aligned with holes in the robot base, and bolted it together. I, for the first time, used the mounting holes in the Arduino Uno, bolting it to the base. I then "designed" a hood (bonnet) for the little guy. from match HDPE, making sure to bolt it down as well. Lastly, I sealed the motor-gears with electrical tape and put a few drops of oil in them. I noticed this regarding geared mini-motors, they collect hair and
will
strip out the gears.
In the end, I did nothing a second grader would be proud of, but I did force myself to drop it from hip heigt five times to make sure I was over the "Oh Shiii-nobi Ninja!" feeling. In psychology we call that
systematic desensitization
. Or something as equally important sounding.
It collected so much hair the tires poped off.
Bleh.
I was careful not to wrap too much of the motor, since I had the thought it might decrease thermal exchange.
NOTE: Try as I might, guys, I can't get the numbers to line up in my HTML version of my code. Instead, you might just load it into Geany or Notepad+ to follow along, since I indicated things by the line number. I'm sorry, I'm out of patience for it.
These are redneck instructions on how to control a robot with a static webcam for under 50 USD.
I'm a robot builder and I got tired of seeing universities play robot soccer or something with computer vision guiding their players, and no matter how much I begged, darn Ivy Leagues wouldn't share.
So, I wrote my own. And while I did it, I swore I'd try to help
anyone
trying something similar.
The PC averages these X, Y positions for around 150 camera frames.
If the blob hasn't moved much, the PC assumes the red blob is the robot.
The PC gets frisky and gives our robot a random target within the webcam's field-of-view.
The PC calculates the angle between the bot and the target.
Meanwhile,
the robot's microcontroller is taking readings from a
magnetometer
on the robot.
The robot, with a one time human calibration, translates true North to "video-game north," aka, top of PC's screen.
The microcontroller transmits this code to the PC.
The PC compares the angle of the bot from the target with the robots angle.
The PC sends a code to the bot telling it to turn left, right, or move forward (closer to the target).
When the robot has made it within an acceptable distance from the target he "Munches the Dot."
A new random dot appears. Rinse repeat. (For real though, don't rinse the bot. Consider
Asimov's
Third Law.)
About Me: (skip, it's boring)
I'm a homeless outreach worker. The job's amazing. But I'll say, emotionally taxing. Skipping the politics and the sermon on
harm-reduction
, I decided at the start I needed something far from the job to allow my mind rest and prevent compassion fatigue. Something that consumed my brain-power so I'd not be stressing over the 6 months pregnant 17 year-old, shooting up under a bridge on I-35. Something to protect my down-time so I'd be frosty for the next day.
Well, I saw that TED talk about the
Sand Flea
and I told Bek, "That's awesome, think I could build one?"
"Oh crap," she said, "new obsession?"
Now, robots are my relief. My way to prevent white-matter from becoming dark-matter as I rake through sludge looking for those who want out.
I started reading a lot. I discovered, Arduino, Sparkfun, eBay, Raspberry Pi,
ferric chloride
, Python,
hackaday
,
HC-SR04
,
Eagle
,
OSHPark
, and the list goes on. But every time I Googled something about robots, I'd end up at the same place.
These guys are brilliant. They are a college education from collaboration, I swear.
Soon, I ended up with
my first bot.
A piece of sh...short-circuits. Although, I did learn a lot interfacing the bot with the
Raspberry Pi
. Also, while I was working with a Raspberry Pi, I played with
OpenCV
, and was considering adding a face tracker to my bot before I got distracted. But before I quit, I created a
proof-of-concept
.
So, all these experiences began to culminate.
Meanwhile, I was taking a graduate Research Methods class at UTA and my professor disappeared. The university was amazing; good professors filled in and made sure our education didn't suffer. But we wondered for many months. Sadly, it was discovered he had killed himself.
It shook me. I deal with suicidality every other day, but it's usually on the street. Why a successful research professor? My thoughts got dark for a bit, which meant I sunk into robots even more. Yet, now, a question sat at the front of my mind:
Will robots one day kill themselves?
This may sound silly. But I believe the formula for self-termination can be expressed in Boolean logic, and therefore coded.
Pseudo-code would be:
if painOfExistence > senseOfPurpose then:
self_terminate()
Derived from work and life experience I genuinely believe the root-motive for suicide is
existential-anxiety
, which seems to me, entangled within both constructs.
Ok. Skipping the _Time_bit.
Someday, I'd like to delve into swarm robotics. Or, at least, attempt to replicate organic group behavior within a robot group. And I thought it might be possible to control a group of robots with a setup similar to those universities or research groups keep showing off. (Jockish Ivy Leagues :P)
Well, I found these desires, information, and tools synergized into a passion. After two days, I was able to write a basic OpenCV Python script that could control a robot using a static webcam looking down on it. Let me clarify, I'm of average intelligence, simply obsessive, so when I mentioned "two-days" I'm trying to convey the utter feasibility of this project, for
anyone
. Python, Arduino, and OpenCV make it
so very easy;
anyidiot like me can hack it out.
Of course, my purpose for this platform is to control robot groups. The group being the second social collection (one-to-eight) and social interaction seems to be the essential in developing a
positronic brain
. The white-mattered brained being necessary for me to test the above mentioned self-termination formula. So, maybe, I'll learn if robots will commit suicide, or perhaps, have a better understanding of why humans do.
Dark and depressing! I know, right? Who writes this crap!?
A robot
It doesn't matter what sort of robot you use, it only needs:
A magnetometer. I used the
HMC5883L
. They're like 2 USD on eBay.
A wireless serial connection.
Bluetooth
,
Xbee
, and
nRF24L01
would be my recommendation since all are well documented creating a bridge between PC and microcontroller.
I personally built my
own
using red cutting-board I stole from Bek (shh). For my serial connection I used two $10 Bluetooth 4.0 modules, which I've written an instructable on setting up a Bluetooth 4.0 module to work with an Arduino and PC:
Bluetooth 4.0 and Arduino
.
A PC
Probably something less than 10 years old. It could be running Linux or Windows;though, I'll be using Windows Vista (hey, I'm first-world poor and can't afford Windows 7 :P).
It will need a wireless serial connection that pairs with your bot. Again, I used my
BT 4.0 modules
.
A Webcam
It's really up to you. I'm not going to lie, I went with the
cheapest webcam
I saw, which costs 6.87 USD. But I would _not _recommend this webcam. It didn't like my PC, so every time my Python script stopped I had to unplug the webcam and re-plug it in. A real annoyance for debugging.
I'd suggest a high-resolution webcam. Maybe even a IP cam, if you're rich? If you are, would you buy me one too?
Long
male-to-female
USB cable. Again, I got two 15' USB cables on
eBay
for around 4.50 USD. If you get everything setup and you notice problems with the webcam at the end of the cable, you can put a powered hub at the end of the cable with an extension cord and it'll take care of the issue. Though, I didn't have this problem at 15'.
A wife that'll let you screw your webcam into the ceiling. Or...don't ask...
Now, about any robot will work, like I've stated, so Google away and select a robot build you like.
Of course, everything you'd every want to know can be found one this site :)
I'm just sayin'.
But the code, that's the part we want to focus on. Really, our robot only has a nerves and muscles, the brain will actually be in the PC, all the robot does is,
Calculates the compass info.
Sends the compass info to the PC.
Reads the movement codes from the PC.
Translates the movement code received into a motor activation.
That's it. Pretty simple.
//I've been using Zombie_3_6_RC in Processing to interact.// Reference the I2C Library#include<Wire.h>// Reference the HMC5883L Compass Library#include<HMC5883L.h>// Store our compass as a variable.HMC5883Lcompass;// Record any errors that may occur in the compass.interror=0;//int pwm_a = 10; //PWM control for motor outputs 1 and 2 is on digital pin 10intpwm_a=3;//PWM control for motor outputs 1 and 2 is on digital pin 3intpwm_b=11;//PWM control for motor outputs 3 and 4 is on digital pin 11intdir_a=12;//dir control for motor outputs 1 and 2 is on digital pin 12intdir_b=13;//dir control for motor outputs 3 and 4 is on digital pin 13intlowspeed=120;inthighspeed=140;//Distance awayintdistance;//Sets the duration each keystroke captures the motors.intkeyDuration=10;intiComp;voidsetup(){Serial.begin(9600);Wire.begin();// Start the I2C interface.Serial.println("Constructing new HMC5883L");compass=HMC5883L();// Construct a new HMC5883 compass.Serial.println("Setting scale to +/- 1.3 Ga");error=compass.SetScale(1.3);// Set the scale of the compasserror=compass.SetMeasurementMode(Measurement_Continuous);// Set the measurement mode to ContinuouspinMode(pwm_a,OUTPUT);//Set control pins to be outputspinMode(pwm_b,OUTPUT);pinMode(dir_a,OUTPUT);pinMode(dir_b,OUTPUT);analogWrite(pwm_a,0);//set both motors to run at (100/255 = 39)% duty cycle (slow) analogWrite(pwm_b,0);pinMode(2,OUTPUT);//attach pin 2 to vccpinMode(5,OUTPUT);//attach pin 5 to GND// initialize serial communication:Serial.begin(9600);}voidloop(){// Retrive the raw values from the compass (not scaled).MagnetometerRawraw=compass.ReadRawAxis();// Retrived the scaled values from the compass (scaled to the configured scale).MagnetometerScaledscaled=compass.ReadScaledAxis();// Values are accessed like so:intMilliGauss_OnThe_XAxis=scaled.XAxis;// (or YAxis, or ZAxis)// Calculate heading when the magnetometer is level, then correct for signs of axis.floatheading=atan2(scaled.YAxis,scaled.XAxis);// Once you have your heading, you must then add your 'Declination Angle', which is the 'Error' of the magnetic field in your location.// Find yours here: http://www.magnetic-declination.com/// Mine is: 237' W, which is 2.617 Degrees, or (which we need) 0.0456752665 radians, I will use 0.0457// If you cannot find your Declination, comment out these two lines, your compass will be slightly off.floatdeclinationAngle=0.0457;heading+=declinationAngle;// Correct for when signs are reversed.if(heading<0)heading+=2*PI;// Check for wrap due to addition of declination.if(heading>2*PI)heading-=2*PI;// Convert radians to degrees for readability.floatheadingDegrees=heading*180/M_PI;// Normally we would delay the application by 66ms to allow the loop// to run at 15Hz (default bandwidth for the HMC5883L).// However since we have a long serial out (104ms at 9600) we will let// it run at its natural speed.// delay(66);//This throttles how much data is sent to Python code. //Basically, it updates every second (10 microsecond delay X 100 iComps)if(iComp>=30){intadjHeading=0;//The "floor" part makes the float into an integer, rounds it up.headingDegrees=floor(headingDegrees);if(headingDegrees>=280){adjHeading=map(headingDegrees,280,360,0,79);}elseif(headingDegrees<=279){adjHeading=map(headingDegrees,0,279,80,360);}Serial.println(adjHeading);iComp=0;}iComp++;delay(10);//For serial stability.intval=Serial.read()-'0';if(val==1){Back();}elseif(val==2){Right();}elseif(val==3){Forward();}elseif(val==4){Left();}elseif(val==5){Stop();}}voidBack(){//Straight backanalogWrite(pwm_a,highspeed);analogWrite(pwm_b,highspeed);digitalWrite(dir_a,HIGH);//Reverse motor direction, 1 high, 2 lowdigitalWrite(dir_b,LOW);//Reverse motor direction, 3 low, 4 highdelay(keyDuration);}voidLeft(){//LeftanalogWrite(pwm_a,lowspeed);analogWrite(pwm_b,lowspeed);digitalWrite(dir_a,HIGH);//Reverse motor direction, 1 high, 2 lowdigitalWrite(dir_b,HIGH);//Reverse motor direction, 3 low, 4 highdelay(keyDuration);}voidRight(){//RightanalogWrite(pwm_a,lowspeed);analogWrite(pwm_b,lowspeed);digitalWrite(dir_a,LOW);//Reverse motor direction, 1 high, 2 lowdigitalWrite(dir_b,LOW);//Reverse motor direction, 3 low, 4 highdelay(keyDuration);}voidForward(){//set both motors to run at 100% duty cycle (fast)analogWrite(pwm_a,highspeed);analogWrite(pwm_b,highspeed);//Straight forwarddigitalWrite(dir_a,LOW);//Set motor direction, 1 low, 2 highdigitalWrite(dir_b,HIGH);//Set motor direction, 3 high, 4 lowdelay(keyDuration);}voidStop(){//set both motors to run at 100% duty cycle (fast)analogWrite(pwm_a,0);analogWrite(pwm_b,0);//Straight forwarddigitalWrite(dir_a,LOW);//Set motor direction, 1 low, 2 highdigitalWrite(dir_b,HIGH);//Set motor direction, 3 high, 4 lowdelay(keyDuration);}
The first bit of robot code I'd like to focus on is the compass. Now, I've not detailed how to use the
HMC5883L
, since SparkFun has done this for me. I also won't go into
tilt-compensation
, since I was more worried about proving the concept here than dead-on accuracy. But if you're a smart-cookie and would like to take that chaellenge, feel free. Just be sure and share the code with us all when you're done :P
No. Instead, I want to focus on adjusting the compass heading from a value respective to true North, to what we want it to think is north, in our case, whatever is the top of our screen. This process takes a little involvement, since the numbers must be set manually and with a little guesstimation.
See code above.
So, I got my compass module lying flat as possible and then bolted it to my robot. This helps assure your getting a full 360º and will keep you from having to re-calibrate what we'd like to call north every time the compass module gets knocked out of place.
106-114:
These modules and the Arduino library are both designed to have 0º be North, but we want to set our own north, video-game north. Which is exactly what lines 106-114 are about. I found 80º is what value my robot was reading when he was headed towards the top of the screen. I had to find a way to adjust this to give me the reading 0º. I ended with this simple code to spin the compass.
I had to divide the adjustments into two sections for the math to stay simple. Lines 109-111 handle
mapping
0-79º onto 280-0º, making the robot think 0-79º is 280-0º. Lines 112-114 do the same for 80-360º, converting it to 0-279º.
Honestly, I've got some spatial-retardation, so I have a hard time thinking through this,
I just know it works
. So, if you have problems I'll answer emails and Skypes and we can work through it together. And, if you want to submit a better explanation, I'll post it and be sure to give you credit.
Do know, my redneck solution was to change the orientation of the camera.
Pfft. Too easy
.
Moving on,
116:
Sends the robot's heading to the PC.
117:
iComp is a variable allowing us to decide when to start sending data to the PC. We don't want to send data to the PC before it's ready or before the robot is warmed-up, we'd be dealing with poor readings.
118:
This is a delay that makes sure we are not bogging down the serial line, since every time we call
Serial.println("whatever")
both the PC and the robot have to take some processing power to deal with it. In short, it's to make sure the robot is not talking the computer's ear off.
See code above.
This bit is pretty easy. It reads the codes being sent from the PC and translates them into a function call. I write all my robot-PC interactions this way, since if I want a code to mean something completely different, for instance I want to swap the robot's right and left movements, I'd just swap lines 134 and 144.
Easy.
125:
If I remember correctly, this line reads serial data being sent from the PC and assures the
val
variable isn't getting a bunch of zeros.
Easy one.
This is one of the functions called to make the motor move, or in the case of this function, stop.
188-189:
This actually tells which pin on the Arduino, specified by the variables
pwm_a
and
pwm_b
to decrease to 0. This effectively stops our robot.
192-193:
This bit actually tells the motor which direction to turn. The pins (
dir_a
and
dir_b)
are set either HIGH or LOW and this changes the direction of how the motor moves.
For windows, use the MSI Install respective to your architecture, either x86 or x64. Of course, Linux and Mac are versions are there as well. Go ahead and install Python 2.7, but I'm not a fan of their
IDE
. Instead, I use:
Though, this IDE is a little tricky to get running on Windows, since it's meant for Linux.
These posts
over at Stack Overflow go through some popular Windows Python IDEs. Pick what you feel comfortable in. I suggest running 'Hello World' in each until you decide you like one.
Here we are, the hardest part of this whole project; if not careful, we fall into
dependency hell
.
I'm going to try and help you setup all the modules needed to run the Python code. It's been difficult for me to do this right, so I'll try to be descriptive.
Of these
we will need to install OpenCV, Numpy, and Serial,
since the rest come built into Python 2.7.
The main trick with
any
module you install in Python is to make sure the exact path you install it to gets added to the Environment Variable (this is true for both Windows and Linux).
To explain this I'm going to hand it over to Lovely Ada as she tells us how to install the Serial module:
At this point, you might bring up Python and try some simple webcam capture test code (if you have problems copying and pasting, I've added web capture code as an attachment as well):
See code above.
If you see a live feed from your webcam, you're almost good to go.
If there any problems, like I said, you and me buddy. Feel free to ask questions here or Skype me:
thomas_ladvien
Okay. Here's all the Python code in one go. _Don't be scared _if this looks confusing. I feel the same way. In fact, some of it I _still _don't understand. (Hey, honesty a is a rare fault I seem to possess.) Again, don't worry, we're going to walk through it one section at a time, you and me, buddy. Until the end.
On the flip side, if you are a Python guru, or yanno, just a sassy-pants: Feel free to add corrections and comments on this page. I'd love to make this code grow through critique. Do know, I guarantee the following: Typos, grammar problems, illogical coding, artifacts from debugging, and the like. But don't worry, I'm thick skinned and usually wear my big-boy panties.
Also, I've included the code as an attachment, it's at the bottom. Video-game south.
See code above.
Ok. The beginning.
So lines 3-10 pull in the
modules
we will need. My take on a module is the following, "Code some smart guy wrote and doesn't want anymore, so he gave it to me to use."
Numpy
, which we'll call "
np
" throughout the code, is used for higher number functions needed for OpenCV to do her magic.
Serial
is the module which will allow us to establish a serial connection between the PC and the robot, via whichever wireless device you've chosen.
Time
allows us to basically idle the code. This is important in controlling many things, for instance, how far the robot moves. We tell the motors to turn on, wait 10 secs, then turn off. Because the sleep function actually puts the code into an idle state, we must have the threading module, since our code requires the PC to do several things at once.
Math
. From the math module we get the code to help us simplify the trigonometry calculations, like the angle between the robot and target.
The
random
module is only used to gives us a random target.
Threading
. Important module. Basically, threading allows the computers to do two tasks at the same time. This becomes important when we are both trying to track the robot and receive his position. Throughout this code we will have three threads
The thread running the OpenCV stuff. This tracks the robot and is also the largest.
A thread controlling the serial connection between the robot and PC.
And a thread with the small job of telling the motors how long to be on, thereby controlling how far the robot will move.
See code above.
13: This is where we actually open a serial connection to the wireless device you are using. Note, we've named the serial connection we opened "
ser
" so when we go to send information it will be something like,
ser.write("What you want to send here")
15-38: Here we declare a bunch of
variables.
The "
global variable
" lets the code know that this variable is going to jump between all threads. Next, the
variable = 0
actually declares the variable. Do know, you'll have to remind each thread a variable is global by stating "global variable."
One thing I should state,
iFrame = 0
is an actual variable declaration, as well as setting it to 0. Of course, this is how one would declare an integer variable with an initial value of 0. On the flip,
rx = " "
isalso a variable declaration but this time a string. You'll know I switched information from a integer to a string if you see something like this:
headingDeg = str(intHeadingDeg)
That tells the code, "I want to convert the value in intHeadingDeg, which is an integer, into a string and call it 'headingDeg'"
The comments indicate what each variable is meant for. Not going to lie, not sure I don't have some declared variables I meant to use, didn't, and forgot to remove.
One important variable is the
iFrame
variable, since it tracks which frame we are on. This becomes key in all aspects of tracking our robot.
See code above.
42
: Here we start this
function
that does most of the work,
OpenCV():
. It is one of the functions that
will be threaded at lines 345-347
.
44
: We open up the webcam and give it the nickname
cap
. If I remember right the "0" in the parenthesis refers to whatever camera comes first on your USB bus, so if you have more than one camera you can specify by changing this number, e.g.,
cap = cv2.VideoCapture(3)
. Notice we called the OpenCV module cv2, so we are using the OpenCV module to access the webcam.
46-52
: Just making the variables we declared work within this function. This might not be needed, but hey, I don't read the
whole
Python manual.
55:
This is just a string flag that is flipped to tell the PC to generate a new target for the robot. Note, we initially set it to "Yes" meaning the first time we run through this function a target needs to be generated.
58:
This is an integer variable to count how many dots the robot has "ate."
Ok, before I get to the next bit I need to take a minute and explain how we approach actually getting the coordinates of our robot. As you know, OpenCV does the hard work for us, giving us the X and Y coordinate of the largest red blob on the screen. Though, the coordinates it gives us are the center of the mass. Now, this is all just a logical guess because I didn't read the whole OpenCV manual, but I believe the X or Y coordinate that refers to the center of this mass is called the
centroid
.
This might seems simple. That's because it is, I'm not sure why we don't just call it the damn center or something. Eh, oh well. Though, it will become important when we do collision detection between the robot and its target.
61-62
: All that to say, the "c" in
cyAvg
and
cxAvg
stands for centroid. So, these are variables that will hold the running average for the X and Y coordinates of the red blob's centroid.
65-66:
These are back-up variables of the
cxAvg
and
cyAvg
and will be important around line
122-127
when we are trying to decide if the color we are tracking is actually the robot or some other piece of junk with enough red in it to fool OpenCV.
69:
This simply clears the string variable with data that came from the robot, like the robot's heading, before another iFrame starts.
See code above.
71
: Creates a loop within the OpenCV() function.
73-81:
Ok, I need to be humble here and say I'm not sure what the
Cthulhu's
Kitchen I was doing. I know
printRx = str(intRx)
is taking the information received from the robot and converting it into a string.
intRx
is as a global variable and it is loaded with robot data at line 326.
headingDeg = printRx
is moving the heading data from one variable to another; the idea here was if I wanted more information to come from the robot besides the compass heading it would come in through
printRx
, then I could chop it up and load it into variables respective to their purpose.
For instance, printRx.split(",") should give a list of strings based on how many commas are currently held within printRx.
But the part that confuses me is I turn right back around and convert the string back to an integer? I'm not sure, guys. I might have Southpark while coding again.
At the end of that poor coding we end up with two variables to use:
intHeadingDeg
and
headingDeg.
We the integer
intHeadingDeg
to do any calculations that involve the robot's heading. The other,
headingDeg
, is to print the robot's heading to the screen, which is done at line 263.
84-85:
These are string variables that will will hold the "Target Locked X" or "Target Locked Y" if we are tracking the robot. These strings are needed so we can print this to the screen on line 259-260.
See code above.
We're in the meat now.
88:
This increments our frame counter.
91:
We read a single frame from the webcam we declared, cap, at line 44.
OPENCV!
Sorry, I just love it so much.
So, by now you know I've not read the OpenCV manual. And please don't tell me, "What! Go
RTFM
!" You go RTFM! I've got a wife, kid, and a job I love. I'm just going to tinker with crap and get it to work. But this attitude will begin to show as we go through the OpenCV calls, since I don't know their inner working. Instead, I'm going to offer my best guess, and as always, if someone wants to correct me or offer better explanation, I'll post and give credit.
94:
This
blurs
the image we got. You may say, "But I thought higher resolution was better?" It is. But jagged edges and color noise are not. A simple shape is much easier for math of OpenCV to wrap around then a complex one. Therefore, we blur the image a little, giving us softer edges to deal with.
Also, blur melds colors, so if there are 2 blue pixels and 1 red pixel in a group, then it become 3 blue-purplish pixels. This has the nifty benefit of speeding up the image processing
a lot
. How much? I don't know I didn't RTFM.
97-100:
Our image is converted to a
histogram
here. Having the image in a histogram format allows us to use
comparative statements
with it. What we use it for is to get rid of all the colors except the one we are trying to find. This will give us a black and white image, the white being only the color we are looking to find.**Line 98 is where your color is defined (it's the two "np.array"s).
In the next step I'll go through how to select your robot's exact color.**
103:
Finds the contours of the white area in the resulting image.
107-112:
OpenCV then counts how many pixels are in each contour it finds in the webcam image. It assumes whichever has the most white area (aka, "mass") is our object.
114-117:
After we decided which object we want to track, now we need to come up with the centroid coordinates. That is what lines 115-116 do. I've not done the research on the math there, but I believe it averages the
moments
of the polygon and calls the average either centroid X or Y, depending on the calculation. But, feel free to correct or explain better.
121-127:
Here we lock onto the mass we believe is the robot. It begins by collecting a 150 samples before it will state is tracking the largest mass. But after it begins to track the largest mass, then we try to stay locked on to it. This is line 122-127. In essence, we allow the mass to move enough to be considered a movement by the robot, but not so much that noise (like a stray hand in the webcam image) will cause the tracking to switch off the robot.
See code above.
This particular line defines what color you are looking for, specifically, the two sets of values:
130, 170, 110 and 190, 190, 200.
These two values set the lower limit and the upper limit of the color you are looking to find. The reason we use upper and lower limits, which we'll call color thresholds, is because our robot will move through different lights. Different light sources have a tendency to change how the webcam reads the color.
The color format we are using is HSV, which stands for
hue, saturation, value
. Later, I'll probably write code to select the robot within our actual program, but for now I use
Gimp
and the following method:
Setup your webcam the in the area you'll be using, just like you're ready to control him.
Run the webcam program attached in step 10.
While the webcam program is watching your robot, hit
Ctrl + Print Screen
Open Gimp.
Hit Ctrl + V to paste the screen capture into gimp.
Now, find the Color Selector tool.
Select the main color of your robot.
Now double click on the color square on the toolbar.
A window should pop open with color information regarding the color you selected, your robot.
Now, the three numbers listed should be close to what we need. Sadly, we have to convert from Gimp's HSV number range to OpenCV's HSV number range. You see, HSV value range in Gimp is H = 0- 360, S = 0-100, and V = 0-100. In OpenCV, H = 0-180, S = 0-255, V = 0-255. So, some conversion needs to take place.
From my selection I ended with Gimp numbers of, H: 355, S:50, and V:61. I could get all fancy and calculate the right numbers, but I figure 180 (OpenCV) is half of 360, so
for my H I just divided by two: 177.
The other two I kinda guess at a little. I doubled and added 25,
S: 125 and V: 147.
In the end, this gave me middle numbers. But I wanted an upper and lower threshold, so I took each number and subtracted 20 to give me a lower, and added 20 to give me an upper.
The result for my robot was:
See code above.
I'll try to code a color selector into the program to make this whole damn thing a cinch.
If you'd like to read more, two good posts on Stack Overflow.
132-136:
Here we actually take the running average of the centroids' X and Y. We load this into the variables
cxAvg
and
cyAvg
, again, this is to assure we are tracking the robot.
142-145
: Here the target, or "dot," for the robot to run after is randomly generated. As you may notice I restricted the generation area of the dots towards the center of my webcam's field-of-view. That's because I'm messy and dots were going where the little robot couldn't get.
147-153:
This is a rough collision detection function. Basically, if the robot gets so close to the target (45px) then it has considered to have "eaten" the dot. If it did, then the
dot
variable is incremented showing the total amount he's done ate and the
newTarget
string variable is flipped so it can generate a new target the next run through.
See code above.
156-177:
Here we are trying to find the angle between the robot and his target. We basically divide the entire screen up into four quadrants but always using the robot's centroid as the point of origin. We then calculate the slope between the target's X and Y (
tY
,
tX
) and the robot's X and Y (
cxAvg
and
cyAvg
).
Something like this:
If the target were to be located in the quadrant III, it would go something like this.
If you'd like to dig further into
Trigonometric Functions
in Python, have fun. Share if you find better math :)
See code above.
181:
When we find the angle between the robot and the target, then convert it into degrees, it ends up giving us a number which is a
float
. That's more than we need, so here we convert the float
(
degs)
to and integer
(targetDegs)
so we can compare to the robot's compass heading.
184:
We declare an empty string called
strTargetDegs
.
187:
Then we convert the float
degs
into a string so we can print the target angle onto the screen at line 264.
See code above.
This is where I need help guys. My turning code has a bug, so if you find it and come up with a correction I'll send you a prize. I dunno? A lint ball? It'd probably be one of my left over circuit boards, or some piece of hardware I hacked together.
But for now, let's take a look.
The idea is like:
The code is supposed to go as follows:
if target1 = True then:
MoveForward()
elif target2 = True then:
TurnRight()
elif target3 = True then:
TurnLeft()
And for the most part that happens, but occasionally it is dumb and turns left when it should right. Not sure what I'm doing wrong. Hey, that "You and me buddy, until the end" is a two-way street. :P
Let's step through it
195:
We want to make sure we are deep into tracking the robot before we start moving it towards the target.
198:
We compare
intHeadingDeg
, which is the robot's heading angle, with
targetDegs,
which is the angle between the robot and the target. But we do this + or - 30º. This means the robot does not have to have its heading angle
exactly
the same as the angle to the target it. It only need to be approximately pointing in the right direction.
199:
The movement code for the robot to go forward is
3
, so here, given the robot is approximately headed in the right direction, we tell the robot to move forward. This happens by loading
3
into the variable
tranx,
which is transmitted to the robot at line 307. When this code gets transmitted to my robot, the Arduino code at line 137 tells the
Forward();
function to fire.
202:
If our robot isn't headed in the right direction, then which way should he turn?
203-232:
Still debugging here. I'm sorry guys. I can tell you this code works "Ok." But once I'm done with this tutorial, I'll go back and focus on making it turn perfect. Sorry, this code took me two days to right, but this tutorial has taken too many days.
Though, within each of the if statements we have two variable assignments:
tranx = X
and
motorDuration = 10
. The tranx tells the robot which direction to move and the motorDuration tells it how long to move that way (this is not yet being utilized in my code).
See code above.
Here, we are drawing every thing to the screen before we show the frame.
242:
Red circle for target.
247:
White box to display black text on. Note, we are drawing things bottom up. So, if you want something to have a particular Z level you'll need to put it towards the top of this section.
250:
This is the green line between the target and our robot.
253-267:
We display all our info here. Compass heading, target-lock, etc.
270:
This actually shows the color window (the window we wrote everything on).
271:
This shows the HSV copy of the captured frame. Notice the white area to be assessed as our target.
See code above.
276:
An if-statement that waits for the ESC to be pressed. If it gets pressed, we close stuff.
278:
This releases our webcam.
279:
This closes the windows we were displaying the color and HSV frames.
281:
We send the code to stop our robot. If we don't do this and we hit the ESC in the middle of a robot movement, that move will continue forever.
282:
Here we closed the serial connection.
283:
We quit.
Towards the beginning of this article I stated my webcam had crappy drivers; well, while writing this I noticed I had placed the
cv2.destroyAllWindows
before
cap.release().
This is what was causing the problem. My interpretation of this was our camera being sucked into the void where the destroyed windows go. Anyway, I switched the order and it seems to have solved the problem.
See code above.
Finally, we are opening our second threaded function. This function is much smaller than the OpenCV function. Here all serial communication takes place.
289:
This helps in translating ASCII.
292-296:
Global variables for passing robot information to other threads.
See code above.
303:
We read information into the variable
rx
. The information is coming from the serial line we opened at the code's beginning.
307:
This is a flag gate that makes it where our Python code can only send a motor command to the robot if the robot isn't already in the middle of a movement.
308:
We write whatever value is in
tranx
, which should be loaded with some sort of movement from lines 192-232.
313:
I think I threw this in there so the serial-line would bog down the my code.
316:
We strip the number down to three digits only;remember, this is the compass heading in degrees, e.g,
000-360
º.
319:
When something is sent over serial it gets an end-of-line character. We don't want that.
323:
The robot collected this number from a compass, which gave a number with a decimal involved. This removes the decimal so we are only dealing with whole numbers.
326-329:
I'm not sure what I was doing here, I think it had to do with the oddities of zero. Eh. I'll try to remember.
See code above.
This is a short threaded function. It only really has one job, to control how long the motors on the robot stay on. It works like this, if we send the robot a message to move forward, it continues to do so until line
341.
There, the command to stop is sent to the robot and the
motorBusy
flag is set back to "No" meaning the motor is ready to be used again.
340:
This sets how long the motor will stay on. For instance, if it were changed to
sleep(1)
the robot's motor would continue in the direction they were told for 1 second.
342:
This makes the robot wait in between movements. In theory, this was meant to ensure OpenCV could keep up with the little guy. So, if you have a fast robot, you might set this higher.
See code above.
Ok.
Code's End.
This bit starts all three threads:
OpenCV
,
rxtx
, and
motorTimer.
And here is my poor attempt to explain Python threading. Most Python code is run sequentially; the order it comes is the order it is executed. One problem is timing. If we have to cause a delay in code, then the
whole program
has to pause. Threading allows us to get around this. I see it like a juggler performing that trick where he keeps all the balls going in one hand, while he holds one ball still in his other. I dunno, just how I see it.
Well, like I said,
"You and me, buddy, until the end."
And here we are. The end.**
I hope this code has been helpful. But do know, you're not alone.
cthomasbrittain@hotmail.com
Skype: thomas_ladvien**
Skype or email me if you have any questions. Likewise, all that crap I did a poor job explaining, coding, writing, just shoot me an email and I'll fix it.
I still want to develop this into a Swarmie platform; so you might keep an eye out on
www.letsmakerobots.com
since I'll post my unfinished work there. Alright, I'm off to work on the 8th iteration of my Swarmie...ugh.
UPDATE (2/514): I split this post, since it's getting a little sluggish. I've updated the breakout board versioÂn v.9.9, have instructions for updating the firmware, and added some research notes on a pseudo-Star-Network.
UPDATE (11/23/13):
I've added research notes on networking the HM-10s and an ATtiny 85 serial filter (at bottom).
OLD:
I know there are few Bluetooth 4.0 and Arduino solutions coming out. Redbear Labs'
BLE Shield
, the
BLEDuinoe
Kickstarter projects, and the
Bluegiga Shield
. But I didn't really like these due primarily to the price:
Redbear's Mini: $39.95 (Note: This is a uC and BLE combo).
Redbear's Uno Shield: $29.95
BLEDuino: $19.95 (if part of Kickstarter)
Bluegiga Shield: $69.95
These are out of my price range for a single module. So, in the end, I created a breakout for a cheap module and got it interfaced with the Arduino for approximately
$10.03 a module.
Although, this price will be higher if you don't buy components in bulk.
Here's a Video Summary
:
Now,
I've not interfaced these with iOS or Droid devices, they are simply a Bluetooth 4.0 solution for a wireless serial connection
. I've interfaced these devices in a limited way with iOS. I used the LightBlue App on my iPad Mini to open a rough serial interface.
Though, I'll probably do this later with Jellybean 4.3's Bluetooth 4.0 API. UPDATE: I've discovered jnhuamoa provides
sample iOS 7.0 interface code for the HM-10
.
Proof of Concept Video
Now, if only I had the $99 to pay for an App store publisher license, I'd make us all a nice little robot interface :)
The modules I used were these
HM-10's
. I won't go into making the breakout board, since I did
that already
. I will state, though, the last iteration of the breakout boards I made had mistakes that I was able to correct for home use, and I've corrected them in the Eagle files I'll put up, so
the board files I put up are untested
, though, they are on the way and when I've confirmed they work I'll post a confirmation. Also, the images I have of the boards I'm using are different, since I corrected the board files.
UPDATE:
It has come to my attention the traffic LEDs on the RX/TX lines are always on due to the level converter pulling the lines high.
The board still functions as intended if the LEDs are left unpopulated.
Ok. Let's make a breakout...
1. This is the v .9.9 of my breakout. I do not swear it is bug free, but it seems stable. Working traffic LEDs and it uses a linear voltage regulator
:
(OPTIONAL)
SOT-23 LDO Voltage Regulator
(it doesn't make sense to use this, but I put the option on the board just in case. I'll explain).
Again, I bought pieces in bulk, since I know I'll use them on other projectss; my price per module is $10.03. Of course, you can buy all these components on DigiKey but the price will be bit more.
Ok. Let me explain the 3.3 linear regulator. I added this option to the board in case there is no pre-regulated 3.3v source, but it inherently contradicts the purpose of using a Bluetooth 4.0 module:
extremely low power consumption.
I tried to get a reading on the milliamps the HM-10 pulls, but my multi-meter only goes to the tenths (ma) and the module wouldn't show at all, even during active use. And as many (all?) probably already know, the linear regulator is
extremely
inefficient. So, it's much better to solder the jumper that bypasses the regulator and leave it un-populated. UPDATE: I've found info on
power consumption
:
Other important soldering tools: A
wet sponge
and
brass-ball
will keep your fine soldering tip _fine. _Sponge the solder tip, then run it through the brass-ball after each component to prevent build-up.
To speak blasphemy: Flux is ok, but I find the tweezers often take place of the flux.
Practice using
both
hands during soldering. Tweezers in one and solder-iron in the other.
5. Wire it up to serial port.
So, this is the board I screwed up on. Basically, like a dumb-ass I was trying to regulate 3.3v with a voltage divider. Of course,
I know better now
. Still, given the HM-10 pulls fewer than 10ma, I'll probably go back and run the math to see if as voltage-divider is, in fact, a feasible solution.
Anyway, the hookup is pretty simple.
BT-3.3v <---> 3.3v
BT-RX <---> FTDI-TX
BT-TX <---> FTDI-RX
BT-IO1 <--> LED <--> 220 Res. <--> GND
BT-GND <---> FTDI GND
(For the 3.3v I used a regulator and tied my grounds).
A few notes, the RX and TX lines are translated from 3.3v to 5v by way of a voltage divider and the BS1138.
All other lines will die at >3.3v
.
Now, as I've stated, I'm connecting two modules together, so you have to set one module as the slave.
Then, under the "Send" tab type in AT commands and hit "Send ASCII":
Send: AT
Response: OK
Now, setup one unit as the slave (they default as master).
Send: AT+ROLE1
Response: OK+Role:Slave
That should be all we need to do to setup the connection. Now, whenever they power on they will automatically try to mate.
You'll know if they are connected if the LED goes from blinking to solid.
7. Wire the modules to the target devices.
BT-3.3v <---> Arduino 3.3
BT-RX <---> Arduino TX
BT-TX <---> Arduino RX
BT-IO1 <--> LED <--> 220 Res. <--> GND (or if you've soldered on the 0603s you can skip that connection).
Notice the mistakes routing my board? :(
It was salvageable though.
10. Turn on the devices and make sure the LEDs go solid.
(10a. Yell at me if it doesn't work.)
11. If the LEDs go solid, then you have a serial connection between the devices. Have fun :)
Some things I've discovered:
They have much better range than I would have thought. I'm getting around 30ft indoors. I've not tried them outside. For those of you who've read my developing post:
Yes, having the copper planes underneath the antenna is what caused the range issue. They've got super range now :)
UPDATE: I found info on
range
: 60 feet indoors, 300 feet line-of-sight.
They connect
much
faster than older Bluetooth devices.
Actively sending or receiving draws fewer than 10mAs :)
I love these little guys over Xbees :)
Research Towards a Hub and Node network using the HM-10s:
11/18/2013
The Theory:
So, I've been working on putting an ATtiny 85 at the end of the HM-10's serial line to allow for remote control of AT commands. It goes something like this:
Using Software Serial to setup two serial lines. Basically, the ATtiny 85 acts as a filter on the serial line. If it is a regular message it passes from TX1 to TX2. But the code in the Tiny will be looking for serial data that begins with "AT+" and if it sees something, it will instead write that command to RX1.
Now, stick with me a minute.
The Master has a mode called Remote, which is setup with the command AT+MODE2. While in Remote mode the HM-10 will transmit serial data but also accept AT commands. Sadly,
this seems to only work on the Master.
So, we must have a different setup for the slaves.
In the case of the slaves we use the reset line. Each slave will have the ATtiny filter, but when it gets an "AT+" command in the serial data it will pull the reset line low. This resets the HM-10. We do this because the HM-10 has a command AT+IMME1 and this will put the HM-10 Slave into a mode where it wont automatically seek to pair. Instead, it will accept AT commands until given the command "AT+WORK" whch will send it into pairing/transmission mode.
Ok. Going back to our Slave setup. So, when we setup our HM-10/ATtiny combos as Slaves, we put them all in the mode where they don't seek to pair until given the command AT+WORK. Of course, we program the ATtiny to send the HM-10 into pairing mode whenever it is turned on. Then, once it pairs with our Master we can send a serial message through the Master to the Slave with the string, "AT+RESET&AT+PIO11&AT+WORK" When the ATtiny gets this code it will pull the reset line low, putting the Slave back in AT mode. Then, the ATtiny connected to the slave will send the command AT+PIO11 which puts pin 1 on the HM-10 high. After, the ATtiny gives the command to the Slave to re-enter transmission mode.
Voila
.
Alright, so far, I've got all that coded and the hardware worked out--most everything above I can confirm works.
But, I've been skeptical as to whether or not the HM-10 will connect quick enough for a Master to have a seemingly seamless transmission between Slaves. I derived this skepticism from watching the blinking connection LED everytime I reset one of the HM-10s that was formerly paired. Then it hit me. They weren't
immediately
reconnecting because the Slave still thought it was connected, therefore, the HM-10 firmware had not re-initialized pairing protocol. I tested it. And sure enough, if a Master and Slave are paired, one loses power, then the other will hang for 3 seconds before trying to pair again. But, if one loses power and the other one is reset at the same time, when they both power back on (<100ms) they will almost
immediately
pair.
Booyah!
So, all we have to do is setup a code where a Master connects to a node, tells it what it needs to, then tells it to reset itself,. Afterwards, the Master changes its own pairing pin, then resets itself, whenever the Master comes back up it should almost immediately connect to the new node.
And there we go. A viable Bluetooth 4.0 Star Network. I hope to have this fully tested before the Holidays.
11/23/13
(Warning: Lots of vehement expression towards datasheet-writers)
Ok. So here is what've I learned.
Alright, I'm beginning this article by saying; I love the HM-10. Excellent device. However! I want to beat the ever loving poo out of their datasheet writer. To begin, I've ordered several HM-10s from www.fasttech.com over the course of several months. And it never dawned on me they were upgrading the firmware quicker than I could buy them. This wouldn't be too bad, but it's like the HM-10 monster took a poo and the datasheets are the result: actual commands for listed firmware versions don't match the datasheets, there is different information in the Chinese datasheets than the English, some AT commands have been merged without being stated. It's just fubar.
So, some of the issues I've had trying to network the little devices I believe has come from firmware versions not playing nice.
For example, the HM-10 V303 has a command AT+IMME1 (0 to turn it off) for the Master only that keeps it in AT mode until given the command
AT+WORK.
I discovered that stupid-ass
jnhuamao
changed the firmware at some point (in the 4xx range) and this command merged with AT+START, which in my V303 datasheet is a command for something else. F'in poor translation.
Now, I have 2 boards with firmware V303 and 1 board with V502. I also have 2 modules that I bought later which more than likely have something greater than V502.
I'm praying they are V508 or greater; at V508 they added the feature to
upgrade the firmware
through the serial line.
'Bout damn time.
I can't find the datasheets (in either language) for V502, but looking at the V508 I can see the AT+TYPE command now has three options. The V303 lists only two options for AT+TYPE. Yet, somehow, my V303 boards actually take this third option (AT+TYPE2). Bizarre.
Moving on from the firmware and datasheet mess: Using the ATtiny 85 does work, but to get the HM-10 to take the commands it requires:
TinySerial.
write
("AT+xxxxx");
So, in theory, to get a HM-10 Master setup to only enter transmission mode when given a command, it goes something like this:
TinySerial.write("AT+RENEW"); //Reset to factory settings.
TinySerial.write("AT+ROLE0"); // Be the Master.
TinySerial.write("AT+IMME1"); // Don't enter transmission mode until told.
TinySerial.write("AT+RESET"); // IMME takes effect after reset.
TinySerial.write("AT+"START"); // Ok, try to connect to something.
This resets it to factory settings, tells it not to connect until given the command, then it gives the command to start trying to connect.
Here's example code I use on the ATtiny 85:
/* This code has been modified for use on an ATtiny. Created by Matthew on June 11, 2013 http://projectssfromtech.blogspot.com/ This example code is in the public domain. */#includeSoftwareSerialTinySerial(3,4);// RX, TXSoftwareSerialTinySerial2(1,2);// RX, TXStringblah;intincomingByte=0;voidsetup(){// Open serial communications and let us know we are connectedTinySerial.begin(9600);//Serial line for the ATtiny85 to read/write from/to the HM-10.TinySerial.println("Tiny Serial Connected via SoftwareSerial Library");TinySerial2.begin(9600);//Serial line for the ATtiny85 to print to a serial port.TinySerial2.println("Tiny Serial Connected via SoftwareSerial Library");TinySerial.write("AT+RENEW");// Reset all settings.delay(300);TinySerial.write("AT+ROLE0");// Slave mode ("AT+ROLE1" is slave and "AT+ROLE0" is master)delay(300);//TinySerial.write("AT+PASS001111"); // "AT+PASS001111" sets the password.//delay(300);//The work mode only works for the Master HM-10.TinySerial.write("AT+MODE2");//"AT+MODE0" = Transmission Mode, "AT+MODE1" = Remote Control Mode, "AT+MODE2" = Modes 0 + 1.delay(300);TinySerial.write("AT+IMME1");// Don't enter transmission mode until told. ("AT+IMME0" is wait until "AT+START" to work. "AT+WORK1" is connect right away.).delay(300);TinySerial.write("AT+START");// Ok, go ahead and enter. BULLSHIT! Apparently "AT+WORK" is not what we use, it's "AT+START"delay(300);}voidloop(){}
Ok. I also learned a little more about PIN command. To begin, "AT+PASS000001" will set the PIN,
not
"AT+PIN000001". Of course, it must be a 6 digit number, so, fill the others with zeros. Now, depending on the firmware version there are 3 different settings for PIN pairing, all set by AT+TYPEx
AT+TYPE0 -- this is supposed to be "Connect without password mode"
AT+TYPE1 -- "Simple pairing" (no explaination).
AT+TYPE2 -- "Requires PIN for pairing"
Alright. So, this was the key to my switching between modules. I thought I would set a unique PIN for each slave and the ATtiny 85 connected to my Master would switch the PIN on my Master depending on which node I wanted to connect.
Well, this feature is broken.
I played with it for several hours and no matter how I set the PIN or TYPE settings, the modules
would pair even without the correct pin
. I could find no answer for this behavior.
Until
, I read through the
Chinese
version of the datasheet and came across this gem.
"IMPORTANT: V515 previous versions, the directive no practical effect, after setting causes not connect, please do not use."
Of course, this is a Google translation. But I'm pretty sure I read that, "
This feature on versions under V515 does not work.
"
And that's where I am at the moment. I wanted to make sure I wrote some of this stuff down in case others were running into problems. My next projects will be writing to
jnhuamao
and get some questions answered (e.g., "Any way to get upgrade the firmware on versions less than V508 so I'm not left with 5 unsecure HM-10s; maybe through the SPI interface?).
I'm posting this collection out of frustration and perhaps defeat. I've been working on several projectss for the last two months, trying to finish something. I'd gotten addicted to that "It works!" moment I think anyone gets when they see a LED blink. Sadly, I feel I've failed at most of these projectss.
The second reason I post is posterity.
I've grown to appreciate failure, given how much I learn from it. Of course, I'd much rather learn from
other's
failures. So, I figure I'd try to write up all my blunders for others.
The last reason is tactical humility.
I figure I might be able to finish some of these projectss if someone tells me what the hell I'm doing wrong (though, it might take less time if someone tells me what's right).
Alright, enough self-loathing and belaboring.
Contents:
Arduino heart rate sensor.
Bluetooth 4.0
Heatsinking a high-power LED
XL4432 -- long-range RF
SMD version of Atmega Fuse Doctor
Arduino Thermostat
Raspberry Pi and Remote Compiling
Pulse Sensor Re-Creation -- A Story of Heartbreak:
Pulse Sensor Attempt 1:
For a awhile now I've been interested biometrics. I figure if my wife ever finishes graduate school and becomes my sugar-momma, then, I'll pursue my pin-headed dope in Experimental Psychology. Developing my own sensors, or at least having an intimate knowledge of how they work would probably help me getting into a program (damn education inflation). So, I've been watching out for open-hardware sensors for a bit, and it seems these
guys
pulse-sensor was ever increasing in popularity.
As always, I still believe the best way for a dumb-guy like myself to learn smart stuff, is by taking apart stuff the real smart-people make. But being a non-conformist, I avoided trying to re-make their sensor. Still, after viewing other schematics I found (
1
,
2
,
3
,
4
,
5
), I decided I'd be such a non-conformist I'd conform and go with the popular sensor.
After a glance, it seemed great; a small lightweight heart-rate monitor that was Arduino compatible. Then, I noticed the
design files
were put together in
Design Spark
. "No problem, I thought, I'll just export them over to Eagle, then I can play with the PCB layout, maybe switch those 0603s to 0805s and send it off to OSHPark."
Come to find out there is no easy way to export Eagle files from Design Spark.
New plan, I'll just follow the Pulse-Sensor
schematic
and re-create the entire board in Eagle (all half inch of it). And that's what I did. I'd post those Eagle files, but, they suck and don't work. I had made several major mistakes.
To begin, I had to create several Eagle components for the board. The op-amp, LED, and light-sensor. Respectively,
MCP6001
,
AM2520ZGC09
, and
APDS-9008
. None were a real threat. I made each part according to the datasheets. Then, I strung together the schematic in Eagle, switched to the PCB layout, and threw my pieces down. But for some reason I thought, "I should replace the 0603 passives on this board with 0402s."
I figured, if I could shrink the board even more I'd be happier, right? I mean, smaller is better--so the women say. Well, it was the only thing on this board version I didn't regret.
In the end, the board was sent off to OshPark for a $2.00.
When the post came, as my friends across the lake say, I was excited about the itty-bitty board. Unfortunately, I started my string of disappointments after I populated the board.
Like I said, I didn't regret the 0402s at all. They all soldered on pretty easy. Though, I think my primary beef with 0402s over 0802 is when it comes to tweezerless soldering. See, when I go to solder 0802s I have this process of tapping one pad with a bit of solder, and taking a resistor for example, hold the 0802's body with tweezers in my left hand while my right hand keeps the solder warm. Then, I simply run one end of the resistor into the pool of solder, move the iron away, then, when the solder cools, let go with the tweezers. To get the other end, I'll put the tweezers down and simply tap the other end with solder.
Voila
.
I'd try to get a picture of this process, but, I don't have a third hand yet. Though,
these folk
are working on it.
This doesn't work with 0402s. One, holding the body of a 0402 with tweezers is like a giant trying to tight-rope-walk a piece of dental-floss. But the problem really begins in the second step, when I set the tweezers down to tap the other end, the heat from my iron shots through the little 0402 to the other end, loosening the hardened side, as soon as this happens, the entire 0402 stands on end, hugging my iron. Of course, this ends with me quickly pulling the little guy off with my fingers (not smart, but each 0402 is like $.20).
A few notes:
The LED fit through the hole in the PCB, but I was worried the drill wouldn't be wide enough (I guessed a bit).
OSHPark takes non-squarish shapes, though, they charge you as if it were square.
Open-source hardware is not the same as Open-Source (but lacking some key pieces of information) Hardware. I believe the Pulse Sensor is the latter.
The only piece that couldn't be soldered with a solder-iron was the light-sensor. All of it's pads are tucked under the actual component. So, I used the good ole' overturned clothes-iron.
Anyways, once I had all the components in place? I threw on a few wires, attached it to the Arduino, uploaded the sketch, turned it on.
And
...
smoke.
Waah-waaah.
I'd like to tell you this story has a good end. But, like the title suggests, it's an incomplete work.
I started trouble shooting the board: I checked the wiring, the schematic, the Eagle components I made. I tried 3.3v, different sketches, touching the sensor, not touching the sensor, praying to Cthullu. Nothing. Just pretty smoke.
Finally, I gave up.
(Well, as much as I ever give-up on anything. Stupid obsessive nature.)
Pulse Sensor Attempt 2:
Well, weeks passed and I was working on other projectss, but the little pulse-sensor board kept begging for another chance.
I decided to try again. I'd have no more trying to ignorantly trouble-shoot the thumbnail sized board, so I went back to the designers' original files, downloaded DesignSpark, and pulled up the schematic and board. After briefly examining them I began to realize I was a little off on the board layout. Then it hit me, I could always copy the design manually, it shouldn't take long since I had the schematic already together and the components made.
Well, below is
a video of that process
:
After I caught the two errors (wrong MCP6001 and disorientation of the APDS-9008) I felt a little better sending it to OSHPark again. Two dollars later I wasn't as sure, but the boards were on their way regardless. While I waited, I jumped on Texas Instrument's website and ordered samples of (what I hoped was) the correct op-amp.
When the boards came in I did something frugal that made me kick myself later: I pulled the components off the old board to put on the new board. It sounded like a great financial strategy at the time, but added more work when
I realized my new little board still didn't work
. Now, I had to question whether it was the components, the board, or I also ran into this
article
that scared the crap outta me. Although, it gave me a lot of respect for those guys. Re-soldering 2,000 SMD leds in a night was no small task. And perhaps it welled slight guilt in me since I was working hard to circumvent their
more
than deserved profit off their $25
sensor
.
That's pretty much it: So far. I've not had the time to give her another go, but I will. The next round I'll actually create breakout boards for the main components to make sure my soldering is not the problem. I'm really only concerned with the light sensor, since the pads are practically DFN (no exposed legs). But I hope to have this tied up in the next week to month.
If anyone knows me, they know I'm cheap. But I prefer to think of myself as "resource efficient." This has led me to do a bit of shopping at
FastTech
. Their stuff isn't as cheap as the eBay, but pretty damn close.
Well, that's a prelude to this story. Awhile back a co-worker's head-phone jack on his phone broke. He said, "Don't you fiddle with that stuff? You should just make me a Bluetooth headset." Externally: Joke-deflected. Internally: Challenge-accepted.
I started looking through different Bluetooth ICs. I mean, why buy
Bluetooth ear-phones
for under $20 when you could make a set for twice that? Especially, when only takes around a 100 hours of "fiddling" time? Americans are so lazy.
Well, the first IC I landed was this guy:
LMX9838
. And it wasn't until I had finished the Eagle part, designed the schematic, and was working on the board did I look at how much it
retailed
for. Well, I wasn't going to order samples every time I wanted to embed Bluetooth in my projects, besides, isn't Bluetooth 2.0 power hungry?
Well, back to looking through ICs.
And on the second browsing I ran across Texas Instrument's new
CC2500
series. Righteous.
I saw the possibility of making a Bluetooth 4.0 device with this little chip
CC2540
. It's a SoC (system-on-chip) with Bluetooth 4.0 capability. Again, righteous.
I ordered several samples of the chip from TI. While I waited on the post I began wading through all the terms: BLE, BT 4.0, Smart Energy, etc. I searched to see if anyone else had already created the schematic, PCB, and firmware that would be needed to drive these little guys. Here are a few discoveries I made.
Hardware:
Here's a
complete PCB
for the
CC2541
, which seems to be optimized for low-power consumption. I will say, the entire chip is pretty well documented on the TI forums, but overall, the hardware aspects are the least.
I downloaded the Eagle files and began ripping off all the unnecessary circuits. (I think there is a lipo charger circuit?) The goal was to bring the board size small enough it would be cheap to print.
But as I got to ripping pieces of close to the antennae noticed how easily it would be to screw the whole thing up. And since I hadn't built my
spectrum analyzer
yet, I would be stabbing in the dark each time ordered a version of the board.
This realization on top of all the 0402s and the DFN package, well, I decided I wanted to play with the chip on a completed board, with already installed firmware before I sunk time into a personal development (fancy words for:
I got lazy
).
I won't cover the firmware or software, since I took another route and didn't thoroughly Google-search them. But do know, within the collective in the Texas Instrument's cc2500 forums there is almost everything you'd want. Although, be prepared, if you desire to create your own firmware you'll need to brush up on your C and AVR programming.
This brings me back to Fasttech.
I noticed one day they had these
HM-10s
on sale.
Waahoo!
A pre-designed CC2540 board with firmware already created? Firmware that is AT commandable? I bought two.
I figure I could at least get a feel for the chip and see if it was something I really wanted to sink time into developing.
Well, a week later I got these little guys.
They aren't bad little boards. But, they were little. I tried soldering on jumpers to the ant sized grooves, it didn't go well. I knew to really start playing with the board I needed a breakout.
When the breakout boards came in I was surprised they worked (I'm usually surprised when something I make turns out :).
They straddled my breadboard nicely.
And it allowed me to play with all the bits of the board I wanted: V, GND, Rx, Tx, and PIO1 (pin for connection-status LED).
Since the little HM-10 operated on 3.3v I carefulIy put it in the breadboard and pulled out my
Sparkfun Lilypad FTDI
(first huge mistake) to interface with the board's serial.
Well, I couldn't figure out what was wrong; the board would not respond to AT commands (I was using
Realterm
). I mean, I had plugged the 3.3v into the HM-10's power, so I know it wasn't getting too much voltage. I even checked it with a multi-meter (about a hundred times).
Well, as those of you who are smarter than me (so all of you?) probably already know: The Sparkfun's Lilypad FTDI is designed to provide the Lilypad with 3.3v, but the Atmega-328-P on the Lilypad is actually 5v tolerant. So why drop the voltage on the Rx and Tx lines? Of course, this wasn't a conclusion I came to for many hours, really, until I started randomly probing the boards with the multi-meter.
Damnit.
Well, there goes $13.98 (yes, I was slow enough to kill both boards).
When everything came in, I took the broken boards off by heating the bottom of the breakout with a heat-gun until the HM-10 was loose. I cleaned the top of the breakout boards with some solder wick. Then, soldered the new HM-10s on.
Video of Soldering the HM-10 to Breakout
I wired the whole mess up on a breadboard and was surprised when I actually got a response in Realterm.
AT
AT+OK
Victory!
(hey, I've learned to enjoy even small triumphs)
I had to use specific settings in Realterm to successfully communicate with the HM-10
Then, under the "Send" tab typed in my AT commands and hit "Send ASCII":
This worked pretty well. Every time I typed "AT" it shot back, "AT+OK"
So, I started digging for the rest of the AT commands. And dig I did.
Apparently the HM-10 is developed by
www.jnhuamao.cn
. These folk are somewhere over in Jinan's
Hi-tech Development Zone
. Anyways, luckily we have GoogleTranslate and I was able to get through several of their current documents. But not before I lost a time on trying to get the module to respond to AT commands no longer supported (like the connection command).
The manual answered a lot of my questions. It came with a complete pinout (even a schematic!). After playing with the commands I was re-naming the module, resetting it and running many other needed commands.
Now for a live test.
I got my work phone, iPhone 4S, which is equipped with Bluetooth 4.0. I tried using the stock Bluetooth connection found under settings and it couldn't find my little HM-10. I switched to
LightBlue
and was able to not only find my little module (named Bob), but it connected, and allowed me to send serial data to Realterm!
Success.
I thought I was on my way to slapping these little HM-10s on a robot, plugging a Bluetooth 4.0 dongle on my PC, sitting back and letting magic happen. That's not quite how it worked out. I ordered
this
Bluetooth dongle and when it came in quickly discovered that the drivers needed to provide it with magic powers were not available. I tried it all, TI's tool pack, random internet drivers, shady internet drivers. It simply wasn't going to happen with that dongle.
I figured that's what you get buying the cheapest dongle you can find. So, I switched over to Newegg and bought
this
dongle, making sure it came with supported drivers.
When I got it in, it still didn't work (I believe this is completely a software issue, so I expect different outcome if I were to play with these dongles on a Linux machine).
I thought, "Well screw it, I could always make a microcontroller, RS232, and another HM-10 into my own dongle."
Um. But I couldn't figure out how to get two of the modules to connect. I set them both up on a breadboard, and they both had the little blinking LED (meaning not connected), but the little guys just wouldn't get it on.
So, on a whim I emailed Jnhuamoa and asked.
Hello,
I'm currently working on interfacing two of your HM-10 modules. I'm having trouble because there seems to be no pairing command. I use "AT+VERS?" and it tells me I'm using version HMSoft V303. Is this old firmware? If it is, is there newer firmware available I could download and write to the cc2540? I've looked through your website and I cannot seem to find any firmware upgrades. But, I only read English, so I'm curious if I've missed it in translation.
I appreciate any help you may give,
--Thomas Brittain
To my surprise, they responded
Dear sir
Thanks you for choose our products.
Between two HM-10 modules, pair and connect process is automic.
You only need to make sure that one of the HM-10 module set to master mode, another is salve mode (use AT+ROLE command), and the TYPE setting of the two modules is the same (use AT+TYPE command) and the PIN number is also same (use AT+PASS command).
If the master module has connected to other modules, please execute AT+CLEAR command first.
Our website have module work flow chart, you can have a look.
:)
Best regards
HMSoft
guocg
But...now what? I mean, I could wire the guys up to an Arduino-bot but it would be one dongle per robot. What I had wanted was several Bluetooth bots per one dongle.
To be honest, I never expected to use the Bluetooth as a bot tether, I was just looking for an application other than my co-workers ear-phones.
After reading the manual some more, and tinkering with the AT commands, I sent another email over to Guocg.
Good sir,
Thank you for your quick reply.
I felt stupid after your instructions. I had the HM-10 paired in less than a minute. A very simple process. Thank you.
But I do have a few others questions. Is there any way to have more control over the connection process? I'd really like to have a micro-controller (PIC, Atmega) in between give the HM-10 very specific commands, which would involve the master connect to several slaves depending on the need of the master. I can see how the PIN could be changed, but would it be fast enough for one master to manage several slaves in real time?
This is the process I'm going to attempt:
1. Setup 3 slaves with unique PINs
2. Setup 1 master connected to a microcontroller.
3. Set master to AT+IMME0 (work when given the command).
3. The micro-controller will pull the HM-10 reset line then give the following commands:
a. AT+CLEAR
b. AT+PINslave1
c. AT+WORK
4. The micro-controller will send a 'X' to slave1
5. Slave1 will have a micro-controller pulling the reset line every half a second or so, unless, it gets a connection with the 'X' string.
I'm not sure of the speed of this process, but I believe it would let me switch between modules remotely. I have read about some of the older releases of the firmware for the module HM-10. Is there still no chance in getting those? I understand now that pairing with the HM-10 modules is very easy, but it also seems very restricted.
Thanks for all the help,
--Thomas
This time, I think I got a blow-off response. That's fair. I'm a hack not an industrial developer.
Dear sir
You shoule make sure that HM-10 modules AT+TYPE vlaue is 1. PinCode set command is AT+PASS.
Best regards
HMSoft
Guocg
So, no chance on getting older firmware. I started preparing to implement my Atmega & HM-10 team. I strung up the HM-10 on the backpack breadboard of Silas' bot (my son's bot).
I was beginning to get really frustrated with the level conversion problem. I had tried the
CD4050
, but found it was uni-directional, meaning I still had to have a converter for the Rx bus (HM-10 and Arduino), or, unplug the HM-10 from the Rx bus every time I wanted to upload a sketch to the Arduino. In the end, I started doing that and used a voltage divider for the Tx line.
That's when I ran into another problem:
Range
.
More specifically, the lack of range. The little modules would lose connection if more than 7 inches away.
Ugh.
Back to trouble-shooting. I couldn't really pin-point the problem. I did find the modules had a power setting (AT+POWEx, X=0-4). But no matter what setting I put the modules on, they didn't have range greater than 7 inches. But I did notice when I was moving the slave around I could get a connection by aiming the master towards the slave. But if I rotated the module, it would lose connection. I didn't want to do it, but I started reading about telemetry.
I can't say I learned anything, though, I did develop a theory. The ground planes I put on my breakout board were screwing with the telemetry.
A second time I breakout the heat-gun and pull the HM-10s off their breakout boards. I get back in Eagle and re-design the breakout to remove the ground planes underneath
I thought as long as I was going to have the boards printed again, I'd go ahead and add some sort of level conversion. I reviewed a few different SMD chips (CD4050, 74LVC245, TXB0108, etc.) but I found the chip was either uni-directional or overpriced. In the end, I decided on the same
design
as
Ada's
and
Spark's
4-channel I2C converters.
This design was cheap, scalable, and required little layout room. It was fairly simple, voltage divider from high-to-low, and a tricky little N-Channel MOSFET on the low-to-high. The low-to-high circuit is actually bi-directional (up to a certain speed) but I'm simply using it to raise Tx voltage from the HM-10 to the Arduino, while protecting the HM-10 from uploading a sketch.
**
**
And, that's it so far...sorry.
I've already got my
BSS138s
and I should get the new boards Monday.
The LED and Heatsink
A bit ago I realized I needed to start documenting better and figured a picture was worth a thousand words, so at a 32fps x 1,000, well, in short video documentation should kick-ass (I submit for further evidence
chickenparmi
's works :). Well, I got to trying to figure out how I was going to make these videos. That's when I came up with the idea of a piece of wood hanging from the ceiling--feel free to copy this design, it is open-design.
Well, I attached a board with a hole on it so my iPhone could lie there and take videos of whatever my hands were doing. But I noticed there simply wasn't enough light in the hacking hours (after the wife's asleep) to do a proper video. That's when I began looking into cheap high-powered LEDs. They aren't too difficult to work with, the items I ended up needing were.
This may seem high, but the heatsink paste was used for many other things; I made 3 other LED track lights, and other LED lighting projectss with it, and I've
maybe
used 5% of the tube. And the PSU has doubled as a work-bench power supply :)
As many other projectss, this one isn't done. The original plan was to add an array of light-sensors to adjust the light as move my hands around, thereby auto-correcting for light imbalances in the videos. That part isn't done.
But I jump ahead. When I first started this little projects I had no idea how to work with high-power LEDs. My little ignorant self assumed they were like 20ma LEDs--right? Well, I quickly figured out that heat displacement was the single most important issue. Which is why I ordered a 800 lumen LED 3 weeks before I ordered a heatsink and paste.
Then, it came to the problem of finding out of my heat sinking was adequate to the massive LED. (I think a 60 watt tungsten produces 800 lume as well? And I know they can cook little cakes--what? It was my sister's
Easy Bake
, oven not mine.) I digress, so being
dyscalculia
I was trying to find out if I was heat-sinking proper without delving into higher math. That's when I remembered I had my
thermocoupler
together from the coffee roaster I built. I strung the coupler together with an Arduino and i2c LCD, giving me a pretty accurate and robust temperature sensor.
I looked for a bit, but I couldn't find the information on the theremal breakdown on the 800lm LED. Eventually, I thought I'd use masking tap to tape the coupler probe against the face of the LED and wire it up (with an
appropriate resistor
, of course).
The LED was on 1.5 seconds before it blew up to around 180
F. Uh, ya, I broke out the heatsink. This time, I attached the LED to the heatsink with a bit of thermal paste. I think attached the two screws, which further pressed the LED against the heatsink. Then, I put coupler probe against the face of the LED. I bit off the last of my finger nails and flipped the LED back on. This time, the temperature sensor went up
much slower
. And after a few minutes stuck around 112
F. Of course, I didn't know if this was beyond the point of thermal breakdown, but I assumed since my own body temperature wasn't far off, and I wasn't exploding, then it would probably be ok. I also touched the face of the LED and was able to leave my finger own for longer than 30 seconds. This I've found to be the most empirically reliable test. With mounting evidence I decided to cut another hole in my board-from-the-ceiling...thing and attach the light. And there we have it. I'll report when I've added the light-sensor arrays.
XL4432 Breakout
--
Telemetry is Voodoo
While I was reading about telemetry I discovered
these
little boards for $3.98 and grew curious if I could do anything with them, other than bricking them. I was very curious about the claim on range, "1000 meters." Even with the BS de-modifier bringing the range down to around 500 meters, that was still something I would love. So I ordered two and made breakout boards for them.
They came in and right away I had macabre feeling that reminded me of the HM-10 boards. Well, I've not played with them enough to know if I've killed them. I go pissed off with my logic-level converter slipping and feeding the board with 5v (a cheap jumper wire came loose from the breadboard).
I stopped what I was doing and re-made the breakout board to include a
TXB0108
(
Digi-Key
: $2.30). This is the same little chip that's in Ada's
8-Channel Bi-directional
logic-level shifter.
That's really the whole story here, well, except I've been trying to find information on hooking these little guys up with Arduinos for when I do get them wired with a voltage translator. I've found much thus far, but
this
seems promising. Sadly, I can't figure out how he's got the XL4432 wired up by his poorly drawn schematic(?). Anyone care to give an opinion as to whether those grounds are connected? Logic and CtC both state, "Always,
always
, connect all grounds." And if I remember my EE schematic lingo, isn't dot on connections most often used when there is an
extraordinary node
?
Oh well.
Atmega Fuse Doctor & Pogo Board
I'm not patient. At all. Which lead me to brick a few Atmega boards (
1
,
2
). This upset me, especially since one of those chips was ~$12-17. I had bricked the chips trying to set their fuses in Atmel Studio. Due to the price of these chips I feel it was natural for me to begin looking for a solution. And apparently the
Fuse Doctor
is one such solution. In essence, it uses the high-voltage programming functions built into Atmel chip.
I thought, "Great. A way to save my chips!" But...I ran into a problem when I saw the board build, it was an etch a home design. And I still do not have pant free of ferric chloride stains. So, I set to re-designing the board into a version I could send off to OSHPark.
I found out the designer had a
SMD version
of the board ready to go. But, after looking it over I came to the conclusion it would simply be to expensive to print. So, I set out to shrink it.
In the end, I printed a board for around $12. But like the rest of the items here, it didn't work as expected. The problem had to do with an extra node I missed, which led to a short-circuit around the voltage regulator. So, I just sliced the trace and was able to at least get pulled up in Atmel Studio and the hex file written to the chip. So, in theory, if I can correct short-circuit, supply it with 12vs, and connect it to the Atmel chips, I should be able to restore them.
You might see in this image where I'm providing the board with a pre-regulated 5vs through a via. This was to circumvent the short-circuit around the on-board regulator. Also, I had to attach my AVR wires on the other side of the 1k resistors to get the board signature read in Atmel studio.
Kariloy--who has a complete
Fuse Doctor
by the way--reminded me that even if I finished my board, I probably shouldn't hook it directly to the bricked board. Rather, I'd need to attach to the chip directly. Of course, in the original fuse doctor, this was done by a DIP socket. But I don't use dips... So, I began to think I should just give up on the idea.
Then, I remembered
Pogo Pins
. I jumped on eBay to see if I could find any pins with a small enough head to fit against a TFQP lead on chips I used. These are the
pins
I ended up ordering.
When they came in I was pleased to find they would probably be small enough to fit against a single lead without touching its neighbors. I then started looking for a way to create a jig. I settled on using HDPE (cutting board) I had left-over. I figure I could drill holes in it using the
bits
I had left over from the Hunter S. Thompson board debacle using OddBot's old drill-press.
<----- OddBot's drill press (thank you OddBot ;).
When I got ready to drill the holes, I got into Eagle and printed out the footprint of a 32-TFQP chip. I cut that out and then cut my HDPE to the same size, making two pieces with enough room for the drill guide in the center. I then drilled two holes in adjacent corners from the other. I put a couple of 4-40 screws and nuts to clinch the two pieces of HDPE together. The idea being, I could spacers between them later and slip the pogo pins through the top, solder the wire on its bottom, then let them rest on the bottom piece of HDPE. Not sure why I state all that....the picture I think explains it.
After I had the hole thing screwed, tapped, and clinched, I ran out to OddBot's drill press, fitted it with the smallest bit I had and tap over one of the pads. I then pulled out one of the small pins and was surprised to find it fit snuggly.
And that's where I stopped, the main reason was not wanting to have to lookup the pinout for the high-voltage serial programming interface on the Atmega-328-P.
Thermostat
I'm not going to write this up until I do something different then the guy I stole it has done.
My wife, Bek, asked me awhile back to make her a smart thermostat, which I replied, "I'm only at the dumb-thermostat skill-level. Will that do?" She glared, so I Googled for some planes.
I found this guy's
Arduino thermostat
and I realized I had most of the parts already lying about.
-- Arduino Uno
-- I2C Real Time Clock
-- I2C LCD
-- Seeed Relay Shield
-- Dallas 1-wire Temperature (
DS18B20)
The idea is pretty simple, the voltage line is tied together on all the legs of the relay shield and the Arduino connects to it. The Arduino reads the temperature from the sensor, checks the time on the I2C RTC, and prints both to the I2C LCD. Also, if the time is correct (after 4:00 PM) it will automatically kick on the AC. I've got it all together and working, just not tied to the AC lines. I was trying to find a way to power it, since all the AC lines are 24v and there is no ground. So, I bought:
Which gives me 15v, 6A unregulated, or 4.5-40v at 1.6a.
Now, looking back,
this projects is more expensive than buying a smart thermostat, and I don't recommend it.
The main reason I went with this build was because I owned everything but the temperature sensor and PSU.
Distcc and Gstreamer on Raspberry Pi:
When I began playing with the OpenCV I fell in love. I just felt it has so much to offer us, but I had really kept my tinkering limited to what could be done on the Raspberry Pi. When I finished the little
face tracker
on the Pi, I knew it would work well as a sensor, but would not approach the level of function I wanted. In short, the Pi simply could not give me the speed I needed. I pulled up OpenCV on my desktop (I wont go into the specifications, just know it's faster than the Pi) and realized pretty quick I'd need to send the video data from the Pi to the desktop if I was to get the results I wanted.
So, I began playing with cheap ways to send video data to the desktop. I started by attempting to use
Gstreamer
to pipe the video data from the Pi, over my WiFi, into OpenCV on the desktop. Twenty hours later...I realized I was too dumb to make this happen. I began reading.
Put simply, there are so many compatibility issues with this process. I still think it is possible, I just got worn out trying to figure it out. And as I understand it, not all of Gstreamer's development plugins work on the Pi. Then, it was a question of what was going to capture the video (Motion, FFMPEG, etc), and whether one, if any, of these programs liked to play with OpenCV, not to mention, a video pipe that came from the Raspberry Pi. There is no need to say, but I'm going to anyways, it was a mess.
I've built Gstreamer and FFMPEG on the Pi more times than I can count (not many, I count to like 5). This got me to thinking, "This would probably go faster if I could compile these beefy programs on the desktop. After doing a bit of digging through cross-compiling literature, I decided on
Distcc
. It seems pretty nifty. It is a genuine remote compiler, meaning there are no SD card swapping and mounting, unmounting. Getting it to run on the Pi was the trick.
I won't go through all my wasted time and effort to sort out how to setup Distcc; and, like the other projectss described here, I'm still not done. Though, this guy's
post
has helped me a lot. I've learned a lot about Bash scripting and have written a script to setup Distcc, mind you,
this script doesn't work yet,
it's like the rest of the post, incomplete:
I'm embarressed to say how much time I spent fiddling with trying to get Distcc to work properly on the Pi. And so much time wasted was my fault. In the end, it was the
manual
that saved me.
Because I didn't really understand the symlink in this scenario I was having a hard time figuring out how Distcc calls the correct compiler. From what I inferred, a symlink was created to replace gcc, c++, cpp, cc, g++. Then, when any of these were called, the symlink would redirect the data to the remote compiler on the desktop. So, when during first go I had at installing Distcc didn't work because the $PATH variable was incorrect, I thought, "Well, if it's creating a symlink to another compiler, I should just delete the local compilers on the Pi--they're not needed anyway. That way I'm sure I'll get the remote." Due to this stinky logic I issued this command
mv local.compilers localcompilers.old
Sadly, it wasn't until I read the manual (hey, it's a long damn manual) did I discover that a "local pre-compiler is used before the data is sent to the remote compiler." Meaning, every time I disabled the local compiler I done-broke-it.
If the symlink to Distcc comes up first in the $PATH, it calls it as the pre-compiler, then removes it from the $PATH variable. Distcc then calls the the next compiler in the $PATH variable, this time, to it should be the remote compiler.
Given I removed the local, the first compiler it would find it treated as the precompiler, then removed it, leaving no compiler to do the real compiling.
This caused me to get errors stating there was no compiler found.
I discovered all this waiting on one of my clients to finish a mental-health screening. It definitely confused him when I hit my head against the wall and said, "Son-of-a..."
I'd been digging around in the manual on my phone.
To sum up, I know now that both the real compiler and the symlink to distcc must be in the $PATH variable, they just have to be in the correct order.
But as all the tales of woe here, it is unfinished.