UPDATE: I discovered the link I had was referring (which is the true stock image) is unuseable unless update and upgrade are run. Sadly, you can't do that with a 2gb image. Regardless, I've switched the image to the updated (as of writing this) Angstrom image. Please double check and make sure you've got the latest image:
Replace the paths in steps 8 & 10 (but I'll try to keep it up to date).
Again, it is unfortunate but you need a 4gb or greater microSD to use these instructions.
**MAIN: **
As I stated, I killed my stock Angstrom on the Beaglebone Black (B^3).
I pieced together how to restore it.
You'll need your B^3, microSD card, and Ethernet connection.
(WiFi dongle can replace the Ethernet, if you got it working before I did. And if you did, where's the walkthrough :P?)
9. Write the Angstrom Stock img to your Beaglebone Black eMMC.
This next step is going to write the image file to your Beagle's eMMC.
Two notes, it is going to take awhile, if you are curious if it is still installing, use the LED activity lights to guide you. When the PuTTY window gives you back to the command prompt and the LEDs are slowed, you're good to go to the next step. Oh, second note. Try not to power down the Beagle during this step.
11. Remove the microSD and power back on the Beagle. It should now boot like you bought it (unless, of course, I screwed up. Feel free to yell at me and I'll fix these instructions).
If you followed this process you'll see these instructions can be used to write any image file to the eMMC. Let me know if you get a different distro working (consistenly).
Now, I'm going to turn to getting Arch Linux going. Unless, someone else has it working...
JerZ?
:)
This fellow here has made some pretty nifty walkthroughs on the rtl8192 and the Chronodot (DS3231) RTC on the Arch Linux. Though I've not attempted his instructions (been burnt out on this board) I believe his instructions will get a reliable WiFi connection with the rtl8192, using Arch Linux, on the B^3.
Also, when I get the energy, the pinout at the bottom of this page has a mistake or two. As
Zaius
pointed out.
EDIT: Ok. Don't use the pinout until I research more. I'm getting conflicting information on what pins are what in Mode 7. The reference manual is stating one thing, but other sources are agreeing with me. I'm guessing Zaius is dead on; version issues. When I've got it sorted I'll update.
7/3/13:
Not much yet, still working on stable wifi. I thought I might as well share my work log; embarrassing as it may be.
If there are some Linux wise in the crowd (especially those knowing Arch Linux) would you mind taking a look at my work flow? I've got the wifi module working, though, it's not as stable as I'd like.
Wow. Sorry all, been more than a month since I updated this post.
I've not given up on the BBB as a robot platform; I realized I didn't know Linux well enough to begin tweaking it on embedded devices (well, devices lacking community support, at least). I've spent the last month reading about Linux and trying to wrap my head around it (that and fixing
all of our
bust-a-mucated cars).
I grew up Microsoft and over this last month all household computers have switched to dual-booting Ubuntu 12.04 and Microsoft X. And router will soon make the switch to OpenWRT.
Back to the BBB; the Realtek WiFi dongle that drove me mad has been solved by
these guys
. I've not had time to attempt their walkthroughs, but it is on the agenda.
It'll let you tunnel (SSH) into your Linux devices from either from an iPhone or iPad X. I've enjoyed this for two reasons: I can keep an eye on how a program is compiling on the Raspberry Pi while watching a movie with the family, and, I like the feeling of running Linux on a closed system. I understand it's a falsity, but it's still comforting.
I hope all are well.
5/20/13
Well, I think I could best describe this point in economic terms. It's the where I've realized my productive efficiency is being restricted due to the
current inabilities of technology.
Figure 1
In essence, this graph shows that I cannot reach my desired productive efficiency (getting the B^3 to do the tricks I want it). Really, I'd be happy at point C (even though point D is probably better for me and my family). The problem is technology limitations are restricting me from getting to point C on the curve. And it's bugging the hell out of me. At first, I thought this was completely due to my ineptitude (which is partially true), but there is another barrier, a seemingly hidden one.
The current Beaglebone driver technology is a hidden barrier to this productivity point.
I've read the warnings TinHead gave on treating embedded devices like computers. But if they don't carry some extrordinary functions then what separates them from really,
really
fast microcontrollers? No. I'm pushing to have some basic PC functionality.
For instance,
WiFi capability.
Easy access to a graphical interface (note, I'm not stating GUI).
Ability to utilize higher-level programming languages (Python, C++, etc).
Really, that's it. A few features to allow rapid prototyping while haranessing the power of open software.
To me, if these three features are achieved, then I feel like the device is complete. Though, I should state, I've realize these three features are no simple feat.
So, where's the Beaglebone Black?
Not there.
Some things not supported that will need to be for me to get to point C (Fig. 1).
Ability to plug in cheap, low-power WiFi dongles and get them going in under an hour
. Let's be honest. Cheap is what 90% of us will go with. It allows us to do more with our budgets. Therefore, if an embedded device in anyway can utilize cheap peripherals, then let's focus on making it happen.
1
Better power-management on the software side
. Several distros will shutdown the board during boot-up, as the peak above 500mA. The designers suggestion? Don't plug anything in until the board is up. Sounds ok, right? Well, don't forget there is no hot-plugging on the USB, microSD, or HDMI. The drivers haven't been written yet. I'm pretty sure this is due to drivers, since I've read through the BBB datasheet and the power supply hardware seems sound.
Ability to adjust the HDMI output
. At one point, I had one point, I was trying to get Arch Linux up and I couldn't get in via SSH. So, I plugged it into the only HDMI monitor I have and tried to adjust the ssh.config file. The problem? I couldn't see what was commented out due to the overscan. I dig through Google group where the board designers rest; guess what? There is no current way to adjust the video output.
2
Therefore, my conclusion (though, my insanity is rising), is:
Figure 2
All this to say, my wife has taken away my Beaglebone Black until that green line moves farther out.
Yes, I am little-cat whipped, but she has said, "You're doing this to relax, not work a second job. I'd rather you work on some other projectss for awhile." Hey, being married is the only thing keeping me sane. Well, her and you guys (and girls, if Max didn't run them off :P).
5/16/13:
I've finally got the Black Bone near where I've got my Pi. Here, the old Bone is running an
updated
Angstrom
(4gb) build, using
WiFi dongle
, and is connected to a 1A wall-wart (connected to microUSB not barrel-jack). When I'm off work today I'll try to complete a "Box to Wireless" walkthrough for good 'ole Angstrom.
(Question, anyone else feel like there's a
Mason's
conspiracy going on in the embedded world?)
I think I got near understanding TinHead's post: Don't treat an embedded device like a PC? I dunno.
5/15/13
I was able to get my WiFi dongle up by adding the realtek8192 kernel. Not sure all I did, but it works. So, as soon as I can get some repeatable steps, I'll post a walkthrough of setting the Beaglebone Black up with PuTTY, VNC, and WiFi dongle.
5/14/13:b
Was able to get RealVNC to pick up Angstrom. Working on getting WiFi dongle up.
5/14/13:a
I added some links to Bonescript GPIO walkthroughs (PWM, Analog, Blinking).
5/12/13:b
I've created a visual guide to mode 7 pinout (added below).
5/12/13:a
I'm pretty frustrated. So, I'm going to back off working on the board until Tuesday when my 8gb microSD comes in. At that point, I'll
use this to reflash my eMMC boot partition
and start working two different projectss: Getting Arch Linux going well, and giving in and update && upgrade my Angstrom. Both, I'll try to write up.
Jerz
, or anyone else with a BBB, if you have any notes to add, if you don't mind shooting me an email I'll update this post.
Hope everyone had an awesome mother's day.
5/11/13:c
May I encourage anyone who has yet to order their B^3:
Wait.
There are several intense issues being worked out on a hardware-software level. Until then, I feel you'll be as frustrated as me. Bdk6's pun says it all:
This board is being a bitch.
Some updates:
The package manager that came with Angstrom was actually broken for awhile, and no one bothered mentioning it to the community. Instead, there were many posts titled "why won't opkg work?" Now, I believe it will work if you run update && upgrade, of course, to do that you must have an SD card since it will be larger than 2gb.
I got Arch Linux up, briefly (it takes both eMMC and SD).
I lost the appropriate boot file for my eMMC. (While attempting Arch Linux).
There doesn't seem to be an easy way to flash eMMC back to stock (I've got to wait for a bigger card).
The developers are pretty stressed out. I don't see solid support for a bit. And already seems like a us vs. them between the developers and open community
I'm tired. Who's taking over?
5/11/13:b
So, I attempted getting my WiFi dongle setup (again) using Angstrom's
package manager
. I found that everything I tried installing using their package manager would shoot back an error. I read, and I believe the problem is the following must be run to catch the Angstrom stock package manager up with desired packages.
opkg update **
opkg upgrade **
I ran them, and guess what? The eMMC does not have enough space to hold the updates. Mother-of-a-Beagle!
Sadly, I'm using a microSD card from an old phone, which is only 2gb. My 8gb is on order.
This, in my opinion, puts the
Beaglebone Black on the same level as the Raspberry Pi
; that is, it must have a SD card before you can use it (a card larger than 2gb). If someone else finds a way to install things on the B^3 without updating it, let me know, I'll correct this critique.
I screwed up the eMMC partition while trying to get Arch Linux on the Beagle.
5/9/13: Oh, dear lord. It's true. Lack of community support kills the Beaglebone.
It took me nearly 9 hours to setup an OS on an MicroSD.
I'll write it up tomorrow night, after some rest.
Ubuntu on Beaglebone Black:
5/6/13
I got my Beaglebone Black (BBB, B^3) in today. I thought I'd share my unboxing and general thoughts.
Please don't take this as Mr. Jones on a Sunday drive, rather, I want to provide the touch-n-feel information for every robot builder here. In short, I don't want everybody to waste $45 if the BBB is going to turn out to be Beagle sh..., well, you get it.
(Hey, Raspbery Pi puns could fill a library, I can't make one BBB pun. Shesh.)
JerZ
was trying to explain to me there are several modes for the B^3 pins (he's run Hello World on it, and I'm sure by this moment he's run Blink), Regardless, I thought I'd try to whip up a visual for robot building on the B^3.
Keep in mind, these are the pin names --
you'll have to look up on page 69-73 of the reference manual to know how they might be used. Although, almost every pin can be used, regardless of its intended function. Their functions are defined by the software, each pin having 8 modes (0-7).
For example, if your bot is a quadrocopter: You find a
real time kernel for Linux
and
you'd probably set the pins to MODE 7 turning the non-essential pins into GPIO (i.e., sacrificing HDMI, eMMC, etc. lines to GPIO).
If anyone else following this,
please double-check me, I'll make corrections if needed.
Beaglebone Black and Raspberry Pi:
These are some of the differences I've noticed between the Beaglebone Black and the Raspberry Pi.
|
|
| Est. to Upgrade Rpi or BBB | Est. Difficulty to Add |
| Real Time Clock |
1
|
0
|
| $
2.30
| Medium |
| Processor Speed |
1GHZ
|
700MHZ
|
| $
4.76
| Medium |
| Power Switch |
1
|
0
|
| $
0.83
| Easy |
| Reset Switch |
1
|
0
|
| $0.83 | Easy |
| Boot Switch |
1
|
0
|
| $0.83 | Easy |
| GPIO |
65
|
26
|
| $
8.85
| Medium |
| Flash Memory |
2GB
|
0
|
| $
5.66
| Easy |
| MicroSD (smaller footprint) |
1
|
0
|
| $
4.35
| Easy |
| Serial Ports |
4
|
1 (that's accessible)
|
| $1.50 | Hard |
| Barrel-jack and MicroUSB power | Yes |
No (just microUSB)
|
| $
2.95
| Easy |
| Highest Screen Resolution |
1280 x 1024
|
1920 x 1200
|
|
~
|
~
|
| Peak Power Requirement | 460mA |
475mA
|
| ~ | ~ |
| Supply Current to USB |
500mA
|
700mA
|
| $
5.50
| Hard |
| USB Host (devices used by BBB or Rpi) |
1
|
2
|
| $
1.87
| Easy |
| USB Client (lets BBB or Rpi be a device) | 1 |
0
|
| ~ | ~ |
| Plastic Headers for GPIO | 65 |
0
|
| $
1.95
| Easy |
| USB Cable | 1 (Mini USB) |
None
|
| $
1.07
| Easy |
The Hardware:
First impressions in a sentence: The hardware looks sound.
Several things make it stand out:
It uses a Micro SD card instead of a SD. This allows for a smaller overall size without using an
Adafruit Adapter
.
It has three tactic switches: (1) power, (2), reset, and (3) a mystery switch. I'm hoping the third is software accessible. The built in powerswitch is a real winner. It means you can tell
this guy
to keep his
£15 and his closed source design.
It has one USB hub. This is my second greatest concern (after less community support) is having to rely on USB HUBs to connect devices. And, yes, I'm aware an IC, breadboard, and access to hardware IO will allow custom
USB
deivces. But sometimes don't you want to just plug-and-go? (Yes, I'm aware I'm lazy.)
It has a barrel-jack instead of a Micro USB for power. I don't know how you feel, but I'd rather have the Micro USB simply because I've got a lot of those lying about, whereas barrel-jacks, I'm not sure. Maybe under the decaying skull?
It's
open hardware
. The RPi claims to be for "educational purposes," although, it seems the education is restricted to the software. Of course, this is an assumption based on not yet seeing a schematic for the Raspberry Pi silicon bits.
It's TI. They're nice to me. (I might have a stack of sampled ICs from them...maybe.)
If everyone is alright with me floating this post for a bit, I'm going to try to do a first-boot video tomorrow, then, continue until I've built this BBB into a bot.
No longer afeared of frying my Pi, I've moved on to trying to implement some of my bot goals. Like many, I want my bot to be able to interact with people, but I didn't realize that I'd stumble on this ability.
I've looked at many visual processing boards like the
CMUcam v4
, but I'm not paying $100 for any board. I looked into making one, it looks possible, but not much cheaper. So, I got curious as to what alternatives there are. I stumbled on Hack-a-Day's recommended article:
OpenCV on Raspberry Pi
.
Anyway, he provided instructions on setting up OpenCV (open source computer vision) on Raspberry Pi. Of course, it was about 20 minutes later I had the code working on my Pi.
I had been skeptical of the Pi's ability to run any computer vision software, and morever, it's usefulness given the Pi's processing constraints. But once I had it up and running, I noticed it actually ran smoother than I had hoped. Don't get me wrong, I think it is less than 10FPS, but I could tell it would work for many robot applications More than that, if the Raspberry Pi was used
only
for the computer vision, then it would still be cheaper than many other hardware driven CV boards.
Basic Raspberry Pi and WiFi Dongle
WiFi Dongle: $6.17
Raspberry Pi: $35.00
SD Card (4g): $2.50
Web cam: $8.00
Total for Basic RPi: $51.67
Therefore, I went to work on hacking his code.
Many hours later, I ended up with a _very crude _ Raspberry Pi, Ardy, Camera, and Servo orchestration to track my face. Mind you, this is a proof of concept, nothing more at this point. But I hope to eventually have my bot wandering around looking for faces.
Image of Pi VNC. The box outline is being written through
i2c
.
Pulling apart a little $8 eBay camera.
To Setup the Raspberry Pi:
If you're setting it up from sratch, start with these
instructions
.
But if you're already setup, I think all you need is OpenCV.
$ sudo apt-get install python-opencv
The Code:
The Arduino code reads bytes from the i2c, converts them to characters, then places the characters into an integer array. The Pi is sending 4 numbers, 2 coordinates, x1, y1, x2, y2.
The Python code is "facetracker.py" by Roman Stanchak and James Bowman, I've merely added lines 101-105, which load the coordinates of the box around your face into a a string, converts that to a string array. I also added function txrx_i2c(). This function converts the string array into bytes and sends it to the i2c bus.
To change this setup from i2c to UART, focus on the txrx_i2c() in the Python code and the onRead() in the Arduino code. I assure you, UART would be much easier.
If anyone has any questions hollar at me. Oh! And if someone can tell me ways I could optimize this code, I'm all ears
#include<Wire.h>#define SLAVE_ADDRESS 0x2A#include<Servo.h>ServoCamServoX;//Attach the pan servo.ServoCamServoY;//Attach the tilt servo.intServoTimer=250;// Change to adjust how quickly the servos respond.intSmallXJump=3;//Sets the movement amount for small pan jumpsintLargeXJump=7;//Sets the movement amount for large pan jumpsintSmallYJump=1;//Sets the movement amount for small pan jumpsintLargeYJump=2;//Sets the movement amount for large pan jumps//How close your face is to the edge to trigger a jump.intSmallYLimit=40;intLargeYLimit=20;intSmallXLimit=40;intLargeXLimit=20;//Set servos to initial position.intposX=90;//Servo position.intposY=90;//Servo position.intx1;inty1;intx2;inty2;//Holders for frame dimesions.// Indexes for getting i2c bytes, then, converting them to integers.inti=0;intvarI=0;//Sets flag to trigger ServoWrite() from the main loop.//I tried to put this under 'onRequest' call, but the Raspberry Pi kept giving me errors.//This flagging was a work around.intNoServoData=0;intdim[12];//Char array for char[] ---> int conversion.chard[8];// Char holder array for byte-->char conversion.voidsetup(){// initialize i2c as slaveWire.begin(SLAVE_ADDRESS);Wire.onRequest(sendData);Wire.onReceive(readData);Serial.begin(9600);//Attach servosCamServoX.attach(10);//Tilt (Y)CamServoY.attach(9);//Pan (X)//Write initial servo position.CamServoX.write(posX);CamServoY.write(posY);}voidloop(){//Again, this is the work around. The flag "NoServoData" is set under the i2c onReceive.if(NoServoData==1){ServoWrite();}}//This is just to show the RPi can be written to. //Replace with stuff you want to write to the Pi.chardata[]="Pasta";intindex=0;// callback for sending datavoidsendData(){Wire.write(data[index]);++index;if(index>=5){index=0;}}// callback for receiving data.voidreadData(intnumbytes){//Holds the charsintc;if(Wire.available()>0){while(Wire.available())// slave may send less than requestedc=Wire.read();}//Add each integer to a char array.//Skip commas ',' and keep adding the integers until char '\0' is received.//Then print out the complete string.if(c!=','){if(c!='\0'){d[i]=d[i]+c;//Appends the characters to an array.i++;}}else{i=0;//Reset the d char array index.if(varI<7){//We only want to get integers until we get all four numbers (x1, y1, x2, y2) plusdim[varI]=atoi(d);//Convert the d int into ASCII and store it in the dim array.d[0]=0;d[1]=0;d[2]=0;d[3]=0;d[4]=0;d[5]=0;//Clear the d array (i2c doesn't like for loops in this functionvarI++;//Increase dim index.}else{//We now have all four numbers, load them into the variables.x1=int(dim[4]);y1=int(dim[1]);x2=int(dim[2]);y2=int(dim[3]);NoServoData=1;//Set the WriteServo() call flag.varI=0;//Reset the dim index to prepare for next set of numbers.}i=0;//Reset some}}voidServoWrite(){intx3=160-x2;// Calculate the distance from the right edge of the screeninty3=120-y2;// Calcualte the distance//For X Axisif(x1<SmallXLimit){//Only do small jumps, since not too far away from the edge.if(posX>1){//If the pan servo is at its edge, do nothing.for(inti=0;i<LargeXJump;i++){posX++;// Set the new positionCamServoX.write(posX);//Make the adjustment.delay(ServoTimer);//Delay between servo increments.}}}if(x3<SmallXLimit){if(posX<180){for(inti=0;i<LargeXJump;i++){posX--;CamServoX.write(posX);Serial.println(posX);delay(ServoTimer);}}}if(x1<LargeXLimit){if(posX>1){for(inti=0;i<SmallXJump;i++){posX++;CamServoX.write(posX);Serial.println(posX);delay(ServoTimer);}}}if(x3<LargeXLimit){if(posX<180){for(inti=0;i<SmallXJump;i++){posX--;CamServoX.write(posX);Serial.println(posX);delay(ServoTimer);}}}//For Y Axisif(y1<SmallYLimit){if(posY>1){for(inti=0;i<SmallYJump;i++){posY--;CamServoY.write(posY);Serial.println(posY);delay(ServoTimer);}}}if(y3<SmallYLimit){if(posY<180){for(inti=0;i<SmallYJump;i++){posY++;CamServoY.write(posY);Serial.println(posY);delay(ServoTimer);}}}if(y1<LargeYLimit){if(posY>1){for(inti=0;i<LargeYJump;i++){posY--;Serial.println(posY);CamServoY.write(posY);delay(ServoTimer);}}}if(y3<LargeYLimit){if(posY<180){for(inti=0;i<LargeYJump;i++){posY++;CamServoY.write(posY);Serial.println(posY);delay(ServoTimer);}}}//Reset servo write flag.NoServoData=0;}
Now for the Python Code:
#!/usr/bin/python"""Have to execute using "sudo python facedetect.py --cascade=face.xml 0"(Normal build sudo python "%f")This program is demonstration for face and object detection using haar-like features.The program finds faces in a camera image or video stream and displays a red box around them.Original C implementation by: ?Python implementation by: Roman Stanchak, James Bowman"""importsysimportcv2.cvascvfromoptparseimportOptionParserimporttimeimportthreadingimportreadlineimportpygamefrompygame.localsimport*importsysimportsmbus# Parameters for haar detection# From the API:# The default parameters (scale_factor=2, min_neighbors=3, flags=0) are tuned# for accurate yet slow object detection. For a faster operation on real video# images the settings are:# scale_factor=1.2, min_neighbors=2, flags=CV_HAAR_DO_CANNY_PRUNING,# min_size=<minimum possible face sizemin_size=(20,20)image_scale=2haar_scale=1.2min_neighbors=2haar_flags=0"""i2c Code"""bus=smbus.SMBus(1)# Open up a i@C bus.address=0x2a# Setup Arduino addresssendstring=""# This will be my send variable (RPI-to-Arduino)bytearraytowrite=[]#Actual array for holding bytes after conversion from string.#This function actually does the writing to the I2C bus.deftoWrite(a):globalsendstringglobalbytearraytowritebytearraytowrite=map(ord,sendstring)#This rewrites the string as bytes.foriina:bus.write_byte(address,i)deftxrx_i2c():globalsendstring#while True:sdata=""rdata=""foriinrange(0,5):rdata+=chr(bus.read_byte(address));#print rdata#print bytearraytowrite#print "".join(map(chr, bytearraytowrite)) #Will convert bytearray to string.#Writes the key commands to the i2c bus.toWrite(bytearraytowrite)#time.sleep(.6);defdetect_and_draw(img,cascade):globalsendstring# allocate temporary imagesgray=cv.CreateImage((img.width,img.height),8,1)small_img=cv.CreateImage((cv.Round(img.width/image_scale),cv.Round(img.height/image_scale)),8,1)# convert color input image to grayscalecv.CvtColor(img,gray,cv.CV_BGR2GRAY)# scale input image for faster processingcv.Resize(gray,small_img,cv.CV_INTER_LINEAR)cv.EqualizeHist(small_img,small_img)if(cascade):t=cv.GetTickCount()faces=cv.HaarDetectObjects(small_img,cascade,cv.CreateMemStorage(0),haar_scale,min_neighbors,haar_flags,min_size)t=cv.GetTickCount()-tprint"detection time = %gms"%(t/(cv.GetTickFrequency()*1000.))iffaces:for((x,y,w,h),n)infaces:# the input to cv.HaarDetectObjects was resized, so scale the# bounding box of each face and convert it to two CvPointspt1=(int(x*image_scale),int(y*image_scale))pt2=(int((x+w)*image_scale),int((y+h)*image_scale))cv.Rectangle(img,pt1,pt2,cv.RGB(255,0,0),3,8,0)x1=int(x*image_scale)y1=int(y*image_scale)x2=int((x+w)*image_scale)y2=int((y+h)*image_scale)sendstring=str(x1)+","+str(y1)+","+str(x2)+","+str(y2)+","sendstring=sendstring.translate(None,'() ')printsendstringtxrx_i2c()sendstring=""cv.ShowImage("result",img)if__name__=='__main__':parser=OptionParser(usage="usage: %prog [options] [filename|camera_index]")parser.add_option("-c","--cascade",action="store",dest="cascade",type="str",help="Haar cascade file, default %default",default="../data/haarcascades/haarcascade_frontalface_alt.xml")(options,args)=parser.parse_args()cascade=cv.Load(options.cascade)iflen(args)!=1:parser.print_help()sys.exit(1)input_name=args[0]ifinput_name.isdigit():#Where the image is actually captured from camera. "capture" is the variable holding image.capture=cv.CreateCameraCapture(int(input_name))else:capture=Nonecv.NamedWindow("result",1)width=160#leave None for auto-detectionheight=120#leave None for auto-detectionifwidthisNone:width=int(cv.GetCaptureProperty(capture,cv.CV_CAP_PROP_FRAME_WIDTH))#Gets the width of the image.else:cv.SetCaptureProperty(capture,cv.CV_CAP_PROP_FRAME_WIDTH,width)#Gets the width of the image.ifheightisNone:height=int(cv.GetCaptureProperty(capture,cv.CV_CAP_PROP_FRAME_HEIGHT))else:cv.SetCaptureProperty(capture,cv.CV_CAP_PROP_FRAME_HEIGHT,height)ifcapture:#If "capture" actually got an image.frame_copy=NonewhileTrue:frame=cv.QueryFrame(capture)ifnotframe:cv.WaitKey(0)breakifnotframe_copy:frame_copy=cv.CreateImage((frame.width,frame.height),cv.IPL_DEPTH_8U,frame.nChannels)# frame_copy = cv.CreateImage((frame.width,frame.height),# cv.IPL_DEPTH_8U, frame.nChannels)ifframe.origin==cv.IPL_ORIGIN_TL:cv.Copy(frame,frame_copy)else:cv.Flip(frame,frame_copy,0)detect_and_draw(frame_copy,cascade)ifcv.WaitKey(10)>=0:breakelse:image=cv.LoadImage(input_name,1)detect_and_draw(image,cascade)cv.WaitKey(0)cv.DestroyWindow("result")
As I prepare to start adding peripherals to my Pi Bot, I wanted to be sure to get around the 700ma power budget the Pi has. After searching for a cheap battery powered USB hub and finding little, I decided to hack up a few cheap(ish) parts and make my own.
4. Make a small hole for wires and bring wires out.
5. Solder the respective leads to the DC-DC converter.
6. Smile, then sit through my
way too long
of a video to make it into the HUB.
Hope all are well. :)
NOTE: Regarding the error at the end of the video. Don't panic (that's what I did). I actually found out this had nothing to do with my hub, it had to do with plugging an iPhone into a Raspberry Pi.
NOTE2: I realize I used the wrong "hearty," my brain has problems typing homonyms and parahomonyms. :P
This article is specific:
How I personally would setup my Raspberry Pi to act as robot base.
But, I'll be clear, this is one of nth possible setups. A chessboard has 64 squares but those working the board allow for innumerable possibilities.
That aside, here we go:
1. Get Berryboot.
Berryboot
will allow you to download several Raspberry Pi images.
Now extract the zip files to a blank SD card.
Put the BerryBoot SD card in your Pi and boot it up.
3.
Setup your WiFi dongle.
I believe BerryBoot will now setup your WiFi dongle on initial boot, which it did for me (even gave me the option to download the image via WiFi). But, I had trouble getting my WiFi dongle pulled up after booting Raspbian Wheezy.
If you have difficulty with manual WiFi dongle setup, you might try
this video
.
Lastly, if you are looking for a WiFi dongle for cheap, with good range, and uses very little mAhs (the Pi can only feed about 700mAhs through the USB port). You might
try this one
, $6.17.
4. Setup PuTTY on your Desktop Computer.
Follow this video.
This will allow you to begin SSHing into the Pi. That way you don't have to look at a little RCA screen like me. For those who aren't familiar with SSH (like I was before this video), the video will explain it. At risk of oversimplification,
it allows you to access your Raspberry Pi command line through your desktop.
You have to plug in your Pi's network number.
You can find this by pulling up your wireless hub's configuration page. You should see what address your Pi is listed at. For some strange reason, if it doesn't list the device name, just view the page while the Pi is up, then unplug your Pi and refresh the wireless hub configuration page. The device that disappeared is your Pi. I've never had to change the port number, but beware you might need to depending on your setup.**
If you want to know whether your have the correct information, try login' in and if you get a screen like this, your good.
Your username and password are by default:
pi, raspberry
Remember! In the case of a Raspberry Pi, always share your password, 'cause everyone has it anyway :)
Once you have PuTTY setup, you should be able to bring up your Pi command line, something like this:
5. Setup VNCServer on your Raspberry Pi.
Follow this video.
(Or this
walkthrough
).
Putty will let you access your Pi's command line, but setting up a VNC will actually allow you to access your Pi's Desktop GUI from your PC, in the same manner as Putty.
6. Setup a VNC Client on your Desktop Computer.
Real VNC.
There are many different programs, I happened to end up using Real VNC.
Once you have VNC setup on both machines, PuTTY into your Pi and start the VNC server.
$sudo vncserver
Two notes here, if you did better with the video instructions than I did, your vncserver will start automatically on boot. Unfortunately, I have to type it each time (I'm too lazy to figure out the boot part of it). As a result, you'll have problems running certain Python scripts through VNC if you don't use $
sudo
vncserver
You'll enter your Pi address, but port should be 1 (if I remember the video instructions correctly).
You should end up with at a windowed version of your Raspberry Pi desktop. One more note, somewhere in the video it gets you to setup the "geometry" of the VNC desktop. The limitations you put there will be reflected in the quality of the desktop you see in the window. In essence, if you put in 640x480, that's the resolution this desktop will end up. So, please, take advantage of the Pi's GPU :)
Use something like this, "-geometry 1024x728 -depth 24"
7.
Resize your SD card
to use all its space. (Note, this should already be done by BerryBoot. But other diskimages will limit your SD card to 2GB, regardless of its actual size).
8. Git manager will allow you to pull code from git hubs (again, this should already be installed, but just in case).
I
**nstall the git manager: **
At Raspberry Pi prompt:
$sudo apt-get install git
The way to use it is like so,
At Raspberry Pi prompt:
$sudo git clone https://github.com/adafruit/Adafruit-Raspberry-Pi-Python-Code.git
9.
Install SMBus.
This is specifically for my setup, since I'll be using the I2C bus to communicate between the Pi and the Arduino.
At Raspberry Pi prompt:
$sudo apt-get install python-smbus
10. Any other Python modules you might fancy.
Useful for keystroke, GUI, and other interfacing needs:
(I'll add other resources as fellow LMRs leave them in the comments).
11. (optional) Install Arduino IDE on Raspberry Pi. This will allow you to program the Arduino directly from your Pi--
and if you follow my design, you'll be able to do so without ever leaving your desktop computer. You can do this by opening the VNC Server, opening the Arduino IDE on the remote desktop, selecting the sketch you want to upload, and as long as your Arduino is connecting by way of USB, you can then upload your sketch from where you sit. This allows for quick changes to Arduino code without switching wires around. Also, I think Kariloy is looking for a way to upload sketches by way of GPIO pins. This would make a cleaner design.
12.
Install WinSCP
. This will allow you to transfer files between your desktop and the Pi.
I find this helps with programming management. I'm a messy filer. If I file at all.
Again, there are many commercial boards that will serve the same function. Also, you can do the same with a
USB cable
,
serial pins to GPIO
, or
RF
connection--basically any way that lets the Arduino and Pi talk at a reasonable speed. The speed restraint will of course depend on your need. I doubt many methods will be apt for running a responsive quadrocopter. But in my case, my Pi is the central nervous system and the Arduino is the autonomous nervous system. The Pi will send directives, but it's up to the Arduino to manifest them through responsive actuators. And I chose this optoisolator because I didn't want an voltage restraint on my actuators or fear of frying my Pi.
Once you have the board setup, you can run:
$sudo i2cdetect -y -a 1
This should bring up a list of active I2C registers. You
should
find your Arduino at whatever address you set in your Arduino code.
Now, I've read this fellow's
article
on how Raspberry Pi I2C pins are actually 5v tolerant. (Note, this is
only
for I2C pins, due to their pull-up resistors.)
So in theory, you can skip the optoisolator all together. But that's
you
, I'll stick with my optoisolation.
Note, my code is really just the base for a robot. Right now, my it is nothing more than a very, very complex radio controller for a RC car. But someday, I'll make a real robot :)
**16. Tweak and gut the code as you see fit. **
17. Ask questions: Pretty much everyone on this site is smarter than me, they'll know the answer.
To other LMRians. Please feel free to tell me how to change, add, or retract from this article. As tired as I am right now, I plan to revise when I'm less muddled.
I've waited to finish incorporating my Raspberry Pi into my bot for an ample bit. But since I know so little about electricity, I swore to myself I wouldn't add my Pi to my
bot
until I was absolutely sure I wouldn't fry it.
Well, I'm still not "absolutely" sure, but I feel this little optoisolator has brought me a lot closer. This builds on
my post
a week or so ago about making Eagle parts.
I plan to actually list out what tweaks a Wheezy image needs to get this optoisolator build to work. It's actually pretty easy--but whatever you, don't be lured in by quick2wire. Those buggers wasted most of my day :(
If anyone has questions let me know.
Oh, one note. When I populated the board I used 4.7k resistors on the Arduino side, but I pulled off everything on the Raspberry Pi side. It seems the Pi has built in pull-ups that do the job rather well.
I decided to try making an Arduino Pro Mini at home. Being done, it's not worth it. You can buy one for a dollar more than you can make them, and it took awhile to populate. Although, it's "fun."
This projects was also a chance for me to test the Spying-Stalactite I built.
I've enjoyed it. It allows me to reflect on my strategy while populating boards. It's simply a drop down with some high-powered LEDs (~2500 lumen), heatsink, and coolant fan. It has a hole for my iphone to do the recording. Cheap and simple. Although, I need to diffuse the light, as you might see by the video that it washes out the details of the projects. Also, I'll add a few more lights and do away with the tungsten lamp, since the iphone is constantly in a white-balance battle as I move infront of the mixed lightsources.
I populated this board; everything came out fine (although, it was
much more difficult
trying not to block the camera with my head). I popped it into Atmel studio and it read out the device voltage and signature. Of course, I bricked it, as I seem to do a lot.
My next projects is a Fuse Doctor. :)
I had ordered the boards from OSHPark and had planned on making three. So, I populated another and took some time programming it. I've outlined my steps below:
1. Hook up the AVRISP MKII
2. Open Atmel Studio. Go to Tools -- Device Programming.
3. Setup:
Tool: AVRISP mkII
Device: ATmega328P
Interface: ISP
Click apply
4. Read Target voltage (it should be ~5V). Read Device Signature.
Open
boards.txt
that comes with Arduino (\Desktop\arduino-1.0.3\hardware\arduino\boards.txt).
Scroll down to the area marked:
8. Pull the programming information for the board from this area.
Now, I've bricked a few boards, but I think I've figured this one out. When programming this board with the MKII and Atmel Studio, you should follow this order.
1. Set the fuses:
Extended: 0xFD
High: 0xDA
Low: 0xFF
(Double check the board file to make sure I didn't make typos)
Hit "Program"
2. Upload Bootloader.
"The bootloader for the 5v, 16mhz Arduino Pro Mini (which is what I built) is "ATmegaBOOT_168_atmega328.hex (Desktop\arduino-1.0.3\hardware\arduino\bootloaders\atmega\ATmegaBOOT_168_atmega328.hex).
It's important to note that the 3.3v and 5v versions use different bootloaders.
Go to the Memories tab
Hit the browse ellipsis.
Select the "ATmegaBOOT_168_atmega328.hex"
(Double check the boards file to make sure I'm not screwing you up).
Hit program.
3 Set Lock Bits.
Go to the "Lock bits" tab.
Check the boards.txt file for Lockbit number
Lockbit: 0xCF
(Double check the boards.txt. I don't take blame for bricked boards :P).
Hit "Program"
9 Upload the Blink Sketch; the LED by the reset button should blink.
10 Let me know how it went. If you bricked a chip using these instructions, let me know so I can modify them quick.
Now that I'm used to the camera and stalactite, I plan to annotate my next board for tips on working with 0402s.
Hope all are well.
ps. Birdmun et al., sorry bout the copyright issues. Not a professional at anything, especially video editing :)
Addendum:
Please don't watch my videos.
After Birdmun's comment I found Hack-a-Day has created better videos (shakes fist at Hack-a-Day) and I don't want anyone to waste anyone's time. Although, mine has a better soundtrack and less mutton-chops :)
Original:
I was speaking with TeleFox and Birdmun about finding an optoisolator for use with my Raspberry Pi; I had gotten some samples of these ICs:
ADUM1250ARZ
. Well, for awhile now I've wanted to share my dumb-luck methods for designing a board around a sampled IC.
Originally posted on [www.letsmakerobots.com](www.letsmakerobots.com
1.
Get over to Analog Devices and sign-up for a
sample account.
They seem to be pretty nice and let you order several samples every month, I believe.
Try to learn Python while the mail peoples do their magics.
**
**
Flip-off your Python code and get the mail.
Take everything out. ADXL breakout board, ADXL345 chip, and caps.
Populate your board. At this point, a good iron will do you well. But as a general rule, start with the largest chip when soldering SMDs. In our case, it is the ADXL345. Paint some solder flux all over the exposed pads. Now, take a very fine solder, such as .022 and put some on the tip of your iron. Drag the droplet of solder across the exposed pads of where the ADXL will go. Now, after the beads have cooled, paint your solder flux over the hardened beads. The idea is to have the chip floating on this flux.
Place the ADXL345 on the invisible flux, hovering over the pads. Make sure the small white dot on the the corner of the chip is in the same place the red is below.
Put the board on an over turned iron.
This is the most important part:
Watch the chip.
What you are hoping to see is the chip magically float in place as the solder flux flows out from under the chip, leading to the solder beads bonding with the exposed copper of the ADXL345.
Seriously, don't look away :).
If for some reason you don't feel the chip has bonded at each pad, then very lightly press down in the middle of the chip. I said lightly!
12. Cap that B. Erm, populate the capacitors.
**
**
**
**
13. Plug and pray.
14. Realize it doesn't work because you suck at life.
15. Pull the ADXL back off, clean the pads with solder wick, and try again.
16. Repeat step 11, but this time, watch the chip, no seriously.
17. Hook it up and check it out.
The chip is both SPI/I2C ready, but I prefer I2C. So, hook that sucker up to the Arduino and see if it works. This fellow provided code and instructions on connecting are in the code's comments at the top.
18. Watch as your one dollar(ish) ADXL345 does witchery.
19.
Ponder the ethics of sampling chips, borrowing board layouts from SparkFun, and buying underpriced capacitors from China; all leading to saving you around $12~25--
or have a beer.
20. Try not to abuse the sampling previlige.
If you have any questions, I'll do my ignorant best.
I finally got in my Mega Mini Motor (M3) shield that I designed. I was surprised, after populating the board: It actually worked. The board came about after making the
Arduino Mega Mini
. I noticed I wouldn't really be reducing the bulk of my
bot
because of the amount of wiring it would take to get logic to the Arduino Motor Driver shield I was using. Therefore, I set out to design a motor driver shield that would plug right into the MegaMini. I broke out Eagle and some datasheets on an assortment of ICs.
I started out working with the L298D chip, but quickly got frustrated with the way it set on the MegaMini footprint. Plus, the flyback diodes were pissing me off. I had remembered reading that the
SN754410
had internal ESD diodes. I started playing with the chip layout and got a board design I was pretty happy with.
I'll attempt a full write up later;I'm pretty mentally fatigued from learning html/css (I know, easy. But as many know by now, cognitively, I'm as slow as a snail on salt.)