UPDATE: I discovered the link I had was referring (which is the true stock image) is unuseable unless update and upgrade are run. Sadly, you can’t do that with a 2gb image. Regardless, I’ve switched the image to the updated (as of writing this) Angstrom image. Please double check and make sure you’ve got the latest image:
9. Write the Angstrom Stock img to your Beaglebone Black eMMC.
This next step is going to write the image file to your Beagle’s eMMC.
Two notes, it is going to take awhile, if you are curious if it is still installing, use the LED activity lights to guide you. When the PuTTY window gives you back to the command prompt and the LEDs are slowed, you’re good to go to the next step. Oh, second note. Try not to power down the Beagle during this step.
This fellow here has made some pretty nifty walkthroughs on the rtl8192 and the Chronodot (DS3231) RTC on the Arch Linux. Though I’ve not attempted his instructions (been burnt out on this board) I believe his instructions will get a reliable WiFi connection with the rtl8192, using Arch Linux, on the B^3.
Also, when I get the energy, the pinout at the bottom of this page has a mistake or two. As Zaius pointed out.
EDIT: Ok. Don’t use the pinout until I research more. I’m getting conflicting information on what pins are what in Mode 7. The reference manual is stating one thing, but other sources are agreeing with me. I’m guessing Zaius is dead on; version issues. When I’ve got it sorted I’ll update.
Not much yet, still working on stable wifi. I thought I might as well share my work log; embarrassing as it may be.
If there are some Linux wise in the crowd (especially those knowing Arch Linux) would you mind taking a look at my work flow? I’ve got the wifi module working, though, it’s not as stable as I’d like.
Wow. Sorry all, been more than a month since I updated this post.
I’ve not given up on the BBB as a robot platform; I realized I didn’t know Linux well enough to begin tweaking it on embedded devices (well, devices lacking community support, at least). I’ve spent the last month reading about Linux and trying to wrap my head around it (that and fixing all of our bust-a-mucated cars).
I grew up Microsoft and over this last month all household computers have switched to dual-booting Ubuntu 12.04 and Microsoft X. And router will soon make the switch to OpenWRT.
Back to the BBB; the Realtek WiFi dongle that drove me mad has been solved by these guys. I’ve not had time to attempt their walkthroughs, but it is on the agenda.
It’ll let you tunnel (SSH) into your Linux devices from either from an iPhone or iPad X. I’ve enjoyed this for two reasons: I can keep an eye on how a program is compiling on the Raspberry Pi while watching a movie with the family, and, I like the feeling of running Linux on a closed system. I understand it’s a falsity, but it’s still comforting.
I hope all are well.
Well, I think I could best describe this point in economic terms. It’s the where I’ve realized my productive efficiency is being restricted due to the current inabilities of technology.
In essence, this graph shows that I cannot reach my desired productive efficiency (getting the B^3 to do the tricks I want it). Really, I’d be happy at point C (even though point D is probably better for me and my family). The problem is technology limitations are restricting me from getting to point C on the curve. And it’s bugging the hell out of me. At first, I thought this was completely due to my ineptitude (which is partially true), but there is another barrier, a seemingly hidden one.
The current Beaglebone driver technology is a hidden barrier to this productivity point.
I’ve read the warnings TinHead gave on treating embedded devices like computers. But if they don’t carry some extrordinary functions then what separates them from really, really fast microcontrollers? No. I’m pushing to have some basic PC functionality.
Easy access to a graphical interface (note, I’m not stating GUI).
Ability to utilize higher-level programming languages (Python, C++, etc).
Really, that’s it. A few features to allow rapid prototyping while haranessing the power of open software.
To me, if these three features are achieved, then I feel like the device is complete. Though, I should state, I’ve realize these three features are no simple feat.
So, where’s the Beaglebone Black? Not there.
Some things not supported that will need to be for me to get to point C (Fig. 1).
Ability to plug in cheap, low-power WiFi dongles and get them going in under an hour. Let’s be honest. Cheap is what 90% of us will go with. It allows us to do more with our budgets. Therefore, if an embedded device in anyway can utilize cheap peripherals, then let’s focus on making it happen. 1
Better power-management on the software side. Several distros will shutdown the board during boot-up, as the peak above 500mA. The designers suggestion? Don’t plug anything in until the board is up. Sounds ok, right? Well, don’t forget there is no hot-plugging on the USB, microSD, or HDMI. The drivers haven’t been written yet. I’m pretty sure this is due to drivers, since I’ve read through the BBB datasheet and the power supply hardware seems sound.
Ability to adjust the HDMI output. At one point, I had one point, I was trying to get Arch Linux up and I couldn’t get in via SSH. So, I plugged it into the only HDMI monitor I have and tried to adjust the ssh.config file. The problem? I couldn’t see what was commented out due to the overscan. I dig through Google group where the board designers rest; guess what? There is no current way to adjust the video output. 2
Therefore, my conclusion (though, my insanity is rising), is:
All this to say, my wife has taken away my Beaglebone Black until that green line moves farther out.
Yes, I am little-cat whipped, but she has said, “You’re doing this to relax, not work a second job. I’d rather you work on some other projectss for awhile.” Hey, being married is the only thing keeping me sane. Well, her and you guys (and girls, if Max didn’t run them off :P).
I’ve finally got the Black Bone near where I’ve got my Pi. Here, the old Bone is running an updatedAngstrom (4gb) build, using WiFi dongle, and is connected to a 1A wall-wart (connected to microUSB not barrel-jack). When I’m off work today I’ll try to complete a “Box to Wireless” walkthrough for good ‘ole Angstrom.
(Question, anyone else feel like there’s a Mason’s conspiracy going on in the embedded world?)
I think I got near understanding TinHead’s post: Don’t treat an embedded device like a PC? I dunno.
I was able to get my WiFi dongle up by adding the realtek8192 kernel. Not sure all I did, but it works. So, as soon as I can get some repeatable steps, I’ll post a walkthrough of setting the Beaglebone Black up with PuTTY, VNC, and WiFi dongle.
Was able to get RealVNC to pick up Angstrom. Working on getting WiFi dongle up.
I added some links to Bonescript GPIO walkthroughs (PWM, Analog, Blinking).
I’ve created a visual guide to mode 7 pinout (added below).
I’m pretty frustrated. So, I’m going to back off working on the board until Tuesday when my 8gb microSD comes in. At that point, I’ll use this to reflash my eMMC boot partition and start working two different projectss: Getting Arch Linux going well, and giving in and update && upgrade my Angstrom. Both, I’ll try to write up.
Jerz, or anyone else with a BBB, if you have any notes to add, if you don’t mind shooting me an email I’ll update this post.
Hope everyone had an awesome mother’s day.
May I encourage anyone who has yet to order their B^3: Wait.
There are several intense issues being worked out on a hardware-software level. Until then, I feel you’ll be as frustrated as me. Bdk6’s pun says it all: This board is being a bitch.
The package manager that came with Angstrom was actually broken for awhile, and no one bothered mentioning it to the community. Instead, there were many posts titled “why won’t opkg work?” Now, I believe it will work if you run update && upgrade, of course, to do that you must have an SD card since it will be larger than 2gb.
I got Arch Linux up, briefly (it takes both eMMC and SD).
I lost the appropriate boot file for my eMMC. (While attempting Arch Linux).
There doesn’t seem to be an easy way to flash eMMC back to stock (I’ve got to wait for a bigger card).
The developers are pretty stressed out. I don’t see solid support for a bit. And already seems like a us vs. them between the developers and open community
I’m tired. Who’s taking over?
So, I attempted getting my WiFi dongle setup (again) using Angstrom’s package manager. I found that everything I tried installing using their package manager would shoot back an error. I read, and I believe the problem is the following must be run to catch the Angstrom stock package manager up with desired packages.
**opkg update **
**opkg upgrade **
I ran them, and guess what? The eMMC does not have enough space to hold the updates. Mother-of-a-Beagle!
Sadly, I’m using a microSD card from an old phone, which is only 2gb. My 8gb is on order.
This, in my opinion, puts the Beaglebone Black on the same level as the Raspberry Pi; that is, it must have a SD card before you can use it (a card larger than 2gb). If someone else finds a way to install things on the B^3 without updating it, let me know, I’ll correct this critique.
I screwed up the eMMC partition while trying to get Arch Linux on the Beagle.
5/9/13: Oh, dear lord. It’s true. Lack of community support kills the Beaglebone.
It took me nearly 9 hours to setup an OS on an MicroSD.
I’ll write it up tomorrow night, after some rest.
Ubuntu on Beaglebone Black:
I got my Beaglebone Black (BBB, B^3) in today. I thought I’d share my unboxing and general thoughts.
Please don’t take this as Mr. Jones on a Sunday drive, rather, I want to provide the touch-n-feel information for every robot builder here. In short, I don’t want everybody to waste $45 if the BBB is going to turn out to be Beagle sh…, well, you get it.
(Hey, Raspbery Pi puns could fill a library, I can’t make one BBB pun. Shesh.)
JerZ was trying to explain to me there are several modes for the B^3 pins (he’s run Hello World on it, and I’m sure by this moment he’s run Blink), Regardless, I thought I’d try to whip up a visual for robot building on the B^3.
Keep in mind, these are the pin names – you’ll have to look up on page 69-73 of the reference manual to know how they might be used. Although, almost every pin can be used, regardless of its intended function. Their functions are defined by the software, each pin having 8 modes (0-7).
For example, if your bot is a quadrocopter: You find a real time kernel for Linux and you’d probably set the pins to MODE 7 turning the non-essential pins into GPIO (i.e., sacrificing HDMI, eMMC, etc. lines to GPIO).
First impressions in a sentence: The hardware looks sound.
Several things make it stand out:
It uses a Micro SD card instead of a SD. This allows for a smaller overall size without using an Adafruit Adapter.
It has three tactic switches: (1) power, (2), reset, and (3) a mystery switch. I’m hoping the third is software accessible. The built in powerswitch is a real winner. It means you can tell this guy to keep his £15 and his closed source design.
It has one USB hub. This is my second greatest concern (after less community support) is having to rely on USB HUBs to connect devices. And, yes, I’m aware an IC, breadboard, and access to hardware IO will allow custom USB deivces. But sometimes don’t you want to just plug-and-go? (Yes, I’m aware I’m lazy.)
It has a barrel-jack instead of a Micro USB for power. I don’t know how you feel, but I’d rather have the Micro USB simply because I’ve got a lot of those lying about, whereas barrel-jacks, I’m not sure. Maybe under the decaying skull?
It’s open hardware. The RPi claims to be for “educational purposes,” although, it seems the education is restricted to the software. Of course, this is an assumption based on not yet seeing a schematic for the Raspberry Pi silicon bits.
It’s TI. They’re nice to me. (I might have a stack of sampled ICs from them…maybe.)
If everyone is alright with me floating this post for a bit, I’m going to try to do a first-boot video tomorrow, then, continue until I’ve built this BBB into a bot.
No longer afeared of frying my Pi, I’ve moved on to trying to implement some of my bot goals. Like many, I want my bot to be able to interact with people, but I didn’t realize that I’d stumble on this ability.
I’ve looked at many visual processing boards like the CMUcam v4, but I’m not paying $100 for any board. I looked into making one, it looks possible, but not much cheaper. So, I got curious as to what alternatives there are. I stumbled on Hack-a-Day’s recommended article: OpenCV on Raspberry Pi.
Anyway, he provided instructions on setting up OpenCV (open source computer vision) on Raspberry Pi. Of course, it was about 20 minutes later I had the code working on my Pi.
I had been skeptical of the Pi’s ability to run any computer vision software, and morever, it’s usefulness given the Pi’s processing constraints. But once I had it up and running, I noticed it actually ran smoother than I had hoped. Don’t get me wrong, I think it is less than 10FPS, but I could tell it would work for many robot applications More than that, if the Raspberry Pi was used only for the computer vision, then it would still be cheaper than many other hardware driven CV boards.
Basic Raspberry Pi and WiFi Dongle
WiFi Dongle: $6.17
Raspberry Pi: $35.00
SD Card (4g): $2.50
Web cam: $8.00
Total for Basic RPi: $51.67
Therefore, I went to work on hacking his code.
Many hours later, I ended up with a _very crude _ Raspberry Pi, Ardy, Camera, and Servo orchestration to track my face. Mind you, this is a proof of concept, nothing more at this point. But I hope to eventually have my bot wandering around looking for faces.
Image of Pi VNC. The box outline is being written through i2c.
Pulling apart a little $8 eBay camera.
To Setup the Raspberry Pi:
If you’re setting it up from sratch, start with these instructions.
But if you’re already setup, I think all you need is OpenCV.
$ sudo apt-get install python-opencv
The Arduino code reads bytes from the i2c, converts them to characters, then places the characters into an integer array. The Pi is sending 4 numbers, 2 coordinates, x1, y1, x2, y2.
The Python code is “facetracker.py” by Roman Stanchak and James Bowman, I’ve merely added lines 101-105, which load the coordinates of the box around your face into a a string, converts that to a string array. I also added function txrx_i2c(). This function converts the string array into bytes and sends it to the i2c bus.
To change this setup from i2c to UART, focus on the txrx_i2c() in the Python code and the onRead() in the Arduino code. I assure you, UART would be much easier.
If anyone has any questions hollar at me. Oh! And if someone can tell me ways I could optimize this code, I’m all ears
As I prepare to start adding peripherals to my Pi Bot, I wanted to be sure to get around the 700ma power budget the Pi has. After searching for a cheap battery powered USB hub and finding little, I decided to hack up a few cheap(ish) parts and make my own.
This article is specific:How I personally would setup my Raspberry Pi to act as robot base. But, I’ll be clear, this is one of nth possible setups. A chessboard has 64 squares but those working the board allow for innumerable possibilities.
That aside, here we go:
1. Get Berryboot.Berryboot will allow you to download several Raspberry Pi images.
Now extract the zip files to a blank SD card.
Put the BerryBoot SD card in your Pi and boot it up.
3. Setup your WiFi dongle. I believe BerryBoot will now setup your WiFi dongle on initial boot, which it did for me (even gave me the option to download the image via WiFi). But, I had trouble getting my WiFi dongle pulled up after booting Raspbian Wheezy.
If you have difficulty with manual WiFi dongle setup, you might try this video.
Lastly, if you are looking for a WiFi dongle for cheap, with good range, and uses very little mAhs (the Pi can only feed about 700mAhs through the USB port). You might try this one, $6.17.
4. Setup PuTTY on your Desktop Computer. Follow this video.This will allow you to begin SSHing into the Pi. That way you don’t have to look at a little RCA screen like me. For those who aren’t familiar with SSH (like I was before this video), the video will explain it. At risk of oversimplification,it allows you to access your Raspberry Pi command line through your desktop.
You have to plug in your Pi’s network number.You can find this by pulling up your wireless hub’s configuration page. You should see what address your Pi is listed at. For some strange reason, if it doesn’t list the device name, just view the page while the Pi is up, then unplug your Pi and refresh the wireless hub configuration page. The device that disappeared is your Pi. I’ve never had to change the port number, but beware you might need to depending on your setup.**
If you want to know whether your have the correct information, try login’ in and if you get a screen like this, your good.
Your username and password are by default:pi, raspberry
Remember! In the case of a Raspberry Pi, always share your password, ‘cause everyone has it anyway :)
Once you have PuTTY setup, you should be able to bring up your Pi command line, something like this:
5. Setup VNCServer on your Raspberry Pi. Follow this video. (Or this walkthrough). Putty will let you access your Pi’s command line, but setting up a VNC will actually allow you to access your Pi’s Desktop GUI from your PC, in the same manner as Putty.
**6. Setup a VNC Client on your Desktop Computer. Real VNC. **There are many different programs, I happened to end up using Real VNC.
Once you have VNC setup on both machines, PuTTY into your Pi and start the VNC server.
Two notes here, if you did better with the video instructions than I did, your vncserver will start automatically on boot. Unfortunately, I have to type it each time (I’m too lazy to figure out the boot part of it). As a result, you’ll have problems running certain Python scripts through VNC if you don’t use $sudo vncserver
You’ll enter your Pi address, but port should be 1 (if I remember the video instructions correctly).
You should end up with at a windowed version of your Raspberry Pi desktop. One more note, somewhere in the video it gets you to setup the “geometry” of the VNC desktop. The limitations you put there will be reflected in the quality of the desktop you see in the window. In essence, if you put in 640x480, that’s the resolution this desktop will end up. So, please, take advantage of the Pi’s GPU :)
Use something like this, “-geometry 1024x728 -depth 24”
7. Resize your SD card to use all its space. (Note, this should already be done by BerryBoot. But other diskimages will limit your SD card to 2GB, regardless of its actual size).
8. Git manager will allow you to pull code from git hubs (again, this should already be installed, but just in case).
I**nstall the git manager: **
At Raspberry Pi prompt: **$sudo apt-get install git**
The way to use it is like so,
At Raspberry Pi prompt: **$sudo git clone https://github.com/adafruit/Adafruit-Raspberry-Pi-Python-Code.git**
9. **Install SMBus. **This is specifically for my setup, since I’ll be using the I2C bus to communicate between the Pi and the Arduino.
At Raspberry Pi prompt: **$sudo apt-get install python-smbus**
10. Any other Python modules you might fancy.
Useful for keystroke, GUI, and other interfacing needs:
(I’ll add other resources as fellow LMRs leave them in the comments).
11. (optional) Install Arduino IDE on Raspberry Pi. This will allow you to program the Arduino directly from your Pi–and if you follow my design, you’ll be able to do so without ever leaving your desktop computer. You can do this by opening the VNC Server, opening the Arduino IDE on the remote desktop, selecting the sketch you want to upload, and as long as your Arduino is connecting by way of USB, you can then upload your sketch from where you sit. This allows for quick changes to Arduino code without switching wires around. Also, I think Kariloy is looking for a way to upload sketches by way of GPIO pins. This would make a cleaner design.
**12. Install WinSCP. This will allow you to transfer files between your desktop and the Pi. **I find this helps with programming management. I’m a messy filer. If I file at all.
Again, there are many commercial boards that will serve the same function. Also, you can do the same with a USB cable, serial pins to GPIO, or RF connection–basically any way that lets the Arduino and Pi talk at a reasonable speed. The speed restraint will of course depend on your need. I doubt many methods will be apt for running a responsive quadrocopter. But in my case, my Pi is the central nervous system and the Arduino is the autonomous nervous system. The Pi will send directives, but it’s up to the Arduino to manifest them through responsive actuators. And I chose this optoisolator because I didn’t want an voltage restraint on my actuators or fear of frying my Pi.
Once you have the board setup, you can run:
$sudo i2cdetect -y -a 1
This should bring up a list of active I2C registers. You should find your Arduino at whatever address you set in your Arduino code.
Now, I’ve read this fellow’s article on how Raspberry Pi I2C pins are actually 5v tolerant. (Note, this is only for I2C pins, due to their pull-up resistors.)
So in theory, you can skip the optoisolator all together. But that’s you, I’ll stick with my optoisolation.