First Robot

Originally posted on www.letsmakerobots.com

First Bot Files

I will update with “4.x” build. But wanted to get video up as demonstration of concept. SSH–>RPi–>I2C Optoisolator–>Arduino Mini Pro—>Self Designed Motor shield–>Tracks :)

This is my first robot. Of course, he is very modular. I’m alright with that–I’m a homeless outreach worker and this entire projects was simply meant to take my mind off the bad stuff for a bit.

I do love this little guy though, his general hodge-podge appearance reminds me of Tank Girl’s monster–good flick.

The Goals:

1. Stay mentally healthy through the Zen of Robotics.

2. Help my two-year-old son overcome his innate fear of robots.

3. To design a platform I could, eventually, use my understanding of psychology and statistics to create nifty behaviors.

The Build v1.x:

In this iteration I have an Interia lab’s base kit (http://www.robotmarketplace.com/products/IL-ANTKIT2.html). I wouldn’t buy the kit again–the tread hubs are too blinkin’ flimsy and the treads fall apart pretty quick (although, the grip on those treads are amazing, it felt like the bugger could climb straight up a wall). Though, the little motors I’m in love with. When I build my next bot, I’ll design my own base, but I’m sure I’ll use these motors (http://www.robotmarketplace.com/products/IL-GMS100.html). Awesome power in such a little package.

The batteries are left over from an old Cannon camera. I have them wired in parallel feeding 8.4v and 2900mAh.

Next, I have an Arduino Uno.

Then, I have an Arduino Motor Shield. I originally was determined to build my own motorshield. I learned about iron transfers, home PCB etching, Eagle, and different sorts of H-Bridge ICs, all which I’ll write about later, but I came to the conclusion I was being delusional about my current level of ability. Therefore, I bought a motorshield and proceeded with the build.

http://cthomasbrittain.wordpress.com/2013/01/27/design-a-motor-driver-for-lsn-bot/

Next, I have a little 5v Linear Regulator (I found out the hard way that if voltage is provided to the Arduino through the vin that apparently a short-circuit is created when you try to draw current from the 5v pin, send it through a sensor, and feed it back into an analog pin. Pfft. And magic smoke from Arduino number #1. It was pretty blue.)

An top of that, I have a block of wood, cut with a hack saw from our garage. Don’t worry, the garage is still standing: um, so far. I put a MaxSonar sensor on it (https://www.sparkfun.com/products/639). I’ve enjoyed this robust sensor; we even survived the short-circuit vin scenario together.

I also have a vibration sensor wired with a 1 meg resistor (https://www.sparkfun.com/products/9196) feeding into analog pin.

And, of course, a feedback LED.

1.x Video

The Build v2.x:

I felt I had finally assembled the “kit.” But I wanted to own it. I wanted to get rid of the electrical tape holding the whole thing down. I made this little mess. I took some plexi and cut a hole in the middle, then put a slit down some poly-flex tubing and hot glued around the edge as bumper. I used a medium servo and threaded through the hole for the MaxSonar. I found some flexible wire that had enough give to keep from the wire breaking down. Last, hot glue and semi-firm putty held the other bits in place. I felt like I “owned” it a little more now–and the whole thing worked pretty great, except how I chose to mount the servo. The MaxSonar sensor stuck over the edge and would bump into things first.

Another thing I noticed, the plateform I had created worked nicely as a handle for Silas, my son. This led him to play with the bot a little more–even “saving” him a few times.













2.x Video (Be warned, baby in diaper.)

The Build 3.x

I wanted to own more of my build; I procceeded to cut plates using Home Depot grade Plexiglass and long ago I bought a pair of Xbees to be used in a projects. I decided that until I had the kinks worked out in my platform, I would make a simple Xbee controller using Processing 2.0. I removed all the electric tape and pulled out LSN’s guts. I cut several Plexiglass plates to act as component holders.

One plate to cover the base, which I glued two 3 AA battery holders. I put these in parellel, using 3 LiFePo4 600 mAh batteries. I ended up with 9.6v at 1200 mAh LiFePo4. I had found the batteries at Wal-Mart on sale and bought them with the intention of learning how to make my own battery packs. Unfortantely, it has been more of a challenge than I expected; no easy charger setup, so I found some chips to charge them in series and I bought the chips thinking I could etch a board real easy like and make my own chargable pack. Well, I didn’t check what chip I was getting and ended up with a DFN that was approaching singularity in size (http://ww1.microchip.com/downloads/en/AppNotes/01276a.pdf). I didn’t give up on a charger board–but I knew it was going to take me longer than I wanted (I’ve included my current board verzion, a little more work and I’ll send it off to OshPark).

Next, I cut a plexiglass template for the Arduino and Motor Shield, lastly, I cut a plexiglass template and cut out where the pins from the an Arduino Wireless Shield could sit down into the Motor Shield. I secured all this down with some 3 inch aluminimum stand offs and some plastic screw knobs (obtained from Alliedelec.com). I screwed the stand offs into the closure mounting holes.

As I mentioned wanting to earlier, I removed the servo and MaxSensor; in place, I put a breadboard for testing sensors.

To power the sensors, I replaced the 5v linear voltage regulator with this neat little DC-DC regulator (LM2596). The little trim pot lets you set the output 1.5v to 35v, 2A (input is 4.5v-40v). And it made me happy at $1.56 on eBay. I also notched out the plexi and put a little two-state switch for turning the bot on and off.

Of course, the sensor I actually worked into the code of this build was simply a HC-SR04 stuck percariously into the breadboard. (I didn’t want to pull the MaxSonar out of the rather stable block of wood I had it screwed on.)




The Build 4.x

.

The Code v1.x:

The basic code tells the little guy to run around shooting his sonar everywhere. If he detects something closer than 12 inches, he stops, turns to the right, takes a reading, turns to the left, reads, then compares the three readings. Front, left, and center, whichever is more clear he’ll start going that direction. He does this ad infinitium–or ad batterius-lowus. If the little guy miss reads, or the sonar’s beam wasn’t wide enough, he’ll run into something and activate the knock sensor. If this happens, he backs up, does a 180 and starts again.

I did have the foresight to design the code around function calls (e.g., “Forward()”) so that I could eventually give him a big brain using some sort of RF connection (later, I think it will be a PC and Python).

The Code 2.x

This code is pretty close to the original; I added the servo functions and I think a few other minor tweaks (most which I tried to annotate throughout my code).

The Code 3.x

As I mentioned in the build–I had bought a pair of Xbees to incorporate into a motioned driven bellydance skirt for my wife. Fortunately, I found a better way to build it, so I left me with two Xbees Ser1’s. After a little meditating, I realized I was finally reaching the point where I could begin to write code on some of processing unit and start doing nifty things with the little guy. So, I chose to start in Processing, since it was close to the code I was familiar. After a bit, I had written a rather rough draft of a simple RC controller using a PC, Processing, and the Xbees (Zombie_Processing_3_6_RC.zip).

But, that wasn’t good enough for me. I had bought a Raspberry Pi that I desperately wanted to add on to the bot. I had done the math. I figured that it was almost as much to setup a RPi with a WiFi dongle as it was to buy a WiFly.

WiFly setup:

  1. WiFly Xbee from SparkFun: $42.95
  2. Arduino WiFi Shield: $6.98 Total for WiFly: $49.93

Basic Raspberry Pi and WiFi Dongle

WiFi Dongle: $6.17 Raspberry Pi: $35.00 SD Card (4g): $2.50 Total for Basic RPi: $43.67

My delusional mind, obviously ahead of my ability, began to run through how this RPi setup would look. I began to see my little bot as having two separate components to its existence. The RPi would be the central nervous system and the Arduino would be the peripheral. The Arduino would be tasked with executing movement and perception functions and the RPi would be tasked with computation and analysis. So, with my inability to see my inability, towards the end of writing the Processing code to act as controller I realized that it would be a good place to begin testing the peripheral nervous system (Arduino code) of my little bot. Moreover, that when I did finally reach the point where I was skilled enough to write intelligent code, then that code would replace the my commands sent from Processing.

Of course, a problem hit me upside the head–” By default, we’ll be supporting Python as the educational language.” F’ing RPi makers. Oh well–I’d been needing to learn another programming language. Well, after several days of delving into a scripting language (it was friggin like trying to speak Koiné of the Late Empire) I finally pieced together working(?) Python code. The Python code I wanted to act as the equivalent to my Processing code, which to my surprise, it did. So, I was able to leave my Arduino sketch the very same, i.e., my Python script and Processing code will both work on the same version of Arduino sketch.

I did notice the potential of Python. I was capable of incorporating threading and Tkinter’s gui.

I would like to indulge in a caveat. These codes are in no way complete–I’ve simply put my mind towards some mechanical projectss while I wait for the components to come in needed to finish.

The Code 4.x

The code for this version might seem somewhat complicated, but after your under the hood (or bonnet, for my European friends) then it actually is much simpler than my Xbee build.

The information flow can be summed up like so:

SSH —> Raspberry Pi —> I2C Optoisolator –> Arduino Pro Mini –> M3 Motor Shield

In essence, it’s a Raspberry Pi WiFi Dongle.

I designed the flow this way so that I could program higher-level functions (e.g., sensor data computation) on the Raspberry Pi, thereby allowing me to make it autonomous by replacing my keystrokes with its own choices (I hope they’re good choices). Of course, this would still allow me to SSH into my Python code on RPi to tweak as I needed.

Therefore, the eventual information flow will be like so:

Raspberry Pi —> I2C Optoisolator –> Arduino Pro Mini –> M3 Motor Shield.

I’ll do my best to out line my code and resources needed to get the code working.

First off, setting up the RPi.

Instead of including in this build, I made a separate walking through.

Blueberry Pi

The code is actually pretty simple right now. The Python comes up with something to do (right now generated from keystrokes) and tells the Arduino to do it. It does this by capturing keystrokes, converting them to byes, sending them over the I2C bus where the Arduino captures them, translates them back from bytes into something human readable. Each keystroke then is associated with a routine. If it captures the “W” key code, it turns it into bytes, over I2C, Ardy grabs it, turns it back into a “W,” then runs the Forward() function. Likewise, the Arduino can send information back through the bus to the Pi, which the Pi converts from a byte, into character, then compiles the characters into a string until it predefined delimiter. Simple.

The keystroke will eventually be replaced with Raspberry Pi calculations–therefore, it’s the central nervous system. The Arduino will be responsible for carrying out functions; in this metaphor, it’s the autonomous nervous system. The idea is, even if the Pi goes down, the Ardy will hold on to some basic functionality. This allows the synergy of high-functions (e.g., speech synthesizing) and system stability (Arduino’s “survival code”). Of course, it’s not done yet. :)

Dicussion v1.x:

So, I’m actually much further with this guy now. But I felt I should go back and verbally process my build; keeping track of it for my sake, and perhaps any other hack like me unfortunate enough to read this.

Therefore, I’m going to list the things I’ve learned (those bored are welcome to leave quietly).

  • The LMR collective is invaluable–(although, they can be a little Ayn Randish with inepts, **“You didn’t ‘Google’ that question before posting! Stick-piggie-piggie! Stick!” **Especially that Cristal the Carpenter fellow).
  • SparkFun is EXPENSIVE.
  • Ebay is cheap.
  • Magic smoke smells good.
  • Designing a circuit is easy; designing a good circuit is esoteric. Although, I still believe I did the right thing in trying to design my own motor circuit. I learned so much from digging around in datasheets and online tutorials, although the frustration was high, the knowledge I gained was worth it.
  • Etching your own boards is pretty easy. Trying to etch a board with small traces is a mother-of-some-puppies.
  • When making homemade PCBs, get a good soldering iron. Cheap irons don’t seem to get hot enough. What ended up happening to me was when the tip got a little dirty I’d go to solder a pin and as I was pulling my iron away the trace would come off the board. Funny enough, this is most likely to happen on the very last pin you solder. Not sure why.
  • Although I now feel comfortable building my own bot from raw components, I’m not ashamed I built this iteration from modular components. It has taught me a lot.
  • Electronics fills your mind. The zen of robots has been excellent for my mental health.
  • Being delusional is in a maker’s best interest. Dream it, start building, and when the details hit you in the face, hit back with a lot of creativity. And remember, you always have Grandma Google to explain anything you might need to know.
  • RC cars are for jocks; robots are for the awesome.

Discussion v2.x

Looking back, I’m not sure I would have gone with the servo again. I didn’t analyze the current it drew, but I’m pretty sure it wasn’t that much less than firing the motors and having the entire bot shift. The treads gripped well, so I don’t feel like I would have sacrificed any precision. I would simply have the bot zero-point turn to the right, left, instead of the servo. Not sure why, my gut was happier with fewer moving parts–before the servo the build had a more solid feel to it. But, on the plus side, I did learn to use servos through the process. I also learned the hard way the differences between a degree rotation servo and a full rotation servo.

Lessons learned:

  • Dremel tools are a necessity for robot building. Especially cutting bits.
  • A table vice is nice. So is a real workbench (even if it came out of a dumpster).
  • A full-rotation servo is not precise (I know angular servos are not either, but I’m not looking for reductio ad absurdum).
  • Children think servos are make creepy noises.
  • Plexiglass is great for prototyping. Combined with a dremel, cutting bits, and a speed-square, rapid prototyping has never been so ghettotastic.

Discussion 3.x:

I’m very excited to begin approaching the put where I can put some real brains into the little guy. Looking back, I will feel a twinge of regret not begining the Xbee controller code in Python rather than writing it in Processing and then re-writing it in Python. Regardless, it was good practice. Also, I know the kinks I will need to work out for a stable version of this Xbee controller. Right now, the processing code has little error control–when receiving data, it simply draws it from the Arduino and displays it. It doesn’t worry about dropped bits or unusable data. The Python code has sophisticated error checking on par with an amoebae. Right now, it simply checks to see if the readline has more than 8 or 9 digits to it. If it does, it prints the data, if not, it skips and waits for a more complete readline. These can be improved drastically. But, it simply isn’t my intention to do so. Eventually, I plan to place the Raspberry Pi on top of the Arduino, linked with a USB cable. So, I’m really counting on no data lost to the aether. I wrote these codes simply to see if I could. Like a caveman grunting his prowess to a robot.

Discussion 4.x

I feel I’ve created probably one of the most complex RC cars on the internet. And I’m tired.

Lessons learned:

  • Raspberry Pi’s have pull-up resistors on their I2C lines.
  • “Typical application” schematics in datasheets are amazing.
  • Don’t rely on copying others work. Create your own, it is much more satisfying.
  • Delusions are good.
  • I’m not smart, I’m obsessive.
  • I2C is simpler than serial?
  • Those quick2wire guys are full-of-shinobi.
  • Designing OSHPark boards has the added benefit of encouraging you to be efficient about PCB design, since you are charged by the square inch. It pays to learn to design compact boards, with SMD parts.
  • It’s not Python that annoys me, rather, it is the lack of ways to manage Python version compatibility.

COGs Breakdown 1.x:

  • Interia’s base (4 motors, treads, hubs, and CNC’ed aluminum): $109.00
  • Arduino Uno: $34
  • Arduino Motor Shield: $28
  • Knock Sensor: $2.95
  • MaxSonar: $25.95
  • Batteries: Already got’em. Est. $12.95
  • Medium Servo $10.95
  • Total: $223.80

COGs Breakdown 2.x:

  • Interia’s base (4 motors, treads, hubs, and CNC’ed aluminum): $109.00
  • Arduino Uno: $34
  • Arduino Motor Shield: $28
  • Knock Sensor: $2.95
  • MaxSonar: $25.95
  • Batteries: Already got’em. Est. $12.95
  • Medium Servo $10.95
  • Wireless Shield: $22.95
  • Xbee X2: $45.90
  • Xbee Explorer: $10.95
  • Plexiglas: $12.98
  • Total: $316.58

Ignorance is Bliss

In June of 2015 I became a Homeless Management Information System Administrator. Going into the job I had no idea what was to be done. I’d been working as a homeless street-outreach specialist for MHMR of Tarrant County for several years before. The reason I landed the job, I think, is I was tech savvy, something rare in the social service world, but more on that later.

I’d become tech savvy because working a street outreach specialist one sees a lot of bad. A lot It will leave scares in your psyche if you are not vigilant to guard against those bad scenes replaying in your head. I found if I filled my head with something complex there was no room for the dissonance created by being helpless to aid the 17-year-old heroine addict and future mother under the unfinished bridge off of I-30. So, I took up robotics. It worked well.

When I started as an HMIS System Administrator I was clueless. Looking back, I’d wished there was someone who was around to teach me. Most of the skills needed are esoteric and few on-line resources exist.

Well, as I step away from the desk, I’m going to do my best to write down everything I learned–of course, it’ll be laden with opinion–but hopefully, it’ll provide the missing manual I sought.

Brand new HMIS System Administrator, this is for you, as you start your new job. The best job in the world.

Get In the Weeds

Hey, by now, you’ve been to a few meetings and you know a few things about the data. A word of warning, don’t let others tell you, “Hey, that’s too in the weeds for this conversation.”

Bull.

Often, your boss or peer will be saying it with good intent. They want to make sure your content is digestable, and often, the critique is fair. But don’t let it become the only rule you live by. The weeds are necessary for several reasons. And if metaphors aren’t your thing, we are talking about the details of data.

First, if you are never allowed to talk about the mechanics of the job, then you will not have the vocabulary, analogies, and metaphors ready to talk about critical system issues when it is necessary. Why would it be necessarty? Let’s say you needed to vet a critical HMIS decision with the goverening board.

Secondly, others will not be primed for a conversation about something they have never heard about until it matters. In short, getting into the weeds of how an HMIS works is needed for more reasons than there are to prevent going into them.

Regarding building your ability to communicate complex HMIS activities concisely, I’d encourage you to cold-call some HMIS system administrators in your state. Introduce yourself, ask if they would be willing to chat with you when needing to discuss HMIS stuff no one else will listen to. I’ve found this to be the absolutely critical to explaining HMIS problems to a COC who doesn’t care or understand.

Also, this job is great. But you’ll have no friends (no one told you?). No one will understand you. And no one will want to talk about how HUD has changed the chronically homeless definition yet again–_except_, other HMIS system admins. When I first made contact with another HMIS Admin, after being in the job a year, it was like I’d discovered a neighboring isle next to the one I was stranded–and that isle had other humans who understood my language!

Data Quality is Key

Three months into my job I realized we had to do something about data quality. We had two cities and a county both complaining the reports coming from HMIS were not reliable. One month they’d produce an “accurate” account of who was in a program, then, the next month they would be completely off. (Of course, the municipalities knew they were off because the agencies funded were keeping a separate set of books–more on that later)

We had to do something. To be honest–oh yah, and always be honest–we didn’t have a clue what our data quality was like. There was no data quality detection system in place to determine if it was good or bad. Luckily, our software vendor had an HMIS data error report which would list out HUD data errors of participants active in any program. Without a better solution, I pulled this report for every program and aggregated the data errors.

Well, crap. They had right to complain.

Above is a graph of 2016’s data errors, in 2015 it started at approximately 100,000 errors. Without the wisdom I’ve now, I used the raw numbers to present to our COC Governing board to show we were addressing the municipalities concerns.

But this is only half the story. What if only one agency was causing all these data errors?

To address this problem, we used a tree graph.

Tree graphs are great to show how certain agencies are causing a disproportionate amount of data errors. We presented the graph much as shown here, without Agency names, at first. But eventually, the Board asked we provide names along with the graph.

These graphs provided the political insurance needed to to approach the partner agencies on the behalf of the board, which is much safer than enduring the resentment engendered otherwise.

When approaching the troubled agency, it helps to have a plan. In TX-601 we called these “Data Quality Action Plans” and consisted a list of all the errors needing to be repaired, and a SMART goal.

By focusing on the agency which has the most data errors it is like you are on a ship with many holes, possibly sinking. Before bailing water, or patching small holes, you patch the biggest, as it’ll have the greatest impact on the entire ship

Eminence vs. Data Based Decision Making

Always Be Honest

Understand How You’re Funded

HUD funds you – they are your boss, kinda’, but your other boss if who provides match for those funds.

Power of the Purse

ESG Funders COC Funder

DTR

Define the relationship. This may have already be done for you, but if it hasn’t, please don’t underestimate how powerful an agency’s perception of your responsibilities.

For example, within three months of starting I received a call from shelter intake staff. He was upset because he wasn’t able to scan-in clients. He “didn’t have time to troubleshoot” and suggested I drive down, which I did. Shortly after I arrived I realized the problem was his computer was shorting–sparks were clear. I let him know he would need to contact his IT department to get it addressed, to which he stated, “I thought you were the IT department?” After explaining I wasn’t, he still insisted I fixed it since it kept him from recording HMIS data and “that was my job.”

Take time, and do it early, to explain to what your role is. For me I listed bullets of what were my responsibilities and what were not:

HMIS System Administrator Responsibilities

  • Data Quality
  • Timely enabling / disabling user permissions
  • Assessing software defects
  • Providing technical assistance
  • Conducting trainings on how to use the software
  • Facilitating oversight and guidance committees
  • Communicating with end-users when system issues impact their work
  • Technical assistance on producing CAPER
  • Technical assistance on producing APR
  • Technical assistance on running standard reports
  • Federal system reports:
    • System Performance Measures
    • Annual Homelessness Assessment Report
    • Point-in-Time (PIT)
    • Housing Inventory Count (HIC)
    • HUD Data Quality Report (HDQ)
    • Coordinated Entry Reports
  • Supporting ESG Participating Jurisdictions
  • Supporting COC Lead

These responsibilities I would continually message. It is important they are understood for both sides. End-users need to know your are there to support their efforts. But, it is also important they understand how you can and may help.

Along side this list of how we could help, I had a list of how we could not help (at least, not guaranteed assistance).

*NOT* HMIS System Administrator Responsibilities

  • Fixing equipment
  • Generating CAPER or APR on an agency’s behalf
  • Fixing data errors created by an end-user(s)
  • Generating standard reports for an agency
  • Meeting (all) customization requests
  • Generating data or reports for domestic violence providers (as they are prohibited from participating in the HMIS)
  • Fulfilling custom report requests in an unrealistic time frame (we would advertise a five business day notice).
  • Providing routine trainings for less than four end-users. Or, short notice ad hoc trainings.
  • Adjusting system reports to bolster performance (these request are insidious)

Find Your Tools

R, SQL, Tablaeu

Automate Everything

Discourage Separate Sets of Book

Data Quality Goes Down

Try to be HMIS Software Independent

My wife is fond of saying, “I love you, but I don’t need you.” It took me awhile to get over being butt-hurt by this statement. But at it’s root is a profound nuance of a good relationship. In good relationships, you are aren’t needed–but you are liked.

This is how it should be with software. It’s hard. But where possible, don’t depend on your HMIS software solution to do your job. Instead, use the software because your continuum-of-care likes using them.

This feat is harder than it appears. It means you need to be able to create your own System Performance Report, CAPER, or APR. That sounds hard, and it is, but not impossible. And I’m not suggesting you need to write all these reports, but you need to be able to write these reports.

For example, if you’re attempting to pull the Annual Homelessness Assessment Report and several days before you submit you notice shelter bed counts are extremely off. Then, it will be beneficial for you to be able to write a report which can double check the software vendor’s number–and if needed, provide the correct numbers. In this way, you are not dependent on the software company to provide you a fix before submission.

There are countless benefits to being in this kind of relationship. If you have the skill to dig into the HMIS data sets to find the problem, then you probably have some valuable debugging information to provide back to the software vendor. This information will allow them to roll out a fix even faster. It’s just good all the way around.

Also, and I’d argue most importantly, you aren’t trapped in an abusive relationship with your software. Fearing your current software vendor will never extract the data from your HMIS in such a way it could be migrated to another vendor. Of course not! If necessary, you have the skill to build your own data warehouse and migrate the data, possibly without any degradation.

A relationship where you need the other person is not one built on love.

Departmental Checks and Balances

Don’t scrub data, ever, not even once – don’t do it… seriously

If you give a mouse a cookie

Advocate for everyone to Create Reports

Taking care of the date = cook Pulling reports for you = your cook chewing your food

Get a Help Desk

If you are not already, you will become overcome with requests. Everyone will be emailing you their wants like you’re Santa Clause in November. This is dangerous. Your end-users only see their one request and they don’t understand why it is taking you more than an hour to fulfill it.

Please tell me you have staff? HUD recommends having one FTE for every 75 end-users. For us, this meant we should have had 3.73 FTEs, but we operated with 3. However, your staff will do little good if everyone is sending their requests directly to you.

Sure, you’ll try forwarding the email to Joe, but then Joe gets sick and the request sits in his inbox for several days.

Get a Help Desk.

For us, we had next to no-budget for a help desk, so I spun up a server and used Drupal’s ticket module to create a help desk. This allowed us to implement a help desk for about $12.50. Not too shabby.

Getting the end-users to use the help desk was painful. Trying to convince them it was in their best interest was not easy. But, eventually, messaging our ability to collaborate on ticket responses won through. They start using ticket system.

There were many other advantages of using a help desk. As a manager, I was able to review my staff’s responses to request. If I were a better manager (or if I had more time, hard to tell which it was) I could have used this information to coach my staff on being customer service focused. Making sure we are providing friendly and relevant responses to all requests.

Another advantage is having a log of all requests made to our department in a searchable fashion. This would allow us to review statements we were being unresponsive with ticket links containing time stamps. I’d like to tell you being prepared to defend your department is unnecessary, but sadly, that’s not true.

Get Used to Bus Treads

Software is organized crime, and those who support it, wild villains. At least, this is the perception of everyone of your users. Any time something goes wrong the software and support are to blame. This will often put you in a tough position with funders.

For example, when an agency is fails to submit their CAPER in a timely manner and the funder is attempting to hold the agency accountable, one of the natural shirking strategies is to blame software and support.

Unfortunately, this conversation between the funder and funded you usually are not privy to. You have no opportunity to cry “bull.” It’s not until the funder contacts you stating the agency’s CAPER is late because of problems with software and support. At this point, no amount of empty excuses will defend you and the team. You’ve failed. I know this to be true, because I’ve failed many times.

I’ve found there are two key strategies to avoid this problem.

First, be proactive. When you know a CAPER, RHY, SSVF, PATH, or APR is due for an agency, send the program manager one simple email.

“Good afternoon Ms. Program Manager,

I know your Report is due pretty soon, I just wanted to check in and see how our team might assist you in being successful.

Sincerely,

–HMIS System Administrator”

Keep it short, you’ll be writing a lot of them.

This let’s the program manager know you’re there to assist. And you are aware the report is due.

Do not CC the funder. That’s silly. I’ve honestly found the aggressive nature of CC’ing everyone and their dog a huge detractor of good working relationships. There’s a better way–show them.

However, do archive your sent emails so you can easily access offers to assist.

Secondly, record Everything. Find a way to record every interaction with your partners. If you’re following the advice about a Help Desk, then great. But make sure every email thread which is a request for help gets moved into the Help Desk.

Also, if you are meeting with partners about issues, make sure to recommend they write the request down using the Help Desk, that way your staff could start working on it right away. (Or if you’re like me and your auditory memory is non-existent, you’ll actually get it done.)

Between these two things, when a funder comes asking you, “Do you know why Shelter of Hope hasn’t been able to generate the CAPER?” You may quickly say, “I’m not sure. I’ve sent an offer to assist September 1st and I’ve looked through our Help Desk, there are no requests for help from Shelter of Hope.” This usually helps show software or support were not the determining reason the report wasn’t submitted in time.

One word of warning. It took me two years to figure out not to blame my end-users for this behavior. They are dealing with cramped work spaces, angry individuals, emergency responses, suicidal clients, bed-bugs, oh! And pay on par with a McDonald’s employee! Don’t be upset with the end-users–they are good souls, just not data souls.

Need Always; Wants, When Possible

##################################################
# Create Occupancy Trends for Emergency Shelters #
# Rapid Rehousing, and Permanet Housing programs #
# by ProjectName and OrganizationName.           #
##################################################
trendsOfOccupancyByProjectAndOrganization <- function(allDataPath, 
                                                      outputFolder, 
                                                      interval = "week", 
                                                      startDate = "2014-01-01"){
  
  library(plyr)
  
  client <- loadClient(allDataPath)
  
  primaryPersonalIDs <- getPrimaryPersonalID(client)
  primaryPersonalIDs <- sqldf("
                            SELECT
                              PrimaryPersonalID,
                              PersonalID 
                            FROM
                              primaryPersonalIDs")
  
  client <- sqldf("
                  SELECT
                    a.PrimaryPersonalID,
                    b.* 
                  FROM
                    primaryPersonalIDs a 
                  LEFT JOIN
                    client b 
                      ON a.PersonalID=b.PersonalID")
  
  client <- within(client, rm(PersonalID))
  colnames(client)[1] <- "PersonalID"
  client <- unique(client)
  
  enrollment <- loadEnrollment(allDataPath)
  enrollment <- sqldf("SELECT
                        a.PrimaryPersonalID,
                        b.* 
                      FROM
                        primaryPersonalIDs a 
                      LEFT JOIN
                        enrollment b 
                          ON a.PersonalID=b.PersonalID")
  
  enrollment <- within(enrollment, rm(PersonalID))
  colnames(enrollment)[1] <- "PersonalID"
  enrollment <- unique(enrollment)
  
  exit <- loadExit(allDataPath)
  exit <- sqldf("SELECT 
                  a.PrimaryPersonalID, 
                  b.* 
                FROM 
                  primaryPersonalIDs a 
                LEFT JOIN 
                  exit b 
                    ON a.PersonalID=b.PersonalID")
  
  exit <- within(exit, rm(PersonalID))
  colnames(exit)[1] <- "PersonalID"
  exit <- unique(exit)
  
  project <- loadProject(allDataPath)
  inventory <- loadInventory(allDataPath)

  organization <- loadOrganization(allDataPath)
  
  # Add all bed inventories into one (HH without children, HH with children, and HH of children only)
  inventory <- sqldf("
                      SELECT
                        ProjectID,
                        SUM(BedInventory) As 'BedInventory' 
                      FROM
                        inventory 
                      GROUP BY
                        ProjectID
                     ")
  
  allData <- sqldf("
                    SELECT  
                      DISTINCT a.PersonalID, 
                      a.EnrollmentID, 
                      c.ProjectName, 
                      a.EntryDate, 
                      b.ExitDate, 
                      c.ProjectType, 
                      d.BedInventory, 
                      e.OrganizationName
                    FROM 
                      enrollment a
                    LEFT JOIN 
                        exit b
                          ON a.EnrollmentID=b.EnrollmentID
                    LEFT JOIN 
                        project c
                          ON a.ProjectID=c.ProjectID
                    LEFT JOIN 
                        inventory d
                          ON c.ProjectID=d.ProjectID
                    LEFT JOIN 
                        organization e
                          ON c.OrganizationID=e.OrganizationID
                    WHERE RelationshipToHoH != 'NA'")
  
  remove(client, enrollment, exit, project, primaryPersonalIDs)
  
  # Gets max and min date
  bfr <- sqldf("
                SELECT 
                  MIN(EntryDate) As MinimumDate
                FROM 
                  allData
               ")
  
  min_date <- ""
  if(startDate == ""){
    min_date <- as.character(bfr[1,1])  
  } else {
    min_date <- startDate
  }
  bfr <- sqldf("
                SELECT 
                  MAX(EntryDate) As MaximumDate 
               FROM 
                  allData
               ")

  max_date <- as.character(bfr)
  
  intervalConstant <- switch(interval,
                             week = 7,
                             month = 30,
                             quarter = 120)
  
  numberOfIntervals <- switch(interval,
                              week = as.integer(getWeeksBetween(min_date, max_date)),
                              month = as.integer(getMonthsBetween(min_date, max_date)),
                              quarter = as.integer(getQuartersBetween(min_date, max_date)))
  
  allData$EntryDate <- as.Date(allData$EntryDate)
  allData$ExitDate <- as.Date(allData$ExitDate)

  projectTypeList <- unique(allData$ProjectType[!is.na(allData$ProjectType)])

  # Calculate occupancy by ProjectName
  for(projectType in projectTypeList) {
    
    projectTypeName <- as.character((makeProjectTypeReadableVector(list(projectType))))
    
    thisProjectTypeData <- allData[allData$ProjectType == projectType,]
    
    # Inialize dataframe with all ProjectNames
    projectEnrollmentsTrend <- unique(data.frame(thisProjectTypeData$ProjectName))
    colnames(projectEnrollmentsTrend)[1] <- "ProjectName"
    
    # Attach project bed data 
    thisProjectBedData <- unique(data.frame(allData$ProjectName, allData$BedInventory))
    colnames(thisProjectBedData)[1] <- "ProjectName"
    colnames(thisProjectBedData)[2] <- "BedInventory"
    
    for(i in 0:numberOfIntervals) {
      intervalStartDate <- as.Date(min_date) + i * intervalConstant
      intervalEndDate <- as.Date(min_date) + (i + 1) * intervalConstant
      
      activeEnrollment <- subset(thisProjectTypeData, 
                                 EntryDate <= as.Date(intervalStartDate) &
                                   (ExitDate >= as.Date(intervalEndDate) |
                                      is.na(ExitDate)))
      
      projectCount <- count(activeEnrollment, ProjectName)
      colnames(projectCount)[2] <- as.character(intervalStartDate)
    
      thisCountWithBeds <- merge(x = projectCount, y = thisProjectBedData, by = "ProjectName", all.x = TRUE)
      thisCountWithBeds$OccupancyPercentage <- round(thisCountWithBeds[,2] / thisCountWithBeds[,3], digits = 4)
      projectCount <- data.frame(thisCountWithBeds$ProjectName, thisCountWithBeds$OccupancyPercentage)
      colnames(projectCount)[1] <- "ProjectName"
      colnames(projectCount)[2] <- "Occupancy"
      
      averageDf <- sqldf("
                         SELECT 
                            'Average' As 'ProjectName',
                            AVG(Occupancy) As 'Occupancy'
                         FROM 
                            projectCount
                         ")
      
      projectCount <- rbind(projectCount, averageDf)
      colnames(projectCount)[2] <- as.character(intervalStartDate)

      projectEnrollmentsTrend <- merge(x = projectEnrollmentsTrend, y = projectCount, by = "ProjectName", all.x = TRUE)
    }

    tmpColMeans <- numcolwise(mean, na.rm = TRUE)(projectEnrollmentsTrend)
    tmpColMeans$ProjectName <- "Average"
    projectEnrollmentsTrend <- rbind(projectEnrollmentsTrend, tmpColMeans)

    projectEnrollmentsTrend <- t(projectEnrollmentsTrend)
    colnames(projectEnrollmentsTrend) <- projectEnrollmentsTrend[1,]
    projectEnrollmentsTrend <- projectEnrollmentsTrend[-1,]
    
    write.csv(projectEnrollmentsTrend, 
              paste(outputPublicPath, 
                    "/ProjectsEnrollmentsTrend_ProjectType_", 
                    projectTypeName, 
                    ".csv", 
                    sep=""), 
              na = "", row.names = TRUE) 
  }


  # Bed occupancy by OrganizationName 
  for(projectType in projectTypeList) {
    
    projectTypeName <- as.character((makeProjectTypeReadableVector(list(projectType))))
    
    thisOrganizationData <- allData[allData$ProjectType == projectType,]
    organizationEnrollmentsTrend <- unique(data.frame(thisOrganizationData$OrganizationName))
    colnames(organizationEnrollmentsTrend)[1] <- "OrganizationName"
    
    thisOrganizationBedData <- unique(data.frame(allData$OrganizationName, allData$BedInventory))
    colnames(thisOrganizationBedData)[1] <- "OrganizationName"
    colnames(thisOrganizationBedData)[2] <- "BedInventory"
    
    for(i in 0:numberOfIntervals) {
      intervalStartDate <- as.Date(min_date) + i * intervalConstant
      intervalEndDate <- as.Date(min_date) + (i + 1) * intervalConstant
      
      activeEnrollment <- subset(thisOrganizationData, 
                                 EntryDate <= as.Date(intervalStartDate) &
                                   (ExitDate >= as.Date(intervalEndDate) |
                                      is.na(ExitDate)))
      
      projectCount <- count(activeEnrollment, OrganizationName)
      colnames(projectCount)[2] <- as.character(intervalStartDate)
    
      thisCountWithBeds <- merge(x = projectCount, y = thisOrganizationBedData, by = "OrganizationName", all.x = TRUE)
      thisCountWithBeds$OccupancyPercentage <- round(thisCountWithBeds[,2] / thisCountWithBeds[,3], digits = 4)
      projectCount <- data.frame(thisCountWithBeds$OrganizationName, thisCountWithBeds$OccupancyPercentage)
      colnames(projectCount)[1] <- "OrganizationName"
      colnames(projectCount)[2] <- "Occupancy"

      projectCount <- sqldf("SELECT 
                              OrganizationName, 
                              AVG(Occupancy) As 'Occupancy'
                            FROM 
                              projectCount 
                            GROUP BY 
                              OrganizationName")
      
      colnames(projectCount)[2] <- as.character(intervalStartDate)

      organizationEnrollmentsTrend <- merge(x = organizationEnrollmentsTrend, y = projectCount, by = "OrganizationName", all.x = TRUE)
    }

    tmpColMeans <- numcolwise(mean, na.rm = TRUE)(organizationEnrollmentsTrend)
    tmpColMeans$OrganizationName <- "Average"
    organizationEnrollmentsTrend <- rbind(organizationEnrollmentsTrend, tmpColMeans)

    organizationEnrollmentsTrend <- t(organizationEnrollmentsTrend)
    colnames(organizationEnrollmentsTrend) <- organizationEnrollmentsTrend[1,]
    organizationEnrollmentsTrend <- organizationEnrollmentsTrend[-1,]
    
    write.csv(organizationEnrollmentsTrend, paste(outputPublicPath, "/OrganizationEnrollmentsTrend_ProjectType_", projectTypeName, ".csv", sep=""), na = "", row.names = TRUE) 
  }
}

trendsOfOccupancyByProjectAndOrganization(allDataPath, outputPublicPath)

Test2

God created the world perfect, we fucked it up

Why does bad shit happen in the world?

Take someone makes a choice to heroine. It doesn’t take more than a few times to make a habit. Then it becomes an addiction. Now, a person has less free will. But mroe than that, the world is a worse place because of the choice. There are drug dealers who provide and cops who are disgruntled at dealing with it.

Even if the world was made perfect, we’ve fucked it up–because we have freewill to do so.

The Hidden God

Why does god remain hidden? It is necessary for freewill and reason to coexist. If a reasonable human saw God in fullest, there is no longer freewill. The only choices which make sense are those which align with his omnipotent will.

Doubting Thomas

This letter isn’t for you. If you’re an aethist, it’ll piss you off. If you are a Christian, it’ll piss you off. If you are a spiritualist, it’ll piss you off. If you are agonostic, well, I’m not sure–it depends how open minded you are.

Can you doubt yourself?

Nor must we overlook the probability of the constant inculcation in a belief in God on the minds of children producing so strong and perhaps an inherited effect on their brains not yet fully developed, that it would be as difficult for them to throw off their belief in God, as for a monkey to throw off its instinctive fear and hatred of a snake.

Doubt is a skill

  • Dillusional battles a good source of practice.
  • Schizophrenic man being able to doubt his reality
  1. Vehement reaction to an idea is ofte an in indicator of non-reason
  2. Dissonance bias and consance bias
  3. Understading others Vehemence
    • Believe you will go to hell
    • Believe you will confine the world with abusive morality (aka, witch trials)
  4. Philosophy is a house of cards
  5. Staked facts have little use

C.S. Lewis on Time

http://www2.esm.vt.edu/~sdross/text/beyondtime.html

If you picture Time as a straight line along which we have to travel, then you must picture as the whole page on which the line is drawn. We come to the parts of the line one by one: we have to leave A behind before we get to B, and cannot reach C until we leave B behind. God, from above or outside or all around, contains the whole line, and sees it all.

Sigmound Brower – Evolution

Darwin on God

With respect to the theological view of the question; this is always painful to me.— I am bewildered.– I had no intention to write atheistically. But I own that I cannot see, as plainly as others do, & as I [should] wish to do, evidence of design & beneficence on all sides of us. There seems to me too much misery in the world. I cannot persuade myself that a beneficent & omnipotent God would have designedly created the Ichneumonidæ with the express intention of their feeding within the living bodies of caterpillars, or that a cat should play with mice. Not believing this, I see no necessity in the belief that the eye was expressly designed. On the other hand I cannot anyhow be contented to view this wonderful universe & especially the nature of man, & to conclude that everything is the result of brute force. I am inclined to look at everything as resulting from designed laws, with the details, whether good or bad, left to the working out of what we may call chance. Not that this notion at all satisfies me. I feel most deeply that the whole subject is too profound for the human intellect. A dog might as well speculate on the mind of Newton.— Let each man hope & believe what he can.

Higgs-Boson

  1. Infinite root-cause analysis
  2. Incomplete physics theory without Higgs-Boson
  3. God is non-falsafible
  4. Insert God into a system and test it
Robber Board v3

2017-12-24 – v3

This is an update on the Robber Board I’ve been slowly working on. Its a small little bells-and-whistles board which is meant to be a test platform for my Lumi wireless AVR uploder.

I’ve almost finished testing the Robber board v3. A few changes:

ISP Key

I’ve added a special ISP header to the board. It works with Tiny AVR-ISP pogo-pin programming adapter

It’s a bit of a pain to solder, but it’s pretty darn sweet once it’s in place. Of course, the header is backwards. I’m going to be humble and blame myself for not checking the pinout, but I ended up vertically switching the pins. This caused a few hours of frustration.

Besides that, the rest of the board works.

Footprint