I thought I could get away with utilizing the onboard 6V from my cannibalized motion platform…aka the orange thing…in order to power both the DC motors and my IOIO. My bluetooth connection was cutting out…indicating that I needed to introduce a separate power source for the board and for the motors.
My workaround didn’t pan out…time to rethink the build. Guess what I have on my desk:
Adafruit Motor Shield for the Arduino. Two good looking H-Bridges staring at me….that’ll do. I never thought I would find myself treating my Arduino gear as a scrapheap, but the day has come.
The H-Bridge will allow me to cross over (think of a capitol H) and provide bi-directional motion from the hardware level. 3.3V digital outputs…no more open drain needed (bonus.)
Anyhow, I ended up putting together a little test board…socket, some male pins, and eventually some wires for a more secure connection. Sucking some serious soldering fumes…
It looks sloppy, but here is the hardware in its entirety:
Here is a quick video of my testing. I fired up my IOIOSeek program, which has two simple digital outputs triggered via buttons…
Early success? Yep. Except for the early part…this has been more work than I had assumed. More EE work…hoping the UI and hardware containment goes smoothly. Tune back in.
I hit another snag with respect to my IOIO specs butting heads with actual power yield. An early success:
…followed by being unable to replicate via battery control, despite higher voltage. I might be running into a similar incident that I saw when putting together the PowerSwitch Tail project. I’ll get there…might have resisters coming out the ass, but I should be able to solve this issue with a creative open-drain setup as well. Tune in again…
Last week found me standing tall upon my shell script soapbox, shouting command line praises to all who would listen.
Thou ought direct thine output aftways, to-wards thine USB port of thee. And that is well and righteous.
Well, that still is the case. My latest project has made it glaringly obvious that sometimes a little Python script will render a whole bunch of shell scripting moot. Namely, parsing HTML. Let’s see a picture…
Lunch hour project: parse the comments from swantron.com; feed said comments to an LCD screen.
I was horsing around with wget from a CLI a few days ago. I found myself trying to smash through the resultant file via pure regular expressions…which is incredibly clumsy. Well, as luck would have it, my go-to after my main go-to is Python, and this type of thing has been issue enough to warrant a library. BeautifulSoup. It acts to parse the HTML info into items, that can be smashed around as I see(med) fit.
My setup was simple: py script to snag my comments and write serial, Arduino sketch to drive a LCD and read/write serial. And a source of shade. And a WiFi signal to snag.
Drudge is reporting that the White House is preparing to release a photo of Osama dead. There is all sorts of speculation going on now…talks about releasing a video of his burial at sea, for instance.
Keep in mind that I avoid talking religion and politics on this site. There are all sorts of idiots blathering on about such things. This has nothing to do with the motivations, but rather a way to analyze the video, should it surface. Likely because it has been on my mind a lot lately, but the first thing that comes to my mind is the open source software I have been researching lately.
I have been messing around with an algorithm nicknamed “Predator.” Messing around, as in trying to get something developed on W7 / Matlab to work on Debian / Octave. Sort of a pain, but I digress. OpenTLD is the project…
How is this applicable? It works to track unknown objects in unbounded video streams. Here, it can be used to compare a reference image (known pic of Osama) to the example footage (military helmet cam) for identification purposes. This could be used also to dampen the movement of the video…take out the bounce.
Spoiler1: This is awesome.
Spoiler2: I’ve never seen Minority Report.
I do know that there is some sort of hands free interface, and that is what I have put together.
Long story short, I have extended upon my PING))) project to include some sweet touchless home automation. I have the ultrasonic sensor interfacing with my garage door and a lamp, utilizing a servo and a PowerSwitch Tail, respectively.
Hit the bump for an awesome video of this thing in action, and for my spippet.