Search This Blog

Showing posts with label Yagarto. Show all posts
Showing posts with label Yagarto. Show all posts

Saturday, 27 October 2007

This weeks roundup..

Another week of chipping away at stuff. My main attention has been with the CMUCam3 module. I've been able to flash various example firmwares to the CMUCam3 and it's all working lovely.

I now have two GNU Arm tool chains on the laptop. The Yagarto GNU ARM toolchain for MX21 (ARM9) work, and the CodeSourcery "G++ for ARM EABI" one for CMUCam3 (ARM7) work. As far as setup goes; the CodeSourcery toolchain takes priority (directory PATHs, etc.), with the Yagarto one being enabled via the custom makefile scripts used for MX21 work.

I've also place an order with Technobots (UK) They were a later supplier find, stocking quite a few parts I had ordered from the US, including the Pololu parts. My original Pololu order had four round robot chassis bases, but I forgot the spacer rods on that order. So I was able to grab two sets of spacers from Technobots. The intention currently is to put the four round bases together to form two platforms. Drill through the front top one to make a place to securely mount the turret assembly, and have the top back one somewhere to put the VC21 stack. Hopefully with enough space on the bottom deck to place the motor controller board and power supplies. Then it's back to determining whether the tiny motors and gearboxes can handle that much weight. My current gut-feeling is that they can't..

Technobots have also start supplying Dimension Engineering's Accelerometer boards. So I've order from them the +-2g Buffered 2 Axis Accelerometer module DE-ACCM2G. I've been using a similar MEMS device in the Nintendo Wii development in my day job, so I'm familiar with the intricacies of analysing the noisy output from a MEMS chip (to check out the internals of a Wiimote, head over to this SparkFun tutorial). The 5g Accelerometer module from Dimension Engineering uses the same Analog Devices chip used in the Wiimote. I'm hoping the smaller 2g device is just as sensitive, if not more so.
I can't wait to get this tiny module and hook it up to the ADC on the VC21RB1. The VC Robot COG has a LPC2136 (ARM7TDMI-S) core that contains two 8-channel 10-bit ADCs. Perfect for connecting the X-Y outputs of the DE-ACCM2G accelerometer. With this hooked up on the robot it will then be able to work out it's orientation vector with respect to gravity. I'm also think about using it in conjunction with the CMUCam3 analysis to determine if the motor controller has gone crazy and needs to be reset.

Another bit of software to add to this mix is RoboRealm. I picked up a while back a very cheap USB web camera. So cheap it's default resolution is CIF (352x288), intentionally the same as the OmniVision CMOS camera sensor module used on the CMUCam3. I should be able to prototype an image processing pipeline for the CMUCam3 using RoboRealm, before moving this across to CMUCam3 firmware. Initial results from experiments in RoboRealm look encouraging, but I know it's going to be a huge effort to move any image processing pipeline routines from RoboRealm across to the very limited CMUCam3.

Finally here's a cute photo of the CMUCam3 in it's turret assembly -

Sunday, 14 October 2007

Source level debugging working

After the success earlier today getting GDB to work through OpenOCD via Olimex ARM-USB-TINY JTAG connector, I thought I'd revisit Eclipse and check out the Insight debugger supplied with the Yagarto tool chain.

I closely followed Jim Lynch's Eclipse/Insight tutorial (PDF link) to get Eclipse setup with CDT, Insight, and OpenOCD. His tutorial is geared towards using an Olimex prototype board, but can be easily refactored to work with Yagarto and OpenOCD.

I setup the Virtual COGs 'touch' example within Eclipse, and also setup two external tools; namely OpenOCD (using the VC config script from the VC wiki, see previous blog post) and the Insight debugger (found in the Yagarto tool chain).

With the MX212+MM1 device connected via USB (as before) and the ARM-USB-TINY JTAG connected, it was a simple case of running OpenOCD server from inside Eclipse, then calling Insight from inside Eclipse. Insight was able to run to 'main', and then source step through the code :)

After some initial teething issues, everything now seems to be falling in to place. Very encouraging. Irrilicht here I come... :)

Wednesday, 26 September 2007

Random babble

While waiting for the other orders to turn up (been a few more, CMUcam3 module to go with the turret assembly, big order of components/parts/tools, Pololu motor controller) I thought I'd jot down some plans and ideas.

The laptop is now charged with all sorts of dev software. Off the top of my head;
I did download Fedora 7, but had silly problems trying to get it installed on a new partition. So for the time being it's going to be WinXP and Cygwin on the laptop. Which is fine, it's what I'm used to with the Wii and other console/handheld development work over the last few years.

The Olimex JTAG and Pololu chassis are sitting lonely, hopefully the other parts will trickle in over the next week or two. Mind you with the Egypt holiday next week, it's going to be maybe a welcome break to step back and hone the direction a bit.

As for direction :p

Current plan is to get the VC21 modules working together, with precidence on the LCD touch screen working alongside the MX212. With a port of the Irrilicht engine to Linux on the MX212 So firstly looking at how that can be done. Possible need to look at a device driver for the LCD handling, and definetely a custom Irrilicht renderer.

Next up will be setting up the Pololu motor controller. Initially breadboard the device to the chassis, and feed it through to the laptop to get it tested. Naturally with the appropriate HW to make sure I dont fry it with the RS232 levels. Need more thought on this. Being a SW guy an all. Although that's the main reason to get this cheapo laptop, just incase I do muck up :) I won't!!

Then it's the CMUcam3 with turret assembly. Not checked this out further, but should be easier to test with RS232. The cam module has better handling for that than the Pololu part.

And while all that is going on, do further investigation of single and dual camera vision research.

Oh, and a trip to a hardware/DIY outlet to see what materials they have to make a bigger base. Possible to buy something like a tank-track based chassis, but want to check out whether something can be constructed around the two Pololu chassis first. But then that's getting a bit to ahead of myself. First need to determine how sensitive and accurate their movement is first, before constructing a larger platform (think 2x CMUcam3 plus turrets and VC21 stack) for this robot :)

I'll have a good think sitting by the pool next week. But currently in my mind is a vision for this little robot to move around unteathered with the CMUcam3 (eventually x2) having it's live feed pulled back into the MX212, and into the Irrlicht engine for display. Appropriate processing on the feed (with possible inclusion of ultrasonic scanning help) building up a 3D representation of the world, feeding navigation/AI. All output on the LCD :)

Then it's on to Neural Nets and Genetic Programming...