The laptop is now charged with all sorts of dev software. Off the top of my head;
- GNUARM toolchain (Virtual COGs variant),
- Yagarto (including OpenOCD, Eclipse & CDT),
- Cygwin,
- Visual Studio 2008 Express Edition,
- CMUcam3 SVN update,
- All the VC21 examples I could find (including the latest PC1 one),
- Irrlicht engine source code (compiles to around 37MB debug build currently with ARM toolchain),
- A variety of Linux eBooks (taken from links on Virtual COGs Wiki, thanks guys),
- Start of the datasheet library,
- OpenCV,
- MS PSDK,
- JRE,
- Silabs driver,
- Tortoise SVN & CVS.
The Olimex JTAG and Pololu chassis are sitting lonely, hopefully the other parts will trickle in over the next week or two. Mind you with the Egypt holiday next week, it's going to be maybe a welcome break to step back and hone the direction a bit.
As for direction :p
Current plan is to get the VC21 modules working together, with precidence on the LCD touch screen working alongside the MX212. With a port of the Irrilicht engine to Linux on the MX212 So firstly looking at how that can be done. Possible need to look at a device driver for the LCD handling, and definetely a custom Irrilicht renderer.
Next up will be setting up the Pololu motor controller. Initially breadboard the device to the chassis, and feed it through to the laptop to get it tested. Naturally with the appropriate HW to make sure I dont fry it with the RS232 levels. Need more thought on this. Being a SW guy an all. Although that's the main reason to get this cheapo laptop, just incase I do muck up :) I won't!!
Then it's the CMUcam3 with turret assembly. Not checked this out further, but should be easier to test with RS232. The cam module has better handling for that than the Pololu part.
And while all that is going on, do further investigation of single and dual camera vision research.
Oh, and a trip to a hardware/DIY outlet to see what materials they have to make a bigger base. Possible to buy something like a tank-track based chassis, but want to check out whether something can be constructed around the two Pololu chassis first. But then that's getting a bit to ahead of myself. First need to determine how sensitive and accurate their movement is first, before constructing a larger platform (think 2x CMUcam3 plus turrets and VC21 stack) for this robot :)
I'll have a good think sitting by the pool next week. But currently in my mind is a vision for this little robot to move around unteathered with the CMUcam3 (eventually x2) having it's live feed pulled back into the MX212, and into the Irrlicht engine for display. Appropriate processing on the feed (with possible inclusion of ultrasonic scanning help) building up a 3D representation of the world, feeding navigation/AI. All output on the LCD :)
Then it's on to Neural Nets and Genetic Programming...
No comments:
Post a Comment