How exciting!
Today’s vision meeting totally made my day. Now that the batteries are fully charged, we can proceed: the camera works, it works beautifully, and it works simply. We got it to work with the software that FIRST gave us. We got it to work with the software developed at CMU. We got it to work with a Python script David and I wrote last week. Kick ass! We got it to move the servos, capture images, track different colors around the image, and move the servos to center the camera on a swath of color. The next step, I feel, is to get the robot’s processor to interface with the camera and do all of this (we’re currently doing it from a desktop computer). The cool thing is that the hard part of this has already been solved by this awesome dude online; we just need to look through the code and figure out how he interfaces with the TTL stuff, and we should be good to go.
Apparently my little sister is playing the clarinet trio we played in high school at a music camp thing, and she didn’t even pick it, the teacher just said “here you three all play clarinet so here’s a piece by david evan thomas to play.” hahahaha.
Woah! That’s pretty cool.