carlfoxmarten: (chair)
With very few preparations, it was actually a pretty good event.

The software I was supposed to have ready for the event wasn't even complete enough to download code to the robot, or even control the robot remotely.

Despite all that, people were actually rather receptive and even fairly interested in learning about the robots we had on display.
(my concern was that since it was part of the Computing Science display, we were "supposed" to be showing programming-related things, not unprogrammed robots running default code)

The range of interests displayed by the visitors was rather intriguing.
From people who hadn't heard of the iRobot Create or the Roomba (the robotic vacuum cleaner the Create is based on) to those who had Roombas of their own, to those who were knowledgeable in the area of robotics.

At the end of the day, we even had one visitor who started a discussion among us about the current gaming consoles.

Despite how well-attended the event was otherwise, our room was fairly quiet for much of the time.
That allowed me to make some further progress on the project, but ultimately it still couldn't talk to the robot via a serial cable.
(not sure quite what the problem is even yet)

Oh well, now that the event is over, I can slow down and fill in some of the parts I skipped when I was rushing the deadline...

In related news, I'll be very tired the next couple of days due to how early I was up.
carlfoxmarten: (chair)
Current status on the CreateSimulator project:
  • Editor: Fairly complete. (I just need to attach a bunch of menu items and buttons and add a bunch of icons)
  • Creating projects: Close. (I just need to add the widgets to handle it and check to see if that project name already exists)
  • Project loading: Complete.
  • Compiling projects: Not started.
  • Downloading projects to the Command Module: Not started.
  • Controlling robots from the computer: Not started.
  • Simulating a robot: Nowhere near started.

My target for the upcoming open house event is to be able to have the same code controlling robots in thee different ways:
  1. Directly controlling an iRobot Create through the included Command Module.
  2. Using a laptop to control a robot through a serial cable.
  3. Simulating a virtual robot on a computer.

The last part ia going to be the hardest part, as I don't have to write my own compiler, I'm going to use variations on the GCC compiler.

Next week, I'm hoping to have two compilation paths completed (one for running on the local computer and the other for running on the Command Module's ARM processor) and the simulator at least started.
That should give me a week to finish the simulator, which ought to be enough time.

Let's hope I haven't overdone things again...
carlfoxmarten: (Default)
It's exactly three weeks until the campus open house event, and the software I'm writing isn't anywhere close to being done enough to demo.

At a bare minimum, it needs to be able to load projects, edit source files, compile the project (in two different ways, one for local execution and the other for embedded execution), download the embedded code to the robot, proxy the local code through the serial port, and run virtual simulations.

So far, it can list projects for you to choose from, and can't even load the project yet.
I'm going to have to devote a whole lot more time to this... =>.<=
carlfoxmarten: (Default)
Ever since I first started as a university student, I've been helping out around campus on various volunteer opportunities, usually Open House and Orientation.

We have about one open house event a year (on rare occasions we have two), and of the three or four times I've helped out, nobody's managed to snap a picture with me in it.

It's kind of odd, really. They usually have two or three photographers (one or two volunteers for the volunteer office and one from the PR office), and the pictures would get posted online somewhere, with a link sent out to the participants.

The last couple of years I've also helped out the computing science department, so there's an additional camera taking pictures.
(and another one that always seems to miss seeing me)

This last open house, the departmental camera almost got a picture of me (I'm behind someone, but you can see who I'm talking with), while one of the other cameras did manage to get me.

First time in what, four years?
(and it was still only ONE picture!)
carlfoxmarten: (Default)
So I missed the first assignment (and due to policy, no late submissions accepted), found out the second assignment isn't due until March 16th, completed most of the first midterm, didn't finish writing my code for the robots for the open house on Thursday, and completed all I could of the second midterm, so all in all, not a bad week.

I'm relieved that I have an extra two weeks to work on the paper, as I'd only started reading the book the weekend before.
(the book's kind of interesting, even if the course isn't)
In case you're wondering, I'm reading Cell Phones: Invisible Hazard in the Wireless Age by George Carlo and Martin Schram. It's about the first proper research into cell phone safety.
(and that's all I want to say about that right now. Anything more can wait until after I've written my paper...)

The midterms went as well as I could hope, with a missed question on the first one (due to time) and an answer that I kind of hacked together (due to not studying that particular section).

The open house I have mixed feelings about.
The software I was trying to write for it I didn't manage to finish in time, so all I did was run the robots around and explain what they were and what our intentions were for having them.
(a computing science department with robots? Isn't that supposed to be the engineering department's job?)

We (or me, at least) are hoping to use the iRobot Creates that we have to teach programming to first-year students.
Some of the advantages of teaching with these robots are:
  1. Better feedback over learning programming on a computer. (it's physical versus virtual, and making things move is more rewarding than making things print...)
  2. It's prebuilt. You don't have to care about how sturdily you made it, thus removing the physical engineering aspect. (that's for the Engineering department, after all)
  3. It's fairly inexpensive. (the model with all the accessories is ~$300 USD, while the vacuum ranges from $100 to $600 USD. The advantage of the educational model is the Command Module (a small plug-in brain), and extra inputs and outputs, while the vacuum models can be cheaper, but don't have any extra inputs).
  4. (I forget the fourth reason...)

(there are disadvantages, by the way, I'm just too tired to list them right now)

Anyway, what would need to happen first is a conversation with both the Computing Science department and the Education department, I need to finish my undergraduate studies (only a little bit left, fortunately), and then we can do some serious discussions on why we want to use them and how.

Only after that can we actually start creating curriculum for it.
(that does not prevent writing the software, fortunately)

My week...

Feb. 28th, 2010 05:36 pm
carlfoxmarten: (Default)
This week is kind of busy for me, especially as I had the last two weeks off for the Olympics.

  1. On Monday, I have a networking homework assignment due.
  2. On Tuesday, I have a computerized society paper due.
  3. On Wednesday, I only have newspapers to deliver.
  4. On Thursday, I have a midterm in the morning, an open house to help set up in the afternoon, my own project to demo during the open house in the evening, and take-down afterwards.
  5. On Friday, I have another midterm late morning.

While I have a very busy day on Thursday, I'm most dreading Tuesday's assignment, as that's the course I'm really not liking.
(I'm much more comfortable writing in a programming language than I am writing papers in English)

While I'm on the topic of the open house, I may as well mention what I've been working on recently.

Last open house, I had most of the robots (about eight iRobot Create robots, basically stripped-down versions of the iRobot Roomba) running around with one of the default programs included, and one or two I had running around a program I wrote to "escape from a room", which basically meant "wander around until you find a virtual wall", though the program I wrote for it was a bit more interesting than that.

This year, I'm going to use input from a webcam as another sensor. So now I get to do stuff like finding the brightest or darkest part of the room, or chase a laser pointer.
You know, stuff that's much more interactive and interesting.

I also get my own room this year. The campus has small four- to eight-person rooms for teams to have meetings in, and I'll be given one all my own this time, with some of the Computing Science department in front of my room, and the rest in another room behind the main CS area.

So, this last week has been spent trying to figure out how to get a picture out of the webcam and into my program, hopefully in Python's Image Library format so I can do all kinds of image processing to it.

My first attempt (and what was billed as the easiest method) was using something called OpenCV, a Linux-only computer vision library.
The major problem is an issue with picture stability.
(in other words, the top-left corner of the webcam image is NOT always coming out as the top-left corner of the requested image)
This is something I'm not prepared to deal with, let alone try to understand.

So I found another alternative, and this one seems to have the advantage of being cross-platform as well.

It's called GStreamer, and it's basically a patch panel for audio and video programs.
Each element in the "pipeline" has inputs and outputs, each with their own capabilities, that can be plugged together in a large variety of ways.

Programs for streaming video from a webcam to the screen are pretty easy to find, and are fairly easy to understand, especially when written in Python.

The trick in my case was figuring out how to get the image data out of the pipeline.
(they don't like that sort of thing, so it's not terribly easy to figure out how to do, but actually doing it is quite easy)

Anyway, after a week, I've written a 30 line Python library that allows me to get a frame from the webcam reliably, as fast as I want and in the format I want.
I've tested it on a few computers (Linux only at this point) and on two different webcams, and it seems to work just fine, so it's what I'll be using for now.

The actual programs I'm going to demo aren't written yet, but that's primarily due to my other assignments with higher priority...
(they shouldn't be very hard to write, as I've done stuff similar to this before)
carlfoxmarten: (Default)
Boy, I sure enjoy working for the Computing Science department, they're very supportive, encouraging, and very appreciative.
And I've just barely begun on the project!

It was suggested that we try connecting a webcam to one of the robots, but then we'd need to use a computer.
So, if we add a computer, we can't use the Command Module, but we can have the computer control the robot directly.

Problem: The computer I'm thinking of using doesn't have any serial ports.
Solution: Get a USB to serial adapter.

When I mentioned this to the C.S. program coordinator, she indicated that the department would cover the costs, and would even order them and have them shipped direct (to the department, of course).
To help out, I found an inexpensive place to buy them.
(Powersonic, at ~$12CAN versus Future Shop at $60. An additional problem is that the battery on the laptop I'd like to use only lasts about an hour before quitting on me...)

Anyway, I've got two ideas for projects that use a webcam:
  • Laser-pointer dot follower (just need to figure out how to find the dot in an image)
  • Light or dark finder (almost trivial, just find the lightest or darkest part of the image and drive towards it)

I'm working on other ideas, as these aren't really all that interesting on their own.
I might write a framework with a GUI to control which program is currently running, possibly even with options to dynamically control each program's parameters.

Actually, that framework alone would be really handy to have, let alone any applications written with it...
carlfoxmarten: (chair)
I've known about it since about October/November that the next Open House event my primary campus is having is going to be on March 4th.

A couple days ago, the program coordinator for the Computing Science department on that campus contacted me asking if I'd be interested in helping out again.

I think it's rather flattering that the program coordinator (which is a pretty important position) would be interested in contacting me, though I suppose I've made an impression with her before she made it to that position.
(actually with most of the department, I think...)

So, I'm thinking about doing stuff with our iRobot Create robots again, but I'm having some problems with deciding what to have them do.

I should probably first point what features the robot has:
  • Two bumpers on the front (to tell if you've bumped into anything)
  • Four cliff sensors (to tell if you're about to drive off a cliff, or stay away from stairs)
  • A wall sensor on the right side (to tell if you're near a wall. Usually used for wall-following algorithms)
  • An IR transmitter and receiver (for talking to other robots and listening for commands from a remote control or other device)
  • Two driving wheels (independent, used for mobility)
  • A bay in the back with a device connector (for a controller, such as the Command Module)
  • A serial port (for interactive control, programming via this port is not supported. You need to use a controlling device for that)
  • One or more "virtual walls" (an IR emitter that the robot can detect)

This is a fairly impressive array of sensors, however, there is very little to actually control. This in turn is what my problem is.

Last year my demonstration project was a fairly simple "escape the room" problem: navigate through a space trying to find the virtual wall indicating the exit. The program was fairly simple, actually, but didn't actually look like much.
(people seemed fairly interested in it, however, especially as a possible teaching aid)

My ideas so far are:
  • A race of some sort
  • Last year's demo (maybe change the description to something else, but most of the logic could stay the same)
  • Demo instead an Integrated Development Environment (IDE) for programming (and possibly simulating) the robots

The last one is the one I think would cost the most amount of effort, but would probably pay off bigger in the long run, as it might shorten development time for later projects.

I was even told that we could set things up the way I liked, which makes me even more inclined to get some kind of curriculum and software set up for students to learn with...
carlfoxmarten: (Default)
I must say I had a pretty happy birthday on Saturday.
I got stuff I wanted, as well as several gift cards, and I even got to see a pantomime, too!

The important stuff I got:
  • The first two Schlock Mercenary books (Tub of Happiness and The Teraport Wars)
  • The FlyTech BladeStar helicopter. (sure I have another RC copter, but that one only lets me control the height, while this allows me to control much more)
  • A book on Electronic Arts' first twenty-five years. (fascinating read, though the first ten or fifteen years are more interesting to me at the moment)
  • Cats: Commemorative Edition (there's interviews with some of the actors and the producers that describes a bit of how and why it was done)
  • Clue: The Movie (the movie based on the board game. Very entertaining)
  • The Funny Blokes of British Comedy (it's going to be very funny, as it includes some clips from Are You Being Served?, Fawlty Towers, Good Neighbours, and more!)
  • A tape of some of Victor Borge's musical comedy. (most of what I have so far is his jokes, so I don't have much of him horsing around on the piano)
  • A book called Rhyme and PUNishment: Adventures in wordplay (should be good!)
  • Also of some mention: More mint chocolate (I'm starting to get tired of it, despite the fact that it's one of my favourites) and more gift cards (for London Drugs, Tim Hortons, and A&W)

The pantomime I got to see was put on by the Royal Canadian Theatre Company at the Surrey Arts Center. If you've ever seen the Robin Hood pantomime as put on by Ross Petty Productions, you'll notice some similarities.
I really enjoyed the "band", a two-piece accompaniment made up of a set of drums and an electronic piano, both played by people old enough to know how to play them properly. They played before and after the show, as well as during the intermission, a selection of songs such as Christmas favourites, some Big Band stuff, and more.

One other thing they had while the show wasn't running was snowflakes projected onto the walls of the theatre, they gave a very nice effect that I think more theatres (as well as theaters) should do. It adds so much to the experience.

Some of the songs they sang were from various places, including one from Chitty-Chitty-Bang-Bang (You're My Little Teddy Bear), Anything You Can Do (they included a verse about singing anything faster...), and, and...
Hmm, must be bed time again...

My semester starts today (Monday), with one class on Mondays, Wednesdays and Fridays, another class on Tuesdays and Thursdays, and a third class Tuesday evenings, and one of them is on the campus closest to me.

The nearest campus is having an open-house event on March 4th, so I need to get my act together and figure something out to present for the Computing Science department...


carlfoxmarten: (Default)
Carl Foxmarten

August 2017

272829 3031  


RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Sep. 21st, 2017 06:58 am
Powered by Dreamwidth Studios