carlfoxmarten: (chair)
With very few preparations, it was actually a pretty good event.

The software I was supposed to have ready for the event wasn't even complete enough to download code to the robot, or even control the robot remotely.

Despite all that, people were actually rather receptive and even fairly interested in learning about the robots we had on display.
(my concern was that since it was part of the Computing Science display, we were "supposed" to be showing programming-related things, not unprogrammed robots running default code)

The range of interests displayed by the visitors was rather intriguing.
From people who hadn't heard of the iRobot Create or the Roomba (the robotic vacuum cleaner the Create is based on) to those who had Roombas of their own, to those who were knowledgeable in the area of robotics.

At the end of the day, we even had one visitor who started a discussion among us about the current gaming consoles.

Despite how well-attended the event was otherwise, our room was fairly quiet for much of the time.
That allowed me to make some further progress on the project, but ultimately it still couldn't talk to the robot via a serial cable.
(not sure quite what the problem is even yet)

Oh well, now that the event is over, I can slow down and fill in some of the parts I skipped when I was rushing the deadline...

In related news, I'll be very tired the next couple of days due to how early I was up.
carlfoxmarten: (Default)
So I missed the first assignment (and due to policy, no late submissions accepted), found out the second assignment isn't due until March 16th, completed most of the first midterm, didn't finish writing my code for the robots for the open house on Thursday, and completed all I could of the second midterm, so all in all, not a bad week.

I'm relieved that I have an extra two weeks to work on the paper, as I'd only started reading the book the weekend before.
(the book's kind of interesting, even if the course isn't)
In case you're wondering, I'm reading Cell Phones: Invisible Hazard in the Wireless Age by George Carlo and Martin Schram. It's about the first proper research into cell phone safety.
(and that's all I want to say about that right now. Anything more can wait until after I've written my paper...)

The midterms went as well as I could hope, with a missed question on the first one (due to time) and an answer that I kind of hacked together (due to not studying that particular section).

The open house I have mixed feelings about.
The software I was trying to write for it I didn't manage to finish in time, so all I did was run the robots around and explain what they were and what our intentions were for having them.
(a computing science department with robots? Isn't that supposed to be the engineering department's job?)

We (or me, at least) are hoping to use the iRobot Creates that we have to teach programming to first-year students.
Some of the advantages of teaching with these robots are:
  1. Better feedback over learning programming on a computer. (it's physical versus virtual, and making things move is more rewarding than making things print...)
  2. It's prebuilt. You don't have to care about how sturdily you made it, thus removing the physical engineering aspect. (that's for the Engineering department, after all)
  3. It's fairly inexpensive. (the model with all the accessories is ~$300 USD, while the vacuum ranges from $100 to $600 USD. The advantage of the educational model is the Command Module (a small plug-in brain), and extra inputs and outputs, while the vacuum models can be cheaper, but don't have any extra inputs).
  4. (I forget the fourth reason...)

(there are disadvantages, by the way, I'm just too tired to list them right now)

Anyway, what would need to happen first is a conversation with both the Computing Science department and the Education department, I need to finish my undergraduate studies (only a little bit left, fortunately), and then we can do some serious discussions on why we want to use them and how.

Only after that can we actually start creating curriculum for it.
(that does not prevent writing the software, fortunately)

My week...

Feb. 28th, 2010 05:36 pm
carlfoxmarten: (Default)
This week is kind of busy for me, especially as I had the last two weeks off for the Olympics.

  1. On Monday, I have a networking homework assignment due.
  2. On Tuesday, I have a computerized society paper due.
  3. On Wednesday, I only have newspapers to deliver.
  4. On Thursday, I have a midterm in the morning, an open house to help set up in the afternoon, my own project to demo during the open house in the evening, and take-down afterwards.
  5. On Friday, I have another midterm late morning.

While I have a very busy day on Thursday, I'm most dreading Tuesday's assignment, as that's the course I'm really not liking.
(I'm much more comfortable writing in a programming language than I am writing papers in English)

While I'm on the topic of the open house, I may as well mention what I've been working on recently.

Last open house, I had most of the robots (about eight iRobot Create robots, basically stripped-down versions of the iRobot Roomba) running around with one of the default programs included, and one or two I had running around a program I wrote to "escape from a room", which basically meant "wander around until you find a virtual wall", though the program I wrote for it was a bit more interesting than that.

This year, I'm going to use input from a webcam as another sensor. So now I get to do stuff like finding the brightest or darkest part of the room, or chase a laser pointer.
You know, stuff that's much more interactive and interesting.

I also get my own room this year. The campus has small four- to eight-person rooms for teams to have meetings in, and I'll be given one all my own this time, with some of the Computing Science department in front of my room, and the rest in another room behind the main CS area.

So, this last week has been spent trying to figure out how to get a picture out of the webcam and into my program, hopefully in Python's Image Library format so I can do all kinds of image processing to it.

My first attempt (and what was billed as the easiest method) was using something called OpenCV, a Linux-only computer vision library.
The major problem is an issue with picture stability.
(in other words, the top-left corner of the webcam image is NOT always coming out as the top-left corner of the requested image)
This is something I'm not prepared to deal with, let alone try to understand.

So I found another alternative, and this one seems to have the advantage of being cross-platform as well.

It's called GStreamer, and it's basically a patch panel for audio and video programs.
Each element in the "pipeline" has inputs and outputs, each with their own capabilities, that can be plugged together in a large variety of ways.

Programs for streaming video from a webcam to the screen are pretty easy to find, and are fairly easy to understand, especially when written in Python.

The trick in my case was figuring out how to get the image data out of the pipeline.
(they don't like that sort of thing, so it's not terribly easy to figure out how to do, but actually doing it is quite easy)

Anyway, after a week, I've written a 30 line Python library that allows me to get a frame from the webcam reliably, as fast as I want and in the format I want.
I've tested it on a few computers (Linux only at this point) and on two different webcams, and it seems to work just fine, so it's what I'll be using for now.

The actual programs I'm going to demo aren't written yet, but that's primarily due to my other assignments with higher priority...
(they shouldn't be very hard to write, as I've done stuff similar to this before)
carlfoxmarten: (Default)
Boy, I sure enjoy working for the Computing Science department, they're very supportive, encouraging, and very appreciative.
And I've just barely begun on the project!

It was suggested that we try connecting a webcam to one of the robots, but then we'd need to use a computer.
So, if we add a computer, we can't use the Command Module, but we can have the computer control the robot directly.

Problem: The computer I'm thinking of using doesn't have any serial ports.
Solution: Get a USB to serial adapter.

When I mentioned this to the C.S. program coordinator, she indicated that the department would cover the costs, and would even order them and have them shipped direct (to the department, of course).
To help out, I found an inexpensive place to buy them.
(Powersonic, at ~$12CAN versus Future Shop at $60. An additional problem is that the battery on the laptop I'd like to use only lasts about an hour before quitting on me...)

Anyway, I've got two ideas for projects that use a webcam:
  • Laser-pointer dot follower (just need to figure out how to find the dot in an image)
  • Light or dark finder (almost trivial, just find the lightest or darkest part of the image and drive towards it)

I'm working on other ideas, as these aren't really all that interesting on their own.
I might write a framework with a GUI to control which program is currently running, possibly even with options to dynamically control each program's parameters.

Actually, that framework alone would be really handy to have, let alone any applications written with it...
carlfoxmarten: (Default)
Near the beginning of January I wandered around the CS department on my local campus, talking with the instructors and staff I knew and seeing what might be happening for this year's open house event.

Interesting what you can get into when you know people, and they know you.

Apparently, one year an instructor used the last twenty-five hundred dollars of his grant money to buy as many iRobot Create robots as he could, and ended up with eight of them, along with all the accessories.
As they were not purchased for a particular use, they were not being used for anything, and it was wondered if anything could be done with them, and not just for the open house (which, in itself is pretty big affair on this campus).

As some of you may have guessed, I stepped up to the task and was given access to the lab and use of a laptop (so I wouldn't have to lug my 4-pounder around too much) and was mandated to provide an educational tool for programming them.
A demo of the current progress was also scheduled for the open house so we had something interesting to show.

So, over the last six to eight weeks I've written a nearly-complete Python API which, unfortunately, required that the robot be plugged into the computer to use, limiting the capabilities.
As they also came with a microcontroller to plug in the back, I also wrote a C API which was more powerful, more useful and more portable.

To test the API, I wrote a short demo program that attempted to escape from a room, using one of the Virtual Walls across the door to indicate when it escaped and the wall sensor to follow the wall once it found one.
(it was also dumb enough that it could get "stuck" on an obstacle in the middle of the room indefinitely, which was one of the things I pointed out during the open house demonstration, and asked for ideas for identifying when the robot got stuck)

Anyway, the open house had been on Thursday, and was a huge success, with the vast majority of people seeing the demo responding positively (and were also interested in some first- and/or second-year CS courses being taught with them, which may be my Master's thesis if it and I get accepted as a graduate student).

I definitely had some fun building the software and testing it, and then almost got hoarse during the open house due to the number of people to talked to.
(I'm pretty sure that I spoke to one off-campus reporter, too, in addition to our public-relations person)

The software will be made available for free under some open-source license once I figure out how to use SVN...
(man, I didn't realize how much I liked Git until I tried to use other version-control systems...)

So, yet another example of stepping in to fill some semi-useful role...

Kinda feels good, you know?


Aug. 13th, 2008 10:19 pm
carlfoxmarten: (Default)
After last Christmas, my Dad gave me a copy of the Summer 2007 edition of Robot Magazine, a once quarterly publication that is about all forms of Robotics, though mostly aimed at hobbyists.

I had a lot of fun reading it, as I'm one of those people who like that sort of thing, and there was also an article on the two original Mars rovers (apparently they were used as testbeds for the latest Mars rover, and so had their software updated to assist in navigation and in what pictures they sent back).

Now, my Dad found it in another location (and it turns out that they've upped it to six times a year), so I've been rather busy reading through this one...

At first I was a bit disappointed with the second one, until I started reading the articles for their content and not for what I thought was interesting.
(I have a habit of doing that sometimes, and so miss important and sometimes very interesting information)

One piece of interesting information is about a microcontroller that has eight processing cores on it (having used a LEGO Mindstorms RCX for a couple of years makes me really appreciate multi-tasking robot brains).
(it's by Parallax Inc., the makers of the BasicSTAMP etc., and it's called the "Propeller". The demo board you can get is good enough to emulate a terminal, right down to the PS/2 keyboard and mouse inputs, and VGA output!)

So now there's two microcontrollers I'm interested in getting (the Arduino is the other one).

Now I just need the money for either of them and some way to adapt them to LEGO...
carlfoxmarten: (Default)
*a package appears on the doorstep of the fox/marten*

*the fox/marten peeks outside, sees the package and snatches it up, dashing back inside*

*after a few minutes, he crashes out of the front door*

"AH HA HA HA HA! Mortals! Fear the awesomeness of my creation!"

*a tiny, fragile Lego robot drives out the front door and attempts to drive down the stairs, but ends up crash-landing in a pile of small pieces at the bottom of the steps*

"Darn, I guess I should have checked the manual on this thing first..."

Well, since that didn't seem to work, I should say that I've enjoyed playing with Lego for many years now, and had got my first Lego Mindstorms kit a couple years ago from a secondhand toy fair.

I was so excited!
I could finally build and program my own Lego robots!

But the problem was that it only came with one working motor, the other had came completely seized up and wouldn't even turn by hand, let alone on command from the RCX unit (the Lego robot's brain).

However, on my last birthday my parents gave me a "gift certificate" for $50 from Lego to use as I wished.
So I carefully browsed their website trying to find a match for the motor that came with the Robotics Invention System I had, but what with it being several years old, I didn't find it on their website.

Fortunately, they still make some of them (or at least have some in stock), as I found out after not one, but two calls to Lego's Canadian phone number.
It's a darker grey, but I don't care about the colour, it'll be compatible with what I have already.

Now I can finally build a mobile robot that can actually steer without having to do that nasty turn-in-reverse thing that I had to do previously.

Once my order comes in, I'll finally be able to build robots like they describe in the books I have!

Later, once I've played around with the system I have enough, I'll probably go out and buy their new update to the system, the Lego Mindstorms NXT.
Boy, what I could do with that!
(it can use many Bluetooth-compatible devices, including cellphone cameras!)


carlfoxmarten: (Default)
Carl Foxmarten

August 2017

272829 3031  


RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Sep. 21st, 2017 06:54 am
Powered by Dreamwidth Studios