Home | Team Info | Documents | News | Progress | Releases | Robots | Concepts | Site Map


RAFS - Progress

Progress

CS 425

Here is a some links from our prototype in CS 425. It consisted of getting the robot to move in a pre-defined square. We had some technical problems with the robot, which kept us from testing it to our full capabilities. The robot was shipped back about a week before our prototype was due. Therefore we had to get our prototype done ahead of time. We spent a lot of time in one weekend to get a majority of the code written. A video of our prototype and the code behind it follows:


Video
6.28 MB

Code


CS 499

September 15

Here is a little sample of our object (chair) recognition so far. It includes a crude video capture from our robot development environment, along with a video from a regular camera. These two videos were recording at the same time, which gives the opportunity to see what was happening from two different angles. On the video from the camera mounted on the robot, we have the environment in which our project will be developed open, which is Saphira. The black and blue dots are where the sonar sees an object, like the wall to the right of the robot. The heading of the robot can be determined by the little red rectangle, which is the front. The actual code for this module is in the black window. The camera shows blue highlighted pixels, which represent "recognized" colors. The color we are using is pink because it is easily distinguishable from other colors in the room we are actually working. In this example, the chair is placed within the camera's view. The robot begins by repositioning the camera and begins to persue the chair. It will then begin to turn toward the chair as it is approaching it, continuously adjusting to put the chair directly in front. When it gets within a reasonable distance from the chair, it stops.

Code

September 24

Here is the final version of our Object Recognition module, which is now integrated with the wander module. It will wander around until it sees a chair, then it will approach the chair. When the chair is within a certain distance from the robot, the gripper will open and close again to let you know it found a chair. The chair will also be logged, along with the coordinates of the chair. Localization plays a big part in how accurate the logging system is. We will get it to be more accurate in Release 2.


Video
12.3 MB

Code

September 26

Here is a copy of our Chair Movement module. It starts at a point with the chair already grasped. It then progresses forward until an object gets in front of the chair. Once this object is detected, it turns a random direction to go around the object. At this point, we are assuming that the coast is clear on both sides of the object. It turns 90 degrees, then goes around the object. Once it is around the object, it goes until a certain distance is covered (passed as an argument to the function call). Once that distance is travelled, the robot turns 90 degrees back, and moves forward until he is along the same path (roughly), and he turns to face the same direction as he was when he started. Localization will make this module more accurate in stopping with respect to the same line as it started.


Video
6.54 MB

Code

October 3

This module began with the Object Recognition module from above. It took that module one step further and actually grasps the chair. The video skips the repeated part from above, and begins when the robot is aleady in front of the chair. A major assumption that was made for this module since it is the research phase of gripping a chair was that one of the legs of the chair must be directly in front of the chair from the robot's view. This eliminates the problem of realigning the robot to approach the chair from the correct angle. A few ideas for algorithms have been conjured at this stage. One of these algorithms will be implemented in Release 3.


Video
2.19 MB

Code

October 14

We captured a small segment of this module running on video. We kept it in its native resolution and format to preserve quality. Since this format is large, we kept the video short. It is important to see the details of the output window to realize what is going on. Also, conversion to mpeg would reduce the quality of the video to the point where laser readings would be hard to decipher.
This is a sample run of the robot selecting random points in EB2029 to "wander" to. In the background the object recognition module is looking for chairs. In this test run the robot does not see a chair until the near end of the video. When the object recognition modules see a chair, it interrupts the wander module and moves the robot in the direction of the chair. This example employs the new module interaction scheme we have devised.


Video
7.9 MB

Code

October 15

In this module, we need to put a chair, once gripped, into the correct place by the desk. The goal of this part is to fine tune a point to use for a desk. We start with the chair gripped, then move toward the point, with the assumption that nothing is in the path. Once we get to the desk, we open the gripper to let you know it is in the correct place.


Video
1.7 MB

Code

November 5

In this module, we start out with the chair in sight and within the given distance the object recognition module leaves off. The robot will then have two cases: the leg is lined up, or it is not lined up. If it is lined up, it simply grabs the chair. If it is not lined up, it turns 90 degrees to the right, and goes in a circle around the chair, until the leg is lined up with the center of the chair. It then turns back toward the chair and grabs the chair. It continuously realigns itself with the center of the chair, so it does not miss the leg.


Video
1.95 MB

Code