subglobal1 link | subglobal1 link | subglobal1 link | subglobal1 link | subglobal1 link | subglobal1 link | subglobal1 link
subglobal2 link | subglobal2 link | subglobal2 link | subglobal2 link | subglobal2 link | subglobal2 link | subglobal2 link
subglobal3 link | subglobal3 link | subglobal3 link | subglobal3 link | subglobal3 link | subglobal3 link | subglobal3 link
subglobal4 link | subglobal4 link | subglobal4 link | subglobal4 link | subglobal4 link | subglobal4 link | subglobal4 link
subglobal5 link | subglobal5 link | subglobal5 link | subglobal5 link | subglobal5 link | subglobal5 link | subglobal5 link
subglobal6 link | subglobal6 link | subglobal6 link | subglobal6 link | subglobal6 link | subglobal6 link | subglobal6 link
subglobal7 link | subglobal7 link | subglobal7 link | subglobal7 link | subglobal7 link | subglobal7 link | subglobal7 link
subglobal8 link | subglobal8 link | subglobal8 link | subglobal8 link | subglobal8 link | subglobal8 link | subglobal8 link

Requirements Analysis Document

 

 

R.O.M.P

(Robot Orientation & Mapping Project)

 

 

Requirements Analysis

Document

Version 1.0

 

 

 

10/25/04

Mike Lazar, Peri Subrahmanya, Sean Hogan,

Joe Hackstadt, Sean Williams

 

CONTENTS

 

 

 

1 Introduction 3

1.1 Purpose of the System . . . . . . . . . . . . . . . . . . . 3

1.2 Scope of the System . . . . . . . . . . . . . . . . . . . 3

1.3 Objectives and Success Criteria of the Project . . . . . . 3

2 Current System 4

2.1 Current System . . . . . . . . . . . . . . . . . . . 4

2.1.1 The Applet . . . . . . . . . . . . . . . . 4

2.1.2 The Middle Man . . . . . . . . . . . . . . . . 5

2.1.3 The Remote Control Server . . . . . . . . . . . . . 5

2.2 ARIA . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2.3 Saphira . . . . . . . . . . . . . . . . . . . . . . . . . . 6

2.4 Problem . . . . . . . . . . . . . . . . . . . . . . . . . 6

2.5 Limitations . . . . . . . . . . . . . . . . . . . . . . . . 6

 

3 Proposed System 7

3.1 Part 1 - Camera Control . . . . . . . . . . . . . . . . . . . 7

3.1.1 Intuitive Camera Controls . . . . . . . . . . . . 7

3.2 Part 2 - Mapping . . . . . . . . . . . . . . . . . . . . . . 9

4 Possible Project Obstacles

4.1 Hardware Obstacles . . . . . . . . . . . . . . . . . . . 10

4.2 Software Obstacles . . . . . . . . . . . . . . . . . . . .

 

5 Glossary 11

6 References 11


1 INTRODUCTION

 

1.1 Purpose of the System:

The purpose of this project is to enable users of the Computer Science department robot “Taz” to have sense of orientation when controlling the robot via the internet. To do this we will be incorporating a mapping feature to the current system. Taz's onboard camera will also be given pan and tilt capabilities so that the user can have the ability to “look around”, without it being necessary for the user to turn the entire robot.

 

1.2 Scope of the System:

The scope of our system is a java application that will work in conjunction with the current movement controls to provide the user with orientation information displayed on a map. There will also be camera tilt and pan controls added to the current applet. We are not dealing with anything other than orientation via the map and camera display.

 

1.3 Objectives and Success Criteria of the Project:

The objective of this project is to allow users to be able to orientate the robot based on the robot's position on the map, and for users to be able to view the surroundings of the robot via the camera display. The success of this project will depend on whether or not users will be able to determine their exact orientation at all times using our capabilities.

 

 

2 Current System

 

2.1 Current System:

The robot is a PeOPLeBOT tm , it is one of three that are owned by the computer science department It has been programmed by three teams in the past. Currently the system has a stationary camera display that allows users to view what is directly in-front of the robot, while they control it over the internet. The user interface uses a graphical interface, and then sends the commands to the robot using wireless Ethernet. The robot currently uses its laser unit to navigate, along with sonar for collision avoidance. The current robot is programmed using the Aria c++ API, and the robot runs on Linux.

The architecture of the current system consists of three parts, the Applet, the Middle Man, and the Remote Control Server. None of these subsystems will be replaced by our proposed system; rather our system will work in conjunction with this present architecture

2.1.1 The Applet

The Applet runs on a client machine. This is the part of the system with which the user interacts. It uses a GUI programmed in Java that allows the user to control the movement of the robot. These movement controls are dynamically updated, disallowing movements that are not currently permitted. The applet also receives information from the Middle Man as to what movements the user is presently allowed to give. Movement commands once they are selected by the user are sent to the Middle Man via sockets. The applet also receives the video feed from the front-mounted camera, which allows the user to see what is directly in-front of the robot.

 

2.1.2 The Middle Man

The Middle Man is run on the web server roboti. It receives commands from the Applet via sockets. It processes these commands into RCP commands and then relays the RCP commands to the Remote Control Server. It also sends user list commands, current mode information, and error responses to the applet, as well as button Enable/Disable information. The Middle Man also handles some security concerns by limiting the number of user connections.

 

2.1.3 The Remote Control Server

The Remote Control Server runs on the robot. It receives RCP commands from the Middle Man and communicates them with the robot. This part of the system also handles localization and navigation by colleting sensor information and using it to determine the precise position of the robot related to the Grid. The information is then used for navigation of the robot. It also sends information to the Middle Man that will then be sent to the Applet to update the GUI as to what movements the robot can make. The Remote Control Server uses two APIs, ARIA and Saphira.

 

 

 

 

2.2 ARIA

ARIA (ActivMedia Robotics Interface Application) is an object oriented interface to ActivMedia robots. This program is usable in Linux and Win32, but we are using it in Linux. ARIA communicates via a client/server relationship, using a TCP/IP connection or a serial port. ARIA uses C++.

 

2.3 Saphira

Saphira is a robot control system that is used for graphically interfacing with the robot. This program is currently being phased out of the system.

 

2.4 Problem

Our problem is that when users first gain control they do not know where in the building they are as there is no mapping feature to identify the position of the robot. This causes users to have to turn the entire robot around to get an idea of the surroundings, because the camera does not rotate. Users must know where they are in the building, and they must also know where they can go and which direction the robot is facing in relationship to its surroundings.

 

2.5 Limitations

There were not many limitations that were put on our solution. We have to take into consideration the current system that is in place for controlling the robot and incorporate our system with it. We cannot add anything to the environment to assist in the mapping function. We have to implement our controls onto the current website, keeping the side panel and the SIUE header unchanged. We can't make major changes to the physical robot.

 

3 Proposed System

 

3.1 Part 1 - Camera Control

3.1.1 Intuitive Camera Control

We intend to develop controls for the front-mounted camera that allow for panning, tilting, and zooming of the camera while keeping camera bearing and robot orientation.

3.1.1.1 Pan Control

The panning control will provide users with the ability to pan left or right during operation. This will allow the user to be able to visually see where they are at in the building. This is a change from the current system in that the present camera view is strictly straight ahead. The design for the control right now is an arc shaped dial, with an arrow initially centered. The user will be able to move the arrow within the arc to control which way the camera is pointing see Fig 1.

 

 

 

 

 

 

 

 

Fig 1: Proposed Panning Control

 

3.1.1.2 Tilt Control

The tilt control will afford the user the ability to tilt the camera up, toward the ceiling, and down toward the floor. The control will be a slide-bar that slides up and down. The slide bar will initially be centered vertically on the control, and the user will click with the mouse and drag the bar either up, tilting the camera up, or down, tilting it down. See Fig 2.

 

 

 

 

 

 

Fig 2: Proposed tilt control

 

 

 

3.1.1.3 Zoom Control

The zoom control will give users the ability to zoom the camera view in and out using a control similar to the tilt control. When the user slides the bar up the camera will zoom in. When the user slides the bar down the camera will zoom out. The center position will be the normal view.

3.2 Part 2 - Mapping

3.2.1 Overview

The purpose of adding the mapping feature is so that we can provide enough detail to allow the user to gain orientation of the robot as they navigate it through the engineering building. This will allow for better control of the robot since users will be able to visibly see the robot's position, and orientation in the engineering building.

•  Displaying the Map

The map will be displayed alongside of the camera display. See Figure 3.

The users will be able to see the position of the robot on the map as they navigate through the engineering building. The robot will be represented by an arrow pointing in the direction that the robot is presently facing.

 

 

 

 

 

 

 

 

 

 

Fig 3: Map display

 

 

4 Possible Project Obstacles

 

4.1 Hardware Obstacles

 

4.3 Software Obstacles

Learning Java

Our application is going to be written in Java, and no one on our design team knows Java, so one of our obstacles will be learning Java as we go along and develop our project. This should not pose a very large problem, as we have acquired several good Java books that will aid us in development.

 

5 Glossary

 

Dynamic - Characterized by frequent change

Ethernet - A local area network protocol

Navigate - To plan, record, and control the course and position of (a robot)

6 References

 

Bruegge, Bernd and Dutoit, Allen (2000), Object-Oriented Software Engineering:

Evans, Keith; Lomonica, Andrew; Ecker, Erin and Bartholomew, Greg (2003),

Requirement Analysis Document (RAD) Edwardsville , IL ;

Deiters, Andrew; Sturtz , Chad ; Williams, Sean; Wilson, Joel (2003),

Requirements Analysis Document (RAD) Edwardsville , IL ;

McConnell, Steve (1996), Rapid Development. Redmond , WA : Microsoft Press.

Metzger, Philip and Boddie, John (1996), Managing A Programming Project. Upper

addle River, NJ: Prentice Hall.

Definitions adapted from: http://www.dictionary.com , http://www.yourdictionary.com

 

 

About Us |Contact Us | ©2004 Team Evolution