[Documentation] [TitleIndex] [WordIndex

Only released in EOL distros:  

Package Summary

Wraps a Human-Robot Interaction Experiment conducted at the University of Texas At Austin to use the Robotics in Concert Framework. The results from this experiment were reported in the following symposium paper: Piyush Khandelwal and Peter Stone. Multi-robot Human Guidance using Topological Graphs. In AAAI Spring 2014 Symposium on Qualitative Representations for Robots (AAAI-SSS), March 2014.

This package wraps a Human-Robot Interaction Experiment conducted at the Univesity of Texas At Austin. The results from this experiment were reported in the following symposium paper: Piyush Khandelwal and Peter Stone. Multi-robot Human Guidance using Topological Graphs. In AAAI Spring 2014 Symposium on Qualitative Representations for Robots (AAAI-SSS), March 2014. (pdf)

The bwi_guidance_concert package wraps this experiment to use the Robotics in Concert framework.

Overview

The goal of the multi-robot guidance project is to allow an automated system to efficiently guide a person unfamiliar with the environment to their goal using multiple robots. In the experiment available through this package, a single controller decides where to position robots in the environment and what direction should be displayed on each robot's screen to guide the person.

This package currently wraps preliminary work on this project, where a central positioner node teleports each robot to a desired location using Gazebo's set_model_state service. Additionally, each robot subscribes to an image stream which is displayed on its simulated robot screen (ex. '/robot1/image'), and these images are also published by the positioner.

Apart from the positioner node, an experiment controller node decides which exact problem the user is facing, and a server node keeps track of which user is interfacing with the system. A user controls a human avatar inside the simulator using a web based GUI that talks to the rest of the system using rosbridge_suite.

The workflow of the system is as follows:

Installation

Running the code

Command Line Execution

The following commands need to be run on a reasonably powerful machine (Recommended: Quad Core i7 Processor) that can also run Gazebo.

Non-Concert Version

roslaunch bwi_guidance_solver server.launch --screen

Concert Version

rocon_launch bwi_guidance_concert guidance.concert --screen

Running the User Interface

The user interface can be run on any browser that can connect to the machine on which the software was run. The connection is made using the rosbridge library over websockets.

Typically, you'll have to type the following link into the web-browser to launch the user side GUI. This link will only work on the machine from where you ran the command line application.

http://localhost/exp1/index.html

If you want to run on a different machine, you'll have to supply the hostname/IP of the machine where the command line application ran, so that your browser knows to connect to it:

http://robot-devil.csres.utexas.edu/exp1/index.html?host=robot-devil.csres.utexas.edu

Additionally, if you're running the concert version of the demo, you'll have to supply an additional parameters:

http://robot-devil.csres.utexas.edu/exp1/index.html?host=robot-devil.csres.utexas.edu&concert=true

A convenience webpage where you can enter this information and get redirected to the correct link is available at:

http://robot-devil.csres.utexas.edu/exp1/link.html

Once you get to the index.html page, follow the instructions to run through the experiment!


2022-05-28 12:28