Difference between revisions of "TelePresence"

From Immersive Visualization Lab Wiki
Jump to: navigation, search
(Update Spring 2011)
(Running TelePresence)
 
(2 intermediate revisions by one user not shown)
Line 1: Line 1:
== Update Spring 2011 ==
+
= Update Spring 2011 =
  
Using multiple cameras in an array, our goal is to make future teleconferencing more intuitive, especially illustrated with the case of eye contacts in a one-to-many conferencing system. A prototype of a speaker-sensitive auto-adjust camera system is built to demonstrate how to achieve this intuitive viewing, with cases such as making eye contacts with one of the counterparts and having them see from the correct angle to realize that the eye contact is made to them. With the prototype, a number of future possibilities can be discussed, such as higher levels of teleconference user scenarios, including an intuitive 3D voting system, smart recording and playback, breakout group video and audio stream manipulation, space sharing (e.g. viewing large architectures as a group) versus view sharing systems (e.g. viewing small objects from a common eye point as a group), and other applications that may be used in an immersive environment.
+
Using multiple cameras in an array, our goal is to make future teleconferencing more intuitive, especially illustrated with the case of eye contacts in a one-to-many conferencing system. A prototype of a speaker-sensitive auto-adjust camera system is built to demonstrate how to achieve this intuitive viewing, with cases such as making eye contacts with one of the counterparts and having them see from the correct angle to realize that the eye contact is made to them. With the prototype, a number of future possibilities may be discussed, such as higher levels of teleconference user scenarios, including an intuitive 3D voting system, smart recording and playback, breakout group video and audio stream manipulation, space sharing (e.g. viewing large architectures as a group) versus view sharing systems (e.g. viewing small objects from a common eye point as a group), and other applications that may be used in an immersive environment.
 +
 
 +
==Setup==
 +
 
 +
Two virtual rooms constructed back-to-back has an array of 6 mounted cameras along the canvases, where the image from the other side is shown during a conference. To simulate the view of the participants in the conference, an array of 6 cameras are set at each participant's position, and the views may be projected onto the screen in their own room to check for the difference in angles and viewing directions.
 +
 
 +
Currently, one room has one speaker maximum at all times, and the other room has two speakers maximum. Whenever a speaker changes in either room, indicated by a click on the participant avatar, the system calculates the optimal camera that is closest to the line connecting the two speakers through the back-to-back front walls of the rooms, and the image is projected from that camera. Thus, at each speaker change in one room, the participant(s) in the other room sees that speaker at an optimal angle based on where they sit. This way, when one of the participants makes an eye contact to someone in the other room through the screens, the eye contact would be made approximately through the line at the end of which sits the counterpart participant, and the eye contact would thus be perceived as being made to them. Existing video conference systems are limiting in this respect in that cameras on the two sides of the conference are often placed in positions where when the speakers make an eye contact, the counterpart sees the side of speaker's face and do not realize that the speaker is looking at them, which makes the conversation less natural.
 +
 
 +
= Prior to Fall 2011 =
  
 
== Running TelePresence ==
 
== Running TelePresence ==

Latest revision as of 07:31, 28 April 2011

Contents

Update Spring 2011

Using multiple cameras in an array, our goal is to make future teleconferencing more intuitive, especially illustrated with the case of eye contacts in a one-to-many conferencing system. A prototype of a speaker-sensitive auto-adjust camera system is built to demonstrate how to achieve this intuitive viewing, with cases such as making eye contacts with one of the counterparts and having them see from the correct angle to realize that the eye contact is made to them. With the prototype, a number of future possibilities may be discussed, such as higher levels of teleconference user scenarios, including an intuitive 3D voting system, smart recording and playback, breakout group video and audio stream manipulation, space sharing (e.g. viewing large architectures as a group) versus view sharing systems (e.g. viewing small objects from a common eye point as a group), and other applications that may be used in an immersive environment.

Setup

Two virtual rooms constructed back-to-back has an array of 6 mounted cameras along the canvases, where the image from the other side is shown during a conference. To simulate the view of the participants in the conference, an array of 6 cameras are set at each participant's position, and the views may be projected onto the screen in their own room to check for the difference in angles and viewing directions.

Currently, one room has one speaker maximum at all times, and the other room has two speakers maximum. Whenever a speaker changes in either room, indicated by a click on the participant avatar, the system calculates the optimal camera that is closest to the line connecting the two speakers through the back-to-back front walls of the rooms, and the image is projected from that camera. Thus, at each speaker change in one room, the participant(s) in the other room sees that speaker at an optimal angle based on where they sit. This way, when one of the participants makes an eye contact to someone in the other room through the screens, the eye contact would be made approximately through the line at the end of which sits the counterpart participant, and the eye contact would thus be perceived as being made to them. Existing video conference systems are limiting in this respect in that cameras on the two sides of the conference are often placed in positions where when the speakers make an eye contact, the counterpart sees the side of speaker's face and do not realize that the speaker is looking at them, which makes the conversation less natural.

Prior to Fall 2011

Running TelePresence

The Cisco TelePresence program requires a working firewire camera connection. The opensource program Coriander can read in the firewire image stream and output it to a video device through Video4Linux. Coriander must be capturing camera data and outputting it to a video device before TelePresence can be loaded with live camera support. If TelePresence is started and the camera is not set up completely correctly, a test color pattern will display where the live video image should be.

If you are not working on a machine that hasn't already been set up for firewire, refer to the Firewire on CentOS guide.

Load coriander by typing "coriander" into a terminal window. Click on the "Services" tab. Set the input resolution to 320x240. Start ISO Transmition at 400, then start the Receive service, then start the Display service. Once the image from the camera displays, start the V4L service.

You might run into a series of issues getting these steps to work. Here's how to troubleshoot the most common ones:

Coriander will not start up and says that there are no cameras connected:

  • Make sure the camera is plugged in and the amber light on the front is on.
  • Make sure that /dev/raw1894, /dev/video1394, and /dev/video1394/0 are owner by the current user.

The Receive service gives an error:

  • Make sure that /dev/raw1894, /dev/video1394, and /dev/video1394/0 are owner by the current user.

ISO Transmission will not start:

  • Keep clicking until it works

The video is green or intermittently flashes green:

  • This is a known issue with the firewire driver
  • Stop and restart transmission until the issue goes away.
  • Sometimes, toggling between different input resolutions while the camera is on helps alleviate the issue.

The V4L service gives an error:

  • The vloopback module needs to be running.
  • Run
    /sbin/modprobe vloopback
    as root to start the loopback services.