Wednesday, June 06, 2007

TinyProjector

TinyProjector:

TinyProjector

Stefan Marti, MIT Media Lab

October 2000 - May 2002

TinyProjector is a very small portable character projector, based on inexpensive laser diodes, projecting a single line of text onto nearby walls and tables. TinyProjector might be useful for projecting text from portable and wearable devices like cellhones and PocketPCs that will be connected, e.g., via wireless serial connection.







Labnotebook:

Complete Labnotebook [PDF 6,519kb, 129 pages]

Complete Labnotebook [HTML single page 13,852kb]

Poster [JPG 238kb, 1351x855]

Movie 1 [MOV 2,268kb, 8 secs]

Movie 2 [MOV 2,190kb, 8 secs]

Send me some comments! Stefan Marti

YouTube - WildLife by Karolina Sobecka

YouTube - WildLife by Karolina Sobecka

Tuesday, June 05, 2007

MeVisLab for Mac OS X

http://www.mevislab.de/macosx/
MeVisLab MeVis Research

MeVisLab for Mac OS X

MeVisLab is a visual programming and rapid prototyping platform for image processing research and development with a focus on medical imaging. MeVisLab provides a flexible and simple handling of visualization and image processing algorithms by modular visual programming. No programming knowledge is required to set up image processing and visualization pipelines. Complete applications including user interfaces can easily be built within a general framework. Beside general image processing and visualization tools, MeVisLab includes advanced medical imaging algorithms for segmentation, registration, and quantitative morphological and functional analysis. Best of all, MeVisLab is now also available on Mac OS X – free for non-commercial use.

New image processing algorithms and visualization tools can be easily integrated as new modules using a standardized software interface. Module wizards as well as an interface for customized wizards add further usability. Macro modules that allow for a hierarchical encapsulation of networks can be used to easily reuse available developments. Efficient design of graphical user interfaces can be achieved by an abstract, hierarchical module definition language (MDL), hiding the complexity of the underlying module network to the end user. Dynamic functionality on both the network and the user interface level can be added using Python or JavaScript. MeVisLab can be fully integrated into the radiological workflow (PACS and DICOM integration).

2D slice


Monday, June 04, 2007

CaveUT2004

CaveUT2004

CaveUT 2004 v1.1

Freeware for Low-Cost Integrated Multi-Screen Displays
Using Unreal Tournament 2004


CaveUT 2004 is a set of modifications allowing the Unreal Tournament 2004 (UT2004) game engine to display in CAVE-like displays and panoramic digital theaters. It supports off-axis projection for correct perspective and multiple views from the observer's viewpoint. This page provides an introduction to CaveUT2004, including download information, installation instructions, guidelines for the program's use, explanations of how it works, and descriptions of improvements planned for the future.

Coming Soon! We have been working on a new version of CaveUT for some time, and we hope to have it ready for public release in December. It has: (1) greatly improved screen synchronization (2) logging capability (3) a special cursor that can traverse the composite display and activate clickable objects (4) support for digital dome-shaped displays (5) the ability to drive navigation via UDP (6) a protocol for interfacing a treadmill (See UtVRPN ). An early beta is available, but only to researchers who can actually help us get the public release done sooner. Stay tuned!

High School Graduate Refines Gyromouse Interface For Virtual Reality; Pre-teens Play Crucial Role — siggraph.org

High School Graduate Refines Gyromouse Interface For Virtual Reality; Pre-teens Play Crucial Role — siggraph.org: "High School Graduate Refines Gyromouse Interface For Virtual Reality; Pre-teens Play Crucial Role
by westman — last modified 30 November 2006 05:38 PM

Matt Duncan was a graduating senior at Yorktown High School in Arlington Virginia when he visited our university for a three-week for-credit internship. The goal was for him to learn about 3D graphic design and interaction design while helping us develop our Virtual Reality (VR) technology. The result was beneficial for everyone. He learned how to build virtual environments and to compare and refine VR navigation and selection techniques though collaboration with users. We got a virtual obstacle course for VR navigation training and a very intuitive interaction for VR navigation with the Gyromouse. "

Hafner, U., Matthias, B., Magg, R. (2000). Wireless Interaction in Cost-Effective Display Environments. Immersive Projection Technology Workshop (IPT2000).

Herpers, R., Hetmann, F., Hau, A., Heiden, W. (2005). Immersion Square - A Mobile Platform for Immersive Visualizations. Proc. Virtual Environment on a PC Cluster Workshop, Protvino, Russia, 2002.
http://viswiz.gmd.de/VEonPC/2002/proceedings/03-1.pdf

Jacobson, J., Kelley, M., Ellis, S., Seethaler, L. (2005c) Immersive Displays for Education Using CaveUT. World Conference on Educational Multimedia Hypermedia & Telecommunications, Montreal, Canada, June 27-July 2. http://planetjeff.net/IndexDownloads/Jacobson2005c.pdf

Jacobson, J., Lewis, L. (2005i). Game Engine Virtual Reality With CaveUT. IEEE Computer, 38, 79-82. http://planetjeff.net/IndexDownloads/Jacobson2005i.html


squidsoup.org home

squidsoup.org home: "


Virtual Puppeteers (2002-7)
Using puppet theatre as metaphor, children can create their own 3D interactive characters and sets from scratch, write their own scripts and put on collaborative live performances, including live voice-over recording.

Latest beta version soft-launched May 2007

HPL Coverage of Telepresence World 2007 || DVE Tele-immersion Room (The Ultimate Telepresence) Debuts at Telepresence World

HPL Coverage of Telepresence World 2007 || DVE Tele-immersion Room (The Ultimate Telepresence) Debuts at Telepresence World:

DVE Tele-immersion Room (The Ultimate Telepresence) Debuts at Telepresence World

2007.06.04 by John Serrao

FOR IMMEDIATE RELEASE
Contact: Jeff Machtig
Phone: (949) 347-9166
DVEtelepresence.com
Telepresence World, San Diego, CA -(June 4, 2007)



DVE announced today the launch of the ultimate telepresence experience - the groundbreaking DVE Tele-Immersion RoomTM - which is available for world preview at the Telepresence World Conference being held in San Diego. Offering dramatic 3-D holographic appearing images of participants in perfect HD, the DVE Tele-Immersion RoomTM shows people sitting and walking around meeting rooms as if they were just across the table, providing an exciting new level of realism to the teleconferencing experience.

Sunday, June 03, 2007

ProCams 2007

ProCams 2007

Program

Schedule

A printable version of the schedule is available.

Workshop date: June 18, 2007.

7:45 - 8:30 Breakfast
8:30 - 8:45 Welcome
8:45 - 9:30 Keynote Talk
Procams for Fast 3D Reflectance Capture and Display
Paul Debevec (University of Southern California Institute for Creative Technologies)
9:30 - 10:00 Poster/Demo Session

Inter-Reflection Compensation for Immersive Projection Display (Poster)
Hitoshi Habe, Nobuo Saeki, Takashi Matsuyama

Analysis of Light Transport based on the Separation of Direct and Indirect Components (Poster)
Osamu Nasu, Shinsaku Hiura, Kosuke Sato

Cordless portable multi-view fringe projection system for 3D reconstruction (Poster)
C. Munkelt, I. Schmidt, C. Bräuer-Burchardt, P. Kühmstedt and G. Notni

High-Speed Visual Tracking of the Nearest Point of An Ojbect Using 1,000-fps Adaptive Pattern Projection (Poster)
Tomoyuki Inoue, Shingo Kagami, Joji Takei, Koichi Hashimoto, Kenkichi Yamamoto, Idaku Ishii

Projector Calibration using Arbitrary Planes and Calibrated Camera (Poster)
Makoto Kimura, Masaaki Mochimaru, Takeo Kanade

Real-Time Projector Tracking on Complex Geometry Using Ordinary Imagery (Demo)
Tyler Johnson, Henry Fuchs

Anywhere Pixel Compositor for Plug-and-Play Multi-Projector Displays (Demo)
Ruigang Yang, Daniel R. Rudolf, Vijai Raghunathan
10:00 - 10:30 Morning Break
10:30 - 11:45 Paper Session I: Calibration and Measurement

Geometric Modeling and Calibration of Planar Multi-Projector Displays Using Rational Bezier Patches
Ezekiel Bhasker, Aditi Majumder

High-Speed Measurement of BRDF using an Ellipsoidal Mirror and a Projector
Yasuhiro MUKAIGAWA, Kohei SUMINO, Yasushi YAGI

Photometric Self-Calibration of a Projector-Camera System
Ray Juang, Aditi Majumder
11:45 - 1:00 Lunch Break
1:00 - 2:40 Paper Session II: Real-Time Applications

Real-Time Projector Tracking on Complex Geometry Using Ordinary Imagery
Tyler Johnson, Henry Fuchs

Shadow Removal in Front Projection Environments using Object Tracking
Samuel Audet, Jeremy Cooperstock
Video

DigiTable: An Interactive Multiusers Table for Collocated and Remote Collaboration Enabling Remote Gesture Visualization
François Coldefy, Stéphane Louis dit Picard

Displaying a Moving Image By Multiple Steerable Projectors
Ikuhisa Mitsugami, Norimichi Ukita, Masatsugu Kidode
Video
2:40 - 3:00 Poster/Demo Session

Projector-Camera Guided Fast Environment Restoration of a Biofeedback System for Rehabilitation (Poster)
Yufei Liu, Gang Qian

Embodied User Interface for Increasing Physical Activities in Games (Poster)
Si-Jung Kim, Woodrow W. Winchester, Yun-Bum Choi, Juck-Sik Lee

A Real-Time ProCam System for Interaction with Chinese Ink-and-Wash Cartoons (Poster)
Ming Jin, Hui Zhang, Xubo Yang, Shuangjiu Xiao

Virtual Recovery of the Deteriorated Art Object based on AR Technology (Poster)
Toshiyuki Amano, Ryo Suzuki

Automatic texture mapping on real 3D model (Poster)
Molinier Thierry, Fofi David, Patrick Gorria, Joaquim Salvi

Multi-Use Light Engine - MULE (Demo)
Mark Bolas
3:00 - 3:30 Afternoon Break
3:30 - 4:45 Paper Session III: Image Quality

Realizing Super-Resolution with Superimposed Projection
Niranjan Damera-Venkata, Nelson L. Chang

Improved Legibility of Text for Multiprojector Tiled Displays
Philip Tuddenham, Peter Robinson

Focal Pre-Correction of Projected Image for Deblurring Screen Image
Yuji OYAMADA, Hideo SAITO
4:45 - 5:30 Capstone Talk
Ultra-resolution Display and the Next Revolution in Computing
Christopher Jaynes (University of Kentucky)
5:30 - 5:45 Best-Paper Awards and Closing



Keynote Talk: Procams for Fast 3D Reflectance Capture and Display

Paul Debevec is a research associate professor at the University of Southern California and the executive producer of graphics research at the USC Centers for Creative Technologies. Debevec's Ph.D. thesis (UC Berkeley, 1996) presented Facade, an image-based modeling and rendering system for creating photoreal architectural models from photographs. Using Facade he led the creation of virtual cinematography of the Berkeley campus for his 1997 film "The Campanile Movie" whose techniques were used to create virtual backgrounds in the 1999 film "The Matrix". Subsequently, Debevec developed techniques for illuminating computer-generated scenes with real-world lighting captured through high dynamic range photography, demonstrating new image-based lighting techniques in his films "Rendering with Natural Light" (1998), "Fiat Lux" (1999), and "The Parthenon" (2004); he also led the design of HDR Shop, the first widely-used high dynamic range image editing program. At USC ICT, Debevec has led the development of a series of Light Stage devices for capturing and simulating how objects and people reflect light, recently used to create realistic digital actors in films such as "Spider Man 2" and "Superman Returns". He is the recipient of ACM SIGGRAPH's first Significant New Researcher Award and a co-author of the 2005 book "High Dynamic Range Imaging" from Morgan Kaufmann.




Capstone Talk: Ultra-resolution Display and the Next Revolution in Computing

Christopher Jaynes is an adjunct professor in the Department of Computer Science and founding research director of the Center for Visualization and Virtual Environments at the University of Kentucky. He received his B.S. degree at the University of Utah in 1994 and his Doctoral degree at the University of Massachusetts, Amherst in 2000. He was awarded the NSF CAREER award in 2001 for work related to wide-area video surveillance and human-computer interaction technologies. He is the founder of Mersive Technologies, a company that is commercializing multi-projector display systems and is actively conducting research related to interactive media beyond standard resolutions.

Christopher's core research is related to visual information processing, its role in mixed reality and novel display technologies, object recognition and tracking, and intelligent environments. He is the author of over 70 scientific articles, and is the editing author of the book Computer Vision for Interactive and Intelligent Environments (IEEE Press, 2003). He has been the keynote speaker at events ranging from the IEEE Conference on Virtual Reality and Cluster Computing to the Architectural Design conference ACADIA. His research related to multi-projector display systems lead to the formation of Mersive Technologies (www.mersive.com) in 2004 where he currently serves as Chief Technical Officer.


IEEE International Workshop on Projector-Camera Systems
Contact: general@procams2007.org

Site based on a design by Camenisch Design Associates

LightTwist�-�TWiki

LightTwist�-�TWiki: "Multi-projector system
Basic ideas

* Many projectors are used to project the virtual world.
* At least some information must be known about their parameters and relative geometry such as:
o Internal parameters.
o Relative position and orientation.
o Color calibration.
* In our case, we have developped an original approach (see scientific references below) based on:
o No full calibration of the devices. We use a panoramic (catadioptric) camera to get correspondances between each projector pixel with the camera pixel. This camera represents the viewpoint of our futur observers. Then, from what the observer should see, we can build the projector images from their respective mapping.
o No 3D reconstruction of the screen.

*
  • Tardif, J.-P., Roy S., Trudeau M., Multi-projectors for arbitrary surfaces without explicit calibration nor reconstruction, The 4th International Conference on 3-D Digital Imaging and Modeling (3DIM'2003), pp. 217-224, 2003. Pdf.




"

The All-Seeing Eye - March, 2007

The All-Seeing Eye - March, 2007:

Free OmniMap API beta now available
Beta testers needs for OpenGL dome-correction libraries

Want to dome-enable your real-time OpenGL application? The OmniMap™ Geometry Correction Library API provides developers with an easy-to-integrate solution to make real-time OpenGL applications compatible with OmniFocus ™ projection systems. It takes advantage of the latest accelerated graphics hardware and OpenGL 2.0 extensions to increase performance over previous geometry correction software solutions.

We've been working hard to streamline and optimize the OmniMap API, and we're now releasing the public beta so you can kick the tires. If you're interested in participating in the beta program, please download OmniMap and let us know what you think. OmniMap is made available free of charge for application developers to dome-enable any OpenGL application.


Saturday, June 02, 2007

Scripps Institution of Oceanography Visualization Center

Scripps Institution of Oceanography Visualization Center:
Hive Image
Highly Immersive Visualization Environment (HIVE)

The Highly Immersive Visualization Environment (HIVE) is a cylindrical wall display 28 feet wide and 8 feet tall installed in the Revelle Conference Room to provide immersive exploration capabilities to large groups of researchers (upto 50 people can be accommodated in the room). A 16 processor SGI Onyx 3400 drives three front mounted projectors that display on the Panoram GVR120E for a fully immersive environment.

Besides the SGI Onyx, the HIVE can also use a Windows PC (with the 3 channel Matrox graphics card) to display on the Panoram screen. The facility also has a DVD player, a VCR and the capability to connect any laptop using a VGA cable for presentation purposes.

http://resumbrae.com/info/cheapvr/


How to construct a projection-based virtual reality display using commodity components, for use in university class or museum settings.

Dave Pape
Josephine Anstey, University at Buffalo








Workshop Schedule

3:15 - 4:00
Introduction, Overview of history and motivations for projected display systems
Case Study, low-cost VR system at Media Study, University at Buffalo
Other lower-cost VR systems/groups

4:00 - 4:50
Technical details - hardware
Software

4:40 - 5:15
Questions

Other lower-cost VR systems/groups

arsbox image ARSBOX (2001)
Built at the Ars Electronic Center FutureLab. A CAVE-like system running on linux PCs, used to forward the center's missions to showcase art and technology and the the Futurelab's mission as the production arm of the AEC. A wall offshoot system uses a pda interface device.
confluxus VR Portal (2001)
Built by the Applied Interactives group, based on research done at the Electronic Visualization Laboratory. One wall system built for museum installations. UIC's undergraduate Electronic Visulization program also has a similar one wall passive stereo system
geowall image The Geowall Consortium
The consortium members pool information to build stereo walls using linux PCs.
"The GEOWALL project makes use of these projection systems to visualize structure and dynamics of the Earth in stereo to aid the understanding of spatial relationships."
SAS cube image The SAS Cube (2001)
A CAVE-like system running on a PC cluster. Developed in France by BARCO, CLARTE, IRISA et Z-A. Claims to be twice as cheap as existing CAVE systems!
VisBox image VisBox
The VisBox is a one wall, fully integrated VR system with head-tracking and passive stereo display. PC based. Bright projectors. 8' x 8' x 8' aluminum box contains all projection equipment and cuts down on light pollution. Commercially available, especially to CAVEQuakers.
P1 image VizTek
1-6 PC virtual walls, active stereo, tracking, multiple PCs, audio surround. Commerically available.