Gamasutra

Emerging Technologies

Interact with digital experiences that move beyond digital tradition, blur the boundaries between art and science, and transform social assumptions. See, learn, touch, and try the state of the art in human-computer interaction and robotics. Emerging Technologies presents work from many sub-disciplines of interactive techniques, with a special emphasis on projects that explore science, high-resolution digital-cinema technologies, and interactive art-science narratives. 

Best of Show Award
New for 2016: Emerging Technologies celebrates the project that best exemplifies the future of art, science, and human-computer interaction, and the role of emerging technologies as a force for positive advancement.

                  

Sponsored by NTT Communications Corporation, NTT Media Intelligence, SmartNews, Inc., and DWANGO


ACM Digital Library
Content from the SIGGRAPH 2016 conference is available in the ACM Digital Library free of charge via the SIGGRAPH organization web site.



AnyLight: An Integral Illumination Device

A novel programmable lighting device that can mimic the illumination effects of a wide range of lighting sources (spotlight, candle, natural light, etc.) using the principle of integral imaging.

Yuichiro Takeuchi
Sony Computer Science Laboratories, Inc.

Shunichi Suwa
Sony Computer Science Laboratories, Inc.

Kunihiko Nagamine
Sony Computer Science Laboratories, Inc.



Big Robot Mk.1A

This tall robot with two wheeled legs positions a human pilot 5 meters above the ground. The pilot controls the robot’s movement with foot motions as it follows the path of a five-meter-tall humanoid.

Tickets are required for this installation. They are available for no charge at the Experience Hall Ticket counter in Hall D.

Hiroo Iwata
University of Tsukuba

Yu-ta Kimura
University of Tsukuba

Hikaru Takatori
University of Tsukuba

Yu-uki Enzaki
University of Tsukuba



Computational Focus-Tunable Near-Eye Display

This focus-tunable near-eye display is capable of reproducing multiple emerging display modes for augmented and virtual reality.

Robert Konrad
Stanford University

Nitish Padmanaban
Stanford University

Emily Cooper
Dartmouth College

Gordon Wetzstein
Stanford University



Demo of Face2Face: Real-Time Face Capture and Reenactment of RGB Videos

A novel approach for real-time facial reenactment of a monocular target-video sequence, where the goal is to animate the facial expressions of the target video with a source actor and re-render the manipulated output video in a photo-realistic fashion.

Justus Thies
Friedrich-Alexander University Erlangen-Nürnberg

Michael Zollhöfer
Max-Planck-Institut für Informatik

Marc Stamminger
Friedrich-Alexander University Erlangen-Nürnberg

Christian Theobalt
Max-Planck-Institut für Informatik

Matthias Nießner
Stanford University



Enjoy 360° Vision With FlyVIZ

FlyVIZ is a novel, compact, lightweight display device that enables artificial omnidirectional vision, so users can see “with eyes behind the back”.

Guillermo Andrade-Barroso
Irisa/Inria Rennes Bretagne Atlantique

Jerome Ardouin

Florian Nouviale
Irisa/Inria Rennes Bretagne Atlantique

Anatole Lecuyer
Irisa/Inria Rennes Bretagne Atlantique

Maud Marchal
Irisa/Inria Rennes Bretagne Atlantique

Eric Marchand
Irisa/Inria Rennes Bretagne Atlantique



FOVE

FOVE is a virtual reality head-mounted display that integrates eye-tracking technology with virtual reality. Two small infrared cameras achieve stereo eye-tracking with less than one-degree accuracy and 120 frames per second per eye.

Lochlainn Wilson
Fove, Inc.

Jim Preston
Fove, Inc.



FinGAR: Combination of Electrical and Mechanical Stimulation for High-Fidelity Tactile Presentation

FinGAR is a finger-glove tactile display that combines electrical and mechanical stimulation to reproduce high-fidelity tactile sensation on a finger-pad. Its combination of virtual scenes and a hand-motion capture system allows users to experience various textures in the virtual world.

Vibol Yem
The University of Electro-Communications

Ryuta Okazaki
The University of Electro-Communications

Hiroyuki Kajimoto
The University of Electro-Communications

Erika Oishi
The University of Electro-Communications

Junko Ikuta
The University of Electro-Communications



Graphical Manipulation of Human Walking Direction With Visual Illusion

This installation presents a novel method that uses a visual illusion to enable pedestrian guidance without referring to information provided by a navigation system.

Akira Ishii
University of Tsukuba

Ippei Suzuki
University of Tsukuba

Shinji Sakamoto
University of Tsukuba

Keita Kanai
University of Tsukuba

Kazuki Takazawa
University of Tsukuba

Hiraku Doi
University of Tsukuba

Yoichi Ochiai
University of Tsukuba



Guidance Field: Vector Field for Implicit Guidance in Virtual Environments

This new guidance method implicitly leads users to pre-defined points in virtual environments while allowing free exploration by altering users’ inputs according to a kind of vector field (guidance field).

Ryohei Tanaka
The University of Tokyo

Takuji Narumi
The University of Tokyo

Tomohiro Tanikawa
The University of Tokyo

Michitaka Hirose
The University of Tokyo



HALUX: Projection-Based Interactive Skin for Digital Sports

Attempts to add haptics sensations to whole-body interactive experiences must overcome latency, which becomes a critical issue because it generates spatial disparity. HALUX places phototransistors and vibrators on the body, and drives them with projector light to eliminate latency derived from communication.

Haruya Uematsu
The University of Electro-Communications

Daichi Ogawa
The University of Electro-Communications

Ryuta Okazaki
The University of Electro-Communications

Taku Hachisu
The University of Electro-Communications

Hiroyuki Kajimoto
The University of Electro-Communications

Takahiro Shitara
The University of Electro-Communications

Keisuke Hoshino
The University of Electro-Communications

Khurelbaatar Sugarragchaa
The University of Electro-Communications



HapTONE: Haptic Instrument for Enriched Musical Play

HapTONE delivers high-fidelity vibrotactile sensations, not only at the moment of key input, but also as keys are pressed. It features key units with vibrators and distance sensors. The instrument can reproduce the touch sensations of a keyboard, stringed, percussion, or non-musical instrument.

Daichi Ogawa
The University of Electro-Communications

Kenta Tanabe
The University of Electro-Communications

Vibol Yem
The University of Electro-Communications

Taku Hachisu
University of Tsukuba

Hiroyuki Kajimoto
The University of Electro-Communications

Akifumi Takahashi
The University of Electro-Communications

Ayaka Nishi
The University of Electro-Communications



HapticWave: Directional Surface Vibrations Using Wave-Field Synthesis

HapticWave is a novel haptic technology that delivers directional haptic sensations generated on a flat surface, without requiring users to wear a physical device.

Ravish Mehra
Oculus VR

Sean Keller
Oculus Research

David Perek
Oculus Research

Christoph Hohnerlein
Oculus Research

Elia Gatti
Oculus Research

Riccardo DeSalvo
Oculus Research



Laplacian Vision: Augmenting Motion Prediction via Optical See-Through Head-Mounted Displays and Projectors

This vision-augmentation system aims to assist the human ability to predict the short-term future by visualizing trajectory information of real-world objects via an optical see-through head-mounted display and a projector.

Yuta Itoh
Keio University

Yuichi Hiroi
Keio University

Jiu Otsuka
Keio University

Maki Sugimoto
Keio University

Jason Orlosky
Osaka University

Kiyoshi Kiyokawa
Osaka University

Gudrun Klinker
Technische Universität München



Layered Telepresence: Simultaneous Multi-Presence Experience Using Eye-Gaze-Based Perceptual Awareness Blending

Layered Telepresence allows a viewer to experience simultaneous multi-presence by blending eye gaze and perceptual awareness ofreal-time audio/visual information received from multiple telepresence robots.

MHD Yamen Saraiji
Keio University

Shota Sugimoto
Keio University

Charith Fernando
Keio University

Kouta Minamizawa
Keio University

Susumu Tachi
The University of Tokyo



LiDARMAN: Reprogramming Reality With Egocentric Laser Depth Scanning

LiDARMAN reprograms reality by substituting visual perception. Using a Light Detection And Ranging (LiDAR) sensor to provide altered vision, the system can provide a novel 3D reconstructed view from outside the body and enables novel applications.

Takashi Miyaki
The University of Tokyo

Jun Rekimoto
The University of Tokyo, Sony Computer Science Laboratories, Inc.



LightAir: a Novel System for Tangible Communication With Quadcopters Using Foot Gestures and Projected Images

This new paradigm of human-drone interaction combines foot gestures with images projected on the road. The system allows creation of a new type of tangible interaction with drones for augmented sports.

Dzmitry Tsetserukou
Skolkovo Institute of Science and Technology

Mikhail Matrosov
Skolkovo Institute of Science and Technology

Olga Volkova
Skolkovo Institute of Science and Technology



Perceptually Based Foveated Virtual Reality

This installation allows attendees to compare several perceptually based foveated rendering techniques running live inside a virtual reality headset with integrated eye-tracking. Foveated rendering lowers computational costs in the image periphery and, with carefully selected foveation algorithms, does not noticeably affect perceptual image quality.

Anjul Patney
NVIDIA Corporation

Joohwan Kim
NVIDIA Corporation

Marco Salvi
NVIDIA Corporation

Anton Kaplanyan
NVIDIA Corporation

Chris Wyman
NVIDIA Corporation

Nir Benty
NVIDIA Corporation

Aaron Lefohn
NVIDIA Corporation

David Luebke
NVIDIA Corporation



Phyxel: Realistic Display Using Physical Objects With High-speed Spatially Pixelated Lighting

This project explores computational display of shape and appearance. Based on persistence of vision, the system coordinates high-speed adaptive lighting and periodic motion of physical materials. It can be easily manipulated to reproduce images that are consistent with real human scene perception.

Takatoshi Yoshida
The University of Tokyo

Yoshihiro Watanabe
The University of Tokyo

Masatoshi Ishikawa
The University of Tokyo



Ratchair: Furniture That Learns to Move Itself With Vibration

This project presents a strategy for displacing big objects by attaching relatively small vibration sources. After learning how several random bursts of vibration affect its pose, an optimization algorithm discovers the optimal sequence of vibration patterns required to (slowly but surely) move the object to a specified position.

Tetiana Pershakova
Korea Advanced Institute of Science and Technology

Minjoo Cho
Korea Advanced Institute of Science and Technology

Alvaro Casinelli
Korea Advanced Institute of Science and Technology

Daniel Saakes
Korea Advanced Institute of Science and Technology



Unlimited Corridor: Redirected Walking Techniques Using Visuo-Haptic Interaction

Unlimited Corridor enables users to walk freely around virtual environments by touching walls, although in current reawlity they are walking around circular walls within a limited space. Attendees experience a VR search for confidential information in a skyscraper.

Keigo Matsumoto
The University of Tokyo

Yuki Ban
The University of Tokyo

Takuji Narumi
The University of Tokyo

Yohei Yanase
Unity Technologies Japan

Tomohiro Tanikawa
The University of Tokyo

Michitaka Hirose
The University of Tokyo



VR Technologies for Rich Sports Experiences

NTT is developing progressive VR technologies to provide sports experiences that enhance spectator enjoyment and player performance.

Daisuke Ochi
Nippon Telegraph and Telephone Corporation

Akio Kameda
Nippon Telegraph and Telephone Corporation

Kosuke Takahashi
Nippon Telegraph and Telephone Corporation

Motohiro Makiguchi
Nippon Telegraph and Telephone Corporation

Kouta Takeuchi
Nippon Telegraph and Telephone Corporation



X-SectionScope: Cross-Section Projection in Light-Field Clone Image

A novel interactive 3D information-visualization display that superimposes a cross-sectional image on an aerial volumetric image of a real object.

Yoshikazu Furuyama
The University of Tokyo

Atsushi Matsubayashi
The University of Tokyo

Yasutoshi Makino
The University of Tokyo

Hiroyuki Shinoda
The University of Tokyo



Yadori: Mask-Type User Interface for Manipulation of Puppets

This system for animatronics storytelling enables performers to manipulate puppets by wearing a mask-type device that allows easier and more intuitive manipulation of puppets.

Mose Sakashita
University of Tsukuba

Keisuke Kawahara
University of Tsukuba

Amy Koike
University of Tsukuba

Kenta Suzuki
University of Tsukuba

Ippei Suzuki
University of Tsukuba

Yoichi Ochiai
University of Tsukuba



ZoeMatrope: A System for Physical Material Design

This project introduces ZoeMatrope: a material display that can present and animate realistic material by compositing real objects. ZoeMatrope can display a variety of materials including diffuse, specular, transparent, spatially varying, and even augmented objects with real-originating great resolution, dynamic range, and light-field fidelity.

Leo Miyashita
The University of Tokyo

Kota Ishihara
PKSHA Technology Inc.

Yoshihiro Watanabe
The University of Tokyo

Masatoshi Ishikawa
The University of Tokyo