2017
Virtual Physicality | Physical Virtuality

Exploring mixed realities

Supervision
Prof. Dr. Claudia Müller-Birn, Freie Universität Berlin
Prof. Carola Zwick, Kunsthochschule Berlin-Weißensee
Judith Glaser, Kunsthochschule Berlin-Weißensee

Augmented reality and virtual reality offer various forms of “mixed realities.” Physical objects and spaces can be enriched with layers of information, thus creating faceted and bespoke, multi-sensual experiences. The ambition of this project was to explore the specific qualities of these technologies and to create meaningful concepts for new hybrid realities and tools.

The goal for each interdisciplinary team was to create a functional prototype to test their concept. The quality of embodied interaction principles and the presence or absence of gravity and other tacit perception phenomena was a starting point to design for and with these technologies. Unity and/or A-Frame were tools to create those interactive experiences.

Aux Synesthesia

Peter Sörries, Design
Jan Batelka, Computer Science
Thushan Satkunanathan, Computer Science

Inspired by the example of the bat, which recognizes the environment by means of echolocation, or blind people, who replace their visual perception channel by clicking sounds – the motivation developed whether an acoustic spatial experience on a performative level is possible for people who use their senses to the same extent. The impetus is based on understanding the heightened potential of the sense of hearing and making it narratively accessible in a concrete application scenario.

Based on the synthesis of a digital and physical prototype, the real space is “read in” via the wearable artifact. Comparable to a Geiger counter, the environment is mapped based on Kinect technology. Approaching objects are detected and interpreted as a digital stereo signal whose frequency level adapts to current conditions. This acoustic scenario is supported by an abstracted Unity visualization, which allows the wearer of a head-mounted display to immerse himself in the augmented reality we have developed and to partially “appear” in the environment, coupled with the read-in Kinect data. Perspectively, we see the potential in acoustic material recognition. Within the analog tests conducted in advance, it turned out that objects have individual acoustic properties that are caused by materiality or surface properties. Accordingly, it is conceivable, for example, to optimize the room resonance by means of specifically placed objects with the help of the prototypes we have developed.

Project Documentation ↗︎

Github ↗︎

Bottage

Siyu Lou, Design
Toni Wirth, Computer Science
Serkan Baris, Computer Science

Bottage is a minimal communication devices. When designing Bottage our aim was to create a new way of communicating, especially for the communication in two different zones. Bottage was inspired by drift bottle. It could not only record our voice messages but also our emotions. Unlike the traditional ways (Telephone), the message will always stay in the bottle until we open it. The visualization of emotions is achieved by the refraction of the LED with glass lens.

Bottage System consists of two bottles. When we speak to the bottle A, the built-in single chip records the voice, analyzes it and transfers the data to bottle B. Bottle B receives the analysis of the sound and visualizes the emotion by controlling the vibration of the lens, the color and flashing frequency of the LED. When we open bottle B, we can not only hear voice, but also see each other’s emotion portfolio. Red color represents negative emotion, white means positive feelings. The flashing frequency of the LED indicates how strong the emotion is.

Github ↗︎

Comprehensive Learning

Alissa Wolter, Design
William Gu, Computer Science
Lutz Schäfer, Computer Science
Severina Virovska, Computer Science

Our concept “Comprehensive Learning” deals with the topic of learning, especially in chemistry. This is a module kit with which students can experience chemistry experiments in groups in a playful way. Students interact with different sized modules representing different elements of the periodic table. There are several experiments for students to choose from. The experiment overview, which we have sensibly designed, is projected onto the workspace with the help of a beamer. After an experiment has been selected, the program guides the students step by step through the entire procedure. As soon as an experiment has been successfully completed, haptic and auditory feedback as well as information worth knowing about the respective element is transferred to the workspace. With our module kit, students can carry out varied series of experiments both together with teachers and independently. In contrast to the often common frontal teaching, we offer an unconventional alternative, where students can learn and manifest exciting teaching content in a playful way. In addition, we see the digital interface as an opportunity to constantly expand our concept to include further learning processes.

Code ↗︎

Shift

Ningyuan Xu, Design
Julia Schmidt, Design
Alexis Iakovenko, Computer Science

“SHIFT – virtual hand therapy” is a tool that supports patients during the acute phase of hand therapy by delivering precise instructions and feedbacks in order to create motivation. The human hand is multifunctional tool, that is important for all kinds of everyday tasks. Regaining lost hand function is target of hand therapy. Because the treatment is temporary, at home exercises are decisive for a successful recovery. For that discipline and motivation are essential. Still, the majority of the patients feel lost because they can’t remember the exercises and instructions of their therapist and there is no feedback. SHIFT allows you to record and safe your individual movement in line with your therapist. It works in connection with a sensor device, that tracks finger and hand movements. These movements are saved as movement patterns. At home, following your recorded movement on the screen, the overlay of the recorded motion, represented as a virtual hand, and the actual movement, represented as a shadow, gives feedback about the comparison. A background pattern appears throughout the exercise and giving additional feedback about the implementation. To develop the project, the prototype needs to be improved. Furthermore it requires more exact hardware to track movements more precisely. User tests would follow this. An additional function could be the measurement of limitation before and after the treatment.

Code ↗︎

Alice

Johanna Ewert, Design
Perrine Trachsel, Computer Science
Lucas Antelo Blanco, Computer Science
János Brodbeck, Computer Science

Alice is a virtual reality first aid program that aims to train you to act calmly and responsibly in emergency situations. It puts you in an everyday situation in which you are suddenly confronted with an emergency that demands your reaction.

It’s an easily transportable learning-tool based on a pair of VR goggles and a hand tracker. As a user, you’re able to move your hands freely, which makes the interaction more natural than if you had to use controllers. As in any ordinary first aid course, we use a dummy. This enables you to feel a physical response while performing life-saving actions, such as CPR. To make it even more realistic, we developed a system that allows you to actually touch and feel the objects in the virtual space. We’re using a modular system consisting of five cubes. For each scenario, the cubes have to be placed on their exact positions. Then, the virtual picture works like an overlay. Virtual objects, which normally have no physical body, can now be touched and interacted with. E.g., when you see a chair in the virtual world, you can just take a seat like you would in the real world.

Air

Ylenia Gortana, Design
Gerold Schneider, Computer Science
Emil Milanov, Computer Science

A musical instrument free of physical constraints.

How might we create a musical instrument that is free of the constraints a real instrument’s physicality imposes? How might we turn any thing into an instrument? How can intuitive motion be translated into music? To answer these questions over the course of three months during the autumn/winter 2017/18 we developed an integrated hard- and software system that allows the user to explore sound and music creation via dynamic hand and arm motion. The system consists of three parts, two wireless motion pickup devices, one for each hand, and a GUI application, called EXI, for filtering, processing, shaping the raw sensor data from the pickups and connecting the results to the control channels of a wavetable synthesizer. EXI is built using Cycling74’s Max while the pickups are a custom hardware solution that was developed as part of the project with firmware built on the Teensy & Arduino frameworks. The current state of the system prototype is promising. It seems possible to develop the project into a marketable product within a reasonable time frame. No technological, economic or design obstacles have emerged so far.

Code ↗︎

Gems

Yi-Ting Chen, Design
Nitzan Ron, Design
Maximilian Stendler, Computer Science
Antje-Carolin Goldau, Computer Science

GEMS is a smartphone application that visualizes the intangible memories and associates it with the physical surroundings. By means of AR technology, GEMS creates a container for the user to store and share their own memory, as well as to explore the memory from others. To record the memory, the user has to wear a Memory Device which can record the moment by the sound. Later, the user can edit and upload the memory on their smartphone. As thus, a unique Memstone is created from the sound-record and the weather condition of the recorded place and time. Optionally, the user can also add text, picture or video to make the Memstone more vivid and colourful.

The Memstones are the interface between the digital data and the physical space. The user can turn on the app to view the Memstones with AR camera and explore different stored memories at the place. Furthermore, the user can use the smartphone to knock on the Memstone to hear the sound- record and see additional information on the memory.

The thought of GEMS is generally designed to lay the foundation for many possibilities: (1) Sharing memories with others gives GEMS the opportunity to be a social media platform, (2) in combination with special places and monuments, GEMS make the whole world to an interactive museum and (3) connecting metadata to places could be interesting for history or geology scientists and tourisms.

Code ↗︎

Scroll to Top