Skip to main content

Mixed Reality: Serious Games For Creative Leaps In Learning


Via: University Of Central Florida - Institute For Simulation & Training

The Media Convergence Laboratory (MCL) is a partnership between the Institute for Simulation and Training, the School of Electrical Engineering and Computer Science, and the Digital Media Department at UCF.

MCL makes creative leaps in experiential media innovation. Through interdisciplinary research and cross-industry applications of core multi-sensory simulation technology its aspiration is to positively impact many sectors of society.


Its research focus is on the use of mixed reality (the seamless merging of real and virtual content) to create experiences that engage and improve the performance of users.

The vision is to melt the boundaries between reality and the imagination by creating compelling, interactive, simulated experiences that spark the imagination, enlighten the mind, immerse the body and engage the spirit.

Combining real and virtual images is the stock in trade of the Media Convergence Lab at UCF’s Institute for Simulation and Training.

MR Engine diagram

Current applications of the lab’s research include situational awareness training, teacher screening and training, creative collaboration, experiential entertainment, free choice learning, naturalistic decision-making, virtual heritage social networks and cognitive and physical rehabilitation.


Here are some examples:

MR Kitchen

The challenge: Retraining the victim of a brain injury how to function in his own kitchen without exposing him to sharp objects or hot burners. The solution: Take pictures of the injured person’s actual kitchen. Build a bare-bones dummy version of the kitchen in the lab. Then, use a head-mounted goggle-type device to combine the pictures with the dummy kitchen, creating a safe “reality” for the injured person to relearn kitchen tasks without fear of injury.

A test kitchen incorporates mixed reality

NSF Forest Fire

The goal of this work is visualize forest fires in faster than real-time at a level-of-detail that makes the experience realistic enough so that these visualizations can be used in environmental policy experiments and in ecology displays at museums.


This means that the visualization must be of an ecologically correct fire simulation. For this reason, the front-end is the program Farsite, a system that is accepted by the community forest fire management experts. It also means that the user must have free movement within and above the forest.


The simulation will cover about 30 years in an hour. Participants will control how they view the environment, by walking through the forest, flying over it or choosing a predefined path

This spring, a team comprising Hughes, UCF economics professors Glenn Harrison and Lisa Rutstrom and IST’s Steve Fiore will look at whether people who experience the devastation of a forest fire in mixed reality will make different public policy decisions from those who only read about the fire’s impact.

Participants will view a desktop 3-D scenario of Volusia County. As they make their way virtually through the forest, they will use real money — $80 or $100 — to decide whether to spend on controlled burns and other prevention policies.

The technology could be used to look at other issues, Harrison says, such as hurricane and land-use planning.

Enhancing Museum Exhibitions using Mixed Reality

Current Mixed Reality experiences focus primarily on training, design and entertainment. This project presents a very different application, scientific virtualization and its use in informal education.

Specifically, it describes a case study that extends an existing museum dinosaur exhibit to include an encounter with ancient sea life.

The real world assets and environment are augmented and, in some cases, occluded by the virtual entities that inhabited the seas at the time of the dinosaurs. Achieving this blending of the real and virtual motivated the development of novel real-time computer graphics algorithms and distributed simulation protocols, as well as new conventions in the creation and production of non-linear MR experiences.