VR Mental Health

In this project, we are using VR 360 for an immersive interactive video storytelling method of health promotion. Talking about one’s feelings is helpful. However, due to stigma, many people with depression keep the suffering to themselves. In this VR environment, we used 360-degree videos to reproduce a counseling session in virtual reality that provides users an opportunity to vent out, without the fear of stigma or discrimination.

  • Date
    • 2017-Current
  • Roles
    • Project co-lead
    • Development support, troubleshooting
  • Related Publications
    • Syed Ali Hussain, Taiwoo Park, Irem Gokce Yildirim, and Zihan Xiang. Virtual Reality-based Counseling for People with Mild Depression. HCI International 2018, ICA 2018.

Modifying Visual Stimuli in Virtual Reality to Reduce Hand Tremor in Micromanipulation Tasks

  • Current experiment environment. By adding extra IMU sensor, we aim to increase precision of location tracking.

Involuntary hand tremor has been a serious challenge in micromanipulation tasks and thus draws a significant amount of attention from related fields. To minimize the effect of the hand tremor, a variety of mechanically assistive solutions have been proposed. However, approaches increasing human awareness of their own hand tremor have not been extensively studied. In this paper, a head mount display based virtual reality (VR) system to increase human self-awareness of hand tremor is proposed. It shows a user a virtual image of a handheld device with emphasized hand tremor information. Provided with this emphasized tremor information, we hypothesize that subjects will control their hand tremor more effectively. Two methods of emphasizing hand tremor information are demonstrated: (1) direct amplification of tremor and (2) magnification of virtual object, in comparison to the controlled condition without emphasized tremor information. A human-subject study with twelve trials was conducted, with four healthy participants who performed a task of holding a handheld gripper device in a specific direction. The results showed that the proposed methods achieved a reduced level of hand tremor compared with the control condition.

We are extending the system to support more precise location tracking for micromanipulation simulation and assist system with the virtual reality technology, including Oculus Rift and HTC VIVE.

  • Related Publication
    • John Prada, *Taiwoo Park, Jintaek Lim, and *Cheol Song. Exploring the Potential of Modifying Visual Stimuli in Virtual Reality to Reduce Hand Tremor in Micromanipulation Tasks. Current Optics and Photonics. December 2017, 1(6), 642-648. (SCI-E Indexed)

JARVIS: Ubiquitous Mixed Reality Fitness Platform

  • Ubiquitous Mixed Reality Screenshot of JARVIS Fitness App

This project envisions genuinely ubiquitous mixed reality experiences. The next generation of mixed reality devices needs to incorporate the capability to closely interact with objects in a user’s surrounding environment, as proposed in Keiichi Matsuda’s concepts of Hyper Reality. a potential approach to enable such capability, we propose to leverage the Internet of Things (IoT) technologies that integrate sensors and connectivity into everyday objects such as toys, home appliances, and shopping carts. These sensors capture a user’s interaction with the objects in her vicinity, expanding the capability of mixed reality platforms beyond tracking users’ head and whole-body movements.

We describe the challenges of realizing the vision of ubiquitous mixed reality, and then present our ongoing work on developing a virtual fitness coach and its supporting platform as a concrete example of ubiquitous mixed reality technology.

This work is in press for the quarterly ACM SIGMOBILE Mobile Computing and Communications Review (GetMobile) and under major revision for Proceedings of the ACM on
Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT). Find the current draft of our vision paper here.

  • Date
    • 2015-Current
  • Role
    • Project Co-Lead, Frontend VR/AR Application Design and Development, User Experience Research
  • In collaboration with Mi Zhang (MSU), Youngki Lee (Singapore Management University).

Trapped! Escape room

Trapped! was a Halloween-themed virtual reality escape room, in which multiple puzzles needed to be solved in order to escape the dungeon. Players were given access to numerous objects within an immediate distance of them that pertained to three separate puzzles: mixing a potion in a certain order, identifying patterns in given objects, and recognizing feedback obtained in experimentation. Players perceive their environment through the use of riddles presented in-game, allowing for a grounded approach to puzzle-solving. Furthermore, the dungeon is complete with a Halloween-themed aesthetic, including some original models and music from the team members themselves. Very spooky.

An escape room was previously developed by Byron Lau, Sage Miller, and Yilang Zhao for internal use. Trapped! was adapted from the structure of the previous iteration — the dimensions of the room, location of the door and keypad, etc. — but incorporated new puzzles and items, in addition to a complete environment overhaul.

  • Date
    • 2017-Current
  • Roles
    • Project co-lead
    • Development support and troubleshooting
    • Player experience researcher

VR Baseball Simulator

VR test sports simulator environment.

See how far you can hit pumpkins in this Halloween-themed VR baseball batting simulator! With a variety of bats, T-ball & Pitcher modes, and a detailed baseball stadium, enjoy sending pumpkins to infinity!

  • Date
    • 2017-Current
  • Roles
    • Project advisor
    • Contents director
    • Development support and troubleshooting
    • User experience specialist

JediFlight VR

JediFlight is an interaction mechanism with virtual wings, where a player moves the wings and basic limbs to interact with environment and other objects. The major contribution of the design is to enable a player to naturally and simultaneously move real and extended limbs (i.e., wings) in multi-faceted interactive task setup, compared to previous works only explored basic feasibility of extended limb manipulation.

  • Date
    • 2017-Current
  • Roles
    • Project advisor
    • VR interaction design
  • Hardware
    • HTC VIVE and VIVE Trackers

MI491/891 Virtual Reality

Understand VR Technology and Create Your Own Virtual World.

Summer 2017

This course provides an overview of virtual reality, as well as a series of lab classes how to quickly and effectively create virtual reality experiences. This course introduces you to the history of virtual reality, variety of application domains of VR and their impacts, human factors, and major aspects for VR user experiences. After the class, you are expected to understand essential elements constituting virtual reality, how the VR technologies can be used for public benefits, and be able to create your imagination into sample VR applications.

For the class, I have developed a lab curriculum based on GoogleVR and Unity3D including a supporting unitypackages to enable Coding-Free VR interactive app design and development. If you are interested in the lab curriculum, please find it in the following links. Should you have any question, please contact me via email.

Materials are licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Notify Dr. Taiwoo Park (taiwoo.park_at_gmail.com) before you use materials for class or remix/publish.

Also find student project examples in the following links:

 

 

MI845 Usability and Accessibility

Learn and Practice Usability Testing Methods.

Spring 2018

In this course, you will learn usability testing principles and strategies for planning and conducting an evaluation. Once the basics are established through early class meetings and group project lab discussion, you will focus your efforts on working with your team to plan, prepare, and conduct a usability evaluation. You will then analyze and present the results in a written report, oral presentation, and highlights tape or embedded video clips.

This course centers on a “sponsored” project, which comes from a virtual client with a real product. You will be working on project with serious assumption with the sponsor from early in the semester.

Touching the Virtual: Individual differences in approach and avoidance behaviors in VR

This study investigates the relationship between individual differences in motivational activation and approach/avoidance behaviors in a 3D virtual environment (VE). The primary hypotheses are that 1) motivational relevance shapes facilitation or inhibition of behaviors while reaching, holding, and manipulating virtual objects, and 2) variations in individual’s trait appetitive system activation (ASA) and defensive system activation (DSA) will moderate the relationship. In order to unobtrusively observe individuals’ unconscious and automatic behaviors, we measure eye-gaze and distance kept between the participant and virtual objects while playing a VR game that involves a sorting task including emotional pictures. We expect that closer distance, and longer visual inspection of virtual objects are associated with ASA while further distance and shorter interaction are related to DSA. Relationship between trait motivational activation and other individual factors, such as VR skill and experience, will also be investigated.

    • Date
      • 2017-Current
    • Hardware
      • HTC VIVE, aGlass Eye Tracker
    • Original VR Environment Credits
      • Sage Miller, Yilang Zhao, Byron Lau

 

MI484 Innovative Interfaces

Build and evaluate your own interface prototype with Arduino, sensors, and 3D printing.

AT&T-MSU 2015-16 Instructional Technology Award: Best Technology Enhanced Class. (Click for Details)

Spring 2015 (MI491), Spring 2016, Fall 2016, Fall 2017

This course aims to empower students to creatively design, develop and evaluate new experimental interaction devices by combining various sensors and microcontrollers, without requiring extensive prior knowledge in computer science or electrical engineering. The course provides students hands­-on experiences of interface design and development, including basics of sensor technologies for interaction devices, fundamental implementation skills for interface hardware and software, and user experience evaluation methodologies. Throughout the class, students will learn how to realize their imagination of novel interaction devices and evaluate their usability.

Check out the students’ project archive below.

Continue reading “MI484 Innovative Interfaces”

MI420 Interactive Prototyping

Experience professional interactive application design and evaluation process and create your own ‘working’ interactive prototype.

Spring 2017, Fall 2017

Learn how to create interactive application prototypes for a variety of platforms (e.g., mobile, web, smart devices and vehicles) satisfying user needs. Beginning from sketches, elaborate the design through iterative paper and digital prototyping and evaluation processes, while catching up latest design trends. Latest technologies and related user experience design issues are covered as well.

Software used: Axure RP

  • Student Project Example (Jinghan Ni, Fall 2017)

CoSMiC: Crowd-Sourced Mobile App to Find a Missing Child

  • Conceptual UI of CoSMiC Application

Finding a missing child is an important problem concerning not only parents but also our society. It is essential and natural to use serendipitous clues from neighbors for finding a missing child. In this paper, we explore a new architecture of crowd collaboration to expedite this mission-critical process and propose a crowd-sourced collaborative mobile application, CoSMiC. It helps parents find their missing child quickly on the spot before he or she completely disappears. A key idea lies in constructing the location history of a child via crowd participation, thereby leading parents to their child easily and quickly. We implement a prototype application and conduct extensive user studies to assess the design of the application and investigate its potential for practical use.

Plant-Based Games for Anxiety Reduction

More and more researchers are finding anxiety and stress as critical health problems influencing quality of life and various illnesses. Studies suggest gardening activities help with anxiety. Our goal is to create engaging ways for people to interact with plants and eventually reduce anxiety and stress. We made three short games employing a person’s touch interaction with a plant as the input interface. Each of the three games implements a unique interaction: tapping, patting, and gentle pinching. We then tested the games with ten players, among whom five of them (the plant group) played the games with the plant as the input interface. The other five (the non-plant group) played the games with a pressure sensor board. The plant group showed decreased anxiety with a borderline statistical significance (p=0.054) with Cohen’s d of 0.20 (i.e., ‘small’ effect), while the non-plant group showed a non-significant decrease in anxiety after the gameplay (p=0.65). We further examined which in-game elements contributed to calming the participants as well as the design elements that need to be improved for plant-based games.

  • Publications
    • Taiwoo Park, Tianyu Hu, and Jina Huh. 2016. Plant-based Games for Anxiety Reduction. In Proceedings of the 2016 Annual Symposium on Computer-Human Interaction in Play (CHIPlay), 199–204.