Mobile Gesture Interaction Platform

Designed and developed E-Gesture, a mobile gesture platform which enables eye- and hands-free mobile gesture interaction with a mobile device and wristwatch sensor. Its unique technical feature is feedback-based close-loop sensor fusion, which minimizes energy consumption of motion sensors while preserving gesture-sensing quality. I have implemented the gesture processing architecture using Android NDK (Native Development Kit), TinyOS, and HTK (HMM ToolKit).

  • Date
    • 2010-2014
  • Related Publications
    • Ju-Hwan Kim, Tek-Jin Nam, Taiwoo Park. “CompositeGesture: Creating Custom Gesture Interfaces with Multiple Mobile or Wearable Devices”, International Journal on Interactive Design and Manufacturing (IJIDeM), 2014.
    • Taiwoo Park, Jinwon Lee, Inseok Hwang, Chungkuk Yoo, Lama Nachman, Junehwa Song, “E-Gesture: A Collaborative Architecture for Energy-efficient Gesture Recognition with Hand-worn Sensor and Mobile Devices”, in Proceedings of ACM SenSys 2011. Seattle, WA, November, 2011.
    • Taiwoo Park, Jinwon Lee, Inseok Hwang, Chungkuk Yoo, Lama Nachman, Junehwa Song. “Demo: E-Gesture – A Collaborative Architecture for Energy-efficient Gesture Recognition with Hand-worn Sensor and Mobile Devices”, ACM MobiSys Demonstration, 2011 (Also demonstrated at IEEE SECON 2012) – Best Demo Award

Continue reading “Mobile Gesture Interaction Platform”

JARVIS: Ubiquitous Mixed Reality Fitness Platform

  • Ubiquitous Mixed Reality Screenshot of JARVIS Fitness App

This project envisions genuinely ubiquitous mixed reality experiences. The next generation of mixed reality devices needs to incorporate the capability to closely interact with objects in a user’s surrounding environment, as proposed in Keiichi Matsuda’s concepts of Hyper Reality. a potential approach to enable such capability, we propose to leverage the Internet of Things (IoT) technologies that integrate sensors and connectivity into everyday objects such as toys, home appliances, and shopping carts. These sensors capture a user’s interaction with the objects in her vicinity, expanding the capability of mixed reality platforms beyond tracking users’ head and whole-body movements.

We describe the challenges of realizing the vision of ubiquitous mixed reality, and then present our ongoing work on developing a virtual fitness coach and its supporting platform as a concrete example of ubiquitous mixed reality technology.

This work is in press for the quarterly ACM SIGMOBILE Mobile Computing and Communications Review (GetMobile) and under major revision for Proceedings of the ACM on
Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT). Find the current draft of our vision paper here.

  • Date
    • 2015-Current
  • Role
    • Project Co-Lead, Frontend VR/AR Application Design and Development, User Experience Research
  • In collaboration with Mi Zhang (MSU), Youngki Lee (Singapore Management University).

Mobile Context-Aware Service Platform

Challenges in designing and developing mobile context-aware service platform

Participated in designing and implementing a context-aware service platform and applications. For the project, I have built a large body of wearable sensor components and experimental toolkits to measure energy consumption. Also, I developed situation-specific context-aware applications for kids in kindergartens and successfully instantiated their feasibility via several field studies.

  • Date
    • 2007-2012
  • Related Publications
    • Youngki Lee, Sitharam S. Iyengar, Chulhong Min, Younghyun Ju, Seungwoo Kang, Taiwoo Park, Jinwon Lee, Yunseok Rhee, and Junehwa Song. “MobiCon: Mobile Context-Monitoring Platform“, Communications of the ACM (CACM), March 2012.
    • Inseok Hwang, Hyukjae Jang, Taiwoo Park, Aram Choi, Youngki Lee, Chanyou Hwang, Yanggui Choi, Lama Nachman, Junehwa Song. “Leveraging Children’s Behavioral Distribution and Singularities in New Interactive Environments: Study in Kindergarten Field Trips“, in Proceedings of Pervasive 2012, Newcastle, UK, June, 2012.
    • Inseok Hwang, Hyukjae Jang, Taiwoo Park, Aram Choi, Chanyou Hwang, Yanggui Choi, Lama Nachman, Junehwa Song, “Toward Delegated Observation of Kindergarten Children’s Exploratory Behaviors in Field Trips“, in Proceedings of the 13th ACM International Conference on Ubiquitous Computing (UbiComp 2011) (Poster), Beijing, China, 2011.
    • Seungwoo Kang, Youngki Lee, Chulhong Min, Younghyun Ju, Taiwoo Park, Jinwon Lee, Yunseok Rhee, Junehwa Song. “Orchestrator: An Active Resource Orchestration Framework for Mobile Context Monitoring in Sensor-rich Mobile Environments“, in Proceedings of IEEE PerCom 2010, Mannheim, Germany, 2010.
    • Seungwoo Kang, Jinwon Lee, Hyukjae Jang, Hyonik Lee, Youngki Lee, Souneil Park, Taiwoo Park, and Junehwa Song. “SeeMon: Scalable and Energy-efficient Context Monitoring Framework for Sensor-rich Mobile Environments“, in Proceedings of ACM MobiSys 2008, Colorado, USA, June 2008.

Continue reading “Mobile Context-Aware Service Platform”

Pervasive Exergame Platform

Exertainer, a pervasive exergaming platform, supports multiple heterogeneous exercise devices, such as hula hoop, jump rope, exercise bike and interactive treadmill, as well as wearable motion sensors. I have prototyped the exercise devices and the wearable sensors (in both of hardware and software). The platform utilizes a variety of sensors, including accelerometer, gyroscope, proximity sensor, rotation speed sensor and magnetic switch to detect players’ activities and exercise context.

  • Date
    • 2010-2016
  • Related Publications
    • Taiwoo Park, Inseok Hwang, Youngki Lee, Junehwa Song, “Toward a Mobile Platform for Pervasive Games“, in Proceedings of ACM SIGCOMM Workshop on Mobile Gaming. Helsinki, Finland, August 2012.
    • Taiwoo Park, Inseok Hwang, Uichin Lee, Sunghoon Ivan Lee, Chungkuk Yoo, Youngki Lee, Hyukjae Jang, Sungwon Peter Choe, Souneil Park, Junehwa Song, “ExerLink: Enabling Pervasive Social Exergames with Heterogeneous Exercise Devices“, in Proceedings of ACM MobiSys 2012. Lake District, UK, June 2012.
  • Awards
    • Best Demo Award, ACM MobiSys 2012
    • Best Demo Honorable Mention, IEEE SECON 2012

VR Mental Health

In this project, we are using VR 360 for an immersive interactive video storytelling method of health promotion. Talking about one’s feelings is helpful. However, due to stigma, many people with depression keep the suffering to themselves. In this VR environment, we used 360-degree videos to reproduce a counseling session in virtual reality that provides users an opportunity to vent out, without the fear of stigma or discrimination.

  • Date
    • 2017-Current
  • Roles
    • Project co-lead
    • Development support, troubleshooting
  • Related Publications
    • Syed Ali Hussain, Taiwoo Park, Irem Gokce Yildirim, and Zihan Xiang. Virtual Reality-based Counseling for People with Mild Depression. HCI International 2018, ICA 2018.

Modifying Visual Stimuli in Virtual Reality to Reduce Hand Tremor in Micromanipulation Tasks

  • Current experiment environment. By adding extra IMU sensor, we aim to increase precision of location tracking.

Involuntary hand tremor has been a serious challenge in micromanipulation tasks and thus draws a significant amount of attention from related fields. To minimize the effect of the hand tremor, a variety of mechanically assistive solutions have been proposed. However, approaches increasing human awareness of their own hand tremor have not been extensively studied. In this paper, a head mount display based virtual reality (VR) system to increase human self-awareness of hand tremor is proposed. It shows a user a virtual image of a handheld device with emphasized hand tremor information. Provided with this emphasized tremor information, we hypothesize that subjects will control their hand tremor more effectively. Two methods of emphasizing hand tremor information are demonstrated: (1) direct amplification of tremor and (2) magnification of virtual object, in comparison to the controlled condition without emphasized tremor information. A human-subject study with twelve trials was conducted, with four healthy participants who performed a task of holding a handheld gripper device in a specific direction. The results showed that the proposed methods achieved a reduced level of hand tremor compared with the control condition.

We are extending the system to support more precise location tracking for micromanipulation simulation and assist system with the virtual reality technology, including Oculus Rift and HTC VIVE.

  • Related Publication
    • John Prada, *Taiwoo Park, Jintaek Lim, and *Cheol Song. Exploring the Potential of Modifying Visual Stimuli in Virtual Reality to Reduce Hand Tremor in Micromanipulation Tasks. Current Optics and Photonics. December 2017, 1(6), 642-648. (SCI-E Indexed)

Trapped! Escape room

Trapped! was a Halloween-themed virtual reality escape room, in which multiple puzzles needed to be solved in order to escape the dungeon. Players were given access to numerous objects within an immediate distance of them that pertained to three separate puzzles: mixing a potion in a certain order, identifying patterns in given objects, and recognizing feedback obtained in experimentation. Players perceive their environment through the use of riddles presented in-game, allowing for a grounded approach to puzzle-solving. Furthermore, the dungeon is complete with a Halloween-themed aesthetic, including some original models and music from the team members themselves. Very spooky.

An escape room was previously developed by Byron Lau, Sage Miller, and Yilang Zhao for internal use. Trapped! was adapted from the structure of the previous iteration — the dimensions of the room, location of the door and keypad, etc. — but incorporated new puzzles and items, in addition to a complete environment overhaul.

  • Date
    • 2017-Current
  • Roles
    • Project co-lead
    • Development support and troubleshooting
    • Player experience researcher

VR Baseball Simulator

VR test sports simulator environment.

See how far you can hit pumpkins in this Halloween-themed VR baseball batting simulator! With a variety of bats, T-ball & Pitcher modes, and a detailed baseball stadium, enjoy sending pumpkins to infinity!

  • Date
    • 2017-Current
  • Roles
    • Project advisor
    • Contents director
    • Development support and troubleshooting
    • User experience specialist

JediFlight VR

JediFlight is an interaction mechanism with virtual wings, where a player moves the wings and basic limbs to interact with environment and other objects. The major contribution of the design is to enable a player to naturally and simultaneously move real and extended limbs (i.e., wings) in multi-faceted interactive task setup, compared to previous works only explored basic feasibility of extended limb manipulation.

  • Date
    • 2017-Current
  • Roles
    • Project advisor
    • VR interaction design
  • Hardware
    • HTC VIVE and VIVE Trackers

MI491/891 Virtual Reality

Understand VR Technology and Create Your Own Virtual World.

Summer 2017

This course provides an overview of virtual reality, as well as a series of lab classes how to quickly and effectively create virtual reality experiences. This course introduces you to the history of virtual reality, variety of application domains of VR and their impacts, human factors, and major aspects for VR user experiences. After the class, you are expected to understand essential elements constituting virtual reality, how the VR technologies can be used for public benefits, and be able to create your imagination into sample VR applications.

For the class, I have developed a lab curriculum based on GoogleVR and Unity3D including a supporting unitypackages to enable Coding-Free VR interactive app design and development. If you are interested in the lab curriculum, please find it in the following links. Should you have any question, please contact me via email.

Materials are licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Notify Dr. Taiwoo Park (taiwoo.park_at_gmail.com) before you use materials for class or remix/publish.

Also find student project examples in the following links:

 

 

MI845 Usability and Accessibility

Learn and Practice Usability Testing Methods.

Spring 2018

In this course, you will learn usability testing principles and strategies for planning and conducting an evaluation. Once the basics are established through early class meetings and group project lab discussion, you will focus your efforts on working with your team to plan, prepare, and conduct a usability evaluation. You will then analyze and present the results in a written report, oral presentation, and highlights tape or embedded video clips.

This course centers on a “sponsored” project, which comes from a virtual client with a real product. You will be working on project with serious assumption with the sponsor from early in the semester.

Touching the Virtual: Individual differences in approach and avoidance behaviors in VR

This study investigates the relationship between individual differences in motivational activation and approach/avoidance behaviors in a 3D virtual environment (VE). The primary hypotheses are that 1) motivational relevance shapes facilitation or inhibition of behaviors while reaching, holding, and manipulating virtual objects, and 2) variations in individual’s trait appetitive system activation (ASA) and defensive system activation (DSA) will moderate the relationship. In order to unobtrusively observe individuals’ unconscious and automatic behaviors, we measure eye-gaze and distance kept between the participant and virtual objects while playing a VR game that involves a sorting task including emotional pictures. We expect that closer distance, and longer visual inspection of virtual objects are associated with ASA while further distance and shorter interaction are related to DSA. Relationship between trait motivational activation and other individual factors, such as VR skill and experience, will also be investigated.

    • Date
      • 2017-Current
    • Hardware
      • HTC VIVE, aGlass Eye Tracker
    • Original VR Environment Credits
      • Sage Miller, Yilang Zhao, Byron Lau

 

MI484 Innovative Interfaces

Build and evaluate your own interface prototype with Arduino, sensors, and 3D printing.

AT&T-MSU 2015-16 Instructional Technology Award: Best Technology Enhanced Class. (Click for Details)

Spring 2015 (MI491), Spring 2016, Fall 2016, Fall 2017

This course aims to empower students to creatively design, develop and evaluate new experimental interaction devices by combining various sensors and microcontrollers, without requiring extensive prior knowledge in computer science or electrical engineering. The course provides students hands­-on experiences of interface design and development, including basics of sensor technologies for interaction devices, fundamental implementation skills for interface hardware and software, and user experience evaluation methodologies. Throughout the class, students will learn how to realize their imagination of novel interaction devices and evaluate their usability.

Check out the students’ project archive below.

Continue reading “MI484 Innovative Interfaces”