PipetteX

MR micropipette training for immersive chemical education

MR STEM App

Mar 2023 - Dec 2023 (10 month)

Summary

PipetteX offers immersive micropipette training through MR and a custom-designed controller. In this individual project, I took on the challenge of designing and developing a MR micropipette training environment from scratch using Unity, while also engineering a realistic micropipette-inspired controller, including its case and PCB design.

Role

MR Design MR Development HW Design

Tool

Unity Fusion360 Figma

Context

Capstone Project w/ STEM Education Startup Advised by Prof. Ian Oakley

PipetteX

MR micropipette training for immersive chemical education

MR STEM App

Mar 2023 - Dec 2023 (10 month)

Summary

PipetteX offers immersive micropipette training through MR and a custom-designed controller. In this individual project, I took on the challenge of designing and developing a MR micropipette training environment from scratch using Unity, while also engineering a realistic micropipette-inspired controller, including its case and PCB design.

Role

MR Design MR Development HW Design

Tool

Unity Fusion360 Figma

Context

Capstone Project w/ STEM Education Startup Advised by Prof. Ian Oakley

PipetteX

MR micropipette training for immersive chemical education

MR STEM App

Mar 2023 - Dec 2023 (10 month)

Summary

PipetteX offers immersive micropipette training through MR and a custom-designed controller. In this individual project, I took on the challenge of designing and developing a MR micropipette training environment from scratch using Unity, while also engineering a realistic micropipette-inspired controller, including its case and PCB design.

Role

MR Design MR Development HW Design

Tool

Unity Fusion360 Figma

Context

Capstone Project w/ STEM Education Startup Advised by Prof. Ian Oakley

Highlight

Background

This project was conducted through an industry-academia collaboration to develop a VR STEM education service for chemical labs. The initiative focused on creating a fundamental VR interaction tool to enhance learning experiences, starting with the micropipette, a key instrument in chemistry experiments.

Process

Co-design process of hardware and software

I independently managed the co-design process of hardware and software, from research to implementation. This end-to-end approach involved harmonizing physical and digital product design, prioritizing the integration of hardware and software to deliver an intuitive and realistic user experience.

digital 👓

digital 👓

physical 📍

physical 📍

electrical ⚡️

electrical ⚡️

Preliminary Research

Pre. Research

Pre. Research

In-depth Interview

User Scenario

VR Scene Design

Disassembly

HW Analysis

HW Design

HW Design

Sensor Selection

PCB Design

Signal Process

Signal Proc.

Research

Redefining VR experiment training for better skill development

Preliminary Research

Current VR training solutions rely heavily on standard controllers that lack the fidelity needed for skill-based training. I discovered that this limitation reduces the effectiveness of learning in VR, highlighting the needs for specialized haptic devices.

In-depth Interview

To fully understand the functional requirements of a micropipette, I conducted interview with two graduate-level teaching assistants in chemistry lab course. I gained essential ideas and process of using micropipette, as well as frequent mistakes that the students make like incorrect pipette tilt.

Design Opportunity

With a realistic haptic device, VR offers a practical platform for students to train with a micropipette.

By integrating VR with a haptic device, the tactile feedback of using a micropipette is replicated, creating a bridge between virtual training and physical lab practice. This method enables students to safely refine essential lab skills, enhancing their confidence and competence before handling actual equipment.

Project Scope

Comprehensive micropipette training model with core features

To ensure scalability, I developed a fully functional micropipette model as a foundational tool for chemistry labs. The tutorial scenario incorporates key features like plunger stops, volume adjustment, and tip ejection, providing a structured framework for core skill training that can expand to future experiments.

Key Features

The VR pipetting system replicates a real micropipette with functional plunger stops (first/second), volume adjustment, and tip ejection with tilt detection. These features provide realistic physical feedback, enabling immersive training that builds muscle memory and proper handling skills.

First Stop

Second Stop

Knob Rotation

Tip Eject

Tilt

Scenario

The VR pipetting system guides users through each step of the pipetting process, simulating real-world actions for a hands-on learning experience. From picking up the pipette to ejecting the tip, each stage is designed to replicate the tactile and functional aspects of using an actual micropipette, ensuring accurate and confident handling.

1) Grab the pipette

2) Attach a tip

3) Adjust the volume

4) First stop & release

5) Second stop & release

6) Eject the tip

Hardware Design

Physical device connecting virtual-physical world with tangible feedback

Creating a haptic-enabled pipette controller required precise coordination of mechanical and electrical design. This effort resulted in a seamless and highly responsive user experience, elevating the realism and functionality of virtual pipetting.

Iterative Mechanical Prototype

Through repeated prototyping and 3D modeling, I developed a form factor that closely mirrors a real micropipette in shape and feel. The design also accommodates PCB and sensor integration to ensure that all components work seamlessly together while performing essential functions in VR.

Disassembly

First Draft

Iteration

Final Prototype

Circuit Design

The custom-designed PCB integrates sensors that detect the user’s actions, such as plunger stops and volume adjustments. By carefully arranging components like the Hall effect sensor and rotary encoder, the circuit design ensures that every action within VR mirrors the tactile response of an actual micropipette.

Hardware Design

The final assembly brought together the casing, sensors, and internal components in a streamlined, ergonomic design. This hardware setup allows for wireless operation, enabling users to interact naturally within the VR environment without physical constraints, making the learning experience both accessible and realistic.

Hall Effect Sensor

The sensor detects the states of both the plunger and the tip ejector by sensing a magnet. It is also capable of differentiating between the first and second stop positions.

Rotary Encoder

This detects the rotation of the volume adjusting knob and accordingly changes the volume setting.

IMU Sensor

When the tilt level of the device along the y-axis reaches a certain threshold, it triggers an alarm.

Battery Charging Module

This device operates wirelessly with its equipped rechargeable battery.

1

Hall Effect Sensor

The sensor detects the states of both the plunger and the tip ejector by sensing a magnet. It is also capable of differentiating between the first and second stop positions.

2

Rotary Encoder

This detects the rotation of the volume adjusting knob and accordingly changes the volume setting.

3

IMU Sensor

When the tilt level of the device along the y-axis reaches a certain threshold, it triggers an alarm.

4

Battery Charging Module

This device operates wirelessly with its equipped rechargeable battery.

1

Hall Effect Sensor

The sensor detects the states of both the plunger and the tip ejector by sensing a magnet. It is also capable of differentiating between the first and second stop positions.

2

Rotary Encoder

This detects the rotation of the volume adjusting knob and accordingly changes the volume setting.

3

IMU Sensor

When the tilt level of the device along the y-axis reaches a certain threshold, it triggers an alarm.

4

Battery Charging Module

This device operates wirelessly with its equipped rechargeable battery.

VR Scene Design

VR scene enhancing the physical with immersive feedback

The VR scene is designed to complement the physical pipette device by providing immersive visual and contextual feedback. Through intuitive cues and realistic interactions, the virtual environment bridges the gap between physical actions and digital simulations, enhancing the user’s overall training experience.

Playground

The virtual playground is equipped with all the necessary lab apparatus required for the simulation, creating a realistic and ready-to-use environment for effective training.

Guide Panel

At each step of the training, a display positioned directly in front of the user shows the current phase of the process, accompanied by supporting images and text-based guidance to enhance understanding.

Pipette Action Guide Tooltip

A tooltip above the pipette indicates the current action and provides instructions if the user deviates from the intended guidance. If the pipette is tilted excessively, it changes to yellow as a cautionary signal.

Action

Out of guidance

Tilt

High Fidelity Prototype

Integration of Hardware and Software for Immersive Training

The high-fidelity prototype is the result of HW/SW co-design process, integrating a haptic-enabled pipette controller with an immersive VR environment. This system combines realistic tactile feedback with interactive virtual guidance, delivering an intuitive and practical training solution that bridges the gap between physical and virtual lab experiences.

Get in touch 🧵

Seeking for 2025 Summer Internship

© 2024 Kyle Kim