
In the upcoming section, I'll present the VR and AR project specifically focused on enhancing kitchen experiences. Click the button to access the particular project.
VR Mini Project
KitchenWise - VR Kitchen Appliance Use Training
An immersive VR experience designed to enhance users’ understanding and skills in safely and efficiently operating kitchen appliances, achieved through live demonstrations and hands-on practice.
Problem Statement
The many different and complicated kitchen appliances not only deter people from cooking but also contribute to safety concerns.
People who are new to cooking, like beginners or those who aren't very experienced, often find it hard to use all the different kitchen appliances safely and properly. Nowadays, kitchens have lots of appliances like ovens, stoves, blenders, and microwaves, which can be confusing for newcomers. The worry of messing up while cooking can stop them from trying new recipes and methods, which leaves them not knowing the right way to use the appliances and can lead to accidents.
Storyboard
The storyboard helps me to visualize user interactions and guide me in considering the complete process. It helps identify missing elements and provides a detailed framework to ensure a seamless and engaging experience.
Users inquire about kitchen appliance usage.
Choose a specific appliance (e.g., microwave oven).
Experience a step-by-step microwave usage demo.
Understand the purpose of each button.
Learn safety tips with practical demonstrations.
Engage in a practice game for content review.
Storyboard sketch
360 Sketch
The 360-degree sketch and video provide me with insights into the arrangement and dimensions of the objects within the immersive environment. This also aids me in making necessary layout adjustments during the physical prototype design phase.
360-degree sketch of the kitchen scene
360-degree sketch of the kitchen scene viewed through the VR monitor
3D Props
I utilized a cardboard box to simulate the VR immersive kitchen interface. Despite its right angles, it effectively illustrates what users will encounter when they turn their heads to the left or right. This process helps me to develop a realistic preview of what we intend to construct in the final VR environment.
Physical prototype materials I created
Physical Prototype
Physical prototype video
UI Elements
These UI elements are created in Figma and will be used for user interactions.
UI elements created for the VR scene
Digital Prototype - Scene
First created the plane and defined the dimension of the whole environment
Downloaded the kitchen appliance 3D assets created by Boxx-Games Assets
Placed objects on the plane and scaled their size appropriately
Downloaded the floor asset created by CaptainCatSparrow
Downloaded the skybox asset created by Render Knight
Digital prototype overview
Digital Prototype - Place UI elements
After setting up the whole environment, I imported UI elements created in Figma to Unity and adjusted their size accordingly.
UI elements placed in the VR scene
Digital Prototype - Background change
Users are able to change the background scene and could be immersively involved in different environment settings.
Background change options
Digital Prototype
Created in Unity
Digital prototype video
Key Interactions
The application includes instructions for users on selecting a kitchen appliance and beginning to learn about its tips and features. It will display prompts on the screen and provide feedback through vibrations and touch sensations when users interact with the objects. Users could actively engage in the learning process with clear guidance.
Instruction
Instructions in the VR scene
Safety tips will be demonstrated using animations to show the consequences of using a kitchen appliance incorrectly. The visual representation can be effective for conveying information and making safety tips more memorable and understandable.
Kitchen appliance safety demonstration
Safety demonstration example
Improvements from the First Version
Add animated safety demonstrations
In the first version of the digital prototype, the instructions and tips were presented as text, which wasn't very user-friendly or interactive. To improve this, I incorporated animated illustrations to visually depict the consequences of not following safety tips when using different kitchen appliances.
Provide user feedback through vibrations and touch sensations
To make human interaction in the virtual environment more engaging and lifelike, I added pop-up effects for selected objects and incorporated feedback through vibrations and touch sensations. Users can experience tactile responses when they interact with kitchen appliances or make selections, making the experience more realistic and interactive.
Include safety actions
To make the most of the learning space dedicated to kitchen appliances and safety tips, it's essential to add information about what to do when safety issues occur. In the final version, apart from teaching how to use appliances safely, I added information on what actions users to take if something goes wrong to ensure their safety.
Lessons Learned
After working on the project step by step, I learned how to choose the prototyping method to effectively convey our ideas. From physical prototyping to digital prototyping, each phase helps me to think thoroughly about my design and refine the interaction considering user interaction. While physical prototyping gave us a general sense of interactions, it was the digital prototyping stage that allowed for a deeper examination of intricacies like distance, size, and scale, placing the user at the center of our considerations.
Another lesson is storyboard cannot be missed, helping us establish clear project goals and build upon them. It helps me strengthen the overall structure of the project.
Unity and Bezel are new tools I haven’t had the chance to learn and build the object. This experience helped me to learn to build 3D scenes while also considering the scaling when users interact in the actual product setting. In addition, the interaction should be simple and easy to understand and users should be able to intuitively grasp what to do from the outset, which in turn encourages continued engagement. Lastly, I learned that the prototype demonstration should be clear which allows developers to interpret and follow what we are thinking about.
Credits
https://assetstore.unity.com/packages/2d/textures-materials/sky/fantasy-skybox-free-18353
https://assetstore.unity.com/packages/3d/props/interior/free-kitchen-cabinets-and-equipment-245554
https://assetstore.unity.com/packages/2d/textures-materials/50-free-pbr-materials-242760
https://assetstore.unity.com/packages/vfx/free-fire-vfx-hdrp-239742
Unity Assets
CookWise - Recipe Assistant & Cooking Guidance
AR Mini Project
An immersive AR experience designed to empower home chefs with recipe recommendations and cooking guidance tailored to their available ingredients, enabling convenient meal preparation without the need for a trip to the grocery store.
Problem Statement
In everyday cooking, people are often frustrated by the situation of having limited choices of ingredients available at home and want to prepare a meal without going grocery shopping. The proposed system could help users get recommendations on what to cook with their available ingredients while also assisting them in cooking the meal.
Domain: Available recipe recommendations and corresponding cooking assistance
Context: This service serves anyone with recipe recommendations on their available ingredients without the time to grocery shopping. The additional feature is users can select the recommended recipe and follow the cooking procedure. This application could be adopted by culinary schools, home cooks, and appliance manufacturers in support of cooking.
Storyboard
User wonders what meal they could prepare based on their available ingredients.
The device scans for available ingredients.
Confirmation of scanned ingredients.
Users pick a recipe to cook.
Users follow cooking instructions.
Users complete the dish successfully.
Storyboard sketch
Digital Prototype
Digital prototype video
Key Interactions
The application allows users to either scan their refrigerator or manually input the ingredients they have available. After inputting their available ingredients, users can review the scanned results and make edits if needed. Once the information is confirmed, the system will generate a list of recommended recipes for users to choose from. Users have the option to sort and filter the list based on their preferences.
Available recipe recommendations
UI design displaying the scanned ingredients' results
After the users select the recipe, the system provides a step-by-step, interactive guide for cooking, allowing users to follow instructions in real time while seeing visual cues through AR. Users are able to see virtual ingredients placed on real-life objects helping with ingredient measurements and cooking techniques placed over their real cooking equipment.
Guided cooking instruction
UI design showing cooking guidance
Improvements from the First Version
Explore Lens Studio
In version 1, I utilized Bezi to design all the interfaces and display them within the real environment. However, for the final version, I tried to explore Lens Studio. During this process, I learned about adding markers and enabling object triggers upon marker detection.
Add Marker-based Tracking
To streamline the entire process and enable users to cook at their own pace, I introduced marker-based tracking for physical objects. This feature allows the device to automatically track the ingredients and offer corresponding instructions. Furthermore, this tracking system can be further improved to automatically detect users' progress by scanning the state of the ingredients. For instance, it can identify when an ingredient has been fully chopped, enabling the system to seamlessly guide users to the next steps without requiring them to manually advance to the next instruction. This enhancement eliminates the need for users to click or interact with the app each time they complete a particular cooking step.
Lessons Learned
After going through designing the AR experience, one of the differences I noticed was it was more convenient to conceptualize the AR scene by actively constructing the experience and projecting objects onto the real environment. In this way, the scale and size of objects could be adjusted effectively when showing the real environment. Compared to VR where we generate objects within a virtual environment, starting from scratch proves to be a more suitable approach.
Exploring several tools helped me to understand what kind of interaction can be created and built for AR scenes. Each tool provides different features and functions but we can utilize a similar approach to build them and better project to the real environment when creating.
Credits
https://www.dannysrocketranch.com/
https://steamcommunity.com/sharedfiles/filedetails/?id=1254376660