Task 1: Trending Experiences
Week 1
What is Experiential Design?
After a few wrong answers, he then explained:
Experiential Design focuses on experiences that deeply engage users across various senses and emotions. Under this umbrella lies Experience Design, which includes more specific areas such as User Experience (UX) Design — which shapes how users interact with products, systems, or services.
For our project, we will be applying Experiential Design through mobile AR (Augmented Reality). This form of experiential design allows users to interact with digital elements layered onto their physical environment, making the experience more immersive and memorable.
He also gave us some keywords and tips, such as:
Minimum Viable Product (MVP):
An MVP is a simple version of the product, such as a wireframe or a prototype, created to test the core functionality. It helps in quickly validating ideas without building a full product.
Vertical Slice:
A vertical slice refers to creating a fully functioning small part of a larger project. Instead of doing a basic mock-up of everything, we pick a small feature and complete it in full detail. For example, in a shopping app, a vertical slice could be animating an item being added to a cart, complete with interactions and feedback.
Exploring Interaction with Visuals:
We learned that simple visual interactions — like animations, color changes, and visual feedback — can dramatically improve the user’s experience. The goal is to create a maximum, memorable impact through minimal, focused interactions.
Facing Difficulties and Moving Forward:
After establishing an idea, the next step is to identify potential challenges early. By doing so, we can adjust, improve, and ensure the project is finished within the timeframe (not like my games development disaster...)
AR in Education: Components for Learning
In the context of educational AR experiences, we explored three major components:
-
Demonstration: Showing a concept or process visually through AR.
-
Practical: Allowing users to actively engage with or manipulate digital elements.
-
Gauging Effectiveness: Building features that check whether users have understood or learned from the experience.
Types of AR interactions we can create include:
-
Scan and Show: Using a marker (like a printed image) that triggers an AR model or animation.
-
Space Creation: Adding virtual elements into real-world environments, both indoors and outdoors, to create new spaces or scenes.
Reflection
At the beginning of the semester, I felt quite nervous about starting this course. Experiential Design sounds exciting, but also a bit intimidating because it combines creativity with technical challenges (also because all of the modules Mr Razif teaches are so difficult TT (not ur fault sir i promise)). However, I was relieved to know that we could form groups for our ongoing project — I’m teaming up with Naura, who has always been my emotional support hehe.
We have started brainstorming together to develop strong ideas. Our goal is for each of us to come up with two ideas over the weekend. Hopefully, by combining our efforts, we can create an amazing project concept that we’re both passionate about..
Week 2
![]() |
| Fig 1. A straightforward diagram showing different fields of design. |
In this week’s Experiential Design class, we explored the subtle but significant differences between customer experience (CX) and user experience (UX). While CX focuses on the broader impression a person has of a brand—often influenced by their purchase or service experience—UX dives into the actual interaction with the product or service.
Mr Razif then reminded us how essential it is for companies to align their brand identity with the way they present and design their digital or physical experiences.
![]() |
| Fig 2. A venn diagram showing the comparisons between UX and XD design. |
We also revisited important foundations such as user personas (which we learned from Mr Yusri), which group users by common traits and behaviors, and information architecture, the hierarchy or structure of information in an interface, crucial to making experiences that we are trying to make intuitive.
A standout part of today’s session was learning about empathy maps—a tool that helps teams gain deeper insight into what users say, think, feel, and do. This tool supports decision-making by aligning team perspectives around real user experiences.
![]() |
| Fig 3. Our class activity User Journey Map. |
Journey Mapping Activity: Taylor’s Campus Edition
Our class activity involved creating a user journey map of a typical day on Taylor’s campus. This included the journey to class, finding food or toilets, and post-class relaxation spots like the library or lakeside. I realized how many pain points we subconsciously tolerate—like long elevator waits or crowded student centers. Yet, there are also thoughtful gain points, like AR face recognition or the lakeside’s calming presence.
One big takeaway was how customizable journey maps can be. There’s no fixed format—they can blend with storyboards or be enhanced with emotion icons or mood indicators, depending on what we want to communicate. However in this case, Mr Razif did mention that if we used emoji's or visuals to represent the plus, pain, points and solutions, it'd be more appealing.
Trending Experiences Ideas
I also discussed two of my current ideas with Naura and Mr Razif. One was an AR Home Designer, where I wanted an affordable AR home interior planning tool that lets users scan and recreate their rooms using LiDAR (or standard depth scanning) and then rearrange furniture virtually using physical QR markers or chip tags. Unfortunately, this idea was already busted because Mr Razif has said there's already an existing AR app called IKEA place. (Clearly I didn't do enough research before, and I guess it's hard to have an original thought in this world..)
Another setback would be choosing what to focus on in the app - such as whether I wanted to make it about room scanning or furniture placement, as I would need to focus on one and not the other due to time constraint and also learning the resources needed. Lastly, even if I did create something like IKEA place, what would make my application different from them?
![]() |
| Fig 4. IKEA Place in simulation. |
My second idea was a Gym Workout Buddy, an AR experience where a virtual trainer avatar demonstrates gym equipment usage or bodyweight exercises beside the user in real time. The avatar can be selected based on gender or fitness style, and would overlay correct posture/form in the physical space. I thought of this idea because I had visited the gym before on my own, and not only was it full of testosterone (guys.. everywhere..) the equipment also looked confusing. Even when watching youtube videos, it was difficult to translate the same understanding due to only seeing 1 perspective on the screen.
When I told Naura about this, she consulted our chatgpt TDS assistant, and it mentioned it would be difficult to portray the correct posture due to tedious rigging and other technical issues with the animation. Because of this, I decided to change my idea where the mascot would demonstrate how to use the machine, but the correct posture/form would not be included in the idea. Instead, the workout buddy would be able to do floor exercises or anything with simple animation (planks, jumping rope) that would still serves its' purpose as a workout buddy.
Mr Razif seemed quite okay with the idea so far, and I went back home to continue researching.
Personal Reflection
I found this activity really enjoyable and insightful. It made me more aware of the small frustrations I’ve gotten used to on campus and how they could be solved. Hearing my groupmates’ different perspectives also reminded me that even within a shared environment, user experience varies greatly. Everyone’s interaction with the same space can differ based on their priorities, habits, or even mobility.
This week's lecture also helped me realize how experience design is about empathy and systems thinking. It’s not just about problem-solving, but it’s about understanding how people move through environments and how subtle improvements can make a big difference.
Also, it's very difficult to have an original idea in 2025.
Here is our future journey map that we did below:
![]() |
| Fig 5. Our future user journey map for Taylors students. |
Week 3
This week, we were introduced to Extended Reality (XR), which includes Augmented Reality (AR), Virtual Reality (VR), and Mixed Reality (MR). These technologies all aim to enhance user experience, but each does so in unique ways.
![]() |
| Fig 6. The differences between AR, VR, and MR. |
Augmented Reality (AR)
Example: IKEA Place
AR overlays virtual content onto the real world. It enhances physical space without replacing it — for example, placing a digital sofa in your living room through your phone.Mixed Reality (MR)
Example: Smart Home Systems
MR allows users to interact with and manipulate virtual objects in real time, while also respecting real-world physics. It’s like AR but with deeper interaction — virtual switches that respond to touch or voice.Virtual Reality (VR)
Example: Minecraft VR
VR replaces the real world entirely with a digital environment. It offers full immersion and a strong sense of “presence,” where users feel like they’re truly somewhere else.
![]() |
| Fig 7. A screenshot of the student's artwork in class. |
Types of AR Experiences:
Mobile AR (Personal)
The most accessible AR. It involves interacting with digital content through smartphones, e.g., changing object colors or animations as you move the phone. This creates a two-layer interaction between user and environment.Projection AR (Shared)
AR is projected into the physical world — think museum installations or digital graffiti that several people can see and interact with together.Head Mounted Displays (HMD)
These include smart glasses or VR headsets that integrate SFX, VFX, touch/voice activation to create more immersive AR experiences.
There are two main approaches to activating AR content:
Marker-Based AR
Requires a visual trigger (like an image or QR code). Common in print or packaging — scan the marker to see content appear.Marker-less AR
Uses sensors, GPS, or LiDAR to anchor AR content in the environment. This method doesn’t require physical prompts to activate content.
We also leanred that designing effective AR isn’t just about visuals — it blends UX Design, usability, and interaction design. We also touch on:
Design Thinking: Keeping the user at the center of the problem-solving process.
Human-Computer Interaction (HCI): Understanding how users interact with technology to design more intuitive experiences.
Class Activity (I forgot what this one was specifically called)
We decided to propose an AR-powered hair salon mirror that uses facial tracking and 3D modeling to:
-
Let customers preview hairstyles and colors in real time, accurately mapped to their face shape and hair type.
-
Show live treatment progress updates — like how long until the next wash, or a visual timeline of each stage.
-
Include “style limits” overlays, where clients can draw or set boundaries (e.g., desired hair length), and stylists can reference them throughout the session.
This idea combines marker-less AR and real-time visualization, creating a more transparent and reassuring salon experience.
![]() |
| Fig 8. My chosen image to put into the database. 2D, and has contrast. |
Unity & Vuforia
![]() |
| Fig 9. Downloading Vuforia and uploading the image into the database (I got 5 stars!) |
![]() |
| Fig 10. Yay my cube worked! |
Mr Razif then told us to place a 3D model instead of the cube so that it feels like a more realistic experience (do it before you go home!). So I found a 3D fish that's free on the internet and aligned it with my fish image. This was an easier process than I expected but I'm still going to be cautious with Unity because it traumatized me a bit too good in Sem 3...
![]() |
| Fig 11. My free 3D fish working :) |
Personal Reflection
Week 4
By creating a list of actions and referring the model of the fish into it, we utilized the Game.SetActive bool that uses a true or false statement. By ticking or unticking the box, you create a state that the button is in. When testing and pressing the button, it switches the option to vice versa. Pressing 'STOP' will stop the animation and pressing 'START' will start it again.
![]() |
| Fig 13. Adding Buttons and Controlled Animations |
![]() |
| Fig 14. Changing colors when pressed (impacts user experience) |
I also had a little bit of a problem when my sphere animation wouldn't play (as you can see in the video), and I suspected it was because I made a previous animation called Sphere_Bounce and changed my mind, deleted it, and made Sphere_Run instead. Because of my previous Unity experience in Games Development, I went to the Animator, and found that the system had tagged Sphere_Bounce as the default animation despite me deleting the file in the Sources folder already. Therefore, I deleted it in the animator and attached Sphere_Run as the default, and my problem was solved.
Personal Reflection
3 Potential Ideas
App | What It Does |
|---|---|
IKEA Place | Lets users place IKEA furniture in AR to-scale |
Magicplan / RoomScan Pro | Scans rooms to create floor plans, but not for adding decorative architecture |
Houzz / Planner 5D | Interior design tools with some AR preview, but complex and brand/product focused |
Morpholio AR Sketchwalk | For architects to walk through 3D models, not accessible for general users |
- Changing the room’s structure (wainscotting, arches, doors) and not adding furniture.
- RoomSketch is for people who want to visualize renovations or changing their space architecturally.
- Could be stylized with different themes (e.g., “modern,” “vintage,” “cottagecore”)
Software/Technicalities:
- Unity
- AR Foundation (Vuforia can only detect flat surfaces horizontally but not vertically, which is what I need to have for putting elements on the wall and etc.)
- Figma
- Blender
Potential Challenges
- Getting overlays to “stick” on walls convincingly
- Avoiding floating/incorrectly placed elements
- Keeping UI clear for toggling designs
Mockup (with Canva)
![]() | |
|
This mockup represents an AR tool that lets users preview architectural changes to their room, in this case it's a door.
In this scenario, the user scans a chip, or a QR code they had placed on the wall. Once scanned, a 3D door model appears in correct perspective on the vertical surface, allowing the user to visualize how it fits within their real room dimensions.
Personal Thoughts
Similar Applications and Uses
| App | What It Does |
|---|---|
| Vi Trainer | AI-based real-time feedback on form (not AR, but visual coaching) |
| AR Runner | Uses AR to create movement challenges on the ground—more gamified cardio |
| FitnessAR | Renders running routes in AR for visualizing terrain, not indoor exercises |
| Fiture / Freeletics | Fitness training with videos, not AR overlays |
Differentiation
Focusing on gym + home bodyweight exercises (because rigging an animation with correct body posture would be difficult).
Uses an AR trainer avatar performing beside you, not just a video (because our assignment is to create an AR Application TT)
-
Has no dependency on AI, making it Unity/Vuforia-feasible (because I have no idea how to implement that)
Software/Technicalities:
- Unity
- Vuforia Ground Plane
- Figma
- Blender
Potential Challenges
- Avoiding avatar clipping or incorrect scale
- Matching animation timing with exercise logic
- Creating a visually clear and motivational interface
Mockups (with Canva)
![]() |
| Fig 18. Workout Buddy mockup (Floor Exercise) |
This screen shows the user doing a plank exercise with an AR avatar alongside them, offering verbal encouragement (the speech bubble or through voice support). A real-time countdown timer runs at the top.
- Users can restart or skip the current routine using clear buttons.
- The AR scene is grounded using the Vuforia plane for added realism.
- The interface displays exercise stats: set count, rep duration, and weight.
- A “Read More” button leads to detailed form breakdowns in a written format with pictures
Personal Thoughts
| App | What It Does |
|---|---|
| Wanna Kicks / AR Shoe apps | Show specific product overlays on foot (e.g., Nike, Adidas) |
| Snapchat / Instagram filters | Some cultural costume filters, but they’re cartoonish or novelty-based |
| Zara AR app | Shows models walking on products in-store (promo-based, not personal try-on) |
| YouCam Makeup | For makeup try-on, not full outfits |
| Zero10 AR Fashion | Experimental AR clothing try-on (fashion-tech runway level) |
Difference:
-
Focused on identity exploration for mixed-race users
-
Offers cultural education + reflection prompts, not just dress-up
-
Not commercial—designed for self-discovery and empathy
Challenges
- Getting overlays to align correctly on the body
- Managing cultural sensitivity (presentation and respect)
- Front camera AR can be harder (consider mirror target as fallback)
Mockup (with Canva)
![]() |
| Fig 20. StyleSwap UI Try On (Chinese Clothing) |
This mockup shows how users can try on traditional cultural attire using AR. The outfit in this example is a traditional Chinese dress, which appears over the user’s real-time camera as a layer of their body. Informative text bubbles explain the symbolism of the design elements (e.g., the red color and jujube fruit patterns).
- Swipe gesture at the bottom encourages users to explore other cultural styles.
- Icons on the left allow users to switch countries or zoom for more details.
Personal Thoughts
Final Decision
After exploring all three concepts: RoomSketch, Workout Buddy, and StyleSwap AR—I’ve decided to proceed with RoomSketch AR as my final idea.
Overall, I think this is a safe and achievable concept for experimentation, especially with the support of online resources, existing applications, and Mr. Razif’s expertise. I also personally like the idea, having loved interior design since I was younger. By creating an application like this, maybe in the future I'll be able to develop it even further and market it to people who are broke (like me) but love redecorating houses (also me).
While the Workout Buddy idea was interesting on the surface, I felt it lacked the uniqueness I was looking for. It leaned a bit too closely to existing solutions and seemed to involve unnecessary complexity for something I’m not personally passionate about (the last time I went to the gym was 3 years ago). The challenges like rigging a trainer avatar and creating believable animations to interact and impact the users experience made it feel more like an overcomplicated assignment than something fun.
For StyleSwap AR, I genuinely loved the idea. It felt the most unique, empathetic, and culturally meaningful, particularly for people with mixed heritage like my best friend and myself. However, after considering Mr. Razif’s advice, I realized the technical execution would be quite demanding. The use of AR Foundation, accurate body tracking, custom 3D clothing models, and cultural research made it too ambitious for this semester, especially since I'm still building confidence in Blender. It’s an idea I would love to revisit one day, but for now, RoomSketch offers the best balance of feasibility, creativity, personal connection, and interactiveness for users.
%20(1).png)




.jpg)










.jpg)
.png)

.jpg)
.png)
.jpg)
%20(5).png)
Comments
Post a Comment