Weekly Posts/Reflections
Week 1
What is Experiential Design?
After a few wrong answers, he then explained:
Experiential Design focuses on experiences that deeply engage users across various senses and emotions. Under this umbrella lies Experience Design, which includes more specific areas such as User Experience (UX) Design — which shapes how users interact with products, systems, or services.
For our project, we will be applying Experiential Design through mobile AR (Augmented Reality). This form of experiential design allows users to interact with digital elements layered onto their physical environment, making the experience more immersive and memorable.
He also gave us some keywords and tips, such as:
Minimum Viable Product (MVP):
An MVP is a simple version of the product, such as a wireframe or a prototype, created to test the core functionality. It helps in quickly validating ideas without building a full product.
Vertical Slice:
A vertical slice refers to creating a fully functioning small part of a larger project. Instead of doing a basic mock-up of everything, we pick a small feature and complete it in full detail. For example, in a shopping app, a vertical slice could be animating an item being added to a cart, complete with interactions and feedback.
Exploring Interaction with Visuals:
We learned that simple visual interactions — like animations, color changes, and visual feedback — can dramatically improve the user’s experience. The goal is to create a maximum, memorable impact through minimal, focused interactions.
Facing Difficulties and Moving Forward:
After establishing an idea, the next step is to identify potential challenges early. By doing so, we can adjust, improve, and ensure the project is finished within the timeframe (not like my games development disaster...)
AR in Education: Components for Learning
In the context of educational AR experiences, we explored three major components:
Demonstration: Showing a concept or process visually through AR.
Practical: Allowing users to actively engage with or manipulate digital elements.
Gauging Effectiveness: Building features that check whether users have understood or learned from the experience.
Types of AR interactions we can create include:
Scan and Show: Using a marker (like a printed image) that triggers an AR model or animation.
Space Creation: Adding virtual elements into real-world environments, both indoors and outdoors, to create new spaces or scenes.
Reflection
At the beginning of the semester, I felt quite nervous about starting this course. Experiential Design sounds exciting, but also a bit intimidating because it combines creativity with technical challenges (also because all of the modules Mr Razif teaches are so difficult TT (not ur fault sir i promise)). However, I was relieved to know that we could form groups for our ongoing project — I’m teaming up with Naura, who has always been my emotional support hehe.
We have started brainstorming together to develop strong ideas. Our goal is for each of us to come up with two ideas over the weekend. Hopefully, by combining our efforts, we can create an amazing project concept that we’re both passionate about..
Week 2
![]() |
| Fig 1. A straightforward diagram showing different fields of design. |
In this week’s Experiential Design class, we explored the subtle but significant differences between customer experience (CX) and user experience (UX). While CX focuses on the broader impression a person has of a brand—often influenced by their purchase or service experience—UX dives into the actual interaction with the product or service.
Mr Razif then reminded us how essential it is for companies to align their brand identity with the way they present and design their digital or physical experiences.
![]() |
| Fig 2. A venn diagram showing the comparisons between UX and XD design. |
We also revisited important foundations such as user personas (which we learned from Mr Yusri), which group users by common traits and behaviors, and information architecture, the hierarchy or structure of information in an interface, crucial to making experiences that we are trying to make intuitive.
A standout part of today’s session was learning about empathy maps—a tool that helps teams gain deeper insight into what users say, think, feel, and do. This tool supports decision-making by aligning team perspectives around real user experiences.
![]() |
| Fig 3. Our class activity User Journey Map. |
Journey Mapping Activity: Taylor’s Campus Edition
Our class activity involved creating a user journey map of a typical day on Taylor’s campus. This included the journey to class, finding food or toilets, and post-class relaxation spots like the library or lakeside. I realized how many pain points we subconsciously tolerate—like long elevator waits or crowded student centers. Yet, there are also thoughtful gain points, like AR face recognition or the lakeside’s calming presence.
One big takeaway was how customizable journey maps can be. There’s no fixed format—they can blend with storyboards or be enhanced with emotion icons or mood indicators, depending on what we want to communicate. However in this case, Mr Razif did mention that if we used emoji's or visuals to represent the plus, pain, points and solutions, it'd be more appealing.
Trending Experiences Ideas
I also discussed two of my current ideas with Naura and Mr Razif. One was an AR Home Designer, where I wanted an affordable AR home interior planning tool that lets users scan and recreate their rooms using LiDAR (or standard depth scanning) and then rearrange furniture virtually using physical QR markers or chip tags. Unfortunately, this idea was already busted because Mr Razif has said there's already an existing AR app called IKEA place. (Clearly I didn't do enough research before, and I guess it's hard to have an original thought in this world..)
Another setback would be choosing what to focus on in the app - such as whether I wanted to make it about room scanning or furniture placement, as I would need to focus on one and not the other due to time constraint and also learning the resources needed. Lastly, even if I did create something like IKEA place, what would make my application different from them?
![]() |
| Fig 4. IKEA Place in simulation. |
My second idea was a Gym Workout Buddy, an AR experience where a virtual trainer avatar demonstrates gym equipment usage or bodyweight exercises beside the user in real time. The avatar can be selected based on gender or fitness style, and would overlay correct posture/form in the physical space. I thought of this idea because I had visited the gym before on my own, and not only was it full of testosterone (guys.. everywhere..) the equipment also looked confusing. Even when watching youtube videos, it was difficult to translate the same understanding due to only seeing 1 perspective on the screen.
When I told Naura about this, she consulted our chatgpt TDS assistant, and it mentioned it would be difficult to portray the correct posture due to tedious rigging and other technical issues with the animation. Because of this, I decided to change my idea where the mascot would demonstrate how to use the machine, but the correct posture/form would not be included in the idea. Instead, the workout buddy would be able to do floor exercises or anything with simple animation (planks, jumping rope) that would still serves its' purpose as a workout buddy.
Mr Razif seemed quite okay with the idea so far, and I went back home to continue researching.
Personal Reflection
I found this activity really enjoyable and insightful. It made me more aware of the small frustrations I’ve gotten used to on campus and how they could be solved. Hearing my groupmates’ different perspectives also reminded me that even within a shared environment, user experience varies greatly. Everyone’s interaction with the same space can differ based on their priorities, habits, or even mobility.
This week's lecture also helped me realize how experience design is about empathy and systems thinking. It’s not just about problem-solving, but it’s about understanding how people move through environments and how subtle improvements can make a big difference.
Also, it's very difficult to have an original idea in 2025.
Here is our future journey map that we did below:
![]() |
| Fig 5. Our future user journey map for Taylors students. |
Week 3
This week, we were introduced to Extended Reality (XR), which includes Augmented Reality (AR), Virtual Reality (VR), and Mixed Reality (MR). These technologies all aim to enhance user experience, but each does so in unique ways.
![]() |
| Fig 6. The differences between AR, VR, and MR. |
Augmented Reality (AR)
Example: IKEA Place
AR overlays virtual content onto the real world. It enhances physical space without replacing it — for example, placing a digital sofa in your living room through your phone.Mixed Reality (MR)
Example: Smart Home Systems
MR allows users to interact with and manipulate virtual objects in real time, while also respecting real-world physics. It’s like AR but with deeper interaction — virtual switches that respond to touch or voice.Virtual Reality (VR)
Example: Minecraft VR
VR replaces the real world entirely with a digital environment. It offers full immersion and a strong sense of “presence,” where users feel like they’re truly somewhere else.
![]() |
| Fig 7. A screenshot of the student's artwork in class. |
Types of AR Experiences:
Mobile AR (Personal)
The most accessible AR. It involves interacting with digital content through smartphones, e.g., changing object colors or animations as you move the phone. This creates a two-layer interaction between user and environment.
Projection AR (Shared)
AR is projected into the physical world — think museum installations or digital graffiti that several people can see and interact with together.
Head Mounted Displays (HMD)
These include smart glasses or VR headsets that integrate SFX, VFX, touch/voice activation to create more immersive AR experiences.
There are two main approaches to activating AR content:
Marker-Based AR
Requires a visual trigger (like an image or QR code). Common in print or packaging — scan the marker to see content appear.
Marker-less AR
Uses sensors, GPS, or LiDAR to anchor AR content in the environment. This method doesn’t require physical prompts to activate content.
We also leanred that designing effective AR isn’t just about visuals — it blends UX Design, usability, and interaction design. We also touch on:
Design Thinking: Keeping the user at the center of the problem-solving process.
Human-Computer Interaction (HCI): Understanding how users interact with technology to design more intuitive experiences.
Mobile AR (Personal)
The most accessible AR. It involves interacting with digital content through smartphones, e.g., changing object colors or animations as you move the phone. This creates a two-layer interaction between user and environment.Projection AR (Shared)
AR is projected into the physical world — think museum installations or digital graffiti that several people can see and interact with together.Head Mounted Displays (HMD)
These include smart glasses or VR headsets that integrate SFX, VFX, touch/voice activation to create more immersive AR experiences.
There are two main approaches to activating AR content:
Marker-Based AR
Requires a visual trigger (like an image or QR code). Common in print or packaging — scan the marker to see content appear.Marker-less AR
Uses sensors, GPS, or LiDAR to anchor AR content in the environment. This method doesn’t require physical prompts to activate content.
We also leanred that designing effective AR isn’t just about visuals — it blends UX Design, usability, and interaction design. We also touch on:
Design Thinking: Keeping the user at the center of the problem-solving process.
Human-Computer Interaction (HCI): Understanding how users interact with technology to design more intuitive experiences.
Class Activity (I forgot what this one was specifically called)
We decided to propose an AR-powered hair salon mirror that uses facial tracking and 3D modeling to:
Let customers preview hairstyles and colors in real time, accurately mapped to their face shape and hair type.
Show live treatment progress updates — like how long until the next wash, or a visual timeline of each stage.
Include “style limits” overlays, where clients can draw or set boundaries (e.g., desired hair length), and stylists can reference them throughout the session.
This idea combines marker-less AR and real-time visualization, creating a more transparent and reassuring salon experience.
![]() |
| Fig 8. My chosen image to put into the database. 2D, and has contrast. |
Unity & Vuforia
![]() |
| Fig 9. Downloading Vuforia and uploading the image into the database (I got 5 stars!) |
![]() |
| Fig 10. Yay my cube worked! |
Mr Razif then told us to place a 3D model instead of the cube so that it feels like a more realistic experience (do it before you go home!). So I found a 3D fish that's free on the internet and aligned it with my fish image. This was an easier process than I expected but I'm still going to be cautious with Unity because it traumatized me a bit too good in Sem 3...
![]() |
| Fig 11. My free 3D fish working :) |
Personal Reflection
Week 4
By creating a list of actions and referring the model of the fish into it, we utilized the Game.SetActive bool that uses a true or false statement. By ticking or unticking the box, you create a state that the button is in. When testing and pressing the button, it switches the option to vice versa. Pressing 'STOP' will stop the animation and pressing 'START' will start it again.
![]() |
| Fig 13. Adding Buttons and Controlled Animations |
![]() |
| Fig 14. Changing colors when pressed (impacts user experience) |
I also had a little bit of a problem when my sphere animation wouldn't play (as you can see in the video), and I suspected it was because I made a previous animation called Sphere_Bounce and changed my mind, deleted it, and made Sphere_Run instead. Because of my previous Unity experience in Games Development, I went to the Animator, and found that the system had tagged Sphere_Bounce as the default animation despite me deleting the file in the Sources folder already. Therefore, I deleted it in the animator and attached Sphere_Run as the default, and my problem was solved.
Personal Reflection
Week 5
![]() |
| Fig 17. Making sure our canvas is on top. |
Week 6
linking the buttons to different scenes by using the function Build Profiles and coding in VS using UnityEngine.SceneManagement
![]() |
| Fig 20. Making sure our Scene List is numerically correct as it will determine the start of our application pages. |
![]() |
| Fig 21. Two different codes that access SceneManager |
![]() |
| Fig 22. Additional step that goes hand in hand with the 2nd code option |
![]() |
| Fig 23. I had a problem. |
%20(4).png)




.jpg)














%20(1).png)
%20(5).png)
Comments
Post a Comment