Final Project: ARchify Application
DELIVERABLES:
Google Drive to Folders + Walkthrough Video
WALKTHROUGH
TASK DELEGATION:
Most of this development was way before we managed to debug our AR Scene, as it previously showed us a constant error of "development console exception in callback failed to create ImageTargetObserver: DATABASE_LOAD_ERROR" when we build to the phone, so most screenshots and testing was done on using the Windows platform.
This error showed that it was having trouble getting the Vuforia database to load, and any attempts to even scan our ImageTarget did not work. (Surprisingly no one else on the internet has this, except for a few which did not manage to get solid answers as to why this happens, and the only one who managed to solve his issue was this guy - who didn't help me...)
Move, Rotation, & Scale, Reset
Considering our absolute fail in the previous Prototype stage, Naura and I were pretty stumped on how to move along. Deciding that we would just stick to horizontal planes, we switched back to Vuforia, where we implemented Image Tracking to spawn where our model would be (as per the proposal idea).
With the help of Renee, we were able to apply the same method she had for her AR app where she could Move, Rotate, and Scale her model.
![]() |
| Fig 1. A screenshot of part of the Draggable code |
After implementing this code, I also implemented a feature called TargetTracker, where the UI will only appear IF the image is tracked. This way, the user knows that they are supposed to scan the room first and implement the image target before accessing to the models as instructed in our Tutorial Page.
To produce a more interactive feature for the users, I implemented a ModelSpawner code with an empty GameObject named ModelManager, where another empty GameObject named SpawnPoint with the coordinates of 0,0,0 (where the center of the ImageTarget is) allows the model to be spawned where we want it to be, solving our problem from the previous MVP prototype where our models would spawn in random places and would be in random inconsistent scales.Next I created a ModelManager script that would handle the Models on screen when they are spawned. I linked it to On Click for the Reset, so that if the user wanted to they can Reset the room (clear all of the models using the Destroy(model) function).
![]() |
| Fig 2. AR Tutorial Page (on startup but other UI is hidden) |
AR Catalog UI
We wanted to follow the Figma UI that we created in the previous assignment, so I created the Tutorial Page (using Panel) and used 2 codes, one called CatalogToggle, and CatalogCategoryManager. For CatalogToggle, when the user clicks on the slider button it toggles the panel and expands to the desired height we wanted, and on the void Update it would go back to the untoggled version.![]() |
| Fig 3. Catalog Manager Inspector |
![]() |
| Fig 4. Part of the code (most crucial!!) |
Models
Individual Control
Next, I modified the Draggable code as you could only previously add in 1 model every time you click on the button. It took me a day or two to figure this out, but eventually I modified the script to update how the models were instantiated. Originally, the code replaced the previous model every time a button was clicked, which meant only one model could exist at a time.
Next, I wanted to have individual control over each model. While I had multiple in one scene, they somehow were controlled by the gestures at the same time.
![]() |
| Fig 6. Evidence of a Confused Natania |
That's when I realized that before this I was just clicking anywhere on the screen and would be able to control the models. Now that I have multiple, doing that obviously wouldn't work anymore. By clicking only on the model, I was able to control them individually by making it detect input only when the model itself is clicked via a Raycast with its Box Collider. Now it works!! So that's great (but I don't think it ended up showing in our final.. for some reason TT)
![]() |
| Fig 7. Playing around with the AR (making a wall of Partitions spawning with Individual Control - STILL on my laptop because we were still finding out the build Error) |
![]() |
| Fig 8. Evidence of our wonderful teamwork |
Collaboration + Solution
Reflection
We did get the occasional error where on Naura's side there would be a file missing or something that wasn't supposed to be there. It even prevented her from opening the Unity file at all, but luckily I could delete it from my side using Version Control, and she could open the file again. I think at times like these, it's crucial for you to have a reliable and close teammate, because we show up for each other when the other can't, and we are each other's moral support (she is more of mine to be honest HAHAHAHA). I think without her I would be struggling so much overall, and I think she really is the key ingredient that makes our app so high quality with all its interactive features, visuals, and models (SHE DID ALL 24 MODELS?? Honestly her Blender skills are so much better than mine).
Through this project, I learned a lot about the real struggles that developers and coders go through — how even the tiniest difference in how you build your app can determine whether it runs or crashes. For example, changing the Input Package Manager to "Both" worked for me during testing, even though in previous classes it was recommended to choose only one. It just goes to show that there is no one-size-fits-all solution when it comes to development.
I also realized how keeping up with the most recent versions of software and plugins is so important. If even one version is out of sync (like Vuforia or Unity packages), things might not work at all, or worse- crash your project halfway through.
Most importantly, I learned that there are a thousand ways to reach the same goal, and just because one method didn’t work for someone else, doesn’t mean it won’t work for you. You just have to keep trying, keep tweaking, and be ready to problem-solve your way through it.
In fact, we tried around 4–5 different approaches before we even got our final app working:
-
AR Foundation
-
AR Magic Bar Lite (from our MVP proposal)
-
Lean Touch Toolkit
-
Manual screen-based prefab placement
-
And finally, Vuforia !!! which was the one that actually allowed us to do everything we wanted.
Here's a rundown of everything we managed to achieve in the AR Scene:
- Marker-based model placement using Vuforia Image Targets.
- Multiple models can be instantiated from the UI catalog.
- Each model is interactable through individual control scripts.
- Drag (move) functionality using mouse or touch.
- Rotate (mouse right-click or pinch gesture).
- Scale (mouse scroll or pinch zoom).
- Tap-to-delete function with raycast verification.
- Reset button to remove all placed models at once.
- UI catalog with expandable/collapsible animation.
- Category-based filtering for catalog panels.
- Sound effects integrated for toggle and reset actions.
- Clean and modular script organization for maintainability.
Honestly, this journey has been chaotic, tiring, and technical but also one of the most satisfying and collaborative experiences I’ve had so far because of the amazing partner I had and our vision on the project together. The fact that our AR app works, runs smoothly, and has such strong interaction features is something we’re genuinely proud of.
.png)








%20(1).png)
%20(5).png)
LOL ending demo video gagal
ReplyDelete