Experiential Design / Task 1
Experiential Design - Task 1 : Trending Experience
WEEK 1: INTRODUCTION TO EXPERIENTIAL DESIGN
On the first week of Experiential Design, Mr. Razif introduced us to what is Augmented Reality (AR) as a whole. We were shown the applications of AR in everyday life. An example Mr. Razif showed us was a feature from Google where you can place 3d models of animals in real life through your phone. After that, we were also shown past works of students from previous students as a basis or inspiration for our future projects.
Image 1.1 - Examples of Google AR
-----------------------------------------------------------------------------------------------------------------------------
EXERCISE 1 - Imagine the scenario in either of the two places. What would the AR experience be and what extended visualization can be useful?
-----------------------------------------------------------------------------------------------------------------------------
Sir Razif said that we could imagine ourselves in any places we want for this exercise. So, I decided to imagine myself in the experimental theatre of Taylor's University, where plays and performances are usually done at.
The AR application will include features such as 3D human models and props that can be used to determine the blocking and composition of the actors in scenes of the play/show.
With the help of this application, stage managers and production crews of plays would be greatly assisted and the production of plays and such would go way smoothly. The application would allow actors to take breaks when they are usually needed to stand for period of times during the blocking stage of production.
Image 1.2 - Rough concept of the AR Scene Blocking app
___________________________________________________________________________________
WEEK 2: JOURNEY MAP
On the second week, Mr. Razif explained to us more terminologies of experiential design such as terminologies of fields and disciplines like UX (User Experience), VcD (Visual Communication Design), Pd (Product Design), etc.
Mr. Razif also explained to us techniques of mappings that are used in the process of Experiential Design. Such mappings include:
- User Mapping
- Empathy Map
- Journey Map
A. Journey Map
In this week's lecture, we delved more into Journey Map. Journey Maps shows each steps/process of an activity. The pros, cons, and what the solution for each step in the journey are written. We were made to group up during this lecture and create our own journey maps, below is the journey map my team decided to create.
Image 2.1 - Journey Map
Our group decided to make a journey map of a theme park, where there are 10 steps in the journey of going and experiencing a theme park.
B. Introduction to Unity
Following the group presentations, Mr. Razif introduced us to the Unity engine.
All the installations relating to Unity has been done prior to the practical, so I will just document what Mr. Razif taught us during the practical session.
First order of business is to choose a template, to follow suit of what Mr. Razif did, I'll just use the 3D template.
Image 2.2
Downloading Vuforia
DISCLAIMER: Download Vuforia for Unity for the API key to access more AR libraries.
Double click this file to import the vuforia package into the unity project then click update.
You'll now see the Vuforia Engine when right clicking in the hierarchy tab. Now add ARCamera from the Vuforia Engine dropdown.
Follow this exact instruction for the license key,
- Press ARCamera then on the right tab, look for a button [Open Vuforia Engine Configuration]
- Besides "License Key" press "Add License"
- On the website press "Get Basic" and create your license
- After creating your license, press the license you've created and press the license key to copy
- Going back to Unity, paste the license key onto the empty box besides "App License Key"
After doing all of that, right click on the hierarchy tab and add "Image Target" from the Vuforia Engine drop-down. Then do the steps below to get your AR app to work when scanning an image of your choice.
- Go to the Vuforia website and go to the "Target Manager" page
- Click on the database you just generated then click on [Add target], and choose what type of target you're scanning. For this example, choose image and click [Choose File] then pick an image of your choice (make sure the image you chose is simple and easy to detect). Do not forget to type 1 for your width.
- Double-click on the file you just downloaded and import it.
Making Object Appear on Scanned Images
Now, let's make objects appear now when we scan the image.
- Create a cube object on top of the image and settle on the size (you can click on the chain beside the scale to make X,Y,Z even)
- Now press the "play" button on top to start the app and show the image to the camera to summon THE CUBE.
Making Videos Appear on Scanned Images
What if we want to make videos appear when we scan an image? Let's follow the steps below to do so!
- Drag the video you want to appear into this tab for easy-access later on
- Click on "Image Target", and on the right panel, look for both [On Target Found] and [On Target Lost] and press the "+" button for the both of them. After doing so, drag "Plane" from the left panel to the box that says [None] below [Runtime] under [On Target Found] and [On Target Lost]. Choose the VideoPlayer function and put "Play()" and "Stop()" respectively.
- Press the play button again and scan your image to the camera, now the video plays.
CONCLUSION
In conclusion, I have learned the basics of making an AR app this week, specifically a marker-based AR app. I've got to learn on how to summon a 3D Model from scanning an image and even prompting a video to play whenever the image is also scanned. An important note from this week's practical is that you have to install an external package for more AR-related features in Unity.
___________________________________________________________________________________
WEEK 3: USER CONTROLS AND UI (Buttons)
On the third week of Experiential Design, we were introduced to creating the user controls and UI of an AR application on Unity.
In this week's practical, we'll be making three different scenes for the AR app, which are;
- Menu Scene
- AR Scene
- Credits Scene
MENU SCENE
First thing to do is to switch the platform we are making the AR app for.
After adding text and buttons, the menu scene is finally done.
CREDITS SCENE
After saving the file, go back to unity and press the button you want to assign the function to, like as illustrated below. Do the same action for the credits button and assign the gotoARScene().
Below is the demonstration of the buttons working now,
After that, we'll have to add the canvas on to the scene. A canvas is there to put all the UI elements and such.
Change the color to pink cause it looks better.
Image 3.3
Image 3.4
AR SCENE
For the AR scene, I just added a back button as Mr. Razif instructed to avoid clutter in the AR.
CREDITS SCENE
The contents of the credits is similar to the menu, with only sample text and a button leading back to the menu screen
Image 3.6
Making the buttons actually usable
Now we have to make the buttons on each scenes usable, to do that we have to make a script folder and a C# Script file in it.
Image 3.7
Double click on the file to open it up in Microsoft Visual Studio and write the following code,
Image 3.8
The two functions below are used to go to the AR and Credits functionally
Image 3.9
After saving the file, go back to unity and press the button you want to assign the function to, like as illustrated below. Do the same action for the credits button and assign the gotoARScene().
Image 3.10
Below is the demonstration of the buttons working now,
Video 3.1
CONCLUSION
In conclusion of what I have learned during this week, creating the Menu and UI is pretty easy, there are some programming know. Of course for the end product you will have to think of a prettier graphic for the menus, UI, etc. There is also the coding part to make the buttons work and to change scenes from the menu to the AR application itself and then to the credits (and any other scenes). In the end, if I keep up the practice and learn more on marker-less AR production, I'm confident I will be able to master it.
___________________________________________________________________________________
WEEK 4: MARKERLESS AR EXPERIENCE
On the fourth week of Experiential Design, Mr. Razif gave a brief recap regarding designing a marker-based AR experience at the beginning of the class. After the recap, Mr. Razif went on to explain the class on how to import the AR application to our phone and utilize the phone to use the application.
Making Objects Appear on Ground Planes
Starting off after importing the Vuforia package and putting in the "App License Key".
- Add the "Plane Finder" and "Ground Plane Stage" on the Hierarchy panel. Both are under the Vuforia tab
Plane Finder - Will allow you to detect the ground
Ground Plane Stage - Will allow you to put items on the ground - To test the application without using your phone, you can download a Grown Plane image from Vuforia that will let you simulate how the app would work on grown.
The image below will let the camera detect it as a ground plane to place your 3d model on using a webcam.
Exporting Application onto your Mobile Device [Android]
- Change the Build Setting to your current mobile device's operating system and then connect your mobile device to your laptop and change the box beside "Run Device" to your current mobile device.
- Wait for everything to process and the app will open on your phone instantly. Tap the screen to place blocks if you unchecked the "Stage Duplication".
That is all for setting up the application on your Android devices. For Apple users, some things might be different, such as installing other third party programs to make sure that the unity app is compatible with the device, but since I use a Samsung, I won't be troubled with the hassle of it.
CONCLUSION
This week's lecture and practical was an interesting one, we finally get to import our application from our laptop to our mobile devices and use it like an actual AR app. In conclusion, to setup an AR app on your phone, you have to put in a Plane Finder and a Ground Plane Stage on the scene to make the app actually functioning. Depending on which type of operating system for your mobile devices, the process to make your mobile device compatible with the Unity application may differ in convenience, thankfully I own an Android phone, where it Unity could work on it without any hassle and other third party application. All in all, with this, I have learned almost everything I need to make my own app.
Project
We have been asked to think about ideas for our AR Project ever since the beginning of the module, so in the docs below, I have compiled three proposals/ideas for and AR project.
___________________________________________________________________________________
REFLECTION
These past 4 weeks have taught me a lot regarding Augmented Reality as a piece of technology. I have learned the fundamentals and concept of Augmented Reality which enhanced my understanding of it by a mile. Applying various maps such as the journey map to help us visualize processes that we do everyday without any thought and to find out what experiences from these process that give us inconveniences and thinking of ways to make said experience better by applying the use of AR technology was terrific.
We were also introduced to the Unity engine which will be the main program for us to develop out AR programs that we will create for various purposes. My understanding of the programs has deepened throughout the meticulously explained lectures and practical classes given by Mr. Razif. All in all, these past 4 weeks have been a great introduction to the world of Augmented Reality.



Comments
Post a Comment