SEA WORLD - Underwater

My final project is called SeaWorld which is an app that I’ve tried to give you an underwater AR experience. The inspiration process is from underwater environment  which is thought to be the place of the origin of life on Earth and it remains the natural habitat of most of the of living organisms. One of them is the sea jellies namely Jellyfishes.


INSTRUCTIONS: I don’t have any UI for instuctions about how to use the App. yet but basically if you look around you’ll see the jellyfishes, you can add sea plants on the floor or add a fish shoals.

DEMO

Demo link

This AR project works only on WebXR Browser

CODE

My code includes examples from WebXR-101 model_load_animation, shape-on-click , hit-testing-model, in_front_camera.

Link for project: Glitch

PROBLEMS

  • Gltf. and bin. files had to be in a folder and dragging them directly in to glitch was putting them in to glitch assets folder and I couldn’t use the URL inside the code. Uploading them on glitch by importing them from GitHub repository to use these highly large gilt files put me trouble, I was getting terminal error that the project I wanna push is very large but I tried to use as much as possible.

  • And the other one was, I wanted to have this blue sea environment which I couldn’t directly applied on camera view. So, finally I tried to put a blue semi-transparent cube very close to camera but than my models behind it wasn’t’ appearing.

    Link for this version: Glitch

FUTURE

My future plan about this project is to add a resonance audio when you get inside the water I want to you to hear underwater sound get out of the water like a normal environment sound. I come up with this idea after seeing a water wave model. And by adding the names of each jellyfish on top of them cuz all these jellyfishes has their own name and it was quite impressive that I’m into learning more about them when I was creating this project. They are beautiful.

Feedback on Medium Fidelity Prototype

AR MENU APP POSTERS

Poster - option1

Poster - option1

 
Poster - option2

Poster - option2

 
Poster - option3

Poster - option3


AR MENU APP FEEDBACK

The overall feedback I got from my users about the experience was very positive. They said they find the idea very useful. Customers depend on menus to help them evaluate, compare and choose dishes. Space considerations prevent restaurants from adding too much information or too many pictures to their menus. As a result, restaurants rely heavily on reviews, word of mouth and in-house staff to supplement food menus, attract customers and sell dishes.

“Of course, this app works for both side. They provide easy access to essential information about dishes to improve the food selection process for customers and help restaurants sell more items.”

“Yes, It will be interested, what the food looks like before you order.”

The interactions weren’t the same in all of my prototypes. Before creating the final output I have to precise the the user interface of my App. In the medium prototype I used dimensional text where the user scans over the food whereas, on the high fidelity prototype the buttons appear on the side of the menu. This makes it hard for the camera to capture the whole regular sized menu (approx. A4 paper) So, the users had hard time to see the animations related to buttons. They had to keep the phone closer to them in order the see the whole scale menu. Usability of it is something I need to fix, the dIstance was an issue in my user-testings. 

In the prototypes I’ve made so far, I’ve focused on the specific interactions where users can see the options to press (button for the 3d visualization of the meal, recipe button etc.) but If I want tp apply them to the latest version of the product I’ve to create these buttons for each meal on the menu so, there should be an indicator next to the meals where user can select. 

For the final prototype, I’ve used Apple Reality Composer and use the menu as image target. Whenever user press the “See” button, the 3d visualization of the meal was appearing on top of the menu which should be on the table. I must have been made a mistake while creating the prototype in Reality Composer because a semi-transparent version of my obj was appearing before the camera defines a horizontal plane. This caused confusion within my users.  

 
Screen Shot 2020-11-09 at 5.16.46 PM.png
Screen Shot 2020-11-09 at 5.16.52 PM.png
Screen Shot 2020-11-09 at 5.16.57 PM.png
Embed Block
Add an embed URL or code. Learn more