THE FAMOUS DONUT

Here’s mine made in Blender. It was a pleasant experience for me.. I had so much fun and learn a lot from this project. It’s best for learners to comprehend 3D modeling in Blender, such a good instructor @blenderguru.

Viewport - Blender

Sculpting- Blender

Animation- Blender

Nodes- Blender

Modeling - Blender

Meditation Sphere - Clouds

A Loading Screen - Sky environment before the enterance to the Meditation Room.

MEDITAION ROOM - VIDEO AND SOUND PLAYER


Meditation Room Media Player

Hi Fidelity Design

Beer Pong

I’ve built a VR Beer Pong game for Oculus Quest2.

203517388_175018134503405_6320933599518569931_n.jpg

VR Beer Pong Game

A Virtual Reality Beer pong game, also known as Beirut, is a game in which player throw a ping pong ball across a table with the intent of landing the ball in a cup of beer on the other end.


DEMO

ABOUT THE GAME

Beer pong, also known as Beirut, is a drinking game in which players throw a ping pong ball across a table with the intent of landing the ball in a cup of beer on the other end. The game typically consists of opposing teams of two or more players per side with 6 or 10 cups set up in a triangle formation on each side. Each team then takes turns attempting to throw ping pong balls into the opponent's cups. If a ball lands in a cup, the contents of that cup are consumed by the other team and the cup is removed from the table. The first team to eliminate all of the opponent's cups is the winner.


PACKAGES

For this project I’ve used listed packages;

  • XR Plugin in Management - Oculus

  • Open XR Plugin

  • XR Interaction Toolkit

PROCESS

Most important functionalities of this project are picking up things(grabbing objects) and moving around the space. 

In order to picking up the objects, I’ve added XR Grab intractable component to the pingpong balls. This component adds Rigidbody automatically to the objects that are related. Later, I’ve to play around with the velocity scale, smoothing and attach ease in order to create an illusion of a real pingpong ball physics. The balls also have their own Physic Materials that bounciness is fixed at around 0.5 to 0.8.

In my XR Rig I’ve created LeftHand and RightHand Controllers and attached Oculus hand models for each component. In order to set up the controllers, I’ve used XRRig which includes XR Interaction Manager - Input action manager Script(which is already default) and Locomotion System. The locomotion system by default presets it to be teleportation and snap turns, but for this project I’ve deselected it and I added continuous move provider. 

I’ve arranged the functionalities accordingly;

  • Snap turn to be on the right controller

  • Move to be on the left controller

The most difficult part for this project was to animate the hands.(Action: When you press the buttons, fingers move) I’ve used oculus’s hand models which is free on oculus’s website - link. It includes 2 mb files for right and left hands, in order to open it I used my Maya account, but I’ve realized some fbx. models are available online. 

After I’ve attached the models to controller objects, I’ve worked on positioning the hand models to controllers orientation. Orienting the hand’s pivot point for X,Y,Z axis of controller took me a whole day, but I still need to work on it, doesn’t really pivot around the wrist area.

To trigger the appropriate fingers to move, I’ve attached animator component for each model. In order to have an animator I’ve created the avatars and started to actually animate the hand. Inside of hand_world, there is the skeletal structure of the all bones, I’ve rotated all the fingers from related indexes in order to create a decent natural looking closed fist. 

Hand_Animation.gif
 

Later, I’ve programmatically cause the animation to happen when I pressed the buttons. In order to do that I’ve used avatar masked. So, there are to functionalities; 

  • Trigger (Thumb and index fingers)

  • Grip (middle, pinky and ring fingers)

Note for coding part while doing the parameters: They are float numbers. 

CODING

There are two scripts for hand movement: Hand(for: Gripping, animating etc.) and Hand Controller(for: trigger those actions). So, the hand controller is communicating the trigger press, understand it and than trigger the appropriate actions from the hand.

Here’s the scripts for hand and hand controller:

Trigger and Grip Layers

Trigger and Grip Layers

Hand Animation for each finger

Hand Animation for each finger

Hand.cs
HandController.cs

using System; using System.Collections; using System.Collections.Generic; using UnityEngine; [RequireComponent(typeof(Animator))] public class Hand : MonoBehaviour { public float speed; Animator animator; SkinnedMeshRenderer mesh; private float gripTarget; private float triggerTarget; private float gripCurrent; private float triggerCurrent; private string animatorGripParam = "Grip"; private string animatorTriggerParam = "Trigger"; // Start is called before the first frame update void Start() { animator = GetComponent<Animator>(); mesh = GetComponentInChildren<SkinnedMeshRenderer>(); } // Update is called once per frame void Update() { AnimateHand(); } internal void SetGrip(float v) { gripTarget = v; } internal void SetTrigger(float v) { triggerTarget = v; } void AnimateHand() { if(gripCurrent != gripTarget) { gripCurrent = Mathf.MoveTowards(gripCurrent, gripTarget, Time.deltaTime * speed); animator.SetFloat(animatorGripParam, gripCurrent); } if(triggerCurrent != triggerTarget) { triggerCurrent = Mathf.MoveTowards(triggerCurrent, triggerTarget, Time.deltaTime * speed); animator.SetFloat(animatorTriggerParam, triggerCurrent); } } public void ToggleVisibility() { mesh.enabled = !mesh.enabled; } }
using System.Collections; using System.Collections.Generic; using UnityEngine; using UnityEngine.XR.Interaction.Toolkit; [RequireComponent(typeof(ActionBasedController))] public class HandController : MonoBehaviour { ActionBasedController controller; public Hand hand; // Start is called before the first frame update void Start() { controller = GetComponent<ActionBasedController>(); } // Update is called once per frame void Update() { hand.SetGrip(controller.selectAction.action.ReadValue<float>()); hand.SetTrigger(controller.activateAction.action.ReadValue<float>()); } }

PROBLEMS

The issue was when I pick something up the object was snapping the pivot point of the hand. So, I’ve remedy that by hiding the hand when you pick an object(balls), so that the balls doesn’t look inside the hands. In order to do that, inside the hand class I’ve added SkinnedMeshRenderer and with ToggleVisibility method, I’ve disabled the visuals of the hand when the interaction is enabled. Putted it inside the interaction events/ on select Intractable (enter/exit).

Public void ToggleVisibility()
{
mesh.enabled = !mesh.enabled;
}

VIRTUAL BUSINESS CARD

I’ve created a virtual business card for a UPBANK. You can easily send an email, make a phone call or visit bank’s website with one touch. Card also includes a 3d face scan of the employee which helps people who suffers from not remembering peoples names and some advertising about the bank.

PROCESS

Screen Shot 2020-06-05 at 1.06.20 PM.png

Shoot the Bottles

All we do these days is to stay home safe and social distance. People are bored so I wanted to develop this app. so that at least they can have some with. This bottle shooting app is a simple game that you detect horizontal planes and place the bottles then shoot them in order so that they fall down. I find it relaxing to break some glasses without making a mess at home.  

You can reach to project from below link,

https://github.com/bestesaylar/AR-Beer-Game2

PROCESS

STAY AT HOME

When you feel like you don’t fit in during Covid-19

SPARK AR_ Face Mask

For this project I’ve created a special mask using the AR software Spark AR.

Screen Shot 2020-03-11 at 7.34.14 PM.png

DEMO

AMNH

The main goal for this project is to use audio listener component. 

So the audio listener component is capturing any sound effects playing within the environment of the game and then output that audio through the speakers of the device that’s been used.

The audio listener component will capture the sound based on its position within 3d space. So if it’s capturing some sound that’s really far away it will sound like that sound effect is really far away the same thing applies to which direction the sound effect is coming from.

The audio listener component attached to the black sphere hanging from the ceiling in the museum which is Mozart’s Symphony No.40 go noisier when the camera is close to the it, which in this case I’ve located it far from the enterance of the museum when we go inside the museum we should hear the sound higher coming close the game object. 

LALA LAND

Assignment 1

Screen Shot 2020-01-29 at 10.16.38 AM.png
Screen Shot 2020-01-30 at 9.10.34 AM.png

DEMO