top of page
  • Instagram
  • LinkedIn
logo-grayscale-inverted_edited_edited.jpg

BUG

Human and AI interaction in built environment

HighresScreenshot00113.png

MY ROLE

Creator

Unreal Engine, AI system, VR, Game design, Gesture detection, Immersive experience, Motion Design, Level design

TAGS

DESCIPRTION

"BUG" VR leverages Unreal Engine 5 to redefine immersive interactions, blending real-world actions with VR controller action and body gestures. The idea emerged from a contemplation of the evolving relationship between humans and artificial intelligence growth. 

PROJECT OBJECTIVES

> Engage with digital and physical worlds in a seamless and captivating manner

> Enabe players to seamlessly interact with the environment and engage with an AI.

DIGITAL ENVIRONMENT

"Bug" mirrors the profound link between human communication and techonology in a Virtual Reality, blending body gesture and responsive robotics sparks a captivating dialogue between human and machinery. 

Xnip2025-01-16_16-49-48_edited.jpg

LOCOMOTION, DYNAMIC CAMERA

The dynamic camera movement is employed to introduce subtle tilts and shifts in the player's field of view, mimicking the organic movements one would feel while walking.

Xnip2025-01-16_20-48-04.png
Xnip2025-01-16_16-53-22_edited.jpg

GESTURE DETECTION FLOW

1. Centroid Calculation: The centroid (ci) is determined by averaging the coordinates of the gesture's data points.

2. Point Relocation: All points are adjusted to place the centroid at the origin, ensuring accurate recognition.

3. Scaling: The gesture is scaled within the range of -1 to 1, further enhancing recognition accuracy.

Xnip2025-01-16_17-30-17.png
Xnip2025-01-16_17-29-59.png

RECOGNITION DOCUMENTATION 

Xnip2025-01-16_17-29-33.png
Xnip2025-01-14_13-15-09.jpg
Xnip2025-01-14_13-15-01.jpg

LEVEL DESIGN

LEVEL DESIGN

Xnip2025-01-16_20-25-53_edited.jpg
Xnip2025-01-16_20-25-53_edited.jpg
Xnip2025-01-16_20-26-01_edited.jpg
Xnip2025-01-16_20-25-53_edited.jpg
Xnip2025-01-16_20-26-01_edited.jpg
Xnip2025-01-16_20-25-15.png

FINAL RENDERING

1st DEMO RECORDING

OTHER PROJECTS

AI OPPONENT DEVELOPMENT

Xnip2025-01-16_17-30-38.png

COMPONENTS

The design of AI components empowers an AI-controlled robot avatar to interact with players in real time within a VR environment. Without requiring physical sensors, the AI relies solely on sensing the player’s virtual body and movements, dynamically reacting to actions through advanced perception, stimuli processing, and decision-making. This enables the avatar to respond vividly to a wide range of player behaviors and even predict their next steps, creating an immersive and lifelike interaction.

Xnip2025-01-16_20-42-07.png

#1 AI PERCEPTION 

Xnip2025-01-16_17-30-48.png

#2 AI STIMULI 

Xnip2025-01-16_17-31-02.png

#3 DETECTION STSTEM

#4 DAMAGE REACTION

Xnip2025-01-16_20-28-14.png
Xnip2025-01-16_20-28-05_edited.png
Xnip2025-01-16_20-28-05_edited.png
Xnip2025-01-16_20-28-05_edited.png
Xnip2025-01-16_20-28-14.png
bottom of page