VR – The Future Of Social Connection

With an estimated 3.5 billion users globally, social media has revolutionized how we connect with each other. Through the likes of Snapchat, Facebook, and Twitter, people from different parts of the world can share snapshots of their lives and hang out in the digital world. But, where will people connect in the future? My answer to this question is VR – The Future Of Social Connection.
Virtual reality allows us to connect and to share on a whole new level. It enables us to gather with friends anywhere on the globe and share experiences that would never be possible in the real world (think fighting dragons), or you can just hang out watching movies together.

Not only can people enrich existing friendships, but they can also make new friends through the ever-growing catalog of social VR experiences. Social VR essentially is getting together in a simulated world using a virtual reality (VR) system. Participants appear as avatars in environments that can be lifelike or fantasy worlds. It gives them the freedom to interact with each other in all sorts of ways.


  • Virtual Reality
  • Social VR
  • Concept Development


The piece was selected as part of the ITP Winter Show interactive exhibition at NYU. We were honored to have over 300 people of different age, demographic groups as well as strangers, couples, friends and families interacting with the piece.

Why Social VR?

On traditional social media, you would be limited to whatever chat options Facebook or any other platform provides you. In a virtual social network, participants can create their avatars, put on a headset and walk around in the game environment that can also be customized by the users themselves. This way, they are able to connect on a much more personal level and experience the connection on a deeper level. Our goal is to foster the sense of accomplishment and success as a group, to allow players to react to constructive social behaviour as well as being able to socialise and emotionally relate to other players.

VR Design Specification



Design a Social VR project that allows two participants to experience an interactive narrative experience.



The story should have only a single scene



No Voice – the participants can only use VR controllers to create actions to communicate.



Cannot be more than 2.5 minutes long.

My Solution


A social VR experience that brings people together through guessing if they are interacting with a human or a pre-programmed robot



We would like to test the idea of humans being replaced by machines in the future. Particularly, prove that humans can sense the synergy from another human being in the virtual world and it would be impossible to not be able to identify another person.

  • Create an entertaining experience designed to distinguish between robots and humans
  • Empower connection and teamwork between two strangers or lifelong friends
  • Preparing to live in the world that you can’t spot what is real and what is not
  • Raise awareness that at some point of the time we are going to live by robots
  • Cultivate human authenticity in the new era of raising machines

Our approach is similar to the Voight-Kampff test game as seen on Blade Runner, that is designed to distinguish between replicants and humans. However ArtificeVR has the social and gamification components.

Philosophy and Inspiration

At the core we asked questions to ourselves such as What’s the definition of being real? What defines machine consciousness? What if we couldn’t tell the difference between androids and humans in a real world? What influences human decision in virtual reality? My curiosity was inspired by the renowned Turing Test and Alan Turing’s concept of thinking machines.

"A computer would deserve to be called intelligent if it could deceive a human into believing that it was human."

Alan Turing1950

2 Oculus Quests VR headsets    ·    WiFi connectivity    ·    OptiTrack Motion Capture System


Unity 3D    ·    Mirror Networking API    ·    OptiTrack Motive:Body


Nick Tanic    ·    Concept    ·   3D Assets and Visuals

My Contribution

Concept    ·    C# software programming using Unity 3D    ·    Testing and deployment on Oculus Quest VR headsets    ·    MoCap


4 months


This game is deceptively simple. With each 30-second round pitting a human investigator against a suspect who might be human or a robot in disguise. The suspect is a patient robot trying to conceal their identity.
The investigator have to carefully watch 3 robot avatars during the round and decide which avatar is the human that controls the robot. This was a completely new way of imagining robots, as these robots looked and behaved exactly like humans.
But there are some that are harder to identify – the experimental robots who think they are human. But which one are you?


01. Concept

  • Concept development
  • Brainstorming
  • Research
  • Storyboard
  • Sketch
  • Script

02. Design

  • UI/UX
  • 3D assets
  • Scene building
  • Lights / shadows
  • Animation timeline
  • Audio / SFX

03. Programming

  • Game logic
  • Motion capture
  • Avatar control
  • Rigging
  • Inverse kinematics
  • Networking

04. Evaluation

  • Software testing
  • Hardware integration
  • Debugging
  • Playtest
  • Polish
  • Copy

05. MVP

  • On-site configuration
  • Show participation
  • Pitch deck
  • Data collection
  • Audience survey
  • Documentation

01. Concept

After multiple brainstormings and ideation session we come up with idea to make something that is 1. Entertaining and 2. Provocative. We started the process by sketching some characters and the world on a piece of paper. Then we used Google Tilt Brush to sketch basics in VR with headsets on.

At the beginning we had two versions – low budget, which is perfect for having MVP and high budget, which is what this experience would be like if we had unlimited amount of funding.

02. Design

ArtificeVR invites two participants to explore an alternate reality that takes place in a post-epidemic environment. They will be spawned at the Artifice Lab which is a made up tech company that works with human consciousness.

We have mostly used open source design components and assets considering the time and budget restrictions. Alongside with visual UI/UX, we wanted to enhance the experience by augmenting with strong audio and SFX. We hired a voice actor for our on-boarding and off-boarding experiences.

03. Programming

The project is built on Unity 3D engine using C# and Mirror high level Networking API. We used Motion Capture to record custom actions of human actors. Then we use the motion data to animate robot avatars in VR using inverse kinematics character animation systems by RootMotion.

We used Server/Client architecture, where the 2 headsets are considered as Clients and the Server is simply a laptop with Unity 3D server script. A Server is an instance of the program which all other players connect to when they want to play together. A Server often manages various aspects of the game, such as keeping score, and transmit that data back to the client. Clients are instances of the game that usually connect from different computers to the server.  All communication happens within same wireless network. Sources codes are available in Github.

04. Evaluation

At this phase we had our first working prototype. We conducted a playtest and based on participants feedback we did some adjustments. Besides the experience, we held interviews right before and after the experience,  where the participants were asked how they feel about the general idea and concept.

The results very astonishing. Some participants couldn’t believe it is going to be so hard for them to identify a human, even their close friends – someone they have known for a very long time. Others failed to identify their partners, after being married for more than 20 years. It was very inspiring to see how provocative this piece was for them.

05. Minimum Viable Product (MVP)

Our piece was selected as part of ITP Winter Show interactive exhibition at NYU. We were honored to have over 300 people of different age, demographic groups as well as strangers, couples, friends and families interacted with the piece.

Special thanks to the following people for their generous support: Igal Nassima, Sarah Rothberg, Kat O’Sullivan, Nick Gregg, Shu-Ju Lin, Sacha Chang, Nick Grant, Idith Barak, August Luhrs, Maya Pruitt, Dylan Dawkins, Jacky Chen, Atharva Patil, Wenjing Liu, Chenhe Zhang, Tim Lobiak, Becca Moore