top of page

LiveWeb (Final)

The Idea

For live web finals, Terrick, Mai and I intended to create a Multi user Augmented Reality Social application. The motivation of the app is to have the user plot their Avatars in specific places leaving their blueprint behind in augmented space.

We were inspired by HYPER-REALITY by Keiichi Matsuda who created an augmented city. It propelled us further into our idea and we wanted to make it into an interactive piece between multi users and the can communicate in augmented space.

Workflow

We initially wanted to test around with HoloLens. It was a new territory for all of us but it would be interesting to be able to use HoloLens and adapt the multi user interaction with it.

Unfortunately, we could not make the visual studio to communicate with the HoloLens which was the integral part of the interaction device. We did get unity to work with HoloLens but without the visual studio, the multiuser and interaction as a whole would not be possible.

AugmentedReality

We had to change our gears and use AugmentedReality ARKit instead, we chance upon PlaceNote SDK that worked well for Augmented reality application. Placenote SDK lets you quickly build cloud-based AR apps that permanently save AR content in physical locations, indoors and outdoors.

Placenote does not need GPS, markers or beacons for geolocation. Instead, it lets you scan any space and turn it into a smart canvas for positioning AR objects. Placenote integrates with ARKit on iOS devices by wrapping ARKit's tracking functionality in a cloud-based computer vision and machine learning API that lets you build Persistent AR apps quickly and easily.

We used PhotonEngine as low latency multiplayer platform. PhotonEngine use LoadBalancing API to match players to a shared game session (called "room") and transfer messages synchronously, in real-time, between connected players across platforms.

The Final Idea

Inspired by google drawing app on desktop, we decided to make a multi user drawing app where users will be able to draw in cohesion in augmented space. We also wanted it to be a musical experience piece where each stroke drawn will create a sound, making an augmented musical drawing composition experience.

We managed to document our work in progress in the multi user drawing application.

The Final (Multi User Documentation)

Issues

We ran into a big problem when we had the music component into our script. We couldn't debug it on time and we have went in too deep into the issue that we can't retrieve back our first rendition. We have the documented piece as above.

Future Reiterration

It would be interesting to be able to debug this issue and get it to work with the compositional piece. We would also like to test this with the HoloLens and get it to work with it.

Featured Posts
Recent Posts
Categories
Follow Me
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page