AME (Augmented Musical Expression) (MWMR Finals)
What is AME?
Augmented Musical Expression is an application that allows the user to draw in augmented space while producing pre-rendered music note as each stroke is drawn. One of my inspiration derived by the works of Keith Sonnier, a post minimalist artist who creates anti form light sculpture as a way to tell his narrative experiences.
Keith Sonnier, a post minimalist artist
Keiichi Matsuda, Augmented City.
Google, Creative Lab
Why create this piece?
An augmented experience through drawing in public space without contributing to vandalism. It brings fun and quirkiness to users as they draw and produce sounds as each stroke is drawn.
Final Video Documentation
Raycasting. Camera.ViewportPointToRay (Unity Documentation)
Raycasting returns a ray going from camera through a viewport point. Resulting ray is in world space, starting on the near plane of the camera and going through position's (x,y) coordinates on the viewport (position.z is ignored).
Using Mesh (Unity Documentation)
A class that allows creating or modifying meshes from scripts. Meshes contain vertices and multiple triangle arrays. See the Procedural example project for examples of using the mesh interface. The triangle arrays are simply indices into the vertex arrays; three indices for each triangle. For every vertex there can be a normal, two texture coordinates, color and tangent. These are optional though and can be removed at will. All vertex information is stored in separate arrays of the same size, so if your mesh has 10 vertices, you would also have 10-size arrays for normals and other attributes.
It would be a heightened experience for the user if I am able to implement multi-user into AME. I attempted to create the multi user using PhotonEngine, a WebSocket for multiuser in gaming, but what I created was not stable and it breaks easily on my part.