Interactive

Waves

Waves is a meditative synthwave VR sound generation experience. Experiencers morph the world around them into a musical instrument of infinite possibilities. 

View on GitHub

Overview

The project is built for a graduate computer graphics class at NYU. It expands on a class assignment to render bicubic surface patches in WebVR. A single patch is presented to the experiencer, who is able to modify it using 16 bezier handles. In “Music Mode” (i.e. while holding the side triggers) the controller becomes a mallet and is able to ‘strike’ the patch, which generates a sound. Lower sections of the patch correspond to lower notes emitted, and thus we have the beginnings of a musical instrument.


Challenges & Solutions

Aesthetics

The most successful (in our opinion) and collaborative aspect of our project was the aesthetic goals and design. We all discussed a design and planned out how to achieve it.

From the outset our project relied heavily on its aesthetic coherence. Without a specific ‘real world’ object or environment to ground the audience we knew we needed a visual style that would tie everything together and make the experience seem cohesive. 

The style we settled on after our discussions with Emilio (SVA) was a retro-digital/80/synthwave look, of course reminiscent of Tron and videogames from the 80s. The mood board that Emilio put together is attached (Appendix 1).

One of the main ways we achieved this aesthetic was to use the gl.LINES, rather than gl.TRIANGLE_STRIP option when sending data to the GPU. In addition to drawStrip(), we have a function called drawLines() which takes an array of vertices and… draws lines instead of triangles.

let drawLines = (mesh, color, width = 1) => {
    //Reduce redundant calls:
      if (width != prev_width) {
         gl.lineWidth(width);
      }
      if (prev_mat != MATERIALS.emission) {
         gl.uniform1i(state.uMaterialLoc, MATERIALS.emission);
         prev_mat = MATERIALS.emission;
      }
 
      gl.uniform3fv(state.uColorLoc, color);
      gl.uniformMatrix4fv(state.uModelLoc, false, m.value());
      if (mesh != prev_mesh)
         gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(mesh.vertices), gl.STATIC_DRAW);
      gl.drawArrays(gl.LINES, 0, mesh.size);
      prev_mesh = mesh;
   }

Apparently in many implementations of openGL the line width is capped at 1 pixel, but we found that on the Quest we could adjust line width beyond that. We used this to modify certain thicknesses (for instance the controllers are made of thin lines, the patch is made of thick lines).


Abstractions

To make it slightly easier to understand the flow of our program (especially for the non-CS James), James implemented a few helper classes that abstracted out some concepts.

In CG.js

class Vector
 A class to store (x,y,z) point coordinates and 
 perform vector maths on them

There are a few objects in our scene (the sun, the handles) that still use solid shading and thus we generate their triangle strips in:

class ParametricMesh
- .vertices stores the flat vertex array for a triangle strip
-
.size stores the size of the vertex array / Vertex Size
-
.createParametricMesh() is the function we created in class to return a vertex array based on a provided callback.

For the patches, though, we want to generate a grid of squares, not a triangle strip. In addition, for collision detection we store a higher-resolution array of points (Vectors) so that we can detect collisions with the controllers.

class ParametricGrid
- .vertices stores the flat vertex array for a grid of lines
- .size stores the size of the vertex array / Vertex Size
- .createParametricGrid() returns and array of vertices that represents a grid of lines
- .createCollisionGrid() returns an array of collision points (as Vectors) along the mesh 
class ParametricGrid {
 
   constructor(M, N, callbackType, args, collision = false) {
 
      this.vertices = ParametricGrid.createParametricGrid(M, N, callbackType, args);
      this.size = this.vertices.length / VERTEX_SIZE;
 
      this.collisionPoints = [];
 
      if (collision) {
         this.collisionPoints = ParametricGrid.createCollisionGrid(M * 2, N * 2, callbackType, args);
      }
 
   }
 
   static createParametricGrid(M, N, callback, args) {
 
      let vertices = [];
      // Code here generates grid, left out of docs for space. See CG.js for details
      return vertices;
   }
 
 
   static createCollisionGrid(M, N, callback, args) {
 
      let points = [];
 
      let uv = { u: 1, v: 0 };  
      let uInc = 1.0;
      let vInc = 1.0;
 
      for (let row = 0; row < N; row++) {
 
         for (let col = 0; col < M; col++) {
            uv = {
               u: col / M,   
               v: row / N    
            };
            //Convert vertex back to vector 
               //(TODO: Make more efficient than this nonsense)
            let vertex = callback(uv.u, uv.v, args);
            points.push(new Vector(vertex[0], vertex[1], vertex[2]));
         }
      }
      return points;
   }
}

In geometry.js

For the shapes which fly off after the mesh is struck, I took a leaf out of the Unity game engine’s book, although in a very rudimentary way. A Geometry object contains a ‘mesh’ (which is vertex data), a Transform object, and a PhysicsBody object. PhysicsBody applies rotation and translation onto a transform object, based on velocity and angular momentum. Transform basically tells the Geometry where to draw its mesh.

class PhyscisBody {
    constructor(velocity, angularMomentum) {
       this.velocity = velocity;
       this.angularMomentum = angularMomentum;
 
       //TODO: Apply frictions?
       this.friction = 0.99;
       this.angularFriction = 0.9;
    }
 
    applyForce(force) {
       //Expects Vector
       this.velocity.add(force);
    }
 
    addTorque(transform, torque) {
       //Expects Vector
       this.angularMomentum.add(torque);
    }
 }
 
 class Transform {
    constructor(pos = new Vector(0, 0, 0), rot = new Vector(0, 0, 0), scale = new Vector(1, 1, 1)) {
       this.position = pos;
       this.rotation = rot;
       this.scale = scale;
    }
 }
 
 //Based off Unity GameObject system. Sort of. 
 class Geometry {
    constructor(transform, mesh) {
       this.mesh = mesh;    //e.g. CG.triangle or CG.square
       this.transform = transform;
       this.physicsBody = null;
       this.age = 0;
    }
 
    addPhysicsBody(velocity = new Vector(0, 0, 0), angularMomentum = new Vector(0, 0, 0)) {
       this.physicsBody = new PhyscisBody(velocity, angularMomentum);
    }
 
    applyTransform(m) {
       let transform = this.transform;
       m.translate(transform.position.x, transform.position.y, transform.position.z);
       m.rotateX(transform.rotation.x);
       m.rotateY(transform.rotation.y);
       m.rotateZ(transform.rotation.z);
       m.scale(transform.scale.x, transform.scale.y, transform.scale.z);
 
    }
 
    update() {
       if (this.physicsBody != null) {
          this.transform.position.add(this.physicsBody.velocity);
          this.transform.rotation.add(this.physicsBody.angularMomentum);
       }

       this.age += 1;
    }
 }

Collisions

Once we had an array of points along the mesh, higher resolution than the vertices of the low-resolution grid, finding an intersection with them was simple as it is just a sphere-point intersection:

if (controller.isButtonDown(2)) {
 
         patches.forEach(function (patch) {
 
            // Check position of controller against collision points of mesh
            patch.mesh.collisionPoints.forEach(function (collisionPoint) {
               let d = Vector.dist(pos, collisionPoint);
               if (d < closest) {
                  closest = d;
                  point = collisionPoint;
               }
            });
         });
 
         if (closest < COLLISIONTHRESHOLD) {
            controller.hitHandler.updateHitState(true);
            if (controller.hitHandler.isNewHit()) {
               handleNoteHit(point, state);
            }
         } else {
            //Flag exit hit:
            controller.hitHandler.updateHitState(false);
         }
      }

Musical Synthesiser (Contributions by Youpeng Gu)

In order to enable Quest to play a specific frequency of sound and mimic the effect of video games from the 80s, the “synth” mode was created first. After learning the Web Audio documentation and lots of experiments, Web Audio oscillators could be used on the Quest – and with SpatialAudio! The following is from our audio.js file.

playToneAt(position, orientation = [0, 0, 0]) {
 
        this.panner.setPosition(position[0], position[1], position[2]);
        this.panner.setOrientation(orientation[0], orientation[1], orientation[2]);
 
        //Proof of concept - take height from position and map to freq:
        let freq = this.map_range(position[1], -1, 2, 27, 2700)
        this.osc = this.playTone(freq);
 
    }
 
 
playTone(freq) {
        let dur = 2;
        let osc = this.context.createOscillator();
        osc.type = this.wave;
        osc.frequency.value = freq;
 
        osc
            .connect(this.panner)
            .connect(this.gainNode)     // Spatial
            .connect(this.context.destination);
        osc.start();
 
      osc.stop(this.context.currentTime + dur);
        delete this.oscillator;
        return osc;
 
    }

James modified the sound with the concept of an ‘envelope’ which effectively adds a fade in and fade out to the tone.

playTone(freq) {
        
        /**JH: Envelope is in basic terms how the sound enters and exits. 
         * A low attack means the sound reaches peak volume relatively 
         * quickly (as if the instrument was struck), and a high
         * release means the sound lingers around for longer. */
 
        this.envelope = {
            attack: 0.1,
            release: .5
        }
 
 
        let now = this.context.currentTime;
        let osc = this.context.createOscillator();
 
        
        let envGainNode = this.context.createGain();
 
        osc.type = this.wave;
        osc.frequency.value = freq;
 
        // envGain.cancelScheduledValues(now);
        envGainNode.gain.setValueAtTime(0, now);
        envGainNode.gain.linearRampToValueAtTime(1, now + this.envelope.attack);
        envGainNode.gain.linearRampToValueAtTime(0, now + this.envelope.attack + this.envelope.release);
        osc
            .connect(this.panner)
            .connect(this.gainNode)     // Spatial
            .connect(envGainNode)       //Envelope
            .connect(this.context.destination);
 
        osc.start();
 
        let dur = (this.envelope.attack + this.envelope.release)
        osc.stop(now + dur + 1);    //+1 helps prevent popping at the end of tones
        return osc;

Musical Styles

In its current version waves has a “synth” mode and a “piano” mode. The synth mode uses Web Audio oscillators to generate sawtooth waves upon impact. The piano mode uses sampled piano notes in .wav format. Both modes are built on top of the SpatialAudio system; the location that you strike on the mesh will be where the sound originates from. 

The synth mode is far from perfect; occasionally notes will cause interference or make jarring popping sounds. For the demonstration we stuck to piano mode as it is far more pleasing on the ears.


Next Steps

User Testing Feedback

The user testing during our last class provided much valuable insight. This ranged from notes about clarifying the onboarding to how to make the experience more musically interesting. Here is a list of actionable items based on the feedback.

  • Provide in-experience instructions on how the controllers work. This could be either:
    • Written instructions in a menu of sorts,
    • Visual cues/reminders on the controller avatars themselves
  • Explore different versions of music notes available. Right now we have the full western scale, which makes finding satisfying sounds difficult. Could the experience be more interesting if we used a pentatonic scale? A blues scale?
  • Further refine the mesh manipulation. Many users wanted to be able to rotate the mesh as a whole, not just move it.
  • Make it multiplayer (we’re aware of this one…)

Other General Future Goals

We hope to make this multiplayer ASAP, as we anticipate this will add an entirely new dimension to the piece. This goal requires us to reconcile the foundational patch code of the piece, that relies on world space without any transforms applied, and the transforms that are applied when calibrating.

We also hope to explore different soundscapes. We used piano samples because those were readily available to us and a decent proof of concept (and they don’t sound too bad), but we’d like to try other sounds – especially ones that compliment the aesthetic better, such as synth keyboards, electric guitars & basses, etc.

Leave a Reply

Your email address will not be published. Required fields are marked *