Making The Web Rock
Web Audio
Google Chrome Developer Advocate
Why another audio API?
We have <audio> already!
<audio> hides the steps of loading, decoding and playing
<audio controls src="mysound.ogg"></audio>
Sometimes that's the right thing!
Web Audio provides:
2) An audio pipeline/routing system for effects and filters
Web Audio provides:
3) Hooks to analyze and
visualize audio data on the fly
DEMO
(Analysis, Filtering, Visualization)
What is audio useful for, anyway?
- Gaming
- App UX feedback
- Musical applications
- Audio processing
Building Simple App/Game Audio is easy
- Load audio files with XHR
- Tell Web Audio to decode them into buffers
- Create source node, point at buffer, connect it
- Call start()!
Loading and Playing a Sound
var myBuffer = null;
var context = new AudioContext(); // webkit prefix alert!
function loadDogSound(url) {
var request = new XMLHttpRequest();
request.open("GET", "dogBarking.mp3", true);
request.responseType = "arraybuffer";
request.onload = function() {
context.decodeAudioData( request.response,
function(buffer) { myBuffer = buffer; } ); }
request.send();
}
function playSound( buffer ) {
var sourceNode = audioContext.createBufferSource();
sourceNode.buffer = myBuffer;
sourceNode.connect( audioContext.destination );
sourceNode.start( 0 );
}
Web Audio API is based on a graph
Web Audio minimizes glitching
Web Audio runs in a separate thread, so audio and graphics don't compete as much.
You schedule Web Audio events in the future, and the system takes care of them.
Scheduling Sound Playback
function playEverySecondForTenSeconds( myBuffer ) {
for (var i=0; i<10; i++) {
var sourceNode = context.createBufferSource();
sourceNode.buffer = myBuffer;
sourceNode.connect( context.destination );
sourceNode.start( context.now + i );
}
}
Scheduling in a complex world
For dynamic rhythms, you need to combine web audio and system timing. See
article.
Scheduling in Web Audio
Not just about start( time )!
ANY AudioParam can be scheduled -
frequency, gain, detune, delayTime...
Scheduling on AudioParams
interface AudioParam {
attribute value;
// Parameter automation
void setValueAtTime( value, time );
void linearRampToValueAtTime( value, time );
void exponentialRampToValueAtTime( value, time );
void setTargetAtTime( target, time, timeConstant );
void setValueCurveAtTime( values, time, duration );
void cancelScheduledValues( startTime );
}
Gain Fade Example
var envelope = context.createGain();
mySoundNode.connect( envelope );
envelope.connect( context.destination );
var now = context.currentTime;
envelope.gain.setValueAtTime( 0, now );
envelope.gain.linearRampToValueAtTime( 1.0, now + 2.0 );
envelope.gain.linearRampToValueAtTime( 0.0, now + 4.0 );
mySoundNode.start(0);
Effects in Web Audio
- Biquad Filtering - lowpass, hipass, etc.
- Delay
- Compression
- Convolution
- Waveshaping
- Positioning/Panning/Doppler
- Custom Javascript processing*
Advanced Game Audio...
...would of course need these things
(although Angry Birds might not)
<audio> Integration
Web Audio can also process <audio> streams (and WebRTC, too!)
Audio for Music Applications
- Filtering
- Compression
- Audio input
- Delays and delay effects
- Waveform synthesis: oscillators
- Envelopes
- Offline processing
Advanced Effects...
Most "musical effects" are more complex than just a single filter or delay.
AudioParams can also be driven by audio-rate signals -
a chorus effect is just an oscillator changing delayTime!
DEMO
(you may need headphones for this one, sorry...)
Audio Input
Now we can get input too!
What's NOT there (yet) for web audio
- Any kind of plugin/VSTi hooks
- Multi-interface hooks(multi-channel, yes; multi-device, no.)
Web MIDI
This is not cheesy background music!
That's "Standard MIDI files."
MIDI lets you connect controllers, synthesizers and more to your computer.
Asking for MIDI devices
window.addEventListener('load', function() {
navigator.requestMIDIAccess().then(
onMIDIInit,
onMIDISystemError );
});
Enumerating MIDI output devices
(old style! Soon to stop working!)
function onMIDIInit( midi ) {
var list=midi.outputs();
for (var i=0; i<list.length; i++) {
list[i].send( [0x90, 3, 32] );
}
}
Enumerating MIDI output devices
(New style! Use this!)
function onMIDIInit( midi ) {
for (var input of midiAccess.outputs.values())
input.send( [0x90, 3, 32] );
}
MIDI Message syntax
MIDI has 16 virtual channels, blah blah blah.
Enumerating MIDI input devices
(Old style! Soon to stop working!)
function onMIDIInit( midi ) {
var list=midi.inputs();
for (var i=0; i<list.length; i++)
list[ i ].onmidimessage = midiMessageReceived;
}
Enumerating MIDI input devices
(New style! Use this!)
function onMIDIInit( midi ) {
for (var input of midiAccess.inputs.values())
input.onmidimessage = midiMessageReceived;
}
Parsing MIDI messages
function midiMessageReceived( ev ) {
var cmd = ev.data[0] >> 4;
var channel = ev.data[0] & 0xf;
var noteNumber = ev.data[1];
var velocity = 0;
if (ev.data.length > 2)
velocity = ev.data[2];
// MIDI noteon with velocity=0 is the same as noteoff
if ( cmd==8 || ((cmd==9)&&(velocity==0)) ) { // noteoff
noteOff( noteNumber );
} else if (cmd == 9) { // note on
noteOn( noteNumber, velocity);
} else if (cmd == 11) { // controller message
controller( noteNumber, velocity);
} else {
// probably sysex!
}
}
Web MIDI support
Implemented in Chrome under flag #enable-web-midi
Works on Android Chrome with USB OTG!
Firefox has expressed interest
Web Audio Support on Desktop
Chrome, Safari, Firefox
Web Audio Support on Mobile
Chrome for Android has support (higher latency)
iOS Safari 6.0 has Web Audio (with some caveats)
FFOS
Future App Opportunities
-
Immersive gaming audio
-
Audio feedback and input in app UX
-
Music applications - from synthesis to DAW
What I want from you:
- Investigate/play around/build awesome stuff
- Tell us what's not there
- Help us prioritize to make the web platform awesome for audio apps!
End
Questions?
cwilso@google.com
@cwilso
+Chris Wilson