You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have begun using Gapless-5 with my web audio player. Before using Gapless-5, I used a single HTML5 audio element and used the Web Audio API to create a MediaSource object from it, so that I could implement a waveform visualization. Example/pseudocode:
var audioCtx = new (window.AudioContext || window.webkitAudioContext)();
var myAudio = document.querySelector('audio.player');
var source = audioCtx.createMediaElementSource(myAudio);
analyser = audioCtx.createAnalyser();
analyser.fftSize = 2048;
bufferLength = analyser.frequencyBinCount;
dataArray = new Uint8Array(bufferLength);
source.connect(analyser);
analyser.connect(audioCtx.destination);
drawWaveform();
Would it be possible for you to create and expose the AudioContext and MediaSource objects, so that they could be used for visualization purposes? I would also need a callback for when the MediaSource object was changed, given that your library switches between multiple HTML5 audio elements and/or Web Audio API buffers.
The text was updated successfully, but these errors were encountered:
You could access the AudioContext through the global variable window.gapless5AudioContext. As for getting the current MediaSource and callbacks when there is a new one, that would be more complicated, especially since Gapless5 handles HTML5 Audio and WebAudio objects simultaneously.
Another project using Gapless5 had a similar problem regarding waveform visualization, and the way it handled this was to have its own HTML5 Audio instance streaming the same files, rather than try to read from the same buffers as Gapless5 (especially since that project is using Gapless5 in WebAudio-only mode)
I have begun using Gapless-5 with my web audio player. Before using Gapless-5, I used a single HTML5 audio element and used the Web Audio API to create a MediaSource object from it, so that I could implement a waveform visualization. Example/pseudocode:
Would it be possible for you to create and expose the AudioContext and MediaSource objects, so that they could be used for visualization purposes? I would also need a callback for when the MediaSource object was changed, given that your library switches between multiple HTML5 audio elements and/or Web Audio API buffers.
The text was updated successfully, but these errors were encountered: