how to play audio files syncronously in javascript?












6















I am working on a program to convert text into morse code audio.



Say I type in sos. My program will turn this into the array [1, 1, 1, 0, 2, 2, 2, 0, 1, 1, 1]. Where s = dot dot dot (or 1,1,1), and o = dash dash dash (or 2,2,2). This part is quite easy.



Next, I have 2 sound files:



var dot = new Audio('dot.mp3');
var dash = new Audio('dash.mp3');


My goal is to have a function that will play dot.mp3 when it sees a 1, and dash.mp3 when it sees a 2, and pauses when it sees a 0.



The following sort of/ kind of/ sometimes works, but I think it's fundamentally flawed and I don't know how to fix it.



function playMorseArr(morseArr) {
for (let i = 0; i < morseArr.length; i++) {
setTimeout(function() {
if (morseArr[i] === 1) {
dot.play();
}
if (morseArr[i] === 2) {
dash.play();
}
}, 250*i);
}
}


The problem:



I can loop over the array, and play the sound files, but timing is a challenge. If I don't set the setTimeout() interval just right, if the last audio file is not done playing and the 250ms has elapsed, the next element in the array will be skipped. So dash.mp3 is longer than dot.mp3. If my timing is too short, I might hear [dot dot dot pause dash dash pause dot dot dot], or something to that effect.



The effect I want



I want the program to go like this (in pseudocode):




  1. look at the ith array element

  2. if 1 or 2, start playing sound file or else create a pause

  3. wait for the sound file or pause to finish

  4. increment i and go back to step 1


What I have thought of, but don't know how to implement



So the pickle is that I want the loop to proceed synchronously. I've used promises in situations where I had several functions that I wanted executed in a specific order, but how would I chain an unknown number of functions?



I also considered using custom events, but I have the same problem.










share|improve this question





























    6















    I am working on a program to convert text into morse code audio.



    Say I type in sos. My program will turn this into the array [1, 1, 1, 0, 2, 2, 2, 0, 1, 1, 1]. Where s = dot dot dot (or 1,1,1), and o = dash dash dash (or 2,2,2). This part is quite easy.



    Next, I have 2 sound files:



    var dot = new Audio('dot.mp3');
    var dash = new Audio('dash.mp3');


    My goal is to have a function that will play dot.mp3 when it sees a 1, and dash.mp3 when it sees a 2, and pauses when it sees a 0.



    The following sort of/ kind of/ sometimes works, but I think it's fundamentally flawed and I don't know how to fix it.



    function playMorseArr(morseArr) {
    for (let i = 0; i < morseArr.length; i++) {
    setTimeout(function() {
    if (morseArr[i] === 1) {
    dot.play();
    }
    if (morseArr[i] === 2) {
    dash.play();
    }
    }, 250*i);
    }
    }


    The problem:



    I can loop over the array, and play the sound files, but timing is a challenge. If I don't set the setTimeout() interval just right, if the last audio file is not done playing and the 250ms has elapsed, the next element in the array will be skipped. So dash.mp3 is longer than dot.mp3. If my timing is too short, I might hear [dot dot dot pause dash dash pause dot dot dot], or something to that effect.



    The effect I want



    I want the program to go like this (in pseudocode):




    1. look at the ith array element

    2. if 1 or 2, start playing sound file or else create a pause

    3. wait for the sound file or pause to finish

    4. increment i and go back to step 1


    What I have thought of, but don't know how to implement



    So the pickle is that I want the loop to proceed synchronously. I've used promises in situations where I had several functions that I wanted executed in a specific order, but how would I chain an unknown number of functions?



    I also considered using custom events, but I have the same problem.










    share|improve this question



























      6












      6








      6








      I am working on a program to convert text into morse code audio.



      Say I type in sos. My program will turn this into the array [1, 1, 1, 0, 2, 2, 2, 0, 1, 1, 1]. Where s = dot dot dot (or 1,1,1), and o = dash dash dash (or 2,2,2). This part is quite easy.



      Next, I have 2 sound files:



      var dot = new Audio('dot.mp3');
      var dash = new Audio('dash.mp3');


      My goal is to have a function that will play dot.mp3 when it sees a 1, and dash.mp3 when it sees a 2, and pauses when it sees a 0.



      The following sort of/ kind of/ sometimes works, but I think it's fundamentally flawed and I don't know how to fix it.



      function playMorseArr(morseArr) {
      for (let i = 0; i < morseArr.length; i++) {
      setTimeout(function() {
      if (morseArr[i] === 1) {
      dot.play();
      }
      if (morseArr[i] === 2) {
      dash.play();
      }
      }, 250*i);
      }
      }


      The problem:



      I can loop over the array, and play the sound files, but timing is a challenge. If I don't set the setTimeout() interval just right, if the last audio file is not done playing and the 250ms has elapsed, the next element in the array will be skipped. So dash.mp3 is longer than dot.mp3. If my timing is too short, I might hear [dot dot dot pause dash dash pause dot dot dot], or something to that effect.



      The effect I want



      I want the program to go like this (in pseudocode):




      1. look at the ith array element

      2. if 1 or 2, start playing sound file or else create a pause

      3. wait for the sound file or pause to finish

      4. increment i and go back to step 1


      What I have thought of, but don't know how to implement



      So the pickle is that I want the loop to proceed synchronously. I've used promises in situations where I had several functions that I wanted executed in a specific order, but how would I chain an unknown number of functions?



      I also considered using custom events, but I have the same problem.










      share|improve this question
















      I am working on a program to convert text into morse code audio.



      Say I type in sos. My program will turn this into the array [1, 1, 1, 0, 2, 2, 2, 0, 1, 1, 1]. Where s = dot dot dot (or 1,1,1), and o = dash dash dash (or 2,2,2). This part is quite easy.



      Next, I have 2 sound files:



      var dot = new Audio('dot.mp3');
      var dash = new Audio('dash.mp3');


      My goal is to have a function that will play dot.mp3 when it sees a 1, and dash.mp3 when it sees a 2, and pauses when it sees a 0.



      The following sort of/ kind of/ sometimes works, but I think it's fundamentally flawed and I don't know how to fix it.



      function playMorseArr(morseArr) {
      for (let i = 0; i < morseArr.length; i++) {
      setTimeout(function() {
      if (morseArr[i] === 1) {
      dot.play();
      }
      if (morseArr[i] === 2) {
      dash.play();
      }
      }, 250*i);
      }
      }


      The problem:



      I can loop over the array, and play the sound files, but timing is a challenge. If I don't set the setTimeout() interval just right, if the last audio file is not done playing and the 250ms has elapsed, the next element in the array will be skipped. So dash.mp3 is longer than dot.mp3. If my timing is too short, I might hear [dot dot dot pause dash dash pause dot dot dot], or something to that effect.



      The effect I want



      I want the program to go like this (in pseudocode):




      1. look at the ith array element

      2. if 1 or 2, start playing sound file or else create a pause

      3. wait for the sound file or pause to finish

      4. increment i and go back to step 1


      What I have thought of, but don't know how to implement



      So the pickle is that I want the loop to proceed synchronously. I've used promises in situations where I had several functions that I wanted executed in a specific order, but how would I chain an unknown number of functions?



      I also considered using custom events, but I have the same problem.







      javascript html5-audio synchronous playback






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited 2 hours ago









      Shidersz

      5,9662729




      5,9662729










      asked 3 hours ago









      JozurcrunchJozurcrunch

      142210




      142210
























          3 Answers
          3






          active

          oldest

          votes


















          4














          Audios have an ended event that you can listen for, so you can await a Promise that resolves when that event fires:



          const audios = [undefined, dot, dash];
          async function playMorseArr(morseArr) {
          for (let i = 0; i < morseArr.length; i++) {
          const item = morseArr[i];
          await new Promise((resolve) => {
          if (item === 0) {
          // insert desired number of milliseconds to pause here
          setTimeout(resolve, 250);
          } else {
          audios[item].onended = resolve;
          audios[item].play();
          }
          });
          }
          }





          share|improve this answer
























          • What is the purpose of undefined at audios?

            – guest271314
            3 hours ago






          • 1





            Just a placeholder, since OP's morseArr has 1 corresponding to the dot audio, and 2 corresponding to the dash audio (but no audio corresponding to 0). Could've also done const item = morseArr[i] - 1 and had only two elements in the audios array

            – CertainPerformance
            3 hours ago



















          4














          I will use a recursive approach that will listen on the audio ended event. So, every time the current playing audio stop, the method is called again to play the next one.



          function playMorseArr(morseArr, idx)
          {
          // Finish condition.
          if (idx >= morseArr.length)
          return;

          let next = function() {playMorseArr(morseArr, idx + 1)};

          if (morseArr[idx] === 1) {
          dot.onended = next;
          dot.play();
          }
          else if (morseArr[idx] === 2) {
          dash.onended = next;
          dash.play();
          }
          else {
          setTimeout(next, 250);
          }
          }


          You can initialize the procedure calling playMorseArr() with the array and the start index:



          playMorseArr([1, 1, 1, 0, 2, 2, 2, 0, 1, 1, 1], 0);





          share|improve this answer

































            0














            Do not use HTMLAudioElement for that kind of application.



            The HTMLMediaElements are by nature asynchronous and everything from the play() method to the pause() one and going through the obvious resource fetching and the less obvious currentTime setting is asynchronous.



            This means that for applications that need perfect timings (like a Morse-code reader), these elements are purely unreliable.



            Instead, use the Web Audio API, and its AudioBufferSourceNodes objects, which you can control with µs precision.



            First fetch all your resources as ArrayBuffers, then when needed generate and play AudioBufferSourceNodes from these ArrayBuffers.



            You'll be able to start playing these synchronously, or to schedule them with higher precision than setTimeout will offer you (AudioContext uses its own clock).



            Worried about memory impact of having several AudioBufferSourceNodes playing your samples? Don't be. The data is stored only once in memory, in the AudioBuffer. AudioBufferSourceNodes are just views over this data and take up no place.






            // I use a lib for Morse encoding, didn't tested it too much though
            // https://github.com/Syncthetic/MorseCode/
            const morse = Object.create(MorseCode);

            const ctx = new AudioContext();

            (async function initMorseData() {
            // our AudioBuffers objects
            const [short, long] = await fetchBuffers();

            btn.onclick = e => {
            let time = 0; // a simple time counter
            const sequence = morse.encode(inp.value);
            console.log(sequence); // dots and dashes
            sequence.split('').forEach(type => {
            if(type === ' ') { // space => 0.5s of silence
            time += 0.5;
            return;
            }
            // create an AudioBufferSourceNode
            let source = ctx.createBufferSource();
            // assign the correct AudioBuffer to it
            source.buffer = type === '-' ? long : short;
            // connect to our output audio
            source.connect(ctx.destination);
            // schedule it to start at the end of previous one
            source.start(ctx.currentTime + time);
            // increment our timer with our sample's duration
            time += source.buffer.duration;
            });
            };
            // ready to go
            btn.disabled = false
            })()
            .catch(console.error);

            function fetchBuffers() {
            return Promise.all(
            [
            'https://dl.dropboxusercontent.com/s/1cdwpm3gca9mlo0/kick.mp3',
            'https://dl.dropboxusercontent.com/s/h2j6vm17r07jf03/snare.mp3'
            ].map(url => fetch(url)
            .then(r => r.arrayBuffer())
            .then(buf => ctx.decodeAudioData(buf))
            )
            );
            }

            <script src="https://cdn.jsdelivr.net/gh/Syncthetic/MorseCode@master/morsecode.js"></script>
            <input type="text" id="inp" value="sos"><button id="btn" disabled>play</button>








            share|improve this answer

























              Your Answer






              StackExchange.ifUsing("editor", function () {
              StackExchange.using("externalEditor", function () {
              StackExchange.using("snippets", function () {
              StackExchange.snippets.init();
              });
              });
              }, "code-snippets");

              StackExchange.ready(function() {
              var channelOptions = {
              tags: "".split(" "),
              id: "1"
              };
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function() {
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled) {
              StackExchange.using("snippets", function() {
              createEditor();
              });
              }
              else {
              createEditor();
              }
              });

              function createEditor() {
              StackExchange.prepareEditor({
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader: {
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              },
              onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              });


              }
              });














              draft saved

              draft discarded


















              StackExchange.ready(
              function () {
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54509959%2fhow-to-play-audio-files-syncronously-in-javascript%23new-answer', 'question_page');
              }
              );

              Post as a guest















              Required, but never shown

























              3 Answers
              3






              active

              oldest

              votes








              3 Answers
              3






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              4














              Audios have an ended event that you can listen for, so you can await a Promise that resolves when that event fires:



              const audios = [undefined, dot, dash];
              async function playMorseArr(morseArr) {
              for (let i = 0; i < morseArr.length; i++) {
              const item = morseArr[i];
              await new Promise((resolve) => {
              if (item === 0) {
              // insert desired number of milliseconds to pause here
              setTimeout(resolve, 250);
              } else {
              audios[item].onended = resolve;
              audios[item].play();
              }
              });
              }
              }





              share|improve this answer
























              • What is the purpose of undefined at audios?

                – guest271314
                3 hours ago






              • 1





                Just a placeholder, since OP's morseArr has 1 corresponding to the dot audio, and 2 corresponding to the dash audio (but no audio corresponding to 0). Could've also done const item = morseArr[i] - 1 and had only two elements in the audios array

                – CertainPerformance
                3 hours ago
















              4














              Audios have an ended event that you can listen for, so you can await a Promise that resolves when that event fires:



              const audios = [undefined, dot, dash];
              async function playMorseArr(morseArr) {
              for (let i = 0; i < morseArr.length; i++) {
              const item = morseArr[i];
              await new Promise((resolve) => {
              if (item === 0) {
              // insert desired number of milliseconds to pause here
              setTimeout(resolve, 250);
              } else {
              audios[item].onended = resolve;
              audios[item].play();
              }
              });
              }
              }





              share|improve this answer
























              • What is the purpose of undefined at audios?

                – guest271314
                3 hours ago






              • 1





                Just a placeholder, since OP's morseArr has 1 corresponding to the dot audio, and 2 corresponding to the dash audio (but no audio corresponding to 0). Could've also done const item = morseArr[i] - 1 and had only two elements in the audios array

                – CertainPerformance
                3 hours ago














              4












              4








              4







              Audios have an ended event that you can listen for, so you can await a Promise that resolves when that event fires:



              const audios = [undefined, dot, dash];
              async function playMorseArr(morseArr) {
              for (let i = 0; i < morseArr.length; i++) {
              const item = morseArr[i];
              await new Promise((resolve) => {
              if (item === 0) {
              // insert desired number of milliseconds to pause here
              setTimeout(resolve, 250);
              } else {
              audios[item].onended = resolve;
              audios[item].play();
              }
              });
              }
              }





              share|improve this answer













              Audios have an ended event that you can listen for, so you can await a Promise that resolves when that event fires:



              const audios = [undefined, dot, dash];
              async function playMorseArr(morseArr) {
              for (let i = 0; i < morseArr.length; i++) {
              const item = morseArr[i];
              await new Promise((resolve) => {
              if (item === 0) {
              // insert desired number of milliseconds to pause here
              setTimeout(resolve, 250);
              } else {
              audios[item].onended = resolve;
              audios[item].play();
              }
              });
              }
              }






              share|improve this answer












              share|improve this answer



              share|improve this answer










              answered 3 hours ago









              CertainPerformanceCertainPerformance

              84.1k154168




              84.1k154168













              • What is the purpose of undefined at audios?

                – guest271314
                3 hours ago






              • 1





                Just a placeholder, since OP's morseArr has 1 corresponding to the dot audio, and 2 corresponding to the dash audio (but no audio corresponding to 0). Could've also done const item = morseArr[i] - 1 and had only two elements in the audios array

                – CertainPerformance
                3 hours ago



















              • What is the purpose of undefined at audios?

                – guest271314
                3 hours ago






              • 1





                Just a placeholder, since OP's morseArr has 1 corresponding to the dot audio, and 2 corresponding to the dash audio (but no audio corresponding to 0). Could've also done const item = morseArr[i] - 1 and had only two elements in the audios array

                – CertainPerformance
                3 hours ago

















              What is the purpose of undefined at audios?

              – guest271314
              3 hours ago





              What is the purpose of undefined at audios?

              – guest271314
              3 hours ago




              1




              1





              Just a placeholder, since OP's morseArr has 1 corresponding to the dot audio, and 2 corresponding to the dash audio (but no audio corresponding to 0). Could've also done const item = morseArr[i] - 1 and had only two elements in the audios array

              – CertainPerformance
              3 hours ago





              Just a placeholder, since OP's morseArr has 1 corresponding to the dot audio, and 2 corresponding to the dash audio (but no audio corresponding to 0). Could've also done const item = morseArr[i] - 1 and had only two elements in the audios array

              – CertainPerformance
              3 hours ago













              4














              I will use a recursive approach that will listen on the audio ended event. So, every time the current playing audio stop, the method is called again to play the next one.



              function playMorseArr(morseArr, idx)
              {
              // Finish condition.
              if (idx >= morseArr.length)
              return;

              let next = function() {playMorseArr(morseArr, idx + 1)};

              if (morseArr[idx] === 1) {
              dot.onended = next;
              dot.play();
              }
              else if (morseArr[idx] === 2) {
              dash.onended = next;
              dash.play();
              }
              else {
              setTimeout(next, 250);
              }
              }


              You can initialize the procedure calling playMorseArr() with the array and the start index:



              playMorseArr([1, 1, 1, 0, 2, 2, 2, 0, 1, 1, 1], 0);





              share|improve this answer






























                4














                I will use a recursive approach that will listen on the audio ended event. So, every time the current playing audio stop, the method is called again to play the next one.



                function playMorseArr(morseArr, idx)
                {
                // Finish condition.
                if (idx >= morseArr.length)
                return;

                let next = function() {playMorseArr(morseArr, idx + 1)};

                if (morseArr[idx] === 1) {
                dot.onended = next;
                dot.play();
                }
                else if (morseArr[idx] === 2) {
                dash.onended = next;
                dash.play();
                }
                else {
                setTimeout(next, 250);
                }
                }


                You can initialize the procedure calling playMorseArr() with the array and the start index:



                playMorseArr([1, 1, 1, 0, 2, 2, 2, 0, 1, 1, 1], 0);





                share|improve this answer




























                  4












                  4








                  4







                  I will use a recursive approach that will listen on the audio ended event. So, every time the current playing audio stop, the method is called again to play the next one.



                  function playMorseArr(morseArr, idx)
                  {
                  // Finish condition.
                  if (idx >= morseArr.length)
                  return;

                  let next = function() {playMorseArr(morseArr, idx + 1)};

                  if (morseArr[idx] === 1) {
                  dot.onended = next;
                  dot.play();
                  }
                  else if (morseArr[idx] === 2) {
                  dash.onended = next;
                  dash.play();
                  }
                  else {
                  setTimeout(next, 250);
                  }
                  }


                  You can initialize the procedure calling playMorseArr() with the array and the start index:



                  playMorseArr([1, 1, 1, 0, 2, 2, 2, 0, 1, 1, 1], 0);





                  share|improve this answer















                  I will use a recursive approach that will listen on the audio ended event. So, every time the current playing audio stop, the method is called again to play the next one.



                  function playMorseArr(morseArr, idx)
                  {
                  // Finish condition.
                  if (idx >= morseArr.length)
                  return;

                  let next = function() {playMorseArr(morseArr, idx + 1)};

                  if (morseArr[idx] === 1) {
                  dot.onended = next;
                  dot.play();
                  }
                  else if (morseArr[idx] === 2) {
                  dash.onended = next;
                  dash.play();
                  }
                  else {
                  setTimeout(next, 250);
                  }
                  }


                  You can initialize the procedure calling playMorseArr() with the array and the start index:



                  playMorseArr([1, 1, 1, 0, 2, 2, 2, 0, 1, 1, 1], 0);






                  share|improve this answer














                  share|improve this answer



                  share|improve this answer








                  edited 3 hours ago

























                  answered 3 hours ago









                  ShiderszShidersz

                  5,9662729




                  5,9662729























                      0














                      Do not use HTMLAudioElement for that kind of application.



                      The HTMLMediaElements are by nature asynchronous and everything from the play() method to the pause() one and going through the obvious resource fetching and the less obvious currentTime setting is asynchronous.



                      This means that for applications that need perfect timings (like a Morse-code reader), these elements are purely unreliable.



                      Instead, use the Web Audio API, and its AudioBufferSourceNodes objects, which you can control with µs precision.



                      First fetch all your resources as ArrayBuffers, then when needed generate and play AudioBufferSourceNodes from these ArrayBuffers.



                      You'll be able to start playing these synchronously, or to schedule them with higher precision than setTimeout will offer you (AudioContext uses its own clock).



                      Worried about memory impact of having several AudioBufferSourceNodes playing your samples? Don't be. The data is stored only once in memory, in the AudioBuffer. AudioBufferSourceNodes are just views over this data and take up no place.






                      // I use a lib for Morse encoding, didn't tested it too much though
                      // https://github.com/Syncthetic/MorseCode/
                      const morse = Object.create(MorseCode);

                      const ctx = new AudioContext();

                      (async function initMorseData() {
                      // our AudioBuffers objects
                      const [short, long] = await fetchBuffers();

                      btn.onclick = e => {
                      let time = 0; // a simple time counter
                      const sequence = morse.encode(inp.value);
                      console.log(sequence); // dots and dashes
                      sequence.split('').forEach(type => {
                      if(type === ' ') { // space => 0.5s of silence
                      time += 0.5;
                      return;
                      }
                      // create an AudioBufferSourceNode
                      let source = ctx.createBufferSource();
                      // assign the correct AudioBuffer to it
                      source.buffer = type === '-' ? long : short;
                      // connect to our output audio
                      source.connect(ctx.destination);
                      // schedule it to start at the end of previous one
                      source.start(ctx.currentTime + time);
                      // increment our timer with our sample's duration
                      time += source.buffer.duration;
                      });
                      };
                      // ready to go
                      btn.disabled = false
                      })()
                      .catch(console.error);

                      function fetchBuffers() {
                      return Promise.all(
                      [
                      'https://dl.dropboxusercontent.com/s/1cdwpm3gca9mlo0/kick.mp3',
                      'https://dl.dropboxusercontent.com/s/h2j6vm17r07jf03/snare.mp3'
                      ].map(url => fetch(url)
                      .then(r => r.arrayBuffer())
                      .then(buf => ctx.decodeAudioData(buf))
                      )
                      );
                      }

                      <script src="https://cdn.jsdelivr.net/gh/Syncthetic/MorseCode@master/morsecode.js"></script>
                      <input type="text" id="inp" value="sos"><button id="btn" disabled>play</button>








                      share|improve this answer






























                        0














                        Do not use HTMLAudioElement for that kind of application.



                        The HTMLMediaElements are by nature asynchronous and everything from the play() method to the pause() one and going through the obvious resource fetching and the less obvious currentTime setting is asynchronous.



                        This means that for applications that need perfect timings (like a Morse-code reader), these elements are purely unreliable.



                        Instead, use the Web Audio API, and its AudioBufferSourceNodes objects, which you can control with µs precision.



                        First fetch all your resources as ArrayBuffers, then when needed generate and play AudioBufferSourceNodes from these ArrayBuffers.



                        You'll be able to start playing these synchronously, or to schedule them with higher precision than setTimeout will offer you (AudioContext uses its own clock).



                        Worried about memory impact of having several AudioBufferSourceNodes playing your samples? Don't be. The data is stored only once in memory, in the AudioBuffer. AudioBufferSourceNodes are just views over this data and take up no place.






                        // I use a lib for Morse encoding, didn't tested it too much though
                        // https://github.com/Syncthetic/MorseCode/
                        const morse = Object.create(MorseCode);

                        const ctx = new AudioContext();

                        (async function initMorseData() {
                        // our AudioBuffers objects
                        const [short, long] = await fetchBuffers();

                        btn.onclick = e => {
                        let time = 0; // a simple time counter
                        const sequence = morse.encode(inp.value);
                        console.log(sequence); // dots and dashes
                        sequence.split('').forEach(type => {
                        if(type === ' ') { // space => 0.5s of silence
                        time += 0.5;
                        return;
                        }
                        // create an AudioBufferSourceNode
                        let source = ctx.createBufferSource();
                        // assign the correct AudioBuffer to it
                        source.buffer = type === '-' ? long : short;
                        // connect to our output audio
                        source.connect(ctx.destination);
                        // schedule it to start at the end of previous one
                        source.start(ctx.currentTime + time);
                        // increment our timer with our sample's duration
                        time += source.buffer.duration;
                        });
                        };
                        // ready to go
                        btn.disabled = false
                        })()
                        .catch(console.error);

                        function fetchBuffers() {
                        return Promise.all(
                        [
                        'https://dl.dropboxusercontent.com/s/1cdwpm3gca9mlo0/kick.mp3',
                        'https://dl.dropboxusercontent.com/s/h2j6vm17r07jf03/snare.mp3'
                        ].map(url => fetch(url)
                        .then(r => r.arrayBuffer())
                        .then(buf => ctx.decodeAudioData(buf))
                        )
                        );
                        }

                        <script src="https://cdn.jsdelivr.net/gh/Syncthetic/MorseCode@master/morsecode.js"></script>
                        <input type="text" id="inp" value="sos"><button id="btn" disabled>play</button>








                        share|improve this answer




























                          0












                          0








                          0







                          Do not use HTMLAudioElement for that kind of application.



                          The HTMLMediaElements are by nature asynchronous and everything from the play() method to the pause() one and going through the obvious resource fetching and the less obvious currentTime setting is asynchronous.



                          This means that for applications that need perfect timings (like a Morse-code reader), these elements are purely unreliable.



                          Instead, use the Web Audio API, and its AudioBufferSourceNodes objects, which you can control with µs precision.



                          First fetch all your resources as ArrayBuffers, then when needed generate and play AudioBufferSourceNodes from these ArrayBuffers.



                          You'll be able to start playing these synchronously, or to schedule them with higher precision than setTimeout will offer you (AudioContext uses its own clock).



                          Worried about memory impact of having several AudioBufferSourceNodes playing your samples? Don't be. The data is stored only once in memory, in the AudioBuffer. AudioBufferSourceNodes are just views over this data and take up no place.






                          // I use a lib for Morse encoding, didn't tested it too much though
                          // https://github.com/Syncthetic/MorseCode/
                          const morse = Object.create(MorseCode);

                          const ctx = new AudioContext();

                          (async function initMorseData() {
                          // our AudioBuffers objects
                          const [short, long] = await fetchBuffers();

                          btn.onclick = e => {
                          let time = 0; // a simple time counter
                          const sequence = morse.encode(inp.value);
                          console.log(sequence); // dots and dashes
                          sequence.split('').forEach(type => {
                          if(type === ' ') { // space => 0.5s of silence
                          time += 0.5;
                          return;
                          }
                          // create an AudioBufferSourceNode
                          let source = ctx.createBufferSource();
                          // assign the correct AudioBuffer to it
                          source.buffer = type === '-' ? long : short;
                          // connect to our output audio
                          source.connect(ctx.destination);
                          // schedule it to start at the end of previous one
                          source.start(ctx.currentTime + time);
                          // increment our timer with our sample's duration
                          time += source.buffer.duration;
                          });
                          };
                          // ready to go
                          btn.disabled = false
                          })()
                          .catch(console.error);

                          function fetchBuffers() {
                          return Promise.all(
                          [
                          'https://dl.dropboxusercontent.com/s/1cdwpm3gca9mlo0/kick.mp3',
                          'https://dl.dropboxusercontent.com/s/h2j6vm17r07jf03/snare.mp3'
                          ].map(url => fetch(url)
                          .then(r => r.arrayBuffer())
                          .then(buf => ctx.decodeAudioData(buf))
                          )
                          );
                          }

                          <script src="https://cdn.jsdelivr.net/gh/Syncthetic/MorseCode@master/morsecode.js"></script>
                          <input type="text" id="inp" value="sos"><button id="btn" disabled>play</button>








                          share|improve this answer















                          Do not use HTMLAudioElement for that kind of application.



                          The HTMLMediaElements are by nature asynchronous and everything from the play() method to the pause() one and going through the obvious resource fetching and the less obvious currentTime setting is asynchronous.



                          This means that for applications that need perfect timings (like a Morse-code reader), these elements are purely unreliable.



                          Instead, use the Web Audio API, and its AudioBufferSourceNodes objects, which you can control with µs precision.



                          First fetch all your resources as ArrayBuffers, then when needed generate and play AudioBufferSourceNodes from these ArrayBuffers.



                          You'll be able to start playing these synchronously, or to schedule them with higher precision than setTimeout will offer you (AudioContext uses its own clock).



                          Worried about memory impact of having several AudioBufferSourceNodes playing your samples? Don't be. The data is stored only once in memory, in the AudioBuffer. AudioBufferSourceNodes are just views over this data and take up no place.






                          // I use a lib for Morse encoding, didn't tested it too much though
                          // https://github.com/Syncthetic/MorseCode/
                          const morse = Object.create(MorseCode);

                          const ctx = new AudioContext();

                          (async function initMorseData() {
                          // our AudioBuffers objects
                          const [short, long] = await fetchBuffers();

                          btn.onclick = e => {
                          let time = 0; // a simple time counter
                          const sequence = morse.encode(inp.value);
                          console.log(sequence); // dots and dashes
                          sequence.split('').forEach(type => {
                          if(type === ' ') { // space => 0.5s of silence
                          time += 0.5;
                          return;
                          }
                          // create an AudioBufferSourceNode
                          let source = ctx.createBufferSource();
                          // assign the correct AudioBuffer to it
                          source.buffer = type === '-' ? long : short;
                          // connect to our output audio
                          source.connect(ctx.destination);
                          // schedule it to start at the end of previous one
                          source.start(ctx.currentTime + time);
                          // increment our timer with our sample's duration
                          time += source.buffer.duration;
                          });
                          };
                          // ready to go
                          btn.disabled = false
                          })()
                          .catch(console.error);

                          function fetchBuffers() {
                          return Promise.all(
                          [
                          'https://dl.dropboxusercontent.com/s/1cdwpm3gca9mlo0/kick.mp3',
                          'https://dl.dropboxusercontent.com/s/h2j6vm17r07jf03/snare.mp3'
                          ].map(url => fetch(url)
                          .then(r => r.arrayBuffer())
                          .then(buf => ctx.decodeAudioData(buf))
                          )
                          );
                          }

                          <script src="https://cdn.jsdelivr.net/gh/Syncthetic/MorseCode@master/morsecode.js"></script>
                          <input type="text" id="inp" value="sos"><button id="btn" disabled>play</button>








                          // I use a lib for Morse encoding, didn't tested it too much though
                          // https://github.com/Syncthetic/MorseCode/
                          const morse = Object.create(MorseCode);

                          const ctx = new AudioContext();

                          (async function initMorseData() {
                          // our AudioBuffers objects
                          const [short, long] = await fetchBuffers();

                          btn.onclick = e => {
                          let time = 0; // a simple time counter
                          const sequence = morse.encode(inp.value);
                          console.log(sequence); // dots and dashes
                          sequence.split('').forEach(type => {
                          if(type === ' ') { // space => 0.5s of silence
                          time += 0.5;
                          return;
                          }
                          // create an AudioBufferSourceNode
                          let source = ctx.createBufferSource();
                          // assign the correct AudioBuffer to it
                          source.buffer = type === '-' ? long : short;
                          // connect to our output audio
                          source.connect(ctx.destination);
                          // schedule it to start at the end of previous one
                          source.start(ctx.currentTime + time);
                          // increment our timer with our sample's duration
                          time += source.buffer.duration;
                          });
                          };
                          // ready to go
                          btn.disabled = false
                          })()
                          .catch(console.error);

                          function fetchBuffers() {
                          return Promise.all(
                          [
                          'https://dl.dropboxusercontent.com/s/1cdwpm3gca9mlo0/kick.mp3',
                          'https://dl.dropboxusercontent.com/s/h2j6vm17r07jf03/snare.mp3'
                          ].map(url => fetch(url)
                          .then(r => r.arrayBuffer())
                          .then(buf => ctx.decodeAudioData(buf))
                          )
                          );
                          }

                          <script src="https://cdn.jsdelivr.net/gh/Syncthetic/MorseCode@master/morsecode.js"></script>
                          <input type="text" id="inp" value="sos"><button id="btn" disabled>play</button>





                          // I use a lib for Morse encoding, didn't tested it too much though
                          // https://github.com/Syncthetic/MorseCode/
                          const morse = Object.create(MorseCode);

                          const ctx = new AudioContext();

                          (async function initMorseData() {
                          // our AudioBuffers objects
                          const [short, long] = await fetchBuffers();

                          btn.onclick = e => {
                          let time = 0; // a simple time counter
                          const sequence = morse.encode(inp.value);
                          console.log(sequence); // dots and dashes
                          sequence.split('').forEach(type => {
                          if(type === ' ') { // space => 0.5s of silence
                          time += 0.5;
                          return;
                          }
                          // create an AudioBufferSourceNode
                          let source = ctx.createBufferSource();
                          // assign the correct AudioBuffer to it
                          source.buffer = type === '-' ? long : short;
                          // connect to our output audio
                          source.connect(ctx.destination);
                          // schedule it to start at the end of previous one
                          source.start(ctx.currentTime + time);
                          // increment our timer with our sample's duration
                          time += source.buffer.duration;
                          });
                          };
                          // ready to go
                          btn.disabled = false
                          })()
                          .catch(console.error);

                          function fetchBuffers() {
                          return Promise.all(
                          [
                          'https://dl.dropboxusercontent.com/s/1cdwpm3gca9mlo0/kick.mp3',
                          'https://dl.dropboxusercontent.com/s/h2j6vm17r07jf03/snare.mp3'
                          ].map(url => fetch(url)
                          .then(r => r.arrayBuffer())
                          .then(buf => ctx.decodeAudioData(buf))
                          )
                          );
                          }

                          <script src="https://cdn.jsdelivr.net/gh/Syncthetic/MorseCode@master/morsecode.js"></script>
                          <input type="text" id="inp" value="sos"><button id="btn" disabled>play</button>






                          share|improve this answer














                          share|improve this answer



                          share|improve this answer








                          edited 45 mins ago

























                          answered 1 hour ago









                          KaiidoKaiido

                          40.8k461100




                          40.8k461100






























                              draft saved

                              draft discarded




















































                              Thanks for contributing an answer to Stack Overflow!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid



                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function () {
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54509959%2fhow-to-play-audio-files-syncronously-in-javascript%23new-answer', 'question_page');
                              }
                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              CARDNET

                              Boot-repair Failure: Unable to locate package grub-common:i386

                              Aws NAT - Aws IGW- Aws router