Android – how to play. MP3 files in the shuttle application?

I wrote a dart web application that retrieves. MP3 files from the server and plays them; I'm trying to write a mobile version using fluent. I know dart: Web_ Audio is the main option for web applications, but fluent can't find it in my SDK. I know it's there because I can compile the following into javascript:

    import 'dart:html';
    import 'dart:convert';
    import 'dart:web_audio';
    AudioContext audioContext;

    main() async {
      audioContext = new AudioContext();
      var ul = (querySelector('#songs') as UListElement);
      var signal = await HttpRequest.getString('http://10.0.0.6:8000/api/filelist');
     //  Map json = JSON.decode(signal);
     //  for (Map file in json['songs']) {
       print("signal: $signal");
       Map json = JSON.decode(signal);
       for (Map file in json['songs']) {
         var li = new LIElement()
           ..appendText(file['title']);
         var button = new ButtonElement();
         button.setAttribute("id", "#${file['file']}");
         button.appendText("Play");

         li.append(button);
         new Song(button, file['file']);
         ul.append(li);

      }

    }


    class Song {
      ButtonElement button;
      bool _playing = false;
      // AudioContext _audioContext;
      AudioBufferSourceNode _source;
      String title;

      Song(this.button, this.title) {

        button..onClick.listen((e) => _toggle());
      }

      _toggle() {
        _playing = !_playing;
        _playing ? _start() : _stop();
      }

      _start() {
        return HttpRequest
             .request("http://10.0.0.6:8000/music/$title", responseType: "arraybuffer")
             .then((HttpRequest httpRequest) {
                return audioContext
                  .decodeAudioData(httpRequest.response)
             .then((AudioBuffer buffer) {
                  _source = audioContext.createBufferSource();
                  _source.buffer = buffer;
                  _source.connectNode(audioContext.destination);
                  _source.start(0);
                  button.text = "Stop";
                  _source.onEnded.listen((e){
                     _playing = false;
                     button.text = "Play";
              });
           });
        });
      }

      _stop() {
         _source.stop(0);
         button.text = "Play";
      }
    } 

How to rewrite dart: Web in code for a fluent application_ Audio part? Can flutter access mediaplayer? If so, how would I reference it in pubspec.yaml?

resolvent:

Like Raju bitter mentioned above, flutter once provided some built-in audio wrappers in its core engine, but they were later deleted: https://github.com/flutter/flutter/issues/1364.

The application using FLUENT is only an IOS or Android application, so you can use hello_ Services model( https://github.com/flutter/flutter/tree/master/examples/hello_services )Some Java or obj-c code in can complete any operation that the underlying IOS / Android can perform through fluent. The model is recorded in https://flutter.io/platform-services. It's not as easy as we want. Many improvements are coming soon

The content of this article comes from the network collection of netizens. It is used as a learning reference. The copyright belongs to the original author.
THE END
分享
二维码
< <上一篇
下一篇>>