1
votes

I'm encountering some errors after upgrading the just_audio package from just_audio: 0.2.2 to just_audio: 0.6.5. I have tried to change the code but failed. Here is a screenshot of my code; enter image description hereenter image description here
errors detail; enter image description hereenter image description hereenter image description here Please help!

1
Hi, there can you tell some more about the error you are getting is your class and methods are not recognized by the editor? - Azad Prajapat
Hello, the errors are only on just_audio package's methods/variables, maybe their are some upgrades in the package which causes these methods unrecognizable? I have added some more detail please have a look and let me know if you need any other detail. Thank you - sitara
Hi there it looks like that these arguments and methods are no longer maintained in the latest version I tried the example code of latest audio package and its working fine and the streams are now submethods of AudioPlayer. so you can use them as _player.playbackEventStream - Azad Prajapat
yes you're right, I can use the playbackEventStream. But what about the other methods like AudioPlaybackState and position, what should I use to replace this? - sitara
Hi, you need to figure out manually in the API and documentation or by testing what methods are available or what is removed .either you can use the previous version which has all these methods. thanks - Azad Prajapat

1 Answers

1
votes

A lot will have changed between 0.2.x and 0.6.x so I will focus my answer on the specific error you showed concerning state.

In the new state model of 0.6.x, the state of the player consists of two orthogonal states called playing and processingState, as depicted in the following state diagram from the project README:

enter image description here

playing can be true or false, while processingState can be one of these states:

enum ProcessingState {
  /// The player has not loaded an [AudioSource].
  idle,

  /// The player is loading an [AudioSource].
  loading,

  /// The player is buffering audio and unable to play.
  buffering,

  /// The player is has enough audio buffered and is able to play.
  ready,

  /// The player has reached the end of the audio.
  completed,
}

One reason why this new composite state model was chosen is that in the old model, there was no way to distinguish between the following two states:

  1. Buffering while playing.
  2. Buffering while paused.

In the new model, these are easily distinguishable. If your app cares about both of these orthogonal states, you can listen to the playingStateStream which emits events encapsulating both states:

_audioPlayer.playerStateStream.listen((state) {
  if (state.playing) ... else ...
  switch (state.processingState) {
    case ProcessingState.idle: ...
    case ProcessingState.loading: ...
    case ProcessingState.buffering: ...
    case ProcessingState.ready: ...
    case ProcessingState.completed: ...
  }
});

Internally, playerStateStream is implemented by simply using rxdart to combine playingStream and processingStateStream. In more advanced scenarios, you may use a similar technique to combine any other streams your app is interested in all into one stream. If you have a good argument for making a particular combination of streams provided as a standard combination by the plugin, you are welcome to submit a feature request on GitHub.

Finally, since it appears you are also using audio_service, I would recommend looking at the latest audio_service example which demonstrates how to leverage the latest just_audio API to implement your background task in a more concise way.