The kmedia2 interfaces allow you to ignore that wav files, mp3s and whatever consist of data streams. Instead, you only implement methods to play them.
Thus, you can write a wave loading routine in a way that you can play wave files (as PlayObject), but nobody else can use your code.
Asynchronous streams would be the alternative. You define an interface which allows you to pass data blocks in, and get data blocks out. This looks like that in MCOP:
interface Codec { in async byte stream indata; out async byte stream outdata; };
Of course codecs could also provide attributes to emit additional data, such as format information.
interface ByteAudioCodec { in async byte stream indata; out async byte stream outdata; readonly attribute samplingRate, bits, channels; };
This ByteAudioCodec
for instance could be
connected to a ByteStreamToAudio
object,
to make real float audio.
Of course, other Codec types could involve directly emitting video data, such as
interface VideoCodec { in async byte stream indata; out video stream outdata; /* note: video streams do not exist yet */ };
Most likely, a codec concept should be employed rather than the
“you know how to play and I don't” way for instance
WavPlayObject
currently uses. However,
somebody needs to sit down and do some experiments before an
API can be finalized.
Would you like to comment or contribute an update to this page?
Send feedback to the TDE Development Team