39
votes

Some months ago, with Android ICS (4.0), I developed an android kernel module which intercepted the "pcmC0D0p"-module to fetch all system audio.

My target is to stream ALL audio (or at least the played music) to a remote speaker via AirPlay.

The kernel module worked, but there where several problems (kernel-versions, root-privileges etc.) so I stopped working on this.

Now, we have Android 4.1 and 4.2 and I have new hope!

Who has an idea how to capture the audio in Android?

I had following ideas:

  1. Connect via bluetooth to the same phone, set routing to BT and grab the audio on the "other end": this shouldn't work

  2. Intercept the audio with a kernel module like done before: hardcore, get it worked but not applicable

  3. JACK Audio Connection Kit: sadly Android uses "tinyALSA" and not "ALSA". TinyALSA does NOT support any filters like JACK (but this brought the idea with the kernel module)

  4. Use PulseAudio as a replacement for AudioFlinger, but this is also not applicable


EDIT (forgot them):

  1. I compiled "tinymix" (baby-version of ALSA-mixer) from tinyALSA (the ALSA on Android) and tried to route the audio-out to mic-in - but with no success (not understandable for me). And this also needs rooting: not applicable

  2. I tested OpenSL ES, but I'm not a C-crack and it ended in "I can record microphone, but no more" (maybe I was wrong?)


I just found ROUTE_TYPE_LIVE_AUDIO:

A device that supports live audio routing will allow the media audio stream to be routed to supported destinations. This can include internal speakers or audio jacks on the device itself, A2DP devices, and more.

Once initiated this routing is transparent to the application. All audio played on the media stream will be routed to the selected destination.

Maybe this helps in any way?

I'm running out of ideas but want to "crack this nut", maybe someone can help me?

EDIT:

I'm really new in C & kernel-coding (but I successfully created a cross-compiled audio-interception-module) - but isn't it in any way possible to listen at the point the PCM-data goes from userspace (JAVA, C-layer?) to the kernel-space (tinyALSA, kernel-module), without hacking & rooting?

1
There's no support in Android for doing this, so it would only work on a custom ROM where you've added this functionality (e.g. by modifying some kernel module as you suggested). Newer Qualcomm platforms support WiFi Display though, which is a different technology from AirPlay that serves as a kind of wireless HDMI connection to compatible devices (e.g. some newer TVs). IIRC, the MediaRouter is a widget that you can add to your app and allows the user to select where audio should be routed. It only allows the user to select devices supported/detected by the phone/tablet though.Michael
Thank you for your response. That's really bad :*( But I'll keep on searching for a solution. I will extend my "test"-documentation above, forgot something.Martin L.
"isn't it in any way possible to listen at the point the PCM-data goes from userspace (JAVA, C-layer?) to the kernel-space (tinyALSA, kernel-module), without hacking & rooting?" Unfortunately for you there isn't. Another thing you might want to keep in mind is all playback doesn't necessarily go to the same ALSA playback device (pcmCxDyp). Normal playback might go to one device, low power playback to another, and low latency playback to yet another. amixer/tinymix won't do any good unless the platform provides some sort of readback of played data, which typically isn't the case.Michael
Hi Michael, yes your're right. But tinyALSA has one good: it ALWAYS uses pcmC0D0p for playback, and pcmC0D0c for capturing. I've analysed the source of it. At the moment, I'm re-installing the VirtualBox for android kernel-development and give my kernel-module another try (after some months cooling down) - I won't give up :)Martin L.
"it ALWAYS uses pcmC0D0p for playback, and pcmC0D0c for capturing." Not necessarily. I've worked with phones that uses libtinyalsa for USB audio playback, and in those cases pcmC1D0p was used.Michael

1 Answers

2
votes

You must pass audio traffic through your local socket server:

  1. Create local socket server

  2. Modify audio http stream address to path is thru your local socket server, for example for address

    http://example.com/stream.mp3  ->
    
    http://127.0.0.1:8888/http://example.com/stream.mp3
    

    where 8888 is your socket port.

  3. Play

    http://127.0.0.1:8888/http://example.com/stream.mp3 
    

    with default media player

  4. Read all incoming requests to port 8888 in your local socket server, analyze it and pass to example.com server.

  5. Read response from example.com and pass it to local client (media player). HERE you can capture audio stream!