I'm trying to bridge between Unity and WebRTC.
Update: Alright, I created a repo for this, still experiencing weird artifacts with rendering the textures, which I can't figure out the reason why. If anyone wants to take a look.
Since WebRTC is capable of providing frames from VideoTracks as a texture, I thought it would be best if it shares the EGL context with unity, so I can render it directly into the engine.
I figured that would be by setting the video hardware acceleration options on the PeerConnectionFactory. as follows:
PeerConnectionFactory.initializeAndroidGlobals(mainActivity.getApplicationContext(), true);
PeerConnectionFactory factory = new PeerConnectionFactory(new PeerConnectionFactory.Options());
EglBase rootEglBase = EglBase.createEgl14(EGL14.eglGetCurrentContext(), EglBase.CONFIG_PIXEL_RGBA_BUFFER);
factory.setVideoHwAccelerationOptions(rootEglBase.getEglBaseContext(),rootEglBase.getEglBaseContext());
Of course, this is only few assumputions on how it should work.
Since setVideoHwAccelerationOptions takes an EglBase.Contextit means that I need to find the context from Unity and convert it into that.
In order to do that, I found that EglBase.createEgl14 might do the trick, but I need the correct config attributes, which I can't find. Tried few conbinations, but that didn't work.
I'm basically stuck, I do not know where to go from here.
One other option is to get the ByteBuffer from frames, and pass them to Unity, but that's going to be performance hit and a waste of resources, since both Unity and WebRTC speak OpenGL. I feel I'm very close to the answer, but something is missing.
Update: I figured that eglGetCurrentContext() was not returning the context because it was not called from the main Unity thread. Now that I got the context, the textureId of I420Frame frames make sense. But they are not rendering. I think it has to do with the config attributes im passing to EglBase.createEgl14. Or else this might be also a threading thing?