I'm developing a web application with Three.js that renders a scene in ASCII, much like the example given here. I'm having issues with frame rate though.
I've tried all sorts of different algorithms to convert the rendered scene into an ASCII string. Some slower than the example, some much faster than the example, but all too slow for rendering large scenes, even with the WebGL renderer.
Now I'm considering moving this conversion process over to the GPU via a shader, although I'm not sure how to make a Three.js shader output a string. Optimally I would also like to be able to input a custom string of ASCII characters to be used as a palette, though there isn't a string type in GLSL.
Thanks! :)