In simple, I would like my custom shader to have access to its own output during the last frame in order to have a pseudo-motion blur through color accumulation. Although motion blur isn't the correct term at all.
Consider the following: I have a mesh with a mostly transparent texture. The texture offset constantly changes. I would now like to build a shader with which the texture, instead of jumping, slowly fades from the locations it has been at. A full-screen blur effect will not do; the blur has to remain only on the object and needs to be camera-movement independent.
My idea so far has been to define a RenderTexture that would contain the generated texture during the last frame, but since RenderTextures are rendered by a camera which, since all I would like is the mesh-specific texture, I wouldn't know how to implement.
I have also tried to use GrabPass { }
in the shader, however that again seems to grab the entire rendered screen rather than the texture.
I may be misunderstanding how shaders work; If rendering the entire screen is the only option, then I could assign my material to a plane and render the texture that way, however I am unsure how, or rather: where in my code, I would do that. Creating a primitive plane as well as a camera just to get a shader to work also seems like it'd be a lot of things required for an effect that, initially, I thought simple to implement. shrug
Any help or pointers would be much appreciated,
- s_m_w
Edit: Here is a very quick mock drawing of what I would like to achieve:
I achieve the current effect by manipulating the texture offset. The "blur" in the second effect needs to be independent from camera movement.