Problem is following: I'm rendering sprites to the texture, then I render texture to the screen. Simple, right? However, when I do that, each time sprite gets rendered, it overwrites every pixel it covers, independent of alpha value of its pixels. Example:
Here you can see two sprites with text "ABC". First is rendered in the right top corner and second on top of it with Y offset. As you can see, the second one overrides even those pixels of first sprite that shouldn't be overwritten with background color. When I do the same, but instead of rendering it to texture and then rendering that texture to screen, I render sprites directly to screen, result is as expected and second sprite overwrites only pixels that are supposed to be overwritten.
So, I'm not quite sure what's going on there, because alpha blending is enabled for all render targets and depth test is off for screen and texture rendering. I'm using DirectX11 on Windows Phone 8 - XAML with DirectX interop.
EDIT: Restating a problem a bit: Let's say I clear a texture to red color, so when I render it to screen, I have red screen. Now I render completely transparent (color 0,0,0,0) texture to said red texture. I would expect no change and after rendering to screen, I'd get red screen. This is not the case as when I render transparent sprite, it actually writes (0,0,0,0) color to the texture and I get a red texture with transparent window. So after rendering it to the screen, I can see color screen was cleared to. What's going on?