0
votes

Thanks for taking the time to check out my issue.

I am working on improving the ocean in my first attempt at a game. I have decided on a using a bump map against my ocean tiles to add a little texture to the water. To do this, I draw my water tiles to a renderTarget and then apply a pixel shader while drawing the render target to the backbuffer.

The problem I am having is that the pixel shader seems to offset or displace the position of render target that is drawn. Observe these two photos:


This image is the game without running the pixel shader. Notice the "shallow water" around the islands which is a solid color here. enter image description here


With the pixel shader is run, that shallow water is offset to the right consistently. enter image description here

I am using the bump map provided in riemers novice bump mapping. One possible thought I had was that the dimensions of this bump map do not match the render target I am applying it on. However, I'm not entirely sure how I would create/resize this bump map.

My HLSL pixel shader looks like this:

#if OPENGL
    #define SV_POSITION POSITION
    #define VS_SHADERMODEL vs_3_0
    #define PS_SHADERMODEL ps_3_0
#else
    #define VS_SHADERMODEL vs_4_0_level_9_1
    #define PS_SHADERMODEL ps_4_0_level_9_1
#endif

matrix WorldViewProjection;
float xWaveLength;
float xWaveHeight;

texture bumpMap;
sampler2D bumpSampler = sampler_state
{
    Texture = <bumpMap>;
};

texture water;
sampler2D waterSampler = sampler_state
{
    Texture = <water>;
};
// MAG,MIN,MIRRR SETTINGS? SEE RIEMERS

struct VertexShaderInput
{
    float4 Position : POSITION0;
    float2 TextureCords : TEXCOORD;
    float4 Color : COLOR0;
};

struct VertexShaderOutput
{
    float4 Pos : SV_POSITION;
    float2 BumpMapSamplingPos : TEXCOORD2;
    float4 Color : COLOR0;
};

VertexShaderOutput MainVS(in VertexShaderInput input)
{
    VertexShaderOutput output = (VertexShaderOutput)0;

    output.BumpMapSamplingPos = input.TextureCords/xWaveLength;
    output.Pos = mul(input.Position, WorldViewProjection);
    output.Color = input.Color;

    return output;
}

float4 MainPS(float4 pos : SV_POSITION, float4 color1 : COLOR0, float2 texCoord : TEXCOORD0) : COLOR
{
    float4 bumpColor = tex2D(bumpSampler, texCoord.xy);
    //get offset 
    float2 perturbation = xWaveHeight * (bumpColor.rg - 0.5f)*2.0f;

    //apply offset to coordinates in original texture
    float2 currentCoords = texCoord.xy;
    float2 perturbatedTexCoords = currentCoords + perturbation;

    //return the perturbed values
    float4 color = tex2D(waterSampler, perturbatedTexCoords);
    return color;
}

technique oceanRipple
{
    pass P0
    {
        //VertexShader = compile VS_SHADERMODEL MainVS();
        PixelShader = compile PS_SHADERMODEL MainPS();
    }
};

And my monogame draw call looks like this:

    public void DrawMap(SpriteBatch sbWorld, SpriteBatch sbStatic, RenderTarget2D worldScene, GameTime gameTime)
    {

        // Set Water RenderTarget
        _graphics.SetRenderTarget(waterScene);
        _graphics.Clear(Color.CornflowerBlue);
        sbWorld.Begin(_cam, SpriteSortMode.Texture);
        foreach (var t in BoundingBoxLocations.OceanTileLocationList)
        {
            TilePiece tile = (TilePiece)t;
            tile.DrawTile(sbWorld);
        }
        sbWorld.End();

        // set up gamescene draw
        _graphics.SetRenderTarget(worldScene);
        _graphics.Clear(Color.PeachPuff);

        // water
        sbWorld.Begin(SpriteSortMode.Immediate, BlendState.AlphaBlend);
        oceanRippleEffect.Parameters["bumpMap"].SetValue(waterBumpMap);
        oceanRippleEffect.Parameters["water"].SetValue(waterScene);
        //oceanRippleEffect.Parameters["xWaveLength"].SetValue(3f);
        oceanRippleEffect.Parameters["xWaveHeight"].SetValue(0.3f);
        ExecuteTechnique("oceanRipple");
        sbWorld.Draw(waterScene, Vector2.Zero, Color.White);
        sbWorld.End();

        // land
        sbWorld.Begin(_cam, SpriteSortMode.Texture);
        foreach (var t in BoundingBoxLocations.LandTileLocationList)
        {
            TilePiece tile = (TilePiece)t;
            tile.DrawTile(sbWorld);
        }
        sbWorld.End();

    }

Can anyone see any issues with my code or otherwise that might be causing this offset issue?

Any help is much appreciated. Thanks!

EDIT

If I modify the xWaveHeight shader parameter, it changes where the offset appears. A value of 0 will not offset, but then the bump mapping is not applied. Is there any way around this?

I understand that the offset is being caused by the pixel shader perturbation, but I'm wondering if there is a way to undo this offset while preserving the bump mapping. In the linked riemer's tutorial, a vertex shader is included. I'm not quite sure if I need this, but when I include my vertex shader in the technique, and modify the pixel shader to the following, no water is drawn.

float4 MainPS(in VertexShaderOutput output) : COLOR
{
    float4 bumpColor = tex2D(bumpSampler, output.BumpMapSamplingPos.xy);
    //get offset 
    float2 perturbation = xWaveHeight * (bumpColor.rg - 0.5f)*2.0f;

    //apply offset to coordinates in original texture
    float2 currentCoords = output.BumpMapSamplingPos.xy;
    float2 perturbatedTexCoords = currentCoords + perturbation;

    //return the perturbed values
    float4 color = tex2D(waterSampler, perturbatedTexCoords);
    return color;
}
1

1 Answers

1
votes

First of all, for what you seem to be wanting to do, bump mapping is actually the wrong approach: bump mapping is about changing the surface normal (basicly "rotating" the pixel in 3D space), so following light calculations (such as reflection) see your surface as more complex then it really is (Notice that the texture of that pixel stays where it is). So, bump mapping would not at all modify the position of the ocean tile texture, but modify what is reflected by the ocean (for example, by changing the sample position of a skybox, so the reflection of the sky in the water is distorted). The way you are implementing it is more like "What if my screen would be an ocean and would reflect an image of tiles with ocean textures".

If you really want to use bump mapping, you would need some kind of big sky texture, and then, while (not after) drawing the ocean tiles, you would calculate a sample position of the reflection of this sky texture (based on the position of the tile on the screen) and then modify that sample position with bump mapping. All while drawing the tiles, not after drawing them to a render target.

It is also possible to do this deffered (more similar to what you are doing now) - actually, there are multiple ways of doing so - but either way you would still need to sample the final color from a sky texture, not from the render target your tiles were drawn on. The render target from your tiles would instead contain "meta" informations (depending on how exactly you want to do this). These informations could be a color that is multiplied with the color from the sky texture (creating "colored" water, eg. for different bioms or to simulate sun sets/sun rises), or a simple 1 or 0 to tell wether or not there is any ocean, or a per-tile bump map (which would you allow to apply a "screen global" and a "per tile" bump mapping in one go. You would still need a way to say "this pixel is not an ocean, don't do anything for that" in the render target), or - if you use multiple render targets - all of these at once. In any way, the sample position to sample from your render target(s) is not modified by bump mapping, only the sample position of the texture that is reflected by the ocean is. That way, there's also no displacement of the ocean, since we aren't touching that sample positions at all.

Now, to create a look that is more similar to what you seem to be wanting (according to your images), you wouldn't use bump mapping, but instead apply a small noise to the sample position in your pixel shader (the rest of the code doesn't need to change). For that, your shader would look more like this:

texture noiseTexture;
sampler2D noiseSampler = sampler_state
{
    Texture = <noiseTexture>;
    MipFilter = LINEAR;
    MinFilter = LINEAR;
    MagFilter = LINEAR;
    AddressU = Wrap;
    AddressV = Wrap;
};
float2 noiseOffset;
float2 noisePower;
float noiseFrequency;
VertexShaderOutput MainVS(in VertexShaderInput input)
{
    VertexShaderOutput output = (VertexShaderOutput)0;

    output.Pos = mul(input.Position, WorldViewProjection);
    output.Color = input.Color;

    return output;
}
float4 MainPS(float4 pos : SV_POSITION, float4 color1 : COLOR0, float2 texCoord : TEXCOORD0) : COLOR
{
    float4 noise = tex2D(noiseSampler, (texCoord.xy + noiseOffset.xy) * noiseFrequency);
    float2 offset = noisePower * (noise.xy - 0.5f) * 2.0f;

    float4 color = tex2D(waterSampler, texCoord.xy + offset.xy);
    return color;
}

Where noisePower would be (at most) approx. 1 over the number of horizontal/vertical tiles on the screen, noiseOffset can be used to "move" the noise over time on the screen (should be in range [-1;1]), and noiseFrequency is an artistic parameter (I would start with twice the max noise power, and then modify it from there, with higher values making the ocean more distorted). This way, the border of the tiles is distorted, but never moved more then one tile in any direction (thanks to the noisePower parameter). It is also important to use the correct kind of noise texture here: white noise, blue noise, maybe a "not really noise" texture that's build out of sinus waves, etc. Important is the fact that the "average" value of each pixel is about 0.5, so there's no overall displacement happening, and that the values are well distributed in the texture. Appart from that, see what kind of noise looks best for you.

Side note to the shader code: I haven't tested that code. Just that you know, not that there would be much room for mistakes.

Edit: As a side node: Of course the sky texture doesn't need to actualy look like a sky ;)