0
votes

So I'm new to programming alongside the GPU and very new to LibGDX as well. I'm slowly making my way, but I'm having an issue figuring out how to modify where objects are drawn.

Right now, I have a single draw call in my level class:

// Update the camera.
// ------------------
_cameraRef.update();
_cameraRef.apply(Gdx.gl10);
_batchRef.setProjectionMatrix(_cameraRef.combined);

// Draw.
// -----
_batchRef.begin();
_player.draw(_batchRef);
_batchRef.end();

Very simple. The player is drawn at his x and y coordinates.

However, what I'm trying to do is to get him drawn at his x and y coordinates multiplied by a constant (depending on the resolution of the game, this can be 1, 2, or 4).

So instead of being drawn at (4, 8), he'll be drawn at (8, 16).

Basically, is there a way to tell the batch to draw every object at a position 2x or 4x from it's x and y coordinate?

I seem to think SpriteBatch::translate(x, y, z) is the answer, but I can't seem to get it to work the way I want.

Any help appreciated!

1

1 Answers

0
votes

There is no GL method to multiply the translation coordinates. If you think this is really what you should do, you can get the matrix as a float array and multiply the translation components. I do not suggest that though, simply use translate(x*resolutionFactor, y*resolutionFactor, z*resolutionFactor).

Though you didn't give us much info I would guess you are having trouble when dealing with devices that have larger DPI by factor 2 or so (retina display). If this is the case your problem probably comes from "glOrtho". This method is mostly used to define screen coordinate system in 2D and for sprite drawing it is usually set to: top = left = 0, bottom = viewHeight, right = viewWidth. Those viewWidth and viewHeight are probably taken from the frame buffer itself and are wrong for your case. If you have access to this method you should divide the width and height for retina devices or simply use the view's width and height to set it.