I'm writing a small rendering engine using C++ and DirectX 11. I've managed to render models using indexed vertices. Now I want to support textures and vertex normals as well. I'm using Wavefront OBJ files, which uses indices to reference to the correct texture and normal coordinates (much like indexed vertices). This is the first time I'm doing something like this with vertices, textures and normals combined, so this is all a bit new to me.
The problem I'm facing is that the number of indices for vertices, normals and texture coordinates are not the same and I'm trying to find a right way to use all these indices in a vertex shader. To illustrate the problem more clearly, I've made some images.
The left image is a wireframe of a simple piramid object and the right image is its UV coordinate layout (with the bottom of the piramid in the center). In the left image you can see that the piramid has 4 vertices, so there are 4 vertex indices. The right UV layout has 6 UV coordinates so there are 6 texture indices. For each face of the piramid has a normal of its own, so there are only 4 normal indices.
I've found this question while searching on SO and it seems to work in theory (I haven't tried it yet). Is this a common way to solve a problem like this or are there better ways? What would you recommend I do?
Thanks :)
P.S.: my vertex, normal and texture data is stored in separate arrays on the cpu side.