ssloy/tinyrenderer

Lesson 3 Texture mapping not rendering correctly

atskae opened this issue ยท 11 comments

I am running into a similar problem as encountered in this issue #105 though I can't seem to get it right with the solutions proposed there.

I am currently rendering this image with my renderer:

Incorrect texture mapping

So it is not as smooth as the tutorial's rendering. I tried switching the orders of the Barycentric coordinates but I think I've tried every combination and none of them look right.

This is what my render looks like if I assign red to vertex 0, green to vertex 1, and blue to vertex 2 of each triangle:

Assign a color to each vertex

which I noticed is the reverse order as the one shown in #105 (comment), but I tried reversing the Barycentric coordinates and it still doesn't look right.

This is how I compute the UV coordinates (I update the indices variable to try various combinations of Barycentric coordinates):

                std::vector<int> indices = {2, 1, 0};
                for (int i=0; i<3; i++) { // for each vertex
                    z += (t_world[i].z*barycentric_coordinates[indices[i]]);
                    u += (uv_coordinates[i].x * barycentric_coordinates[indices[i]]);
                    v += (uv_coordinates[i].y * barycentric_coordinates[indices[i]]);
                }

My source code is here (under triangle_filled_barycentric_coordinates()). Any help would be appreciated, thank you!

When I map red to vertex 0, green to vertex 1, and blue to vertex 2, I notice that, when compared to this render (which I assume is correct), the correct render is consistently in counterclockwise order, but my triangles are sometimes counterclockwise and sometimes clockwise. What would cause this issue? I checked that my copy of the .obj file and tinyrenderer's copy are the same. I read every vertex from that file the same way.

When I map red to vertex 0, green to vertex 1, and blue to vertex 2, I notice that, when compared to this render (which I assume is correct), the correct render is consistently in counterclockwise order, but my triangles are sometimes counterclockwise and sometimes clockwise. What would cause this issue? I checked that my copy of the .obj file and tinyrenderer's copy are the same. I read every vertex from that file the same way.

Would @ssloy have any insights on why this might happen?

I think your issue is related to not mapping the right vertices to the correct texture coordinates or their corresponding barycentric coordinates. The orientation of the vertices (whether they are clockwise or counterclockwise) isn't the primary issue because each vertex should correspond to a specific coordinate in the texture file (diffuse). Understanding the barycentric coordinates of each vertex of a face is more important.

Try experimenting more with the mappings, or review the original author's code for guidance. It's crucial to understand why the textures are not mapping correctly. I hope you can solve this problem soon.

Understanding the barycentric coordinates of each vertex of a face is more important.

Thank you for your comment! Do you mean the computation of the barycentric coordinates themselves could be incorrect in my implementation?

Try experimenting more with the mappings

Do you mean try mapping different barycentric coordinates to each vertex of the triangle? I think I tried all combinations... Would I have to change what barycentric coordinates map to certain triangle vertices depending on the triangle?

The orientation of the vertices (whether they are clockwise or counterclockwise) isn't the primary issue because each vertex should correspond to a specific coordinate in the texture file (diffuse).

I also noticed that I do not use the normal vectors in any way. Does that mean that I don't have to use those vectors for texture mapping?

When I take a quick look at your implementation. The function "compute_barycentric_coordinates" is not match with the implementation in the author's source code.

If you set:

Point a = t[0];
Point b = t[1];
Point c = t[2];

Then it should be:

std::vector v0 = {
// A -> C
(c-a).x,
// A -> B
(b-a).x,
// P -> A
(a-p).x
};
and so on...

The problem is that you should set the first value for v0 to AC.x rather than AB.x (I only point this out because I have the same problem before by not set AC.x first by looking at the author's code)

Thank you for looking at my code. I tried swapping AC and BC but I still don't get a smooth result...

swap_ac_bc

It's actually not intuitive to me why AC/BC are swapped.

This was my logic for computing barycentric coordinates to generate this image:

std::vector<float> compute_barycentric_coordinates(std::vector<Point>& t, Point p) {
    // Rewrite as variables for readability....
    Point a = t[0];
    Point b = t[1];
    Point c = t[2];

    std::vector<int> v0 = {
        // A -> C  
        (c-a).x,
        // A -> B
        (b-a).x,
        // P -> A 
        (a-p).x
    };

    std::vector<int> v1 = {
        // A -> C  
        (c-a).y,
        // A -> B
        (b-a).y,
        // P -> A 
        (a-p).y
    };

    std::vector<int> cross_product = {
        v0[1]*v1[2] - v0[2]*v1[1],
        v0[2]*v1[0] - v0[0]*v1[2],
        v0[0]*v1[1] - v0[1]*v1[0]
    };

    // Get the cross product in terms of [u, v, 1]
    // To normalize the z-coordinate, we divide each value by the z-coordinate
    std::vector<float> cross_product_float;
    for (int i=0; i<3; i++) {
        cross_product_float.push_back((float) cross_product[i] / cross_product[2]);
    }

    // Now cross_product_float is of form: [u, v, 1];
    // We can use the cross product vector to then compute
    // the Barycentric coordinates: (1 - u - v, u, v)
    std::vector<float> coefficients;
    coefficients.push_back(1.0 - (cross_product_float[0] + cross_product_float[1]));
    coefficients.push_back(cross_product_float[0]);
    coefficients.push_back(cross_product_float[1]);
    
    return coefficients;
}

I really wanted to avoid looking at the author's code (I'm just stubborn) but I guess I might have to...

(edit) I took a look at the author's code and I am still unsure of my bug... ๐Ÿ˜…

Just experiments more, I guess. You will solve the issue by swapping more or make your code logic the same as the author's. There's no way the author's code can give the wrong result (or at least not the same as the picture he shows).

That's true, I'll keep trying ๐Ÿ‘ I just want to make sure I also grasp the intuition, not just copy the logic.
Thanks for reaching out though, I appreciate it!

Could you actually point me to the commit where the author computes the color from the interpolated texture coordinates? The commits seem to jump from computing the barycentric coordinates (and not using it for texture) to using a Fragment Shader in a future chapter.

You will see the mapping of texture implement from the lesson 6. The author also reconstructs the code, but you will know the idea. I suggest you try to interpolate the texture coordinates by your own first (it is a homework anyway and it's fun).

I finally solved this issue... I am so dumb. To calculate the bounding box I used the triangle screen coordinates. I used in-place sorting of the vector that holds the screen coordinates, so this changed the order of the triangle vertices from the intended order in the model file by the time I calculated the barycentric coordinates / UV mapping.

After fixing that, I was able to generate the correct texture mapping!

Correct RGB mapping

Correct texture mapping