1
votes

I am by no means a mathematician, and I'm trying to understand what's happening with GLM's multiplication between vectors and matrices. From what I understand, GLM's matrices are sized as columns x rows, and in normal matrix multiplication in GLM, the left hand side of the expression must have the same amount of columns as the right hand side has rows. So here's where my question comes in.

The thing that confuses me is that I can multiply a 2x3 matrix by a three dimensional vector

| 1, 4 |
| 2, 5 | x [ 1, 2, 3 ]
| 3, 6 |

giving me an answer of [ 9, 12, 15 ]. But I can also multiply a 3x2 matrix by the same 3D vector

| 1, 3, 5 |
| 2, 4, 6 | x [ 1, 2, 3 ]

and I now get a 2-dimensional vector of [ 22, 28 ]. Why is this possible? What is happening here?

What confuses me further is that I can switch the order and put the vector on the left hand side and the matrix on the right, and I still get a valid result. Again, why is this possible? What is happening? And is this GLM-specific behavior, or is it standard matrix behavior?

I apologize if this is easy math, however after many hours of searching the internet and testing by myself with GLM, I still do not understand the concept. I appreciate any assistance. Thanks.

EDIT: Here is a code example

#define GLM_ENABLE_EXPERIMENTAL

#include <iostream>
#include <glm/mat2x3.hpp>
#include <glm/mat3x2.hpp>
#include <glm/gtx/string_cast.hpp>

int main() {
    glm::mat2x3 a = {
        1, 2, 3, 
        4, 5, 6
    };

    glm::mat3x2 b = {
        1, 2,
        3, 4,
        5, 6
    };

    glm::vec3 c = {
       1, 2, 3
    };

    // These all compile correctly? And each gives different results.
    std::cout << glm::to_string(a * c) << std::endl;
    std::cout << glm::to_string(c * a) << std::endl;
    std::cout << glm::to_string(b * c) << std::endl;
    std::cout << glm::to_string(c * b) << std::endl;
}

The console output is the following

vec3(9.000000, 12.000000, 15.000000)
vec2(14.000000, 32.000000)
vec2(22.000000, 28.000000)
vec3(5.000000, 11.000000, 17.000000)
1
Matrices are defined as row X column. Your visualization is wrong (same for vectors)RoQuOTriX
@RoQuOTriX row length X column length, or amount of rows X amount of columns? Sorry for the confusion.Elijah Seed Arita
I think the correct is amount of rows X amount of columns. I don't understand what you mean with length of a row? The number of elements in this row? This would be the amount of columns and vice versa... If you have 3x1 matrix (a vector) it has 3 rows and 1 column, not 1 row and 3 columnsRoQuOTriX
You've tagged this with glm, did you mean glm-math?genpfault
@genpfault Shoot, you are right. Thanks for the correction.Elijah Seed Arita

1 Answers

0
votes

I found what was going on! The problem ended up having nothing to do with matrices at all, but rather a quirk of GLM. In the code example, a * c and c * b are able to compile because c is being implicitly casted to a two-dimensional vector without any warning. Simple as that, but very confusing!