I am by no means a mathematician, and I'm trying to understand what's happening with GLM's multiplication between vectors and matrices. From what I understand, GLM's matrices are sized as columns x rows, and in normal matrix multiplication in GLM, the left hand side of the expression must have the same amount of columns as the right hand side has rows. So here's where my question comes in.
The thing that confuses me is that I can multiply a 2x3 matrix by a three dimensional vector
| 1, 4 |
| 2, 5 | x [ 1, 2, 3 ]
| 3, 6 |
giving me an answer of [ 9, 12, 15 ]
. But I can also multiply a 3x2 matrix by the same 3D vector
| 1, 3, 5 |
| 2, 4, 6 | x [ 1, 2, 3 ]
and I now get a 2-dimensional vector of [ 22, 28 ]
. Why is this possible? What is happening here?
What confuses me further is that I can switch the order and put the vector on the left hand side and the matrix on the right, and I still get a valid result. Again, why is this possible? What is happening? And is this GLM-specific behavior, or is it standard matrix behavior?
I apologize if this is easy math, however after many hours of searching the internet and testing by myself with GLM, I still do not understand the concept. I appreciate any assistance. Thanks.
EDIT: Here is a code example
#define GLM_ENABLE_EXPERIMENTAL
#include <iostream>
#include <glm/mat2x3.hpp>
#include <glm/mat3x2.hpp>
#include <glm/gtx/string_cast.hpp>
int main() {
glm::mat2x3 a = {
1, 2, 3,
4, 5, 6
};
glm::mat3x2 b = {
1, 2,
3, 4,
5, 6
};
glm::vec3 c = {
1, 2, 3
};
// These all compile correctly? And each gives different results.
std::cout << glm::to_string(a * c) << std::endl;
std::cout << glm::to_string(c * a) << std::endl;
std::cout << glm::to_string(b * c) << std::endl;
std::cout << glm::to_string(c * b) << std::endl;
}
The console output is the following
vec3(9.000000, 12.000000, 15.000000)
vec2(14.000000, 32.000000)
vec2(22.000000, 28.000000)
vec3(5.000000, 11.000000, 17.000000)