Torch mv behavior not understandable

Solution for Torch mv behavior not understandable
is Given Below:

The following screenshots show that is unusable in a situation that obviously seem to be correct… how is this possible, any idea what can be the problem?

enter image description here

enter image description here

this first image shows the correct situation, where the vector has 10 rows for a matrix of 10 columns, but I showed the other also just in case. Also swapping for does not make a difference.

enter image description here

However, the @ operator works… the thing is that for my own reasons I want to use mv, so I would like to know what the problem is.

According to documentation:, vec, *, out=None) → Tensor
If input is a (n×m) tensor, vec is a 1-D tensor of size m, out will be 1-D of size n.

The x here should be 1-D, but in your case it’s 10×1 (2D). You can remove extra dimension (or create a single dimension x)

tensor([ 0.1432, -2.0639, -2.1871, -1.8837,  0.7333, -0.4000,  0.4023, -1.1318,
         0.0423, -1.2136])

>>> w @ x
tensor([[ 0.1432],
        [ 0.7333],
        [ 0.4023],
        [ 0.0423],