0
Completed

Matrix Support for Multiply?

Brian Hall 11 years ago updated by Aaron Meyers 8 years ago 9
I'd like to be able to do my own rolled Matrix multiplies (for instance, to be able to decode scale lightmaps properly).  To do that I need to be able to multiply my normal by the unity_DirBasis which is a 3x3.

Another option is just to support Lightmap decode options as nodes.

Answer

Answer
Completed
This has now been implemented in 1.17 :)
This may come at some point, but it's not of high priority at the moment! It would be nice to have the matrix data types :)
Matrix multiplication is required for shaders which make use of Shader.SetGlobalMatrix - in particular, this is needed to light particles correctly so it would be great to have this in ShaderForge.
Answer
Completed
This has now been implemented in 1.17 :)
Found a bug in the matrix multiply node. Create this shader graph:

Close it, then reopen it. The A input of the multiply matrix node gets disconnected and there's an 'invalid input' error in the console.
Thanks, I'll look at this for the next update!

I just put together a shader with the Multiply Matrix node and came across this same bug. Close the shader, re-open and the A input of the Append node is disconnected!

actually seems like more often its the A input of the Multiply Matrix node >_<

+2
Also, could the matrix multiply node please accept a Vector3 input? Right now we have to append a 0 at the end of every Vector3 we want to multiply by the matrix :)