Your comments

o_O Nothing changes even when I set "DepthTextureMode.None".

I have a custom component on the main camera which has
camera.depthTextureMode = DepthTextureMode.Depth;
in both Start() and OnEnable(). (Also tried the DepthNormal setting.)
Blending&Depth, Alpha blended, "AutoSort" unchecked, "Write to Depth buffer" unchecked.
Hmm Depth Bias works weirdly: 0.3 and indeed any positive number makes it pitch black. Around -2 I start to see some grays if the camera is close enough (stock cube, unlit shader, Depth Bias is on emission):Scene Depth produces a (15, 15, 15) dark gray until I move the camera really far away when it suddenly becomes full white.
I'm on Unity free so I couldn't test this on deferred rendering path.

0.14 yes.


Tomorrow I'll do some more tests.

Hmm, one scalar material parameter is animated hmm...

Oh right, you meant branching not RGBA slots. Of course then! :)

I'd just make the node as small as the basic Value node, one input, one output.

It was a regular Value. However now I have trouble reproing it... I restarted Unity, loaded the shader and it did come up showing a 1x tiling with 16 multiplier in that Value node. However as soon as I changed it everything started behaving as expected. I created a new shader from scratch but still no luck reproducing the bug... I'll keep an eye out for more clues.

I tried to recreate the results from UDK but couldn't. :\