(FOMOsapiens).
If you check math Logit function, it converts real space from [0,1]
interval to infinity [-inf, inf]
.
Sigmoid and softmax will do exactly the opposite thing. They will convert the [-inf, inf]
real space to [0, 1]
real space.
This is why, in machine learning we may use logit before sigmoid and softmax function (since they match).
And this is why "we may call" anything in machine learning that goes in front of sigmoid or softmax function the logit.
Here is J. Hinton video using this term.