There is bound to be disagreement near the edge cases, but I can tell you my personal guidelines.
I look at these the criteria when I decide to use var
:
Conversely, these situations would push me to not use var
:
Finally, I would never use var
for native value types or corresponding nullable<>
types (int
, decimal
, string
, decimal?
, ...). There is an implicit assumption that if you use var
, there must be "a reason".
These are all guidelines. You should also think also about the experience and skills of your coworkers, the complexity of the algorithm, the longevity/scope of the variable, etc, etc.
Most of the time, there is no perfect right answer. Or, it doesn't really matter.
[Edit: removed a duplicate bullet]