Ok, I know this will be controversial, but... also interesting. I hope.
I'm starting from a Klaus quote from the latest post about the 500mm:
If you have fine tonal variations in feathers or so that'll make a difference as well - resolution isn't everything.
But I've opened a new thread because it's a generic thing.
Today, is contrast of a lens still important, in fine-grained terms? I'm explaining what I mean. Let's imagine a fictional classification of lens contrast from 0 to 10. Clearly a lens with a score of 9 is better than a lens with a score of - say - 6. Such a different performance can't be compensated in post-production, because the amplification of contrast would produce coarse and not fine tonal variations. Ok.
But if I compare two lenses and they score 9.5 and 9, is it really something that can't be compensated in post-production?
A clarification of my question. We have micro-contrast and contrast. Personally I understand micro-contrast is more important "in the lens", because trying to enhance it in post-production might create some "artificial" look. I'm thinking, for instance, of what happens when you overdo with "Clarity" in Lightroom.
But for the normal, overall contrast? Here I don't have a clear answer. If anybody has one, in either sense, does he have some evidence, I mean some test-case with comparisons?
PS Clearly I'm thinking of RAW post-processing, with plenty of bits...