I gave up trying to calibrate mine. For this monitor it depended on the lighting in the room as well as the distance and angle... and the only way to get a perfect match is to be crouching ten feet away. I have since put my monitor on top of a stack of textbooks (4) on my desk and even then it still isn't right until I am a certain distance away. This hasn't become a problem yet (I've been using it since July)... but so far it's served me well enough. As for what type of monitor I prefer, I am never going back to CRT. There are certain nuances in color and texture that I find are very hard for CRTs to pick up (artifacting in jpgs being one of the most dramatic examples) and I now feel handicapped every time I am forced to go back to one.
Incidentally, the reason why you eventually switched to TFT is the same my uncle gave, and he used the same example too.
For some reason I'm still intrigued by CRT monitors. Once I fix my house and get a new desk I'll hook up my old 60 MHz and see if I can get a monitor for it for comparison reasons.
You're forgetting that the refresh rate is also the maximum FPS to show. Some people just don't think it's smooth enough at 60 and prefer something higher like 75 or more.
I assume you're talking about video games here. I've always been sceptic about this. The reason why people perceive choppiness up to very high frame rates is because in video games, a frame is a still image of one exact point in time. Film frames, on the other hand, contain much more information because the camera recorded light continuously while the camera's shutter was open—which is why films have motion blur. When you see choppiness in a game, you're really just responding to
gaps in between screen updates.
It's not unreasonable to claim that the human eye is able to notice even really tiny gaps in between screen updates, even up to very high refresh rates (120 and beyond). But it's a much larger stretch to suggest that this is still the case when motion blur is added to the equation. You'll have a much harder time telling the difference between blurred 60 FPS and blurred 120 FPS because the "gaps" have already been eliminated. The limit might be a little higher than 60, but I doubt it's very much higher.
As soon as GPUs become able to render accurate motion blur, I think there will be very little need to crank the FPS up higher than 60.