You duuurty durrrrrrty hypocrite you.....
Anyways, regarding your system build, you may want to consider the Radeon 3850/3870. They're about half the cost of the 8800GT (at street price, not msrp) and offer between 70-110% of the performance, depending on the game. Seems ATI is taking the "wii" route through this generation of video cards - can't say I think it's a bad idea, after all the 8800 series is still completely untouched by any games. I get 30+ fps at all times during Crysis with all settings at maximum except antialiasing and ~11450 points in 3dmark06...
I don't see any reason to release another even more powerful generation of cards when the people who bought 8800s/2900s are probably still trying to pay them off. The only place there's money now is the low-end enthusiast/budget segment, which is exactly where this card is aimed. Clever, clever...
Infos: http://www.extremetech.com/article2/0,1 ... 044,00.asp
When reading the charts, remember the fact that if you're using an LCD, you definitely won't be running 1920x1200. LCDs are either 1280x1024 or 1680x1050. So the total number of pixels is:
1920x1200: 2,304,000
1680x1050: 1,764,000
1280x1024: 1,310,720
So... somewhere midway between the first two items on the graph is where you'll most likely be with a widescreen LCD. 25-30 fps is generally considered "highly playable"... in that range is where most tv standards run at too, so if you think motion on TV is smooth, you won't be complaining. Most people can't tell the difference between 30 and 60 fps unless they're side-by-side.