Wii has won.
Moderator: Moderators
On the other side of that coin, on a console there's no need to worry about drivers, no tweaking settings to make it work. And anygame you pick up anywhere in the life span of the console for that console WILL work. You can't take a PC from even '95, and play BF2 on it. But you can take an XboX and play Halo, or something much farther along in the lifespan, without worries about compatibility.
i think triaxx brings up the main point of contention between PC gaming and Console Gaming but frankly i don't feel like aruging that arugement again
As to the whole, Console Wars: Episode Who-gives-a-F*CK!!, im gonna wait till i can acutally use the Wii before passing judgement. As to PS3, meh whenever it gets here it i'll give it a try...
and finally to bring the thread back on topic:
Wii wins because "I'm(She's) cheap and fun"

As to the whole, Console Wars: Episode Who-gives-a-F*CK!!, im gonna wait till i can acutally use the Wii before passing judgement. As to PS3, meh whenever it gets here it i'll give it a try...
and finally to bring the thread back on topic:
Wii wins because "I'm(She's) cheap and fun"

- SwiftSpear
- Classic Community Lead
- Posts: 7287
- Joined: 12 Aug 2005, 09:29
Not likely. In 3 years time DX10 will split the gaming market in two and you will violently feel the sting of not having DX10 compatible hardware if you don't rebuild.Snipawolf wrote:I doubt that, but if I used that 500$ for *Cough New processor, New GFX card Cough * I could easily beat it...Drone_Fragger wrote: The Xbox360 can be beaten down by a 500$ Pc, easily.
But I want my own computer, for me, not one any of my computer illiterate family can touch
Now, I can get a good computer for 1100$, it will last at least 3 years, and 3 years and I can upgrade INSTEAD OF BUYING A NEW ONE, and save a couple hundred...
http://youtube.com/watch?v=zea6FH1w1Zc
That looks so great. Im totally getting a Wii. Anyone who says Wii has bad graphics is a moron, it looks great! And games arent just about visuals.
Imagine a game like Fear or Doom 3 played with a Wii, it would be so much scarier because you're actually, you know, involved. I think a horror game on the Wii would be scary as hell.
That looks so great. Im totally getting a Wii. Anyone who says Wii has bad graphics is a moron, it looks great! And games arent just about visuals.
Imagine a game like Fear or Doom 3 played with a Wii, it would be so much scarier because you're actually, you know, involved. I think a horror game on the Wii would be scary as hell.
- SwiftSpear
- Classic Community Lead
- Posts: 7287
- Joined: 12 Aug 2005, 09:29
The graphics for red steel aren't bad... but comparitively look at something like alen wake or crysis. There's no competition. Wii can't do high def, so it couldn't possibly even come close to matching the graphical abilities of either the 360 or PS3, however, I don't know about you, but I don't have a high def TV.
- Drone_Fragger
- Posts: 1341
- Joined: 04 Dec 2005, 15:49
In three years you'll either have rebuilt your computer (or the relevant parts at least) already or it's so outdated it wouldn't even run those games at minimum detail if it didn't have the Direct X issue. Games only take what most people have already added to their rigs, look how long it took for shaders to be used. By the time games will only come in DX10 flavour pretty much all gamers will have replaced their incompatible hardware as part of their regular upgrade cycle.SwiftSpear wrote:Not likely. In 3 years time DX10 will split the gaming market in two and you will violently feel the sting of not having DX10 compatible hardware if you don't rebuild.
HD is only a higher resolution, the assets themselves have become much more complex on the PC, 360 and PS3 and it would look much better even on SD.Wii can't do high def, so it couldn't possibly even come close to matching the graphical abilities of either the 360 or PS3, however, I don't know about you, but I don't have a high def TV.
Of course few people have HD, I think it has only become available this year around here. Some Capcom guy (the guy responsible for Dead Rising) said if you don't have a HDTV there's no point in buying a HD console. Or if you've got a family and have to plug the consoles into a secondary TV, I guess. Many families would still use an SDTV as their secondary TVs (most likely the old primary TV) even after buying a HDTV since the HDTV ends up in the living room, the consoles rarely do.
EDIT: Drone, it's 1280x720 or 1920x1080. Of course the physical resolution of the TV may differ...
Then why do a lot of games run like crap on the XBOX?Triaxx2 wrote:On the other side of that coin, on a console there's no need to worry about drivers, no tweaking settings to make it work. And anygame you pick up anywhere in the life span of the console for that console WILL work. You can't take a PC from even '95, and play BF2 on it. But you can take an XboX and play Halo, or something much farther along in the lifespan, without worries about compatibility.
BTW, He said he can upgrade in a few years time. And he can. There not goona be changing the PCIE interface for ages now, and the next geforce (IF you seen the specs, liek WTF, 128 shaders, 768ram... insanity), runs everything needed for DX10. So he will only have to get a new graphics card.
I got a new pc recently and insted of getting all teh best stuff i brought teh best performance for price stuff, and then next year ill by the super graphics card and the fatsest procesor. Thats the best way to do it.
aGorm
I got a new pc recently and insted of getting all teh best stuff i brought teh best performance for price stuff, and then next year ill by the super graphics card and the fatsest procesor. Thats the best way to do it.
aGorm
Something to remember is that as a result of not having hi-def, it can do much more impressive graphics with less hardware. Hi-def is not just some magic chip they put in the console to increase the resolution, it's the same as for computers. Yeah, I could run Doom 3 on my computer a few years ago - at 640x480 with 4x antialiasing to compensate. It definitely still looked extremely good IMO, and I got about 30-60 fps on average. But do something like increase that resolution to 1080p (1920├âÔÇö1080 or "high definition") and there's no way I could possibly maintain 5 FPS with even nothing at all on the screen. I mean, really, 1600x1200 is currently considered the super-high-end resolution (there's 2048x1536 too, but no monitors can really do it well yet). That's 1,920,000 pixels to be rendered 60 times a second. But 1080p is even higher - 2,073,600!!!
This is why, even despite not having cell and comparatively less expensive hardware, the X360 looks just as good as any current PS3 stuff you see. It's not rendering at 1080p, it's rendering at 1080i, or 1,036,800 pixels total, half of what the PS3 will have to calculate.
Now, take that down even further to the wii's max resolution, of 480p (852x480), for a total of 408,960 pixels 60 times a second, you realize that the Wii only has to work less than one fifth as hard as the PS3 in order to attain the same level of visual quality, assuming you're using a standard or enhanced definition TV set like me and 9/10 of all other North Americans.
This means that Wii needs far less to do the same thing. No stats have been released yet on its memory, cpu, or gpu speeds, it's still all rumor. But let's assume it has a 1ghz CPU and the graphical equivalent of something medium-power like a 7600 and 512 mb of DDR. While there isn't a direct correlation to be made between number of pixels rendered and performance, since lots of other things come into play. Luckily for wii, one of the major things is prior developer experience with the console, and GC and Wii stuff is apparently very similar to program for. While you can't just say it's going to perform like a 5ghz computer, but it will certainly perform a lot better than a direct mhz comparison between the big three machines would indicate.
This is why, even despite not having cell and comparatively less expensive hardware, the X360 looks just as good as any current PS3 stuff you see. It's not rendering at 1080p, it's rendering at 1080i, or 1,036,800 pixels total, half of what the PS3 will have to calculate.
Now, take that down even further to the wii's max resolution, of 480p (852x480), for a total of 408,960 pixels 60 times a second, you realize that the Wii only has to work less than one fifth as hard as the PS3 in order to attain the same level of visual quality, assuming you're using a standard or enhanced definition TV set like me and 9/10 of all other North Americans.
This means that Wii needs far less to do the same thing. No stats have been released yet on its memory, cpu, or gpu speeds, it's still all rumor. But let's assume it has a 1ghz CPU and the graphical equivalent of something medium-power like a 7600 and 512 mb of DDR. While there isn't a direct correlation to be made between number of pixels rendered and performance, since lots of other things come into play. Luckily for wii, one of the major things is prior developer experience with the console, and GC and Wii stuff is apparently very similar to program for. While you can't just say it's going to perform like a 5ghz computer, but it will certainly perform a lot better than a direct mhz comparison between the big three machines would indicate.
- Drone_Fragger
- Posts: 1341
- Joined: 04 Dec 2005, 15:49