For those of you in doubt about AMD doing somthing with ATI

For those of you in doubt about AMD doing somthing with ATI

Post just about everything that isn't directly related to Spring here!

Moderator: Moderators

Post Reply
User avatar
LOrDo
Posts: 1154
Joined: 27 Feb 2006, 00:21

For those of you in doubt about AMD doing somthing with ATI

Post by LOrDo »

http://www.tgdaily.com/2006/10/25/amd_a ... processor/

Processors and GPU's merged together eh? Says it'l offer lower voltage consumption, but they don't really go into the details. Well, gives a reason to live another 3 years...
Discuss.
User avatar
KDR_11k
Game Developer
Posts: 8293
Joined: 25 Jun 2006, 08:44

Post by KDR_11k »

Probably a disproportionally crappy GPU put into every CPU and only the highest end CPUs come with decent GPUs. So you have to buy the 1000$ "extreme gamer's edition" to get a *900 chip.
User avatar
Peet
Malcontent
Posts: 4384
Joined: 27 Feb 2006, 22:04

Post by Peet »

I don't see how that would be any better than integrated, unless it had onboard ram.
User avatar
Lindir The Green
Posts: 815
Joined: 04 May 2005, 15:09

Post by Lindir The Green »

I got the impression that it would be like integrated graphics, except for when there's less demand for GPU time and more for CPU time some of the GPU proccessing power could spill over, and vise versa. If not this particular generation then sometime in the future.

And then you can add a dedicated card, and get two graphics systems at once.

Eventually though I think there will be something like this: A low core (2 or 4) high speed proccesor with low bandwith RAM, connected to a high core low speed proccesor with high bandwith RAM. The low core one would be optimised for what the proccessor usually does, and the high core one would be optimised for physics and graphics (which work best like this, because there are many many many seperate calculations that can be done in any order). But if there's a lot of demand for one thing and less of the other, then some of the calculations could be done by the proccesser not optimised for those calculations.
User avatar
mehere101
Posts: 293
Joined: 15 Mar 2006, 02:38

Post by mehere101 »

I think that this is a bad idea. Basically, graphics processing is far too different from generalized processing. Generalized programming isn't nearly as dependant on fast floating point operations. What AMD should do is get competitive ATI linux drivers out.
User avatar
SwiftSpear
Classic Community Lead
Posts: 7287
Joined: 12 Aug 2005, 09:29

Post by SwiftSpear »

mehere101 wrote:I think that this is a bad idea. Basically, graphics processing is far too different from generalized processing. Generalized programming isn't nearly as dependent on fast floating point operations. What AMD should do is get competitive ATI linux drivers out.
I think it's pretty evident that they plan on using multiple processor cores. IOW, one or more dedicated standard processor cores with one or more dedicated graphics processor cores, which could mean lightning fast interdevice communication... or it could mean worse overall performance for both pieces of hardware, that is yet to be seen.
User avatar
jcnossen
Former Engine Dev
Posts: 2440
Joined: 05 Jun 2005, 19:13

Post by jcnossen »

It could also that the GPU can use unused CPU pipelines, which is a great advantage. I think right now all the SIMD instruction support isn't used most of the time...
User avatar
aGorm
Posts: 2928
Joined: 12 Jan 2005, 10:25

Post by aGorm »

There not saying there goona replace descreat graphics entirly though... this is mainly (at first anyway) for home entertainment systems and laptops, and possibly office work. However.... Think about it, if there putting phsics onto graphics cards, whats to stiop them using the graphics on a cpu to handel phyisics why your 3d cards do what they were brought for? This would be a sweet cool way of doing things.

3 years is ages away though so Id not bother worriing what it will be like.

aGorm
User avatar
PauloMorfeo
Posts: 2004
Joined: 15 Dec 2004, 20:53

Post by PauloMorfeo »

CPUs eventually integrated the Mathmatical co-processors. Now, no CPU will ship without a mathematical co-processor. Maybe it will end up like that with CPUs and GPUs?
bamb
Posts: 350
Joined: 04 Apr 2006, 14:20

Post by bamb »

Who the hell wants to buy an ati anyway as its opengl support sucks ass. :?
User avatar
aGorm
Posts: 2928
Joined: 12 Jan 2005, 10:25

Post by aGorm »

I dont, (NVIDIA RULES!1!!!1!! (except for buying 3dfx...dam them))... but you could use that for physics and nvida for graphics.... that would be sweet.

aGorm
imbaczek
Posts: 3629
Joined: 22 Aug 2006, 16:19

Post by imbaczek »

Today's GPUs are actually a little more specialized vector processors, so that's a smart move. E.g. folding@home running on some new ati is 20x (twenty times) faster than the fastest opterons; this could also work as a dedicated physics processor and stuff like that. See also PS3 and its Cell array.
raneti
Posts: 148
Joined: 21 Sep 2006, 00:12

Post by raneti »

my ATI RADEON 9200 beats all the GFORCE MX series about 2x
i had a GF4 MX it didn't have Pixel Shader at least??? (besides it burned out)
ATI has always been cheaper too
is it so currently too? cuz i will have to buy a pixel shader 3 video in the incoming months, the games keep on demanding higher shaders.

As for the rest is only about companies making money. What would be if they would increaseprocessor size 3, 4, 5 times etc, include all instruction s necessary for computing anything and video boards just becoming just as support for graphics memory, mass produce et voila, a cheap "gamer system". But companies makes more money releasing progressing 100 versions of processors in good time so you at least buy 10 making more money from cheapness like that.

An btw AMD might just as well be an intel branch kept to keep the concurence lawyers at bay. Seems to be a lot of pairs of everything, intel vs amd, GF series vs ATI series ...
User avatar
rattle
Damned Developer
Posts: 8278
Joined: 01 Jun 2006, 13:15

Post by rattle »

The Geforce MX series are crap and not really good to be compared against anything.

Oh yes ATI IS cheaper. So cheap that my first 3D card forgot half of it's textures every now and then... the Rage 3D was bollocks and so were it's drivers and I hardly doubt that's ever going to change after seeing some newer models in action at a friend. Driver issues here, borked shaders there and generally poor opengl support (using latest drivers in dx apps that is).
User avatar
LOrDo
Posts: 1154
Joined: 27 Feb 2006, 00:21

Post by LOrDo »

ATI's OpenGL support sucks balls compares to Nvidia. But most games come with a Direct3D option anyways, so I don't really care. They also overclock better. But everything else ATI seems to excell at, including price.
Rage is ooooooold...Theres no point in even making a comparison with them anymore.
User avatar
PauloMorfeo
Posts: 2004
Joined: 15 Dec 2004, 20:53

Post by PauloMorfeo »

LOrDo wrote:... But most games come with a Direct3D option anyways, ...
Not on Linux! And since i spend most non-Spring time on Linux...

Even if i didn't, the mere fact that, when i would be on Linux, it would be much worst, would make me not think ati anymore...

That said, the Radeon 9000 is a good card and was definetly the best on it's time in performance, technology, price and Linux drivers (yes, at the time ati was making the Linux drivers completely open-source), that i know now for sure. And my next gfx card will be mainly focused on Linux, meaning nvidia.
User avatar
Caydr
Omnidouche
Posts: 7179
Joined: 16 Oct 2004, 19:40

Post by Caydr »

Wow, this is a good idea. AMD already has multiple additional things built into the CPU that intel traditionally had elsewhere on the motherboard, now put the GPU into the CPU and suddenly I bet the manufacturing costs drop drastically. Of course, it would only be good for integrated setups and laptops, but cheaper laptops = good.
Post Reply

Return to “Off Topic Discussion”