what's OGL doing to keep up with DX10?

what's OGL doing to keep up with DX10?

Post just about everything that isn't directly related to Spring here!

Moderator: Moderators

Post Reply
User avatar
Caydr
Omnidouche
Posts: 7179
Joined: 16 Oct 2004, 19:40

what's OGL doing to keep up with DX10?

Post by Caydr »

Just curious, I've heard bits and pieces but nothing really concrete. Does anyone know what the people behind OGL are doing to make it capable of being competitive with DX10?

For those that don't know, according to Microsoft (ha ha), DirectX 10 will be as much as 8 times faster than DirectX 9. Whether that's factoring in the superior hardware that's being developed, improved efficiency in general with Vista, or what, it's still a pretty significant claim. The entire code is being rewritten from the ground up. From what little I understand, it looks like one of the big differences is that instead of every processor cycle going like this: "CPU tells GPU what to render, GPU sends it back through CPU, then CPU sends it back through GPU", most everything graphics-related will be handled independently by the GPU. That means the precious few milliseconds every cycle has to work with will not be wasted mostly on the CPU and GPU talking to each other.

As an example, a great deal of intel's Conroe advantage comes from the fact that it is manufactured on a smaller die than before. So a transmission that would usually have to travel 2 millimetres now only has to travel half a millimetre, boosting efficiency and all that good stuff *massively*.... now picture, in the GPU's case, the difference could go from 40 millimetres 4 times, to half a millimetre. Great implications. Anyway...

Even if you ignore most of the hype about 8x faster performance, the fact alone that DX10 removes the CPU from a lot of the processes and whatever tells me that the increase in efficiency over DX9 will be dramatic, and OGL will have to do something similar or will become obsolete overnight.
User avatar
AF
AI Developer
Posts: 20687
Joined: 14 Sep 2004, 11:32

Post by AF »

Without cedega there's no DirectX for Linux. So OpenGL has a stronghold from whcih to work with regardless of DirectX's capabilities. That is unless DirectX suddenly becomes oepnsource or a linux version is made.

something is in the works though but i have only heard bits and bobs myself. Multithreaded OpenGL setup?
User avatar
Drone_Fragger
Posts: 1341
Joined: 04 Dec 2005, 15:49

Post by Drone_Fragger »

Actually, Nvidia put presure on Microsft to put OGL Support in. The way this works may not make much sense, But here goes:

Nvidia will refuse to make DX10 cards unless windows outs OGL supprt in. This is because UT2007 will use OGL, And Nvidia is part of the team that makes it (Or paid lots of money for it to be Nvidia, nm anyways). Now, The idea being, That if Nvidia makes no DX10 cards, Die hard Nvidia fans aren't going to get vista. Because thewy don't need DX10 as theres no precious DX10 nvidia cards. Therefore, Microsft loses lots of money (50% of people use Nvidia, 49% ati and 1% other). Now, That may not make sense, But it sorta works.
User avatar
Caydr
Omnidouche
Posts: 7179
Joined: 16 Oct 2004, 19:40

Post by Caydr »

Pretty biold of Nvidia to make a statement like that, especially when they've lost the upper hand (for now) in graphics.
User avatar
jcnossen
Former Engine Dev
Posts: 2440
Joined: 05 Jun 2005, 19:13

Post by jcnossen »

UT2007 on windows isn't using OpenGL. They implement it using DX, and Loki which ports a lot of games to linux will add OpenGL support. The fact is that documentation and support of DirectX is simply much better than that of OpenGL, so basically all AAA game engines except Doom3 is using D3D.
Doom3 using OpenGL is somewhat logical due to John Carmack who did the rendering being in the OpenGL committee

As this article shows, they are (as usual) coming with the same stuff just named slightly different and a lot later :/

http://www.gamedev.net/columns/events/g ... asp?id=233
User avatar
Min3mat
Posts: 3455
Joined: 17 Nov 2004, 20:19

Post by Min3mat »

opensource FTW!
that way you can choose the evil capitalist stuff and feel like you really made a choice and took a stand
one exception being spring, if the evil OGLers (pun!) steal our SY's it may be able to compete (but at what cost ={O)
HAARP
Posts: 182
Joined: 06 Apr 2006, 07:18

Post by HAARP »

On the other hand, Nvidia is the company that paid EA lots of money to make BF2 incompatible with older cards.
User avatar
Snipawolf
Posts: 4357
Joined: 12 Dec 2005, 01:49

Post by Snipawolf »

HAARP wrote:On the other hand, Nvidia is the company that paid EA lots of money to make BF2 incompatible with older cards.
That sounds underhanded, heheheh...
hawkki
Posts: 222
Joined: 01 Jan 2006, 19:47

Post by hawkki »

From a technical perspective dx10 will be a huge leap. The pc will get closer to the way consoles work. So you get more performance out from your hardware.

Another point is then, how much faster will everything work on dx10 ? Probably not a single one of the old games will be updated to dx10 compatibility because its so difficult and requires much work. And probably new games designed for dx10 will not work on anything lower because its so much different.

So the real performance gain is probably going to be hard to notice.
User avatar
Drone_Fragger
Posts: 1341
Joined: 04 Dec 2005, 15:49

Post by Drone_Fragger »

hawkki wrote:From a technical perspective dx10 will be a huge leap. The pc will get closer to the way consoles work. So you get more performance out from your hardware.

Another point is then, how much faster will everything work on dx10 ? Probably not a single one of the old games will be updated to dx10 compatibility because its so difficult and requires much work. And probably new games designed for dx10 will not work on anything lower because its so much different.

So the real performance gain is probably going to be hard to notice.
Wrong. DX10 games running on a DX9 card gets no DX10 deatures (but still runs) And a DX9 game running on a DX10 card gets about an 800% performance boost (according to Microsoft anyway, but no one believes what they say)
User avatar
jcnossen
Former Engine Dev
Posts: 2440
Joined: 05 Jun 2005, 19:13

Post by jcnossen »

800% speed increase only happens at the places where CPU and the Direct3D API is actually the bottleneck. If you didn't have enough pixel processing speed on your hardware, you still don't have enough with DX10..
User avatar
Cabbage
Posts: 1548
Joined: 12 Mar 2006, 22:34

Post by Cabbage »

/me careses his 28 pixel pipelines....

probably 56 by the time DX10 is out :P
User avatar
mehere101
Posts: 293
Joined: 15 Mar 2006, 02:38

Post by mehere101 »

/me carresses my precious 128mb of texture memory that VISTA WILL UNCEREMONIOUSLY EAT.
User avatar
Argh
Posts: 10920
Joined: 21 Feb 2005, 03:38

Post by Argh »

Um, am I the only one paying attention to the whole multi-threaded raytracing cards being developed by a German university project, that may make all of this crap moot, aside from shader implementations?

Just asking.
User avatar
jcnossen
Former Engine Dev
Posts: 2440
Joined: 05 Jun 2005, 19:13

Post by jcnossen »

By the time the raytracing specific hardware is done we can do the same on programmable consumer gfx cards. Those are moving towards a general parallel processor system anyway, and will be much cheaper.
User avatar
Argh
Posts: 10920
Joined: 21 Feb 2005, 03:38

Post by Argh »

Maybe.

I just chalk it up as one more reason not to spend a lot've money upgrading my hardware too much this year ;)

For those who have no idea what I'm talking about, go here.
Post Reply

Return to “Off Topic Discussion”