SLI and Spring?

SLI and Spring?

Post just about everything that isn't directly related to Spring here!

Moderator: Moderators

Post Reply
User avatar
Argh
Posts: 10920
Joined: 21 Feb 2005, 03:38

SLI and Spring?

Post by Argh »

It's (finally) time for me to upgrade my rig again. Haven't done anything since I put in the 7800GT last year, and a new SATA drive. I'm sticking with XP for now, but I'm going to go ahead and upgrade the mobo, CPU, RAM... and I'm going to get two GPUs.

For Spring, will that be useful, useless, or cause problems? I've messed with DRB's rig, so I have some experience (he has a pair of Geforce 7400's SLI'd, and with certain adjustments made, he can run Spring pretty darn fast), but I'd like to know if anybody else has done any real tweaking with SLI yet, and has any advice on the topic before I put $300+ into two video cards (whatever I buy won't be the "latest / greatest", because I never do that- waste of money, when I upgrade at least a part of my machine every 9 months or so).
User avatar
AF
AI Developer
Posts: 20687
Joined: 14 Sep 2004, 11:32

Post by AF »

I suspect you'll run into the same issue that 8xxx users, and people with threaded optimizations turned on have.
User avatar
Argh
Posts: 10920
Joined: 21 Feb 2005, 03:38

Post by Argh »

So, what, exactly, is causing these problems? Does it have to do with what OpenGL calls are being used, and how the drivers handle them? Can it be addressed by making Spring OpenGL 2.0 compliant? Is Smoth using Vista, or XP? 32-bit, or 64? I can always upgrade to XP64, ya know- that's what DRB has, and while it has a few issues, it's been a fairly decent white-elephant OS...

Or is it purely a problem with drivers, which can be at least partially addressed with tweaking the driver behaviors?

DRB's system ran Spring rather crappily, until I used a driver-tweaking tool and did a number of things to optimize its performance and make use of SLI correctly.

I'm mainly worried about the 8X-series issue... should I just look at an ATI card, since I'm still using XP, and the drivers behave acceptably with Spring, if not quite as perfectly? I'm mainly wanting a pair of cards that are reasonably fast, so that I can play Bioshock and a few other latecomer DX9 games that I can't really play on my current, last-gen rig, which is perfectly acceptable for Spring.

I don't give a flying fig about DX10 support, since I have no intention of using Vista until it either stops sounding like a terrible idea or hardware changes force me to, frankly... and if M$ hasn't fixed Vista yet, I'll probably go Linux at that point.

Which is another thing, entirely- if, perchance... I decided to go dual-boot, and get my feet wet with Linux... would it work with SLI there, or is the driver support still too crappy?
User avatar
AF
AI Developer
Posts: 20687
Joined: 14 Sep 2004, 11:32

Post by AF »

One problem is that the more smoth goes on about an issue, the more people think the issue only affects smoth.

The 8xxx series cards and their ATI equivilant are a huge shift in architecture design from the cards that came before them. For the same reason the multi-threaded driver settings which are supposed to speed programs up have such as degrading effect on spring.

What we're seeing is the equivalent of a single threaded program running on a 2048 core cpu. We're only getting 1 2048th of the processing power. This shouldn't normally affect programs as badly in the graphics world as the ways of solving them just happen to have been the same ways of getting more frame rates in previous architectures.

So basically, springs internals aren't behaving as well as they should causing various slowdowns, perhaps that driver thread has been forced to wait for the other threads to finish so that it can pass spring something back when there's a better way of doing it anyway.

Iamacup IIRC saw that he had SLI and wasn't getting double performance. Lordmatt even switched to linux.


Also SJ asked an nvidia engineer IIRC ages ago so the story goes, and was told the slowdown could be because of glGet calls.
DemO
Posts: 541
Joined: 18 Jul 2006, 02:05

Post by DemO »

Don't go SLI unless you are going to buy at least two 8800 series cards.

A single 8800GT will outperform an sli setup of any nvidia cards from a lower series, and it will also outperform an 8800GTS for much less cost.

If it hadn't been for the 8800GT being released just after I built my rig with an 8800GTX, I would have bought two for the same price and got more performance.

On the downside, it appears that 8800GT's are more or less out of stock in every credible online retailer I have checked recently. This is obviously because they are by far and away the best deal in terms of "bang per buck" performance right now, at least from Nvidia.

Oh and, incase you are not aware, SLI does NOT give double performance - not even close. More like anywhere between 30-70% extra performance. The drivers are not well optimized yet, and there is considerable doubt concerning whether the SLI architecture is actually even capable of giving double performance, provided that it did have perfectly optimized drivers.
Tobi
Spring Developer
Posts: 4598
Joined: 01 Jun 2005, 11:36

Post by Tobi »

I took a quick look, I think there are 3 glGet calls that may be the cause, if the cause is indeed just in the glGet calls.

Each of them is getting the modelview matrix from OpenGL, so by just emulating the gl/glu code manually or remembering the matrix instead of querying it from OpenGL it should be pretty trivial to change this.

The places are:
UnitDrawer.cpp:1152
Camera.cpp:196
Camera.cpp:211

There are more glGet's in LUA code, but if fixing these 3 works for no-LUA mods you know you are on the right track.
User avatar
Argh
Posts: 10920
Joined: 21 Feb 2005, 03:38

Post by Argh »

@Tobi:

Um, ok, I'm stupid (big surprise). What does a glGet have to do with these problems? I went here to educate myself a little bit more on this topic, and it doesn't help me out much.

IIRC, when I tweaked DRB's SLI rig, I had to use the tweak tools to force OpenGL 1.X compliance mode, or something. Are there OpenGL calls in 1.X that are not valid on OpenGL 2.X? If so, is there a list somewhere? I'll go look at the official site, maybe there's a relatively simple way to deal with this issue... or maybe Spring needs a separate renderer, with parallel code, in order to use modern GPUs effectively, or at least a switch that could be passed, if a card responds to a query asking what version of OpenGL it supports...

@Dem0:

You're saying that instead of 2 SLI'd 8600's, for about $300, I should get a 8800, for about the same price? I'll have to go read the reviews, I guess, and see what the numbers look like.
User avatar
Argh
Posts: 10920
Joined: 21 Feb 2005, 03:38

Post by Argh »

This might be interesting, for troubleshooting purposes:

http://developer.download.nvidia.com/op ... 0specs.pdf

This is ol' Jabba, trying to deal with rendering 3DO, I suspect.

http://www.gamedev.net/community/forums ... _id=304782

What's interesting, now that I've poked at the source a bit, is that it looks like, in one place, that Spring is doing this as described by the posters, and in another place, it's doing glGetFloatv(GL_MODELVIEW_MATRIX,parameter) instead of doing the matrix math and sending it back to OpenGL, as suggested. But I'm way too stupid to know why, or whether it matters.

Page 77+ of the g80 chipset stuff seems to be talking about OpenGL and how it works with the new architecture. I can't follow all of it, but it almost looks like it wants a new, proprietary function call to it... again, I wonder if the FPS hits are due to emulation being invoked- Spring may need to query video cards, and if GF8+, then send GL_EXT_geometry_shader4... maybe I'm just not reading that right, of course, I'm having trouble following chunks of it, when it's getting into the mechanics of tristripping, etc.
Harbinger
Posts: 82
Joined: 26 Mar 2007, 22:14

Post by Harbinger »

1 8800gt owns 2*8600gts.
User avatar
jcnossen
Former Engine Dev
Posts: 2440
Joined: 05 Jun 2005, 19:13

Post by jcnossen »

Um, ok, I'm stupid (big surprise). What does a glGet have to do with these problems?
glGet requires the driver to process all pending GL commands before returning whatever the program requested, so it can stall the rendering process.

My new 8800GT works ok with spring btw, 70 fps with everything on highest.. still low but acceptable. I wonder if you're even going to notice putting in a second card, since a single 8800GT can process everything spring throws at it anyway. CPU and memory is going to be the bottleneck
User avatar
AF
AI Developer
Posts: 20687
Joined: 14 Sep 2004, 11:32

Post by AF »

A similar issue exists with java swing.

All the GUI work is done on an event queue in a single thread. All GUI work must be done in that thread or it will give horrible performance or crashes.

So if your not in the event thread and you want to query a GUI control, you use invoke and wait. So ti executes the event queue up un till it reaches your invoke and wait command and then the program returns your query and continues as normal. Which means while your main thread is waiting for the event queue, its not doing anything, its just waiting, wasting time, holding up the processing.

This is pretty much the same thing as the glGet. As jelmer said, while the gfx card is sorting through and executing all its commands, spring is waiting for it to finish, holding up the graphics card which could be sent more commands in this time, and the simulation which could be moving units and projectiles around or accepting user input.

This would also explain the multithreaded optimizations slowdown. There's no point in parallel processing if all the threads have to stop and wait till everythings finished 32 times a second.
Post Reply

Return to “Off Topic Discussion”