but when I move fast I get ghosting like I get with bloom.
(images are larger when clicked)
also still doesn't change the issue with it over distance. You can find good angles like this one but I keep seeing red shadows and thing. I am going to try and get another pair of glasses to see if that helps.
The ghosting comes from your `low` fps. The widget switches each frame the camera eye and then combines the previous and the current one. So one of them is always 1 frame old, and so it can cause ghosting when moving the camera and the timespan between the last and the current frame is too long.
The big secret is that this widget is a hack. There is no solution to this except an engine patch. The only other remedy is if you have extremely high FPS then the delay is less bad.
smoth, I'm not sure if you read the help but you must use a rotatable camera in order for convergence to work. When using TA Overhead cam the convergence setting is disabled because the camera can't rotate.
Bumping thread because I got a 3DTV a few months back and I'm finally getting ready to game on it. Is there any chance of making this work with ATI HD3D or nVidia 3D Vision so we get properly exported sequential left / right eye frames? It'd be pretty cool to be the first open source game on their rather sad list of native titles.
Hardly the first. All the games based on the IoQuake3 engine support stereo 3D.
Yeah, but you have to hack it. http://forums.nvidia.com/index.php?showtopic=170960 I'm talking about working out of the box. One checkbox, and if you have an nVidia 3D vision rig, ATI / nVidia box hooked up to a standard 3DTV or monitor, you'd have Spring in 3D.
I don't know if that would work. In the widget all I can do is draw alternating left/right frames with each drawscreen event. But that is not the same as your shutter glasses which are synced to the refresh rate of your actual monitor. It's probably doable with an engine patch though, which I hope to tackle one day. As mentioned a few posts up, the widget is a hack and performs poorly when FPS is low.
I don't really see anything on their site about how the APIs work. I'd hope it was the exact same for nVidia and ATI but experience teaches me that probably won't happen. I'd happily go buy a new nVidia or ATI card and be the guinea pig for all this.
I was talking to the Oculus Rift guys at Quakecon today. I'm trying to convince them to release libraries under the GPL or BSD/ISC style license, as well as get them to write support upstream in SDL.
I'd be willing to buy a pair of Rift goggles and ship them to anyone who is willing and able to write support into Spring, as long as they ship them back to me when done. :) Or I could just remotely test while you write the code.
They're an absolutely mindblowing experience and I'm really looking forward to see what people are going to do with them. Spring could really be on the front lines this time.
The Rift uses relatively simple optics which means there's quite a bit of geometric distortion, chromatic aberration, and other things. These have to be "fixed" in software with fragment shaders. That's what the libraries would be for.
The device itself is basically just a 1280x800 screen divided in half at 640x800 per eye. It takes a single DVI cable, so you'd just do standard left / right frame packing to run it.
Also, the head tracking, being an input device, is probably best supported partially in the SDL. I'm sure you guys would much prefer just getting passed six degree of freedom coordinates of where the head is pointing by the SDL rather than build anything into Spring itself.
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot post attachments in this forum