LabRat Report post Posted October 3, 2006 I am not sure how to categorise this one, or where would be the best place to put it - here or bug reports.. I am running a nice fast system (AMD 3300+, ATI x700 256MB), and have set VSYNC to 'on' in the ATI OpenGL control panel. While running the game with %limit_fps 0 the CPU usage is as near to 100% as the game can manage without crashing Windows, and giving a constant 50FPS (I am running on TV out until I replace my exploded monitor). If I %limit_fps 50 my CPU usage drops to around 12%-20% - it appears that frames are still being drawn even though the screen doesn't render them. Upgrading the SDL libs to 1.2.11 introduces a new attribute: SDL_GL_SWAP_CONTROL which allows the program to set/unset VSYNC. I can't see how to get the monitor's current refresh rate using SDL anywhere on google, but if a way could be found then setting %limit_fps to match that should reduce processor usage immensely when activated. In the meantime, maybe forcing %limit_fps to a max of 100 or 85 or 72 or 60 (72 sounds a fair number to me) will help reduce the CPU usage. If %limit_fps is set to 0, then just let it continue as-is for benchmarking purposes. Anyway, if you can make sense of that lot, feel free to comment. Share this post Link to post Share on other sites
0ctane Report post Posted October 3, 2006 I am a bit surprised that your CPU load is near 100%. At limit_fps of zero I get a full GPU load, but not CPU. Regardless, when I actually use a fps limit, typically 50fps, the reported fps always seems to be quite lower, 35-40fps. I have noticed this effect for both by Athlon 2200 / Geforce 440MX and the more powerful Macs I use. I highly recommend a FPS limit, otherwise you are going to stressout that videocard. EL will try to use every ounce of GPU power possible at limit_fps=0. I thought the NTSC standard was 30fps for broadcast TV. I am not sure about PAL. Do (non-HD) TVs go up to 50fps? I am not sure. Share this post Link to post Share on other sites
LabRat Report post Posted October 3, 2006 (edited) Picture 1: Task Manager with %limit_fps 50 Picture 2: Task Manager with %limit_fps 0 The PAL refresh rate standard is 25Hz interlaced, but TV out graphics cards just do 50Hz and output half the picture each pass. I normally run on %limit_fps 12 as a rule - I don't watch the game closely enough for the slight jerkiness to bother me, but I was just checking my benchmarks as I have just overclocked my CPU from 2GHz (3300+) to 2.2Ghz (3600+) Edit: The bottom picture shows where I loaded MSPaint if you are wondering about the extra threads etc.. Edited November 15, 2006 by LabRat Share this post Link to post Share on other sites
emajekral Report post Posted October 6, 2006 (edited) I've noticed both behaviours as well. The GPU one in Linux, the other in Windows. Dunno how to solve though, so unless I'm doing speed tests I set limit_fps to about 5 fps over my goal and enable vsync too. That refresh rate looked right to me labrat I hear that American movies are generally pitched too high in PAL format because they just speed up the film from 24fps to 25fps. Is that true? (Offtopic, I know) Edited October 6, 2006 by emajekral Share this post Link to post Share on other sites
LabRat Report post Posted October 6, 2006 Nah, the movie is still played in realtime, my TV and video just playback at NTSC speed so it makes no difference at all.</offtopic> Share this post Link to post Share on other sites
majestyk Report post Posted October 10, 2006 I've noticed both behaviours as well. The GPU one in Linux, the other in Windows. Dunno how to solve though, so unless I'm doing speed tests I set limit_fps to about 5 fps over my goal and enable vsync too. That refresh rate looked right to me labrat I hear that American movies are generally pitched too high in PAL format because they just speed up the film from 24fps to 25fps. Is that true? (Offtopic, I know) I had a similar effect while being at University. For our cinema project, we had very old russian 30mm projectors. Due to the different voltage applied they ran always a bit faster than in their original country; we calculated the speed to be around 26.2 fps, which averages to about 5-7 minutes shorter movie runtimes over the length of a complete hollywood movie. You didn't really notice the difference while watching, but you always got home early. Nice way to save time. :-) Share this post Link to post Share on other sites
Entropy Report post Posted October 12, 2006 I had a similar effect while being at University. For our cinema project, we had very old russian 30mm projectors. Due to the different voltage applied they ran always a bit faster than in their original country; we calculated the speed to be around 26.2 fps, which averages to about 5-7 minutes shorter movie runtimes over the length of a complete hollywood movie. You didn't really notice the difference while watching, but you always got home early. Nice way to save time. :-) The difference is more apparent in the audio. Share this post Link to post Share on other sites