Jump to content
Eternal Lands Official Forums
LabRat

%limit_fps bug/suggestion

Recommended Posts

I am not sure how to categorise this one, or where would be the best place to put it - here or bug reports..

 

I am running a nice fast system (AMD 3300+, ATI x700 256MB), and have set VSYNC to 'on' in the ATI OpenGL control panel.

 

While running the game with %limit_fps 0 the CPU usage is as near to 100% as the game can manage without crashing Windows, and giving a constant 50FPS (I am running on TV out until I replace my exploded monitor).

 

If I %limit_fps 50 my CPU usage drops to around 12%-20% - it appears that frames are still being drawn even though the screen doesn't render them.

 

Upgrading the SDL libs to 1.2.11 introduces a new attribute:

SDL_GL_SWAP_CONTROL

which allows the program to set/unset VSYNC.

 

I can't see how to get the monitor's current refresh rate using SDL anywhere on google, but if a way could be found then setting %limit_fps to match that should reduce processor usage immensely when activated.

 

In the meantime, maybe forcing %limit_fps to a max of 100 or 85 or 72 or 60 (72 sounds a fair number to me) will help reduce the CPU usage.

 

If %limit_fps is set to 0, then just let it continue as-is for benchmarking purposes.

 

Anyway, if you can make sense of that lot, feel free to comment.

Share this post


Link to post
Share on other sites

I am a bit surprised that your CPU load is near 100%. At limit_fps of zero I get a full GPU load, but not CPU. Regardless, when I actually use a fps limit, typically 50fps, the reported fps always seems to be quite lower, 35-40fps. I have noticed this effect for both by Athlon 2200 / Geforce 440MX and the more powerful Macs I use. I highly recommend a FPS limit, otherwise you are going to stressout that videocard. EL will try to use every ounce of GPU power possible at limit_fps=0.

 

I thought the NTSC standard was 30fps for broadcast TV. I am not sure about PAL. Do (non-HD) TVs go up to 50fps? I am not sure.

Share this post


Link to post
Share on other sites

Picture 1: Task Manager with %limit_fps 50

cpu1.jpg

 

Picture 2: Task Manager with %limit_fps 0

cpu2.jpg

 

The PAL refresh rate standard is 25Hz interlaced, but TV out graphics cards just do 50Hz and output half the picture each pass.

 

I normally run on %limit_fps 12 as a rule - I don't watch the game closely enough for the slight jerkiness to bother me, but I was just checking my benchmarks as I have just overclocked my CPU from 2GHz (3300+) to 2.2Ghz (3600+)

 

Edit: The bottom picture shows where I loaded MSPaint if you are wondering about the extra threads etc..

Edited by LabRat

Share this post


Link to post
Share on other sites

I've noticed both behaviours as well. The GPU one in Linux, the other in Windows. Dunno how to solve though, so unless I'm doing speed tests I set limit_fps to about 5 fps over my goal and enable vsync too.

 

That refresh rate looked right to me labrat :)

 

I hear that American movies are generally pitched too high in PAL format because they just speed up the film from 24fps to 25fps. Is that true? (Offtopic, I know)

Edited by emajekral

Share this post


Link to post
Share on other sites

Nah, the movie is still played in realtime, my TV and video just playback at NTSC speed so it makes no difference at all.</offtopic>

Share this post


Link to post
Share on other sites

I've noticed both behaviours as well. The GPU one in Linux, the other in Windows. Dunno how to solve though, so unless I'm doing speed tests I set limit_fps to about 5 fps over my goal and enable vsync too.

 

That refresh rate looked right to me labrat :D

 

I hear that American movies are generally pitched too high in PAL format because they just speed up the film from 24fps to 25fps. Is that true? (Offtopic, I know)

 

I had a similar effect while being at University. For our cinema project, we had very old russian 30mm projectors. Due to the different voltage applied they ran always a bit faster than in their original country; we calculated the speed to be around 26.2 fps, which averages to about 5-7 minutes shorter movie runtimes over the length of a complete hollywood movie. You didn't really notice the difference while watching, but you always got home early. Nice way to save time. :-)

Share this post


Link to post
Share on other sites

I had a similar effect while being at University. For our cinema project, we had very old russian 30mm projectors. Due to the different voltage applied they ran always a bit faster than in their original country; we calculated the speed to be around 26.2 fps, which averages to about 5-7 minutes shorter movie runtimes over the length of a complete hollywood movie. You didn't really notice the difference while watching, but you always got home early. Nice way to save time. :-)

 

The difference is more apparent in the audio.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Recently Browsing   0 members

    No registered users viewing this page.

×