🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

Confusing but probably logic behaviour...

Started by
4 comments, last by Adam_42 4 years, 5 months ago

I am playing around with a C++ project using DirectX. Mostly out of curiosity and just the plain fun of learning things. At the moment its a large grid (a plane basically) and a player controlled box that I controll with a gamepad. It can run and jump. There are no real performance issues even though the grid is about 30 000 vertices and no culling is done. Yet.

I am using the DX::StepTimer for delta time between frames and it seems robust.

However. I noticed a HUGE difference in DT between frames when the application is running in the background. It is WAY faster when the app is in the background. So I started to dump out the DT to the console every frame just to check. When the app is active the DT about 10ms but when I tab to any other app (or click on the desktop) it shrinks to below 1. Still rendering. Shouldn't it be the other way around?

I am probably being real stupid and am missing something but I can't for the *** figure it out.

So WTH is the app NOT doing when I am tabbed out? Should I create a game that exclusively run behind other windows?

Any insight or nudge in the right direction is greatly appreciated.

Advertisement

Is frame rate locked to vertical sync? I think the user can set this as a preference in the driver. I think all background apps in reality render to texture and the window manager renders them to the framebuffer. The client are of the fore ground app can - in principle - be rendered to directly. But I read this works only in full screen exlusive mode? Still the sync may already happen.

Redrawing is a very performance intense operation for every OS so they are optimizing as hell to not shring the performance of your desktop if there are several programs open like Photoshop, together with Unity and Maya. To achieve this, the OS is tracking what window has focus, what parts of a window are visible and manages those windows in the background.

VSync happens regardless of if you are in background or have focus, but it depends on the graphics card diver, user settings and program settings. VSync is a function that has to be turned on explicitly if not set to default being active in the driver and vice versa. I had a large test session on this flag (it is no more than a flag) on several maschines using NVidia and AMD drivers with OpenGL.

However, it might be a difference if the program is really running in the background, partially or fully covered by another window or minimized to tray. If you minimize the window, Windows for example is shrinking your window size to something arround 177 px x 50 px. Rendering to a such small window is faster than for Full HD, 4K or 8K

I see.

I have tried a few combination of partly covered and minimized mode but there isn't any noticable difference in DT. As long as the application does not have focus the DT is very small and vice versa.

Today I experimented with changing refreshrate, turning VSync on and off and disabling GSync (which to my knowledge shouldn't impact on these things) but part from naturally getting worse when going from 100hz to 60hz the DT does not care that much.

Have you tried using a profiler to find out what's different between the two cases?

Are you processing all of the windows messages every frame?

This topic is closed to new replies.

Advertisement