path: root/src/wgl/wglinfo.c
diff options
authorJosé Fonseca <>2014-11-19 18:10:39 +0000
committerJosé Fonseca <>2014-11-19 21:05:20 +0000
commita819167f7daf8fe060efd75d02825b082f1fd255 (patch)
tree33c3d0d9205c7336af91233c4d7832eae203bb04 /src/wgl/wglinfo.c
parentbcd5a7d93770a1cd83a574c9d1026a45d48cc309 (diff)
wgl: Ensure PIXELFORMATDESCRIPTOR members are zeroed.
I suddenly started seeing many simple GL apps, including wglinfo, choosing Microsoft GDI OpenGL implementation, even though hardware accelerated pixel formats were available. It turned out that: - the screen was in 16bpp mode (some WHCK tests have the nasty habit of doing that) - NVIDIA opengl driver only reports R5G6B5 pixel formats (ie no alpha bits) in this case - non-zero cAlphaBits was being passed to ChoosePixelformat (or in the wglinfo case, garbage, as the structure wasn't being properly zeroed) - ChoosePixelFormat will choose a SW pixel format if has to in order to honour non-zero cAlphaBits. At least on the wglinfo and friends case the alpha bits are not needed, so this change will make sure that HW accelerated formats will be chosen before SW ones. Reviewed-by: Roland Scheidegger <>
Diffstat (limited to 'src/wgl/wglinfo.c')
1 files changed, 1 insertions, 0 deletions
diff --git a/src/wgl/wglinfo.c b/src/wgl/wglinfo.c
index 30b1307..b6285ec 100644
--- a/src/wgl/wglinfo.c
+++ b/src/wgl/wglinfo.c
@@ -123,6 +123,7 @@ print_screen_info(HDC _hdc, GLboolean limits, GLboolean singleLine,
+ memset(&pfd, 0, sizeof(pfd));
pfd.cColorBits = 3;
pfd.cRedBits = 1;
pfd.cGreenBits = 1;