I have made an application that uses VBO’s to render large volume of data. I use around 160mb total on the GPU. This program crashes on windows vista and OS tells me that the display driver has crashed but has recovered. There are no exceptions thrown by Java, however, and the program continues to run, it just doesn’t render anything.
Same program works fine with smaller data sets such as around 80mb. The weird part is, that this program worked fine with 160mb usage under windows XP. And if I tried to set buffers that were too large for the GPU, it threw exceptions and at least gave me some errors.
What could be happening here? Does Windows Vista use more GPU memory for itself, so my program doesn’t have enough of it? Is there a way to get this working under Vista or will I just have to use larger data sets under XP?