[quote]Or could you provide sourceode of a working example (tiny), so that I can run it on my own machine. Remote-debugging == slow
[/quote]
I have an ATi so I might be able to debug it.
I can try to extract a small test application for you. It will take a few days for me to get around to doing this though!
Riven, maybe that’s true for a clever driver, but usually at data upload time the buffer object is just a bunch of bytes for the driver. The actual data type is made known to the driver after a call to a glXXXPointer function.
chris0, another thing you could try is using 4 instead of 3 bytes for the color array (better data alignment). If that doesn’t work, could you post any unusual GL state you have enabled during rendering (line/point rendering, depth offset, etc)?
Yes, I was thinking that too. What I’ve just tried is to use floats for colours instead of bytes (as I remember reading somewhere that floats are the native format for colours too). This has completely solved the problem. The RX600 now runs at well over 200fps (originally 3fps!!). I guess I was making similar assumptions to Riven with regard to “automatic optimisations” that were infact never made. I will also try using bytes aligned on 4-byte boundaries later, and I’ll post back with the results just for interest.
Thanks to both of you for your help.
Chris.