I recently began to read Introduction to Algorithms (great book, check it out) and it got me wondering: the running times of the algorithms analysed in the books can be expressed as a function of the input size in asymptotic notation, but can the same be done for algorithms such as Screen Space Ambient Occlusion or indirect illumination approximation algorithms?How are the running times for such algorithms expressed, and how would one go about analysing and comparing the running times of two computer graphics algorithms (for example, SSAO and HBAO)?

The running time for those algorithms would be expressed in terms of the number of input pixels they process. Going about analyzing them can be done by understanding exactly what their algorithms do. I’m not overly familiar with their exact implementations, so I cannot tell you any more.