Click here to check the course website.

Supervision work is due by 4pm (16:00) on the day before the supervision . Partial work is perfectly fine as long as it's clear that you made an effort and it is visible from your work where you got stuck.

Feel free to attach any questions to your work.

Supervision 1

Warmup questions

Only short answers, please

  1. Watch NVIDIA's real-time ray tracing demo from 2020 to get in the mood
  2. What are the ray parameters of the intersection points between ray (1,1,1) + t(−1,−1,−1) and the sphere centred at the origin with radius 1?
  3. Why do we need anti-aliasing?
  4. What is the difference between a point light source and an area light source?
  5. We use a lot of triangles to approximate stuff in computer graphics. Why are they good? Why are they bad? Can you think of any alternatives?

Longer questions

Please give more detailed answers to these

  1. 2017 Paper 4 Question 3 (a)
  2. What information would you need to define the volume of space that might be visible from the camera during rendering (research view frustum if in doubt)?
  3. Write pseudo-code for the ray tracing algorithm, where the first line of code is as stated below.
    (if you use slide 32, make sure you explain each line in detail)

            for each pixel:          ...

  4. Explain how Ray tracing can achieve the following effects:
    • reflections
    • refraction
    • shadows
  5. Provide two examples for distributed ray tracing and explain how the selected techniques works
  6. Describe the Model, View, and Projection transformations. Comment on why we use homogeneous co-ordinates.
  7. When transforming objects into world co-ordinates using matrix M , position vectors are pre-multiplied with M . Discuss whether this matrix is suitable to transform the objects' normals. If not, can you suggest an alternative?
  8. 2010 Paper 4 Question 4

Supervision 2

Warmup questions on OpenGL

  1. What is OpenGL?
  2. How is Vulkan different from OpenGL?
  3. Put the the following stages of the OpenGL rendering pipeline in the correct order. Very briefly explain what each stage does and comment whether each stage is programmable.
    • Rasterization
    • Vertex shader
    • Fragment shader
    • Primitive setup
    • Clipping
  4. Search for "normal map" images on the internet. Why do they tend have an overall blue shade?

Longer questions on OpenGL

  1. Describe the z buffering algorithm. Compare the projection matrix on slide 86 with the projection matrix in the 2010P4Q4 past paper, and discuss which one you need to use for Z buffering
  2. What is the worst case scenario, in terms of a number of times a pixel colour is computed, when rendering N triangles using the Z-buffer algorithm? How could we avoid such a worst-case scenario?
  3. How could you use the following texture types to texture a sphere in OpenGL?
    • 2D
    • 3D
    • CUBE_MAP
    How do these techniques compare in terms of visual quality and storage?
  4. For downsampling an image, briefly explain how each of the following sampling techniques work (feel free to use khronos.org when unsure). Find or generate some illustrations of typical artefacts where relevant. Discuss performance, storage and visual quality.

(Short) questions on colour & perception

  1. What is an image? How are digital images represented in memory?
  2. What is quantisation?
  3. What is colour banding?
  4. What is the difference between luma and luminance?
  5. Why is gamma correction needed?
  6. What are the differences between rods and cones?
  7. How can two colour spectra appear the same? What are these called then?
  8. What is the relation between LMS cone sensitivities, CIE XYZ and the RGB space of a monitor?
  9. Explain the purpose of tone-mapping and display-encoding steps in a rendering pipeline.
  10. What is the rationale behind sigmoidal tone-curves?