Our eyes see things by recieving light emitted by or bounced off it. Monitors fool our eyes into seeing something by emitting the same amount of light. Game engines aim to construct the realistic image data and transform it to correctly reach the user's eye.
Monitor / TV Reproduction of Luminance
Let's say we're looking at an apple in an indoor bedroom environment:
What you are seeing above is light being emitted by your monitor in an attempt to reproduce light that was reflected off the real apple. If the real apple reflects 40 nits on the right side, then the monitor would ideally emit 40 nits in its pixels. If the monitor is in the same lighting environment as the apple, and the monitor emits exact same amount of light over the image as the apple reflects, then it should look just like the apple (ignoring 3D / depth perception).
If for example the image data is too bright, then you see something that looks less realistic:
Of course the real world problem is more complicated: if you are looking at the same photo in a brightly lit environment, then 40 nits would appear almost invisible. If you're looking at the same 40 nits in a completely dark room with lights off it would be way too bright. You often see this problem with smartphones; they try to adjust the brightness of the screen to match the ambient lighting.
Let's say we calculated that our apple object should reflect 40 nits. We want to give the monitor some signal that says “this pixel is 40 nits” assuming perfectly dark room, and add a brightness adjust option to the monitor and the game to adjust for ambient lighting differences. How to we tell the monitor this?
Monitors generally conform to some sort of specification; most LDR monitors take and display the rec.709 gamma curve. This means any pixel value we output will be interpreted (whether it should be or not) by the monitor as rec.709 data, to which it will transform into linear data and set the pixel to that intensity value. Rec.709 defines a gamma curve of approxiately y = x ^ 2.2.