
Colour signal: \[ C(\lambda) = E(\lambda) \cdot S(\lambda) \]
Sensor response: \[\rho = \int_\lambda C(\lambda)\cdot R(\lambda)~\mathrm d\!\lambda = \int_\lambda E(\lambda) \cdot S(\lambda)\cdot R(\lambda)~\mathrm d\!\lambda\]
How to obtain colour information?
Electrons jump between orbits. When they go from a higher energy orbit to a lower energy orbit, they emit a photon
The emitted radiation has energy: \[Q = h \nu = \frac{h c}{\lambda}\]
Depend on power
\(\varphi\) angle to surface normal
In addition to radiometric measures, we use spectral radiometric measures by defining the previous measures per unit wavelength or frequency interval

The wavelength of the maximum emitted radiation is inversely proportional to the temperature \[\lambda_{max} = \frac {2.897\cdot 10^{-3}} T\]
As the temperature increases, the amount of radiation at each wavelength increases, with the total radiant excitance being proportional to \(T^4\)
\[M = \int_0^\infty M_\lambda(T)\mathrm d\lambda = \sigma T^4\]
The human vision system has a Colour Constancy property: we quickly adapt to the color of the light.
In camera processing: we use White Balancing to estimate the illuminant and correct for it to mimic human colour constancy

By Figure design by Kasuga~jawiki; vectorization by Editor at Large; “The dress” modification by Jahobr, CC BY-SA 3.0, Link


Claude Monet: Cathédrale de Rouen
A series of 30 paintings of the same building, at different times of day, or the importance of the illuminant before the digital era.
When taking pictures, we want to perform white balance to correct the illuminant and obtain a plausible image. However, we do not want to apply it too strongly, as we want to preserve information about the illuminant, which is part of a picture and can set its tone/mood.
Typical tungsten-filament or incandescent lighting (was used in light bulbs before LED lights became common)
These illuminants are early/obsolete approximations of daylight: B represents noon daylight while C represents average North sky daylight.
They are no longer used much as we now have much better estimations
D series represent natural daylights. D65 is considered the standard illuminant.
D50 is used in the printing industry, it corresponds horizon light and applies to many lighting situations.

D55 represents mid-morning / mid-afternoon daylight

D65 is the standard illuminant, it represents noon daylight and is used when mapping in and out of different devices (sRGB colour space)

North sky daylight

Beyond these four canonical standards, it is possible to create D-like representations for other temperatures: All D illuminants are a linear combination of three SPDs.
While the D numbers represent the temperature (D65 stands for 6500 K), note that the exact CCT of the four canonical illuminants differ slightly from this value (the CCT of D65 is actually 6504 K). This is due to the SPDs of D illuminants being based on Planck’s law constants: our estimate of the constants is now more precise than what was known when the standard illuminants were created.
E is a theoretical reference for an equal-energy radiator, it has constant SPD within the visible spectrum.
Series FL represent fluorescent lamps.
Standard fluorescent lamps

Broadband fluorescent lamps with a fuller spectrum

Narrow-band lamps

LEDs have a narrow bandwidth: for a correct “white” light, several LEDs are needed (the narrow peaks of individual LEDs is particularly easy to see in the LED-RGB1 illuminant).
Phosphor-converted blue light

Mix of phosphor-converted blue light and a red LED

RGB system (mix of three colour LEDs)

phosphor-converted violet light

Two standards representing natural indoor light. ID50 and ID65 are equivalent to outdoor D50 and D65, filtered through glass windows. The ultraviolet contents are filtered by the glass.
There are a few other standard illuminants * FL3.1 — 3.15: more different fluorescent illuminants * HP 1 — 5: high pressure discharge lamps
Until now, we have measured EM radiations: this is radiometry, where we equally measure all radiations equally.
In photography, we also use photometry, by weighting the measurements according to the luminosity function of the human eye \(V_\lambda\) \(\Rightarrow\) values are only sampled in the visible spectrum
Luminous Flux (power): radiant flux weighted by the human luminosity function
\[\Phi_L = 683 \int_{380}^{750}V(\lambda)\Phi_R(\lambda)~\mathrm d\!\lambda\]
Our sensitivity peak is around the green values: we are more sensitive to changes in contrast around the green values.
| Radiometric term | Photometric term |
|---|---|
| Radiant Flux \(\Phi_R\) (Watts \(W\)) | Luminous Flux \(\Phi_L\) (Lumens \(\mathrm{lm}\)) |
| Radiant Exitance \(M\) (\(W\cdot m^{-2}\)) | Luminous Emittance (\(\mathrm{lm}\cdot m^{-2}\)) |
| Radiant Intensity \(I_R\) (Watts per steradian, \(W\cdot\mathrm{sr}^{-1}\)) | Luminous Intensity \(I_L\) (candela \(\mathrm{cd} = \mathrm{lm}\cdot\mathrm{sr}^{-1}\)) |
| Irradiance \(E\) (\(W\cdot m^{-2}\)) | Illuminance \(E\) ($=m^{-2} |
| Radiance \(L\) (\(W\cdot m^{-2}\cdot\mathrm{sr}^{-1}\)) | Luminance \(L\) (\(\mathrm{cd}\cdot \mathrm m^{-2}\)) |
The color of an opaque object is characterized by its surface reflectance \(S(\lambda)\in[0, 1]\).


Surfaces are represented by bidirectional reflectance distribution functions (BRDF) with a diffuse component (reflects light equally at all angles), a glossy component (reflects light around the reflection direction), and a specular component (like a mirror, reflects like only at the reflection direction).
The irradiance/illuminance falling on a surface is proportional to the cosine of the incident angle \(\vartheta\): \[E(\vartheta) = E(\vartheta=0)\cdot\cos\vartheta\]

The surface radiance is thus \[L_R = \frac 1 \pi \int_{380}^{750} E(\lambda)\cos\vartheta S(\lambda)~\mathrm d\!\lambda\] and the surface luminance \[L_R = \frac {683} \pi \int_{380}^{750} E(\lambda)\cos\vartheta V(\lambda) S(\lambda)~\mathrm d\!\lambda\]
(\(E\) is the spectral irradiance, \(S\) the spectral reflectance factor of the surface)
![]()

Colour signal: \[ C(\lambda) = E(\lambda) \cdot S(\lambda) \]
Sensor response of a sensor \(k\): \[\rho-k = \int_\lambda C(\lambda)\cdot R(\lambda)~\mathrm d\!\lambda = \int_\lambda E(\lambda) \cdot S(\lambda)\cdot R_k(\lambda)~\mathrm d\!\lambda\]
Three sensorsm sensitive to different wavelengths: * Human Visual System: psychovisual measurements \[\left(\begin{matrix}X\\Y\\Z\end{matrix}\right) = \int_\lambda C(\lambda)\cdot \left(\begin{matrix}R_X(\lambda)\\R_Y(\lambda)\\R_Z(\lambda)\end{matrix}\right)~\mathrm d\!\lambda\] * Camera measurements \[\left(\begin{matrix}R\\G\\B\end{matrix}\right) = \int_\lambda C(\lambda)\cdot \left(\begin{matrix}R_R(\lambda)\\R_G(\lambda)\\R_B(\lambda)\end{matrix}\right)~\mathrm d\!\lambda\]
We express the values as vectors of size 31:
| Name | Continuous notation | Discrete vector notation |
|---|---|---|
| Colour signal | \(C(\mathbf x, \lambda)\) | \(\mathbf c(\mathbf x)\) |
| Reflectance | \(S(\mathbf x, \lambda)\) | \(\mathbf s(\mathbf x)\) |
| Illumination | \(E(\mathbf x, \lambda)\) | \(\mathbf e(\mathbf x)\) |
| Sensor \(k\) sensitivity | \(R_k(\lambda)\) | \(\mathbf r_k\) |
The sensor response is thus \[\rho_k(\mathbf x) = \mathbf c(\mathbf x)^\intercal \mathbf r_k = \mathbf s(\mathbf x)^\intercal \mathrm{diag}\left(\mathbf e(\mathbf x)\right)\mathbf r_k\]
Where \({}^\intercal\) is the matrix transposition and \[\mathrm{diag}(\mathbf e) = \begin{pmatrix} e_0(\mathbf x) & 0 & 0 & \cdots & 0 \\ 0 & e_1(\mathbf x) & 0 & \cdots & 0 \\ 0 & 0 & e_2(\mathbf x) & \vdots & 0 \\ \vdots & \vdots & \cdots & \ddots & \vdots \\ 0 & 0 & \cdots & 0 & e_{30}(\mathbf x)\end{pmatrix}\]
Any physical sensor can have a number of filters (channels) with different sensitivities \(R_k(\lambda)\).
The human vision system and most cameras are trichromatic, e. g. we have three different channels.
The total colour response at position \(\mathbf x\) of such a system is thus a vector with three entries:
\[\rho(\mathbf x) = \begin{pmatrix}\rho_1(\mathbf x)\\\rho_2(\mathbf x)\\\rho_3(\mathbf x)\end{pmatrix}\]
We usually call the three responses: