What Is Light?
Photography is, at its most fundamental, the recording of light. The word itself comes from the Greek photos (light) and graphe (drawing) — literally, drawing with light. Before we pick up a camera or load a roll of film, we need to understand what light actually is, how it behaves, and why its properties matter so much to the images we make.
This first lesson lays the groundwork for everything that follows. The physics here is not difficult, but it is essential. Every decision a photographer makes — choosing a film stock, setting a white balance, metering a scene, deciding when to shoot — traces back to the nature of light itself.
The Electromagnetic Spectrum
Light is a form of electromagnetic radiation. That puts it in the same family as radio waves, microwaves, infrared, ultraviolet, X-rays, and gamma rays. What distinguishes these from one another is their wavelength — the distance between successive peaks of the electromagnetic wave.
The full electromagnetic spectrum spans an enormous range. Radio waves can have wavelengths measured in kilometers. Gamma rays have wavelengths smaller than an atom — on the order of picometers (trillionths of a meter). Somewhere in the middle of this vast continuum lies a remarkably narrow band that our eyes can detect: visible light.
James Clerk Maxwell unified electricity and magnetism in the 1860s, publishing his famous equations in 1865. He showed that electric and magnetic fields could propagate through space as waves, and he calculated their speed — which turned out to be the speed of light. This was the first theoretical proof that light was an electromagnetic wave. Heinrich Hertz confirmed Maxwell's predictions experimentally in 1887 by generating and detecting radio waves in his laboratory in Karlsruhe, Germany.
The key insight for photographers is that all electromagnetic radiation travels at the same speed in a vacuum — approximately 300,000 kilometers per second (299,792,458 m/s to be precise). What varies is the wavelength and, inversely, the frequency. Shorter wavelengths mean higher frequencies and higher energy. This relationship is described by a simple equation:
Wave equation: c = λ × f, where c is the speed of light, λ (lambda) is the wavelength, and f is the frequency. Since c is constant, shorter wavelength means higher frequency and higher energy per photon.
The electromagnetic spectrum from radio waves to gamma rays, with the narrow visible band expanded below. Visible light spans roughly 380 nm (violet) to 700 nm (red).
Visible Light: The Narrow Window
Visible light occupies an astonishingly small slice of the electromagnetic spectrum — wavelengths from roughly 380 nanometers (violet) to about 700 nanometers (red). A nanometer is one billionth of a meter. The entire visible range spans barely an octave of frequency, yet it contains all the colors we can see and all the light we use to make photographs.
Why this particular range? It is not a coincidence. The sun's peak energy output falls in this band, and our eyes evolved to be sensitive to the wavelengths most abundantly available. The atmosphere is also relatively transparent to visible light, while absorbing much of the ultraviolet and infrared. Evolution equipped us with detectors perfectly tuned to the light most available on Earth's surface.
Within this narrow window, wavelength determines color. The shortest visible wavelengths, around 380–450 nm, appear violet and blue. Middle wavelengths, around 495–570 nm, appear green and yellow. The longest visible wavelengths, around 590–700 nm, appear orange and red. Isaac Newton demonstrated this in his famous prism experiments of 1666, showing that white light could be separated into a spectrum of colors and recombined back into white light.
Key concept: Wavelength determines color. Shorter wavelengths (380–450 nm) are violet and blue; medium wavelengths (495–570 nm) are green; longer wavelengths (590–700 nm) are orange and red. Our entire experience of color comes from this narrow range.
White Light and Color Mixing
What we perceive as white light is not a single wavelength — it is a mixture of many wavelengths across the visible spectrum. Sunlight at midday contains a roughly even distribution of all visible wavelengths, which is why we perceive it as white (or nearly so).
This fact has profound implications for photography. Film emulsions must respond to multiple wavelengths to produce accurate color reproduction. Early photographic emulsions were sensitive only to blue and ultraviolet light, which is why 19th-century photographs often show blue skies as white and red objects as black. It took decades of chemical research to create panchromatic emulsions sensitive to the full visible spectrum. Hermann Vogel discovered spectral sensitization in 1873, and truly panchromatic film did not become widely available until the early 20th century.
Color film takes this further. A color negative like Kodak Portra 400 contains three separate emulsion layers, each sensitized to a different portion of the spectrum — one for red, one for green, one for blue. These three channels are sufficient to reconstruct a convincing full-color image, because human color vision relies on three types of cone cells in the retina, each sensitive to a different wavelength range. This trichromatic model of vision, proposed by Thomas Young in 1802 and refined by Hermann von Helmholtz in the 1850s, is the foundation of all color photography and color display technology.
Particle and Wave: The Dual Nature
Light behaves both as a wave and as a stream of particles called photons. For most of photography, the wave model is sufficient — it explains reflection, refraction, diffraction, and interference, all of which affect how lenses form images. But the particle model matters too: when light strikes film, individual photons trigger chemical changes in individual silver halide crystals. The quantum nature of light is why very low light photography produces grainy, noisy images — there simply are not enough photons arriving to build a smooth picture.
Albert Einstein received the Nobel Prize in Physics in 1921 (awarded for his 1905 paper) for explaining the photoelectric effect using the concept of light quanta — what we now call photons. Each photon carries an energy proportional to its frequency: E = hf, where h is Planck's constant (approximately 6.626 × 10−34 joule-seconds). Blue photons carry more energy than red ones. This is relevant to film photography because higher-energy photons are more effective at triggering the silver halide reaction, which is partly why early emulsions were most sensitive to blue light.
Practical note: The dual nature of light explains two different photographic phenomena. Wave behavior governs how lenses focus light (refraction) and limits sharpness at small apertures (diffraction). Particle behavior governs how film records light (individual photons creating latent image specks in silver halide crystals).
Color Temperature
Not all white light is the same. The light from a candle flame looks warm and orange-yellow. Noon sunlight appears neutral. The light in open shade on a clear day has a distinctly cool, bluish quality. These differences are described using the concept of color temperature, measured in Kelvin (K).
Color temperature is based on the physics of black body radiation. A black body is an idealized physical object that absorbs all radiation that falls on it and, when heated, emits radiation with a characteristic spectrum that depends only on its temperature. As a black body gets hotter, its emitted light shifts from red to orange to yellow to white to blue-white. The temperature of the black body that most closely matches a given light source defines that source's color temperature.
Max Planck developed the theoretical framework for black body radiation in 1900, introducing the concept of energy quanta that would later become central to quantum mechanics. His radiation law precisely describes the spectrum of light emitted at each temperature, and it underpins the entire color temperature scale used in photography.
Common color temperatures that photographers encounter include:
- 1,800 K — Candle flame. Very warm, deep orange light.
- 2,700–3,000 K — Incandescent light bulbs and sunset/sunrise. Warm orange-yellow tones.
- 3,200 K — Professional tungsten studio lights. This is the standard for "tungsten-balanced" film such as Kodak Ektachrome 64T.
- 4,000–4,500 K — Fluorescent lighting and early morning/late afternoon sun.
- 5,500 K — Noon daylight. This is the standard reference for "daylight-balanced" film.
- 6,500 K — Overcast sky. Slightly cool, with more blue.
- 7,000–8,000 K — Open shade on a clear day. Noticeably blue.
- 10,000+ K — Clear blue sky (the sky itself as a light source, not direct sun).
The color temperature scale in Kelvin. Lower values produce warm, orange light; higher values produce cool, bluish light. Daylight-balanced film is calibrated for approximately 5,500 K.
Why Color Temperature Matters to Photographers
Film emulsions are manufactured to produce accurate colors under a specific color temperature. Most general-purpose films — Kodak Portra, Fuji Superia, Ilford HP5 — are daylight-balanced, meaning they are optimized for light around 5,500 K. If you shoot daylight-balanced film under tungsten lighting (about 3,200 K), the resulting images will have a strong orange-yellow color cast. Shooting in shade (7,000+ K) will produce a cool blue cast.
With black-and-white film, color temperature matters less for color accuracy (obviously) but still affects the image. Tungsten light has relatively less blue compared to daylight. Since many black-and-white films have varying sensitivity across the spectrum, the color temperature of your light source can affect the relative brightness of different objects in the scene. A blue shirt and a red shirt that appear equally bright in daylight may render very differently under tungsten light on a film that is more sensitive to blue wavelengths.
In the days before digital white balance, photographers used color correction filters to adapt to different lighting conditions. An 80A blue filter converts tungsten light to approximate daylight balance. An 85B amber filter does the reverse, allowing tungsten-balanced film to be used outdoors. These filters work by selectively absorbing wavelengths to shift the overall color of the light reaching the film.
Practical tip: When shooting color film, pay attention to your light source. For the most natural color rendition, match your film to your lighting: daylight film in daylight, or use appropriate color correction filters. Many photographers embrace the warm cast of tungsten light on daylight film as a creative choice — the "warm glow" of indoor scenes shot on Portra 400 is a beloved aesthetic.
Beyond Visible: Infrared and Ultraviolet
While our eyes cannot see beyond the visible spectrum, photographic film can be manufactured to respond to wavelengths outside our perception. Infrared film, such as the now-discontinued Kodak High Speed Infrared or the still-available Rollei Infrared 400, is sensitive to wavelengths up to about 900 nm. Infrared photography produces striking, otherworldly images: green foliage glows bright white (because leaves strongly reflect infrared), blue skies turn nearly black, and skin takes on an ethereal, smooth quality.
At the other end, standard photographic film is naturally sensitive to ultraviolet light. In fact, UV sensitivity is something lens designers must account for — a UV filter on the lens absorbs ultraviolet wavelengths that would otherwise cause a bluish haze in the image, especially at high altitudes or near large bodies of water where UV levels are elevated. The ubiquitous "UV filter" that many photographers keep on their lenses serves this purpose, though it also protects the front element from scratches and dust.
Light Intensity and the Inverse Square Law
Beyond color and wavelength, the intensity of light — how much of it there is — is the other critical variable for photographers. Light from a point source obeys the inverse square law: the intensity drops off with the square of the distance from the source. Double the distance and you get one-quarter the light. Triple the distance and you get one-ninth.
This is a direct consequence of geometry. Light radiates outward in all directions (or at least a wide cone, in the case of focused sources), and as it travels farther, it spreads over a larger area. The surface area of a sphere increases with the square of the radius, so the same total energy is spread over four times the area at twice the distance.
Key concept: The inverse square law — I = 1/d² — means light falls off rapidly with distance. This is why a subject 2 meters from a lamp receives only one-quarter the light of a subject 1 meter away. Understanding this principle is essential for any work with artificial lighting.
The inverse square law is why studio photographers obsess over the distance between their lights and their subjects. Moving a light even a small amount can dramatically change the exposure. It also explains why the sun, despite being 150 million kilometers away, provides relatively even illumination across a scene — because the distances within any earthly scene are vanishingly small compared to the distance to the sun, the intensity is effectively constant across the frame.
Reflection, Absorption, and Transmission
When light strikes an object, three things can happen: it can be reflected, absorbed, or transmitted. Most objects exhibit a combination of all three. A red apple absorbs blue and green light and reflects red wavelengths, which is why we see it as red. A pane of clear glass transmits most visible light, reflects a small percentage (about 4% per surface), and absorbs very little. A sheet of black velvet absorbs almost all light across the visible spectrum.
These interactions are what create the world of tones and colors that photography captures. Every photograph is fundamentally a record of which wavelengths were reflected from (or transmitted through) the objects in the scene toward the camera. The lens gathers this reflected and transmitted light and focuses it onto the film plane, where it creates the image.
Understanding reflection also helps explain metering. A light meter measures the intensity of light, but a camera's built-in meter (or a handheld reflected meter) measures light reflected from the scene. This means the meter's reading depends not just on the illumination but on the reflectivity of the subject. An 18% gray card reflects 18% of the light falling on it, and this is the reference standard that light meters are calibrated against. We will explore metering in much more detail in Lesson 7.
Looking Ahead
With this foundation in the nature of light, we are ready to explore how that light can be controlled and directed. In the next lesson, we will examine how lenses bend light to form images — from the simplest pinhole to the elegant multi-element designs found in the finest cameras. The physics of refraction will build directly on what we have learned here about the wave nature of light.
Every roll of film you shoot, every exposure you make, is an encounter with the electromagnetic spectrum. The more deeply you understand light, the more intentionally you can work with it.
Sources
- Wikipedia — Electromagnetic spectrum
- Wikipedia — Visible spectrum
- Wikipedia — Color temperature
- Wikipedia — Black-body radiation
- Wikipedia — Photoelectric effect (Einstein, 1905)
- Wikipedia — Hermann Vogel and spectral sensitization
- Wikipedia — Inverse-square law
- Hecht, Eugene. Optics, 5th edition. Pearson, 2017. Chapters 3–4 on electromagnetic waves and light propagation.