How Big is a Photon?

Physics / Photon

How Big is a Photon?

A photon of visible light must be small enough to fit through a 1 micron slit. Yet it must also be large enough to completely cover two slits spaced more than a thousand microns apart.[1] Answering this question is far from trivial — but after building up the physics in the previous three lessons, we can finally answer it.

Three Definitions, Three Answers

Definition Size Why
The electromagnetic field The entire expanding shell — potentially enormous, no upper limit The field propagates at $c$ and never stops expanding. The $1/r$ amplitude decay means $1/r^2$ intensity, which approaches zero but never reaches it.[2]
The detection probability peak Depends entirely on how the probability was shaped For spontaneous emission: the $\sin^2\theta / r^2$ dipole pattern. For a laser: the beam waist $w_0$, typically $\sim$ mm.[3]
The quantized interaction with matter A single atom — as small as it gets Detection collapses the entire extended field to one point, transferring exactly $\hbar\omega$ of energy to one atom.[4]

A photon from the Sun, at the moment it hits your retina, has a field that was a sphere 16 light-minutes in diameter — roughly 360 million kilometres across. A photon from a laser pointer has its probability concentrated in a beam a few millimetres wide — but its field technically extends to infinity. In both cases, detection happens at one atom.

The Uncertainty Principle Connection

The impossibility of confining a photon's field to a small region is not a practical limitation — it is fundamental. For any quantum particle, position and momentum obey:

: Uncertainty in position — how well-localized the photon is: Uncertainty in momentum — how well-defined the photon's direction is: Reduced Planck's constant: $1.055 \times 10^{-34}$ J·s

For a photon with momentum $p = \hbar k = h/\lambda$, attempting to confine it to a region smaller than $\lambda$ would require a momentum uncertainty larger than $p$ itself — physically meaningless. A photon cannot be localized below the scale of its own wavelength, and in practice the field always extends far beyond that.

This is the same physics behind the Gouy phase in a Gaussian beam: a tighter focus (smaller $w_0$) demands greater transverse momentum spread, which shifts the phase and increases divergence. You cannot cheat diffraction.

What We've Learned

The field can be attenuated, split by beam splitters, diluted by distance — it doesn't care, because the field is not quantized.[1] Only the energy exchange with matter is quantized. The field can take on any amplitude, but every absorption or emission event transfers exactly $\hbar\omega$.

And somehow, energy spread across a field that spans light-minutes collapses into a single atom in a single instant.

That is the mystery.

Sources

[1]Huygens Optics — How big is a visible photon?
[2]Jackson, J.D. — Classical Electrodynamics, 3rd ed. (Wiley, 1999), Ch. 9 & 16
[3]Saleh, B.E.A. & Teich, M.C. — Fundamentals of Photonics (Wiley, 2007), Ch. 3
[4]Fox, M. — Quantum Optics: An Introduction (Oxford, 2006), Ch. 5–7
[5]Maxwell, J.C. — A Treatise on Electricity and Magnetism (1873)
[6]Mandel, L. & Wolf, E. — Optical Coherence and Quantum Optics (Cambridge, 1995)