Suppose we have a blob of magnetised plasma emitting synchrotron emission. Let the temperature of the plasma be $ k T $, so that the Lorentz factor of the electrons is $ \gamma_e = k T / m_e c^2 $, where $ m_e $ is the electron mass and $ c $ is the speed of light. Let the magnetic field be $ B $, and the number of electrons be $ N_e $. The rate at which electrons cool by emitting synchrotron radiation is

$ L_s \approx N_e r_0^2 c B^2 \gamma_e^2 $

where $ r_0 $ is the classical electron radius. The electrons emit photons with a typical frequency

$ \nu_s \approx \frac{c}{r_0} \sqrt{\frac{B^2 r_0^3}{m_e c^2}} \gamma_e^2 $

Electrons can also cool by inverse Compton emission. The cooling rate due to inverse Compton is

$ L_c \approx N_e r_0^2 c u \gamma_e^2 $

This is the same expression as the synchrotron cooling rate, with the magnetic energy density $ B^2 $ replaced by the photon energy density $ u $.

When inverse Compton cooling rate exceeds the synchrotron cooling rate, synchrotron emission is suppressed. This happens when the radiation energy density is comparable to the magnetic energy density $ u \approx B^2 $. Observationally, this condition manifests itself as an upper bound on the brightness temperature of radio sources.

Suppose an astrophysical source is observed at a frequency $ \nu_s $. The magnetic field scales with temperature and frequency as $ B \propto \nu_s/ T^2 $. The energy density of photons in the $ nu_s $ range is bounded by Black - body energy density $ u \approx T \nu_s^3 $. Therefore

$ \frac{L_c}{L_s} \approx \frac{u}{B^2} \approx \left(\frac{k T}{m_e c^2} \right)^5 \left(\frac{\nu_s r_0}{c} \right)<1 $.

For sources observed at a frequency of about 1 GHz, the temperature cannot exceed $ 10^{12} $ K. This temperature scales with frequency as $ \nu_s^{-1/5} $, and hence is not very sensitive to it.