Let us consider two stars located at(-L,h/2) and (-L,-h/2). Suppose we have an infinite screen at x=0. The light from the two stars will create a double slit diffraction pattern on the screen. The size of the central ridge is determined by the condition that the differences in optical paths should be comparable with the wavelength

 \lambda \approx \sqrt{L^2 + \left(r+h/2 \right )^2} - \sqrt{L^2 + \left(r-h/2 \right )^2} \approx \frac{r h}{L} \Rightarrow r \approx \frac{L \lambda}{h}

where we assumed  L \gg r,h (this regime is often called the far field). For a telescope to resolve the two objects, the size of its aperture should be larger then the radius of the main diffraction ridge. By substituting  h/R = \Delta \theta , where  \Delta \theta is the angular separation we obtain the diffraction limit

 \Delta \theta > \frac{\lambda}{2 r} .

A more rigorous derivation adds the prefactor 1.22 to the right hand side of this equation.