A parallel beam of monochromatic light of wavelength 900 nm passes through a long slit of width 0.4 mm. The angular divergence in which most of the light is diffracted, is |
$8.9 × 10^{-3} rad$ $7.2 × 10^{-3} rad$ $3.6 × 10^{-3} rad$ $4.5 × 10^{-3} rad$ |
$4.5 × 10^{-3} rad$ |
The correct answer is Option (4) → $4.5 × 10^{-3} rad$ Given: Wavelength, $λ = 900\ \text{nm} = 9 \times 10^{-7}\ \text{m}$ Slit width, $a = 0.4\ \text{mm} = 4 \times 10^{-4}\ \text{m}$ For single-slit diffraction, the first minimum occurs at: $a\sinθ = λ$ ∴ $\sinθ = \frac{λ}{a} = \frac{9 \times 10^{-7}}{4 \times 10^{-4}} = 2.25 \times 10^{-3}$ Since $θ$ is small, $\sinθ ≈ θ$ (in radians): $θ = 2.25 \times 10^{-3}\ \text{rad}$ Angular divergence for central maximum = $2θ = 2 \times 2.25 \times 10^{-3} = 4.5 \times 10^{-3}\ \text{rad}$ Answer: Angular divergence = $4.5 \times 10^{-3}\ \text{rad}$ |