A slit of width 'd' is illuminated by red light of wavelength 650 nm. What should be the value of slit width 'd' so that first minimum fall at an angle of diffraction of $\frac{\pi}{6}$? |
$1.3 \times 10^{-6}$ m $1.3 \times 10^{-5}$ m $6.5 \times 10^{-6}$ m $7.5 \times 10^{-6}$ m |
$1.3 \times 10^{-6}$ m |
The correct answer is Option (1) → $1.3 \times 10^{-6}$ m Using single-slit diffraction - $\sin θ=\frac{mλ}{d}$ m = order of minima = 1 λ = wavelength of light = 650 nm d = width of slit θ = Angle of diffraction = $\frac{\pi}{6}$ $∴\sin\left(\frac{\pi}{6}\right)=\frac{1×650×10^{-9}}{d}$ $d=2×650×10^{-9}$ $=1.3μm$ |