X-rays with a wavelength of 1.10 Åscatter at an angle of 34.5 ∘
from a crystal.
If n=1, what is the distance between planes of atoms in
the crystal that give rise to this scattering?
The problem can be solved by Bragg's equation.
That is where n= an integer
= wavelength of incident wave
d = distance between the planes of atoms OR the spacing between the planes of atomic lattice.
= angle between the incident ray and the scattering plane of atomic lattice
Given details are, 1.10 A0 = 1.10 * 10-10 m
34.50
n = 1 and d=?
Therefore,
= ( 1 * 1.10*10-10 ) / ( 2*sin(34.5) )
= 1.1*10-10 / 1.133
d = 9.7*10-11 m OR 0.97 A0
Get Answers For Free
Most questions answered within 1 hours.