Why must a wavefunction approach 0 as x approaches infinity for it to be normalizable? As much detail in the explanation would be appreciated.
The wavefunction must approach zero as x approaches infinity since the probability of finding the particle whose wavefunction is considered should be unity at the very most everywhere in space. For the wavefunction to be zero at the infinite space values simply means that the particle is trapped in a region which is another way of saying that the particle 'exists'.
Thus for a normalization (realistic) wavefunction, the wavefunction must approach zero as x approaches infinity.
Get Answers For Free
Most questions answered within 1 hours.