Suppose that a very concentrated sample read 19 ppm when tested with an instrument calibrated for 0 - 3 ppm. The sample was then diluted to 1/10th and the diluted sample read 2.6.
a. Based on the 2.6 ppm reading for the dilution, what was the true concentration of the original sample?
b. briefly explain why the spectrophotometer is likely to show an incorrect result for the original sample?
a) based on the 2.6 ppm reading, we can say that the oriiginal and true concentration of the sample was:
C = 2.6 ppm * (10/1) = 26 ppm instead of 19 ppm as it was stated before.
b) With such a range of calibration, means that the accuracy of the instrument is for low measurements, or in this case, low concentrations. When the measure is higher, the accuracy of the instrument gets lower. It's like you are trying to measure 250 mL on a beaker rather than use first a graduated cylinder for more accuracy. Not to mention the fact that the sample may be impure, and when it's diluted, you dilute too the impurities of the sample.
Hope this helps
Get Answers For Free
Most questions answered within 1 hours.