In a calibration curve, how is the sensitivity of the method defined?
Calibration curve is a plot of different solutions of known concentrations plotted against the property of the particular analyte present in the solution.
So as the analyte concentration in the solution changes, the value (intensity) of the property measured also changes.
Sensitivity is the ability of the detector to measure the small differences in analyte concentration in the solution. So for a given calibration curve, as the sensitivity goes up, the slope value also increases. As sensitivity goes down the slope value decreases drastically due to increase in noise for the measurement.
Get Answers For Free
Most questions answered within 1 hours.