(3)An assay method has a simple linear calibration: S = 0.50[A] –2.00, where S is the measured signal(arbitrary units)and [A] is the concentration of analyte in units of micromolar. A blank sample is measured five times and an average signal of 0.50 ±0.50(± one standard deviation) is obtained. What is the limit of detection for the method?
The answer is 8.0 uM. Unsure why as LOD= Smb + zo, wherein the textbook it says z is assumed to be 3 when no other assumptions can be made. (is this true?)
That would give 2.0uM
Could somebody run me through this? Thank you