Hello everyone.
Please i need clarification in simple terms on how method detection limit is obtained.
I have checked literature and quite contrasting or not clear statements to me were made.
I can see that, because:
For instance, the most popular one I see is "detection limit defined as 3σ (i.e 3 times standard deviation) of the measured lowest standard concentration (eg 10 nM) was ." In order cases, they make mention of error bar.
I assume that's a valid one that can be used, I've never seen it used before.
I will appreciate if you can state in simple terms how to obtain the method detection limit of a particular analyte ,A, assuming I am using an HPLC-Fluorescence detector for quantification.
Oh, that's easier. For HPLC, the limit of detection is (often) defined as 3x noise. You'll look at the signal height of the blank, or a blank area of the run, and use that as noise. HPLC acquisition software will even do it for you in many cases
In fact, the method you've described above, 3 sigma the lowest slope, is used to define the Limit of Quantitation.
Precision:
I think I understand precision better. It is basically repeating a particular analysis using the same concentrations of reagents and reaction conditions several times and looking out for the variations in their response e.g. fluorescence signal intensity.
I will appreciate further insight on this also.
You can report that, but you may want to call that robustness, or something else, depending on the standards to which your analysis is held.