"low detection area" meaning in each calibration point gives a low area value (this shows the GC is working on low sensitivity right?) As for the calibration line, I still get a straight line graph because all the points are in the Low bias area values. It will not produce any error. For an example:
25ppm=300 (area value)
50ppm=600
150ppm=1800
200ppm=2400
but, the week after that, after re-calibration,my results are
25ppm=500 (area value)
50ppm=1000
150ppm=3000
200ppm=4000
As you can see, both calibrations with give a nice straight line... so how are this two calibrations affecting my results precision and quality control or spike sample results? You know any websites that explain this?