Hey,guys!i've met a weired PDA detector problem,the following is the question:
A diode array detector contains 2048 pixels arranged in a line. The array is 40 mm long. The array is mounted inside a spectrophotometer, 100 mm from a grating polychromator with an angular dispersion D = 9.55x104 rad·m-1.What is the difference in wavelength, in nm, falling on adjacent pixels in the array?
In order to solve the problem i make a graph(in the attachment):
in the graph i plot a triangle with three sides:one is the space between the adjacent pixels(denoted by "d");one is the wavelength of an exiting light from the polychromator(denoted by 入1);the rest one is the wavelength of another exiting light from the polychromator(denoted by 入2);i think the angle between the two exiting lights is equal to the given angular dispersion(denoted by α);what's more,the height of the triangle is equal to the distance from the polychromator(denoted by a) .I think in order to solve the Q6,the number of 入1 and 入2 must be calculated and then make the subtraction to yield the Δ入.i have tried all the triogometry method to calculate the number of 入1 and 入2 and Δ入,but all failed.
Would anybody help me out?
Thank you very much !