So, I have a homework question:
"What is the boiling point of water 2 miles above sea level? Assume that the atmosphere follows the barometric formula with M = .0289 kg/mol and T = 300 K. Assume the enthalpy of vaporization of water is 44.0 kJ/mol independent of temperature."
We're also supposed to assume that the pressure at sea level is 1 ATM.
So, barometric formula:
P = Pstandard e^-(g * M * h / (R * T))
I calculated the pressure at 2 miles (~3219 m) above sea level using the barometric formula (I got .365 ATM).
Then, I used dP/dT = Heat of vaporization / (T(molar volume steam - molar volume water)) to solve for dP/dT.
I then inversed dT/dP in units of K/Pa
I converted this value to units in K/ATM and then multiplied this value by the change in pressure (calculated from barometric formula) to get how much the temperature changed.
Then I can just calculate the new BP by taking into account dT/dP.
Now, unless my book is wrong (the book answer is 90.6 degrees C), I'm slipping up somewhere along the way.
I was wondering if there was a blatant error in the logic I used for this problem.
Thanks