# Physics307L F08:People/McCoy/Balmer

SJK 00:21, 3 November 2008 (EST)
00:21, 3 November 2008 (EST)
Overall, you did a very nice job on this lab. Your use of linear fitting w/ error, calculation of SEM, and statistical comparisons was very good. You also were bold in your attempt to calibrate the data -- but unfortunately made an error. In this lab, your measurements are so precise, that the error in calibration is not totally obvious--so it's a little forgivable to miss it. However, you would have picked up on it by asking yourself the question, "is the calibration changing these numbers in the correct direction?" I also would argue philosophically with your calibration method (without the error).

## Background

For the weeks of October 13 and October 20, I chose to do the Balmer series lab, in which using a constant deviation spectrometer, I measured the wavelength of the Balmer series of the Hydrogen atom. The Balmer series is that, which is generated by electrons transferring from higher energy levels down to the second energy level, which emits light rays in the visible spectrum. The ability to accurately measure the rays is through the use of a Pellin-Broca constant-deviation prism, which when rotated, changes the visible light ray as it has a wavelength-dependent index of refraction. The point of the lab was to measure the Rydberg constant, relating wavelength and quantum number relative to the gas for a series, such as the Balmer or Lyman series.

• A description of the apparatus and materials used can be found in my lab notebook, which can be accessed here.

## Data

My raw data is found in the first portion of my lab notebook which is linked above. To take data, I first calibrated the spectrometer by looking at the lines generated by a mercury vapor lamp and by comparing them to the known wavelength of the lines I came up with a regression line calibrating the spectrometer. Having done this, I observed the lines of Hydrogen and Deuterium, such that I could try and measure a difference between the spectra of the gasses along with calculating the Rydberg constant of Hydrogen. After calibrating the spectrometer and measuring the lines of Hydrogen and Deuterium, I calculated the Rydberg constant and errors due to the calibration.

• All the calculations and results can be found here

## Results

After doing all the calculations I found the final value for the Rydberg Constant such that

• ${\displaystyle R=1.09556(28)m^{-1}}$
This value, when compared to the known value of the Rydberg Constant is significantly low, as a 99.7 percent confidence interval stretched about my value of the Rydberg Constant does not include the given value which is ${\displaystyle 1.0967758*10^{7}}$.SJK 00:16, 3 November 2008 (EST)
00:16, 3 November 2008 (EST)
This is exactly the right way to compare your measurement with the accepted value. Very nice use of the confidence interval / normal distribution.

Because my value has such a significant discrepancy from the known value, my first assumption was that there was an error in my measurements or calculations that put too much emphasis on the highest value which in the calibration data was significantly higher than any other point, relative to the known value.

In the Lab manual, it asks to observe the double yellow lines as given off by sodium and see if they could be accurately resolved. Because I did not have a sodium bulb, I was unable to see if the lines were able to be resolved, but as the spectral lines had a standard error on the order of one-tenth a nanometer, the double yellow lines of sodium may be able to be resolved, but accurately measuring their wavelength, I do not feel is a possibility, because in order to resolve them accurately, the slit would have to be closed to the point where it is impossible to make out the lines.

The second thing we were asked was to calculate the Rydberg constant for deuterium and from that determine the wavelength of the H-alpha line and compare it to that of hydrogen. From the calculations, the wavelength of the alpha line for deuterium would be:${\displaystyle H_{a}=656.1nm}$, whereas that of hydrogen is known to be at ${\displaystyle H_{a}=656.3nm}$ as seen in the full table of measured values from Georgia State University's HyperPhysics here. Because this difference is only twice the measured error, and within the calibration error, plus the fact that it is near the infrared end of the spectrum, and the measured values varied greatly from the actual values at that point, I do not think it is possible to resolve the difference between hydrogen and deuterium.

## Causes of Error

The greatest cause of error in this experiment was the calibration of the spectrometer and the large discrepancy that occurred at high wavelengths. This could be due to the crystal not being clean or being partially worn down by air, as it is old, or it could be a problem that occurs because of the gear backlash, and as it nears the maximum and minimum, the gears may not match as well as in the middle. The solution that I would try first would be to replace the crystal and see if doing so allows for better results. The second cause for error would be the gear backlash, as the non-exact cut of the gears could possibly result in the relative location differing in the different ranges of the spectrometer. The only solution for this would be to take apart the gear box in the spectrometer, or purchasing a new spectrometer, neither of which I would consider a viable alternative. A third cause for error is the lack of a sodium bulb, as that prevents us from understanding the resolving power of the spectrometer. I feel that that would be a relatively easy problem to resolve as the purchase of a new bulb is a very viable solution.