Balmer Lab: 10/1326/2008
 Calibration of the spectrometer was done using Microsoft Excel, via fitting a least squares regression line to the know and measured values. In doing so, I found values of:
$m=1.0034$ and $m=1.0021$.
 I fit the data with the yintercept fixed at zero to get a direct regression for the wavelength, and found the coefficient of determination to be greater than 99.99% for both calibration sets
 That proved that it is valid to have a linear regression for calculating the known and measured values from each other.
 Steve Koch: I don't think this proves it, because in this particular lab your precision is at the 0.01% level!
 The two calibration lines are shown below.
 My full calculations can be found in the Excel file located at here
 Doing so resulted in adjusted values for the spectral lines at:
Adjusted Values of Hydrogen Spectral Lines
Quantum # 
Wavelengths (nm)

3

656.36

658.01

4

486.58

486.77

5

434.67

434.89

6

410.83

410.67

 ^{SJK 00:07, 3 November 2008 (EST)}
00:07, 3 November 2008 (EST) I'm happy to see that you attempted to use the calibration to correct your measurements. Most people just sort of check that things are roughly OK. That said, I think you did not do it properly! Look at your numbers for the n=3 quantum number, for example. Your calibration method pushed the value to a larger wavelength, whereas in your calibration, the known value was lower than your measurement. So, I think you should divide, not multiply by the slope. But even then, I don't see why forcing the line through the origin is appropriate: how do you know you don't have an offset? Finally, what could make even more sense (potentially) would be to interpolate your calibration (use a linear interpolation between the neighboring calibration points. I give an example of this by modifying your excel sheet: File:Balmer Calculations Chad SJK.xls I got these values by multiplying the slope of my regression line with their measured value.
 From here I was able to calculate the Rydberg Constant using each wavelength and the formula:
Failed to parse (syntax error): {\displaystyle \frac{1}{lambda}=R×(\frac{1}{2^2}\frac{1}{n^2})}
where R is the Rydberg Constant, Lambda is the wavelength and n is the quantum number.
 Solving the equation for R leads to:
Failed to parse (syntax error): {\displaystyle R=\frac{1}{lambda×(\frac{1}{2^2}\frac{1}{n^2})}}
 From this I was able to plug in the values of lambda and n and came out with 8 values of R, such that after averaging the values:
$R=1.09556*10^{7}{\frac {1}{m}}$
 The standard error in the value R, strictly from the average, I found to be $SER=2834m^{1}$
 Using the Standard error of the projected Y values, as given by the LINEST function, along with the standard error of the data set, I calculated the error from the measurement and calibration and found a value $CalError=145m^{1}$
 ^{SJK 00:09, 3 November 2008 (EST)}
00:09, 3 November 2008 (EST) This part is confusing to me By summing the error in quadrature, I was able to find the overall error as
$ErrorR={\sqrt {2834^{2}+(145)^{2}}}=2838m^{1}$
 Therefore my final value for the Rydberg Constant can be written as:
$R=1.09556(28)*10^{7}{\frac {1}{m}}$
 Comparing my value to the known value of the Rydberg constant: $R=1.0967758*10^{7}m^{1}$ shows that the actual value does not lie within 3 standard deviations of the mean as that would result a 99.7% confidence interval of $R=1.09556(85)*10^{7}m^{1}$, demonstrating that my answer is significantly lower than the actual value.
 Rydberg constant of Deuterium:
Failed to parse (syntax error): {\displaystyle R=(\frac{2pi^2m·e^4}{h^3c})(\frac{mu_0*c^2}{4*pi})^2=1.0973733*10^7m^{1}}
 From this the alpha line can be found as:
Failed to parse (syntax error): {\displaystyle lambda=\frac{1}{R×(\frac{1}{2^2}\frac{1}{3^2})}=\frac{1}{1.0973733*10^7*(.13889)}=656.1nm}
 Comparing the alpha line of deuterium to the known value of hydrogen gives that the difference of .2 nanometers as the space differential of the lines.
