Based upon the currently accepted value for the Hubble constant, what would be a minimum distance to detect a red shift in starlight? Specifically, I am wondering if the 100 K light year size of the Milky Way galaxy would be enough distance to detect a red shift.
Edit: I know we could measure a Doppler shift at those distances; but I'm wondering if calculations based upon the Hubble constant alone would produce a detectable red shift.
Wiki article on Hubble's law
Edit: I know we could measure a Doppler shift at those distances; but I'm wondering if calculations based upon the Hubble constant alone would produce a detectable red shift.
Wiki article on Hubble's law
The most recent calculation of the proportionality constant used 2003 data from the satellite WMAP combined with other astronomical data, and yielded a value of H0 = 70.1 ± 1.3 (km/s)/Mpc. This value agrees well with that of H0 = 72 ± 8 km/s/Mpc obtained in 2001 by using NASA's Hubble Space Telescope[4]. In August, 2006, a less precise figure was obtained independently using data from NASA's orbital Chandra X-ray Observatory: H0 = 77 (km/s)/Mpc or about 2.5×10−18 s−1 with an uncertainty of ± 15%.[5] NASA summarizes existing data to indicate a constant of 70.8 ± 1.6 (km/s)/Mpc if space is assumed to be flat, or 70.8 ± 4.0 (km/s)/Mpc otherwise.[6] (2.3×10−18 s−1 = 1/(13.8 billion years))
Last edited: