I know that for close by stars (<50 LY) we can use the parallax effect. And for
ID: 1319382 • Letter: I
Question
I know that for close by stars (<50 LY) we can use the parallax effect. And for distant galaxies we use red-shift (& hubble's constant). So how do we measure how far is a star lets say 50,000 LY from the earth?
I know I am missing something, I just don't know what.
How can we assume there is a relationship between red-shift and distance when the stars
act like a (turbulent) fluid within the galaxy and
can be moving in any sort of different manner?
Edit: It is a physics question, because I really want to know what is the star velocity model within the galaxy in order to use red-shift. My instinct tell me that stars move like a swarm within the galaxy with an overall rotation about the galaxy center of gravity (In agreement to the edited horowitz answer).
Explanation / Answer
One neat trick for middle ranges requires a dynamically bound system whose components have measurable proper motions. There are a reasonable number of globular clusters that qualify.
If you project those motions across the sky, they will appear to come together (to some approximation) in two places (one forward and one backward), and the directions of those are the direction of the real motion. Combine that with the measured line-of-sight velocity (from Doppler shifts of spectral lines), and you know he total velocity and can compute the distance.
For more distant compound objects (clusters and galaxies) still too close to use the cosmological scale (we can't use the Hubble relationship on anything in our local group because the velocities arising from dynamic binding are larger than the cosmological relationship)), one can use Cephied variables and Type 1a supernovae as standard candles. Measuring the distance to clusters and near by galaxies (the Magelenic clouds in particular) for finding distances inside the galaxy because it increase the total population of star we can use to calibrate the HR-diagram.