A plane flies the 700 miles from Atlanta to Detroit in \(1\frac{1}{4}\) hours. What is the plane's average air speed, in miles per hour?



Use the formula \( r = d \div t\). Put 700 in place of d and \(1\frac{1}{4}\)in place of t.
 \(r= 700 \div 1\frac{1}{4}\)
\(r= 700 \div \frac{5}{4}\)
\(r= 700 \times \frac{4}{5} = 140 \times 4 = 560\) 

Visit our website for other GED topics now!