To find the distance a pitcher must throw the ball from home plate to second base on a baseball diamond, we can visualize the diamond as a square. Each side of the square, representing the distance between the bases, measures 90 feet.
The path from home plate to second base forms the diagonal of this square. We can use the Pythagorean theorem to calculate the length of this diagonal. The theorem states that in a right triangle, the square of the length of the hypotenuse (the diagonal) is equal to the sum of the squares of the lengths of the other two sides.
In this case, both sides (the distance between home plate and first base, and the distance between first base and second base) are 90 feet. So, we can set up the equation:
Diagonal² = Side1² + Side2²
Diagonal² = 90² + 90²
Calculating this gives us:
Diagonal² = 8100 + 8100 = 16200
Now, taking the square root of both sides to find the diagonal:
Diagonal = √16200
Calculating that gives us:
Diagonal ≈ 127.28
Therefore, the pitcher must throw the ball approximately 127.28 feet to reach the catcher at second base.