To determine whether the given geometric series is convergent or divergent, we start by identifying the first term and the common ratio. The series is: 10, 2, 0.4, 0.08.
The first term (a) of the series is 10. To find the common ratio (r), we can divide the second term by the first term:
r = 2 / 10 = 0.2
Now, if we multiply the first term by the common ratio, it should produce the second term:
10 * 0.2 = 2
We can check the next terms as well:
2 * 0.2 = 0.40.4 * 0.2 = 0.08
Since the ratio between consecutive terms is the same (0.2), this confirms we have a geometric series.
To determine convergence, we check the absolute value of the common ratio. A geometric series converges if the absolute value of the common ratio is less than 1:
|r| = |0.2| = 0.2
Since 0.2 is less than 1, the series converges.
To find the sum of an infinite geometric series, we use the formula:
S = a / (1 – r)
Where a is the first term and r is the common ratio. Plugging in the values:
S = 10 / (1 – 0.2)
S = 10 / 0.8
S = 12.5
Therefore, the given geometric series is convergent and its sum is 12.5.