To show that 3 plus the square root of 5 is an irrational number, we start by recalling the definition of an irrational number: it is a number that cannot be expressed as a fraction of two integers.
Let’s assume, for the sake of contradiction, that the square root of 5 is rational. This means that we can express it as a fraction in the form a/b, where a and b are integers with no common factors (i.e., gcd(a, b) = 1). If this were true, then squaring both sides would yield:
5 = a²/b² or a² = 5b².
From this equation, we can see that a² is a multiple of 5, which implies that a must also be a multiple of 5 (since the prime factorization of a perfect square contains even exponents). Thus, we can write a = 5k for some integer k. Substituting back into our equation gives:
(5k)² = 5b² or 25k² = 5b².
Dividing both sides by 5 leads to 5k² = b², meaning that b² is also a multiple of 5. Consequently, b must also be a multiple of 5.
However, if both a and b share 5 as a common factor, this contradicts our initial assumption that a and b have no common factors. Therefore, the assumption that the square root of 5 is rational must be false, implying that √5 is irrational.
Now, we return to our original expression, which is 3 + √5. Since 3 is a rational number and the sum of a rational number and an irrational number is always irrational, we can conclude that 3 + √5 is irrational.
Thus, we have proven that 3 plus the square root of 5 is indeed an irrational number.