*Algebra*

## Section 15.6. Adjoining Roots

Exercise 15.6.1 Since \(g \divides f\), write \(f = gh\) where \(h \in F[x]\). Then \(f' = gh' + g'h\). We are given that \(g \divides f'\), so \(g \divides f' - gh' = g'h\). By Theorem 12.3.10, \(F[x]\) is a UFD, so \(g\) is prime; therefore \(g \divides g'\) or \(g \divides h\). But \(g\) can only divide \(g'\) if \(g' = 0\), which is not possible for \(\char F = 0\) and \(g\) irreducible. Therefore \(g \divides h\). This implies \(g^2 \divides f\).

Exercise 15.6.2

Let \(a \in F\). The extension \(F(\sqrt{a})\) is \(F[x]/(x^2 - a)\), which is only a field if \(x^2 - a\) is irreducible, that is, \(a\) is not already a square in \(F\). The field \(F(\sqrt{a})\) will have \(F\)-basis \(\{1, \sqrt{a}\}\). Suppose \((x + y\sqrt{a})^2 = z\) where \(x, y, z \in F\). Expanding, we obtain \(x^2 + ay^2 + 2xy\sqrt{a} = z\), so that \(x^2 + ay^2 = z\) and \(2xy = 0\). Since \(\char F = 0\), this implies \(x = 0\) or \(y = 0\). If \(y = 0\), then \(x^2 = z\), that is, \(z\) is already a square in \(F\). If \(x = 0\), then \(y^2 = z/a\). So the elements of \(F\) that have square roots in \(F(\sqrt{a})\) are precisely the elements that have square roots in \(F\) and the elements \(z \in F\) such that \(z/a\) has a square root in \(F\). The square root of \(z\) in \(F(\sqrt{a})\) will be \(\sqrt{z/a}\sqrt{a}\) in the latter case.

Suppose \(a, b \in \Z\) are two squarefree integers, not equal to 1. Then \(\Q(\sqrt{a})\) and \(\Q(\sqrt{b})\) are fields. Obviously \(a\) has a square root in \(\Q(\sqrt{a})\). Since \(b\) doesn't have a square root in \(\Q\), \(b\) will have a square root in \(\Q(\sqrt{a})\) iff \(b/a\) has a square root in \(\Q\). Suppose this is the case. Each prime \(p\) occurs zero or one times in the prime factorization of \(a\) and likewise for \(b\), therefore in \(b/a\) each prime occurs with exponent \(e \in \{-1, 0, 1\}\). For \(b/a\) to be a perfect square, each such exponent must be 0, so that \(b = \pm a\). In the case that \(b = -a\), \(b/a\) is not a square in \(\Q\). So \(b\) has a square root in \(\Q(\sqrt{a})\) iff \(b = a\). This implies that \(\Q(\sqrt{a})\) and \(\Q(\sqrt{b})\) are nonisomorphic extensions when \(a, b\) are distinct squarefree integers, not equal to 1.

Now in general a quadratic extension of \(\Q\) can be formed by adjoining some square root \(\sqrt{a}\) to \(\Q\) (Proposition 15.3.3); and \(a\) is not a square in \(\Q\). There is some square \(q \in \Q\) such that \(aq\) is a squarefree integer, not equal to 1. Therefore \(\Q(\sqrt{a}) = \Q(\sqrt{aq})\). So in fact the extensions generated by the square root of a squarefree integer other than 1 are all the quadratic extensions of \(\Q\).

Exercise 15.6.3 The primitive root of unity
\(\zeta_n = \exp(2\pi i/n)\) generates the same extension field of \(\Q\) as
does \(\zeta_n^e\) where \(e\) is any integer relatively prime to \(n\). So
\(\Q[\sqrt{d}]\) will contain a primitive \(n\)^{th} root of unity iff
it contains \(\zeta_n\). If \([\Q(\zeta_n) : \Q] > 2\) then it is not
possible to have \(\Q(\zeta_n) \subseteq \Q[\sqrt{d}]\). According to
Exercise 15.3.5, a primitive \(n\)^{th} root
of unity has degree 2 only when \(n \in \{3, 4, 6\}\), so only these primitive
roots can be contained in a quadratic number field. (We are disregarding \(1\)
and \(-1\), which are primitive roots of unity of degree 1 and 2, respectively,
and which are obviously contained in all quadratic number fields.) Now
\begin{align*}
\zeta_3 &= -\frac{1}{2} + i\frac{\sqrt{3}}{2} \in \Q[\sqrt{-3}] \\
\zeta_6 &= \frac{1}{2} + i\frac{\sqrt{3}}{2} \in \Q[\sqrt{-3}] \\
\zeta_4 &= i \in \Q[\sqrt{-1}]
\end{align*}
so the answer to the problem is \(d \in \{-1, -3\}\).