Recreational Math
I enjoy recreational math. I am especially fascinated by topics in modern/abstract algebra, category theory, and unexpectedly powerful abstractions in general. Here is a collection of mathematical observations, results, and questions that I think are interesting. There may be mistakes or simplifications in the following. (Some of these are taken from Proofs from The Book.)
- Relating dimensionality to algebraic structure: every finite division ring is a field (Wedderburn’s Little Theorem). A division ring is a set where every nonzero element has a multiplicative inverse, but multiplication need not be commutative (e.g., quaternions, matrix rings over division rings). Surprisingly, if the division ring is finite, then multiplication must commute, making it a field. The relationship between dimensionality and commutativity stems from the fact that higher-dimensional (infinite) structures provide enough “room” to accommodate noncommutative behavior, such as directional or rotational distinctions. In contrast, finite systems are constrained by their limited algebraic and combinatorial possibilities, forcing multiplication to be commutative
- Geometry of roots: if \(P\) is a nonconstant polynomial, then all roots of \(P\)’s derivative (\(P'\)) lies within the convex hull of the roots of \(P\). (Gauss-Lucas Theorem).
- Infinitely many proofs for the infinitude of the primes. Consider a sequence \(S\) of integers such that the set of primes \(\mathbb{P}_S\) that divide some member of \(S\) is infinite. Then, \(S\) would provide its own proof for the infinitude of the primes. The Fermat numbers \(F_n = 2^{2^n} + 1\) are an example of such a sequence. The set of all nonzero values \(\{ p(n) : p(n) \neq 0, n \in \mathbb{N} \}\) for some nonconstant polynomial \(p(x)\) with integer coefficients (e.g., \(p(x) = x^2 + 1\)) is also such a “proof for the infinitude of the primes”. Since there are infinitely many nonconstant polynomials, there are infinitely many such “proofs”.
-
Algebra and 5. A set of cool facts about the number 5.
- There is no ring whose unit group is cyclic with order 5.
- All groups of order 5 or less must be abelian.
- There is no formula using radicals for the roots of a polynomial with degree 5 or higher.
- Galois’ criterion. A polynomial \(f(x)\) over a field \(K\) is solvable by radicals iff its Galois group \(\text{Gal}(f / K)\) is a solvable group, meaning it has a chain of subgroups where each is normal in the previous one and the corresponding quotient groups are abelian.
- Reals as a vector space over the rationals. The reals can be defined as the set of all Cauchy sequences of rationals, with addition and multiplication defined component-wise. This makes the reals a vector space (of infinite dimension) over the rationals. If the axiom of choice is true, then there exists a basis for this vector space.
- Leibniz’s proof for the derivative of a quadratic. If \(x\) changes by \(\delta x\), then \(y = x^2\) should change by \(y + \delta y = (x + \delta x)^2 = x^2 + 2x\delta x + (\delta x)^2\). Relabel \(\delta x, \delta y\) as \(dx, dy\) and as they become ‘infinitely small’, then \((\delta x)^2\) gets infinitely smaller. Also subtract \(y = x^2\). So we get \(dy = 2x dx\), or \(dy/dx = 2x\).
-
Interesting isomorphisms between rings and groups.
- \(\mathbb{R}[x]/(x^2 + 1) \cong \mathbb{C}\).
- The field \(K\) of matrices with form \(\begin{pmatrix} a & b \\ -b & a \end{pmatrix}\) is isomorphic to the field \(\mathbb{C}\).
- \(\mathbb{H} \cong \mathbb{C} \times \mathbb{C}\) (where \(\mathbb{H}\) is the set of quaternions).
- \(\mathbb{R} / \mathbb{Z} \cong \{ z \in \mathbb{C} : \vert z \vert = 1 \}\).
- Irreducibility of polynomials. \(p(x)\) is irreducible in \(F[x]\) iff \(F[x]/(p(x))\) is a field.
-
Isomorphism theorems across math
- Rings: If \(\phi: R \to S\) is a ring homomorphism, then \(\ker(\phi)\) is an ideal of \(R\) and \(R/\ker(\phi) \cong \text{im}(\phi)\).
- Groups: If \(\phi: G \to H\) is a group homomorphism, then \(\ker(\phi)\) is a normal subgroup of \(G\) and \(G/\ker(\phi) \cong \text{im}(\phi)\).
- Topologies: If \(f: X \to Y\) is a continuous map, then \(X / \sim \cong \text{im} f\) where \(x \sim y\) iff \(f(x) = f(y)\).
- Abelian groups. Every group of order 5 or less is abelian.
- Cayley’s Theorem. Every group is isomorphic to a subgroup of the symmetric group (permutations) on the group’s elements. Cayley’s theorem establishes that \(G\) can linearly act on the vector space \(F^G\) (functions from \(G\) to \(F\) where \(F\) is a field), where the bases are indicator functions recognizing each element of \(g\), and where the action of \(G\) on \(F^G\) permutes these basis functions. This makes \(G\) a subgroup of \(\text{GL}(F^G)\) (invertible linear transformations of the vector space \(F^G\)). Now, we can construct a regular representation of \(G\) in terms of matrices: let \(\vert G \vert = n\), which is the dimension of the vector space \(F^G\); each group element \(g\) is representable by a permutation matrix \(M_g\) which has exactly one \(1\) in each row and column, where the position of the \(1\) in the \(h\)th row corresponds to \(gh\). This makes \(G\) a subgroup of \(\text{GL}(n, F)\). We can also construct a group algebra \(F[G]\) where \(F[G]\) is the vector space over \(F\) with basis given by elements of \(G\). Cayely’s theorem guarantees that \(F[G]\) contains all representations of \(G\).
- Wedderburn-Artin Theorem. Every semisimple ring \(R\) is isomorphic to a direct sum of matrix rings over division rings. A ring is semisimple if it has no nontrivial two-sided ninpotent ideals, where an ideal \(I\) in a ring \(R\) is nontrivial if \(I \neq \{0\}, R\), nilpotent if there exists an integer \(n\) such that \(I^n = \{0\}\), and two-sided if \(I\) is both a left and right ideal of \(R\). For example, the following are semisimple rings: all fields, all division rings, \(\text{Mat}_n(F)\), \(F[G]\) (\(F\) has characteristic 0). A matrix ring \(\text{Mat}_n{D}\) is a ring of \(n \times n\) matrices, in this case over a division ring \(D\), where every nonzero element has a multiplicative inverse (but not necessarily commutative, e.g. quaternions). A direct sum of rings \(R_1 \oplus R_2 \oplus \ldots \oplus R_n\) is the set of tuples \((r_1, r_2, \ldots, r_n)\) where each \(r_i\) is in \(R_i\), with addition and multiplication defined component-wise. The Wedderburn-Artin theorem gives us a complete structural description of semisimple rings and allows us to decompose group algebras \(F[G]\) into irreducible representations of \(G\) in the form of matrix rings over division rings.
- Cayley’s Theorem as an instance of the Yoneda Lemma. Let \(F: \text{Grp} \to \text{Set}\) be the forgetful functor which maps each group to its “underlying set” but removes group structure. Cayley’s theorem gives a map \(G \to \text{Aut}(F(G))\), where \(\text{Aut}(F(G))\) is the automorphism group of the underlying set of \(G\) (i.e., \(S(G)\)). The Yoneda lemma states more broadly that objects in any category \(\mathcal{C}\) can be embedded into the category of functors \(\text{Set}^{\mathcal{C}^\text{op}}\). Specifically, this is the category of contravariant functors from \(\mathcal{C}^\text{op}\) to sets. For each object \(C \in \mathcal{C}\), the functor \(F\) assigns a set \(F(C)\), and for each morphism \(f : C \to D\) in \(\mathcal{C}\), the functor assigns a function \(F(f) : F(D) \to F(C)\) (reversing the direction of the morphism). We can first interpret \(G\) as a category with a single object \(\cdot\) where morphisms \(\text{Hom}(\cdot, \cdot)\) correspond to elements of \(G\); the functor \(F : G^\text{op} \to \text{Set}\) assigns \(\cdot\) to \(G\)’s underlying set, with each morphism \(g \in \text{Hom}(\cdot, \cdot)\) mapped to \(F(g) : G \to G\), defined by left multiplication: \(F(g)(x) = gx\); now the action of \(G\) on \(F(\cdot) = G\) is interpreted as a functor \(F\) in \(\text{Set}^{G^\text{op}}\).
- “Finite Simple Group of Order Two” ❤️
-
Noetherian rings are rings where any ascending chain of ideals in \(R\), \(I_1 \subseteq I_2 \subseteq \ldots\), stabilizes eventually such that there exists some \(n\) where \(I_n = I_{n+1} = ...\). Examples of Noetherian rings include all fields, integers, polynomial rings over fields, and finite rings. There are many interesting properties of Noetherian rings.
- Hilbert’s Basis Theorem: If \(R\) is Noetherian, then so is \(R[x]\).
- Lasker-Noether Theorem: Every ideal \(I\) in a Noetherian ring \(R\) can be written as an intersection of finitely many primary ideals, where \(Q\) is a primary ideal iff \(ab \in Q\) implies \(a \in Q\) or \(b^n \in Q\) for some \(n\). This generalizes the decomposition of integers into prime factors to ideals in Noetherian rings.
- Every Artinian ring is Noetherian, but not vice versa. An Artinian ring is a ring where any descending chain of ideals in \(R\), \(I_1 \supseteq I_2 \supseteq \ldots\), stabilizes eventually such that there exists some \(n\) where \(I_n = I_{n+1} = ...\). Examples of Artinian rings include fields, finite rings, semisimple rings, and \(\mathbb{Z}/n\mathbb{Z}\). \(\mathbb{Z}\) is Noetherian but not Artinian.
- Finite abelian groups. Every finite abelian group is isomorphic to a Cartesian product of cyclic groups of prime power order.
- Finite groups. Every finite group is isomorphic to the Galois group of a finite field extension of \(\mathbb{Q}(x)\), where \(\mathbb{Q}(x)\) is the field of rational functions in one variable (of the form \(p(x)/q(x)\) for polynomials \(p, q\) over the rationals) and a Galois group of a field extension \(K/F\) is the group of automorphisms of \(K\) that fix \(F\). The same result holds for \(\mathbb{C}(x)\) instead of \(\mathbb{Q}(x)\).
- Inverse Galois problem. (Unsolved for >200 years.) Is every finite group the Galois group of a Galois extension of \(\mathbb{Q}\)? (A Galois extension is a field extension that is normal and separable; “normal” means that it is the complete splitting field of a separable polynomial, and “separable” means that the minimal polynomial of every element in the extension has distinct roots.)
- The opposite of a set is a boolean algebra (in a categorical-theoretic sense). If the category \(\text{Set}\) has sets as objects and functions \(f: X \to Y\) mapping every element of set \(X\) to an element of set \(Y\) as morphisms, then the opposite category \(\text{Set}^{op}\) has sets as objects and maps \(g: Y \to X\) where \(g(y) = x\) iff \(f(x) = y\). Naturally, \(g\) should map elements of \(Y\) to subsets of \(X\), since functions can map multiple elements in the domain to one element in the codomain. To ensure compposition, however, both the domain and codomain of \(g\) should be sets of sets, as to map subsets of \(Y\) to subsets of \(X\). The specific structure of the objects in \(\text{Set}^{op}\), which are sets of sets, forms a boolean algebra where conjunction and disjunction are intersection and union. This video describes this better.
- Branching Programs. Every function \(f : \{0, 1\}^n \to \{0, 1\}\) can be computed by a branching program of width 5 and length \(O(4^d)\).
- Time Hierarchy. If \(r, t\) are time-constructible functions satisfying \(r(n) \log r(n) = o(t(n))\), then \(\text{DTIME}(r(n)) \subsetneq \text{DTIME}(t(n))\).
- Space Hierarchy. If \(q, s\) are space-constructible functions satisfying \(q(n) = o(s(n))\), then \(\text{DSPACE}(q(n)) \subsetneq \text{DSPACE}(s(n))\).
- Matrix product checking. To check if \(A \times B = C\) for three \(n \times n\) matrices \(A, B, C\), randomly sample \(r \in \{0, 1\}^n\) and check if \(ABr = Cr\) repeatedly. This randomized algorithm takes only \(O(n^2)\) time and has exponentially decreasing error rate.
- NP-completeness proofs. I think that many proofs for a problem’s (especially graph problems) NP-completeness by demonstrating dependence on CircuitSAT or 3SAT are really nifty. Here is a great resource for some of these proofs.
- A complication for proving \(P ?= NP\). There exists an oracle \(A\) such that \(P^A = NP^A\), and an oracle \(B\) such that \(P^B \neq NP^B\). Therefore any proof on \(P ?= NP\) may not work in relativized worlds where access to both is permitted, whereas other familiar proofs to relativize.
- The blind man. A blind man is trying to buy a red rag and a blue rag. But he is not sure if the salesman is trying to trick him or not. The salesman could, for instance, give him two red rags or two blue rags. The blind man shuffles rags behind his back, but keeps track of the identities of each (so at first he doesn’t know the true colors, but he knows which rag is which). He randomly pulls out a rag and asks the salesman what color it is. Then, he puts it behind his back and shuffles again, repeating this several times. If the salesman gives two conflicting answers for the same rag (e.g., calling it red one time and blue another time), then the blind man knows that he is lying. If, on the other hand, the salesman consistently calls one rag blue and the other rag red, then the blind man is sure that he indeed has one blue and one red rag. The moral of this story is that even a “handicapped” verifier can compute powerful functions with randomness and interaction.