diff --git a/posts/permutations/1/index.qmd b/posts/permutations/1/index.qmd
index f3bf358..9d82a3b 100644
--- a/posts/permutations/1/index.qmd
+++ b/posts/permutations/1/index.qmd
@@ -14,6 +14,11 @@ categories:
- sorting algorithms
---
+
In the time since [my last post](../chebyshev/2) discussing graphs,
I have been spurred on to continue playing with them, with a slight focus on abstract algebra.
@@ -53,7 +58,7 @@ There is a group element for each permutation, so the order of the group is the
On the other hand, the lists differ in the first element and cannot be equal.
Sets are still useful as a container.
- For example, the elements of a group unordered.
+ For example, the elements of a group are unordered.
To keep vocabulary simple, I will do my best to refer to objects in a group as
"group elements" and the objects in a list as "items".
@@ -74,7 +79,7 @@ Take the element of $S_3$ which can be described as the action
Our canonical list in this case is $[1, 2, 3]$, matching the degree of the group.
This results in $[3, 1, 2]$, since "1" is in now in position two, and similarly for the other items.
-Unfortunately, this choice is too results-oriented.
+Unfortunately, this choice is too result-oriented.
This choice makes it difficult to compose group elements in a meaningful way, since all
of the information about the permutation is in the position of items in the list,
rather than the items of the list themselves.
@@ -84,20 +89,19 @@ For example, under the same rule, $[c, b, a]$ is mapped to $[a, c, b]$.
### True List Notations (Two- and One-line Notation)
Instead, let's go back to the definition of the element.
-We have pairs of indices before and after the permutation is applied.
All we have to do is list out every index on one line, then the destination
of every index on the next.
This is known as *two-line notation*.
$$
-\begin{bmatrix}
+\begin{pmatrix}
1 & 2 & 3 \\
p(1) & p(2) & p(3)
-\end{bmatrix}
-= \begin{bmatrix}
+\end{pmatrix}
+= \begin{pmatrix}
1 & 2 & 3 \\
2 & 3 & 1
-\end{bmatrix}
+\end{pmatrix}
$$
For simplicity, the first row is kept as a list whose items match their indices.
@@ -107,18 +111,18 @@ All we have to do is sort the columns of the permutation by the second row,
then swap the two rows.
$$
-\begin{bmatrix}
+\begin{pmatrix}
1 & 2 & 3 \\
2 & 3 & 1
-\end{bmatrix}^{-1}
-= \begin{bmatrix}
+\end{pmatrix}^{-1}
+= \begin{pmatrix}
3 & 1 & 2 \\
1 & 2 & 3
-\end{bmatrix}^{-1}
-= \begin{bmatrix}
+\end{pmatrix}^{-1}
+= \begin{pmatrix}
1 & 2 & 3 \\
3 & 1 & 2
-\end{bmatrix} =
+\end{pmatrix}
$$
Note that the second row is now the same as the result from the naive notation.
@@ -129,14 +133,14 @@ The *n*th item in the list now describes the position in which *n* can be found
the permutation.
$$
-\begin{bmatrix}
+\begin{pmatrix}
1 & 2 & 3 \\
2 & 3 & 1
-\end{bmatrix}
+\end{pmatrix}
\equiv [\![2, 3, 1]\!]
$$
-Double brackets are used to distinguish this as a pertation from ordinary lists.
+Double brackets are used to distinguish this as a permutation and not an ordinary list.
These notations make it straightforward to encode symmetric group elements on a computer.
After all, we only have to read the items of a list by the indices in another.
@@ -147,7 +151,7 @@ Here's a compact definition in Haskell:
newtype Permutation = P { unP :: [Int] }
apply :: Permutation -> [a] -> [a]
-apply = flip $ (\xs ->
+apply = flip (\xs ->
map ( -- for each item of the permutation, map it to...
(xs !!) -- the nth item of the first list
. (+(-1)) -- (indexed starting with 1)
@@ -155,7 +159,7 @@ apply = flip $ (\xs ->
-- written in a non-point free form
apply' (P xs) ys = map ( \n -> ys !! (n-1) ) xs
-print $ (P [2,3,1]) `apply` [1,2,3]
+print $ P [2,3,1] `apply` [1,2,3]
```
Note that this means `P [2,3,1]` is actually equivalent to $[\![2, 3, 1]\!]^{-1}$,
@@ -175,29 +179,29 @@ The verbosity of these notations also makes composing group elements difficult[^
by looking at the first and last rows.
For example, with group inverses:
$$
- \begin{bmatrix}
+ \begin{pmatrix}
1 & 2 & 3 \\
2 & 3 & 1
- \end{bmatrix}
- \begin{bmatrix}
+ \end{pmatrix}
+ \begin{pmatrix}
1 & 2 & 3 \\
2 & 3 & 1
- \end{bmatrix}^{-1}
+ \end{pmatrix}^{-1}
= \begin{matrix}
- \begin{bmatrix}
+ \begin{pmatrix}
1 & 2 & 3 \\
- 2 & 3 & 1
- \end{bmatrix}
+ \cancel{2} & \cancel{3} & \cancel{1}
+ \end{pmatrix}
\\
- \begin{bmatrix}
- 2 & 3 & 1 \\
+ \begin{pmatrix}
+ \cancel{2} & \cancel{3} & \cancel{1} \\
1 & 2 & 3
- \end{bmatrix}
+ \end{pmatrix}
\end{matrix}
- = \begin{bmatrix}
+ = \begin{pmatrix}
1 & 2 & 3 \\
1 & 2 & 3
- \end{bmatrix}
+ \end{pmatrix}
$$
@@ -205,17 +209,24 @@ The verbosity of these notations also makes composing group elements difficult[^
*Cycle notation* addresses all of these issues, but gets rid of the transparency with respect to lists.
Let's try phrasing the element we've been describing differently.
+
+> assign the first item to the second index, the second to the third, and the third to the first
+
We start at index 1 and follow it to index 2, and from there follow it to index 3.
-Continuing from index 3, we return to 1, and continuing in this manner would go on forever.
+Continuing from index 3, we return to index 1, and from then we'd loop forever.
This describes a *cycle*, denoted as $(1 ~ 2 ~ 3)$.
Cycle notation is much more delicate than list notation, since the notation is nonunique:
-- Naturally, the elements of a cycle may be cycled to produce an equivalent ones.
+- Naturally, the elements of a cycle may be cycled to produce an equivalent one.
- $(1 ~ 2 ~ 3) = (3 ~ 1 ~ 2) = (2 ~ 3 ~ 1)$
-- Cycles which have no common elements (i.e., are disjoint) commute, since they act on separate parts of the list.
+- Cycles which have no common elements (i.e., are disjoint) commute,
+ since they act on separate parts of the list.
- $(1 ~ 2 ~ 3)(4 ~ 5) = (4 ~ 5)(1 ~ 2 ~ 3)$
+
+#### Cycle Algebra
+
The true benefit of cycles is that they are easy to manipulate algebraically.
For some reason, [Wikipedia](https://en.wikipedia.org/wiki/Permutation#Cycle_notation)
does not elaborate on the composition rules for cycles,
@@ -226,7 +237,7 @@ While playing around with them and deriving these rules oneself *is* a good idea
- Cycles can be inverted by reversing their order.
- $(1 ~ 2 ~ 3)^{-1} = (3 ~ 2 ~ 1) = (1 ~ 3 ~ 2)$
- Cycles may be composed if the last element in the first is the first index on the right.
- Inversely, cycles may also be decomposed by partitioning it on an index and duplicating.
+ Inversely, cycles may also be decomposed by partitioning on an index and duplicating.
- $(1 ~ 2 ~ 3) = (1 ~ 2)(2 ~ 3)$
- If an index in a cycle is repeated twice, it may be omitted from the cycle.
- $(1 ~ 2 ~ 3)(1 ~ 3) = (1 ~ 2 ~ 3)(3 ~ 1) = (1 ~ 2 ~ 3 ~ 1) = (1 ~ 1 ~ 2 ~ 3) = (2 ~ 3)$
@@ -234,9 +245,9 @@ While playing around with them and deriving these rules oneself *is* a good idea
Going back to $(1 ~ 2 ~ 3)$, if we apply this permutation to the list $[1, 2, 3]$:
$$
-(1 ~ 2 ~ 3) \left( \vphantom{0 \over 1} [1, 2, 3] \right)
- = (1 ~ 2)(2 ~ 3) \left( \vphantom{0 \over 1} [1, 2, 3] \right)
- = (1 ~ 2) \left( \vphantom{0 \over 1} [1, 3, 2] \right)
+(1 ~ 2 ~ 3) \left( \vphantom{0^{0^0}} [1, 2, 3] \right)
+ = (1 ~ 2)(2 ~ 3) \left( \vphantom{0^{0^0}} [1, 2, 3] \right)
+ = (1 ~ 2) \left( \vphantom{0^{0^0}} [1, 3, 2] \right)
= [3, 1, 2]
$$
@@ -250,7 +261,7 @@ If we have a group *G*, then we can select a set of elements
$\langle g_1, g_2, g_3, ... \rangle$ as *generators*.
If we form all possible products -- not only the pairwise ones $g_1 g_2$,
but also $g_1 g_2 g_3$ and all powers of any $g_n$ -- then the products form a subgroup of *G*.
-Naturally, the set is called a *generating set*.
+Naturally, such a set is called a *generating set*.
Symmetric groups are of primary interest because of their subgroups, also known as permutation groups.
[Cayley's theorem](https://en.wikipedia.org/wiki/Cayley%27s_theorem),
@@ -295,12 +306,12 @@ In this algorithm, we swap two items when the latter is less than the former,
looping over the list until it is sorted.
Until the list is sorted, the algorithm finds all such adjacent inversions.
In the worst case, it will swap every pair of adjacent items, some possibly multiple times.
-This corresponding to the generating set
+This corresponds to the generating set
$\langle (1 ~ 2), (2 ~ 3), (3 ~ 4), (4 ~ 5), …, (n-1 ~\ ~ n) \rangle$.

+](./bubble_sort.gif){.narrow}
### Selection Sort
@@ -315,7 +326,7 @@ Continuing until the last item, this gives the generating set

+](./selection_sort.gif){.narrow}
This behavior for selection sort is uncommon, and this animation omits the selection of a swap candidate.
The animation below shows a more destructive selection sort, in which the
@@ -326,7 +337,7 @@ Once the algorithm hits the end of the list, the candidate is swapped to the lea

+](./destructive_selection_sort.gif){.narrow}
Swap Diagrams
@@ -391,7 +402,7 @@ In other words, if we have have two adjacent edges, the new edge corresponds to
a product of elements from the generating set.
Graph theory has a name for this operation: when we produce *all* new edges by linking vertices
that were separated by a distance of 2, the result is called the *square of that graph*.
-In fact, [higher graph](https://en.wikipedia.org/wiki/Graph_power) powers will reflect connections
+In fact, higher [graph powers](https://en.wikipedia.org/wiki/Graph_power) will reflect connections
induced by more conjugations of adjacent edges.
 powers will r
](./P4_powers.png)
If our graph is connected, then repeating this operation will tend toward a complete graph.
-Complete graphs contain every possible edge, and correspond to all possible 2-cycles,
- which in turn generate the symmetric group.
-Conversely, if a graph has *n* vertices, then for it to be connected, it must have at least $n-1$ edges.
-Thus, a generating set of 2-cycles must have at least $n-1$ items to generate the symmetric group.
+Complete graphs contain every possible edge, and so correspond to all possible 2-cycles,
+ which trivially generate the symmetric group.
+Conversely, if a graph has *n* vertices, then for it to be connected, it must have at least $n - 1$ edges.
+Thus, a generating set of 2-cycles must have at least $n - 1$ items to generate the symmetric group.
Picking a different vertex labelling will correspond to a different generating set.
For example, in the image of $P_4$ above, if the edge connecting vertices 1 and 2
@@ -423,11 +434,12 @@ Under graph powers, we know that each connected graph tends toward a complete gr
But what groups do cluster graphs correspond to?
The simplest case to consider is what happens when the graph is $P_2 \oplus P_2$.
-One of the generating sets it corresponds to is $\langle (1 ~ 2), (3 ~ 4) \rangle$, a pair of disjoint cycles.
-The group they generate is
+If there is an edge connecting vertices 1 and 2 and an edge connecting vertices 3 and 4,
+ it corresponds to the generating set $\langle (1 ~ 2), (3 ~ 4) \rangle$.
+This is a pair of disjoint cycles, and the group they generate is
$$
-\{id, (1 ~ 2), (3 ~ 4), (1 ~ 2)(3 ~ 4) \}
+\{e, (1 ~ 2), (3 ~ 4), (1 ~ 2)(3 ~ 4) \}
\cong S_2 \times S_2
\cong C_2 \times C_2
$$
@@ -438,16 +450,17 @@ One way to look at this is by considering paths on each component:
or both at the same time.
This independence means that one group's structure is duplicated over the other's,
or more succinctly, gives the direct product.
-In general, if we denote γ as the map which "runs" the swap diagram and produces the group, then
+In general, if we denote *γ* as the map which "runs" the swap diagram and produces the group, then
$$
-\gamma( A \oplus B ) = S_{|A|} \times S_{|B|}, ~ A, B \text{ connected}
+\gamma( A \oplus B ) = S_{|A|} \times S_{|B|},
+ ~ A, B \text{ connected}
$$
where $|A|$ is the number of vertices in *A*.
-γ has the interesting property of mapping a sum-like object onto a product-like object.
-If we express a disconnected graph *U* as the disjoint union of its connected components $V_i$
+*γ* has the interesting property of mapping a sum-like object onto a product-like object.
+If we express a disconnected graph *U* as the disjoint union of its connected components $V_i$, then
$$
\begin{gather*}
@@ -457,7 +470,7 @@ $$
\end{gather*}
$$
-Which takes care of every simple graph.
+This describes *γ* for every simple graph.
It also shows that we're rather limited in the kinds of groups which can be expressed by a swap diagram.
diff --git a/posts/permutations/2/index.qmd b/posts/permutations/2/index.qmd
index fde20b5..43d1dc2 100644
--- a/posts/permutations/2/index.qmd
+++ b/posts/permutations/2/index.qmd
@@ -1,7 +1,7 @@
---
title: "A Game of Permutations, Part 2"
description: |
- Notes on permutohedra and some very large graphs.
+ Notes on an operation which makes some very large graphs.
format:
html:
html-math-method: katex
@@ -13,6 +13,12 @@ categories:
- group theory
---
+
+
```{python}
#| echo: false
@@ -45,13 +51,6 @@ The resulting figure is known as a [Cayley graph](https://mathworld.wolfram.com/
It is also common to describe an unlabelled graph as "Cayley" if it could
be generated by this procedure.
-Owing to the way in which they are defined, Cayley graphs have a few useful properties as graphs.
-At every vertex, we have as many outward edges as we do generators in the generating set,
- so the outward (and in fact, inward) degree of each vertex is the same.
-In other words, it is a regular graph.
-More than that, it is [vertex-transitive](https://mathworld.wolfram.com/Vertex-TransitiveGraph.html),
- since labelling a single vertex's outward edges will label that of the entire graph.
-
Cayley graphs depend on the generating set used, so they can take a wide variety of shapes.
Here are a few examples of Cayley graphs made from elements of $S_4$:
@@ -62,6 +61,13 @@ Here are a few examples of Cayley graphs made from elements of $S_4$:
Generating sets obtained from the previous MathWorld article.
](./s4_cayley_graphs.png)
+Owing to the way in which they are defined, Cayley graphs have a few useful properties as graphs.
+At every vertex, we have as many outward edges as we do generators in the generating set,
+ so the outward (and in fact, inward) degree of each vertex is the same.
+In other words, it is a regular graph.
+More than that, it is [vertex-transitive](https://mathworld.wolfram.com/Vertex-TransitiveGraph.html),
+ since labelling a single vertex's outward edges will label that of the entire graph.
+
In general, the Cayley graph is a directed graph.
However, if for every member of the generating set, we also include its inverse,
every directed edge will be matched by an edge in the opposite direction,
@@ -74,30 +80,32 @@ Graphs to Graphs
All 2-cycles are their own inverse, so generating sets which include only them
produce undirected Cayley graphs.
Since this kind of generating set can itself be thought of as a graph,
- we may consider an operation from graphs to graphs that maps a swap diagram to its Cayley graph.
+ we may consider an operation on graphs that maps a swap diagram to its Cayley graph.

-I've taken to calling this operation the "graph exponential"[^1] because of its apparent relationship
- with the disjoint union.
-Namely, it seems to be the case that $\exp( A \oplus B ) = \exp( A ) \times \exp( B )$,
- where $\times$ signifies the
+It seems to be the case that $\exp( A \oplus B ) = \exp( A ) \times \exp( B )$,
+ where $\oplus$ signifies the disjoint uinion and $\times$ signifies the
[Cartesian (box) product of graphs](https://en.wikipedia.org/wiki/Cartesian_product_of_graphs)[^2].
-
-[^1]: Originally, I called this operation the "graph factorial", since it involves permutations
- and the number of vertices in the resulting graph grows factorially.
+Unlike *γ* from the previous post, both the input and output of this operation are graphs.
+Because of this and the sum/product relationship, I've taken to calling this operation the
+ "graph exponential"[^3].
[^2]: Graphs have many product structures, such as the tensor product and strong product.
The Cartesian product is (categorically) more natural when paired with disjoint unions.
+[^3]: Originally, I called this operation the "graph factorial", since it involves permutations
+ and the number of vertices in the resulting graph grows factorially.
+
This operation is my own invention, so I am unsure whether or not
it constitutes anything useful.
In fact, the possible graphs grow so rapidly that computing anything about the exponential
of order 8 graphs starts to overwhelm a single computer.
+It is, however, interesting, as I will hopefully be able to convince.
A random graph will not generally correspond to an interesting generating set,
and therefore, will also generally have an uninteresting exponential graph.
@@ -109,71 +117,79 @@ They are among the simplest graphs one can consider, and as we will see shortly,
Some Small Exponential Graphs
-----------------------------
-Because of [the difficulty in determining graph isomorphism](https://en.wikipedia.org/wiki/Graph_isomorphism_problem),
- it is challenging for a computer to find a graph in an encyclopedia.
+Because of [the difficulty in determining graph isomorphism](
+ https://en.wikipedia.org/wiki/Graph_isomorphism_problem
+ ), it is challenging for a computer to find a graph in an encyclopedia.
Computers think of graphs as a list of vertices and their outward edges,
but this implementation faces inherent labelling issues.
These persist even if the graph is described as a list of (un)ordered pairs,
an adjacency matrix, or an incidence matrix,
- the latter two of which have very large memory footprints.
-I was able to locate a project named the
- [Encyclopedia of Finite Graphs](https://github.com/thoppe/Encyclopedia-of-Finite-Graphs),
- but it is only able to build a database simple connected graphs which
- can be queried by invariants (and is outdated since it uses Python 2).
+ the latter two of which have very large memory footprints[^4].
+
+[^4]: I was able to locate a project named the
+ [Encyclopedia of Finite Graphs](https://github.com/thoppe/Encyclopedia-of-Finite-Graphs),
+ but it is only able to build a database simple connected graphs which
+ can be queried by invariants (and is outdated since it uses Python 2).
However, as visual objects, humans can compare graphs fairly easily
-- the name means "drawing" after all.
-Exponentials of 3- and 4-graphs are neither so small as to be uninteresting
+Exponentials of order 3 and order 4 graphs are neither so small as to be uninteresting
nor so big as to be unparsable by humans.
### Order 3
-
+{.narrow}
At this stage, we only really have two graphs to consider, since $P_3 = \bigstar_3$.
Immediately, one can see that $\exp( P_3 ) = \exp( \bigstar_3 ) = C_6$,
the 6-cycle graph (or hexagonal graph).
It is also apparent that $\exp( K_3 )$ is the utility graph, $K_{3,3}$.
-
+
-We can again demonstrate the sum rule of the graph exponential with $\exp( P_3 \oplus P_2 )$.
+Here, we can again demonstrate the sum rule of the graph exponential with $\exp( P_3 \oplus P_2 )$.
Simplifying, since we know $\exp( P_3 ) = C_6$, the result is $C_6 \times P_2 = \text{Prism}_6$,
the hexagonal prism graph.
### Order 4 (and beyond)
-::: {layout-ncol="2"}
+::: {layout="[[1,1],[1]]"}


-:::
-
+
+:::
With some effort, $\exp( P_4 )$ can be imagined as a projection of a 3D object,
the [truncated octahedron](https://en.wikipedia.org/wiki/Truncated_octahedron).
Because of its correspondence to a 3D solid, this graph is planar.
Both the hexagon and this solid belong to a class of polytopes called
[*permutohedra*](https://en.wikipedia.org/wiki/Permutohedron), which are figures
- that are also formed by permutations of the coordinate (1, 2, 3, ..., *n*) in Euclidean space.
-In fact, they are able to completely tessellate the $n-1$ dimensional subspace of
- $\mathbb{R}^n$ where the coordinates sum to the $n-1$th triangular number.
-Note that the previous graph in the sequence of $\exp(P_n)$, the hexagonal graph,
- is visible in the truncated octahedron.
-This corresponds to the projection $(x,y,z,w) \mapsto (x,y,z)$ over
- the coordinates of the permutohedra.
-
+ that are also formed by permutations of the coordinate (1, 2, 3, ..., *n*) in Euclidean space[^5].
Technically, there is a distinction between the Cayley graphs and permutohedra
since their labellings differ.
Both have edges generated by swaps, but in the latter case, the connected vertices are expected to be
- separated by a certain Euclidean distance.
-More information about the distinction can be found at this article on
- [Wikimedia](https://commons.wikimedia.org/wiki/Category:Permutohedron_of_order_4_%28raytraced%29#Permutohedron_vs._Cayley_graph)[^3].
+ separated by a certain distance.
+More information about the distinction can be found at this article on [Wikimedia](
+ https://commons.wikimedia.org/wiki/Category:Permutohedron_of_order_4_%28raytraced%29#Permutohedron_vs._Cayley_graph
+ )[^6].
-[^3]: Actually, if one considers a *right* Cayley graph, where each generator is right-multiplied
+[^5]: In fact, these figures are able to completely tessellate the $n-1$ dimensional subspace of
+ $\mathbb{R}^n$ where the coordinates sum to the $n-1$th triangular number.
+ Note also that the previous graph in the sequence of $\exp(P_n)$, the hexagonal graph,
+ is visible in the truncated octahedron.
+ This corresponds to the projection $(x,y,z,w) \mapsto (x,y,z)$ over
+ the coordinates of the permutohedra.
+
+[^6]: Actually, if one considers a *right* Cayley graph, where each generator is right-multiplied
to the permutation at a node rather than left-multiplied, then a true correspondence is obtained,
at least for order 4.
@@ -193,12 +209,13 @@ It is certainly *not* isomorphic to $K_{4,4}$, since this graph has 8 vertices,
as opposed to 24 in $\exp( K_4 )$.
-### Graph Invariants
+Graph Invariants
+----------------
While I have managed to identify the families to which some of these graphs belong,
I am rather fond of computing (and conjecturing) sequences from objects.
-Not only is it much easier to consult something like the OEIS for these quantities,
- but when finding a matching sequence, there are ample articles to consult for more information.
+Not only is it much easier to consult something like [the OEIS](https://oeis.org/) for these quantities,
+ but after finding a matching sequence, there are ample articles to consult for more information.
By linking to their respective entries, I hope you'll consider reading more there.
Even though I have obtained these values empirically, I am certain that the sequences for
@@ -206,9 +223,9 @@ Even though I have obtained these values empirically, I am certain that the sequ
I also have great confidence in the sequences I found for $\exp( K_n )$.
-#### Edge Counts
+### Edge Counts
-Despite knowing how many vertices there are ($n!$, the order of the symmetric group),
+Despite knowing how many vertices there are (*n*!, the order of the symmetric group),
we don't necessarily know how many edges there are.
```{python}
@@ -246,7 +263,7 @@ Markdown(tabulate(
```
-#### Radius and Distance Classes
+### Radius and Distance Classes
The radius of a graph is the smallest possible distance which separates two maximally-separated vertices.
Due to vertex transitivity, the greatest distance between two vertices is the same for every vertex.
@@ -291,7 +308,7 @@ Including the vertex itself (which is distance 0 away), there will be $r + 1$ su
where *r* is the radius.
These classes are the same for every vertex due to transitivity.
-In the case of these graphs, they are a partition of the factorial numbers.
+In the case of these graphs, they are a partition of *n*!.
```{python}
#| echo: false
@@ -379,15 +396,14 @@ It seems to be the case that $\exp( K_n )$ are also integral graphs.
Perplexingly, the multiplicities for each of the eigenvalues appear to mostly be perfect powers.
This is the case until n = 8, which ruins the pattern because neither of
$9864 = 2^3 \cdot 3^2 \cdot 137$ or $6125 = 5^3 \cdot 7^2$ are perfect powers.
-I find both this and the fact that such a large prime appears among the factorization
- of the former rather creepy[^4].
-All other primes appearing in the factorization of the other numbers are small -- 2, 3, 5, and 7.
+I find both this[^7] and the fact that such a large prime appears among the factorization of the former
+ rather creepy since all other primes which appear here are small -- 2, 3, 5, and 7.
-[^4]: Some physicists are fond of 137 for its closeness to the reciprocal
- of the fine structure constant (a bit of harmless numerology).
+[^7]: Some physicists are fond of 137 for its closeness to the reciprocal
+ of the fine structure constant (a bit of mostly-harmless numerology).
-#### Notes about Spectral Computation
+### Notes about Spectral Computation
For *n* = 3 through 6, exactly computing the spectrum
(or more accurately, the characteristic polynomial)
diff --git a/posts/permutations/3/index.qmd b/posts/permutations/3/index.qmd
index bb8f941..f223a1e 100644
--- a/posts/permutations/3/index.qmd
+++ b/posts/permutations/3/index.qmd
@@ -36,7 +36,7 @@ To summarize from the first post, a swap diagram is a graph where each vertex re
I singled out three graph families for their symmetry:
- Paths, which link adjacent swaps
-- Stars, in which all swaps are "adjacent" to a distinguished index
+- Stars, in which all swaps contain a distinguished index
- Complete graphs, which contain all possible swaps
Swap diagrams are ultimately limited by only being able to describe collections of 2-cycles.
@@ -98,10 +98,8 @@ Furthermore, *disconnected* vertices correspond to elements which commute with o
However, each vertex need not correspond to a single 2-cycle,
and can instead be *any* element of order 2, i.e., a product of disjoint 2-cycles.
-The most important Coxeter diagrams can be seen below.
-The $A_n$ diagrams are just the familiar path graphs.
-
+