Let $n$ be a fixed positive integer. Let $$A=\begin{bmatrix} a_{11} & a_{12} & \cdots &a_{1n} \\ a_{21} & a_{22} & \cdots &a_{2n} \\ \vdots & \vdots & \cdots & \vdots \\ a_{n1} & a_{n2} & \cdots &a_{nn} \end{bmatrix}\quad \text{and} \quad B=\begin{bmatrix} b_{11} & b_{12} & \cdots &b_{1n} \\ b_{21} & b_{22} & \cdots &b_{2n} \\ \vdots & \vdots & \cdots & \vdots \\ b_{n1} & b_{n2} & \cdots &b_{nn} \end{bmatrix}\quad$$be two $n\times n$ tables such that $\{a_{ij}|1\le i,j\le n\}=\{b_{ij}|1\le i,j\le n\}=\{k\in N^*|1\le k\le n^2\}$. One can perform the following operation on table $A$: Choose $2$ numbers in the same row or in the same column of $A$, interchange these $2$ numbers, and leave the remaining $n^2-2$ numbers unchanged. This operation is called a transposition of $A$. Find, with proof, the smallest positive integer $m$ such that for any tables $A$ and $B$, one can perform at most $m$ transpositions such that the resulting table of $A$ is $B$.
Problem
Source:
Tags: combinatorics
Pathological
21.04.2019 20:35
We claim that the answer is $2n(n-1).$ Let a $\mathbf{row}$ $\mathbf{transposition}$ be a transposition where we switch two elements of the same row. Define a $\mathbf{column}$ $\mathbf{transposition}$ analogously. Firstly, we will give an example which shows that this bound cannot be improved. Let $A$ be the array where $a_{ij} = (i-1)n+j$ and $B$ be the array where $b_{ij} = i'n + j'$, where $i'$ is the unique integer in $\{0, 1, \cdots, n-1\}$ with $i' \equiv i$ (mod $n$) and $j'$ is the unique integer in $\{1, 2, \cdots, n\}$ with $j' \equiv j+1$ (mod $n$). We will now show that this example indeed requires $2n(n-1)$ transpositions. In fact, we will show that it requires at least $n(n-1)$ row transpositions, from which a similar statement holds by symmetry for column transpositions and so we'd be done.
To do this, let's color every element of $A, B$ according to its residue class modulo $n$, with the colors $c_1, c_2, \cdots, c_n$ corresponding to $1, 2, \cdots n$ (mod $n$). Then, we will consider the following problem, which would imply the claim about row transpositions.
$\mathbf{Problem.}$ Let there be $n$ heaps each of $n$ stones. Initially, the first heap has all stones of color $c_1$, the second has all stones of color $c_2$, $\cdots$, and the $n$th heap has all stones of colors $c_n.$ Then, in one move we may take any two stones and switch them. In the end, we wish to attain the situation where the $i$th heap has all stones of color $c_{i+1}$ (indices modulo $n$). Then, we must move at least $n(n-1)$ times.
Firstly, it's easily seen that this is equivalent to the row transposition claim since we can view each column as a "heap," and only the row transpositions actually change "stones" between columns.
$\mathbf{Proof.}$ For each stone $s$ of a color $c_i$, let its $\mathbf{distance}$ be a variable which corresponds to the quantity $m \in \{0, 1, \cdots, n-1\}$ which is congruent to $i-1-j$ mod $n$, when the stone is currently in the $j$th heap. Then, observe that initially the sum of the distances of all stones is $n^2 \cdot (n-1)$ and in the end it is $0.$ Therefore, since it's easily seen that the sum of the distances decreases by at most $n$ each time (the distance for any one stone decreases by at most $n-1$, so since we move two stones at a time it decreases by $\le (2n-2)$ at a time, so now just note that the sum of distances is invariant mod $n$), we know that there are at least $\frac{n^2(n-1)}{n}$ moves and we're done.
$\blacksquare$
With the Problem, we have shown that the example earlier requires at least $n(n-1)$ row transpositions. Analogously, it requires at least $n(n-1)$ column transpositions and so we've exhibited an example which needs at least $2n(n-1)$ transpositions.
Now, we will turn to the harder and more exciting part of the problem, which is showing that $2n(n-1)$ transpositions actually suffice for any two arrays $A, B$ (in getting from $A$ to $B$). First of all, we will cite the following well-known lemma without proof.
$\mathbf{Lemma.}$ For any permutation $\sigma$ of $\{1, 2, \cdots, n\}$, the number of transpositions required to obtain it from the identity permutation is $n - c(\sigma)$, where $c(\sigma)$ denotes the number of cycles in the cycle decomposition of $\sigma.$
As a corollary, note that any permutation of $\{1, 2, \cdots, n\}$ requires at most $n-1$ transpositions to be obtained from the identity permutation.
With the lemma in mind, let's proceed. Our algorithm will proceed as follows. In the $i$th row of the algorithm, for $1 \le i \le n-1$, we will look at the $i$th row of $B$. In $2n-1$ moves, we will perform transpositions on the bottom $n-i+1$ rows of $A$ (so not touching the first $i-1$) so that the $i$th rows of $A$ and $B$ match each other. In the $n$th phase of the algorithm, we will apply the corollary above to change the $n$th row of $A$ into that of $B$ in $\le n-1$ moves, for a grand total of $(n-1) \cdot (2n-1) + (n-1) = 2n(n-1)$ moves, as desired.
Now, all that remains to be shown is that we can indeed perform the $i$th phase in $2n-1$ moves, for all $1 \le i \le n-1.$ We will simply prove it for the $1$st phase, with the other $n-2$ phases proceeding similarly.
Before we construct an algorithm for $2n-1$ moves, we will first try constructing one for $2n$ moves. To do so, let's set $x_1, x_2, \cdots, x_n$ to be $b_{11}, b_{12}, \cdots, b_{1n},$ respectively. Then, we'll "fix" $x_1, x_2, \cdots, x_n$ in two steps, each of which requires at most $n$ moves.
Step 1. Put them in the correct column.
Step 2. "Jump" the $x_i$'s to the first row.
To begin Step 1, we will first put $x_1$ in the correct column (the first one) using a row transposition. Next, we'll put $x_2$ in the second column with another transposition. Continue this $n$ times. To see that this requires $n$ moves, simply note that we "fix" one $x_i$ each term and we can never unfix an $x_i$ (e.g., if $x_i$ was swapped with $x_j$ in the $i$th column, then since $j \neq i$ $x_j$ cannot possibly have been in the right column before).
Now, for the second step, we simply perform column transpositions on all the $x_i$'s to bring them to the first row.
With this, we have now obtained an algorithm for doing Phase $1$ in $2n$ moves, and this clearly can generalize for the $i$th phase, $\forall 1 < i \le n-1$. In all, this gives us an algorithm which works in $\le 2n(n-1)+(n-1) = (2n+1)(n-1)$ transpositions, which is just a bit too much.
To optimize it, we will use the following slightly improved three-step algorithm to perform the $i$th phase, $\forall 1 \le i \le n-1$, in $2n-1$ transpositions instead of $2n$. Again, we'll just do it for the $1$st phase, with the $n-2$ other phases working analogously. We'll define $x_i$ the same as before. Before actually performing any transpositions, we'll locate a "cycle" among the $x_i$'s, which is defined as follows. For each $x_i$ in $A$, if it's in the $j$th column, then create an edge from $v_i \rightarrow v_j$ in the digraph of vertices $v_1, v_2, \cdots, v_n.$ Since every vertex of this digraph has outdegree $1$, there exists a cycle $v_{a_1} \rightarrow v_{a_2} \rightarrow \cdots \rightarrow v_{a_k} \rightarrow v_{a_1},$ which means that $x_{a_i}$ is in the $a_{i+1}$th column for each $i$ (indices mod $k$). Note that $k$ could be $1$. We can now proceed with our algorithm.
Step 1. Put the $x_i$'s in the correct column, except for the $x_{a_j}$'s.
Step 2. Jump all the $x_i$'s to the first row.
Step 3. Reorder the $x_{a_j}$'s.
To execute Step 1, we will do the same thing as before, where we first put $x_1$ in the first column with a row transposition, etc., except we will be careful to "skip over" the $x_{a_j}$'s when we do this. Note that we could never ever touch one of the $x_{a_j}$'s, since it's in the $a_{j+1}$th column, but we avoided fixing $x_{a_{j+1}}$ in the first step. This takes $\le n-k$ moves since we fix at least one of the $n-k$ things not involved in the cycle every time.
For Step 2, we will simply apply column transpositions to get all $x_i$'s (including the $x_{a_j}$'s) into the first row. This takes $\le n$ moves.
For Step 3, we again use the Lemma, noting that we can reorder the $x_{a_j}$'s into the desired order in $k-1$ moves, since it is a $k-$cycle.
Hence, Steps 1-3 took a grand total of $\le n-k + n + k-1 = 2n-1$ moves, and so we're done.
Applying the same thing for the $i$th phase, $\forall 1 < i \le n-1$, and then using the $n$th phase as described above, we have an algorithm which works in $(2n-1)(n-1) + (n-1) = 2n(n-1)$ moves, as desired.
$\square$
Kepler_s_Dream
19.11.2021 15:40
Pathological wrote:
We claim that the answer is $2n(n-1).$ Let a $\mathbf{row}$ $\mathbf{transposition}$ be a transposition where we switch two elements of the same row. Define a $\mathbf{column}$ $\mathbf{transposition}$ analogously. Firstly, we will give an example which shows that this bound cannot be improved. Let $A$ be the array where $a_{ij} = (i-1)n+j$ and $B$ be the array where $b_{ij} = i'n + j'$, where $i'$ is the unique integer in $\{0, 1, \cdots, n-1\}$ with $i' \equiv i$ (mod $n$) and $j'$ is the unique integer in $\{1, 2, \cdots, n\}$ with $j' \equiv j+1$ (mod $n$). We will now show that this example indeed requires $2n(n-1)$ transpositions. In fact, we will show that it requires at least $n(n-1)$ row transpositions, from which a similar statement holds by symmetry for column transpositions and so we'd be done.
To do this, let's color every element of $A, B$ according to its residue class modulo $n$, with the colors $c_1, c_2, \cdots, c_n$ corresponding to $1, 2, \cdots n$ (mod $n$). Then, we will consider the following problem, which would imply the claim about row transpositions.
$\mathbf{Problem.}$ Let there be $n$ heaps each of $n$ stones. Initially, the first heap has all stones of color $c_1$, the second has all stones of color $c_2$, $\cdots$, and the $n$th heap has all stones of colors $c_n.$ Then, in one move we may take any two stones and switch them. In the end, we wish to attain the situation where the $i$th heap has all stones of color $c_{i+1}$ (indices modulo $n$). Then, we must move at least $n(n-1)$ times.
Firstly, it's easily seen that this is equivalent to the row transposition claim since we can view each column as a "heap," and only the row transpositions actually change "stones" between columns.
$\mathbf{Proof.}$ For each stone $s$ of a color $c_i$, let its $\mathbf{distance}$ be a variable which corresponds to the quantity $m \in \{0, 1, \cdots, n-1\}$ which is congruent to $i-1-j$ mod $n$, when the stone is currently in the $j$th heap. Then, observe that initially the sum of the distances of all stones is $n^2 \cdot (n-1)$ and in the end it is $0.$ Therefore, since it's easily seen that the sum of the distances decreases by at most $n$ each time (the distance for any one stone decreases by at most $n-1$, so since we move two stones at a time it decreases by $\le (2n-2)$ at a time, so now just note that the sum of distances is invariant mod $n$), we know that there are at least $\frac{n^2(n-1)}{n}$ moves and we're done.
$\blacksquare$
With the Problem, we have shown that the example earlier requires at least $n(n-1)$ row transpositions. Analogously, it requires at least $n(n-1)$ column transpositions and so we've exhibited an example which needs at least $2n(n-1)$ transpositions.
Now, we will turn to the harder and more exciting part of the problem, which is showing that $2n(n-1)$ transpositions actually suffice for any two arrays $A, B$ (in getting from $A$ to $B$). First of all, we will cite the following well-known lemma without proof.
$\mathbf{Lemma.}$ For any permutation $\sigma$ of $\{1, 2, \cdots, n\}$, the number of transpositions required to obtain it from the identity permutation is $n - c(\sigma)$, where $c(\sigma)$ denotes the number of cycles in the cycle decomposition of $\sigma.$
As a corollary, note that any permutation of $\{1, 2, \cdots, n\}$ requires at most $n-1$ transpositions to be obtained from the identity permutation.
With the lemma in mind, let's proceed. Our algorithm will proceed as follows. In the $i$th row of the algorithm, for $1 \le i \le n-1$, we will look at the $i$th row of $B$. In $2n-1$ moves, we will perform transpositions on the bottom $n-i+1$ rows of $A$ (so not touching the first $i-1$) so that the $i$th rows of $A$ and $B$ match each other. In the $n$th phase of the algorithm, we will apply the corollary above to change the $n$th row of $A$ into that of $B$ in $\le n-1$ moves, for a grand total of $(n-1) \cdot (2n-1) + (n-1) = 2n(n-1)$ moves, as desired.
Now, all that remains to be shown is that we can indeed perform the $i$th phase in $2n-1$ moves, for all $1 \le i \le n-1.$ We will simply prove it for the $1$st phase, with the other $n-2$ phases proceeding similarly.
Before we construct an algorithm for $2n-1$ moves, we will first try constructing one for $2n$ moves. To do so, let's set $x_1, x_2, \cdots, x_n$ to be $b_{11}, b_{12}, \cdots, b_{1n},$ respectively. Then, we'll "fix" $x_1, x_2, \cdots, x_n$ in two steps, each of which requires at most $n$ moves.
Step 1. Put them in the correct column.
Step 2. "Jump" the $x_i$'s to the first row.
To begin Step 1, we will first put $x_1$ in the correct column (the first one) using a row transposition. Next, we'll put $x_2$ in the second column with another transposition. Continue this $n$ times. To see that this requires $n$ moves, simply note that we "fix" one $x_i$ each term and we can never unfix an $x_i$ (e.g., if $x_i$ was swapped with $x_j$ in the $i$th column, then since $j \neq i$ $x_j$ cannot possibly have been in the right column before).
Now, for the second step, we simply perform column transpositions on all the $x_i$'s to bring them to the first row.
With this, we have now obtained an algorithm for doing Phase $1$ in $2n$ moves, and this clearly can generalize for the $i$th phase, $\forall 1 < i \le n-1$. In all, this gives us an algorithm which works in $\le 2n(n-1)+(n-1) = (2n+1)(n-1)$ transpositions, which is just a bit too much.
To optimize it, we will use the following slightly improved three-step algorithm to perform the $i$th phase, $\forall 1 \le i \le n-1$, in $2n-1$ transpositions instead of $2n$. Again, we'll just do it for the $1$st phase, with the $n-2$ other phases working analogously. We'll define $x_i$ the same as before. Before actually performing any transpositions, we'll locate a "cycle" among the $x_i$'s, which is defined as follows. For each $x_i$ in $A$, if it's in the $j$th column, then create an edge from $v_i \rightarrow v_j$ in the digraph of vertices $v_1, v_2, \cdots, v_n.$ Since every vertex of this digraph has outdegree $1$, there exists a cycle $v_{a_1} \rightarrow v_{a_2} \rightarrow \cdots \rightarrow v_{a_k} \rightarrow v_{a_1},$ which means that $x_{a_i}$ is in the $a_{i+1}$th column for each $i$ (indices mod $k$). Note that $k$ could be $1$. We can now proceed with our algorithm.
Step 1. Put the $x_i$'s in the correct column, except for the $x_{a_j}$'s.
Step 2. Jump all the $x_i$'s to the first row.
Step 3. Reorder the $x_{a_j}$'s.
To execute Step 1, we will do the same thing as before, where we first put $x_1$ in the first column with a row transposition, etc., except we will be careful to "skip over" the $x_{a_j}$'s when we do this. Note that we could never ever touch one of the $x_{a_j}$'s, since it's in the $a_{j+1}$th column, but we avoided fixing $x_{a_{j+1}}$ in the first step. This takes $\le n-k$ moves since we fix at least one of the $n-k$ things not involved in the cycle every time.
For Step 2, we will simply apply column transpositions to get all $x_i$'s (including the $x_{a_j}$'s) into the first row. This takes $\le n$ moves.
For Step 3, we again use the Lemma, noting that we can reorder the $x_{a_j}$'s into the desired order in $k-1$ moves, since it is a $k-$cycle.
Hence, Steps 1-3 took a grand total of $\le n-k + n + k-1 = 2n-1$ moves, and so we're done.
Applying the same thing for the $i$th phase, $\forall 1 < i \le n-1$, and then using the $n$th phase as described above, we have an algorithm which works in $(2n-1)(n-1) + (n-1) = 2n(n-1)$ moves, as desired.
$\square$
Can you speak $\text{Chinese}$?!
starchan
19.11.2021 15:42
Kepler_s_Dream wrote: Pathological wrote:
We claim that the answer is $2n(n-1).$ Let a $\mathbf{row}$ $\mathbf{transposition}$ be a transposition where we switch two elements of the same row. Define a $\mathbf{column}$ $\mathbf{transposition}$ analogously. Firstly, we will give an example which shows that this bound cannot be improved. Let $A$ be the array where $a_{ij} = (i-1)n+j$ and $B$ be the array where $b_{ij} = i'n + j'$, where $i'$ is the unique integer in $\{0, 1, \cdots, n-1\}$ with $i' \equiv i$ (mod $n$) and $j'$ is the unique integer in $\{1, 2, \cdots, n\}$ with $j' \equiv j+1$ (mod $n$). We will now show that this example indeed requires $2n(n-1)$ transpositions. In fact, we will show that it requires at least $n(n-1)$ row transpositions, from which a similar statement holds by symmetry for column transpositions and so we'd be done.
To do this, let's color every element of $A, B$ according to its residue class modulo $n$, with the colors $c_1, c_2, \cdots, c_n$ corresponding to $1, 2, \cdots n$ (mod $n$). Then, we will consider the following problem, which would imply the claim about row transpositions.
$\mathbf{Problem.}$ Let there be $n$ heaps each of $n$ stones. Initially, the first heap has all stones of color $c_1$, the second has all stones of color $c_2$, $\cdots$, and the $n$th heap has all stones of colors $c_n.$ Then, in one move we may take any two stones and switch them. In the end, we wish to attain the situation where the $i$th heap has all stones of color $c_{i+1}$ (indices modulo $n$). Then, we must move at least $n(n-1)$ times.
Firstly, it's easily seen that this is equivalent to the row transposition claim since we can view each column as a "heap," and only the row transpositions actually change "stones" between columns.
$\mathbf{Proof.}$ For each stone $s$ of a color $c_i$, let its $\mathbf{distance}$ be a variable which corresponds to the quantity $m \in \{0, 1, \cdots, n-1\}$ which is congruent to $i-1-j$ mod $n$, when the stone is currently in the $j$th heap. Then, observe that initially the sum of the distances of all stones is $n^2 \cdot (n-1)$ and in the end it is $0.$ Therefore, since it's easily seen that the sum of the distances decreases by at most $n$ each time (the distance for any one stone decreases by at most $n-1$, so since we move two stones at a time it decreases by $\le (2n-2)$ at a time, so now just note that the sum of distances is invariant mod $n$), we know that there are at least $\frac{n^2(n-1)}{n}$ moves and we're done.
$\blacksquare$
With the Problem, we have shown that the example earlier requires at least $n(n-1)$ row transpositions. Analogously, it requires at least $n(n-1)$ column transpositions and so we've exhibited an example which needs at least $2n(n-1)$ transpositions.
Now, we will turn to the harder and more exciting part of the problem, which is showing that $2n(n-1)$ transpositions actually suffice for any two arrays $A, B$ (in getting from $A$ to $B$). First of all, we will cite the following well-known lemma without proof.
$\mathbf{Lemma.}$ For any permutation $\sigma$ of $\{1, 2, \cdots, n\}$, the number of transpositions required to obtain it from the identity permutation is $n - c(\sigma)$, where $c(\sigma)$ denotes the number of cycles in the cycle decomposition of $\sigma.$
As a corollary, note that any permutation of $\{1, 2, \cdots, n\}$ requires at most $n-1$ transpositions to be obtained from the identity permutation.
With the lemma in mind, let's proceed. Our algorithm will proceed as follows. In the $i$th row of the algorithm, for $1 \le i \le n-1$, we will look at the $i$th row of $B$. In $2n-1$ moves, we will perform transpositions on the bottom $n-i+1$ rows of $A$ (so not touching the first $i-1$) so that the $i$th rows of $A$ and $B$ match each other. In the $n$th phase of the algorithm, we will apply the corollary above to change the $n$th row of $A$ into that of $B$ in $\le n-1$ moves, for a grand total of $(n-1) \cdot (2n-1) + (n-1) = 2n(n-1)$ moves, as desired.
Now, all that remains to be shown is that we can indeed perform the $i$th phase in $2n-1$ moves, for all $1 \le i \le n-1.$ We will simply prove it for the $1$st phase, with the other $n-2$ phases proceeding similarly.
Before we construct an algorithm for $2n-1$ moves, we will first try constructing one for $2n$ moves. To do so, let's set $x_1, x_2, \cdots, x_n$ to be $b_{11}, b_{12}, \cdots, b_{1n},$ respectively. Then, we'll "fix" $x_1, x_2, \cdots, x_n$ in two steps, each of which requires at most $n$ moves.
Step 1. Put them in the correct column.
Step 2. "Jump" the $x_i$'s to the first row.
To begin Step 1, we will first put $x_1$ in the correct column (the first one) using a row transposition. Next, we'll put $x_2$ in the second column with another transposition. Continue this $n$ times. To see that this requires $n$ moves, simply note that we "fix" one $x_i$ each term and we can never unfix an $x_i$ (e.g., if $x_i$ was swapped with $x_j$ in the $i$th column, then since $j \neq i$ $x_j$ cannot possibly have been in the right column before).
Now, for the second step, we simply perform column transpositions on all the $x_i$'s to bring them to the first row.
With this, we have now obtained an algorithm for doing Phase $1$ in $2n$ moves, and this clearly can generalize for the $i$th phase, $\forall 1 < i \le n-1$. In all, this gives us an algorithm which works in $\le 2n(n-1)+(n-1) = (2n+1)(n-1)$ transpositions, which is just a bit too much.
To optimize it, we will use the following slightly improved three-step algorithm to perform the $i$th phase, $\forall 1 \le i \le n-1$, in $2n-1$ transpositions instead of $2n$. Again, we'll just do it for the $1$st phase, with the $n-2$ other phases working analogously. We'll define $x_i$ the same as before. Before actually performing any transpositions, we'll locate a "cycle" among the $x_i$'s, which is defined as follows. For each $x_i$ in $A$, if it's in the $j$th column, then create an edge from $v_i \rightarrow v_j$ in the digraph of vertices $v_1, v_2, \cdots, v_n.$ Since every vertex of this digraph has outdegree $1$, there exists a cycle $v_{a_1} \rightarrow v_{a_2} \rightarrow \cdots \rightarrow v_{a_k} \rightarrow v_{a_1},$ which means that $x_{a_i}$ is in the $a_{i+1}$th column for each $i$ (indices mod $k$). Note that $k$ could be $1$. We can now proceed with our algorithm.
Step 1. Put the $x_i$'s in the correct column, except for the $x_{a_j}$'s.
Step 2. Jump all the $x_i$'s to the first row.
Step 3. Reorder the $x_{a_j}$'s.
To execute Step 1, we will do the same thing as before, where we first put $x_1$ in the first column with a row transposition, etc., except we will be careful to "skip over" the $x_{a_j}$'s when we do this. Note that we could never ever touch one of the $x_{a_j}$'s, since it's in the $a_{j+1}$th column, but we avoided fixing $x_{a_{j+1}}$ in the first step. This takes $\le n-k$ moves since we fix at least one of the $n-k$ things not involved in the cycle every time.
For Step 2, we will simply apply column transpositions to get all $x_i$'s (including the $x_{a_j}$'s) into the first row. This takes $\le n$ moves.
For Step 3, we again use the Lemma, noting that we can reorder the $x_{a_j}$'s into the desired order in $k-1$ moves, since it is a $k-$cycle.
Hence, Steps 1-3 took a grand total of $\le n-k + n + k-1 = 2n-1$ moves, and so we're done.
Applying the same thing for the $i$th phase, $\forall 1 < i \le n-1$, and then using the $n$th phase as described above, we have an algorithm which works in $(2n-1)(n-1) + (n-1) = 2n(n-1)$ moves, as desired.
$\square$
Can you speak $\text{Chinese}$?! Isn't the solution written in chinese?
Kepler_s_Dream
19.11.2021 15:44
starchan wrote:
Kepler_s_Dream wrote:
Pathological wrote:
SolutionWe claim that the answer is $2n(n-1).$ Let a $\mathbf{row}$ $\mathbf{transposition}$ be a transposition where we switch two elements of the same row. Define a $\mathbf{column}$ $\mathbf{transposition}$ analogously. Firstly, we will give an example which shows that this bound cannot be improved. Let $A$ be the array where $a_{ij} = (i-1)n+j$ and $B$ be the array where $b_{ij} = i'n + j'$, where $i'$ is the unique integer in $\{0, 1, \cdots, n-1\}$ with $i' \equiv i$ (mod $n$) and $j'$ is the unique integer in $\{1, 2, \cdots, n\}$ with $j' \equiv j+1$ (mod $n$). We will now show that this example indeed requires $2n(n-1)$ transpositions. In fact, we will show that it requires at least $n(n-1)$ row transpositions, from which a similar statement holds by symmetry for column transpositions and so we'd be done.
To do this, let's color every element of $A, B$ according to its residue class modulo $n$, with the colors $c_1, c_2, \cdots, c_n$ corresponding to $1, 2, \cdots n$ (mod $n$). Then, we will consider the following problem, which would imply the claim about row transpositions.
$\mathbf{Problem.}$ Let there be $n$ heaps each of $n$ stones. Initially, the first heap has all stones of color $c_1$, the second has all stones of color $c_2$, $\cdots$, and the $n$th heap has all stones of colors $c_n.$ Then, in one move we may take any two stones and switch them. In the end, we wish to attain the situation where the $i$th heap has all stones of color $c_{i+1}$ (indices modulo $n$). Then, we must move at least $n(n-1)$ times.
Firstly, it's easily seen that this is equivalent to the row transposition claim since we can view each column as a "heap," and only the row transpositions actually change "stones" between columns.
$\mathbf{Proof.}$ For each stone $s$ of a color $c_i$, let its $\mathbf{distance}$ be a variable which corresponds to the quantity $m \in \{0, 1, \cdots, n-1\}$ which is congruent to $i-1-j$ mod $n$, when the stone is currently in the $j$th heap. Then, observe that initially the sum of the distances of all stones is $n^2 \cdot (n-1)$ and in the end it is $0.$ Therefore, since it's easily seen that the sum of the distances decreases by at most $n$ each time (the distance for any one stone decreases by at most $n-1$, so since we move two stones at a time it decreases by $\le (2n-2)$ at a time, so now just note that the sum of distances is invariant mod $n$), we know that there are at least $\frac{n^2(n-1)}{n}$ moves and we're done.
$\blacksquare$
With the Problem, we have shown that the example earlier requires at least $n(n-1)$ row transpositions. Analogously, it requires at least $n(n-1)$ column transpositions and so we've exhibited an example which needs at least $2n(n-1)$ transpositions.
Now, we will turn to the harder and more exciting part of the problem, which is showing that $2n(n-1)$ transpositions actually suffice for any two arrays $A, B$ (in getting from $A$ to $B$). First of all, we will cite the following well-known lemma without proof.
$\mathbf{Lemma.}$ For any permutation $\sigma$ of $\{1, 2, \cdots, n\}$, the number of transpositions required to obtain it from the identity permutation is $n - c(\sigma)$, where $c(\sigma)$ denotes the number of cycles in the cycle decomposition of $\sigma.$
As a corollary, note that any permutation of $\{1, 2, \cdots, n\}$ requires at most $n-1$ transpositions to be obtained from the identity permutation.
With the lemma in mind, let's proceed. Our algorithm will proceed as follows. In the $i$th row of the algorithm, for $1 \le i \le n-1$, we will look at the $i$th row of $B$. In $2n-1$ moves, we will perform transpositions on the bottom $n-i+1$ rows of $A$ (so not touching the first $i-1$) so that the $i$th rows of $A$ and $B$ match each other. In the $n$th phase of the algorithm, we will apply the corollary above to change the $n$th row of $A$ into that of $B$ in $\le n-1$ moves, for a grand total of $(n-1) \cdot (2n-1) + (n-1) = 2n(n-1)$ moves, as desired.
Now, all that remains to be shown is that we can indeed perform the $i$th phase in $2n-1$ moves, for all $1 \le i \le n-1.$ We will simply prove it for the $1$st phase, with the other $n-2$ phases proceeding similarly.
Before we construct an algorithm for $2n-1$ moves, we will first try constructing one for $2n$ moves. To do so, let's set $x_1, x_2, \cdots, x_n$ to be $b_{11}, b_{12}, \cdots, b_{1n},$ respectively. Then, we'll "fix" $x_1, x_2, \cdots, x_n$ in two steps, each of which requires at most $n$ moves.
Step 1. Put them in the correct column.
Step 2. "Jump" the $x_i$'s to the first row.
To begin Step 1, we will first put $x_1$ in the correct column (the first one) using a row transposition. Next, we'll put $x_2$ in the second column with another transposition. Continue this $n$ times. To see that this requires $n$ moves, simply note that we "fix" one $x_i$ each term and we can never unfix an $x_i$ (e.g., if $x_i$ was swapped with $x_j$ in the $i$th column, then since $j \neq i$ $x_j$ cannot possibly have been in the right column before).
Now, for the second step, we simply perform column transpositions on all the $x_i$'s to bring them to the first row.
With this, we have now obtained an algorithm for doing Phase $1$ in $2n$ moves, and this clearly can generalize for the $i$th phase, $\forall 1 < i \le n-1$. In all, this gives us an algorithm which works in $\le 2n(n-1)+(n-1) = (2n+1)(n-1)$ transpositions, which is just a bit too much.
To optimize it, we will use the following slightly improved three-step algorithm to perform the $i$th phase, $\forall 1 \le i \le n-1$, in $2n-1$ transpositions instead of $2n$. Again, we'll just do it for the $1$st phase, with the $n-2$ other phases working analogously. We'll define $x_i$ the same as before. Before actually performing any transpositions, we'll locate a "cycle" among the $x_i$'s, which is defined as follows. For each $x_i$ in $A$, if it's in the $j$th column, then create an edge from $v_i \rightarrow v_j$ in the digraph of vertices $v_1, v_2, \cdots, v_n.$ Since every vertex of this digraph has outdegree $1$, there exists a cycle $v_{a_1} \rightarrow v_{a_2} \rightarrow \cdots \rightarrow v_{a_k} \rightarrow v_{a_1},$ which means that $x_{a_i}$ is in the $a_{i+1}$th column for each $i$ (indices mod $k$). Note that $k$ could be $1$. We can now proceed with our algorithm.
Step 1. Put the $x_i$'s in the correct column, except for the $x_{a_j}$'s.
Step 2. Jump all the $x_i$'s to the first row.
Step 3. Reorder the $x_{a_j}$'s.
To execute Step 1, we will do the same thing as before, where we first put $x_1$ in the first column with a row transposition, etc., except we will be careful to "skip over" the $x_{a_j}$'s when we do this. Note that we could never ever touch one of the $x_{a_j}$'s, since it's in the $a_{j+1}$th column, but we avoided fixing $x_{a_{j+1}}$ in the first step. This takes $\le n-k$ moves since we fix at least one of the $n-k$ things not involved in the cycle every time.
For Step 2, we will simply apply column transpositions to get all $x_i$'s (including the $x_{a_j}$'s) into the first row. This takes $\le n$ moves.
For Step 3, we again use the Lemma, noting that we can reorder the $x_{a_j}$'s into the desired order in $k-1$ moves, since it is a $k-$cycle.
Hence, Steps 1-3 took a grand total of $\le n-k + n + k-1 = 2n-1$ moves, and so we're done.
Applying the same thing for the $i$th phase, $\forall 1 < i \le n-1$, and then using the $n$th phase as described above, we have an algorithm which works in $(2n-1)(n-1) + (n-1) = 2n(n-1)$ moves, as desired.
$\square$
Can you speak $\text{Chinese}$?!
Isn't the solution written in chinese?
I dont think so.