Let $p$ be a prime number. Let $\mathbb F_p$ denote the integers modulo $p$, and let $\mathbb F_p[x]$ be the set of polynomials with coefficients in $\mathbb F_p$. Define $\Psi : \mathbb F_p[x] \to \mathbb F_p[x]$ by \[ \Psi\left( \sum_{i=0}^n a_i x^i \right) = \sum_{i=0}^n a_i x^{p^i}. \]Prove that for nonzero polynomials $F,G \in \mathbb F_p[x]$, \[ \Psi(\gcd(F,G)) = \gcd(\Psi(F), \Psi(G)). \]Here, a polynomial $Q$ divides $P$ if there exists $R \in \mathbb F_p[x]$ such that $P(x) - Q(x) R(x)$ is the polynomial with all coefficients $0$ (with all addition and multiplication in the coefficients taken modulo $p$), and the gcd of two polynomials is the highest degree polynomial with leading coefficient $1$ which divides both of them. A non-zero polynomial is a polynomial with not all coefficients $0$. As an example of multiplication, $(x+1)(x+2)(x+3) = x^3+x^2+x+1$ in $\mathbb F_5[x]$. Proposed by Mark Sellke
Problem
Source: USA December TST for 57th IMO 2016, Problem 3
Tags: number theory, greatest common divisor, modular arithmetic, USA TST, December TST, TST, Integer Polynomial
21.12.2015 18:37
Here's a pretty unique solution I found after the test
21.12.2015 18:49
Hello, me and ABCDE seem to be in discontent over the more beautiful solution. I post my solution:
21.12.2015 18:53
Dang, that's pretty nice. (Sorry ABCDE )
22.12.2015 06:09
22.12.2015 06:28
CantonMathGuy's solution above is how I and how I believe most people solved it.
EDIT: This is wrong, as brian22 points out. Refer to CantonMathGuy's solution instead.
22.12.2015 07:50
24.12.2015 05:01
CantonMathGuy wrote: \[\Psi(Q)=\Psi\left(P\sum_{i=0}^{k}r_ix^i\right)=\sum_{i=0}^{k}\Psi(Pr_ix^i)=\sum_{i=0}^{k}r_i\Psi(P)^{p^i}\equiv0\pmod{\Psi(P)}\] Sorry if this sounds stupid, but isn't $\Psi : \mathbb F_p[x] \to \mathbb F_p[x]$? So why is $\Psi\left(P\sum_{i=0}^{k}r_ix^i\right)=\sum_{i=0}^{k}\Psi(Pr_ix^i))=\sum_{i=0}^{k}r_i\Psi(P)^{p^i}$ true, since you might have to first reduce the coefficients in the expression in $\Psi$ by $\pmod p$ before you can change and evaluate the expression? Thanks.
24.12.2015 06:04
To be more clear, I mean: If $P=\sum_{i=0}^{k}p_ix^i$, then $\Psi(Pr_ix^i)$ will have coefficients $p_jr_i$, which could be greater than $p$, so we would need to subtract some number of $p$'s, giving $p_jr_i-mp$, giving $\sum_{i=0}^{k} (p_jr_i-mp)x^{something} = \sum_{i=0}^{k} r_i\Psi(P)^{p^i} - something$, which may not be $0 \pmod \Psi(P)$. $\sum_{i=0}^{k}r_i\Psi(P)^{p^i}$ could have coefficients greater than $p$, so you would have to first reduce the coefficients by $\pmod p$ before you can say that it is $0 \pmod \Psi(P)$, right? But reducing the coefficients by $\pmod p$ could make it no longer divisible by $\Psi(P)$ as you sort of changed the expression and coefficients.
24.12.2015 06:43
In $\mathbb{F}_{p}[x]$ we say that $F(x)\mid G(x)$ if there exists a polynomial $Q(x)$ such that $F(x)\equiv G(x)Q(x)\pmod{p}$. Reduction doesn't affect anything, because in $\mathbb{F}_p[x]$ the polynomials $P(x)$ and $P(x)+pQ(x)$ are the same.
24.12.2015 20:54
So just confirming, $x^3+x^2+x+1$ is divisible by $x+3$ in $\mathbb{F}_{5}[x]$ since $x^3+x^2+x+1 \equiv (x+1)(x+2)(x+3) \pmod 5$?
24.12.2015 21:54
MathPanda1 wrote: So just confirming, $x^3+x^2+x+1$ is divisible by $x+3$ in $\mathbb{F}_{5}[x]$ since $x^3+x^2+x+1 \equiv (x+1)(x+2)(x+3) \pmod 5$? That is correct.
17.03.2016 07:36
Fun problem I hope the solution below is okay. Claim 1. $\Psi (F+G) = \Psi (F) + \Psi (G)$ and $\Psi(cF) = c\Psi(F)$ for integer $c$. This follows from definition of $\Psi$ - easy. Claim 2. $\Psi (Fx^k) = \Psi(F)^{p^k}$ for all $k \in \mathbb{N}$. We go by induction. Let $F = \sum_{i=0}^n a_ix^i$. First we prove the statement for $ik1$. $$\Psi( Fx) = \Psi ( \sum_{i=1}^{n+1} a_{i-1}x^i) = \sum_{i=1}^{n+1} a_{i-1}x^{p^i} = (\sum_{i=1}^{n+1} a_{i-1}x^{p^{i-1}})^p = (\sum_{i=0}^n a_ix^{p^i})^p = \Psi(F)^p$$as required. Now if the result holds for $k$, then $$\Psi(Fx^{k+1}) = \Psi(Fx^k)^p = \Psi(F)^{p^{k+1}}$$so the result holds for $k+1$ as well, finishing the induction. Claim 3. $Q|P \implies \Psi(Q)|\Psi(P)$. Denote $P=QR$, and let $R = \sum_{i=0}^n a_ix^i$. Now $$\Psi(P) = \Psi(QR) = \Psi(Q\sum_{i=0}^n a_ix^i) = \sum_{i=0}^n a_i \Psi(Qx^i) = \sum_{i=0}^n a_i \Psi(Q)^{p^i} \equiv 0 \pmod{\Psi(Q)}$$which gives us that $Q|P \implies \Psi(Q)|\Psi(P)$ as required. Now for the main proof. We show that $\Psi(\text{gcd}(F,G))|\text{gcd}(\Psi(F),\Psi(G))$ and $\text{gcd}(\Psi(F),\Psi(G))|\Psi(\text{gcd}(F,G))$. First, we show that $\Psi(\text{gcd}(F,G))|\text{gcd}(\Psi(F),\Psi(G))$. Since $\Psi(\text{gcd}(F,G))| \Psi(F)$ and $\Psi(\text{gcd}(F,G)) | \Psi(G)$, the result follows. Second, we show that $\text{gcd}(\Psi(F),\Psi(G))|\Psi(\text{gcd}(F,G))$. Take $A, B \in \mathbb F_p[x]$ such that $AF+BG = \text{gcd}(F,G)$. Now $\text{gcd}(\Psi(F),\Psi(G))|\Psi(F) | \Psi(AF)$ and $\text{gcd}(\Psi(F),\Psi(G))|\Psi(G)|\Psi(BG)$. This gives us $\text{gcd}(\Psi(F),\Psi(G))|\Psi(AF+BG) = \Psi(\text{gcd}(F,G))$ as required. These two relations are enough to claim that $\Psi(\gcd(F,G)) = \gcd(\Psi(F), \Psi(G))$, so we are done. $\blacksquare$
17.03.2016 07:43
@above: Wow our proofs are really similar - you even use the same variable names
31.05.2016 18:33
pi37 wrote: This implies that $\Psi$ is not only a poset homomorphism, but a poset isomorphism, which directly gives us the gcd statement. I think you also need to show that the image of $\Psi$ is closed under g.c.d. for this to work. Fortunately, not hard either.
31.07.2016 04:21
Doesn't seem too bad for a 3/6? My solution isn't super nice like some of the others on this thread but it is the easiest to find imho
As a side note, was this problem motivated by the Frobenius Endomorphism?
31.07.2016 04:25
@above: Many people (including me) felt that this was the easiest problem on December TST.
31.07.2016 07:13
CantonMathGuy wrote: @above: Many people (including me) felt that this was the easiest problem on December TST. Noted. I'll make sure P3 is harder next time. Jokes aside: I was generally of the impression that all three problems on this December TST were of comparable difficulty (all medium-easy). I personally voted this one as a bit harder than P1 or P2, but not by much.
20.09.2016 19:15
20.09.2016 20:14
By the way, it seems that the easier part of my solution is the same as brian22's idea. Also, it appears to be a deeper interpretation of CantonMathGuy's solution. But at least it's linear algebra!
22.09.2017 18:22
Does anyone know any article (and can you send me please?) about linear algebra? It seems this is going to be important topic in olympiad. Thanks!!!
22.09.2017 21:52
tenplusten wrote: Does anyone know any article (and can you send me please?) about linear algebra? It seems this is going to be important topic in olympiad. Thanks!!! I think you can do good in Olympiad without knowing linear algebra.
23.09.2017 12:23
TomMarvoloRiddle wrote: tenplusten wrote: Does anyone know any article (and can you send me please?) about linear algebra? It seems this is going to be important topic in olympiad. Thanks!!! I think you can do good in Olympiad without knowing linear algebra. Maybe... But I want to learn it ... Anyone can help me?
24.09.2017 00:20
tenplusten wrote: Does anyone know any article (and can you send me please?) about linear algebra? It seems this is going to be important topic in olympiad. Thanks!!! In terms of textbooks, I have heard good things about Linear Algebra Done Right. Most linear algebra textbooks are pretty bad though. Part II of my Napkin also covers linear algebra briefly.
28.12.2018 22:40
Can someone check this? Clearly $\Psi$ is linear. Now, note that if $A = \sum a_ix^i \in \mathbb F_p[x]$, and $B = x^k$, \[ \Psi(AB) = \Psi\left(\sum a_ix^{i+k}\right) = \sum a_ix^{p^{i+k}} = \left(\sum a_ix^{p^i}\right)^{p^k}, \]so $\Psi(AB) = \Psi(B) \circ \Psi(A)$ by linearity. Similarly this equals $\Psi(A)\circ \Psi(B)$. We have the following lemma: Lemma: Let $P,Q$ be polynomials with $\gcd(P,Q) = R$. Then, $\gcd(P(A), Q(A)) = R(A)$. Proof: Dividing $P,Q$ by $R$ it suffices to show this for the case $R = 1$. Then $P,Q$ do not have common roots (in $\overline{\mathbb F_p}$); we wish to show $P(A), Q(A)$ do not either. Indeed, if they did have some common root $\alpha$ then $A(\alpha)$ would be a common root of $P,Q$, contradiction. $\Box$ Now, let $H = \gcd(F,G)$, $F = HF'$, $G = HG'$. Then, $\Psi(F) = \Psi(HF') = \Psi(F')\circ \Psi(H)$, and $\Psi(G) = \Psi(G')\circ \Psi(H)$. Now, we see that $\gcd(\Psi(F), \Psi(G)) = \gcd(\Psi(F'), \Psi(G'))\circ \Psi(H)$. Thus it suffices to show that for $\gcd(F',G') = 1$, $\gcd(\Psi(F'),\Psi(G')) = x$. Clearly $x$ divides both $\Psi(F')$ and $\Psi(G')$. By Bézout there exist $A,B$ with $AF'+BG' = 1$. Taking $\Psi$ of both sides we have \[ \Psi(A)\circ \Psi(F') + \Psi(B)\circ \Psi(G') = x. \]Now, let $H' = \gcd(\Psi(F'), \Psi(G'))$, and we have $H'\mid \Psi(F')\mid \Psi(A)\circ \Psi(F')$ and $H'\mid \Psi(B)\circ\Psi(G')$, so $H\mid x$ as well; thus $H' = x$ and we are done. $\blacksquare$
24.02.2019 10:58
Let $\Phi:\mathbb{F}_p[X]\to\mathbb{F}_p[Y]$ be given by \[\Phi\left(\sum_{i=0}^n a_iX^i\right) = \sum_{i=0}^n a_iY^{1+p+p^2+\cdots+p^{i-1}}.\]For example, $\Phi(X^2+X+1)=Y^{p+1}+Y+1$. Note that \[\Psi(F)(X)=X\Phi(F)(X^{p-1}).\]The crux of the solution is lemma 2 below, but we need lemma 1 as a prerequisite.. Lemma 1: Suppose $P\in\mathbb{F}_p[Y]$. Then $P(Y^{p^k})=P(Y)^{p^k}$. Proof of Lemma 1: It's clear we can show the lemma for $k=1$, and the other $k$ follow by repition of the $k=1$ case. This is the lemma that utilizes the fact that we are working mod $p$. Suppose \[P(Y)=\sum_{i=0}^np_iY^i.\]Then, \[P(Y)^p=\sum_{a_0+\cdots+a_n=r}\frac{p!}{a_0!a_1!\cdots a_n!}p_{a_0}\cdots p_{a_n}Y^p.\]But if at least two of $a_0,\ldots,a_n$ are nonzero, then $\frac{p!}{a_0!a_1!\cdots a_n!}\equiv 0\pmod{p}$, so the lemma is proved. $\blacksquare$ Lemma 2: Suppose $A,F\in\mathbb{F}_p[X]$. There exists a polynomial $\bar{A}\in\mathbb{F}_p[Y]$ such that \[\Phi(A\cdot F)=\bar{A}\cdot\Phi(F).\] Proof of Lemma 2: For brevity, let $\bar{F}=\Phi(F)$. Suppose \[A(X)=\sum_{i=0}^na_iX^i\]and \[F(X)=\sum_{i=0}^mf_iX^i.\]Let \[\bar{A}(Y):=\sum_{i=0}^n a_iY^{1+p+\cdots+p^{i-1}}\bar{F}(Y)^{p^i-1}.\]Then, we have that \begin{align*} \bar{A}(Y)\bar{F}(Y)&=\sum_{i=0}^n a_i Y^{1+p+\cdots+p^{i-1}}\bar{F}(Y)^{p^i} \\ &= \sum_{i=0}^n a_i Y^{1+p+\cdots+p^{i-1}}\bar{F}(Y^{p^i}) \\ &= \sum_{i=0}^n a_i Y^{1+p+\cdots+p^{i-1}}\sum_{j=0}^m f_j Y^{(1+p+\cdots+p^{i-1})p^j} \\ &= \sum_{i=0}^n\sum_{j=0}^m a_if_j Y^{1+p+\cdots+p^{i+j-1}} \\ &= \Phi(A\cdot F)(Y), \end{align*}as desired. $\blacksquare$ Suppose $\gcd(F,G)=H$. We see that $\Phi(H)\mid\Phi(F)$ by lemma 2, so $\Psi(H)\mid\Psi(F)$, and similarly $\Psi(H)\mid\Psi(G)$. Therefore, $\Psi(H)\mid\gcd(\Psi(F),\Psi(G))$. If we showed that there were $A',B'\in\mathbb{F}_P[X]$ so that \[\Psi(H)=A'\Psi(F)+B'\Psi(G),\]then we'd be done (because this would imply $\gcd(\Psi(F),\Psi(G))\mid\Psi(H)$, and we already have $\Psi(H)\mid\gcd(\Psi(F),\Psi(G))$). Suppose $H(X)=A(X)F(X)+B(X)G(X)$. Then, construct $\bar{A},\bar{B}\in\mathbb{F}_p[Y]$ according to lemma 2. Thus, \[\Phi(H)=\Phi(A\cdot F)+\Phi(B\cdot G)=\bar{A}\cdot\Phi(F)+\bar{B}\cdot\Phi(G).\]Let $A'(X)=\bar{A}(X^{p-1})$ and $B'(X)=\bar{B}(X^{p-1})$. Thus, \[\Psi(H)(X)=X\Phi(H)(X^{p-1})=A'(X)\Psi(F)(X)+B'(X)\Psi(G)(X),\]so $\gcd(\Psi(F),\Psi(G))=\Psi(H)$, as desired. $\square$
28.02.2019 16:15
Even though I love geometry, I found this problem to be the easiest USA TST P3 that I have tried till now. Weird . Anyway, I'll provide my solution (which I guess is nothing new for this forum, but still ): I don't like $F$ and $G$ so I'll use $P$ and $Q$. First of all a trivial observation (proof's not included obviously)- OBSERVATION $\Psi(P+Q)=\Psi(P)+\Psi(Q)$. In particular, $\Psi(aP)=a\Psi(P)$, where $a \in \mathbb{Z}$. Now, for some serious claims, which will help us conquer the problem. But before that a well-known lemma which we'll require later on- LEMMA Let $a_1,a_2, \dots ,a_n$ be $n$ integers and $p$ be a prime. Then $$(a_1+a_2+ \dots +a_n)^p \equiv a_1^p+a_2^p+ \dots +a_n^p \pmod{p}$$ Proof of Lemma The proof proceeds by induction on $n$. For $n=2$, we just need to apply binomial theorem on the LHS, and use the fact that $p \mid \binom{p}{i}$ for $i \in \{1,2, \dots ,p-1\}$. Then the inductive step is fairly simple to use. $\Box$ CLAIM 1 $\Psi(P) \mid \Psi(Px^m)$ for all $m \in \mathbb{Z}^+$. Proof of Claim 1 Let $P(x)=a_0+a_1x+ \dots +a_nx^n$. Then $$\Psi(Px^m)=\Psi(a_0x^m+a_1x^{m+1}+ \dots +a_nx^{m+n})=a_0x^{p^m}+a_1x^{p^{m+1}}+ \dots +a_mx^{p^{m+n}}$$Now, By Fermat's Little Theorem, and using the fact that $\gcd(a_i,p)=1$ for all $i \in \{0,1, \dots ,n\}$ as $a_i \in \mathbb{F}_p$, we have that $$a_i \equiv a_i^{p^m} \pmod{p} \quad \forall i \in \{0,1, \dots ,n\}$$Using the above equality, and working in $\mathbb{F}_p$, we have $$\Psi(Px^m)=\sum_{i=0}^n \left(a_ix^{p^i} \right)^{p^m} \overset{\text{LEMMA}}{\Longrightarrow} \Psi(Px^m)=\left(\sum_{i=0}^n a_ix^{p^i} \right)^{p^m}=(\Psi(P))^{p^m} \Rightarrow \Psi(P) \mid \Psi(Px^m) \quad \Box$$ CLAIM 2 (Key Claim) $\Psi(P) \mid \Psi(PQ)$ for all $P,Q \in \mathbb F_p[x]$. Proof of Claim 2 Let $Q(x)=b_0+b_1x+ \dots +b_kx^k$. Then using our initial observation, we have $$\Psi(PQ)=\Psi(b_0P+b_1xP+ \dots +b_kx^kP)=b_0\Psi(P)+b_1 \Psi(Px)+ \dots b_k \Psi(Px^k) \overset{\text{CLAIM 1}}{\Longrightarrow} \Psi(P) \mid \Psi(PQ) \quad \Box$$ Return to the problem at hand. Let $\gcd(P,Q)=R$. Then, by Claim 2, $\Psi(R) \mid \Psi(P)$ and $\Psi(R) \mid \Psi(Q)$, which together gives that $\Psi(R) \mid \gcd(\Psi(P),\Psi(Q))$. Now, By Bezout's Identity, we can write $R=AP+BQ$ for polynomials $A,B \in \mathbb F_p[x]$. Then, using our observation once again, we get that $$\Psi(R)=\Psi(AP+BQ)=\Psi(AP)+\Psi(BQ) \overset{\text{CLAIM 2}}{\Longrightarrow} \Psi(R)=C\Psi(P)+D\Psi(Q) \text{ for some polynomials } C,D \in \mathbb F_p[x]$$But this means (again by Bezout's Identity) that $\gcd(\Psi(P),\Psi(Q)) \mid \Psi(R)$. Thus, we have $$\gcd(\Psi(P),\Psi(Q))=\Psi(R)=\Psi(\gcd(P,Q)) \quad \blacksquare$$
04.05.2019 13:30
Claim 1: $\left(\sum_{i = 1}^n a_i\right)^p = \sum_{i = 1}^n a_i^p$ for $a_i \in \mathbb{F}_p[x]$. Proof. $p \left| \binom{p}{k}\right.$ for all $1 \le k < p$ since no number in the denominator is divisible by $p$. \begin{align*} \left(\sum_{i = i}^n a_i\right)^p = \left(a_n + \sum_{i = 1}^{n - 1} a_i\right)^p &= a_n^p + \sum_{k = 1}^{p - 1} \binom{p}{k} a_n^k \left(\sum_{i = 1} a_i\right)^{p - k} + \left(\sum_{i = 1}^{n - 1} a_i\right)^{p}\\ &= a_n^p + \left(\sum_{i = 1}^{n-1} a_i\right)^{p} = \cdots = \sum_{i = 1}^n a_i^p\qquad\qquad\qquad\square \end{align*} Let $\circ$ refer to composition of functions. Let $\mathbb{S}$ be the image of $\mathbb{F}_p[x]$ under $\Psi$. Claim 2: $(\mathbb{S}, +, \circ)$ is a ring. Proof. Clearly $S$ is closed under addition. Let $P(x) = \sum_{i=1}^m a_i x^{p^i}$ and $Q(x) = \sum_{i=1}^n b_i x^{p^i}$ \[(Q(x))^{p^j} = \left(\sum_{i=1}^n b_i x^{p^i}\right)^{p^j} = \left(\sum_{i=1}^n b_i^{p} x^{p^{i + 1}}\right)^{p^{j - 1}} = \left(\sum_{i=1}^n b_i x^{p^{i + 1}}\right)^{p^{j - 1}} = \cdots = \sum_{i=1}^n b_i x^{p^{i + j}} \]\[(P\circ Q)(x) = P(Q(x)) = \sum_{j = 1}^m a_j (Q(x))^{p^j} = \sum_{j = 1}^m \sum_{i=1}^n a_j b_i x^{p^{i + j}} \in \mathbb{S}\]Thus $\mathbb{S}$ is closed under composition. Clearly both the identities are present in $\mathbb{S}$ and additive inverse is present in $\mathbb{S}$. Thus, $(\mathbb{S}, +, \circ)$ is a ring. $\square$ It's well-known that $(\mathbb{F}_p[x], +, \times)$ is a ring. We'll call the rings $\mathbb{S}$ and $\mathbb{F}_p[x]$. Claim 3: $\mathbb{F}_p[x] \cong \mathbb{S}$ with $\Psi$ as the isomorphism. Proof. It's trivial to prove that $\Psi$ distributes over $+$ and is bijective. Multiplicative identity of $\mathbb{F}_p[x]$ is $1$ and that of $S$ is $P(x) = x$. So clearly $\Psi(1) = x$. Now from the proof of claim 2 we know that for $P(x) = \sum_{i = 1}^m a_i x^{i}$ and $Q(x) = \sum_{i=1}^n b_i x^{i}$, \[(\Psi (P) \circ \Psi (Q))(x) = \sum_{i = 1}^m \sum_{j =1}^n a_i b_j x^{p^{i + j}} = \Psi\left(\sum_{i = 1}^m \sum_{j =1}^n a_i b_j x^{i + j}\right) = \Psi\left(P(x) \times Q(x)\right).\]Thus, $\Psi$ is an isomorphism from $\mathbb{F}_p[x] \to \mathbb{S}$. $\square$ Claim 4: If $P, Q \in \mathbb{F}_p[x]$ such that $Q | P$, then $\Psi(Q) | \Psi(P)$. Proof. Let $P = QR$. Since $x | T(x)$ for all $T \in \mathbb{S}$, replacing $T$ with $\Psi(R)$ and $x$ with $\Psi(Q)$. \[\Psi(Q) | \Psi R (\Psi Q) = (\Psi R) \circ (\Psi Q) = \Psi(RQ) = \Psi(P).\qquad \square\] Claim 5: $\Psi(\gcd(P, Q)) | \gcd(\Psi(P), \Psi(Q))$ for all $P, Q \in \mathbb{F}_p[x]$. Proof. \[\left.\begin{cases} \gcd(P, Q) | P \\\gcd(P, Q) | Q \end{cases}\hspace{-1em}\right\} \implies \left.\begin{cases} \Psi(\gcd(P, Q)) | \Psi(P) \\\Psi(\gcd(P, Q)) | \Psi(Q) \end{cases}\hspace{-1em}\right\} \implies \Psi(\gcd(P, Q)) | \gcd(\Psi(P), \Psi(Q)) \qquad\square\] Claim 6: $\gcd(\Psi(P), \Psi(Q)) | \Psi(\gcd(P, Q))$ for all $P, Q \in \mathbb{F}_p[x]$. Proof. Let $\gcd(P, Q) = PX + QY$ for $X, Y \in \mathbb{F}_p[x]$ by Bezout's Identity for polynomials. Now, \[\Psi(\gcd(P, Q)) = \Psi(PX + QY) = \Psi(PX) + \Psi(QY)\]Since $P|PX$ and $Q|QY$, $\Psi(P)|\Psi(PX)$ and $\Psi(Q)|\Psi(QY)$. Therefore \[\gcd(\Psi(P), \Psi(Q))| \Psi(PX) + \Psi(QY)\]as desired. $\square$ Combining claim 5 and 6, we get the desired result, \[\gcd(\Psi(P), \Psi(Q)) = \Psi(\gcd(P, Q))\]for all $P, Q \in \mathbb{F}_p[x]$.
13.09.2019 06:20
For an $A \in \mathbb{F}_p[x]$, let $f_A(x)$ denote the polynomial $\Psi(A)$. Observe that $\Psi(kA) = k\Psi(A)$ for a constant $k$ and $\Psi(A+B) = \Psi(A) + \Psi(B).$ Lemma: $f_{AB}(x) = f_A(f_B(x))$.
Lemma: Let $d(x)$ be a monic polynomial. Then for relatively prime polynomials $A, B$ we have
To finish, let $d = \gcd(F, G)$ and write $\gcd(\Psi(F), \Psi(G)) = \gcd( f_{F/d}(f_d), f_{G/d}(f_d))$ by the first lemma. Then apply the second lemma to get that this $\gcd$ is just $f_d=\Psi(\gcd(F, G))$. $\blacksquare$
25.02.2020 10:42
Note that $\Psi(P+Q)=\Psi(P)+\Psi(Q)$ and $\Psi(kP)=k\Psi(P)$. Lemma: For $P\in\mathbb F_p[x]$ and $k\in\mathbb Z$, we have $\Psi(Px^k)=\Psi(P)^{p^k}$. Proof. Let $\textstyle P(x)=\sum_{i=0}^na_ix^i$. Write \[\Psi(Px^k)=\Psi\left(\sum_{i=0}^na_ix^{i+k}\right)=\sum_{i=0}^na_i\left(x^{p^i}\right)^{p^k}=\left(\sum_{i=0}^na_ix^{p^i}\right)^{p^k}=\Psi(P)^{p^k},\]where the third equality is by Frobenius. $\blacksquare$ Lemma: For $P,Q\in\mathbb F_p[x]$, if $P\mid Q$, then $\Psi(P)\mid\Psi(Q)$. Proof. Let $Q=PR$ and $\textstyle R=\sum_{i=0}^na_ix^i$. Then \[\Psi(Q)=\Psi\left(P\sum_{i=0}^na_ix^i\right)=\sum_{i=0}^na_i\Psi(Px^i)=\sum_{i=0}^na_i\Psi(P)^{p^i},\]which is divisible by $\Psi(P)$. $\blacksquare$ Let $H=\gcd(F,G)$ and $I=\gcd(\Psi(F),\Psi(G))$, so it suffices to prove $\Psi(H)=I$. Note the following: By the above lemma, we have $\Psi(H)\mid\Psi(F)$ and $\Psi(H)\mid\Psi(G)$, so $\Psi(H)\mid I$. Let $H=AF+BG$ for $A,B\in\mathbb F_p[x]$ by B\'ezout's lemma. We have $I\mid\Psi(F)\mid\Psi(AF)$ and $I\mid\Psi(G)\mid\Psi(BG)$, so $I\mid\Psi(AF)+\Psi(BG)=\Psi(H)$. The conclusion follows.
13.06.2020 18:50
Take some three polynomials $P(x)=\sum_{i=0}^{\infty}a_ix^i$, $Q(x)=\sum_{i=0}^{\infty}b_ix^i$ and $R(x)=\sum_{i=0}^{\infty}c_ix^i$, where coefficients are zero if large enough for simplicity We have $\Psi(PQ) = \Psi(\sum_{i=0}^{\infty}\sum_{j=0}^{\infty}a_ib_jx^{i+j}) = \sum_{i=0}^{\infty}\sum_{j=0}^{\infty}a_ib_jx^{p^{i+j}}$, now notice $\Psi(P+Q)=\Psi(P)+\Psi(Q)$ and the main claim: $$gcd(\Psi(PQ), \Psi(PR)) = gcd(\sum_{i=0}^{\infty}\sum_{j=0}^{\infty}a_ib_jx^{p^{i+j}}, \sum_{i=0}^{\infty}\sum_{j=0}^{\infty}a_ic_jx^{p^{i+j}}) =$$$$= gcd(\sum_{i=0}^{\infty}\sum_{j=0}^{\infty}a_i(b_j-c_j)x^{p^{i+j}}, \sum_{i=0}^{\infty}\sum_{j=0}^{\infty}a_ic_jx^{p^{i+j}}) = gcd(\Psi(P(Q-R)), \Psi(PR))$$ This follows from $gcd(X,Y)=gcd(X-Y,Y)$, the standard number theory lemma but for polynomials. Now, assume $F=PQ$, $G=PR$ and $gcd(Q,R)=1$ and $gcd(F,G)=P$. Keep repeating the main result above, $gcd(\Psi(PQ), \Psi(PR))=gcd(\Psi(P(Q-R)), \Psi(PR))$, and by the Euclidian algorithm we arrive at $gcd(\Psi(F),\Psi(G)) = gcd(\Psi(P), \Psi(0)) = \Psi(P) = \Psi(gcd(F,G)) \implies gcd(\Psi(F),\Psi(G)) = \Psi(gcd(F,G))$, which we wanted to prove $\blacksquare$ EDIT: Am I mistaken or does this method generalize from $\mathbb{F}_p[x]$ to $\mathbb{Z}[x]$?
15.06.2020 18:55
First, we claim that $a_1^p+a_2^p+\ldots +a_n^p$ leaves a remainder with all terms divisible by $p$ when divided by $a_1+\ldots +a_n$. We prove this with induction on $n$. For $n=2$, the claim is trivial as $a_1+a_2|a_1^p+a_2^p$. Now, suppose the claim is true for $n-1$. Now, $$a_1^p+a_2^p+\ldots+a_n^p=(a_1^p+\ldots + a_{n-2}^p+(a_{n-1}+a_n)^p)-\sum_{i=1}^{p-1}a_{n-1}^ia_n^{p-i}\binom{p}{i}$$For the first term, we are guaranteed a remainder divisible by $p$ by induction. On the other hand, for the latter term, $\binom{p}{i}$ is always divisible by $p$, so of course, subtracting it will preserve the fact that the remainder has all coefficients divisible by $p$, as desired. Now, note that if $P(x)=a_nx^n+a_{n-1}x^{n-1}+\ldots + a_0$, then $\Psi(P(x))=a_nx^{p^n}+\ldots +a_0x^{p^0}$ and $\Psi(xP(x))=\left(a_nx^{p^n}\right)^p+\ldots +\left(a_0x^{p^0}\right)^p$ as $a^p\equiv a\pmod p$. Now, using our claims, the remainder when $\Psi(xP(x))$ is divided by $\Psi(P(x))$ has all terms divisible by $p$, and since we are working in $\mathbb{F}_p$, this means $\Psi(P(x))|\Psi(xP(x))$. Using induction, this gives $\Psi(P(x))|\Psi(x^kP(x))$, and since $\Psi$ is additive, we get $\Psi(P(x))|\Psi(Q(x))$ whenver $P|Q$. To finish, we just use the Euclidean Algorithm. We have that $\gcd(\Psi(P(x)),\Psi(Q(x)))=\gcd(\Psi(P(x)),\Psi(Q(x)-R(x)P(x)))$ by properties of $\gcd$, additivity of $\Psi$, and the fact that $\Psi(P(x))|\Psi(P(x)R(x))$, so applying normal Euclidean algorithm gives $$\gcd(\Psi(P(x)),\Psi(Q(x)))=\gcd(\Psi(\gcd(P(x),Q(x))),\Psi(\gcd(P(x),Q(x))))=\Psi(\gcd(P(x),Q(x)))$$as desired.
28.11.2020 10:24
This is essentially the same idea as brian22's solution, but phrased differently. We define a notation $[a_0, a_1, a_2, \dots, a_n]$ recursively such that \[[a_0,a_1,\dots,a_n] = a_0 + [a_1,\dots,a_n]^p.\]Then we can prove by induction (using a property of Frobenius endomorphism) that \[[a_0x, a_1x,\dots, a_nx] = \Psi(a_0 + a_1x+\dots +a_nx^n).\]Notice also that $[a_0,\dots,a_n] + [b_0,\dots,b_n] = [a_0+b_0,\dots, a_n+b_n]$ and $c[a_0,\dots,a_n] = [ca_0,\dots,ca_n]$ for any $c\in \mathbb{F}_p$. Claim. If $a(x) \mid f(x)$, then $\Psi(a(x)) \mid \Psi(f(x)).$ Proof. Let $a(x) = a_0 + a_1x+ \dots +a_nx^n$ and $f(x) = a(x)b(x)$ where $b(x) = b_0 + b_1x + \dots + b_m x^m$. Write $A = [a_0, a_1, \dots, a_n] = \Psi(a(x))$, then it suffices for us to prove that \[\Psi(f(x)) = [b_0A, b_1A, \dots, b_mA].\]This can be easily shown by expanding out the entire thing. $\square$ Back to the original problem, we use the Euclidean algorithm. Suppose WLOG that $G(x) = F(x)\cdot Q(x) + R(x)$ where $\deg R < \deg F$, and assume the problem is true for the pair $(R,F)$. Our notation implies $\Psi$ is linear, so \[\gcd(\Psi(F), \Psi(G)) = \gcd(\Psi(F), \Psi(R) + \Psi(F\cdot Q)) = \gcd(\Psi(F), \Psi(R)) = \Psi(\gcd(F,R)) = \Psi(\gcd(F,G))\]as desired.
30.11.2022 17:32
Claim: $\Psi(F) | \Psi(G)$ iff $F | G$. We can split this claim into two parts, proving that $\Psi(F) | \Psi(G) \implies F | G$, and $F | G \implies \Psi(F) | \Psi(G)$. Notice that this claim is enough to complete the problem. For the first part, notice that you can use the fact that $P(A+B) = P(A)+P(B)$ to split the polynomials into monomials. For the second part, FTSOC, we let $F = GA + R$ where $R$ is a nonzero polynomial with $\deg R < \deg P$. This implies that $\Psi(Q) = \Psi(PA) + \Psi(R)$. $\Psi(PA) \equiv 0 \pmod \Psi(P)$, so therefore $\Psi(P)|\Psi(R)$. Since $\deg R < \deg P$, $R \equiv 0$, therefore implying the result. $\blacksquare$
24.12.2022 02:41
Just brute force by inventing your own EA. Let $[F, G]$ denote the statement that the desired condition holds for $F, G$. First: $\Psi$ is linear so $[F, G] \leftrightarrow [F-kG, G]$ Next: By substitution, $[F, G] \leftrightarrow [xF, xG]$ Next: By symmetry, $[F, G] \leftrightarrow [G, F]$ Finally: I claim $[xF, G] \leftrightarrow [F, G]$ if $x \nmid G$. This is because $\Psi(xF)(x) = \Psi(F)(x^p) = \Psi(F)^p$, so it suffices to show $\gcd(\Psi(F)^p, \Psi(G)) = \gcd(\Psi(F), \Psi(G))$. This follows immediately from the claim that $\Psi(G)$ has no double roots for arbitrary $G$. This is because given $\alpha$ a root of $\Psi(G)$, $\Psi(G)' = G(0)$ which must be nonzero by assumption, finishing. Now, these operations together compose the backward Euclidean algorithm, which cancels the constant term and divides out factors of $x$ until the degree is minimal, finishing.
02.02.2023 18:35
Really nice. We have the three following key properties: $\Psi(f+g)=\Psi(f)+\Psi(g)$. This is obvious. $c\Psi(f)=\Psi(cf)$, where $c \in \mathbb{F}_p$. This is clear because $c^p=c$ in $\mathbb{F}_p$. $\Psi(f)^p=\Psi(\mathrm{id}\cdot f)$ (as a polynomial equality). This is because if $f(x)=a_nx^n+\cdots+a_0$, then both sides are equal to $a_nx^{p^{n+1}}+\cdots+a_0x^{p^1}$, since for any $a \in \mathbb{F}_p[x]$ we have $a(x^p)=a(x)^p$. Now we will essentially apply the Euclidean algorithm to $\Psi(\gcd(F,G))$ and essentially show that the same thing works on $\gcd(\Psi(F),\Psi(G))$. Indeed, WLOG let $\deg F>\deg G$ select $c \in \mathbb{F}_p$ and $k\in \mathbb{Z}_{\geq 0}$ such that $$\deg F-cx^kG < \deg F,$$so $$\Psi(\gcd(F(x),G(x)))=\Psi(\gcd(F(x)-cx^kG(x),G(x)))$$Likewise, we have $$\gcd(\Psi(F(x)),\Psi(G(x)))=\gcd(\Psi(F(x))-c\Psi(G(x))^{p^k},\Psi(G(x)))=\gcd(\Psi(F(x)-cx^kG(x)),\Psi(G(x))).$$Thus by repeatedly doing this (which is just the Euclidean algorithm on $F$ and $G$), we reduce $\Psi(\gcd(F,G))$ to $\Psi(\gcd(H,H))=\Psi(H)$ where $H=\gcd(F,G)$. The same sequence of operations reduce $\gcd(\Psi(F),\Psi(G))$ to $\gcd(\Psi(H),\Psi(H))$, which clearly equals $\Psi(H)$ as well. Thus we are done. $\blacksquare$ Remark: My initial solution to this was a bit different. Similarly to the third key property we may prove the more general statement that $\Psi(fg)=\Psi(f)(\Psi(g))$ (if $f \equiv \mathrm{id}$ we obtain the third key property). Then if $F \equiv ab$ and $G \equiv ac$ where $b \perp c$, we want to prove that $\Psi(a)=\gcd(\Psi(ab),\Psi(ac))=\gcd(\Psi(b)(\Psi(a)),\Psi(c)(\Psi(a)))$. From here the solution basically remains the same. The "necessity" of the third property or its generalization come from a need to find some property of $\Psi$ which will connect it with $\gcd$. It's a bit unclear how the third property actually relates to $\gcd$, but in its general form it seems much clearer, since $\gcd$ is certainly related to products of polynomials. Indeed, from the general form it is essentially immediate that at the very least we have $\Psi(a) \mid \gcd(\Psi(b)(\Psi(a)),\Psi(c)(\Psi(a)))$, which indicates that this route should be pushed further.
27.07.2023 20:30
Neat problem, the key idea is to inductively show that all pairs $(F(x), G(x))$ in $\mathbb{F}_p[x]^2$ work. For $(F, G) \in \mathbb{F}_p[x]^2$, let \[ \alpha(F, G) = \begin{cases} 1 & (F, G) \ \text{works} \\ 0 & (F, G) \ \text{does not work}. \end{cases} \]We say that $(F, G) \sim (H, K)$ if $\alpha(F, G)=1$ implies $\alpha(H, K)=1$. Lemma: [Symmetry] We have $(F, G) \sim (G, F)$. Proof. Note that $\gcd(F, G)$ is symmetric with respect to $F, G$, so we are done. The above lemma completes the proof of the fact that $\sim$ is an equivalence relation. Lemma: [Euclidean algorithm] We have $(F, G) \sim (F-G, G)$, $(F, G) \sim (F+G, G)$. Proof. This holds due to the linearity property of $\Psi$. Lemma: [Factoring monomials] We have $(F, G) \sim (xF, xG)$. Proof. By definition, it can be easily seen that \begin{align*} \Psi(xF(x)) &= F(x^p) \\ &= F(x)^p , \end{align*}from which the lemma follows (this is by the use of $\gcd$ on $p$th powers of polynomials). Now onto the hardest one. Lemma: [Factoring monomials from one term] If $x \nmid G$, then $(F, G) \sim (xF, G)$. Proof. As in the proof of the prior lemma, $\Psi(xF(x)) = \Psi(F)^p$. Note that $\Psi(G)'(x)=G(0)$ for all $x$ must be nonzero, so $\Psi(G)$ must have no double root. Thus, $\gcd(\Psi(F)^p, \Psi(G)) = \gcd(\Psi(F), \Psi(G))$, and we are done from our first observation. Using the above lemmas, we can inductively show that $\alpha(F, G)=1$ for all $F, G$ by following an algorithm similar to that of the Euclidean Algorithm. We are done.
21.10.2023 21:01
The key is to offer the following characterization of $\Psi$ multiplicatively. Claim. We have $\Psi(fg) = \Psi(f) \circ \Psi(g)$. Proof. It suffices to prove the monomial case $$\Psi(x^n f(x)) = \Psi(f(x))^{p^n}$$because $\Psi$ is additive. To see this, set $f(x) = pg(x) + h(x)$ in $\mathbb Z$ and apply the binomial theorem. $\blacksquare$ Now we obviously have $\Psi(\gcd(f, g)) \mid \gcd(\Psi(f), \Psi(g))$ because if $f \mid g$, then $\Psi(f)\mid \Psi(g)$. (This is because in the above expression, $\Psi$ has no constant term.) To show the other direction, use Bezout's lemma: there exists polynomials $a, b$ such that $$af+bg = \gcd(f, g).$$Taking $\Psi$ of both sides, $$\Psi(f) \circ \Psi(a) + \Psi(g) \circ \Psi(b) = \Psi(\gcd(f, g)).$$On the other hand, $\gcd(\Psi(f), \Psi(g))$ divides both terms on the LHS, so it divides the RHS too. Thus $\Psi(\gcd(f, g)) =\gcd(\Psi(f), \Psi(g))$, as needed.
03.04.2024 20:42
Napkin wrote: (USA Team Selection Test 2016) Let $\mathbb{F}_p$ denote the integers modulo a fixed prime $p$. Define $\Psi \colon \mathbb{F}_p[x] \mapsto \mathbb{F}_p[x]$ by, \begin{align*} \Psi\left(\sum_{i = 0}^n a_ix^i\right) = \sum_{i = 0}^n a_ix^{p^i} \end{align*}Denote by $S$ the image of $\Psi$. Show that, $S$ is a ring with addition given by polynomial addition, and multiplication given by \textit{function composition}. Prove that $\Psi \colon \mathbb{F}_p[x] \mapsto S$ is then a ring isomorphism. We begin by characterizing $S$; Take two arbitrary elements in $S$: \begin{align*} A = \sum_{i= 0}^{n} a_ix^{p^i}\\ B = \sum_{i = 0}^n b_ix^{p_i} \end{align*}with some $a_i$ and $b_i$ possibly $0$. Claim: $S$ is closed under addition. Proof. Simply note that, \begin{align*} A + B = \sum_{i = 0}^n (a_i + b_i)x^{p^i} = \sum_{i = 0}^n (a_i + b_i \bmod{p})x^{p^i} \in S \end{align*}Also, there is an additive identity, namely the $0$ polynomial. $\square$ Claim: $S$ obeys multiplication given by function composition. Proof. We first demonstrate $S$ obeys exponentiation by powers of $p$. Then we wish to show that, \begin{align*} \left( \sum_{i = 0}^n a_ix^{p^i} \right)^p &\in S\\ \end{align*}We are working with something of the form $(y_1 + y_2 + \dots + y_n)^p \equiv 0 \bmod{p}$. Note that lower order terms not of the form $y_i^{p}$ die modulo $p$; to see this say that some term in our expansion is of the form, \begin{align*} \prod_{i = 0}^k y_i^{e_i} \end{align*}where we without loss of generality assume that the $k$ variables are $y_1$, $y_2$, and so on. The coefficient of this term is then, \begin{align*} \binom{p}{e_1} \cdot \binom{p - e_1}{e_2} \dots \end{align*}Clearly $p$ divides this coefficient, given that $e_1 < p$, thus proving that these terms vanish. This in turn proves that $S$ behaves well under exponentiation by $p$ and in fact that, \begin{align*} \left( \sum_{i = 0}^n a_ix^{p^i} \right)^{p^t} &\in S\\ \end{align*}lies in $S$ for any $t$, by induction. Now to see that $S$ obeys function composition, simply take $A, B \in S$ and note, \begin{align*} A \circ B = \sum_{i = 0}^n a_iB^{p^i} \in S \end{align*}which is clearly in $S$ as $S$ is closed under exponentiation by powers of $p$ and addition. Moreover it clearly has identity element $x$, is associative, and distributes over addition. $\square$ Therefore $S = (S, + , \circ)$ is a ring, finishing the first bullet. For the second bullet we wish to show that $\Psi \colon \mathbb{F}_p[x] \mapsto S$ is a ring isomorphism. Thus we wish to show $\Psi$ is a bijective map satisfying, $\Psi(x + y) = \Psi(x) + \Psi(y)$ $\Psi(x \times y) = \Psi(x) \circ \Psi(y)$ $\Psi(1) = x$ The first and last bullets follow easily from the definition of $\Psi$, and so does the bijectivity of the map. It remains to verify the second bullet; namely we wish to show, \begin{align*} \Psi(AB) = \Psi(A) \circ \Psi(B) \end{align*}The right hand side clearly is just, \begin{align*} \sum_{i = 0}^n a_iB^{p^i} &= \sum_{i = 0}^n \left( a_i\left(\sum_{j = 0}^n b_jx^{p^{j + 1}} \right) \right)\\ &= \sum_{i = 0}^n \left(\sum_{j = 0}^n a_ib_jx^{p^{i + j}} \right) \end{align*}whereas we may observe the left hand side is, \begin{align*} \Psi\left(\sum_{i = 0}^n a_ix^i \cdot B \right) &= \sum_{i = 0}^n \Psi(a_ix^i \cdot B)\\ &= \sum_{i = 0}^n a_i \Psi\left(\sum_{j = 0}^n b_jx^{i + j}\right)\\ &= \sum_{i = 0}^n \left(\sum_{j = 0}^n a_ib_jx^{p^{i + j}} \right) \end{align*}which are clearly equal. This finishes the problem. Remarks: Oops I got braindead and needed help on the last part (which was supposed to be easy ).
10.05.2024 07:17
Let $\mathbb{N}$ denote the set of natural numbers, including $0$. Let $R$ be the ring consisting of the set of $\mathbb{F}_p$-polynomials \[ \left\{a_n x^{p^n}+a_{n-1} x^{p^{n-1}}+\cdots+a_1 x^p+a_0x\mid n\in\mathbb{N};a_0,\ldots,a_n\in\mathbb{F}_p\right\}, \]with addition $+$ and multiplication $\circ$ (function composition). It is easy to check that $R$ is a ring with additive identity $0$ and multiplicative identity $x$. Let $\overline{\mathbb{F}_p}$ denote the algebraic closure of $\mathbb{F}_p$. Lemma 1. $\mathbb{F}_p[x]$ is isomorphic to $R$ with isomorphism $\Psi$. Proof. We first prove that $\Psi:\mathbb{F}_p[x]\rightarrow R$ is a homomorphism: $\Psi(f+g)=\Psi(f)+\Psi(g)$ is trivial. We have $\Psi(x^\alpha x^\beta)=x^{p^{\alpha+\beta}}=\left(x^{p^\alpha}\right)^{p^\beta}=\Psi(x^\alpha)\circ\Psi(x^\beta)$. Clearly we also have $\Psi(cx^\alpha)=c\Psi(x^\alpha)$ for any constant $c$. It follows that $\Psi(fg)=\Psi(f)\Psi(g)$. $\Psi(1)=x$ is trivial. Clearly $\ker\Psi=\{1\}$ and $\Psi$ is surjective so $\Psi$ is an isomorphism. $\square$ Let $\overline{R}\supseteq R$ be the ring consisting of the set of $\overline{\mathbb{F}_p}$-polynomials \[ \left\{a_n x^{p^n}+a_{n-1} x^{p^{n-1}}+\cdots+a_1 x^p+a_0x\mid n\in\mathbb{N};a_0,\ldots,a_n\in\overline{\mathbb{F}_p}\right\}, \]with addition $+$ and multiplication $\circ$. Let $\tilde{\Psi}:\overline{\mathbb{F}_p}[x]\rightarrow\overline{R}$ be the extension of $\Psi$ given by \[ \sum_{i=0}^n a_i x^i\mapsto\sum_{i=0}^n a_i x^{p^i}. \]By the same argument as Lemma 1, $\tilde{\Psi}$ is an isomorphism. Lemma 2. For $f\in\overline{R}$ and all $a,b\in\overline{\mathbb{F}_p}$ and $c\in\mathbb{F}_p$, we have $f(a+b)=f(a)+f(b)$, $f(ca)=cf(a)$. Proof. The first part follows directly from freshman's dream. For the second part, note that $c^{p^\alpha}=c$ for all $a\in\mathbb{N}$ by FlT. The result immediately follows. $\square$ Lemma 3. All $f\in\overline{R}$ (viewed as a polynomial in $\overline{\mathbb{F}_p}[x]$) is separable. Proof. The formal derivative of $f$ is equal to the coefficient of $x$ in $f$, which is in $\overline{\mathbb{F}_p}$. Thus the formal derivative of $f$ is relatively prime with $f$ so $f$ is separable. $\square$ For $f\in\overline{R}$, let \[ Z(f):=\{r\in\overline{\mathbb{F}_p}\mid f(r)=0\} \]denote the set of roots of $f$. By Lemma 3, all the roots of $f$ have multiplicity $1$. It follows that \[ f(x)=\prod_{r\in Z(f)}(x-r). \]Lemma 4. For all $f\in\overline{R}$, $Z(f)$ forms a vector space over $\mathbb{F}_p$, with addition and multiplication the same as that of $\overline{\mathbb{F}_p}$. Proof. We first prove that $Z(f)\subseteq\overline{\mathbb{F}_p}$ is an abelian group: If $r_1,r_2\in Z(f)$, then $f(r_1+r_2)=f(r_1)+f(r_2)=0$ by Lemma 2 so $r_1+r_2\in Z(f)$. $Z(f)$ is commutative because the additive group of $\overline{\mathbb{F}_p}$ is abelian. $f(0)=0$ because $f$ has no constant term. Thus $0\in Z(f)$. If $r\in Z(f)$, then $f(-r)=-f(r)=0$ by Lemma 2 so $-r\in Z(f)$. For all $c\in\mathbb{F}_p$ and $r\in Z(f)$, we have $f(cr)=cf(r)=0$ by Lemma 2 so $cr\in Z(f)$. Thus $Z(f)$ is a vector space over $\mathbb{F}_p$. $\square$ From now on, "$=$" will denote equality up to multiplication by a unit of $\overline{\mathbb{F}_p}[x]$. Lemma 5. If $F,G\in\overline{R},H\in\overline{\mathbb{F}_p}[x]$ and $F(x)=H(G(x))$, then $H\in\overline{R}$. Proof. Assume for the sake of contradiction $H\not\in\overline{R}$. Then $H$ has a $cx^\alpha$ term where $\alpha$ is not a power of $p$. Take the minimum such $\alpha$. Let \[ G(x)=:a_nx^{p^n}+a_{n-1}x^{p^{n-1}}+\cdots+a_0x. \]Clearly the $x^\alpha$ term of $H(G(x))$ appears only in $c(G(x))^\alpha$, so it has coefficient $ca_0^\alpha\neq 0$. This implies that $F$ has an $x^\alpha$ term, a contradiction. $\square$ Lemma 6. For nonzero polynomials $F,G\in\overline{\mathbb{F}_p}[x]$, we have $G\mid F$ if and only if $\tilde{\Psi}(G)\mid\tilde{\Psi}(F)$. Proof. Suppose $F=HG$ for some $H\in\overline{\mathbb{F}_p}[x]$. Then $\tilde{\Psi}(F)=\tilde{\Psi}(H)\circ\tilde{\Psi}(G)$. Since $\tilde{\Psi}(H)$ has no constant term, it follows that $\tilde{\Psi}(G)\mid\tilde{\Psi}(F)$. Now let $P:=\tilde{\Psi}(F)$ and $Q:=\tilde{\Psi}(G)$, and suppose $Q\mid P$. Then $Z(Q)$ is a subspace of $Z(P)$. Let $\alpha:=\dim Z(P)$ and $\beta:=\dim Z(Q)$. Let $e_1,\ldots,e_\beta$ be a basis for $Z(Q)$ and extend it to a basis $e_1,\ldots,e_\alpha$ for $Z(P)$. We have \begin{align*} P(x)&=\prod_{r\in Z(P)}(x-r)\\ &=\prod_{c_1,\ldots,c_\alpha\in\mathbb{F}_p}\left(x-\sum_{i=1}^\alpha c_ie_i\right)\\ &=\prod_{c_{\beta+1},\ldots,c_\alpha\in\mathbb{F}_p}\prod_{c_1,\ldots,c_\beta\in\mathbb{F}_p}\left(\left(x-\sum_{i=\beta+1}^\alpha c_ie_i\right)-\sum_{j=1}^\beta c_je_j\right)\\ &=\prod_{c_{\beta+1},\ldots,c_\alpha\in\mathbb{F}_p}Q\left(x-\sum_{i=\beta+1}^\alpha c_ie_i\right)\\ &=\prod_{c_{\beta+1},\ldots,c_\alpha\in\mathbb{F}_p}\left(Q(x)-\sum_{i=\beta+1}^\alpha c_i Q(e_i)\right)\\ &=A(Q(x)) \end{align*}where \[ A(x):=\prod_{c_{\beta+1},\ldots,c_\alpha\in\mathbb{F}_p}\left(x-\sum_{i=\beta+1}^\alpha c_i Q(e_i)\right). \]By Lemma 5, $A\in\overline{R}$ so $P=A\circ Q$. It follows that $F=\tilde{\Psi}^{-1}(A)\cdot G$ so $G\mid F$, as desired. $\square$ Lemma 7. Let $\mathbb{F}$ be an algebraic extension of $\mathbb{F}_p$. For nonzero polynomials $F,G\in\mathbb{F}[x]$, we have $G\mid F$ if and only if $\tilde{\Psi}(G)\mid\tilde{\Psi}(F)$ (where divisibility is taken in $\mathbb{F}[x]$). Proof. Since $\mathbb{F}$ is an algebraic extension, it is a subfield of $\overline{\mathbb{F}_p}$. Suppose $F=HG$ for some $H\in\mathbb{F}[x]$. Then $\tilde{\Psi}(F)=\tilde{\Psi}(H)\circ\tilde{\Psi}(G)$. Since $\tilde{\Psi}(H)$ has no constant term, it follows that $\tilde{\Psi}(G)\mid\tilde{\Psi}(F)$. If $\tilde{\Psi}(G)\mid\tilde{\Psi}(F)$, by Lemma 6 we have $F=GH$ for some $H\in\overline{\mathbb{F}_p}[x]$. Let \begin{align*} F(x)&=:a_\alpha x^\alpha+\cdots+a_0\\ G(x)&=:b_\beta x^\beta+\cdots+b_0\\ H(x)&=:c_\gamma x^\gamma+\cdots+c_0. \end{align*}Then $a_0=b_0c_0$ so $c_0\in\mathbb{F}$. Assume we already have $c_0,\ldots,c_i\in\mathbb{F}$. Then \[ b_{i+1}c_0+b_ic_1+\cdots+b_1c_i+b_0c_{i+1}=a_{i+1}\in\mathbb{F} \](where we set $b_{\beta+1},\ldots$ equal to $0$) so $c_{i+1}\in\mathbb{F}$. Thus $H\in\mathbb{F}[x]$ as desired. $\square$ Let $K$ be an algebraic extension of $\mathbb{F}_p$ (for the problem just take $K:=\mathbb{F}_p$). Fix $F,G\in K[x]$ and let $H_1:=\gcd(F,G)$ ($K[x]$ is a Euclidean domain because $K$ is a field). By Lemma 7, $\tilde{\Psi}(H_1)\mid\tilde{\Psi}(F)$ and $\tilde{\Psi}(H_1)\mid\tilde{\Psi}(G)$ so \[ \tilde{\Psi}(H_1)\mid\gcd(\tilde{\Psi}(F),\tilde{\Psi}(G)). \]Let $H_2:=\gcd(\tilde{\Psi}(F),\tilde{\Psi}(G))$. By Lemma 7, $\tilde{\Psi}^{-1}(H_2)\mid F$ and $\tilde{\Psi}^{-1}(H_2)\mid G$ so \[ \tilde{\Psi}^{-1}(H_2)\mid\gcd(F,G). \]It follows that $H_2\mid\tilde{\Psi}(H_1)$ and $\tilde{\Psi}(H_1)\mid H_2$ so $H_2=\tilde{\Psi}(H_1)$, as desired. $\square$
23.10.2024 01:11
All statements made in this solution are in $\mathbb F_p$. Run two Euclidean algorithms in parallel! Ritwin writes two polynomials in $\mathbb F_p$, $B$ and $C$, on a whiteboard. Ivy writes $\Psi(B)$ and $\Psi(C)$ on the whiteboard. In each step, they select a nonnegative integer $k$, and say Ritwin has $(R_1,R_2)$ and Ivy has $(I_1,I_2)$ written respectively. Then, Ritwin replaces $R_2$ with $R_2-x^kR_1$ (either that or replace $R_1$ with $R_1-x^kR_2$), and Ivy replaces $I_2$ with $I_2-I_1^{p^k}$, (or other way, but using the same index as Ritwin). Claim: During every step of this process, we always have $I_1=\Psi(R_1)$ and $I_2=\Psi(R_2)$. Use induction. This is clearly true at the start. Note that $$\Psi(R_2-x^kR_1)=\Psi(R_2)-\Psi(x^kR_1)=I_2-\Psi(x^kR_1).$$Now, we wish to show that $\Psi(x^kR_1)=\Psi(R_1)^{p^k}$, to show that Ivy's new polynomial is still the $\Psi$ of Ritwin's new polynomial. However, if we shift all the exponents of $x$ up by $k$ before applying $\Psi$, it multiplies every exponent later by $p^k$ due to the definition of $\Psi$. However, on the RHS, raising the result of $\Phi(R_1)$ to the power of $p^k$ is also equivalent to multiplying every exponent by $p^k$ as we are working in $\mathbb F_p$ (Freshman's dream!), so we have shown the claim. Have Ritwin perform the Euclidean algorithm on $B$ and $C$ until one of his polynomials is $0$. Then, suppose he ends with $(0,G)$. By the above claim, Ivy ends with $(0,\Phi(G))$. However, note that because $$I_1\mid I_1^{p^k},$$Ivy is also performing a Euclidean algorithm, which means from Ivy's process that $$\gcd(\Phi(B),\Phi(C))=\Phi(G)$$and we are done! Remark: The way found this solution was like this. First, I realized that we can freely subtract constant multiples of $B$ from $C$ because $\Psi$ is an additive function so the Euclidean algorithm is automatically being handled there. However, this alone isn't enough, as eventually in the algorithm, once the degrees are different, we must start subtracting multiples of $x^k$. So first, I just experimented with trying to replace $C$ with $C-xB$. When this is done, on the other side, $\gcd(\Psi(B),\Psi(C))$ becomes $\gcd(\Psi(B),\Psi(C-xB))=\gcd(\Psi(B),\Psi(C)-\Psi(xB)).$ In order for this to be a legal Euclidean algorithm move, we must have $\Psi(B)\mid \Psi(xB)$. After a bit of playing around however, I realized that $\Psi(xB)$ is just $\Psi(B)^p$, and then I realized this generalized to higher powers of $x$ as well and solved the problem. Finally, I phrased the solution this way to hopefully be easier to understand, and also of course as a tribute to some friends from MOP
12.12.2024 09:25
Cool question, though i believe was a bit on the easier side for a p3. As in i was able to get it in 2ish hours . And if you know grp theory its even easier.