Find all functions $f\colon \mathbb{Z}^2 \to [0, 1]$ such that for any integers $x$ and $y$, \[f(x, y) = \frac{f(x - 1, y) + f(x, y - 1)}{2}.\] Proposed by Yang Liu and Michael Kural
Problem
Source: USA Winter Team Selection Test #1 for IMO 2018, Problem 2
Tags: function, algebra, functional equation, TST, probability, random walks
11.12.2017 23:04
Lemma: $f(x+n,y+n) =\dfrac{1}{2^n}\left( \binom{n}{0} f(x,y+n) + \binom{n}{1} f(x+1, y+n-1) + ... + \binom{n}{n} f(x+n, y)\right)$. Proof: Trivial, induct on $n$ and use the given condition. By the lemma twice applied for $2n$ instead of $n$, it follows that $f(x,y) - f(x+1,y-1) = \displaystyle\sum_{i=0}^{2n+1} \dfrac{\binom{2n}{i}-\binom{2n}{i-1}}{2^{2n}} f(x-2n+i, y-i)$. Let $d = f(x,y) - f(x+1,y-1)$ and assume $d\neq 0$. Also let $c_i = \dfrac{\binom{2n}{i}-\binom{2n}{i-1}}{2^{2n}}$. We know that $c_i\ge 0$ for $i\le n$ and $c_i \le 0$ for $i\ge n+1$. Therefore, since $d = \displaystyle\sum_{i=0}^{2n+1} c_i f(x-2n+i, y-i)$, we know $d \le \displaystyle\sum_{i=0}^n c_i \cdot 1 + \displaystyle\sum_{i=n+1}^{2n+1} c_i \cdot 0 = \dfrac{\binom{2n}{n}}{2^{2n}}$. Similarly we can prove $d \ge -\dfrac{\binom{2n}{n}}{2^{2n}}$. Lemma 2: $\binom{2n}{n} \le \dfrac{4^n}{\sqrt{2n+1}}$ for $n\ge 0$. Proof: Trivial, induct on $n$. As a consequence of lemma $2$, we get $|d| \le \dfrac{1}{\sqrt{2n+1}}$, so for sufficiently large $n$ we get a contradiction unless $d=0$. This implies $f(x,y) = f(x+1, y-1)$ for all $x,y$. Now notice $f(x+1,y) =\dfrac{f(x,y) + f(x+1,y-1)}{2} = f(x,y)$. Similarly $f(x,y) = f(x,y+1)$. It's not hard to conclude from this that $f$ must be constant, and it's easy to see that $f\equiv c$ for $c\in [0,1]$ satisfies the conditions, so we're done.
11.12.2017 23:44
Just want to note a bit that the Lemma 1 in the above solution can be easily (motivated and viewed too) proved using the combinatorial way by considering $2^{\ell}f(x+n,y+n)$ as the numbers collected along path of length $\ell$ (one unit to the right or up in each step) from $(p,q)$ to $(x+n,y+n)$ where $p\leqslant x+n,q\leqslant y+n$ and note that, in the case of $\ell=n$, the paths must start at a point on the "diagonal" where all such $n$ points are $n$-unit apart (in taxicab geometry) from $(x+n,y+n)$. I somehow think this is easier than P1.
12.12.2017 00:32
Is it not obvious, that $f(x-1,y)=f(x,y-1)$ because they have same parity?
12.12.2017 00:55
RagvaloD wrote: Is it not obvious, that $f(x-1,y)=f(x,y-1)$ because they have same parity? Note that $f:\mathbb{Z}^2\to\bf{[0,1]}$, which means that $f(x,y)$ can be an arbitrary real number between 0 and 1. This means we cannot just use parity arguments.
12.12.2017 02:12
This problem is well-known: You can see for example here, where the general problem is also treated. https://artofproblemsolving.com/community/q2h5327p335839
12.12.2017 04:04
ThE-dArK-lOrD wrote: I somehow think this is easier than P1. I think it is the kind of problem that can be solved extremely quickly, but I do not think it is easy at all, especially after seeing the scores on the problem. Here are a couple more official solutions (more for instructive purposes, I don't think these are canonical for contestants). Second solution (random walks, Mark Sellke): We show that if $x+y=x'+y'$ then $f(x,y)=f(x',y')$. Let $Z_n$, $Z'_n$ be random walks starting at $(x,y)$ and $(x',y')$ and moving down/left. Then $f(Z_n)$ is a martingale so we have \[\mathbb E[f(Z_n)]=f(x,y), \qquad \mathbb E[f(Z'_n)]=f(x',y') .\]We'll take $Z_n$, $Z'_n$ to be independent until they hit each other, after which they will stay together. Then \[|\mathbb E[f(Z_n)-f(Z'_n)]| \leq \mathbb E[|f(Z_n)-f(Z'_n)|] \leq 2p_n\]where $p_n$ is the probability that $Z_n$, $Z'_n$ never collide. But the distance between $Z_n$, $Z'_n$ is essentially a $1$-dimensional random walk, so they will collide with probability $1$, meaning $\lim_{n\to\infty} p_n=0$. Hence \[ |f(x,y)-f(x',y')| = |\mathbb E[f(Z_n)-f(Z'_n)]| = o(1)\]as desired. Remark: If the problem were in $\mathbb Z^d$ for large $d$, this solution wouldn't work as written because the independent random walks wouldn't hit each other. However, this isn't a serious problem because $Z_n$, $Z'_n$ don't have to be independent before hitting each other. Indeed, if every time $Z_n,Z'_n$ agree on a new coordinate we force them to agree on that coordinate forever, we can make the two walks individually have the distribution of a coordinate-decreasing random walk but make them intersect eventually with probability $1$. The difference in each coordinate will be a $1$-dimensional random walk which gets stuck at $0$. Third solution (martingales): Imagine starting at $(x, y)$ and taking a random walk down and to the left. This is a martingale. As $f$ is bounded, this martingale converges with probability $1$. Let $X_1, X_2, \dots$ each be random variables that represent either down moves or left moves with equal probability. Note that by the Hewitt-Savage 0-1 law, we have that for any real numbers $a < b$, \[ \Pr\left[ \lim_{n \to \infty} f((x, y)+X_1+X_2+\dots + X_n) \in [a, b] \right] \in \{0,1\}. \]Hence, there exists a single value $v$ such that with probability $1$, \[ \Pr\left[\lim_{n \to \infty} f((x, y)+X_1+X_2+\dots + X_n) = v\right] = 1. \]Obviously, this value $v$ must equal $f(x, y)$. Now, we show this value $v$ is the same for all $(x, y)$. Note that any two starting points have a positive chance of meeting. Therefore, we are done.
12.12.2017 04:54
Just wanted to address an aspect of the above solutions which as far as I'm aware is not in the Olympiad canon (partly because I know someone's going to ask this sooner or later, partly because this intersects with topics discussed in this year's version of 21-701 Walk through Combinatorics and I want to make sure I haven't forgotten everything already). Let $X_1,X_2,\ldots$ be a sequence of random variables. We say that this sequence is a martingale if for every choice of values $a_1,\ldots, a_\ell$, \[\mathbb E\left[X_{\ell+1}\mid X_1 = a_1\wedge X_2 = a_2\wedge\cdots\wedge X_\ell = a_\ell\right] = a_\ell.\]In somewhat less formal language, this basically states that given prior knowledge of what values the variables $X_i$ took, the expected value of $X_{\ell+1}$ is equal to the most "recent" observed value. (The word martingale comes from gambling, referring to some gambling strategies that originated in 18th century France.) In the above solutions, we (I think) consider the random walks as a sequence of random variables $\{X_i\}_{i\in\mathbb N}$ such that $X_i$ is the random variable denoting the point $(x_i,y_i)$ on the $i^{\text{th}}$ step of said random walk. Then the sequence $\{f(X_i)\}_{i\in\mathbb N}$ is a martingale practically from the definition of $f$; in particular, one may explicitly compute (suppressing all the conditional notation due to space constraints) \[\mathbb E\left[X_{\ell + 1}\right] = \frac12\cdot f(x_\ell, y_\ell - 1) + \frac12\cdot f(x_\ell - 1, y_\ell) = f(x_\ell, y_\ell).\]
12.12.2017 18:45
Along that lines of martingales and random walks, here is a rather topological solution: Let $X$ be the set of all functions $\mathbb{Z}^2 \to [0, 1]$ and $F$ be the subsetset of $X$ of those complying the requirement. Then $X$ is compact under the product (Tichonof) topology. $F\subset X$ is closed and hence compact. Moreover it's easily checked $F$ is convex. Applying Krein-Milman theorem, $F$ is the closed convex hull of its extreme points. But, the extreme points of $F$ are only $f\equiv 0$ and $f\equiv 1$. The result follows. EDIT: Why the extreme points of $F$ are only $f\equiv 0$ and $f\equiv 1$ ? Suppose that $f_0(x,y)\in F$ is an extreme point of $F$. Note that $f_1(x,y):= f(x-1,y)$ and $f_2(x,y):=f(x,y-1)$ are also in $F$ and $f_0=\frac{1}{2}f_1 + \frac{1}{2}f_2$. Since $f_0$ is an extreme point, it means $f_0=f_1=f_2$. Hence, $f(x,y)=f(x-1,y)\,,\, f(x,y)=f(x,y-1)\,,\,\forall x,y\in \mathbb{Z}$. It implies $f_0$ is a constant function. If $f_0\equiv c\in (0,1)$, then $f_0$ can be represented as $f_0=\frac{1}{2}(c+\varepsilon)+\frac{1}{2}(c-\varepsilon)$ for some small $\varepsilon>0$. Thus, the only possibility is $f_0\equiv 0$ or $f_0\equiv 1$. Commnent. It's not my argument, the problem is very popular, I saw it somewhere. Notice that the same argument applies when $f$ satisfies $ f=\alpha_1f(x+1,y)+\alpha_2f(x,y+1)+\alpha_3f(x-1,y)+\alpha_4f(x,y-1)$, where $\sum \alpha_i=1\,,\,\alpha_i\geq 1$. The case $\alpha_i=1/4,i=1,\dots,4$ was given at a Bulgarian TST see here, it is the link provided by @silouan. The official solution of that problem is here, it also applies without any changes to the OP. Here is also the continuous version of the problem (in the case $\alpha_i=1/4$).
13.12.2017 22:36
How do you know those are the only extreme points? Also, another remark is that the problem is false when the range is $\mathbb R^+$. Indeed, let $f(x,y)=\frac{2^{x+y}}{3^x}$; then the random walk multiples by either $3/2$ or $1/2$ each time step. In the context of the third (martingale) solution, the "obvious" step at the end fails here. To write it out, that step is saying that the expected value at time step $n$ is constant, and hence $f(x,y)$ (the time $0$ expectation) equals the large-$n$ limit of the expected value, which is the expected value of the large-$n$ limit, which is constant. However when $f$ is unbounded as in my example, the last deduction is false; the large-$n$ limit of the expectation is still $f(x,y)$ while the expected value of the large-$n$ limit is $0$. Since we obtained equality of $f$ values by comparison with the large-$n$ limit, this failure breaks the whole solution.
13.12.2017 22:59
Also: the fact that a large-$n$ limit exists, i.e. that any martingale converges, isn't always true (i.e. its clearly false for a simple random walk) but is true when the martingale is bounded, or just positive (so that part is still fine in the positive, unbounded case). The proof goes by showing that there is no interval $[a,b]$ which is crossed infinitely many times; by taking all such intervals with rational endpoints we get convergence (the axioms of probability tell you that if a countable collection of events each has probability $0$, so does their union). To do this, the idea is to show that if the interval $[a,b]$ is crossed infinitely often, you can trade this martingale as a stock and make infinite profit by buying low and selling high, with a finite amount of starting capital. There are a few versions of this martingale convergence theorem with various conditions designed to rule out the simple counterexamples, all with the same proof idea. A proof can be found, for example, in this MIT management course: https://ocw.mit.edu/courses/sloan-school-of-management/15-070j-advanced-stochastic-processes-fall-2013/lecture-notes/MIT15_070JF13_Lec11Add.pdf
14.12.2017 11:56
geoishard wrote: How do you know those are the only extreme points? Suppose that $f_0(x,y)\in F$ is an extreme point of $F$. Note that $f_1(x,y):= f(x-1,y)$ and $f_2(x,y):=f(x,y-1)$ are also in $F$ and $f_0=\frac{1}{2}f_1 + \frac{1}{2}f_2$. Since $f_0$ is an extreme point, it means $f_0=f_1=f_2$. Hence, $f(x,y)= f(x-1,y)\,,\, f(x,y)=f(x,y-1)\,,\,\forall x,y\in \mathbb{Z}$. It implies $f_0$ is a constant function. If $f_0\equiv c\in (0,1)$, then $f_0$ can be represented as $f_0=\frac{1}{2}(c+\varepsilon)+\frac{1}{2}(c-\varepsilon)$ for some small $\varepsilon>0$. Thus, the only possibility is $f_0\equiv 0$ or $f_0\equiv 1$.
14.12.2017 23:02
Wow I'm super impressed right now. You win.
06.02.2018 19:06
Consider $g(x,y) = 2^{x+y} f(x,y)$. Now we see that $g(x,y) = g(x-1,y) + g(x, y-1)$. Consider an $2n+1 \times 2n+1$ board, with the corner squares as $(x-2n,y-2n), (x,y-2n), (x-2n,y), (x, y)$. Note that we can find $g(x, y)$ as a linear combination of the values of $g(x-a, y-2n+a)$ where $0 \le a \le 2n$, because what the recursion does is to add the values in the squares just below and to the left of it. The coefficient of $g(x-a, y-2n+a)$ is the number of paths from that point to $(x, y)$. So we have, using the definition of $g$, that $$f(x,y) =\frac{1}{2^{2n}}\left[ \binom{2n}{0} f(x-2n,y) + \binom{2n}{1} f(x-2n+1, y-1) + ... + \binom{2n}{2n} f(x, y-2n)\right]$$Our main idea will be to show that the $f-$valus can't remain too far for long enough, due to large scale average. We consider the difference between two values of $f$ on the line $y+x = \text{constant}$. Bounding $f(x+1,y-1) - f(x,y)$ by expanding and replacing $f$ values for the first $n+1$ values (i.e. the ones with negative coefficients) by $0$, and the next $n$ values of $f$ by $1$, and doing this in reverse as well gives us the bounds $ -\frac{\binom{2n}{n}}{4^{n}} \le f(x+1,y-1) - f(x,y) \le \frac{\binom{2n}{n}}{4^{n}}$. Now using the Stirling asymptotic expression for large enough $n$, we have the absolute value of the difference smaller than $\frac{1}{\sqrt{\pi n}}$, which upon taking the limit as $n \to \infty$, is zero. So $f$ is constant along the line $y+x = \text{constant}$. Now since by our previous observation that the diagonal points determine the $f-$values of the corner points on a square, we deduce that $f$ is constant everywhere. Thus the only solution to this equation is $\boxed{f(n) = c \ \ \forall \ n \in \mathbb{Z}}$ Edit : Just realised that this is almost exactly the same solution as tastymath75025.
02.02.2019 17:06
Note that if $f$ is constant on each line parallel to $OY$, then $f$ is constant on $\mathbb{Z}^2$, thanks to the given equation. So suppose by the way of contradiction that there are some $a,b\in \mathbb{Z}$ such that $f(a+1,b)>f(a,b)$ (the case $f(a+1,b)<f(a,b)$ is similar, just work with the function $F(x,y)=1-f(x,y)$ instead of $f$). Let $g: \mathbb{Z}^2 \to [-1,1]$ , $g(x,y)=f(x+1,y)-f(x,y)$. Consider $M=sup \{g(x,y) | x,y\in \mathbb{Z}\}$ and note that $M>0$, as $g(a,b)>0$. Also, note that $g(x-1,y)+g(x,y-1)=$ $=f(x,y)-f(x-1,y)+f(x+1,y-1)-f(x,y-1)=$ $=f(x,y)-2f(x,y)+f(x+1,y-1)=$ $=2f(x+1,y)-f(x,y)-f(x,y)=$ $=2g(x,y)$, so $g$ satisfies the same functional equation as $f$. Let $\varepsilon >0$ to be chosen later. We know that for some $u,v\in \mathbb{Z}$ we have $g(u,v)>M-\varepsilon$. Then $g(u-1,v)=2g(u,v)-g(u,v-1)>2M-2\varepsilon-M=M-2\varepsilon$ and inductively, $g(u-n,v)>M-2^n\varepsilon$. So $f(u,v)-f(u-n,v)=g(u-1,v)+\dots +g(u-n,v)>nM-(2^{n+1}-2)\varepsilon$. However, $1>f(u,v)-f(u-n,v)$, so we find that $1>nM-(2^{n+1}-2)\varepsilon $ or equivalently written $$\varepsilon > \frac {nM-1}{2^{n+1}-2}.$$ Now, if we choose $\varepsilon >0$ very small and $n\in \mathbb{Z}_{>0}$ such that $\frac {nM-1}{2^{n+1}-2}>\varepsilon$ (which is obviously possible), we get the desired contradiction! Hence our function $f$ is equal to some constant $c\in [0,1]$, which obviously works.
03.02.2019 15:40
Since $f(x,y)=\frac{f(x-1,y)+ f(x,y-1)}{2}$, we can prove by induction on $n$ that $f(x,y)=\frac{f(x-n,y)+ \binom{n}{1}f(x-n+1,y-1)+\ldots+\binom{n}{1} f(x-1,y-n+1)+ f(x,y-n)}{2^n}$ ---[1] Then, $f(x-1,y+1)=\frac{f(x-1-n,y+1)+ \binom{n}{1}f(x-n,y)+\ldots+\binom{n}{1} f(x-2,y-n+2)+ f(x-1,y+1-n)}{2^n}$ Therefore, $2^n \left(f(x,y)- f(x-1,y+1)\right)= f(x-1-n,y+1)+ \left(1-\binom{n}{1}\right)f(x-n,y)+\left(\binom{n}{1}-\binom{n}{2}\right) f(x-n+1,y-1) + \ldots+\left(\binom{n}{1}- 1\right) f(x-1,y+1-n) + f(x,y-n)$ Since $f(x,y) \in [0,1]$, so $ RHS \leq \binom{n}{\left\lfloor \frac{n}{2} \right\rfloor}$ i.e., $LHS \leq \binom{n}{\left\lfloor \frac{n}{2} \right\rfloor}$ then $O(2^n) \leq \binom{n}{\left\lfloor \frac{n}{2} \right\rfloor}$ which is impossible for large $n$, so $f(x,y)= f(x-1,y+1)$ From [1], $f(x,y)= f(x-n,y)= f(x,y-n) \rightarrow f(x,y) = C, \;\forall (x,y) \in \mathbb{Z}^2$ $\blacksquare$
06.06.2019 05:33
05.05.2020 14:15
$$f(x,y)=\frac{f(x-1,y)+f(x,y-1)}{2}\quad (*)$$Seeking for contradiction, suppose there exists $ f:\mathbb{Z}^2\to [-1,1]$ satisfying $ (*)$ which is not constant. Then, there exists a point $ (x_0,y_0)\in \mathbb{Z}^2$ for which either $ f(x_0+1,y_0)\ne f(x_0,y_0)$ or $ f(x_0,y_0+1)\ne f(x_0,y_0)$. WLOG, suppose the former holds (otherwise we can flip $ x$ and $ y$). We also can assume $ f(x_0+1,y_0)-f(x_0,y_0)>0$, otherwise we consider $ -f$. Let $$ g(x,y):=f(x+1,y)-f(x,y)$$Clearly, $ g$ also satisfies $ (*)$ and it is bounded. Denote $ M:=\sup_{(x,y)\in\mathbb{Z}^2} g(x,y)$, hence $ 0<M\le 2$. Fix some $ \varepsilon>0$ and let $ (x_0,y_0)$ satisfies $ M-\varepsilon <g(x_0,y_0)\le M$. The condition $ (*)$ gives us $$ g(x_0-1,y_0)\geq M-2\varepsilon$$Applying it again several times, we get $$ \displaystyle g(x_0-i,y_0)\geq M-2^n\varepsilon, i=1,2,\dots$$ It follows $$ \displaystyle g(x_0-i,y_0)\geq \frac{M}{2},i=1,2,\dots n$$where $ n=\left\lfloor\log (M/2\varepsilon)\right\rfloor.$ Finally, we have $$ \displaystyle f(x_0,y_0)-f(x_0-n,y_0)=\sum_{i=1}^n g(x_0-i,y_0)>\frac{nM}{2}$$The LHS is bounded, but we can make the RHS as large as we want by taking sufficiently small $ \varepsilon$ (thus $ n$ as large as we want), contradiction. $\blacksquare$ Comment. This is the same approach as used in this Bulgarian TST's problem. That claim is also true when $f$ is lower (or upper) bounded (but not this one, as pointed out in #10). Unfortunately, the method presented here cannot be applied to the weaker assumption (at least I cannot manage it). But, the same idea as in #9 using Krein-Milman theorem, can be successfully exploited. For motivation and more details, see my blog.
05.05.2020 21:34
Superbe post on your blog very informative. Thank you for the incredible work.
03.07.2020 20:13
CantonMathGuy wrote: Find all functions $f: \mathbb{Z}^2 \to [0, 1]$ such that for any integers $x$ and $y$, \[f(x, y) = \frac{f(x - 1, y) + f(x, y - 1)}{2}.\] Proposed by Yang Liu and Michael Kural My solution is similar to dgrozev's solution but posting for sake of completeness:
09.11.2020 08:01
Here is a different solution (I hope it works):
11.05.2021 05:20
We claim the only solutions are $f\equiv c$ with $c\in [0,1]$. Suppose otherwise. Clearly there exist $x,y$ such that $f(x,y)\ne f(x-1,y+1)$ as otherwise the function must be constant. Now, let $n=2m$ be even. We have \[2^n[f(x,y)-f(x-1,y+1)] = \sum_{k=0}^n \binom nk f(x-k,y-n+k) - \sum_{k=0}^n \binom nk f(x-k-1,y+1-n+k)=\]\[\sum_{k=0}^{n+1} f(x-k,y-n+k)\cdot \left(\binom nk - \binom n{k-1}\right) = \]\[\sum_{k=0}^m \left(\binom nk - \binom{n}{k-1}\right)[f(x-k,y-n+k)-f(x-n-1+k,y+1-k)].\]The sum of the coefficients is clearly $\binom nm$, thus some absolute value of a difference in values of $f$ is at least \[\frac{4^m}{\binom{2m}{m}}\cdot |c|\]where $c = f(x,y)-f(x-1,y+1)$. It is well-known that \[\binom{2m}{m}\le \frac{4^m}{\sqrt{\pi m}},\]so this is enough. Boom. Remark: This solution omits the proof of the two results Tastymath75025 cites as lemmas, instead implicitly assuming them or replacing them with stronger results from Wikipedia.
27.09.2021 02:25
This problem is basically: find all ways to assign reals $r \in [0, 1]$ to each point on the integer lattice such that each point has value equal to the average of the two points to its left and down. I claim the answer is all $f$ constant. Based on this interpretation, it is natural to see, for each $f(x,y)$, we can expand all paths with manhattan distance $n$ from it to get that\[f(x, y) = \sum_{i = 0}^n \left[\frac{\binom{n}{i}}{2^n}\right]f(x - i, y - [n-i]).\]This is also doable with induction. This shows that\begin{align*}f(x+1, y-1) - f(x, y) &= \sum_{i = -1}^{n} \left[\frac{\binom{n}{i+1}}{2^n}\right]f(x - i, y - [n-i]) - \sum_{i = -1}^{n} \left[\frac{\binom{n}{i}}{2^n}\right]f(x - i, y - [n-i])\\&=\sum_{i = -1}^n \left[\frac{\binom{n}{i+1} - \binom{n}{i}}{2^n}\right]f(x - i, y - [n-i])\end{align*}for any possibly arbitrarily large $n$, where $\tbinom{n}{m}$ is defined regularly when $0 \leq m \leq n$ and is equal to $0$ elsewhere. For each such $n$, define the $n+2$ coefficients\[c_k = \frac{1}{2^n}\left(\binom{n}{k} - \binom{n}{k-1}\right)\]for $k \in [0, n+1]$ so that\[f(x+1 , y-1) - f(x, y) = \sum_{i = -1}^{n} [c_{i+1}]f(x - i, y - [n-i]).\]Choose $n = 2m$ to be arbitrarily large and even; we will bound this quantity and show that it must be $0$. Doing so will show that all points lying on some $x + y = \ell$ for any $\ell \in \mathbb{Z}$ have the same value, which clearly proves that $f$ must be constant, finishing. Indeed, note for $n = 2m$, that $c_0, c_1, \ldots , c_m$ are $\geq 0$ while $c_{m+1}, c_{m+2}, \ldots , c_{2m+1}$ are $\leq 0$. Since $f \in [0, 1]$, it follows that we can upper and bound as follows:\[0[c_0 + \ldots + c_{m}] + 1[c_{m+1} + \ldots + c_{2m+1}] \leq f(x+1, y-1) - f(x, y) \leq 1[c_0 + \ldots + c_{m}] + 0[c_{m+1} + \ldots + c_{2m+1}]\]and upon telescoping the sums, we get\[\frac{-\binom{2m}{m}}{2^{2m}} \leq f(x+1, y-1) - f(x, y) \leq \frac{\binom{2m}{m}}{2^{2m}}.\]It suffices to show, for arbitrarily large $m$, that $\tfrac{1}{2^{2m}}\tbinom{2m}{m}$ goes to $0$. This is not hard to see:\[\frac{\binom{2(m+1)}{m+1}}{2^{2(m+1)}} = \frac{2m+1}{2m+2}\left[\frac{\binom{2m}{m}}{2^{2m}}\right]\]and thus by induction we can get that\[\frac{\binom{2m}{m}}{2^{2m}} = \frac{(2m-1)!!}{(2m)!!} < \frac{1}{\sqrt{2m+1}}\]which can be verfied by induction. This does indeed prove that the term goes to $0$ as $m$ becomes large, and we are done.
27.09.2021 09:02
Suppose a maximum is attainded at f(a, b)=m: 2f(a,b)=f(a-1,b)+f(a,b-1) Which means f(a-1,b)=f(a,b-1)=m. We can then induct and see that for all x<=a and all y<=b, f(x, y) =m. With this logic we can divide Z into intervals which would make f constant. Now suppose an absolute maximum of value M: 2p=2f(x+1,y)=f(x,y)+f(x+1,y-1)=M+p This means that M=p and the function is constant. P.S: Do tell if there are any mistakes (there probably are)
27.11.2021 22:09
My most intense write-up yet . Writing this solution took 1.5 hours aaaaaaahhhh. Similar to Idio-logy's proof. The answer is all constant $f$, which work. We will show that no nonconstant function will work. Define $S=\{|f(x,y+1)-f(x+1,y)| \colon (x,y) \in \mathbb{Z}^2\}$. Assume FTSOC that $\sup S$ exists and is positive. Choose integers $a$ and $b$ such that $\sup S-|f(a,b+1)-f(a+1,b)|$ is arbitrarily small, and define \[r=\frac{|f(a,b+1)-f(a+1,b)|}{\sup S}.\]Define \[d_{x,y}=|f(a-x+y,b-y+1)-f(a-x+y+1,b-y)|\]for all integers $x$ and $y$. Claim 1: For all integers $x$ and $y$, \[2d_{x,y}=d_{x+1,y}+d_{x+1,y+1}.\]Proof: Trivial. As a corollary, if $d_{x,y} \geq \frac{\sup S}{2}$, then the sequence \[f(a-x+y-1,b-y+1),f(a-x+y,b-y),f(a-x+y+1,b-y-1)\]is monotonic. Thus, we have the following claim: Claim 2: If $d_{x,0},d_{x,1},\ldots,d_{x,x} \geq \frac{\sup S}{2}$, then \[|f(a-x-1,b+1)-f(a+1,b-x-1)|=d_{x+1,0}+d_{x+1,1}+\cdots+d_{x+1,x+1}.\] Claim 3: For integers $x$ and $y$ with $0 \leq y \leq x$, $d_{x,y} \geq (2^xr-2^x+1)\sup S$. Proof: We proceed by induction on $x$. The base case of $x=0$ is true by definition. For the inductive step, assume that $d_{k,y} \geq (2^kr-2^k+1)\sup S$. We know that \[d_{k+1,y}+\sup S \geq d_{k+1,y}+d_{k+1,y+1}=2d_{k,y} \geq 2(2^kr-2^k+1)\sup S,\]so $d_{k+1,y} \geq (2^{k+1}r-2^{k+1}+1)\sup S$. We can repeat the same proof to show that $d_{k+1,y+1} \geq (2^{k+1}r-2^{k+1}+1)\sup S$, so our induction is complete. Using this, we can show that when $d_{x-1,0},d_{x-1,1},\ldots,d_{x-1,x-1} \geq \frac{\sup S}{2}$, \begin{align*} d_{x,0}+d_{x,1}+\cdots+d_{x,x} &= d_{x-1,0}+d_{x-1,1}+\cdots+d_{x-1,x-1}+\frac{d_{x,0}+d_{x,x}}{2} \\ &\geq d_{x-1,0}+d_{x-1,1}+\cdots+d_{x-1,x-1}+\left(2^xr-2^x+1\right)\sup S,\end{align*}where the first equality is by Claims 1 and 2. Set $n=\lfloor \frac{1}{\sup S}+1\rfloor$, and notice that \[\sum_{k=1}^n \left(2^kr-2^k+1\right)\sup S>1\]when $r$ is sufficiently close to $1$. Combined with Claim 2, this gives a contradiction.
28.11.2021 18:34
v_Enhance wrote:
Second solution (random walks, Mark Sellke): We show that if $x+y=x'+y'$ then $f(x,y)=f(x',y')$. Let $Z_n$, $Z'_n$ be random walks starting at $(x,y)$ and $(x',y')$ and moving down/left. Then $f(Z_n)$ is a martingale so we have \[\mathbb E[f(Z_n)]=f(x,y), \qquad \mathbb E[f(Z'_n)]=f(x',y') .\]We'll take $Z_n$, $Z'_n$ to be independent until they hit each other, after which they will stay together. Then \[|\mathbb E[f(Z_n)-f(Z'_n)]| \leq \mathbb E[|f(Z_n)-f(Z'_n)|] \leq 2p_n\]where $p_n$ is the probability that $Z_n$, $Z'_n$ never collide. But the distance between $Z_n$, $Z'_n$ is essentially a $1$-dimensional random walk, so they will collide with probability $1$, meaning $\lim_{n\to\infty} p_n=0$. Hence \[ |f(x,y)-f(x',y')| = |\mathbb E[f(Z_n)-f(Z'_n)]| = o(1)\]as desired. Remark: If the problem were in $\mathbb Z^d$ for large $d$, this solution wouldn't work as written because the independent random walks wouldn't hit each other. However, this isn't a serious problem because $Z_n$, $Z'_n$ don't have to be independent before hitting each other. Indeed, if every time $Z_n,Z'_n$ agree on a new coordinate we force them to agree on that coordinate forever, we can make the two walks individually have the distribution of a coordinate-decreasing random walk but make them intersect eventually with probability $1$. The difference in each coordinate will be a $1$-dimensional random walk which gets stuck at $0$.
This solution is very interesting; it can be expressed in a different tone. Consider any arbitrary $x,x',y,y' \in \mathbf{Z}$ and define $(Z_n)_{n \ge 0}$ to be the random walk which starts at $(x,y),$ and moves unit step towards left, or unit step downwards with probability $1/2$ each at any step, and let $(Z'_n)_{n \ge 0}$ be the random walk obeying these transition laws and which starts at $(x' , y').$ Then $f(x,y) - f(x', y') = \mathbf{E} [ Z_n ] - \mathbf{E} [ Z'_n]$ for all natural number $n;$ also oen has $\mathbf{E} [ | f(Z_n) - f(Z' _ n) | ] \le 2 \mathbf{P} \{ Z_n \ne Z'_n \}.$ Now the sequence $(\mathrm{x}(Z_n) - \mathrm{x}(Z'_n) , \mathrm{y}(Z_n) - \mathrm{y}(Z'_n))$ (where $\mathrm{x}(P) , \mathrm{y}(P)$ denote the x and y coordinates of a point $P$ respectively) is a simple symmetric random walk on $\mathbf{Z}^2$ and since it is recurrent (thanks to Polya's theorem) so it visits $(0,0)$ at some finite point of time with probability $1$, thus the claim is proved. $\square$ This has some further interesting consequences: suppose we ask the same question but now the given functional equation being replaced by $$f(x,y) = p f(x-1 , y) + q f(x, y-1)$$where $0 < p < 1, p+q = 1$. Then, also we can say that $f$ is constant (thanks to Chung-Fuchs theorem), since in this case too the step distribution of the two dimensional random walk $(\mathrm{x}(Z_n) - \mathrm{x}(Z'_n) , \mathrm{y}(Z_n) - \mathrm{y}(Z'_n))$ will have $0$ mean and finite second moment, thus it will be recurrent and hence visit $(0,0)$ at some finite point of time with probability one, whence once again by the exactly same computation as above $f$ will be constant. $\square$ Remark: And as Evan mentioned in his remark, by modifying the coupling we may prove the result for higher dimensions by stitching together the conclusions for smaller dimensions even in this case.
14.11.2022 01:30
Took so long to solve... ouch. Let $g(x, y) = |f(x-1, y) - f(x, y - 1)|$ Let $S = \limsup g(x, y)$. Assume $S > 0$. Then find an $x, y$ such that $|g(x, y) - S| < 2^{- \lceil \frac{3}{S} \rceil - 100}$ Then travel down or left $\lceil \frac{3}{S} \rceil + 5$ times from $x, y$ such that the value of the function is always decreasing, then since $2g(x, y) \leq g(x - 1, y) + g(x, y-1) \leq \{S + g(x-1, y), S+g(x, y-1)\}$, each time we take a step, the difference from $S$ at most doubles. Thus since the sum of geometric series is more or less infinitessimal after $\lceil \frac{3}{S} \rceil + 5$ steps ($\approx \frac{1}{2^{99}}$), the distance traversed is $\approx 3 > 2$ but this is impossible as the initial number and final number are both in $[0, 1]$, giving contradiction. Hence $S = 0$, but this means there is a maximum value of the function somewhere. However since $2g(x, y) \leq g(x-1, y)+g(x, y-1)$, either $g(x-1, y) \geq g(x, y)$ or $g(x, y - 1) \geq g(x, y)$, so the function cannot ever be greater than $0$. Thus it is always $0$, it it is easy to see that $f(x, y)$ must be constant.
10.12.2022 09:40
The COUPLING Technique is very important in stochastic processes. We will show for any $x,y$ we have $f(x,y)=f(x+1,y-1)$. WLOG, $x,y>N$. Consider a random walk (in a rectangle with $x=0$ as the left edge and $y=0$ as the top edge) from a point, and it goes up with probability $\frac 12$ and left with probability $\frac 12$. It STOPS when it touches the line $x=0$ or $y=0$. If it stops on $(a,b)$, the process returns $f(a,b)$. Let $X$ be the random variable representing what I get when the walk stops, then $f(x,y)=\mathbb{E}[X]$. Define $Y$ similarly for $(x+1,y-1)$ We tweak the rules a little bit. Let $(X_n)$ be the random walk starting from $(x,y)$ and $(Y_n)$ be the random walk starting from $(x+1,y-1)$. The two walks are independent until they hit the same point, and they BECOME THE SAME WALK. In other words, if $\tau$ is the smallest number such that $X_{\tau}= Y_{\tau}$ then $X_k=Y_k$ for all $k\ge \tau$. So how do we model $\tau$? Well it can be modelled with a one-dimensional random walk on the taxicab distance of $X_i, Y_i$. Note the taxicab distance between $X_i,Y_i$, call $d_i$, satisfy $d_0=2$, increases by 2 with probability $\frac 14$, stays the same with probability $\frac 12$, and decreases by 2 such with probability $\frac 14$, and $\tau$ is the smallest integer st $d_{\tau}=0$. Since the Markov process on $(d_i)$ is recurrent, it follows that $$\lim_{n\to\infty} \mathbb{P}(\tau > n) = 0$$ Back to the problem, we know if $t<x+y$ then $\mathbb{E}[X|\tau = t] = \mathbb{E}[Y|\tau = t]$, so we have $$\mathbb{E}[X]-\mathbb{E}[Y] = \sum\limits_{t\ge 1} \mathbb{P}(\tau = t) (\mathbb{E}[X|\tau = t] - \mathbb{E}[Y|\tau = t]) = \sum\limits_{t>x+y} \mathbb{P}(\tau = t) (\mathbb{E}[X|\tau = t] - \mathbb{E}[Y|\tau = t])$$ Notice $|\mathbb{E}[X|\tau = t] - \mathbb{E}[Y|\tau = t]| \le 1$ since $Im(f) \subset [0,1]$ Hence $$|\mathbb{E}[X]-\mathbb{E}[Y]| \le \mathbb{P}(\tau > x+y)$$ As $x+y\to \infty$, this implies that if LHS is nonzero, we can set RHS to be smaller. Thus, LHS=0. This readily implies $f$ is constant.
07.04.2023 12:40
07.04.2023 14:20
Pyramix wrote: Since the range of the function is bounded, the function takes a maximum value $M\leq1$ and a minimum value $m\geq0$. By definition, $\exists\ (x_1,y_1),(x_2,y_2)\in\mathbb{Z}^2$ for which $f(x_1,y_1)=M$ and $f(x_2,y_2)=m$. Why would this be true?
07.04.2023 16:13
MatteD wrote: Pyramix wrote: Since the range of the function is bounded, the function takes a maximum value $M\leq1$ and a minimum value $m\geq0$. By definition, $\exists\ (x_1,y_1),(x_2,y_2)\in\mathbb{Z}^2$ for which $f(x_1,y_1)=M$ and $f(x_2,y_2)=m$. Why would this be true? Since the range is bounded and the domain is well-defined, there must be a maximum/minimum. Maximum/minimum value is defined to be taken, so I think it is true.
07.04.2023 16:45
Pyramix wrote: MatteD wrote: Pyramix wrote: Since the range of the function is bounded, the function takes a maximum value $M\leq1$ and a minimum value $m\geq0$. By definition, $\exists\ (x_1,y_1),(x_2,y_2)\in\mathbb{Z}^2$ for which $f(x_1,y_1)=M$ and $f(x_2,y_2)=m$. Why would this be true? Since the range is bounded and the domain is well-defined, there must be a maximum/minimum. Maximum/minimum value is defined to be taken, so I think it is true. @MatteD/others please confirm if this claim is true... Also please notify me if I have errors in my solution (I think I do, because my solution seems overly simple...)
07.04.2023 16:57
What if $f(x,y) = \min\left\{1,\frac{1}{2^{x+y}}\right\}$, for example? This function is bounded but does not have a minimum.
08.04.2023 08:54
Thank you for pointing out. @MatteD, @djmathman
05.05.2023 07:08
Why write up so pain? We claim that all such functions are constant. First note that all constant functions work. We claim no else function exists. Else, there must exist $a, b$ such $f(a, b) \ne f(a-1, b+1)$. Define $g(x, y) = f(x-1,y+1) - f(x,y)$. More visually, $g(x,y)$ is the difference between the square one above and to the left's value under $f$ compared to this one. WLOG assume that $g(a,b) > 0$ or else we can flip $f$ such it holds. As such, it follows that $2g(x, y) = g(x-1, y) + g(x,y-1)$. Thus, $\max\{g(x-1,y), g(x,y-1)\} \ge g(x,y)$. Let $M = \sup_{x \le a, y \le b}\{g(x, y)\}$. Then we can redefine $a, b$ such $g(a, b)$ becomes arbitrarily close to $M$, say $g(a, b) \ge M - \frac{1}{2^{r}}$ for some $r$. However, if $g(a, b) \ge M - \frac{1}{2^{r}}$ then $g(a, b-1) \ge M - \frac{1}{2^{r-1}}$ and $f(a,b) - f(a, b-1) \ge \frac{M - \frac{1}{2^{r-1}}}{2}$ where equality holds when $g(x-1,y) = M$. Thus, \[ f(a,b) - f(a,b-r') \ge \frac12\sum_{i=r'}^{r} \left(M - \frac{1}{2^i}\right) \]where $r'$ is the minimal integer for which $M - \frac{1}{2^{r'}} > 0$. For a sufficiently large $r$ it follows that \[ f(a,b) - f(a,b-r') > 1, \]contradiction.
15.07.2023 15:47
The answer is constant functions only, which clearly work. We now prove they are the only ones. The main part of the problem is the following claim. Claim: We have $f(x,y)=f(x+1,y-1)$. Proof: Suppose otherwise. Let $M=\sup |f(x,y)-f(x+1,y-1)|$ (this exists since $f$ is bounded), where $M \neq 0$. Pick some $(x,y)$ such that $|f(x,y)-f(x+1,y-1)|\geq (1-\varepsilon)M$ is very close to the supremum: the selection of $\varepsilon$ will occur later. Let $a=\max\{f(x,y),f(x+1,y-1)\}$. By comparing the assertion when plugging in $x,y$ and $x+1,y-1$, we find that $|f(x-1,y)-f(x+1,y-2)|\geq(2-2\varepsilon)M$. Since both $|f(x-1,y)-f(x,y-1)|$ and $|f(x,y-1)-f(x+1,y-2)|$ are bounded above by $M$, by the triangle inequality they are also bounded below by $1-2\varepsilon$. Therefore, since either $\tfrac{f(x-1,y)+f(x,y-1)}{2}$ or $\tfrac{f(x,y-1),f(x+1,y-2)}{2}$ is equal to $a$, and the absolute difference between any two terms sharing a numerator is at least $1-2\varepsilon$, it follows that one of $f(x-1,y),f(x,y-1),f(x+1,y-2)$ is at least $a+\tfrac{(1-2\varepsilon)M}{2}$—WLOG it is $f(x-1,y)$, but it really doesn't matter. Since $|f(x-1,y)-f(x,y-1)|\geq (1-2\varepsilon)M$, we find that $|f(x-2,y)-f(x,y-2)|\geq (2-4\varepsilon)M$, so then by similar reasoning both $|f(x-2,y)-f(x-1,y-1)|$ and $|f(x-1,y-1)-f(x,y-2)|$ are bounded above by $M$ and below by $(1-4\varepsilon)M$. Hence one of $f(x-2,y),f(x-1,y-1),f(x,y-2)$ is at least $(a+\tfrac{(1-2\varepsilon)M}{2})+\tfrac{(1-4\varepsilon)M}{2}$. Repeating this argument $k$ times, we get that $f$ attains a value which is at least $$a+\frac{(1-2\varepsilon)M}{2}+\frac{(1-4\varepsilon)M}{2}+\cdots+\frac{(1-2^k\varepsilon)M}{2}.$$By taking $k$ large enough and sending $\varepsilon \to 0^+$, however, it is clear that this expression is unbounded, since $M \neq 0$, but this contradicts the fact that the codomain of $f$ is $[0,1]$. $\blacksquare$ Now, by induction on the fact that $f(x,y)=f(x+1,y-1)$, we find that $f$ is constant along every "diagonal" of the form $x+y=n$. Then for all $(x,y)$ with $x+y=n$, we have $f(x+1,y)=\tfrac{f(x,y)+f(x+1,y-1)}{2}$, so the value $f$ attains along the diagonal $x+y=n+1$ is the same as the value attained on $x+y=n$. This implies that $f$ is constant everywhere, since if two diagonals yield different values for $f$ then we may start at the one with a lower value of $x+y$ and reach a contradiction by induction. We are done. $\blacksquare$
31.08.2024 08:24
The answer is $f(x,y) = \boxed{c}$, which obviously works. We begin by inducting to the $n$th layer to get \[f(x,y) = \frac{1}{2^n}\left(\sum_{i=0}^n \binom{n}{i} f(x-n+i, y-i)\right).\] Clearly if $f(x,y) = f(x+1,y-1)$, then the function is constant, so henceforth assume otherwise. Suppose that $d = f(x,y)-f(x+1,y-1)$. Then, use our iterated form of $f(x,y)$ to get \begin{align*} d &= \frac{1}{2^n}\left(\sum_{i=0}^n \binom{n}{i} f(x-n+i, y-i) - \sum_{i=1}^{n+1} \binom{n}{i-1} f(x-n+i, y-i) \right) \\ &= \frac{f(x-n+i,y-i)}{2^n} \left(\sum_{i=0}^n \binom{n}{i} - \sum_{i=1}^{n+1} \binom{n}{i-1} \right) \\ &= \frac{f(x-n+i,y-i)}{2^n} \sum_{i=0}^{n+1} \left(\binom{n}{i} - \binom{n}{i-1} \right), \end{align*} letting $\tbinom{n}{-1} = \tbinom{n}{n+1} = 0$. Now, let $c_0, c_1, \dots, c_{n+1}$ be the coefficients such that \[c_i = \frac{\binom{n}{i}- \binom{n}{i-1}}{2^n},\] so that \[d = f(x-n+i,y-i) \sum_{i=0}^{n+1} c_i.\] Choose $n$ to be even, and let $n=2m$. Notice that $c_i>0$ when $i\le m$, and $c_i>0$ when $i\ge m$. Hence, \[\max d = 1\left(\sum_{i=0}^m c_i \right) +0\left(\sum_{i=m+1}^{2m+1} c_i\right) = \frac{\binom{2m}{m}}{4^m}. \] The Stirling Approximation for $n!$ gives that \[n! \approx \sqrt{2\pi n} \left(\frac{n}{e} \right)^n \implies \binom{2m}{m} \approx \frac{4^m}{\sqrt{\pi m}}.\] This means that \[\lim_{m \to \infty} |d| \le \lim_{m \to \infty} \frac{\binom{2m}{m}}{4^m} = \lim_{m \to \infty} \frac{1}{\sqrt{\pi m}} = 0,\] which requires $d=0$, a contradiction. Thus, the only solution is $f \equiv c$.
22.11.2024 01:57
Beautiful problem! We claim the only solutions are constants, which clearly work. Call a function skibidi if it satisfies the functional equation. Fix a skibidi function $f$. Consider some "stack" of $f$, where the $i$th layer in the stack for $i\ge 0$ consists of the values $f(r-i,s),f(r-i+1,s-1),\dots,f(r+k,s-k-i)$ for $k\ge 0$ and some fixed $r,s$. Lemma: The sum of squares of elements of layer $i$ is nondecreasing as $i$ increases. Proof: If one layer has $a_1,\dots,a_n$ then the previous layer has sum of squares \begin{align*}&\left(\frac{a_1+a_2}2\right)^2+\dots+\left(\frac{a_{n-1}+a_n}2\right)^2\\=\,&\frac{a_1^2}2+a_2^2+\dots+a_{n-1}^2+\frac{a_n^2}2-\frac14\left((a_1-a_2)^2+\dots+(a_{n-1}-a_n)^2\right)\\=\,&a_1^2+\dots+a_n^2-\frac12(a_1^2+a_n^2)-\frac14\left((a_1-a_2)^2+\dots+(a_{n-1}-a_n)^2\right).\end{align*} Let $g(x,y)=f(x,y)-f(x+1,y-1)$. The main idea is that $g$ is also skibidi, which is easy to check. Next, define $h(x,y)=g(x,y)-g(x+1,y-1)$. Now $h$ is also skibidi. Suppose $g$ is nonconstant. Then there is some layer of $h$ with sum of squares $H>0$, and thus the stack of $h$ starting at that layer does as well, by our lemma. Now we can choose the stack of $g$ that generates the above stack of $h$, such that between every two layers we get \begin{align*}&b_1^2+\dots+b_n^2-\left(\frac{b_1+b_2}2\right)^2-\dots-\left(\frac{b_{n-1}+b_n}2\right)^2\\\ge\,&\frac14\left((b_1-b_2)^2+\dots+(b_{n-1}-b_n)^2\right)\\\ge\,&\frac H4.\end{align*}In particular, by telescoping this we can find some layer of $g$ with sum of squares $G>4$. Now choose the stack of $f$ that generates the above stack of $g$ starting at the layer with sum of squares $>4$. Between every two layers we get \begin{align*}&a_1^2+\dots+a_n^2-\left(\frac{a_1+a_2}2\right)^2-\dots-\left(\frac{a_{n-1}+a_n}2\right)^2\\\ge\,&\frac14\left((a_1-a_2)^2+\dots+(a_{n-1}-a_n)^2\right)\\\ge\,&\frac G4.\end{align*}Telescoping this, we get that the sum of squares of layer $n$ is $\ge n\cdot\frac G4$, but layer $n$ has $n+k$ elements for fixed $k$. Thus each layer has average square of an element $\ge\frac n{n+k}\cdot\frac G4$, which is greater than $1$ for large enough $n$. This is impossible. Otherwise if $g$ is constant and nonzero, then $f$ is clearly unbounded. Thus $g$ is constantly $0$, so $f$ is constant.
25.11.2024 01:19
The answer is all constant $f$, which clearly work. It remains to show that $f$ must be constant. We begin with a claim. Claim: For any $(x,y)\in\mathbb{Z}^2$ and integer $s > 0$, we have $|f(x+1,y-1) - f(x,y)|\le 2^{-s}(1+\tbinom{s}{\lfloor s/2\rfloor})$. proof: Fix $(x,y)$ and $s$. Define $a_{p} = f(p, x+y-s-p)$. It's easy to show by induction using the given FE that for any $q$, we have \[b_q := f(q,x+y-q) = \frac{1}{2^s}\sum_{i = 0}^s\binom{s}{i}a_{q-s+i}.\]Then, we have \begin{align*} b_{x+1} - b_x &= \frac{1}{2^s}\sum_{i = 0}^{s}\binom{s}{i}a_{x+1-s+i} - \frac{1}{2^s}\sum_{i = 0}^s\binom{s}{i}a_{x-s+i}\\ &= \frac{1}{2^s}\sum_{i =0}^s \binom{s}{i}a_{x+1-s+i} - \frac{1}{2^s}\sum_{i = -1}^{s-1}\binom{s}{i+1}a_{x+1-s+i}\\ &= \frac{1}{2^s}(a_{x+1} - a_{x-s}) + \frac{1}{2^s}\sum_{i = 0}^{s-1}\left(\binom{s}{i} - \binom{s}{i+1}\right)a_{x+1-s+i}\\ &\le \frac{1}{2^s}(1) + \frac{1}{2^s}\binom{s}{\lfloor s/2\rfloor} \end{align*}where the last step used $a_p \in [0,1]$ for all $p$. The proof of $b_{x+1} - b_x\ge -\tfrac{1}{2^s} - \tfrac{1}{2^s}\tbinom{s}{\lfloor s/2\rfloor}$ is nearly identical $\square$ Using the claim, we now have that for any $(x,y)$ \begin{align*} |f(x+1, y-1) - f(x,y)| &\le \lim_{s\to \infty}\frac{1}{2^s} + \frac{\binom{s}{\lfloor s/2\rfloor}}{2^s}\\ &= \lim_{s\to \infty}\frac{\binom{2s}{s}}{2^{2s}}\\ &=\lim_{s\to \infty} \frac{(2s!)}{(s!)^22^{2s}}\\ &=\lim_{s\to \infty} \frac{\sqrt{4\pi s}(2s/e)^{2s}(1 + O(1/s))}{(\sqrt{2\pi s}(s/e)^s(1 + O(1/s)))^22^{2s}}\\ &= \lim_{s\to \infty}\frac{C}{\sqrt{s}} = 0, \end{align*}where $C$ is some constant I was too lazy to calculate. Hence, $f(x,y) = f(x+1,y-1)$ for all $(x,y)$. Hence, $$f(x+1,y) = \tfrac{1}{2}(f(x,y) + f(x+1,y-1)) = f(x,y).$$Similarly, $f(x,y) = f(x-1, y+1),$ so $f(x,y+1) = \tfrac{1}{2}(f(x,y) + f(x-1, y+1)) = f(x,y)$. Thus, we have that $f(x,y) $$= f(x+1,y)$ $= f(x,y+1)$ for all $(x,y)$ which implies $f$ is constant.