Let $G$ be a directed graph with infinitely many vertices. It is known that for each vertex the outdegree is greater than the indegree. Let $O$ be a fixed vertex of $G$. For an arbitrary positive number $n$, let $V_{n}$ be the number of vertices which can be reached from $O$ passing through at most $n$ edges ( $O$ counts). Find the smallest possible value of $V_{n}$.
Problem
Source: Bulgarian IMO TST 2008, Day 2, Problem 3
Tags: floor function, ceiling function, induction, algebra, polynomial, combinatorics proposed, combinatorics
09.07.2013 19:26
The question doesn't say so, but presumably multiple parallel edges are forbidden. (Otherwise, the answer is $V_n = n + 1$: from vertex $O$ we have an edge to vertex $O_1$, from which we have two edges to vertex $O_2$, from which we have three edges to vertex $O_3$, etc.) Experimenting for a little while leads one to the conclusion that the minimum is attained with the graph $G$ defined as follows: $G$ has vertex set $A_0 \cup A_1 \cup A_2 \cup \ldots$ where $A_0 = \{O\}$, $A_{2i}$ consists of $i + 1$ vertices, $A_{2i + 1}$ consists of $i + 1$ vertices, every vertex in $A_i$ has a directed edge going to every vertex in $A_{i + 1}$, and there are no other edges. This gives \[ V_n = \left\lfloor \frac{(n+2)^2}{4}\right\rfloor = \begin{cases} (k + 1)^2, & n = 2k \\ (k+1)(k+2),& n = 2k + 1. \end{cases} \] Now we show that this is minimal. Let $O_n$ be the sum of all outdegrees of all vertices visited by time $n$ and let $I_n$ be the sum of all indegrees of all vertices visited by time $n$. Since every vertex has larger outdegree than indegree, $O_n \geq V_n + I_n$. Also, every outgoing edge from a vertex visited by time $n - 1$ is an incoming edge to some vertex visited by time $n$, so $I_n \geq O_{n - 1}$. Thus $O_n \geq V_n + O_{n - 1}$, and inducting downwards we get $O_n \geq V_n + V_{n - 1} + \ldots + V_0$. The average outdegree of vertices visited by time $n$ is $\frac{O_n}{V_n}$, so some vertex must have outdegree at least $\left\lceil\frac{O_n}{V_n}\right\rceil$. Since there are no parallel edges, it follows that \begin{align*} V_n & \geq V_{n - 1} + \left\lceil\frac{O_{n-1}}{V_{n-1}}\right\rceil \\ & \geq V_{n - 1} + 1 + \left\lceil\frac{V_{n - 2} + \ldots + V_0}{V_{n-1}}\right\rceil. \end{align*} By induction, we may assume by induction that $V_i \geq \left\lfloor \frac{(i+2)^2}{4}\right\rfloor$ for $i < n$. The expression on the right side of the previous displayed equation will be minimized when (1) $V_0, \ldots, V_{n - 2}$ are chosen as small as possible, and then (2) $V_{n - 1}$ is chosen to make the whole expression as small as possible. Under the given constraints, we have [fudging and computations suppressed] that also we should choose $V_{n - 1}$ as small as possible. Putting all this together [more computations and possibly fudging suppressed] we get the bound we originally claimed for $V_n$. It would be nice to see a solution that didn't rely on very careful arithmetic with sums of floors of polynomials and so on. The idea for this proof came to me from an argument in Patrick Byrnes' University of Minnesota Ph.D. thesis (2012), in which he proves that the largest rank sizes for an $r$-differential poset is given by the $r$-Fibonacci numbers.
30.05.2019 04:05
JBL wrote: Also, every outgoing edge from a vertex visited by time $n - 1$ is an incoming edge to some vertex visited by time $n$, so $I_n \geq O_{n - 1}$. Thus $O_n \geq V_n + O_{n - 1}$, and inducting downwards we get $O_n \geq V_n + V_{n - 1} + \ldots + V_0$. I'm afraid it's not so easy to see why this should be true (is it even true?), since you could certainly have an edge "pointing backwards in time." In other words, there could be an edge from a vertex visited at time $a$ to a vertex visited at time $b$, with $b \le a.$ The following is a solution which obviates this annoying scenario using an extremal assumption. Redefine $V_n$ to mean the set instead of the number. The answer is $|V_n| = \left \lfloor \frac{(n+2)^2}{4} \right \rfloor.$ For a construction, do as JBL did above. We'll now show that this is minimal. Consider the graph with the smallest $|V_n|$, and among those graphs, the one which minimizes the number of edges amongst the vertices of $V_n.$ Let $S_i$ denote the set of vertices which are at a distance $i$ from $O$, for each $i = 0, 1, \cdots, n.$ In this way, $S_0 = \{O\}$ and $V_n = S_0 \cup S_1 \cup \cdots \cup S_n.$ WLOG that there is no $a \notin V_n$ so that $a \rightarrow c$ for some $c \in S_n$, as we could just remove that edge with no repercussions (just fix $a$ using the other infinitely many vertices of the graph if we messed it up). Claim. For a vertex $a \in S_i$, all vertices $b$ so that $b \rightarrow a$ satisfy $b \in S_{i-1}.$ Proof. Suppose the contrary, for contradiction. Consider the smallest $i$ for which some $a \in S_i$ fails to satisfy the claim. By our earlier assumption, we have that $i < n$. We will consider four cases on $b$. If $b \notin V_n$, then we simply consider a vertex $v_{i+1} \in S_{i+1}$ with $a \rightarrow v_{i+1}$, and remove the edges $b \rightarrow a$ and $a \rightarrow v_{i+1}.$ If we have messed up $b$, then use the infinitely many other vertices of the graph to fix it. If $b \in S_0 \cup S_1 \cup \cdots \cup S_{i-2}$, then $a$ is of distance at most $i-2 + 1 = i-1$ from $O$, contradiction. If $b \in S_j$ for $i \le j < n$, then we can start at $a$ and continually just move. In other words, take the longest possible path $a \rightarrow v_1 \rightarrow v_2 \rightarrow \cdots \rightarrow v_k$ starting at $a$ and contained entirely within $V_n$. Eventually, this path either forms a cycle with itself or enters $S_{j+1}$. If it forms a cycle, then we just remove all edges of the cycle. Otherwise, let $v_x$ be the first vertex to enter $S_{j+1}.$ We must have that $v_{x-1} \in S_j.$ Then, we will remove the edges $a \rightarrow v_1, v_1 \rightarrow v_2, \cdots, v_{x-2} \rightarrow v_{x-1}, v_{x-1} \rightarrow v_x$ and add the edge $a \rightarrow v_x.$ If $b \in S_n$, then we simply remove $b \rightarrow a$ and use the infinitely many other vertices to fix $b.$ We've obtained a contradiction in all cases by obtaining a graph with a smaller number of edges among the $V_n$. $\blacksquare$ After having made these assumptions about the graph, we're ready to proceed with the problem. Let $x_i$ be $|S_i|$ for each $0 \le i \le n.$ Let $e_i$ be the number of edges pointing from $S_{i-1}$ to $S_i$, for each $1 \le i \le n.$ Lemma 1. $e_i = x_{i-1} + e_{i-1}$, for all $1 \le i \le n.$ Proof. We clearly have that $e_i \ge x_{i-1} + e_{i-1}$ by simply summing outdegree minus indegree over $S_{i-1}.$ Now, if $e_i > x_{i-1} + e_{i-1}$ then there exists an edge from some vertex in $S_{i-1}$ to some vertex in $S_i$ which we can remove. Hence, removing it gives a graph with less edges among $V_n$, contradicting our minimality assumption. $\blacksquare$ Lemma 2. $|S_i| \ge \frac{e_i}{|S_{i-1}|} = 1 + \frac{e_{i-1}}{|S_{i-1}|}$ for each $1 \le i \le n.$ Proof. Observe that there exists some vertex $v \in S_{i-1}$ with outdegree at least $\frac{e_i}{|S_{i-1}|}$ by Pigeonhole, which implies the left inequality. As for the right equality, it simply follows from Lemma 1. $\blacksquare$ Rearranging the equation in Lemma 2 gives us that $$x_{i-1} (x_i-1) \ge e_{i-1}.$$Let $s_i = x_1 + \cdots + x_i$ for each $1 \le i \le n.$ We have that $e_i = s_{i-1}$ for each $2 \le i \le n.$ Hence, we have from Lemma 1, AM-GM, and the above that $$s_i = e_{i-1} + x_i + x_{i-1} = e_{i-1} + 1 + (x_i - 1) + x_{i-1} \ge e_{i-1} + 1 + 2 \sqrt{(x_i-1)x_{i-1}} = e_{i-1} + 2 \sqrt{e_{i-1}},$$for each $2 \le i \le n.$ Therefore, since $s_1 \ge 1, s_2 \ge 3$ are easy to verify, we have by an easy induction that $s_n \ge \left \lfloor \frac{(n+2)^2}{4} \right \rfloor - 1.$ Adding back $x_0 = 1$ hence gives us the required bound $s_n \ge \left \lfloor \frac{(n+2)^2}{4} \right \rfloor,$ as desired. $\square$