Let $\omega$ be a circle of radius $h$ with center $P$. Let $\omega_1$ be a circle of radius $AD$ with center $A$. Let $\omega _2$ be a circle of radius $BC$ with center $B$. Note that all three circles are externaly tangent to each other. Also, $\omega$ is tangent to $CD$. Let this tangency point be $E$.
In order to satisfy the problem, there must exist a tangent to $\omega$ through $E$ such that the tanget to $\omega$ at that point intersects both $\omega _1$ and $\omega _2$ thus forming $D$ and $C$.
The smallest radius of $\omega _1$ occurs when $CD$ is tangent to both $\omega _1$ and $\omega _2$. If $\omega _1$ is any smaller, then when tangent $EC$ is drawn, it is always below $\omega _1$, contradiction.
Using Pythagorean's Theorem, we can find that $\frac{1}{\sqrt{h}}=\frac{1}{\sqrt{AD}}+\frac{1}{\sqrt{BC}}$.
Since $AD$ is greater than or equal to the already determined value for it, we get the desired inequality.