Let $x(t),t\in [1,\infty)$ be a nondecreasing positive function satisfying the following inequality: $$ x'(t) \le \int_t^{+\infty} x(s)\frac{k(s)}{s^2}\,ds, $$ for any $t \ge 1$, where $k(t),t\in [1,\infty)$ is a nonincreasing positive function such that $$ \int_1^{+\infty}\frac{k(s)}{s}\,ds <\infty. $$

Can we prove that $x(t)$ is a bounded function?


The boundedness statement is true

The general argument is similar to what I gave in my previous answer, which was essentially off by a log due to certain inefficiencies in the estimates. Here I rewrite the argument to get rid of the log loss.

To start, as explained in the previous answer, we immediately have that $x(t)$ is sublinear. Since the inequality concerning $x(t)$ is linear, we can assume, without loss of generality, that $\sup x(t)/t \leq 1$ by a simple rescaling of $x$. This means we can find a sequence of times $1 = t_0, t_1, t_2, \ldots$ defined by

$$ t_i = \inf \{ t\in [1,\infty) : \forall s \geq t, x(s) \leq 2^{-i} s \}. $$

(Note that implicitly since $x$ is differentiable it is also continuous, so $x(t_i) = 2^{-i} t_i$.)

Our goal is to estimate $t_i$. Specifically, we want to show that $2^{-i}t_i$ is bounded.

We will also denote by

$$ K(t) = \int_t^\infty \frac{k(s)}{s} ~ds, \quad K_i = \int_{t_i}^{t_{i+1}} \frac{k(s)}{s} ~ds.$$

We note that $K_i \searrow 0$ and the numbers are in fact summable by assumption.


Integrating by parts the differential inequality for $x'$ we get (as I argued in the previous answer) for $1 \leq a < b$

$$ x(b) - x(a) \leq (b-a) \int_b^\infty \frac{x(s)}{s} \frac{k(s)}{s} ~ds + \int_a^b x(s) \frac{s - a}{s} \frac{k(s)}{s} ~ds $$

Estimating $(s-a)/s \leq 1$ we have that, by Gronwall's inequality

$$ x(b) \leq \left[ x(a) + (b-a) \int_a^\infty \frac{x(s)}{s} \frac{k(s)}{s} ~ds\right] \cdot e^{K(a) - K(b)} $$

This implies, setting $b = t_{i+1}$ and $a = t_i$, that

$$ 2^{-1-i} t_{i+1} \leq \left[ 2^{-i} t_i + (t_{i+1} - t_i) \sum_{j = i}^\infty 2^{-j} K_j \right] e^{K_{i}} $$

(here we rewrote $\int_{t_{i}}^\infty x(s) k(s) s^{-2} ~ds = \sum_{j = i}^\infty \int_{t_j}^{t_{j+1}} x(s)s^{-1} \cdot k(s) s^{-1} ~ds$, and used the decaying bound on $x(s)s^{-1}$ above $t_j$, and the fact that all functions involved are positive.)

Simplify (by the summability of $K_j$ we can assume from here on the indices $i$ are always larger than some sufficiently large $i_0$ such that the two terms in the brackets below are guaranteed to be positive)

$$ \left[ e^{-K_{i}} - \sum_{j = 0}^\infty 2^{-j} K_{i+j} \right] t_{i+1} \leq \left[2 - \sum_{j = 0}^\infty 2^{-j} K_{i+j} \right] t_i $$

So for all sufficiently large $i$ we have the bound (using the convexity of the exponential function)

$$ t_{i+1} \leq 2 t_i \cdot \frac{1}{e^{-K_{i}} - \sum_{j = 0}^\infty 2^{-j} K_{i+j}} \leq \frac{2 t_i}{1 - K_i - \sum_{j = 0}^\infty 2^{-j} K_{i + j}} $$

The Estimates on $t$

To show our desired conclusion it suffices to show that the infinite product

$$ \prod_{i = i_0}^\infty (1 - K_i - \sum_{j = 0}^\infty 2^{-j} K_{i+j}) $$

is bounded below away from zero. Now, the summability of $K_i$ implies that we can choose $i_0$ sufficiently large that $\sum_{i = i_0}^\infty K_i < \frac12$. This implies

$$ \sum_{i = i_0}^\infty \ln (1 - K_i - \sum_{j = 0}^\infty 2^{-j} K_{i+j}) \geq - 2\ln 2 \sum_{i = i_0}^\infty \left( K_i + \sum_{j = 0}^\infty 2^{-j} K_{i + j} \right) $$

the first term in the sum is obviously bounded by the summability of $K_i$. For the second term we interchange the order of summation

$$ \sum_{i = i_0}^\infty \sum_{j = 0}^\infty 2^{-j} K_{i + j} = \sum_{j = 0}^\infty \sum_{i = i_0}^\infty 2^{-j} K_{i + j} \leq 2 \sum_{i = i_0}^\infty K_{i+1} < \infty$$

and is also bounded. This concludes the proof.

  • $\begingroup$ It is not clear to me why $ x(b) \leq \left[ x(a) + (b-a) \int_b^\infty \frac{x(s)}{s} \frac{k(s)}{s} ~ds\right] \cdot e^{K(a) - K(b)} $ is true. I guess that it should be$ x(b) \leq \left[ x(a) + (b-a) \int_a^\infty \frac{x(s)}{s} \frac{k(s)}{s} ~ds\right] \cdot e^{K(a) - K(b)} $ and the next line is $ 2^{-1-i} t_{i+1} \leq \left[ 2^{-i} t_i + (t_{i+1} - t_i) \sum_{j = i}^\infty 2^{-j} K_j \right] e^{K_{i}} $. Then the rest arguments work as before. $\endgroup$ – Totoro Feb 21 at 16:16
  • $\begingroup$ @Totoro: that part is exactly the same as my previous answer. Do you see how I got the bound for $x(t_2) - x(t_1)$ through integration by parts in the previous part? There are two boundary terms corresponding to the points $t_2 (=b)$ and $t_1 (=a)$. The $t_1$ term vanish because I chose to integrate by parts against $t - t_1$ which vanishes there. $\endgroup$ – Willie Wong Feb 21 at 20:36
  • $\begingroup$ Oh, wait, I see what you meant; your question is with my application of Gronwall. Yes, you are correct. $\endgroup$ – Willie Wong Feb 21 at 20:42

This is not a complete answer

The argument below doesn't give boundedness.

However, it gives that under the assumption $\int_1^\infty x(s) k(s) s^{-2} ds$ converges, $x(t) = o(t^\lambda)$ for any $\lambda > 0$.

With stronger assumptions on $k$ the argument can also imply boundedness.

Sublinear growth

First one observes that if the integral is finite, the growth rate is sublinear. This follows from the fact that if the integral $\int_1^\infty x(s) k(s) s^{-2} ds$ converges, then $x'(t) \to 0$ as $t\to\infty$, and hence $x(t) = o(t)$ by, e.g., L'Hopital.

Upgrade the growth control

Integrating the bound for $x'(t)$ gives, for arbitrary $1 \leq t_1 < t_2$

$$ x(t_2) - x(t_1) \leq \int_{t_1}^{t_2} \int_t^\infty x(s) k(s) s^{-2} ds~dt $$

Integrating in parts in $t$, using that $dt = d(t - t_1)$

$$ x(t_2) - x(t_1) \leq (t_2 - t_1) \int_{t_2}^\infty x(s) k(s) s^{-2} ds + \int_{t_1}^{t_2} (t - t_1) x(t) k(t) t^{-2} dt $$

Now, for any $\delta, \beta> 0$, there exists $\tau_{\delta,\beta}$ such that for every $s \geq \tau_{\delta,\beta}$, it holds that

$$ x(s) \leq \delta s, \qquad \text{and} \qquad \int_{s}^\infty k(s)s^{-1} ds < \beta $$

since we know that $x$ grows sublinearly and the relevant integral converges. This implies that, for $t_2 > \tau_{\delta,\beta}$,

$$ x(t_2) \leq x(\tau_{\delta,\beta}) + (t_2 - \tau_{\delta,\beta}) \delta \beta + \int_{\tau_{\delta,\beta}}^{t_2} x(s) \cdot k(s) s^{-1} ds $$

By Gronwall's inequality this means that

$$ x(t_2) \leq [ \delta \tau_{\delta,\beta} + (t_2 - \tau_{\delta}) \delta \beta ] e^{\beta} $$

This final inequality can be used to estimate $\tau_{\delta/2,\beta}$. Consider the inequality

$$ [\delta \tau_{\delta,\beta} + (s - \tau_{\delta,\beta}) \delta\beta] e^{\beta} \leq \frac{\delta}{2} s \tag{A}$$

Solving this inequality we see that this is satisfied whenever

$$ \frac{e^{\beta} \tau_{\delta,\beta}}{\frac12 - \beta e^{\beta}} \leq s $$

For convenience write

$$ \frac{e^\beta}{\frac12 - \beta e^\beta} = 2^{1+\sigma(\beta)} $$

and note that $\sigma$ is continuous and $\sigma(0) = 0$.

This implies that $\tau_{\delta/2,\beta} \leq 2^{1 + \sigma(\beta)} \tau_{\delta}$.

Now, iterating this argument, starting from some sufficiently small $\delta$ that we fix, we see can construct an increasing sequence of times $\tau_{2^{-k} \delta,\beta}$ such that for $T \geq \tau_{2^{-k} \delta,\beta}$ it holds

$$ x(T) \leq 2^{-k} \delta T $$

We also have by iteration the bounds

$$ \tau_{2^{-k}\delta,\beta} \leq 2^{k + k\sigma(\beta)} \tau_{\delta,\beta} $$

So we have

$$ x(T) \lesssim_{\delta,\beta} T^{\sigma(\beta)} $$


One can upgrade the estimate to get boundedness if one knows the decay rate of $\int_t^\infty k(s) s^{-1} ds$.

For example, if one knows that this integral is bounded by $C/\ln(t)^2$, then Gronwall will imply an estimate along the lines of

$$ (\tau_{\delta} + (s - \tau_\delta) \frac{C}{(\ln \tau_{\delta})^2}) e^{C/(\ln \tau_\delta)^2} < \frac12 s $$

replacing (A).

Now, by our estimates in the previous section we would've found that

$$ \tau_{2^{-k\delta}} \leq 2^{k(1 + \sigma)} \tau_{\delta} $$

for some $\sigma$. Bootstrapping from this we would get that

$$ \tau_{2^{-k} \delta} \leq 2^{1 + O(k^{-2})} \tau_{2^{-(k-1)} \delta} $$

The summability of the series $(k^{-2})$ then gives boundedness of $x(t)$.

[This argument can carry through as long as we know, for example, that $k(s) \lesssim (\ln (1+s))^{-2-\gamma}$ for some $\gamma > 0$; but seemingly fails for $k(s) = (\ln(1+s))^{-2}$.]

  • $\begingroup$ Sharpening the arguments above a bit more, a sufficient condition for guaranteeing boundedness can be written as $$ \int_1^\infty \frac{1}{t} \int_t^\infty \frac{k(s)}{s} ds~dt < \infty $$ $\endgroup$ – Willie Wong Feb 19 at 18:21

Without requiring that the integral \begin{equation} \int_1^\infty x(s)\frac{k(s)}{s^2}\,ds \tag{1} \end{equation} be finite, the answer is no. Indeed, then one can take e.g. $k(s)=1/s$ and $x(s)=e^s$.

I don't know whether the condition that the integral in (1) be finite changes the answer.


Your Answer

By clicking "Post Your Answer", you agree to our terms of service, privacy policy and cookie policy

Not the answer you're looking for? Browse other questions tagged or ask your own question.