Quite a surprising integral

So I was spending some time recently going over some Lebesgue integration theory (of course, in preparation for my qualifiers) when I came across a pretty cool problem.

Question

Prove that for any $p > -1$ we have
$$\int_{0}^1 \frac{x^p}{1-x}\ln\left( \frac{1}{x}\right) \ \mathrm{d}x = \sum_{k=1}^\infty \frac{1}{(p+k)^2}$$

Of course, as always, you are more than welcome to have a go at it, before coming back here to read on.

Solution

One of the first things to notice in this problem is that the left-hand-side is an integral, while the right-hand-side is a summation; and this is fascinating by itself. So if we are to have any hopes of proving this identity, we need to somehow introduce a sum on the left, or an integral on the right.

After fighting with this problem a bit, we recognize in the integrand, one of our long lost friends from Calculus.

$$ \sum_{n=1}^\infty x^n = \frac{1}{1-x} $$

Of course, provided that $x \in (0,1)$, which it is, in this case! So there it is, a sum inside the integral just like we wanted. So we are tasked with proving

$$\int_0^1 \left[ \sum_{k=1}^\infty -\ln(x)x^{k+p} \right]\ \mathrm{d}x = \sum_{k=1}^\infty \frac{1}{(p+k)^2}$$

It is at this point we must proclaim to ourselves: “Damn, it would be very nice if we could interchange the integral and the sum…”. And this seems like a cause worth fighting for. So we are stuck trying to know when it is legal to interchange the integral and the sum.

Theoretical Detour

Clearly, we can do this if $f$ is positive valued because of the monotonicity of the integral. Here is a quick sketch of that proof

Proof: The monotonicity of the integral tells us that if $f_k \uparrow f$ then $\int_E f_k \uparrow \int_E f$ for $f_k$ non-negative Lebesgue integrable functions. Here is a nice proof of the statement. So we can set $s_k = \sum_{n=1}^k f_n(x)$ and $s(x) = \sum_{k=1}^\infty f_k(x)$. Clearly $s_k \uparrow s$, and thus $$\int_{E} \sum_{k=1}^\infty f_k(x) = \int_E s = \lim_n \int_E s_n = \lim_n \int_E \sum_{k=1}^n f_k(x) = \lim_n \sum_{k=1}^n \int_E f_k(x) = \sum_{k=1}^\infty \int_E f_k(x)$$
And we win $\qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad\qquad\qquad \square$

Now all we need to do is generalize this to all functions. As we hurriedly peruse through our class notes from two semesters ago, we will uncover a gem, that shouldn’t have been forgotten in the first place: The Lebesgue Dominated Convergence theorem. This gives us hope.

Theorem (Lebesgue Dominated Convergence theorem) : Suppose $E \subseteq X$ is a subset of some space $X$ and $f_k \in L(E)$1 and $f_k \to f$ almost everywhere in $E$. If there exists some $g \in L(E)$ such that $|f_k| \leq g$ for all $k$, then $\int_E f_k \to \int_E f$.

Now the nice thing is we can try and do the same sort of trick here. Given some sequence of functions $f_k$, we can let $$s_k = \sum_{n=1}^k f_n(x) \quad \text{ and } \quad s(x) = \sum_{k=1}^\infty f_k(x)$$
and we see $s_k \to s$, but it is no longer a monotone convergence because some of the $f_k(x)$ could have negative values. So is there anything else we can do? Well, sorta?

If we let
$$s_k = \sum_{n=1}^k |f_n(x)| \quad \text{ and } \quad s(x) = \sum_{k=1}^\infty |f_k(x)|$$
then we know that $s_k \uparrow s$ and the previous result holds. But wait a second, if $s$ is Lebesgue integrable, we could potentially use it as the $g$ function in DCT! To make this a bit more clear, from the previous result we have
$$\int_E s = \sum_{k=1}^\infty \int_E |f_k| $$
So if we have an additional hypothesis that
$$\sum_{k=1}^\infty \int_E |f_k| < \infty$$
then $s$ becomes Lebesgue integrable and so, given $h_n = \sum_{k=1}^n f_k$, we know that $|h_n| \leq s$ for all $n$ 2. We also know that $h_n \to f$ where $f = \sum_{k=1}^\infty f_k$. So by DCT, we have $$\lim_n \int_E h_n = \int_E f \implies \sum_{k=1}^\infty \int_E f_k = \int_E \sum_{k=1}^\infty f_k$$
and we win! $\qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad\qquad\qquad \square$

The above discussion can be encapsulated into a nice corollary to DCT as follows

Corollary: Let $f_k \in L(E)$. If $\sum_{k=1}^\infty \int_E |f_k| < \infty$ then $$\int_E \sum_{k=1}^\infty f_k = \sum_{k=1}^\infty \int_E f_k$$

Back to the problem

Clearly we are going to let $f_k = -\ln(x)x^{k+p}$. The only thing we want to check in our problem is the additional hypothesis that $\sum_{k=1}^\infty \int_E |f_k| < \infty$ (The integrability of $f_k$ is easily checked!). To this end, consider

$$\sum_{k=0}^\infty \int_0^1 -\ln(x)x^{k+p}\ \mathrm{d}x $$

which we want to show is finite 3.

Integrating by parts, we have that

$$\sum_{k=0}^\infty \bigg( \frac{1}{k+p+1} \bigg[-\ln(x)x^{k+p+1}\bigg]_0^1 +\frac{1}{k+p+1}\int_0^1x^{k+p}\ \mathrm{d}x \bigg) $$

Notice that $$\lim_{x\rightarrow 0} \ln(x)x^{k+p+1} = \lim_{x\rightarrow 0} \frac{\ln(x)}{\frac{1}{x^{k+p+1}}} \sim \lim_{x\rightarrow 0} \frac{1/x}{-(k+p+1)x^{-(k+p)}} = -\frac{1}{k+p+1}x^{k+p-1} = 0$$
where $\sim$ denotes the application of L-Hospital’s rule.

And so,

$$\frac{1}{k+p+1} \bigg[-\ln(x)x^{k+p+1}\bigg]_0^1 = 0$$

Therefore, we have that the required integral is simply given by

$$\sum_{k=0}^\infty \frac{1}{k+p+1}\int_0^1x^{k+p}\ \mathrm{d}x = \sum_{k=0}^{\infty} \frac{1}{(k+p+1)^2}$$

Under change of index $(k+1 \rightarrow k)$, we have that the required integral evaluates to

$$\sum_{k=1}^\infty \frac{1}{(k+p)^2}$$

and notice that for all $p>-1$, the above series is convergent 4.

Therefore we have shown that $\sum_k \int_E f_k(x) < \infty$. So by the corollary discussed above, we have that

$$\int_0^1 f(x) \ \mathrm{d}x = \int_0^1 \sum_{k=0}^\infty -\ln(x)x^{k+p} \ \mathrm{d}x = \sum_{k=0}^\infty \int_0^1 -\ln(x)x^{k+p} \ \mathrm{d}x = \sum_{k=1}^\infty \frac{1}{(k+p)^2}$$

And with this we are done!

A very nice consequence of this is that we now have the following integral equation
$$\int_0^1 \frac{\ln(x)}{x-1} = \frac{\pi^2}{6}$$


  1. 1.$L(E)$ is the set of Lebesgue integrable functions from $E$ to $\overline{\mathbb{R}}$
  2. 2.Triangle inequality (and of course also because $s$ is an infinite sum).
  3. 3.Notice that in $(0,1)$ $\ln(x) < 0$, so $|\ln(x)| = -\ln(x)$
  4. 4.$p-$test with some direct comparison test shenanigans suffice to guarantee this