top of page

Jensen's inequality and Hölder's defect

  • Writer: ottoipulkkinen
    ottoipulkkinen
  • Jun 8, 2016
  • 2 min read

Back in the day, I was introduced to measure theoretic probability by Esko Valkeila, a notable Finnish researcher of fractional stochastic processes. I got fascinated by the subject not only because of his highly inspiring lectures but also because of the course book 'Probability with Martingales' by David Williams. It is still an excellent introductory text.

Among the wealth of theorems presented in Williams' book, I was immediately impressed by one of the simplest, yet one of most versatile of them, namely Jensen's inequality. It simply states that, for a convex up function f on an open interval and a finite-mean random variable X,

In other words, the expected value of a convex up function is always bounded from below by it's value at the mean of the random variable. Remember that a function is convex up (or simply convex, as opposed to concave) if the whole curve lies above the tangential lines at any point (there can be two separate lines for each point, corresponding to left and right derivatives). In particular, this holds true for the point EX, the expected value of X. Hence, with probability one,

for some constant c. Jensen's inequality follows simply by taking expectations (that is by integration with respect to probability measure of X).

A far less known result is that, for random variables with finite variance, there is an exact relation

with the correction term known as Hölder's defect. This is a truly remarkable result: The correction depends only on the first two moments of X, no matter how complicated its full distribution is. Otto Hölder derived it his famous 1889 paper in which presented the inequality that now bears his name. It can be proven quite easily by using the integral form of Taylor's theorem (as presented in e.g. Apostol):

Assuming a twice differentiable f, convexity simply means that f ''(x) is non-negative, so Jensen's inequality clearly holds. Furthermore, by taking expectations and assuming a finite upper bound M for f ''(x) in its domain of definition, we get

so that, by the intermediate value theorem, there must be a non-negative constant c such that the exact relation above holds.

Together with Ralf Metzler, we were able to use these results in a recent study on rates of chemical reactions and transcriptional gene regulation in live cell populations. Click the link for the full text

The point is that the rates of these reactions are determined by the Michaelis-Menten equation (from 1913) which is a concave function of the cellular concentration of reactant molecules - the rate of reaction can't increase indefinitely because the number of enzymes or other binding sites is limited. Due to stochastic synthesis of molecules such as transcription factors in cells, the reactant concentration is a random variable with a unique value in each cell. The concave rate equation therefore needs to be averaged over the population, which leads to a variance correction to the rate, and to Hölder's defect. We call the new theory Variance-Corrected Michaelis-Menten Equation and it is yet to be tested in laboratory.

Comments


Subscribe for Updates

Congrats! You're subscribed.

  • Facebook Social Icon
  • Black Instagram Icon

© 2023 by The Mountain Man. Proudly created with Wix.com

bottom of page