Geometric intuition of mean value property of nonlinear elliptic equation

I wish to gain some understanding of the MVP of nonlinear elliptic equation by geometric intuition.


Linear elliptic equation case

First of all, I have a very good geometric explain of the MVP of Laplace equation, i.e.

MVP of laplace equation

\Delta u=0 in \Omega , \forall B(x_0,r)\subset \Omega is a Ball, we have following identity:

\frac{1}{\mu(\partial(B))}\int_{\partial B}u(x)dx=u(x_0)


I need to point out first, this property is not difficult to proof by standard integral by part method, but the following method have more geometric intuition. And in some sense explain why this property holds.

The proof is not very difficult to explain by mathematic formula, but I wish to divide the proof into two part, explain one part by graph and literal interpretation.

part 1 of the proof:

we consider a 1-parameter group of foliation, and consider the integral identity with this foliation.

\int_{v\in S_{n-1}}\int_{\gamma_v} \frac{\partial \partial_{v}u}{\partial t}dt=\int_{B(x_0,r)}\partial_{n}u-\int_{S_{n-1}}\partial_{v}udv=\int_{B(x_0,r)}\partial_{n}u

Part 2 of the proof:

and we have:

  \int_{v\in S_{n-1}}\int_{\gamma_v} \frac{\partial \partial_{v}u}{\partial t}dt=0

by the pointwise equation \Delta u=0, one key point is \partial_{-v-v}u=\partial_{vv}u, \forall v\in S_{n-1}.


This approach cloud easily to transform to the general elliptic equation case and it seems a little difficult to transform to Possion equation, the non-hemomorphism case.


Nonlinear elliptic equation case


A-B-P estimate for general nonlinear uniformly elliptic equation


ABP estimate is the most basic estimate in fully nonliear elliptic equation.
The ABP maximum principle states (roughly) that, if

a^{ij} \partial _i \partial _j u \geq f, in \ \Omega \subset \mathbb{R}^n (a^{ij} \geq C Id >0),

Then (assuming sufficient regularity of the coefficients),

\sup _{\Omega} u \leq \sup _{\partial \Omega} u + C (\int _{\Omega} \vert f \vert^n )^{1/n} ………. (*)

I will give an intuitive explanation of the proof of (*) .
Usually, in order to prove maximum principles, the key idea is to use that at a local max the second derivative is negative-definite, then choose a good basis and get some identity of 1-order drivative and inequality for 2-order’s. This process is used in such like the proof of the Hopf lemma, and some inter gradient estimate, consider some flexiable function like e^{Au} or sometihng else anyway.

But in the proof of ABP we need more geometric intution and more trick.

First we do a rescaling:
if a^{ij} \partial _i \partial _j u \geq 1, in B_1 \subset \mathbb{R}^n (a^{ij} \geq C Id >0),u|_{\partial B_1}\geq 0.

|inf_{B_1}u| \leq C |A|^{1/n} ....(**)

And then We explain what is the contact set. It is the subset \Gamma^{+} of \Omega such that u agree with it convex envelop. i.e. \Gamma^{+}=\{x|u=convex \ evolap \ of \ u\ at \ x\} . The geometric meaning is it has at least one lower support plane. So what is \Gamma^{+} it is just the set that u is very low on it. Or in another way of view you consider -u as a lot of mountains then \Gamma^+ is the place near the tops of which mountain can see every thing (locally).
Then we look at every point in \Gamma^{+} , then determination of the hessian matrix det(u_{ij}) at this point have a control due to the PDE a^{ij} \partial _i \partial _j u \geq 1, and the uniformly elliptic property.

The determination of hessian matrix could be view as a determination of Jacobe matrix of the map (u_1,...,u_n)\to (e_1,...,e_n) .and by Area formula we have:

\int_{\Phi(\Omega)}  f( \Phi^{-1}(y)) dy =\int_{\Omega}f(x)|J({\Phi(x)})| dx,

It is easy to see for a constant c ,B_{c|sup_{\Omega}|u||}(0)\subset \Phi(\Omega) (Base on the PDE on every point, the geometric intution is just the function u could not be very narrow cone at every point). So we have , take f=\chi_{\Gamma^{+}} ,

|B_{c\sup_{\Omega}|u|}(0)|^{n}\leq \int_{\Gamma^{+}}\chi_{\Gamma_{+}}(x)|J_{\Phi}(x)|dx

so we have:

|B_{c\sup_{\Omega}|u|}(0)|\lesssim |{\Gamma_{+}}|^{\frac{1}{n}}...(***)

and the classical matrix inequality for every positive definite matrix A we have

det(AB)\leq (\frac{tr(AB)}{n})^n...(****) .

combine (***),(****) ,we have:

sup_{\Omega}|u|\lesssim ||\frac{a^{ij}u_{ij}}{D^*}||_{L^n({\Gamma^+})} .



General approach to get MVP for elliptic equation which is come from geometry


MVP for K-hessian equation, with geometric explanation


MVP for K-curvature equation, with geometric explanation 


MVP for p-Laplace equation, with geometric explanation





k-hessian equation and k-curvature equation

here is the problem, how to understand k-hessian equation and k-curvature equation.

k-hessian equation

k-hessian equation is:

H_k(u)=\sigma_k(D^2(u))=f (*)

where u is admissible, i.e. \forall 1\leq i\leq k, \sigma_i(D^2(u))\geq 0. this is just the condition to make (*) be a elliptic equation.

The most important result is the following three:

1.sovable (*) with direchlet boundary condition.

This is mainly the contribution of Caffaralli in 90’s. According flexible function and maximum principle we can establish the C^{1,\alpha} estimate and C^{2,\alpha} estimate in the inter. And the C^{2,\alpha} estimate near the boundary is establish according to the conformation invariant and some perbutation of the solution of k-hessian equation after special rescaling.

2.Hessian measure.

This is mainly the work of X.J.Wang and Trudinger. they proved:

in the meaning of viscosity solution, if \sigma_k(D^2(u))=f. then we can associate a measure \mu with u,and the following is right:

when u\in C^2(\Omega), \mu(B_r(x))=\int_{B_r{x}}\sigma_k(D^2(u)).

if u_1,..,u_n,... coverage to u. then \mu_1,...,\mu_n,... coverage to \mu in weak sense.


this is merely depend on a priori estimate on u

3.pointwise estimate corresponding wolff potential.

the Wolff potential is:

W^{\mu}_{k}(x,r)= \int_{0}^r(\frac{\mu(B_t(x))}{t^{n-2k}})^{\frac{1}{k}}\frac{1}{t}dt

We can easily use rescaling to understand the reasonable of this potential, and use this potential Lubutin establish the following pointwise estimate:

u\in \Phi_k(B_{4R}(x)), u\leq 0, then we have:

W^{\mu}_k(x,\frac{R}{2})\leq |u(0)| \leq W^{\mu}_k(x,2R)-sup_{B_{2R}}|u|


the RHS could look as a corollary of classical A-B-P estimate. the LHS need combine several observation. mean-value property and some else.

This result could use to establish some result on singularity point can be removable.

k-curvature equation

1.sovable (*) with direchlet boundary condition.

This is also established by cafferalli.

2.curvature measure.

This is established very recently. mean curvature equation in 2014, by perron lift and modified, general case in 2016 by more complex calculate and method.

3.pointwise estimate corresponding wolff potential.

This still do not established, and is the main thing I focus on. Due to we can look as k-curvature as a “projection” of k-hessian equation, Calderon-Zegmund decomposition and the estimate of k-hessian equation maybe useful.


 My ideas

look is as “average” of “loop space”, “surface space”.

1.Grassmannian bundle

n algebraic geometry, the Grassmann d-plane bundle of a vector bundle E on an algebraic scheme X is a scheme over X:
{\displaystyle p:G_{d}(E)\to X}
such that the fiber

{\displaystyle p^{-1}(x)=G_{d}(E_{x})} is the Grassmannian of the d-dimensional vector subspaces of E_x. For example,

{\displaystyle G_{1}(E)=\mathbb {P} (E)} is the projective bundle of E. In the other direction, a Grassmann bundle is a special case of a (partial) flag bundle. Concretely, the Grassmann bundle can be constructed as a Quot scheme.

Like the usual Grassmannian, the Grassmann bundle comes with natural vector bundles on it; namely, there are universal or tautological subbundle S and universal quotient bundle Q that fit into

{\displaystyle 0\to S\to p^{*}E\to Q\to 0}.
Specifically, if V is in the fiber p−1(x), then the fiber of S over V is V itself; thus, S has rank r = rk(E) and

{\displaystyle \wedge ^{r}S} is the determinant line bundle. Now, by the universal property of a projective bundle, the injection

{\displaystyle \wedge ^{r}S\to p^{*}(\wedge ^{r}E)} corresponds to the morphism over X:
{\displaystyle G_{d}(E)\to \mathbb {P} (\wedge ^{r}E)},
which is nothing but a family of Plücker embeddings.

The relative tangent bundle T Gd(E)/X of Gd(E) is given by[1]
{\displaystyle T_{G_{d}(E)/X}=\operatorname {Hom} (S,Q)=S^{\vee }\otimes Q,}
which is morally given by the second fundamental form. In particular, when d = 1, the early exact sequence tensored with the dual of S = O(-1) gives:
{\displaystyle 0\to {\mathcal {O}}_{\mathbb {P} (E)}\to p^{*}E\otimes {\mathcal {O}}_{\mathbb {P} (E)}(1)\to T_{\mathbb {P} (E)/X}\to 0},
which is the relative version of the Euler sequence.

2.Explain of the fully nonlinear elliptic equation

Now, we could consider the determination \sum_{i_1,...,i_k\in\{1,...,n\}}det(u_{ij})_{i,j\in \{i_1,...,i_k\}\times\{i_1,...,i_k\}} as the determination of transform: (u_{i_1},...,u_{i_k}) \longrightarrow (e_{i_1},...,e_{i_k}).

Now we need to understand \sigma_k(D^2(u))=f at a point x_0 as the average of determination of transform matrix of (u_{i_1},...,u_{i_k}) \longrightarrow (e_{i_1},...,e_{i_k}) on Grassmannian manifold G_k(x_0) is equal to f(x_0), i.e.:

\int_{G_k(x_0)} det(\frac{\partial u_{i_a}}{\partial e_{i_b}})     d\mu=f(x_0)

where \mu is the natural haar measure on G_k(x_0) \simeq G_k.

But the difficult to make the argument rigorous is that $u_i$ is scale and $e_i$ is vector.


Regularity of k-curvature equation

this is a note after reading the article”” of Cafferalli.

in his article,a large type of fully nonlinear elliptic equation has been particular,including the k-curvature equation.and use the continue method,we just need to establish a ingredient estimate,C^2 estimate in the interior and C^2 estimate near the boundary.we establish these estimate step by step,base on construct special flexible function and use the maximum principle to establish the first and second estimate,for the C^2 estimate near the boundary we need to investigate the influence of permutation on the boundary carefully.