The large sieve and the Bombieri-Vinogradov theorem

-1.Motivation-

Large sieve a philosophy reflect as a large group of inequalities which is very effective on controlling some linear sum or square sum of some correlation of arithmetic function, some idea of which could have originated in harmonic analysis, merely rely on almost orthogonality.

One fundamental example is the estimate of the quality,

\sum_{n\leq x}|\Lambda(n)\overline{\chi(n)}|

One naive idea of control this quality is using Cauchy-schwarz inequality. But stupid use this we gain something even worse than trivial estimate. In fact by triangle inequality and trivial estimate we gain trivial bound: \sum_{n\leq x}|\Lambda(n)\overline{\chi(n)}|\leq x. But by stupid use Cauchy we get following,

\sum_{n\leq x}|\Lambda(n)\overline{\chi(n)}|\leq ((\sum_{n\leq x}|\Lambda(n)|^2)(\sum_{n\leq x}|\chi(n)|^2))^{\frac{1}{2}}\leq xlog^{\frac{1}{2}}x

But this does not mean Cauchy-Schwarz is useless on charge this quality, we careful look at the inequality and try to understand why the bound will be even worse. Every time we successful use Cauchy-Schwarz there are two main phenomenon, first, we lower down the complexity of the quantity we wish to bound, second we almost do not loss any thing at all. So we just reformulate the quantity and find it lower down the complexity and the change is compatible with the equivalent condition of Cauchy-Schwarz. For example we have following identity,

\sum_{n\leq x}|\Lambda(n)\overline{\chi(n)}|=\sqrt{ \sum_{n\leq x}|\Lambda(n)\overline{\chi(n)}| \sum_{m\leq x}|\Lambda(m)\overline{\chi(m)}|}=\sqrt{ \sum_{k_1,k_2\in \mathbb F_p^{\times}}\sum_{n',m'\leq \frac{x}{p}}|\Lambda(n')\Lambda(m')\overline{\chi(k_1)\chi(k_2)}| }

So we could understand this quality as the Variation of primes in arithmetic profession constructed by \{pn+b| b\in\{1,2,...,p-1\}\}. But this is still difficult to estimate, merely because of we need to control the variation of convolution of \Lambda with itself on \mathbb F_p^{\times}\simeq \{pn+b| b\in\{1,2,...,p-1\}\}.

Now we change our perspective, recall a variant of Cauchy-Schwarz inequality, which called Bessel inequality, as following,

Bessel inequality

Let {g_1,\dots,g_J: {\bf N} \rightarrow {\bf C}} be finitely supported functions obeying the orthonormality relationship,

\displaystyle \sum_n g_j(n) \overline{g_{j'}(n)} = 1_{j=j'}

for all {1 \leq j,j' \leq J}. Then for any function {f: {\bf N} \rightarrow {\bf C}}, we have,

\displaystyle (\sum_{j=1}^J |\sum_{n} f(n) \overline{g_j(n)}|^2)^{1/2} \leq (\sum_n |f(n)|^2)^{1/2}.

Pf: The proof is not very difficult, we just need to keep an orthogonal picture in our mind, consider \{g_{j}(n)\}, 1\leq j\leq J to be a orthogonal basis on l^2(\mathbb N), then this inequality is a natural corollary.

Have this inequality in mind, by the standard argument given by transform from version of orthogonal to almost orthogonal which was merely explained in the previous note.  We could image the following corresponding almost orthogonal variate of “Bessel inequality” is true:

Generalised Bessel inequality

Let {g_1,\dots,g_J: {\bf N} \rightarrow {\bf C}} be finitely supported functions, and let {\nu: {\bf N} \rightarrow {\bf R}^+} be a non-negative function. Let {f: {\bf N} \rightarrow {\bf C}} be such that {f} vanishes whenever {\nu} vanishes, we have

\displaystyle (\sum_{j=1}^J |\sum_{n} f(n) \overline{g_j(n)}|^2)^{1/2} \leq (\sum_n |f(n)|^2 / \nu(n))^{1/2} \times ( \sum_{j=1}^J \sum_{j'=1}^J c_j \overline{c_{j'}} \sum_n \nu(n) g_j(n) \overline{g_{j'}(n)} )^{1/2}

for some sequence {c_1,\dots,c_J} of complex numbers with {\sum_{j=1}^J |c_j|^2 = 1}, with the convention that {|f(n)|^2/\nu(n)} vanishes whenever {f(n), \nu(n)} both vanish.

 

发表评论

Fill in your details below or click an icon to log in:

WordPress.com 徽标

您正在使用您的 WordPress.com 账号评论。 注销 /  更改 )

Twitter picture

您正在使用您的 Twitter 账号评论。 注销 /  更改 )

Facebook photo

您正在使用您的 Facebook 账号评论。 注销 /  更改 )

Connecting to %s