Derivative Blog

Mathematica et cetera

A proof of Euler’s Reflection Formula

I just found the following proof of Euler’s reflection formula. The thing I like about it is that all  it requires is dumbly marching forwards, there is no trickery.

When s and 1-s are both positive, that is, when 0<s<1,

    \[\Gamma(s)\Gamma(1-s)=\int_0^\infty t^{s-1}e^{-t}dt\int_0^\infty t^{-s}e^{-t}dt.\]

I’m going to join those integrals, so giving different names to each copy of t,

    \[=\int_0^\infty\int_0^\infty x^{s-1}e^{-x}y^{-s}e^{-y}dxdy\]

    \[=\int_0^\infty\int_0^\infty \frac 1x \left(\frac xy\right)^s e^{-(x+y)}dxdy.\]

Now change variables u=\frac xy and v=x+y. The inverse transformation is x=\frac{uv}{u+1} and y=\frac v{u+1} and the Jacobian is \frac v{(u+1)^2}, then

    \[=\int_0^\infty\int_0^\infty\frac{u+1}{uv} u^s e^{-v}\frac v{(u+1)^2}dudv\]

    \[=\int_0^\infty\int_0^\infty\frac{u^{s-1}}{u+1}e^{-v}dudv=\int_0^\infty\frac{u^{s-1}}{u+1}du.\]

To evaluate this integral, split into the two parts

    \[\int_0^\infty\frac{u^{s-1}}{u+1}du=\int_0^1\frac{u^{s-1}}{u+1}du+\int_1^\infty\frac{u^{s-1}}{u+1}du\]

    \[=\int_0^1\frac{u^{s-1}}{1+u}du+\int_1^\infty\frac{u^{s-2}}{1+\frac 1u}du.\]

Now use the series for the denominators

    \[=\int_0^1u^{s-1}\sum_{h=0}^\infty (-1)^hu^hdu+\int_1^\infty u^{s-2}\sum_{h=0}^\infty (-1)^hu^{-h}du\]

and now change the order of the sums and integrals

    \[=\sum_{h=0}^\infty\int_0^1(-1)^hu^{s+h-1}du+\sum_{h=0}^\infty\int_1^\infty (-1)^hu^{s-h-2}du\]

    \[=\sum_{h=0}^\infty\frac {(-1)^h}{s+h}+\frac {(-1)^h}{s-h-1}\]

    \[=\frac 1s+\sum_{h=1}^\infty \frac {(-1)^h 2h}{s^2-h^2}.\]

Now define the function

    \[f(x)=x\prod_{n=1}^\infty \frac{4n^2-x^2}{4n^2}\frac{(2n-1)^2}{(2n-1)^2-x^2}\]

and observe that

    \[f(x)=\left(x\prod_{n=1}^\infty 1-\frac{x^2}{4n^2}\right)\left(\prod_{n=0}^\infty 1-\frac{x^2}{(2n+1)^2}\right)^{-1}\]

and use Euler’s products for the sine and cosine

    \[\sin(\pi x)=\pi x\prod_{n=1}^\infty 1-\frac{x^2}{n^2}\quad \cos\left(\frac\pi 2 x\right)=\prod_{n=0}^\infty 1-\frac{x^2}{(2n+1)^2}\]

to notice

    \[f(x)=\frac 2\pi \sin\left(\frac\pi 2 x\right)\frac 1{\cos\left(\frac\pi 2 x\right)}=\frac 2\pi\tan\left(\frac\pi 2 x\right)\]

now take the logarithmic derivative

    \[\log(f(x))'=\frac{\pi}{\sin(\pi x)}=\frac 1x+\sum_{n=1}^\infty \frac {4n}{4n^2-x^2}-\frac{2(2n-1)}{(2n-1)^2-x^2}\]

the right hand side is what we had previously found with the variable s instead of x, therefore

    \[\Gamma(s)\Gamma(1-s)=\frac{\pi}{\sin(\pi s)}.\]

We have the equality when 0<s<1, and since these functions are holomorphic, and they coincide in an open set they must be equal.

The Shoe-Banging Incident and Historiography

A montage, obviously.

In 1960, the Cold War was in a crescendo. The communists had won the civil war in China, reasserted themselves in Korea and weren’t showing any signs that they would withdraw from Eastern Europe. Another war was underway in Indochina and the United States was starting to get involved. Year after year the opposing sides would develop bigger nuclear weapons and deadlier missiles to deliver them. On this day 61 years ago, before hundreds of reporters and delegates from every UN member state, Khrushchev had his finest hour. Irate at Sumulong, the Philippine delegate who had accused the Soviet Union of “swallowing up”  the rights of the nations of Eastern Europe, he walked up to the podium and berated him while banging his shoe on the table. Or did he?

The United Nations official transcript for the assembly contains Sumulong’s speech, but not Khrushchev’s, and no mention of any shoe. An article in the UN’s site claims he was not at the rostrum when the incident is claimed to have occurred. It claims that it was the Romanian representative who called the point of order, not Khrushchev. Another issue of contention is the origin of the shoe. One version has it that Khrushchev hit his hand on the table so hard he broke his watch, and after that he took off his shoe to bang with instead. The Wikipedia article says his granddaughter claimed he was wearing tight shoes and took them off, so when he stood up shoeless and banged his hand on the table and his watch fell off he saw the shoe and picked it up. But the source it cites for that is incomplete in the archive and the original is down. Note that in the first version the watch stopped and in the second it fell off. Another version claims he somehow arranged a third shoe just to bang it. The New York Times article also has one version by John Loengard who says he never banged it.

Come and take it

It’s an irresistible story isn’t it? Xerxes, king of kings, demands Leonidas give up his weapons; Leonidas replies “Come and take them!”. So irresistible that twenty-three centuries later separatist Texans remembered it when asked to return a cannon.Another example, when the Ottoman Sultan demanded:

Sultan Mehmed IV to the Zaporozhian Cossacks: As the Sultan; son of Muhammad; brother of the sun and moon; grandson and viceroy of God; ruler of the kingdoms of Macedonia, Babylon, Jerusalem, Upper and Lower Egypt; emperor of emperors; sovereign of sovereigns; extraordinary knight, never defeated; steadfast guardian of the tomb of Jesus Christ; trustee chosen by God Himself; the hope and comfort of Muslims; confounder and great defender of Christians – I command you, the Zaporogian Cossacks, to submit to me voluntarily and without any resistance, and to desist from troubling me with your attacks.

— Turkish Sultan Mehmed IV

The Cossacks replied:

Zaporozhian Cossacks to the Turkish Sultan!

O sultan, Turkish devil and damned devil’s kith and kin, secretary to Lucifer himself. What the devil kind of knight are thou, that canst not slay a hedgehog with your naked arse? The devil shits, and your army eats. Thou shalt not, thou son of a whore, make subjects of Christian sons. We have no fear of your army; by land and by sea we will battle with thee. Fuck thy mother.

Thou Babylonian scullion, Macedonian wheelwright, brewer of Jerusalem, goat-fucker of Alexandria, swineherd of Greater and Lesser Egypt, pig of Armenia, Podolian thief, catamite of Tartary, hangman of Kamyanets, and fool of all the world and underworld, an idiot before God, grandson of the Serpent, and the crick in our dick. Pig’s snout, mare’s arse, slaughterhouse cur, unchristened brow. Screw thine own mother!

So the Zaporozhians declare, you lowlife. You won’t even be herding pigs for the Christians. Now we’ll conclude, for we don’t know the date and don’t own a calendar; the moon’s in the sky, the year with the Lord. The day’s the same over here as it is over there; for this kiss our arse!

— Koshovyi otaman Ivan Sirko, with the whole Zaporozhian Host

This one is also a cultural phenomenon in Ukraine and to a lesser extent Russia. The painting was bought by Tsar Alexander III by the highest price paid for a Russian painting at the time. For example here are Ukrainian people’s deputies (members of parliament) reenacting it, but as a reply to Putin:

 

The takeaway

This might seem like a trivial point to historians, but we shouldn’t take these stories too seriously. They’re very popular with the general public, probably because we use them to imagine the events unfolding like a movie. People make lists of this type of thing. And they’re usually stated as fact too. Contrast with, for example, the seven kings of Rome which people usually make sure to say the story is legend and not to be taken literally. We can’t determine with certainty what happened in living memory during a period when cameras existed. It was on the New York Times the following day, and yet there’s still debate about it. It seems hopeless to try to determine what Caesar said when crossing the Rubicon, or what Louis XIV claimed his metaphysical relation to the state was. I think the best policy is to always emphasize that these things are apocryphal when talking about them.

A346587

OEIS link. This is the sequence of k so that

    \[\frac{k}{\lambda(k)}\]

is incrementally largest, where \lambda is the Carmichael function.

I wrote a math stackexchange post which I’ll reproduce below:

If we fix a list of primes, and write n=2^{a_0} p_1^{a_1}\dots p_k^{a_k} where only a_0 can be 0, we can find what the exponents should be. There are four cases for a_0=0,1,2 and a_0>2. I’ll do the last one and the others are analogous.

    \[\lambda(n)=\text{lcm}(2^{a_0-2},(p_1-1)p_1^{a_1-1},\dots,(p_k-1)p_k^{a_k-1})\]

and now we choose the a_i to be equal to or smaller than one (2 for a_0) plus the largest exponent of p_i which occurs in any of the other parentheses, because if we chose a_i to be greater than that it would have the effect of multiplying both n and \lambda(n) which would cancel out.

So this means that for any list of primes we know that it appears on the sequence a finite number of times and we can find what the exponents are. For example if we want the primes that divide n to be 2,3,5,7 we have that the possibilities of n are 105,210,315,420,630,840,1260,2520 and of those only 1260 appears on the sequence. One thought I had is that we would want n to be divisible by some prime p so that p-1 has a prime raised to a large power in its factorization, but for example n which is divisible by 2,17 does not appear on the sequence. Larger Fermat primes don’t work either. Some of the terms in the sequence are primorials like for the totient function, but some don’t appear, for example 30 and 210 don’t.


The question of describing the numbers which appear on the sequence feels like it shouldn’t be hard, but I haven’t been able to prove anything about it, except the result above. It also seems like the sequence encodes something about the sequence of numbers one less than the primes, but it’s unclear what that is. Alas, a seemingly interesting problem I cannot solve.

Counting the number of chess games

I recently came across a github repo by John Tromp with an exceedingly simple premise: generate a bunch of random chess positions, verify whether they’re legal, and use statistics to approximate how many legal games there are.  Clearly that idea has the precise simplicity and novelty to make one feel like an idiot for not having come up with it oneself. He did that with a sample of 1000 games and found a 95% confidence interval of (4\pm 1.1)\cdot 10^{44} games. Then he decided to crowdsource a sample of 10000 games, which should produce an estimate with one digit within the confidence interval. I proved that 8 of them are legal by finding games that result in the random position. It was… odd. I expected a  randomly generated chess game to be weird, but I wasn’t prepared for just how weird they were going to be. I think that’s sort of like how old movies imagined aliens to be little green men. The weirdness we can up with is within the bounds of what we know. Here’s an example:

Anarchy! How many queens are there? And why does black have five knights? Anyway, the position is legal, as I found a game that produces it. As of writing there are still 132 positions missing proofs that they’re either legal or illegal. So I encourage everyone reading to go there and try to do them. The procedure I came up with to produce the proof games is:

  1. Count the number of black and white pieces. Subtract those from 16 to get the number of pieces you need to capture from each side.
  2. Find out whether either side is missing a piece they started with, for example if they don’t have a light square bishop. If so, you have to capture that piece, otherwise capturing pawns is usually more convenient.
  3. If either player has more than one light square or dark square bishop, make sure you can produce it by promotion.
  4. Get the pawns out of each others’s ways for promotion. This is the trickiest part in my opinion. In the example above I had to preserve the white B and G pawns and the C and G black ones. I also had to make room for 5 black and 4 white promotions, and get rid of the black dark squares bishop. I had finished this step by move 11, the last capture of the game, when the board looked like this:

  1. Get the kings into their final positions, and if possible get the pieces around them too, so you don’t produce too many checks when getting the other.
  2. Do all the promotions necessary.
  3. Get all the pieces into position.

Edit: the 10 thousand game sample is complete. It yielded a 95% confidence interval of (4.45\pm 0.37)\cdot 10^{44} games.

Consistent Legendre Symbols in LaTeX

Quadratic reciprocity tells us that

    \[\left(\frac 3p\right)\left(\frac p3\right)=(-1)^{(p-1)/2}\]

which is just terrible: the parentheses are different sizes, on the Legendre symbol on the left p and 3 are closer together and on the right further apart. Normally this doesn’t come up because you rarely have two nearly identical fractions side by side and enclosed by parentheses, but when talking about quadratic residues it’s quite common. One solution is to change the \left and \right tags to \bigg

    \[\bigg(\frac 3p\bigg)\bigg(\frac p3\bigg)=(-1)^{(p-1)/2}.\]

It looks good enough, but if you’re going to the trouble, you might as well define a command to do it for you. I copied one from the LaTeX stackexchange and modified it with the fix above

\makeatletter
\def\legendre@dash#1#2{\hb@xt@#1{%
\kern-#2\p@
\cleaders\hbox{\kern.5\p@
\vrule\@height.2\p@\@depth.2\p@\@width\p@
\kern.5\p@}\hfil
\kern-#2\p@
}}
\def\@legendre#1#2#3#4#5{\mathopen{}\bigg(
\sbox\z@{$\genfrac{}{}{0pt}{#1}{#3#4}{#3#5}$}%
\dimen@=\wd\z@
\kern-\p@\vcenter{\box0}\kern-\dimen@\vcenter{\legendre@dash\dimen@{#2}}\kern-\p@
\bigg)\mathclose{}}
\newcommand\legendre[2]{\mathchoice
{\@legendre{0}{1}{}{#1}{#2}}
{\@legendre{1}{.5}{\vphantom{1}}{#1}{#2}}
{\@legendre{2}{0}{\vphantom{1}}{#1}{#2}}
{\@legendre{3}{0}{\vphantom{1}}{#1}{#2}}
}
\def\dlegendre{\@legendre{0}{1}{}}
\def\tlegendre{\@legendre{1}{0.5}{\vphantom{1}}}
\makeatother

It looks like this

\legendre{p}{3}\legendre{3}{p}=(-1)^{(p-1)/2}

    \[\legendre{p}{3}\legendre{3}{p}=(-1)^{(p-1)/2}.\]

Which is nice, since it isn’t identical to a fraction. It doesn’t, however, fix the issue with the spacing. I think that’s because the bounding box of the p in this font has some whitespace on top and 3 has on the bottom. But you can just fix that by changing the font.

Book Recommendations

  • Seeing Like A State by James C. Scott.

I just had to reproduce the cover below. It’s a picture of a road that is simultaneously supposed to be straight and run across a line of constant latitude. It’s very representative of the theme of the book: states trying to achieve certain narrow goals and getting unexpected results.

  • Nabokov’s Favorite Word Is Mauve by Benn Blatt.

Blatt quantifies several famous authors’s quirks. George RR Martin likes to describe things as “black as pitch”; women authors use the words pillows, lace, curls, dress, china, skirt, and curtains more often than men; the word most indicative of a British as opposed to an American writer is wanked.

  • Homage to Catalonia by George Orwell.

Orwell’s memoir of his time fighting in the Spanish Civil War. Funnier than you would expect.

The Gregorian Calendar is Wrong

Caesar!

Picture this: it is January 48BC. Caesar crossed the Rubicon nearly an year ago. Then he proceeded to march on Rome then go to Spain, destroy the forces loyal to the Senate there and come back. Now he is in Italy while Pompey and most of the Senate are in Greece assembling an army to meet him. There is a real possibility that Pompey will use the resources of the richer Roman East to outnumber and defeat Caesar. Furthermore, the Senate’s navy, led by Bibulus, controls the Mediterranean, and are blockading the Strait of Otranto.

However, Caesar knows something Bibulus doesn’t. The roman calendar calendar is a real mess. It is a lunar calendar with common years of 355 days and leap years of 377 or 378. The Pontifex Maximus, that is Caesar, was responsible for manually adding in leap years to keep the calendar in sync. But Caesar had been busy conquering Gaul for the last ten years so the calendar had drifted so far that it said January, but it was actually early Autumn. It was too dangerous to sail in winter, so Bibulus assumed Caesar wouldn’t dare a crossing. Caesar, of course, did dare the crossing. Then he proceeded to destroy Pompey’s army at Pharsalus. Pompey fled to Egypt, where king Ptolemy XIII had him beheaded. I think the lesson here is not to mess with the guy who owns the calendar.

After winning the civil war, Caesar proceeded to fix the calendar, so nobody could pull this trick ever again. The calendar he instituted, known as the Julian calendar, had the familiar 365 day regular years and 366 day leap years every 4 years, so an year length of 365.25. It was used into the XXth century in places like Russia and the Ottoman Empire. But in the XVIth century, Pope Gregory XIII noticed the calendar was 10 days out of sync which was messing up the calculation of the date of Easter. The Pope didn’t use this knowledge to win any battles, but he did fix the calendar to the precision of a day for the next 7700 years, which seems good enough. The Gregorian calendar has an year length of 365.2425 days. In the XIXth century John Herschel proposed modifying the Gregorian calendar by removing a leap year every 4000 years which would make it yet closer to the true tropical year. I will propose a better solution.

The true length of a tropical year is 365.242189 days. And the continued fraction of that is

    \[365.242189=365+\frac 1{4+\frac 1{7+\frac 1{1+\frac 1{3+\frac 1\ddots}}}}=[365;4,7,1,3,40,2,3,5].\]

This means that we can crop the continued fraction and get a good approximation of the length of the year. Cropping just before a large coefficient yields an especially good approximation compared to the size of the denominator, so I propose a calendar with year length [365;4,7,1,3]=365+\frac{31}{128}=365+\frac{1}{4}-\frac 1{128}, which corresponds to common years of 365, intercalated with leap years of 366 days every 4 years, except for once every 128 years which is a regular year instead. This is simpler than the rules for the Gregorian calendar and yet it achieves a smaller error

    \begin{equation*} \begin{split} 365.242189-365-\frac{1}{4}+\frac{1}{128} & = \phantom{-}0.0000015 \\ 365.242189-365-\frac{1}{4}+\frac{1}{100}-\frac 1{400} & = -0.000311 \end{split} \end{equation*}

So we’d be on the correct date for the next six hundred thousand years, which is close to the maximum length of time we can predict that sort of thing over anyway.

Links

This is a list of links to things I find interesting.

Math

A blog series on the classical theorem of multivariable calculus.

A blog post with a proof of the Abel-Ruffini theorem. Complete with javascript group actions.

A blog post by Terence Tao.

A blog post by Scott Aaronson.

Using computers to generate numerical conjectures. My favorite is

    \[-\frac 4\pi =-1+\frac 1{-3+\frac 1{-3+\frac 4{-5+\frac 9{-7+\frac {16}{\dots}}}}}\]

This is very difficult to explain. Anyway, it’s a great explanation of what mathematics is, what axioms are, etc.

Twofold performance increase on the greatest common divisor operation by using the binary variation of Euclid’s algorithm.

The probability of ending up with a single coin when you start with n and toss them and discard the tails.

The highlight of this post is the identity

    \[\sigma_7(n) = \sigma_3(n) + 120\sum_{m=1}^{n-1}\sigma_3(m)\sigma_3(n-m).\]

Science

A blog series defending a tentative theory that the obesity epidemic is caused by environmental contaminants.

A collection of courses which attempts to teach physics concepts with the minimum amount of overhead.

A derivation of the Boltzmann distribution (of the density of the atmosphere versus height) in a not handwavey way.

  • Saving the earth from the Sun’s red giant phase

A game about an universe where you can run at light speed. It’s great for building intuition about relativistic phenomena.

Finance

A graph of the S&P 500 Shiller PE since 1872.

The source of that data, Shiller himself.

Aswath Damodaran investigated the motives for China’s recent crackdown on its tech companies.

Art

  • Earth and Stars – Thomas Pesquet

Picture taken aboard the International Space Station.

Earth and stars

History

A podcast series on the history of Rome, from its founding to the fall of the Western Empire. In astounding detail.

Orwell ponders about the consequences of the then new atomic bomb technology. This also explains something I always found unclear in 1984: why is it that the world is divided into superstates? Wouldn’t an autocracy of many conventional sized states be as plausible? In his words “we have before us the prospect of two or three monstrous super-states, each possessed of a weapon by which millions of people can be wiped out in a few seconds, dividing the world between them”.

A project that makes exhaustive lists of war related things, notably a list of all the wars between 1816 and 2007.

Successful bioterrorism attack.

Tech

Measurements of the latency of several computers and phones.

Miscellaneous

A university decided to do the obvious and minmax their ranking.

A344202

OEIS link. This is the sequence of primes p such that

    \[\gcd(\text{ord}_p(2), \text{ord}_p(3)) = 1.\]

I first thought of this sequence after reading a mathoverflow post about the Diophantine equation

    \[(2^x-1)(3^y-1)=2z^2.\]

The sequence arises when you consider solving the equation for x,y coprime. In this case you might also expect 2^x-1 and 3^y-1 to be coprime, in which case you would have

    \[2^x-1=p^2\quad 3^y-1=2q^2\]

for integers p,q because a product of coprime integers which equals twice a square must one be a square and the other twice a square. But you would be wrong to think that. However, the primes that can make that conjecture fail are somewhat rare: 683, 599479, 108390409, 149817457, 666591179, 2000634731, 4562284561, 14764460089, 24040333283, 2506025630791, 5988931115977, and none more up to 10^{13}. OEIS generates these nice graphs:

So it looks like the sequence is more or less linearly distributed in the logarithmic graph, so actually exponentially distributed.

I, and someone called sheerluck/Andrew, wrote a program to calculate these numbers. It’s somewhat interesting in its own right. It uses the Sieve of Eratosthenes to generate batches of primes and then spawns a thread to check all the primes in a batch. The version on github is configured to use 4 threads, and it took about a month to check all the way to 10^{13}, but you can edit it to use as many as you want. Each thread then checks each prime in its designated batch one by one. It does this by calculating \text{ord}_p(2) for each prime and then checking if \text{ord}_p(3) shares a factor with it. This is more efficient than calculating both orders because we can return early when checking the order of 3 nearly always. The function that calculates the order of 2 is as follows:

std::vector<uint64_t> order_two(std::map<uint64_t, uint64_t> factors, uint64_t p) {
    namespace view = std::ranges::views;
    uint64_t group_order = p - 1;
    std::vector<uint64_t> order;
    for (const auto& [P, e] : factors)
    {
        uint64_t exponent = group_order;
        for (const auto f: view::iota(0ull, e + 1))
        {
            if (modpow_two(exponent, p) != 1)
            {
                uint64_t alpha = P;
                order.push_back(alpha);
                
                break;
            }
            exponent /= P;
        }

    }
    return order;
}

It uses the fact that the subset of (\mathbb{Z}/p\mathbb{Z})^\times generated by 2 is also a subgroup, thus by Lagrange’s theorem its order divides p-1. So we don’t have to check whether all the powers of two are congruent to 1, just the ones which are divisors of p-1.

Also, the reason why I calculate the order of 2 and then check if it has a common factor with the order of 3 and not the other way around is because calculating the order of 2 is slightly faster using this trick. And I profiled several ways of multiplying 64 bit unsigned ints without overflowing: peasant multiplication, using 128 bit ints (__uint128_t from gcc), using some assembly, and using a trick based on how a * b, the c++ modular multiplication is equal to ab the integer multiplication mod 2^{64}. The one using assembly turned out to be fastest.

I conjecture that the sequence is infinite. If the analogous sequences for all pairs of naturals (\alpha,\beta) (not just (2,3) as we’ve considered so far) are all finite some real coincidences must happen. So I suspect they’re all infinite.

Powered by WordPress & Theme by Anders Norén