Category Archives: Uncategorized
We have seen with astonishment shapes in the heavens that are nothing other than systems of such fixed stars limited to a common plane, such milky ways, if I may express myself in this way, that exhibit elliptical shapes in different positions in relation to the eye with a weakened shimmering as is appropriate to their infinite distance; they are systems of, so to speak, infinity times infinity greater diameter than that of our solar system,h but that, without doubt, are generated in the same way, ordered and arranged by the same causes, and that maintain themselves by the same mechanismi as this one in its constitution.—Immanuel Kant
Immanuel Kant wasn’t the first to posit that the “spiral nebulae” observed by many astronomers, such as Charles Messier, were actually galaxies like to the Milky Way. Theologians, such as Emanuel Swedenborg, had proposed that a limitless God wouldn’t limit creation to a single galaxy. The British astronomers William and Caroline Herschel had discussed the possibility, and their hypothetical proposals influenced Kant. Kant was the most prestigious philosopher of his time, and his endorsement gave the idea heft.
Copernicus and Kepler had shifted the view of the universe from geocentrism to heliocentrism, but up until Kant’s argument it was accepted that the universe was little larger than the Milky Way. Kant created room for the possibility that nature contained many galaxies equivalent to the Milky Way.
165 years later, in the early 1920s, the question was still being debated. Those who believed the Milky Way way the only galaxy argued that spin had been observed in the spiral nebulae. That spin would be impossible to observe if they were as vast as the Milky Way since the outer stars would have to be moving faster than the speed of light. They also argued that novae that had been observed in them were so bright that they outshone the rest of the nebulae. That would put the energy output from the novae so high as to be incomprehensible. Another problem with the novae was that more had been observed in the spiral nebulae than in the Milky Way, implying that the Milky Way had a different composition than the distant nebulae.
On the other side were those who took Kant’s position. Their most powerful argument was that of homogeneity–that the Milky Way wasn’t special, and that the rest of the universe would be very much like the part of it we can observe. They also argued that dark regions in the spiral nebulae strongly resembled regions of darkness in the Milky Way caused by vast expanses of dust hundreds of light years across.
The Milky Way is immense, estimated at 100,000 light years across, but visually it doesn’t fill the night sky. In fact, the band of stars, clusters, and dust doesn’t even cross the sky completely. Despite it’s enormity, is visible. It’s imaginable. It can be held in the mind. If the entire universe consisted mainly of the Milky Way and a few thousand nebulae caught in it’s orbit, then it’s still small enough and simple enough to be grasped in the human mind. On the other hand, if there are thousands or millions of island universes like the Milky Way, then the universe is so large it’s beyond comprehension.
The question was answered in 1924 by Edwin Hubble, through the use of a giant telescope, and a spectrometer. But before discussing the answer to the question, we need to discuss how it was answered–with a little thing known as a universal candle.
Got a secret
Can you keep it?
Swear this one you’ll save
Better lock it in your pocket
Taking this one to the grave
If I show you then I know you
Won’t tell what I said
‘Cause two can keep a secret
If one of them is dead?
— The Pierces
Like most folks I enjoy a good conspiracy theory, and sometimes even a bad one. They allow the listener to suspend disbelief and consider for a moment that the social world in which he or she resides is an illusion. This is somewhat similar to the ability to slip into the fictional world of a story or novel, and bears similarity to the Gnostic assertion that the material realm is an entrapping illusion. In fact many very good films are based on the idea that conspiracies are afoot–the Bourne, Mission Impossible, Star Wars, and of course the Matrix series.
In modern societies, however, there are many conspiracy theories peddled as truth that cause real harm to their believers. One recent example from my country is PizzaGate–a ridiculous story based on the notion that a major Presidential candidate was operating a pederasty ring through a Washington DC pizza place–that led to a believer carrying out an armed raid. Another example is the belief that childhood vaccination is a cause of autism, an assertion founded on falsified data. Failure to vaccinate has led to numerous outbreaks of dangerous epidemic diseases in just the last few years. I should note that the longest running conspiracy theories I know of aren’t theories at all because in the minds of their believers they aren’t disprovable. In that sense they are fictitious certainties,
In his 2016 paper “On the Viability of Conspiratorial Beliefs” Dr David Grimes of Oxford University applied probability models to four popular conspiracy theories–NASA moon landing hoax, climate change fakery, vaccination-autism link, and cancer cure conspiracy. His model simulates the probability of a conspiracy being leaked using a Poisson distribution with conspiracy population parameters based on exponential decay, the Gompertz function, and no change. In order to understand Dr Grimes work a little better, I coded his equations in Octave and simulated his parameters, replicating the generalized results displayed in Figure 1 of the paper.
In examining the work I found one point where I wanted to test a variation. The population parameters and probability distribution used in the paper are continuous–the growth models allow for fractions of a conspirator to leak information. I added an integer rounding function to the population parameters. I also decided on a binomial discrete probability function, where the probability of a leak is treated similarly to the probability of a defective part rolling off an assembly line. I also added a parameter that allows for a leak, presumably to the press, to require verification from additional defectors.
To test this formula I decided to test the fake job numbers conspiracy. The idea is that the US Bureau of Labor Statistics was cooking up job numbers in 2012 to assist the reelection campaign of President Obama. A few problems with this are:
- The BLS employs more than 2500 individuals with a broad range of political preferences
- Job figures are also measured by private firms, such as Gallup and ADP, which show different raw numbers but similar trends
- There really isn’t any evidence that the average American voter pays close attention to economic releases
The down side to the discrete formula is that it uses factorial numbers to calculate the binomial coefficient. Numbers greater than 170! are treated as infinite by Octave. To mitigate this, I used a conspiracy population of 100, with an individual chance of leaking at 0.1% per year, and a press requirement of 2 defectors to publish the leak. While it looks like it would be possible to carry out such a conspiracy one time, the chance of the conspiracy being broken in 80 years is 91.49%. If one assumes a single leaker is sufficient, the chance of discovery rises to more than 99.99%. The chance of carrying it out with the involvement of 2500 must be slim indeed.
The m file code for the discrete distribution is below:
function conspViaDisc(decayType, p, N0, numIt)
x = [1:numIt]; % fill x-vector for x-axis values
alpha = 10^-4;
beta = 0.085; % alpha & beta for gompretzian function
te = 40; % mean age of conspirators
lifeExp = 78; % mean life expectancy of conspirators
lambda = log(2)/(lifeExp-te);
N(1) = N0;
a = 1; % Number of leakers needed to break the conspiracy
if decayType == “G”
for n = 2:numIt
N(n) = int16(N0*exp(0-lambda*n)); % Gompretzian decay
elseif decayType == “E”
for n = 2:numIt
N(n) = int16(N0*exp((alpha/beta)*(1-exp(beta*(te+n))))); % Exponential Decay
for n = 2:numIt
N(n) = N0; % No Decay
L(1) = 0.001; % Estimated probability of discovery in timestep 1
cumL(1) = L(1);
for n = 2:numIt
if N(n) >= a
summ = 0;
for m = 1:a
s = m-1;
binCoef = factorial(N(n))/(factorial(s)*factorial(N(n)-s));
summ += binCoef*(p^s)*((1-p)^(N(1)-s));
L(n) = (1-summ)*(1-cumL(n-1));
cumL(n) = cumL(n-1) + L(n);
L(n) = 0;
cumL(n) = cumL(n-1) + L(n);
“Come and listen to a story ’bout a man named Jed
Poor mountaineer barely kept his family fed
Then one day he was shooting for some food,
And up through the ground come a bubbling crude
(Oil that is, black gold, Texas tea)
Well the first thing you know old Jed’s a millionaire
Kin folk said Jed move away from there
Said California is the place you oughta be…”
—The Ballad of Jed Clampett
Oil and natural gas have been important in the US and globally for nearly 180 years, with that importance spiking following invention and widespread adoption of the internal combustion engine in the 19th and 20th centuries. Beginning in 1973, the US began consuming more crude oil than could be domestically produced. Dependence on global markets made the US economy vulnerable to oil-price shocks in the 1970s, and acted as one of several proximate triggers for the 1990-1991 recession.
In order to better understand production and labor dynamics in US oil and gas production over time, I examined and downloaded production data from the Energy Information Administration (US DoE) and oil and gas employment data from the Bureau of Labor Statistics (US DoL). Graphs of the raw data are below:
A few points stand out when examining the data:
- Natural gas production reached a bottom in the mid-1980s, rising modestly until the mid-1990s, and remained stable until the mid-2000s. Production is currently just off the 2014-2015 peak.
- Crude Oil production in the US peaked in the early 1970s, though oil and gas employment peaked around 1980. Since then, a second production peak occurred in 2015, though production has fallen off slightly since then.
- In spite of recent peaks in production, employment in oil and gas extraction is about 35% off the 1982 peak of 266 thousand workers.
To examine oil and gas combined production and productivity, data was downloaded and opened in MS Excel. Annual barrels of oil and millions of cubic feet of natural gas were converted to British Thermal Units (BTUs). September employment levels were used for each year. The total production series was divided by the employment series to calculate productivity in BTUs per worker.
The productivity data shows a nearly unbroken 20-year increase from 1983 to 2003, a modest decline, and some leveling until 2014, and then a return to productivity growth. As one would expect, productivity increases tend to correspond to falling or stable crude oil prices. Productivity by this measure tends to decline when prices are rising. This makes sense because rising prices can be expected to follow periods of falling production (affecting the numerator), and lead to increases in employment (affecting the denominator).
The following is my opinion only, freely given, worth what you’ve paid for it, isn’t meant to be career or investment advice, and isn’t necessarily the view of my employer or anybody else: The shale revolution has gone global, and demand growth is expected to be slow due to demographics and consumer preferences in the developed world. Because global supply is much less constrained than demand, I expect crude oil prices to stabilize in the 60 to 80 dollar per barrel range for the next several years, assuming no severe shocks to global supply or demand. In order for profitability of US resources to be maintained productivity must continue to improve, implying modest but steady increases in production, and net employment changes occurring very slowly.