Tuesday, May 07, 2024

Will Perps Replace Broken AMMs?

 AMMs are not in equilibrium. While LPs for the top Uniswap pools have become profitable, returns within a rounding error of zero. Most people in DeFi, even those building AMMs, do not understand convexity costs, but as everyone figures it out, TVL will continue to stagnate, if not decline. Worse, many yield farming scams are predicated on the underlying yields from LPing, and if this is built on a base zero-return at best, no amount of leverage can make it generate an attractive return.

2023 and 2024 YTD Profit/USD Traded (in bps)

For example, the eth-usdc pool lost 0.000048 USD per USD traded in 2023 and has made 0.000071 USD per USD traded through 5/5/2024.

Perps to the Rescue?

In TradFi, futures volume dominates the spot, but in DeFi, it's the opposite. Many think decentralized perp trading will soon grow to reflect their natural dominance, but that requires them to become attractive to big players, whales. Crypto entrepreneurs are great for pumping meme coins, but they are incompetent at creating robust, sustainable markets in assets that will get us through the inevitable fiat devaluation that will accompany the next financial crisis.

One problem in attracting whales is regulation. While the US has been making it easier for hedge funds to trade crypto, they are currently restricted to centralized exchanges like Coinbase and the Chicago Mercantile Exchange. Allowing US hedge funds to trade on the blockchain will take several years, as currently, regulators do not even comprehend how wallets like MetaMask work.

Outside the regulatory hurdle, anyone with $10 million can not trust a dex perp because they are non-recourse, often pseudonymously administered, and lack integrity.

  • To the degree they are low-latency limit order books, they provide strong incentives towards covert preferential treatment.

  • Staking tokens and rewards programs offered by perp protocols highlight insiders are stupid or comfortable with a Ponzi

  • Only an insider conspiracy explains an average funding rate that is too high on average and positively correlated with recent price movements.

  • Insiders are comfortable lying, which makes them untrustworthy

    • The official Bitmex perp funding rate mechanism has and will never work as promoted.

    • Referencing the academic work of Gehr and Shiller is a deliberately misleading

Who Gets Colocation?

In the 1990s, the rise of the internet spawned electronic trading via exchanges like Archipelago and Island. This replaced the old system where white-listed Nasdaq market makers or a monopoly specialist would handle equity orders. As this new system was open, the market makers competed on speed, as the fastest would dominate. This lead to exchanges giving high-frequency trading firms colocation services, where market makers could put servers in the same building as the exchange servers. Many claim this is unfair, but it’s the fairest thing imaginable. Your average day trader cannot compete with HFTs without colocation, so the only effect is to make the competition among the elite open and equal. The alternative would be to give the advantage to those with covert agreements with exchange insiders, or to those who knew the owner of the building next door to exchange. The monetary incentive was significant, so colocation was fairer because traders would do whatever it took to get the shortest and fastest connection to the exchanges.

Many faux-dex perps use limit order books (LOBs) instead of AMMs, and LOBs require low latency and high bandwidth so that LPs can quickly and costlessly cancel and replace their resting limit orders. For example, DyDx and Hyperliquid have their private blockchains, which can reduce block times to under a second. This is only possible via effective centralization. However, these perp exchanges must aspire to decentralization to be credible, as FTX highlighted that centralized exchanges are risky regardless of their official mission statement. The global messaging and consensus mechanisms in any potentially decentralizable blockchain preclude centralized servers or officially white-listed colocation services.

Nonetheless, the advantage of having the quickest connection to these exchanges is the same as TradFi. Insiders at DyDx, etc., can make a lot of money telling preferred traders how to have top-tier access to their network. With hundreds of milliseconds to work with, there is a large window to provide insiders with a decisive advantage. These exchanges have no independent third-party auditing their messages or trading tape, so there is no downside. One must trust that no one in these organizations is susceptible to bribery.

Staking and Rewards

The earlier version of perp player DyDx offered a rewards program where users could profit by wash trading. When the rewards program ended, the volume and TVL dropped by 90%. This has happened often because it works (see here and here). Fake volume fools outsiders into thinking there is real demand, which pushes up the perp protocol’s token price, enabling insiders to cash out with big gains or a big VC investment (initial FTX volume was $300MM/day, though there were no mentions of it circa June 2019).

Another common crypto trick is to offer users extra rewards for locking up their tokens, often for a year. The exchanges would not even pay them with outside tokens; they would just inflate their own token by giving these stakeholders extra rewards. This is a simple Ponzi that works well until it doesn't. Staking makes sense for PoS blockchains like Ethereum, as they provide a useful service. The staker offers their collateral to bond the validator, giving the validator an incentive to act honestly: risk and return. Perp protocols GMX and Synthetix promoted staking their tokens for the sole reason of artificially reducing the supply.

These tactics are symptomatic of bad faith, not something any investor with other people’s money should trust.

The Perp Funding Rate Mechanism

In traditional futures markets, there are maturity dates where traders can deliver spot to settle their positions. This enables arbitrage if the futures and spot price deviates by more than their relative cost of carry, as one can simultaneously buy one and sell the other and then close the position at maturity for a sure profit. [see footnote below for the standard theory].1

Perpetual swaps have no expiration dates, so there is no delivery. This was necessary when Bitmex created them because the exchange only took Bitcoin; it could not be delivered into ETH or USDC. They proposed a perp premium funding rate mechanism as an alternative to the arbitrage mechanism used on traditional futures markets.

The perp premium is the percent difference between the perp and spot prices.

Perp Premium = PerpPrice / SpotPrice - 1

The standard story is that a bullish market sentiment causes the perpetual contract's price to exceed the spot price. To equilibrate the market, the longs will pay the shorts a funding rate when the perp premium is positive, discouraging longs and encouraging shorts. The logic works the opposite way when the perp premium is negative.

There are many variations in calculating the perp premium and how it translates into an actual overnight rate. For example, the spot price can be taken from an oracle or spot market on the exchange. The perp price used in this calculation has many degrees of freedom, such as the simple mid-price of the best bid and ask or using bid prices given an order of $1000. The perp premium is generally averaged over 8 hours and then paid at the end of the period. Some sample each tick or minute pay every hour or continuously, but these differences are insignificant. These calculations are impossible for an outside auditor to validate given the economic significance of differences within the standard error of these measurements (e.g., a 0.03% higher perp premium raises the annualized funding rate by 11%).

Initially, the perp premium was mapped into a daily rate, which is usually the case today, though occasionally, it is mapped into 8- or 1-hour rates. There are usually max and min rates and a flat zone set at the default funding rate of 10.95%. For example, this is the function for BitMEX.

Presumably, 'bullishness' drives the perp premium, though theoretically, this makes no sense. If people think the price will be higher in the future, it gets reflected in the spot, not the basis (see Sameulson's Law of Iterated Expectations). In any case, to equilibrate bullishness with bearishness, the funding rate incents more shorts and fewer longs, reducing the perp premium. For example, a 0.09% perp premium implies a 33% annualized funding rate added to the shorts and subtracted from the longs (365x0.09). Over a day, that is just a 0.09% return, and your average daily volatility will be around 4.0%, which generates an insignificant reward/risk ratio, especially after trading costs.

Yet the short's funding rate only materializes to any significant degree if that perp premium persists, and the current perp premium is but a single moment in that calculation; the current perp premium might not even be sampled in calculating the average perp premium used in the next 8-hour period's funding rate payment.

In practice, traders look at the historical funding rate premium over the past few days, not the current perp premium. At any point in time, the perp premium does not incent buys and sells as in arbitrage trading, where if the spot price deviates from the futures price, an arbitrageur can instantly lock in a profit.

Arbitrage bounds prices on spot AMMs, and we can see that liquid AMM prices are almost always within the fee of the current world price on the major centralized exchanges. The funding rate mechanism provides a weak simulacrum of futures-spot arbitrage. It is more akin to how the Fed adjusts interest rates to manage the economy. Higher interest rates diminish investment, but it's a very imprecise mechanism that operates on long and variable lags. No economist would assert that the effect of interest rates on investment is like some arbitrage mechanism.

In practice, perp markets operate on a focal point, the current spot (Schelling points, Aumann's correlated equilibria). This makes sense because liquidity providers want to give customers what is promised to keep them returning. Indeed, many perp markets like Synthetix have AMMs that work fine without a perp funding rate pretext, highlighting their redundancy.

On one level, there is nothing wrong with this mechanism. BitMEX introduced it out of a need in the era before wrapping and stablecoins, and they needed a cover story to get traders to trust that their perp markets were 'trustless,' not merely based on focal points. However, given that we now know this perp premium mechanism has never operated like a governor on a steam engine and that it was just a white lie to get a market off the ground, its persistence is an insult. Exchanges, from Binance to BitMEX to DyDx, lie about how funding rates are determined and what they are for. Funding rates do not tie perp prices to spot via arbitrage or the emergent market price. The funding rates are set indirectly by insiders, who set the perp premium to their advantage.

Equilibrium Funding Rate Too High

Defi perp traders overwhelmingly want to lever long, not short. This makes sense because, conditional upon having money on blockchains, one is generally bullish; if you are bearish on ETH, you generally do not have anything on the blockchain. For example, the long-short trader open interest on GMX for the Arbitrum blockchain is below. Green is long; red is short.

Long-Short Perp Positions on GMX

There are two sides to every perp trade, which implies perp liquidity providers will be short on average. This is the general equilibrium on perp markets. The funding rate does not bring the trader perp long-short demand into equilibrium.

Most perp dexes either employ or are affiliated with their primary liquidity providers. This makes sense because when starting a market, it helps to seed it with liquidity. This short bias works well for the LPs, as it is easy to hedge their short perp positions on the blockchain with a long position. These LPs naturally want the default funding rate to pay them, the shorts. Working with the perp admins who calculate the nebulously sampled averages perp premiums using 'impact' prices and variously amended price feeds for a spot price, the LPs can target the perp premium to be whatever they can get away with.

Theoretically, the perp funding rate for the ETH-USDC should be around 4%, given the difference between the USDC and the ETH's lending rates on the blockchain is 6 and 2%, respectively. The fact that the default rate for dex perps is 11% highlights the gamed nature of perps. An honest perp exchange would allow users to post ETH as collateral, short that ETH, and collect the funding rate. This would be, at most, the riskless USD rate, 5%. Perp markets do not allow this because it would reduce the returns for their LPs.

The House Money Effect

This perp premium farce allows the perp insiders to reap extra returns when traders, who are generally long, are sitting on big profits. This is like when a bettor has won a lot, they have 'house money,' so they don't mind giving a big tip to the dealer or making a frivolous bet. One can see this by looking at the funding rate and the price level, where funding rates often exceed 50%. In traditional financial markets, the primary stylized fact is that price increases are associated with negative financing rates. The crypto pattern is anomalous, reflecting a perp conspiracy that would be provably illegal if in TradFi, even by regulators as competent as the current SEC.

The graphs above make it hard to see which series is leading. If we cross-tab the funding rate with the prior return at a shorter duration, there's a clear positive relation; when cross-tabbed against future returns, there is no relation. Thus, there is no plausible story that the funding rate reflects a risk premium that shows up in future returns.

Academic Pretext

In podcasts about perps, it is common to mention the perp premium mechanism as an ingenious blockchain application of Nobel prize-winning research. Robert Shiller had created a well-known housing price index, and as housing is a major asset class in any economy, he thought having an active futures market would be helpful. The problem is that, unlike a stock index, there is no way to sell houses that underlie a housing index. With 'delivery' out of the question, in 1992 he proposed a futures market that did not have maturity dates for housing futures, a perpetual futures. He used an econometric model to estimate monthly housing rents like a stock's dividends, which would be credited/debited to the daily margin. The futures price would be the market's present value of these rents, just as a stock is the present value of a stock's dividends.

Outside of being a perpetual futures with no fixed delivery date, it had nothing like the perp premium tying the perp price to the spot price via a funding rate. The funding rate was calculated via an exogenous econometric model using macro data.

Adam K. Gehr's 1988 article on the Chinese Gold and Silver Exchange Society of Hong Kong (CGSES) is also commonly mentioned and is a better analog to crypto perp markets. The CGSES had a perpetual futures contract that used a nightly funding rate auction. If the price of Gold closed at $111.0, and the daily interest rate auction was set at $0.15, the closing 'futures' price would be marked at the spot close plus the interest rate, $111.15. Thus, if the long sold at $112.0 the next day, he would make 0.85, using the futures close as the basis.

Gehr suggested that, like in the CGSES, a short separate trading period after the spot close could facilitate perpetual futures, which have the advantage of automatically rolling. The mechanism would be just like in the CGSES, so if the spot price closed at 111.0, the futures market would trade for 15 minutes and set a price at 111.15.

In Gehr's model, the perp premium directly generates a funding rate, but it differs profoundly from the crypto perp approach. The spot market was taken as a given, and the proposed closing perp price was determined after the spot market closed, a direct analog to the overnight funding rate auction used at the CGSES at the end of daily trading. The perp closing price was a simple way to account for the overnight lending rate within the structure of daily margining, using the prior perp close as the cost basis.

It's ignorance or willful deception to assert these papers demonstrate the economic soundness of the perp premium funding rate mechanism.

Perps Aren't Whale-Friendly

For a small trader, the perp conspiracy is not a big deal. They get access to 50x leverage, something not possible for many elsewhere. As many don't mind paying a 50% premium for lottery tickets, the funding rate charge is tolerable because of its easy access to leverage.

 For a whale, however, it's a dealbreaker. If a whale wants to be market maker on one of these exchanges, they will never compete if they aren't part of the insider club. If they want to put on a position, they know insider LPs will have the opposite position. In a large hedge fund, a portfolio manager with 5% alpha is considered exceptional, but a big crypto portfolio manager investing in perps is subject to insiders bumping the funding rate to offset any conceivable alpha.

If you think I am being paranoid, consider the case of perp.fi’s virtual AMM, which used rewards to pump their token market cap to $1 billion. While one commenter opined it should be considered for a Nobel Prize, it contained the minor flaw that the LP collective could become insolvent. Eventually, the LP's insolvency was greater than its insurance fund. When this was discussed on various user forums, the chat highlighted animus towards the accounts with large gains, a predictable rationalization when debating whether to renege on a large outstanding debt. For example:

" IMO it would be correct to add the option of not compensating whales like them who didn't bring any value to the protocol, just risks. They didn't do any active trading, didn't generate a lot fees therefore, just reaped funding (keeping neutral positions most likely)."

There are many good reasons to not pay debts to rich people. A trustless, decentralized contract has to eliminate any such discretion.

Decentralized perps can work, but they need an integrity enema to entice the whales needed to flourish. Protocols should emphasize decentralization, transparency, and immutability instead of focusing on creating a closer substitute for centralized exchanges.

1

The theory that explains the perp funding rate is a typical non-equilibrium story that sounds good at 30k feet, but makes no sense. Like the explanation that there are "more buyers than sellers," the idea that long demand shows up in futures price premiums makes no sense. In 1967, Paul Samuelson demonstrated the law of iterated expectations, which implies current sentiment is reflected in spot prices, not forward/futures prices. Funding rates are instead a function of the relative interest rates of the two assets traded, such as the rate of inerest on the USD and the dividend rate on stocks.

The theory is called ‘covered interest rate parity,’ and works like this. One can take a dollar and earn interest via money markets or Treasury Bills directly which generates:

Alternatively, one can turn this into ETH, earn the ETH lending rate, and then convert back into USD:

In equilibrium, these two paths generate the same net return. Relabeling futurePrice with perp, and spotPrice with spot, and some rearranging, we get:

Which rearranges further to

Here the perp premium is just the difference in the USD minus the ETH interest rate. In markets there are significant storage costs due to grain wasting, or a lack of storage space for oil during a demand collapse as in March 2020. There are no storage costs in crypto. The USDC and ETH lending rates are approximately 6% and 2% respectively. Thus, in theory,

Tuesday, April 30, 2024

Dark Spirits and Ancient Aliens

Michael Heiser

Joe Rogan and Tucker Carlson mentioned demonic spirits in a recent podcast, and my first thought was that they need to look up Michael Heiser. He was an Old Testament scholar who spent his career examining how various ethereal spirits fit into our world. Heiser died from cancer last year but has a ton of material online, as he did a weekly podcast discussing bible topics, not limited to his niche focus on angels and demons (see an intro video here). It's too bad he died because he would make for an excellent podcast with Rogan, as he was an excellent communicator and enjoyed investigating UFOs, Ancient Aliens, and Zechariah Hitchens (he was generally skeptical but found it fun; see here). More recently, Tim Chaffey wrote a book on the Nephilim, so perhaps that's an alternative for one of them.

Most intellectuals find the idea of ethereal forces patently absurd, but this is just because they don't understand the assumptions of their naturalistic worldview. Everyone believes in unseen forces and miracles; religious people merely own it. Cosmologists rely on the multiverse's infinite universes to explain various fine-tuning, such as the cosmological constant and the initial entropy of the universe; they also believe in dark energy, dark matter, and inflation, though these forces cannot be observed directly or falsified in any way. Secular origin-of-life researcher Eugene Koonin uses the multiverse to overcome the many improbabilities required. These scientific theories are untestable, differing from pre-scientific theories only by replacing God with hypothetical fields and forces.

Falkenblog is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.

Burying miracles into assumptions does not make them any less miraculous. An infinite number of universes could easily explain Genesis (with Charleton Heston as God), that we are a hallucinating consciousness imagining life as we know it, or that we live in a simulation designed by an ancient effective altruist. The only ontological difference between religious and naturalistic miracles is that religious people assign their creator not just with personhood but one who defines absolute, objective morality. I find the Biblical virtues stand on their own as optimal praxis for personal and societal flourishing, but simple Bayesian logic implies I should defer my ethical judgments to my creator, given the knowledge disparity. The Bible, however, teaches that my view has been and always will be a minority take under the sun.

Spirit Beings Who Are Not God

If you believe in the New or Old Testament, you believe there is more than God is not the only being in the spiritual realm. There is not only the trinity, but angels Gabriel, Michael, and the fallen angel Lucifer. Hebrews 12:22 mentions' countless thousands of angels,' and Psalm 89 mentions a divine assembly that includes many heavenly spirits, among many other mentions. No Jew or Christian thinks this implies polytheism.

A big problem centers on one of the words for God in the Old Testament. The word elohim is used thousands of times in the Hebrew Bible, and it usually refers to the 'God Most High,' Yahweh, aka God. This has led many to think every use of the word elohim refers to God, which is simply incorrect. First, note that elohim is a word like deer or sheep that can be singular or plural. For example, in Psalm 82, we read.

Elohim presides in the divine assembly; He renders judgment in the midst of the elohim

The word elohim here has two referents: the first for God is singular, and the second is plural because God is 'in the midst of' them. Elohim is used thousands of times in the OT to denote the God (aka Yahweh), but also reference different ethereal beings.

  • The members of God’s council (Psa. 82:1, 6)

  • Gods and goddesses of other nations (Judg. 11:24; 1 Kgs. 11:33)

  • Demons (Hebrew: shedim—Deut. 32:17)

  • The deceased Samuel (1 Sam. 28:13)

  • Angels or the Angel of God (Gen. 35:7)

Elohim is just a word for spiritual beings, and these are often mentioned in the Old and New Testaments. God is an elohim, but he is also the Most High, as in Exo 15:11: "Who is like you among the elohim, Yahweh?" If He is most high, He has to be higher than something else.

Understanding many other spiritual beings are interacting with God helps us understand the phrase in Genesis 1 where God says, "Let us make man in our image," or when God says, "man has become like one of us in knowing good and evil." Ancient Hebrew did not have the majestic we or trinitarian phrases.1

This is an essential point because the existence of other elohim is commonly overlooked by Christians, and this obscures some profound truths. How else would one make sense of 1 Kings 11, where God is in heaven discussing what to do with Ahab:

Micaiah continued, "Therefore hear the word of the LORD: I saw the LORD sitting on His throne, and all the host of heaven standing by Him on His right and on His left.

And the LORD said, 'Who will entice Ahab to march up and fall at Ramoth-gilead?'

And one suggested this, and another that.

Then a spirit came forward, stood before the LORD, and said, 'I will entice him.'

'By what means?' asked the LORD.

And he replied, 'I will go out and be a lying spirit in the mouths of all his prophets.'

'You will surely entice him and prevail,' said the LORD. 'Go and do it.'

God does not have a personality disorder; he has a council of other ethereal beings, other elohim. He does not need these helpers any more than we need dogs, friends, or children: they bring us joy, but also problems. Humans can empathize because we were made in his image. Many other verses make no sense if God exists in the spiritual world with only his alter egos (the trinity).

Spirits go Bad

Many cannot fathom why a good and all-powerful Yahweh would create beings who become evil, and the answer generally focuses on the importance of free will, which is part of being made in the image of God. There is endless debate on free will, whether one believes in God or not. For example, the existence of evil given a good, all-powerful God is puzzling, but it's also difficult to imagine a world without evil; the word evil would lose its meaning, as well as its antonym, goodness. If they are inseparable, like two sides of a coin, is a world without good and evil better than one with them? One can speculate, but the bottom line is that elohim, like humans, can and do go bad, creating evil spirits that manipulate men.

There are three significant calamities in human history. The first is Adam and Eve's fall in the Garden of Eden, the only one most Christians recognize. However, two others get less attention: the rebellions that led to the flood and tower of Babel incident, which directly involved evil elohim. There are many more texts relating to these latter falls in the Qumran texts (aka Dead Sea Scrolls) than the incident with the apple, highlighting they should get more of our attention.

The backstory for God's decision to flood the earth is mentioned in Gen 6:2-4

"the sons of God saw that the daughters of man were attractive. And they took as their wives any they chose. Then the LORD said, "My Spirit shall not abide in man forever, for he is flesh: his days shall be 120 years."

The Nephilim were on the earth in those days—and afterward as well—when the sons of God had relations with the daughters of men. And they bore them children who became the mighty men of old, men of renown."

The sons of God, fallen elohim, impregnate human females, creating 'mighty men of renown,' the Nephilim, translated as giants in the Septuagint. The Nephilim are referenced in several places, including descriptions of figures like Goliath, the Anakim, King Og of Bashan, and the Canaanite "giants in the land" targeted by Joshua's conquests. Christians often interpret the 'sons of God' who fathered the Nephilim as just good Hebrews, who contrast with the unrighteous' daughters of men.' This makes little sense because their offspring were clearly different, not just in size, but in their capabilities. Why would a sinful-righteous human pairing create supermen? The acceleration of evil created by these half-breeds and their progeny was a travesty so evil Yahweh decided to eliminate most of humanity.

The flood did not fix the problem. A couple hundred years after the flood, there is the Babel incident, in which humanity attempts to build a tower to heaven, a rebellion based on hubris that offends God. In response, God not only disperses humanity into various nations, but he disinherits them as His people and puts them under the authority of lesser Elohim. This is recalled in Deuteronomy 32.

When the Most High gave the nations their inheritance, when He divided the sons of man, He set the boundaries of the peoples according to the number of the sons of God (Israel). But the LORD's portion is His people, Jacob His allotted inheritance. ~ Deu 32: 8-9

We know this refers to the Babel event because God apportions a table of 70 nations in Genesis 10 from Noah's three sons. In Genesis 11, after the judgment at Babel, each of these nations was assigned to a son of God, a lesser elohim, while one was kept for the Lord.

Historically, most Christians have read Deuteronomy 32 with 'sons of Israel' as opposed to 'sons of God.’ Currently, most English translations have Israel, but many have God, such as the ESV. While Israel is an accurate translation of the Hebrew Masoretic text used by Jerome, God was more common among Qumran texts and the Septuagint. This mistranslation led to a lack of interest in its implications, as in the ‘sons of Israel’ interpretation, one assigns rulers like Jacob, while in the ‘sons of God’ view, regional deities. The issue becomes prominent in Psalm 82.

Ps 82:1-2 Elohim presides in the divine assembly; He renders judgment in the midst of the elohim. How long will you judge unjustly and show partiality to the wicked? …

Ps 82: 6-7 I have said, 'You are elohim; you are all sons of the Most High.' But like mortals you will die, and like rulers you will fall.

God is not speaking to Jewish elders as elohim because humans do not sit in God's divine assembly (see Psalm 89 for a description of God's council). Further, it would make no sense to proclaim humans would die like mortals; they would have known that. Thus, Psalm 82 describes God judging the elohim he assigned to rule the nations, as described in Genesis 11 and Deuteronomy 32. You could easily miss it if you did not read 'sons of God' in Deu 32, which explains why this interpretation is relatively new (the Qumran texts that seal the inference were unavailable to Augustine, Calvin, etc.).

To recap, we have

  • Gen 6:2-4: Evil spiritual beings come down to earth, defiling women and creating a race of evil, mighty giants, the Nephilim

  • Deu 32:8-9: Recounts how God abandons 69 nations to lesser elohim but keeps one to himself that he would inherit through Abram.

  • Psalm 82: God condemns 69 elohim for being unrighteous rulers of their nations.

These are the types of dark forces that create problems for humanity.

The Watchers

The Book of Enoch (aka 1 Enoch) and The Book of Giants are prominent among the Qumran texts, and the expand upon the brief mentions above. Enoch refers to the sons of heaven as the Watchers. The Watchers see the daughters of men, desire them, and decide to come down from heaven and mate with them (as in Genesis 6). Their leader, Shemihazah, knows his plan is sinful, and he does not want to bear the responsibility alone, so the Watchers swear an oath on Mount Hermon, a classic conspiracy (they rebel against God, noted in Ps 82). The Watchers teach humans various 'magical' practices such as medicines, metallurgy, and the knowledge of constellations. This knowledge gives people great power but exacerbates their sin and suffering.

Enoch describes how the offspring of the Watchers and women became giants who dominated humanity, 'devouring the labor of all the children of men and men were unable to supply them.' 1 Enoch 15:8 refers to the offspring of the giants as demons. These beings are described as spiritual, following their fathers' nature; they do not eat, are not thirsty, and know no obstacles.

One might reject this description of an older form of Ancient Aliens click-bait. However, we can know this is true because listening to spirits is forbidden by God, which would make no sense if it were impossible.

You must not turn to mediums or spiritists; do not seek them out, or you will be defiled by them. I am the LORD your God. ~ Leviticus 19:31

While the Book of Enoch is not canonical, it should still be taken seriously. Both Peter and Jude not only quote from the Book of Enoch, but they quote the verses that directly describe the fallen elohim.2 Enoch is also quoted in 3 Maccabees, Ecclesiasticus, and the Book of Jubilees. It is favorably examined by the early Church fathers Justin Martyr, Irenaeus, Clement of Alexandria, Tertullian, and Origen.

A big reason Christians do not revere the Book of Enoch is that influential Christian Augustine ignored it. Augustine became a Christian as a Manichean who revered The Book of Enoch and emphasized the external battle of good vs. evil. Enoch's narrative does not mention human responsibility. It is a determinist view where evil originates from the deeds of the Watchers, in contrast to the Garden of Eden story, where sin emanates solely from our human nature. Ultimately, Augustine rejected Manichism; he thought our most pressing problem was the sin residing in all of us due to the initial fall of man in the Garden of Eden. Like many who fall away from an ideology, he tried to make a complete break as possible and dismissed the Watchers story.

Jesus and the Dark Spirits

This view of dark forces explains the cosmic geography mentioned in the New Testament.

To the intent that now unto the principalities and powers in heavenly places ~ Ephesians 3:10

against principalities, against powers, against the rulers of the darkness of this world, against spiritual wickedness in high places. ~ Ephesians 6:12

If you believe in the New Testament, you can't reject the assertion that demonic forces have a major role in human life.

The gospel's good news directly addresses the problem created by demonic forces. Christ enabled those under the lesser elohim's dominion to turn from those gods via faith. The breach caused by the Babel rebellion had been closed; the gap between all humanity and the true God had been bridged.

We see this in Acts, where Luke records Paul's speech in Athens. In talking about God's salvation plan, Paul says:

And God made from one man every nation of humanity to live on all the face of the earth, determining their fixed times and the fixed boundaries of their habitation, to search for God, if perhaps indeed they might feel around for Him and find Him. And indeed He is not far away from each one of us. ~ Acts 17:26-27

Paul clearly alludes to the situation with the nations produced by God's judgment at Babel, as described in Deuteronomy 32. Paul's rationale for his ministry to the Gentiles was that God intended to reclaim the nations to restore the original Edenic vision. Salvation was not only for the physical children of Abraham but for anyone who would believe (Gal. 3:28-29)

In Acts 2, the apostles, filled with the Holy Spirit, begin to speak in various tongues, allowing them to communicate the gospel to people from different nations visiting Jerusalem. Paul describes the disciples speaking in tongues as divided using the same Greek word (diamerizo) from the Septuagint in Deuteronomy 32, 'When the Most High divided the nations, when He scattered humankind, He fixed the boundaries of the nations.' Luke then describes the crowd, composed of Jews from all the nations, as confused, using the same Greek word (suncheo) used in the Septuagint version of the Babel story in Genesis 11: 'Come, let us go down and confuse their language there.' This mirroring highlighted that Jesus and the holy spirit would rectify the disinheritance at Babel and the subsequent oppression by corrupted elohim rulers.

Christ’s sacrifice gave people the ability to defy regional demons, but he did not eliminate them.

Demonic Ancient Aliens

In the Babylonian flood story, divine beings known as Apkallu possessed great knowledge, had sex with women, produced semi-divine offspring, and shared their supernatural knowledge with humanity. In contrast to the Bible, they were hailed as pre-flood cultural heroes, so Babylonian kings claimed to be descended from the Apkallus. To make the connection with Enoch even clearer, Apkallu idols were often buried in Babylonian house foundations for good luck, and were called 'watchers.' The Watchers relates to many mysteries, such as the discovery of bronze and iron or the creation of the pyramids. These demi-gods were presented as the good guys for many ancient Middle Eastern societies.

The ancient Greeks had their version of this, replacing Apkallu with Titans. These were regional semi-deities with knowledge from the gods that gave them great power. In Hesiod's Theogony, he mentions the titan semi-gods and how Prometheus stole fire from the Gods. Aeschylus's play Prometheus Bound describes the famous punishment for giving humans divine knowledge, as sinful humans would invariably use their greater knowledge to their ultimate detriment, the classic story of hubris. The takeaway here is that interacting with these demi-gods is a Faustian bargain. 

two takes

A common literary theme is to spin either the good Apkallu or bad Promethean interpretation of human development. For instance, in The Lord of the Rings, the ring has great power but ultimately destroys those who possess it; in The Godfather, Michael wins the war with the five families but loses his soul. In Percy Shelley's Prometheus Unbound and George Bernard Shaw's Back to Methuselah, knowledge from the gods generates human intellectual and spiritual development that brings humanity's eventual liberation and enlightenment through knowledge and moral improvement.

There is nothing wrong with technological improvement or efficiency. Bezalel is described as having great wisdom and craftsmanship and is promoted by God to create the first Tabernacle and the Ark of the Covenant. Noah was considered uniquely righteous and built a boat that could hold an entire zoo. The problem is succumbing to the temptations created by powerful, dark spiritual powers who know many valuable things that humans do not. Understandably, this power would seduce many. Any elohim who does this is contravening God's plan for their glory, which is evil. Many humans glom onto them, as they would rather rule in hell than serve in heaven, or just not care about the long run and prioritize the ephemeral pleasures of status and its spoils (‘eat, drink, and be merry for tomorrow we die’).

How demons interact with humans is unclear. As humans, we probably cannot exterminate them if we try, but we know we can and should resist them. There are many opportunities to aid and abet evil for a short-term advantage, but it is foolish to gain the whole world to lose your soul. While I doubt anyone can tell if someone is a demon puppet, let alone a demon, both have and do exist. We need not to be naïve and be wary. Discerning good and evil is difficult when dealing with fallen elohim, as they lie and are smarter than us. A simple rule is to speak truth to lies because God is truth. If you must lie to make your point, there's a good chance you are allying with dark forces.

1

see other 'us' language in Gen 1:26, Gen 3:22, Gen 11:7, Isaiah 6:8

2

1 Enoch 19:1 quoted in 2 Peter 2:4

For if God did not spare the angels when they sinned, but cast them into Tartarus

1 Enoch 1:9 quoted in Jude 14-15

It was also about these men that Enoch, in the seventh generation from Adam, prophesied, saying, “Behold, the Lord came with many thousands of His holy ones, to execute judgment upon all, and to convict all the ungodly of all their ungodly deeds which they have done in an ungodly way, and of all the harsh things which ungodly sinners have spoken against Him.

Wednesday, April 24, 2024

Why Evolution is False

 Recently deceased philosopher Daniel Dennett called Darwin's idea of natural selection the best idea anyone ever had, a universal solvent that melts away all the problems in biology. Dennet had contempt for Christians, coined the term 'brights' for those who shared his worldview, and thought it wise not to respect religion because of its damage to the 'epistemological fabric of society.' Like fellow atheist Richard Dawkins, he never debated biologists, just theologians.

In a 2009 debate, Dennet mentioned he brought his friend, an evolutionary biologist, to a 1997 debate with Michael Behe about his book Darwin's Black Box (1996) because he felt unqualified to address the micro-biology arguments. Dennett described Behe's book as 'hugely disingenuous propaganda, full of telling omissions and misrepresentations,' that was 'neither serious nor quantitative.' Dennet then added he would not waste time evaluating Behe's newest book, The Edge of Evolution (2007).1

Dennett emphasized his approach to understanding the world was rational, reasoned, and evidence-based. Yet, he never directly addressed Behe's arguments and instead stuck to the cowardly pleasure of debating non-scientist theologians with whom he had greater knowledge of biology. He admits he could not evaluate the arguments alone by enlisting a biologist to help him debate Behe. If he could not trust himself to evaluate Behe's argument, a rational approach would be to take a trusted source's opinion as a Bayesian prior, not as a fact so certain that its negation damages the epistemic fabric of society. Unfortunately, many, perhaps most scientists, agree with Dennett and think the only people who don’t believe in evolution are ignorant (e.g., see Geoffrey Miller here, or Dawkins here).

If Dennett had read Behe's Edge of Evolution, he would have seen it as a logical extension of his earlier book, not moving the goalposts. Behe's argument isn't based on parochial microbiology knowledge; it's pretty simple once one gets the gist.

Behe highlighted the edge of evolutionary power using data on malarial resistance to two different antibiotics. For the malaria antibiotic atovaquone, resistance develops spontaneously in every third patient, which, given the number of malaria in a single person, the probability malaria successfully adapts to this threat can be estimated as one in 1e12, or 1e-12. Resistance occurs once every billionth patient for the antibiotic chloroquine, giving an estimated successful mutation probability of 1e-20. This roughly squares the original probability, which led Behe to suggest that at least two mutations were required to develop resistance, as two mutations represent a squaring of the initial probability. This prediction was confirmed a few years later.

Extending this logic, given a base mutation rate of p per base pair per generation (e.g., p ~1e-8 for humans), if n mutations are needed, the probability of that happening scales at pn. Given that new proteins require at least 10 changes to some ancestor (out of an average of 300 aminos), the probability of an existing genome evolving to a new specific protein would be 1e-80. Given that only 1e40 organisms have lived on Earth, this implies that evolution is limited in what it can do (note most evolution we observe, as in dog breeds, just involves changes in allele frequencies).

A reasonable criticism is that this argument works for a specific target, such as Behe's example of malaria overcoming an antibiotic. However, the state space of possible unknown proteins is much larger. For example, the average protein has 350 amino acids; with 20 amino acids, that's 20^350 or 1e455 possible permutations. If there are islands of functional proteins within this vast state space, say at only a rate of 1e-55, that leaves 1e400 potential targets. The 'singular target' criticism does not invalidate Behe's assertion, but addressing it would require too much space for this post, so I will just mention it as a defensible criticism.  

However, most reviews of Behe's malaria-based argument centered on the simple assertion that generating n specific mutations scales at pn.

Ken Miller (Nature, 2007):

Behe obtains his probabilities by considering each mutation as an independent event

Sean Carroll (Nature, 2007)

Behe's chief error is minimizing the power of natural selection to act cumulatively.

Jerry Coyne (The New Republic, 2007)

 If it looks impossible, this is only because of Behe's bizarre and unrealistic assumption that for a protein-protein interaction to evolve, all mutations must occur simultaneously, because the step-by-step path is not adaptive.

These criticisms are based on two possible assumptions. One is that if multiple mutations are needed, single mutations encountered along the process of fixing the multiple mutations may each confer a fitness advantage, allowing selection to fix the intermediate cases. While this can be true, it is not for the mutations needed for malaria to develop chloroquine resistance, which needs at least two, and it is undoubtedly not true in general. Indeed, a good fraction of the intermediate steps reduce fitness, some severely (see here or here), which is why a fitness advantage from a two-step mutation does not happen with the same frequency as fitness enhancement that needs one mutation: in the wild, the intermediate mutations are eliminated from the population.

The other assumption would be if the number of indirect paths overwhelms the specific case where sequential mutations occur. There are many indirect paths from one sequence of 300 amino acids into another. The question is their probability, the sum of these successful paths over all possible paths.

Intuitively, multiple specific simultaneous mutations are astronomically improbable and would constitute an impenetrable fitness valley, which is why evolution proponents are emphatic that it is an absurd approximation. Their intuition is that if one needs a handful of specific mutations when one already has 100 or 400 of the needed amino acids in the correct spot, a cumulative process of some sort should be likely, even if the final steps involve neutral mutations that neither help nor hurt fitness. However, none of Behe's critics generated a model to quantify their reasoning, even though they are all scientists in this field.

Model of the Behe Edge of Evolution Argument

Hypothesis: Prob(get n specific mutations sequentially) ~= Prob(get n specific mutations over 1e40 generations)

The model below uses a trinomial lattice to show that if the probability of getting one mutation correct is p, the probability of getting n mutations correct is on the order of pn. Given the small probability of getting one mutation correct, this highlights what Michael Behe calls the edge of evolution: the fitness landscape generates an impenetrable barrier for naturalistic processes. Assuming simultaneous mutations towards the target captures most of the cumulative probability of reaching that target if mutations are neutral until the target is achieved. The other paths drift away at a rate that initially eliminates the benefit of being so close.

We can model this using a variant of Richard Dawkin's Weasel program that he used to demonstrate the power of evolution. It starts with a string of letters and spaces, 27 potential characters in a string of length 28.

neuh utnaswqvzwzsththeouanbm

Dawkins randomly generated strings, fixing characters that matched the target phrase from a Shakespearean play. The application to genetics is straightforward if we think of the phrase as a sequence of nucleotides or amino acids creating a protein.

xethinks dt is like a weasek

xethinks dt is like a weasel

xethinks it is like a weasel

methinks it is like a weasel

This algorithm fixes the characters that only make sense at completion. This is like assuming necessary mutations are selected with foresight, which would require some outside designer shepherding the process. Evolutionists countered that Dawkins' weasel program was merely to demonstrate the difference between cumulative selection and single-step selection, but this is disingenuous, as removing forward-looking selection increases the number of steps needed from 100 to 1e39, which would not make for a convincing TV demonstration (see Dawkins’ promoting his weasel here).

Nonetheless, the Weasel program is familiar to many who study evolution and can be used to illustrate my point. Consider the case where we are two characters away from the target sequence.

Start: 2 wrong, 26  correct

XXTHINKS IT IS LIKE A WEASEL

Case 1, closer: 1 wrong, 27 correct.

MXTHINKS IT IS LIKE A WEASEL

XETHINKS IT IS LIKE A WEASEL

Case 2, stasis: 2 wrong, 26 correct. The two mismatched characters can each change into 25 potential targets (27 – 2) that are also wrong, for a total of 2*25. For example, here are three.

YXTHINKS IT IS LIKE A WEASEL

XYTHINKS IT IS LIKE A WEASEL

AXTHINKS IT IS LIKE A WEASEL

Case 3, further: 3 wrong, 25 correct. Each of the remaining matching characters can become unmatched by changing into any of the 27-1 potential characters. The total number of paths is 26 x 26. Here are two.

XXXHINKS IT IS LIKE A WEASEL

XBTHXNKS IT IS LIKE A WEASEL

To generalize the problem, let us define L as the string length, c the number of potential characters at each element in the string, and n the number of needed changes to the string (initial mismatches). The set of potential new strings can be split into three groupings with their distinctive number of paths:

1.      strings that have more characters that match the target:

a.       e.g., n moving from 3 to 2, n possibilities

2.      strings that have the same number of characters matching the target: n*(c – 2) possibilities

a.       e.g., n changing one mismatched amino to another, so staying at 3

3.      strings that have fewer characters that match the target: (Ln)*(c-1) possibilities

a.       e.g., n moving from 3 to 4

The probabilities are the same regardless of which n characters are referenced. For example, the two sequences below are mathematically identical regarding how many paths are towards and away from the target, so they will have the same probabilities of moving towards or away from the target.

XXTHINKS IT IS LIKE A WEASEL = METHINKS IT IS XIKE A WEASLX

This makes the problem simple to model because regardless of where the string starts, the number of cases that need probabilities is at most L+1, as n can range from 0 to L. All mutations are considered equal probability; the number of paths up over the total possible paths {up, same, down} is the probability of moving up. We can, therefore, calculate the probability of moving closer, the same, or further from its target (i.e., nt+1 < nt, nt+1 = nt, nt+1 > nt) for each n <= L. This allows us to model the evolution of the string using a recombining lattice that evolves from an initial node, a standard tool for modeling option values.

At every row of the L+1 nodes in the lattice, we calculate the probabilities of moving up, across, and down via the following formulas.

1.      Prob(closer): n/(L*(c-1))

2.      Prob(same): n*(c-2)/(L*(c-1))

3.      Prob(farther: (L-n)*(c-1) /(L*(c-1))

Figure 1 below shows a case with five rows representing a string of 4 characters. The columns represent generations defined by a change in one of the spaces in the sequence (a mutation). The bottom row of nodes reflects zero matches, and the top level is a complete match. In this case, the initial node on the left implies 2 mismatches out of 4. If the new mutation flips a mismatched position to the target, it is closer to the target and thus moves up one level, just below the top row; if it stays two away by having an incorrect letter changed to a different incorrect letter, it moves horizontally; if a correct letter changes to an incorrect letter, it moves down.

Figure 1

A complete match is a success, and we assume that once the sequence reaches the target, it does not regress (it is then subject to selection and fixates in the population). Thus, there is no path downward from the top row. While this is not realistic in practice, it inflates the total probability of reaching the target and does not affect the gist of my argument. The point is to highlight the relative importance of the direct path as opposed to the actual probability.

In Figure 2 below, we have a snip from a worksheet modeling the case where the starting string has two mismatched characters from the target sequence of 'methinks...' For the first mutation, there are 676 changes away from our target, 50 that would maintain the same distance, and 2 that move towards the target, giving probabilities of 92.8%, 6.9%, and 0.3% to the node branches. At the next upward node in step 1, the probability of going up again and reaching the target is 1/728 or 0.14%, so the probability of reaching the target in two mutations is 0.27% * 0.14%, or 3.77e-6.

Figure 2

The 'nodeProb' row represents the probability of reaching the target on that mutation, while the 'cum prob' row represents the cumulative sum of those nodes, given we assume reaching the target allows the organism to thrive with its new protein function.

The node probabilities asymptotically decline, so taking the hundredth 'nodeProb' as equal to all subsequent nodeProbs generates a conservative estimate of the number of generations (G) needed to double the direct path probability.

2*nodeProb(2) = cumProb(100) + G*nodeProb(100)

G = (2*nodeProb(2) - cumProb(100) )/nodeProb(100)

For this example

G = (2*3.77e-6 – 4.23e-6)/3.00e-37 = 1e31

This implies at least 1e31 generations are needed for the cumulative probability to be twice the direct probability. As estimates in this field have logarithmic standard errors (e.g., 1e-10 to 1e-11, as opposed to 1.0e-10 to 1.1e-10), a direct path probability within a factor of 2 of the cumulative case over 1e31 paths is about the same.

Modeling this with a lattice allows us to see the relative importance of the direct path because everything reaches the target over an infinite amount of time, so the probability of reaching the target is the same regardless of the starting point. With a lattice approach, we can model the finite case that is still large enough (e.g., 1e40) and see the relative probabilities.

The probability of a direct path is approximately equal to a simultaneous path because, if we assume mutations obey a Poisson process, the probability of a simultaneous mutation is the same as sequential mutations, just one probability times the other. For example, the malarial parasite cycles through several generations during an infection, and the resistant parasite could acquire the necessary mutations in generations 3 and 5 or simultaneously in generation 3 (both would have the same probability).

Thus, Behe's hypothesis that the probability of reaching a target n mutations away scales with pn is a reasonable estimate when intermediate steps are neutral.

The worksheet contains the Weasel program applied to 2 and 3 characters away from the target. It can be generalized straightforwardly for an arbitrary n of needed mutations, 20 amino acids, and protein of length L, as shown in the worksheet ‘12awayAminos.’ Looking at the worksheet '2away' in the Excel workbook, the probability of reaching the target in a direct sequence is the probability of hitting the first top node. All the probabilities here are relative because this model assumes a mutation along an existing string of amino acids. This occurs only at a rate of 1e-8 per nucleotide per generation in humans. So, the extension to amino acid changes needs an extra adjustment (nucleotide changes do not necessarily change amino acids). The purpose of this post is just to show the relative probability of the direct and indirect paths, which is independent of the absolute probability.

There are an estimated 35 million single nucleotide differences and 90 Mb of insertions and deletions, creating 700 new genes in chimps and humans that were not present in our last common ancestor. That's at least 100 functional mutations that must be fixed every generation over the past 6 million years. Genetic drift can generate the 100 newly fixed mutations, but this drift is random. The probability that a random amino acid sequence would do something helpful is astronomically improbable (estimates range from 1e-11 to 1e-77, a range that highlights the bizarre nature of evolution debates), which creates a rift within the evolution community between those who emphasize drift vs. those who emphasize natural selection.2 This debate highlights that there are rational reasons to reject both; each side thinks the other side’s mechanism cannot work, and I agree.

In Dawkin's Mount Improbable, he admits the de novo case is effectively impossible. He suggests indirect paths involving gene duplication, as the base structure provided by an existing gene would give the scaffolding needed for protein stability. Then, one must merely nuance a few segments to some unknown function. The above shows that if one needs a modest number of specific amino acids (e.g., a dozen), the probability of reaching such a target will be greater than 1 in 1e100. For this copy-and-refine process to work, it requires a protein space with a dense set of targets—islands of functionality in the protein state space—which is possible but eminently debatable.

1

See the 2009 debate between Dennett and Christian philosopher Alvin Plantinga, especially around 64 minutes in.

2

The 1e-11 estimate applies to an 80-amino string converting from weak to strongly binding to ATP, which is understandable given this is the most common binding protein machine in a cell, as it powers everything. Further, this was studied in vitro, which is 1e10 less likely to work in vivo. The 1e-77 estimate is for beta-lactamase, an enzyme produced in bacteria with a specific function, unlike binding to ATP which is common. Other estimates include 1e-65 for cytochrome c, 1e-63 for the bacteriophage lambda repressor, and 1e-70 for a certain phage protein.