From crackpot Covid theorists to antivaxxers, hubris and fear haunt the wellness community

From crackpot Covid theorists to antivaxxers, hubris and fear haunt the wellness community
The randomness of illness is far too frightening for many to contemplate – so they rely on a fiction they’re special and can control their bodies
By Brigid Delaney
Sep 16 2021

It’s not unusual to meet people with alternative beliefs at a Sufi meditation course on a rooftop in Ubud. But when a mild, vaguely apologetic Australian woman in her mid-50s explained to me that she was responsible for her own breast cancer because she had repressed her needs and her sexuality and this repression had manifested itself as cancer in her breast, I thought: “Far out.”

It was 2014 and the woman had been staying at a retreat centre nearby, submitting to a range of alternative therapies in a last-ditch attempt to stay alive after her cancer had spread.

Her belief that she had caused her own cancer made her feel regretful and guilty, but conversely, as she explained, it also meant she might be able to reverse her diagnosis if she worked on her emotional problems.

That emotional problems caused physical illness was a common belief in the nascent wellness industry of the 1980s and 90s. Louise Hay, the mega seller behind You can Heal your Life, pushed the line that various diseases signified a personal defeat – for example, rheumatoid arthritis meant “feeling victimized. Lack of love. Chronic bitterness. Resentment” and asthma was a result of “feeling stifled. Suppressed crying.”

Such obviously crackpot theories could be dismissed with a laugh, except Hay’s books were wildly popular, selling more than 30m copies worldwide.

Years later I still think of the terminally ill woman in Bali and feel angry that she had signed up for such bullshit – and that she had wasted what was likely to be the final months of her life blaming herself for her illness.

You don’t hear much about Louise Hay today, but trace elements of her philosophy survive when it comes to the wellness industry and Covid.

There is the belief that we can control our bodies and that a powerful natural immune system is the best defence against Covid, not a vaccine.

After resisting the notion that Covid is even real (the so-called “scamdemic” or “plandemic”), now those who push conspiracy theories are holding their nerve, this time arguing that vaccines are either dangerous, part of a plot by Big Pharma to increase profits, or not for well people who have strong immune systems.

When this corner of the wellness industry refuses to be vaccinated, it is not primarily out of fear of the vaccine’s side effects or because it was developed too quickly, but more likely comes from a place of arrogance: those who are well don’t need the vaccine because they have Rolls Royce immune systems. Instead the only people who get sick and die from Covid have a pre-existing illness, or are in some way physically deficient, or have succumbed to the immune system-weakening emotion of fear.

A theory sharing some of these tenets found popular expression recently on the (now deleted) LinkedIn post of the head of a US salad chain, who said he was vaccinated himself and supportive of people being vaccinated: “78% of hospitalizations due to Covid are obese and overweight people. Is there an underlying problem that perhaps we have not given enough attention to? … no vaccine nor mask will save us.” That is, eat enough salad (take personal responsibility FFS!!) and you won’t need a vaccine. After a public backlash, he first apologised to his staff and then in a new LinkedIn post.

In the unedited version of an interview with 60 Minutes posted to YouTube in 2020, chef Pete Evans, a notable antivaxxer, also touted the sovereignty of a pure immune system: “And am I fearful of Covid-19, if I came into contact with anybody [who has it]? No I’m not, because I believe in who I am and my ability to stay as healthy as I can through anything.”

More recently a Byron-based wellness influencer came under fire for a post published on the day of anti-lockdown protests that argued “Remember science is a THEORY, just like magic.”

Instead she advised to, “look after your physical health and optimise your immune system with herbs, breathing exercises, organic foods. Start to grow your own food. Learn about soil.”

Dr James Rose, a social anthropologist at the University of Melbourne, told me last year that a sense of superiority can pervade the identity of conspiracy-based communities. He said identity can be reinforced when you position yourself above others “and Pete Evans and his community are very explicit about it – they believe they are better, more pure, that they are fitter and more active than other people.”

The feeling of superiority can also mean that people who attack the ideals of the group are dismissed as unevolved or “sheeple”, an arrogance which prevents meaningful debate or dissent.

But from Hay to the current crop of social media wellness influencers, there is a common thrum of neurosis underneath the bravado: that is the need to feel in control.


Evolving threat

Evolving threat
New variants have changed the face of the pandemic. What will the virus do next?
By Kai Kupferschmidt
Aug 19 2021

A version of this story appeared in Science, Vol 373, Issue 6557.

Edward Holmes does not like making predictions, but last year he hazarded a few. Again and again, people had asked Holmes, an expert on viral evolution at the University of Sydney, how he expected SARS-CoV-2 to change. In May 2020, 5 months into the pandemic, he started to include a slide with his best guesses in his talks. The virus would probably evolve to avoid at least some human immunity, he suggested. But it would likely make people less sick over time, he said, and there would be little change in its infectivity. In short, it sounded like evolution would not play a major role in the pandemic’s near future.

“A year on I’ve been proven pretty much wrong on all of it,” Holmes says.

Well, not all: SARS-CoV-2 did evolve to better avoid human antibodies. But it has also become a bit more virulent and a lot more infectious, causing more people to fall ill. That has had an enormous influence on the course of the pandemic.

The Delta strain circulating now—one of four “variants of concern” identified by the World Health Organization, along with four “variants of interest”—is so radically different from the virus that appeared in Wuhan, China, in late 2019 that many countries have been forced to change their pandemic planning. Governments are scrambling to accelerate vaccination programs while prolonging or even reintroducing mask wearing and other public health measures. As to the goal of reaching herd immunity—vaccinating so many people that the virus simply has nowhere to go—“With the emergence of Delta, I realized that it’s just impossible to reach that,” says Müge Çevik, an infectious disease specialist at the University of St. Andrews.

Yet the most tumultuous period in SARS-CoV-2’s evolution may still be ahead of us, says Aris Katzourakis, an evolutionary biologist at the University of Oxford. There’s now enough immunity in the human population to ratchet up an evolutionary competition, pressuring the virus to adapt further. At the same time, much of the world is still overwhelmed with infections, giving the virus plenty of chances to replicate and throw up new mutations.

Predicting where those worrisome factors will lead is just as tricky as it was a year and a half ago, however. “We’re much better at explaining the past than predicting the future,” says Andrew Read, an evolutionary biologist at Pennsylvania State University, University Park. Evolution, after all, is driven by random mutations, which are impossible to predict. “It’s very, very tricky to know what’s possible, until it happens,” Read says. “It’s not physics. It doesn’t happen on a billiard table.”

Still, experience with other viruses gives evolutionary biologists some clues about where SARS-CoV-2 may be headed. The courses of past outbreaks show the coronavirus could well become even more infectious than Delta is now, Read says: “I think there’s every expectation that this virus will continue to adapt to humans and will get better and better at us.” Far from making people less sick, it could also evolve to become even deadlier, as some previous viruses including the 1918 flu have. And although COVID-19 vaccines have held up well so far, history shows the virus could evolve further to elude their protective effect—although a recent study in another coronavirus suggests that could take many years, which would leave more time to adapt vaccines to the changing threat.

Explaining the past

Holmes himself uploaded one of the first SARS-CoV-2 genomes to the internet on 10 January 2020. Since then, more than 2 million genomes have been sequenced and published, painting an exquisitely detailed picture of a changing virus. “I don’t think we’ve ever seen that level of precision in watching an evolutionary process,” Holmes says.

Making sense of the endless stream of mutations is complicated. Each is just a tiny tweak in the instructions for how to make proteins. Which mutations end up spreading depends on how the viruses carrying those tweaked proteins fare in the real world.

The vast majority of mutations give the virus no advantage at all, and identifying the ones that do is difficult. There are obvious candidates, such as mutations that change the part of the spike protein—which sits on the surface of the virus—that binds to human cells. But changes elsewhere in the genome may be just as crucial—yet are harder to interpret. Some genes’ functions aren’t even clear, let alone what a change in their sequence could mean. The impact of any one change on the virus’ fitness also depends on other changes it has already accumulated. That means scientists need real-world data to see which variants appear to be taking off. Only then can they investigate, in cell cultures and animal experiments, what might explain that viral success.

The most eye-popping change in SARS-CoV-2 so far has been its improved ability to spread between humans. At some point early in the pandemic, SARS-CoV-2 acquired a mutation called D614G that made it a bit more infectious. That version spread around the world; almost all current viruses are descended from it. Then in late 2020, scientists identified a new variant, now called Alpha, in patients in Kent, U.K., that was about 50% more transmissible. Delta, first seen in India and now conquering the world, is another 40% to 60% more transmissible than Alpha.

Read says the pattern is no surprise. “The only way you could not get infectiousness rising would be if the virus popped into humans as perfect at infecting humans as it could be, and the chance of that happening is incredibly small,” he says. But Holmes was startled. “This virus has gone up three notches in effectively a year and that, I think, was the biggest surprise to me,” Holmes says. “I didn’t quite appreciate how much further the virus could get.”

Bette Korber at Los Alamos National Laboratory and her colleagues first suggested that D614G, the early mutation, was taking over because it made the virus better at spreading. She says skepticism about the virus’ ability to evolve was common in the early days of the pandemic, with some researchers saying D614G’s apparent advantage might be sheer luck. “There was extraordinary resistance in the scientific community to the idea this virus could evolve as the pandemic grew in seriousness in spring of 2020,” Korber says.

Researchers had never watched a completely novel virus spread so widely and evolve in humans, after all. “We’re used to dealing with pathogens that have been in humanity for centuries, and their evolutionary course is set in the context of having been a human pathogen for many, many years,” says Jeremy Farrar, head of the Wellcome Trust. Katzourakis agrees. “This may have affected our priors and conditioned many to think in a particular way,” he says.

Another, more practical problem is that real-world advantages for the virus don’t always show up in cell culture or animal models. “There is no way anyone would have noticed anything special about Alpha from laboratory data alone,” says Christian Drosten, a virologist at the Charité University Hospital in Berlin. He and others are still figuring out what, at the molecular level, gives Alpha and Delta an edge.

Alpha seems to bind more strongly to the human ACE2 receptor, the virus’ target on the cell surface, partly because of a mutation in the spike protein called N501Y. It may also be better at countering interferons, molecules that are part of the body’s viral immune defenses. Together those changes may lower the amount of virus needed to infect someone—the infectious dose. In Delta, one of the most important changes may be near the furin cleavage site on spike, where a human enzyme cuts the protein, a key step enabling the virus to invade human cells. A mutation called P681R in that region makes cleavage more efficient, which may allow the virus to enter more cells faster and lead to greater numbers of virus particles in an infected person. In July, Chinese researchers posted a preprint showing Delta could lead to virus levels in patient samples 1000 times higher than for previous variants. Evidence is accumulating that infected people not only spread the virus more efficiently, but also faster, allowing the variant to spread even more rapidly.

Deadly trade-offs

The new variants of SARS-CoV-2 may also cause more severe disease. For example, a study in Scotland found that an infection with Delta was about twice as likely to lead to hospital admission than with Alpha.

It wouldn’t be the first time a newly emerging disease quickly became more serious. The 1918–19 influenza pandemic also appears to have caused more serious illness as time went on, says Lone Simonsen, an epidemiologist at Roskilde University who studies past pandemics. “Our data from Denmark suggests it was six times deadlier in the second wave.”

A popular notion holds that viruses tend to evolve over time to become less dangerous, allowing the host to live longer and spread the virus more widely. But that idea is too simplistic, Holmes says. “The evolution of virulence has proven to be quicksand for evolutionary biologists,” he says. “It’s not a simple thing.”

Two of the best studied examples of viral evolution are myxoma virus and rabbit hemorrhagic disease virus, which were released in Australia in 1960 and 1996, respectively, to decimate populations of European rabbits that were destroying croplands and wreaking ecological havoc. Myxoma virus initially killed more than 99% of infected rabbits, but then less pathogenic strains evolved, likely because the virus was killing many animals before they had a chance to pass it on. (Rabbits also evolved to be less susceptible.) Rabbit hemorrhagic disease virus, by contrast, got more deadly over time, probably because the virus is spread by blow flies feeding on rabbit carcasses, and quicker death accelerated its spread.

Other factors loosen the constraints on deadliness. For example, a virus variant that can outgrow other variants within a host can end up dominating even if it makes the host sicker and reduces the likelihood of transmission. And an assumption about human respiratory diseases may not always hold: that a milder virus—one that doesn’t make you crawl into bed, say—might allow an infected person to spread the virus further. In SARS-CoV-2, most transmission happens early on, when the virus is replicating in the upper airways, whereas serious disease, if it develops, comes later, when the virus infects the lower airways. As a result, a variant that makes the host sicker might spread just as fast as before.

Evasive measures

From the start of the pandemic, researchers have worried about a third type of viral change, perhaps the most unsettling of all: that SARS-CoV-2 might evolve to evade immunity triggered by natural infections or vaccines. Already, several variants have emerged sporting changes in the surface of the spike protein that make it less easily recognized by antibodies. But although news of these variants has caused widespread fear, their impact has so far been limited.


Google’s Project Taara Wirelessly Transmits 700TB Across a River in 20 Days

[Note:  This item comes from friend Desire Banse.  DLH]

Google’s Project Taara Wirelessly Transmits 700TB Across a River in 20 Days
By Ryan Whitwam
Sep 16 2021

Google runs a plethora of aspirational projects to explore one moonshot or another, but only some become real products. The company’s Project Loon internet balloons didn’t make the cut, having shut down in early 2021. However, one aspect of Loon has lived on to become its own Googley project. Google says it has used the Free Space Optical Communications (FSOC) links developed for Project Loon to beam hundreds of terabytes of data nearly five kilometers, no wires necessary. 

Now under the purview of the company’s X labs, the little-known Project Taara is already enhancing connectivity in Kenya and India. Google says FSOC is essentially a fiber optic connection (up to 20 Gbps) without the wires, but it requires a direct line of sight. In Africa, Taara is now beaming data across the Congo River from Brazzaville in the Republic of Congo and Kinshasa in the Democratic Republic of Congo. After setting up the links over the past few years, Google is now sharing some of the project’s more impressive metrics. 

Project Taara lead Baris Erkmen notes that Project Taara transmitted 700 TB over a recent 20-day period. This helped to back up wired connections in use by Google’s local partner Econet. Testing Taara in Africa makes sense because line-of-sight laser communication falls apart in a foggy locale like Google’s Bay Area home, and the fast-flowing Congo River has made connectivity in the region much more expensive. 

Even without a wire to insulate the optical signal from interference, Erkmen says the system had 99.9 percent uptime during that 20-day test. On the user side, there is no indication when their data passed through wires or FCOS nodes — Google aims to make the experience indistinguishable. The Taara links (see top) are placed high up, and they can automatically adjust their mirrors by up to five degrees to maintain a perfect connection. The system can hit a 5-centimeter target up to 10 kilometers away, according to Google. Taara has kept on ticking even in the face of inclement weather, flocks of birds, and other unexpected obstacles.


The Costs of 20 Years of War

The Costs of 20 Years of War
Two decades of war caused the deaths of nearly 1 million people and will cost US taxpayers more than $8 trillion.
By Neta Crawford
Sep 2 2021

The United States reacted to the 9/11 attacks with a military mobilization of unprecedented cost. Over the past 20 years, the US military has spent or requested about $5.8 trillion in today’s dollars. Add in medical expenses and disability payments for veterans, which according to research by Harvard’s Linda Bilmes will likely exceed $2.2 trillion by 2050, and the total cost of two decades of war is more than $8 trillion. Included in these numbers are the $704 million in “death gratuities” that have been paid to the survivors of the 7,052 service members who were killed as well as payments to civilians who were injured and the families of civilians who were killed.

Every country goes to war believing that it can win and that it will do everything it can to protect its own soldiers and the lives of noncombatants. But when things go awry, increments of force are often added—or surged—on the theory that a few more troops will make the difference. The war continues, and the costs in blood and treasure go up.

Wars take a toll on politics, too. Military operations may be shrouded in well-intentioned but unnecessary secrecy, and mistakes are hidden or downplayed. Voices of caution are often ignored, derided, or silenced as citizens, the media, and decision-makers rally around the flag and defer to generals. The Costs of War Project hopes that this accounting, and our other work, will promote transparency and facilitate informed conversations about current and future wars.


How America’s Big Science Literacy Mistake Is Coming Back To Haunt Us

How America’s Big Science Literacy Mistake Is Coming Back To Haunt Us
Without these two elements, we’re doomed to fail.
By Ethan Siegel
Sep 16 2021

In this day and age, it’s virtually impossible to have sufficient expertise to figure out what the complete, comprehensive, scientifically validated truth surrounding any issue is. Unless you yourself have spent many years studying, researching, and actively participating in furthering the scientific endeavor in a particular field, you can be certain — with an incredibly high degree of confidence — that your non-expertise will fundamentally limit the depth and breadth of your understanding. Put simply, your inexperience, relative to that of bona fide professionals, gives you too many blind spots that you yourself will be unaware of, to be able to distinguish what’s valid and conclusive from what’s not.

We have this persistent myth that’s been a part of a society for a very long time: that if you just do your own “research,” figuring out what you’re capable of learning from reading and listening to other sources, you’ll be just as capable of the experts at distinguishing truths from falsehoods. That if you just learn enough of the relevant facts and apply your logic, intuition, and critical reasoning skills to any problem you encounter, you’ll be as scientifically literate as anyone, empowering you to make expert-level decisions as routinely as the experts themselves can.

This fundamental misunderstanding of what it means to be scientifically literate, and the accompanying, even if unintentional, devaluation of actual expertise, is in large part why so many of us mistrust and misunderstand science today. We can correct our course, but only if we understand what it actually means to be scientifically literate.

The common definition of science literacy. What does it mean to be scientifically literate? For most people, they use a simple proxy for measuring their science literacy the same way they would use one for measuring language literacy: their ability to answer a variety of questions about scientifically known facts and issues. If you understand the basics of things like:

• the law of gravitation,
• biological evolution,
• the geological layers of the Earth,
• the germ theory of disease,
• planetary motion,
• along with mathematical concepts like rates, ratios, and percentages,
you’re guaranteed to do well on any number of tests allegedly designed to measure scientific literacy.

When the results of these tests and surveys come back, they’re almost always accompanied by headlines deriding the state of science literacy where they’re measured. For example,

• some percentage of people don’t know that the Earth is round;
• some percentage of those who know that it’s round don’t know how to discern whether it’s round or flat for themselves;
• some percent of people who can demonstrate that the Earth is, in fact, round, are unable to take the further step of determining Earth’s circumference from their measurements.
For almost all of us, asking progressively more and more sophisticated questions, just like this, will eventually reveal where our understanding ceases.

One common question that’s often asked in these surveys is whether the Earth revolves around the Sun or whether the Sun revolves around the Earth? Reliably, every time the question is asked, approximately 1 in 4 respondents get it wrong, answering that the Sun goes around the Earth instead of the other way around. But does this mean, as is often implied by those reporting on this, that Americans are hopeless, unable to learn and assimilate even the most basic facts about our physical reality.

Hardly. Just like IQ tests, SAT tests, or GRE tests, these types of tests measure one and only one thing: how well the people answering these questions perform at answering these questions. IQ tests don’t measure your intelligence; SATs don’t measure your scholastic aptitude, and GREs — both generally and in any particular subject — are a horrendous predictor of long-term career success.

What we’re using as a proxy for scientific literacy, the ability to answer a pre-chosen set of questions about the results of prior scientific inquiries, is woefully inadequate.

Why that definition fails us, every time. For starters, we’re not actually “doing science” when we’re making decisions or choices; we’re taking a shortcut. We’re using our current conception of the world and how it works — a conception that’s riddled with flaws, holes, and other blind spots — and basing our evaluation of scientific literacy on the ability to recall certain facts: facts that were most likely learned by rote. We aren’t figuring things out by putting the question to the laws of nature and listening to what nature tells us; we’re trying to remember the correct answer to questions that almost all of us have never, in fact, investigated ourselves.

Beyond that, even if we were to go away and perform the relevant experiments or acquire the relevant observations for ourselves, most of us would be woefully ill-equipped to draw the proper conclusions from the raw data. Most of us have no idea how to:

• properly calibrate or control those experiments,
• account for systematic uncertainties,
• follow the appropriate set of procedures for responsibly acquiring that data,
• or place those results in the context of all the other pieces of information that are relevant to the question we’re investigating within this particular field.
In short, the very fact that we’re lacking the required expertise to do this research prevents us from, in most cases, drawing valid conclusions even if we (think that we) understand the research for ourselves.

It’s tempting — just as Socrates was blamed for “corrupting the youth of Athens” through his teachings — to blame scientists and science educators for these perceived failings among the general populace. But such a line of thought is completely misguided, for a variety of reasons. For one, student performance often has very little to do with the instruction they receive. For another, long-term retention of facts learned in an academic setting has been demonstrated to be rare. And still another point is that people often decide what to believe for reasons other than scientific validity.

However, none of that actually gets to the core of the issue: that it’s an unreasonable expectation, even among scientists, to correlate answering a random set of questions about science correctly with one’s scientific literacy. These past 18 months have highlighted how poor nearly all of us are, ourselves, at:

• separating fact from fiction,
• choosing our experts wisely,
• drawing valid conclusions from the same raw data,
• understanding the full context in which certain studies are performed,
• following the full suite of data rather than the components that conform to our pre-existing biases,
• and, for lack of a better term, doing our own research.
Put simply, most of us are too underinformed to make an appropriately informed decision concerning many of the scientific issues facing us today. Even though the concept of “informed consent” is a pillar of medical care, being “informed” is much more about perception, in this context, than actually being adequately informed in any meaningful sense.

So, what is scientific literacy? Many important concepts in science, to put it bluntly, are beyond the ability of most people to master. That isn’t because they aren’t intelligent enough, but rather because most people are simply not going to put the requisite time and effort into learning how to properly conduct science in the particular field or sub-field they’re learning about. In order to actually become a competent scientist, years of specialized training are needed, and not only is it the case that innumerable new concepts need to be learned in gory detail, but an enormous number of misconceptions will need to be corrected in the process. Most new students, from the beginning of their undergrad education to the end of graduate school, require about a decade of full-time work in order to reach that point.

Rather than requiring that mastery to exist among everyone — a foolhardy and unattainable goal — a much more powerful measure of scientific literacy can be imparted to the general public through only two metrics:

• fostering an awareness of what the enterprise of science actually is,
• and fostering an appreciation for how applying the best known science to our societal problems positively impacts all of us.
As persuasively argued by Dr. Morris Shamos (mildly famous as the science advisor to Mr. Wizard) until his death in 2002, and then ignored by science educators everywhere, the combination of an awareness of and an appreciation for science would be transformative for our society.

At its very core, the enterprise of science is two things simultaneously, where neither one has any value without the other. On the one hand, science is the full suite of knowledge and data relevant to a particular issue: the cumulative answer to every aspect of one particular question, “what is true?” On the other hand, science is also a process for testing, inquiring, refining, and reproducing results that will probe and reveal further information about the Universe, beyond what is presently known. When one talks about “being aware” of the enterprise of science, it requires a recognition of your own incompetence in all areas compared to the greatest experts, including, if you yourself are a scientist, of your own ignorance and incompetence in various aspects of even your own field.

As far as an appreciation of science goes, consider your quality of life today, compared to that of your grandparents, your ancestors from centuries or millennia ago, or of a prehistoric human who might have lived 10,000+ years ago. Our ability to understand the environment around us, the laws of nature — including laws inherent to both the physical and life sciences — that all things obey, and to develop ubiquitous, life-improving technologies as a fruit of that knowledge, has led to so many of the modern advances we take for granted.

The electronic devices that permeate our society, the sophisticated medical care available to so many of us, and the improvements in our quality of life even over our lifetimes so far are undeniable. Science has made all of that possible, and to appreciate science is to appreciate the advances it’s brought into our lives.


To Survive Climate Change, We Need to Rebuild the World as We Know It

To Survive Climate Change, We Need to Rebuild the World as We Know It
Extreme weather events like Hurricane Ida prove that we need to rethink everything we think we know about the built and natural worlds.
By Jake Bittle
Sep 6 2021

The evening of September 1, three days after Hurricane Ida made landfall in Louisiana, the remnants of the storm passed over New York City and dumped more than five inches of water on the metropolitan area in the span of a few hours. The resulting flood event stranded thousands of cars on the roadways, spewed water into countless basement apartments, and shut down almost the entire subway system. At least 13 people died in the city alone. 

According to data from NOAA, the Ida event qualified as at least a “hundred-year flood,” which means it has around a 1 percent chance of happening in any given year. Except the event came only a few weeks after another once-in-a-lifetime flood; that storm broke the one-hour record for precipitation in New York at 1.94 inches, but the Ida event shattered that record again, bringing 3.14 inches of rain to Central Park in a single hour.

Descriptors like “hundred-year flood” aren’t just terms of art. They reflect how we understand and interpret the natural world, and when we witness events like Elsa and Ida, we can be confident that our understanding is sorely out of date. If we want to catch up to disasters like these, we have to come to terms with the fact that we need nothing short of a radical, full-scale transformation of the world we live in; to survive, we may need to alter it beyond recognition. We have to rethink how we build, live, and do almost everything, and rebuild most of the physical environment we take for granted.

Many of us are wont to think of climate change as the kind of thing that happens in particular, defined areas: Fires burn in California; hurricanes strike Florida; heat waves scald the deserts of Arizona. If it has done nothing else, the past summer has shown that this framework is false. The temperate Pacific Northwest burned beneath a heat dome, an apocalyptic flood raged through the hills of Middle Tennessee, and the Caldor Fire jumped the Sierra Nevada and burned through Lake Tahoe.

The increase in extreme rainfall events is of particular note, since it threatens many of the northern cities that we have thought of as refuges from climate change. Warmer air can hold more moisture, which means that storms passing through that air can gather and release more rain. This phenomenon is behind the convective weather patterns known as “rain bombs,” and although the science is not yet definite, it appears to affect larger storm systems like Ida and the rainstorm that struck Tennessee.

The sheer territorial extent of areas vulnerable to extreme weather has dire consequences: It’s now difficult to know where and how to concentrate our resources, and almost impossible for the eventual victims to prepare for impact. The Federal Emergency Management Agency publishes a nationwide database of flood zone maps that purport to show which areas are vulnerable to flooding. If you look at the maps for New York City, you will see that they bear almost no resemblance to the places where flooding took place during Ida: elevated parts of Brooklyn like Bushwick, Bed Stuy, and Park Slope were buried beneath several feet of water. The folly of the flood maps is the same as the folly of the hundred-year flood: They’re both attempts to delimit and define a crisis that seems to have no real outer bound.

Imagine you wanted to prevent the kind of flooding that happened in New York City during Ida. What would you do? First, you would have to find a way to soak up some of the rainwater as it falls on the city streets, because concrete and asphalt are very bad at absorbing water. One way to do this would be to create thousands or perhaps tens of thousands of natural water sinks such as bioswales or grasslands, but you’d have to make sure there were several in every single neighborhood, and you’d have to find a place to put them that didn’t interfere with private property or public rights-of-way. You could also revamp and expand the city’s storm drain system, or create a pump system to match the robust one in a city like New Orleans, but those interventions would cost untold billions of dollars, and there’s no guarantee they could keep up with the flooding during a rain event like Ida.

Let’s move on to residential damage. If you can’t fix the problem of drainage and water absorption, you need to get people out of low-lying homes, which means you have to find a way to relocate many of the 100,000-plus people who live in illegal basement and sub-grade apartments. This would entail addressing the endemic shortage of housing that forced many low-income residents to live in such converted units in the first place. There are also a million or more residents who live in or near the coastal floodplain, making them vulnerable to storm surge events like Sandy, and you’d have to find new homes for them as well. Or you could mandate that buildings in the floodplain be elevated above a certain height, but what about all the houses that are already there?

Then there’s the transportation system. The city is lucky to have few major expressways that run below street grade, but even the elevated expressways turn into swimming pools during Ida-caliber rain events, so you’ll have to modify them to create adequate drainage, then find a way to keep all that excess water away from nearby homes and streets. The subway, too, is an easy destination for all falling water, and in order to prevent major floods you will have to revamp the entire architecture of the underground train network. An estimated 20 percent of entrances are vulnerable to lethal flash flooding, so you’ll have to revamp those at a cost of who knows how many billion dollars. On a sunny day, the current system must pump around half a million gallons of water per hour in order to keep the stations dry; at its peak power, the Ida remnants dropped around 4 million gallons of water on Brooklyn in the space of a single hour, eclipsing the pumps’ capacity by around eightfold.


SpaceX launches world’s first ‘amateur astronaut’ crew to orbit Earth

SpaceX launches world’s first ‘amateur astronaut’ crew to orbit Earth
Launch marks biggest advancement so far in space tourism as Elon Musk’s company conducts first chartered passenger flight
By Guardian Staff
Sep 15 2021

SpaceX has launched the world’s first crew of “amateur astronauts” on a private flight to circle Earth for three days.

Wednesday night’s successful launch marked the most ambitious leap yet in space tourism. It’s the first chartered passenger flight for Elon Musk’s space company and the first time a rocket streaked toward orbit with a crew that contained no professional astronauts.

“It blows me away, honestly,” the SpaceX director, Benji Reed, said on the eve of launch from Nasa’s Kennedy Space Center. “It gives me goosebumps even right now to talk about it.”

Leading the flight is Jared Isaacman, 38, who made his fortune with a payment-processing company he started in his teens. Isaacman is the third billionaire to launch this summer, following flights by Virgin Galactic’s Richard Branson and Blue Origin’s Jeff Bezos in July.

Isaacman is joined by Hayley Arceneaux, 29, a childhood cancer survivor who works as a physician assistant at St Jude Children’s Research Hospital in Memphis, Tennessee. Isaacman has pledged $100m of his own money to the hospital and is seeking another $100m in donations.

Also along for the ride are the sweepstakes winners Chris Sembroski, 42, a data engineer in Everett, Washington, and Sian Proctor, 51, a community college educator in Tempe, Arizona.

Arceneaux is set to become the youngest American in space and the first person in space with a prosthesis, a titanium rod in her left leg.

The passengers will spend three days orbiting Earth at an unusually high altitude of 357 miles (575km) – 100 miles (160km) higher than the International Space Station – before splashing down off the Florida coast this weekend.

While Nasa has no role in the process, its managers and astronauts are rooting for the flight, dubbed Inspiration4.

“To me, the more people involved in it, whether private or government, the better,” said the Nasa astronaut Shane Kimbrough, who is nearing the end of his six-month space station stay.

Isaacman, an accomplished pilot, persuaded SpaceX to take the Dragon capsule higher than it’s ever been. Initially reluctant because of the increased radiation exposure and other risks, SpaceX agreed after a safety review.

“Now I just wish we pushed them to go higher,” Isaacman told reporters on the eve of the flight. “If we’re going to go to the moon again and we’re going to go to Mars and beyond, then we’ve got to get a little outside of our comfort zone and take the next step in that direction.”

Isaacman is picking up the entire tab for the flight but won’t say how many millions he paid.

Though the capsule is automated, the four Dragon riders spent six months training for the flight to cope with any emergency. That training included centrifuge and fighter jet flights, launch and re-entry practice in SpaceX’s capsule simulator and a grueling trek up Washington’s Mount Rainier in the snow.

Four hours before liftoff, the four emerged from SpaceX’s huge rocket hangar, waving and blowing kisses to their families and company employees, before they were driven off to get into their sleek white flight suits. Once at the launch pad, they posed for pictures and bumped gloved fists, before taking the elevator up. Proctor danced as she made her way to the hatch.

SpaceX’s next private trip, early next year, will see a retired Nasa astronaut escorting three wealthy businessmen to the space station for a weeklong visit. The Russians are launching an actor, film director and a Japanese tycoon to the space station in the next few months.


The History of Publishing Is a History of Racial Inequality

The History of Publishing Is a History of Racial Inequality
A conversation with Richard Jean So about combining data and literary analysis to understand how the publishing industry came to be dominated by white writers.
By Rosemarie Ho
May 27 2021

Late last year, The New York Times published an opinion piece that illustrated an uncomfortable fact: The vast majority of American authors published after World War II have been white. This should not be a surprise to most people who pay any attention to contemporary literature, but the voluminous data included in the piece proved shocking even to the worst of pessimists. Between the years 1950 and 2018, the authors found, 95 percent of books published with major firms like Penguin Random House and Simon and Schuster were written by white people. That gap hadn’t narrowed at all in 2018—white people wrote 89 percent of books published that year. And in 2020, only 10 percent of the New York Times best-seller list were written by people of color. The New York Times piece promptly went viral.

The main researcher behind the piece is Richard Jean So, an assistant professor of English and cultural analytics at McGill University whose research has culminated in Redlining Culture: A Data History of Racial Inequality and Postwar Fiction. Employing computational analysis and close reading, the book argues that the history of postwar American publishing is one of white stasis, where only a select few people of color are given opportunities to be published and promoted as whiteness shifts and reinforces its hegemony over the literary sphere. So only examined literary fiction from this period, but the numbers are bleak. Between 1950 and 2000, 97 percent of the novels published by Random House were written by white people, 90 percent of the novels reviewed in major periodicals were by white people, and 91 percent of major book prizes were awarded to white authors.

But the book goes one step further, arguing that this inertia is measurable through computational analysis of the form and content of these novels. At every level of publishing, So charges, you can find the unmistakable domination of whiteness, and this ought to make everyone, especially literary scholars, question received narratives about the success of multiculturalism in American arts and letters. Earlier this year we talked to So about Redlining Culture. This conversation has been edited and condensed for clarity.

—Rosemarie Ho

Rosemarie Ho: One of the findings in your book that I found really interesting was that postwar US fiction written by white authors is generally “a reflexive…narration of what it’s like to be a white person writing novels in the time of a growing racial diversity,” preferring “adverbs and modal verbs, indicating a version of reality defined by qualification and conditionality.” You also compare this to the corpus of Black literature written in the same period, and found that the latter is more distinguished by lexical diversity and dialogue. Can you talk more about your methodological approach? Can we really measure literary whiteness?

Richard Jean So: One thing I really wanted to do is try to get an account of cultural or literary whiteness: What does that really look like? I think sociologists have done this more through case studies—through interviews, they’ve noticed some traits—but I want to do this more at scale. I quantify that in different ways. I’m using two methods: one that looks at the way that white versus Black writers tend to describe white people, and I find a set of collocations or terms that tend to be used to describe people in stories. The second, slightly different method compares novels written by white authors, primarily best sellers and prize winners, versus novels written by Black writers. The two analyses converge on the same thing, which is: There’s a kind of recursiveness inherent to white texts. Which is to say whiteness is very interested in itself.

The major takeaway is that we can use machines to identify a white cultural voice. I think that’s important because people might believe one doesn’t exist or might think that’s too abstract. Whereas I want to say like, no, it’s real; whiteness is a kind of culture. It has a relationship to blackness. Sara Ahmed argues that a big part of whiteness is narcissism. I think that the data really bears that out—that white narcissism is really a thing that has certain qualities and a historical trajectory. And when we think about the cultural field, like best sellers, prizewinners—they also have a distinct white voice that’s very different from Blackness. And what really defines both of those categories is their whiteness, and how they, in some ways, are anti-Black in terms of their style, or their form, or the things that they care about. That’s something that we might have intuited before, but we weren’t quite clear about that.

RH: How broad can we take these claims to be?

RJS: All we can really know is based on the data that I have. So I would say [the book] has a pretty good account of Random House after the war, which is super similar to other publishing houses, but it’s super different from small publishers, like Grove or Graywolf, that stress diversity. But people need to do more research, build different kinds of models, use different methods. The one thing I am confident about is the amount of racism and the dominance of whiteness, and I’d be surprised if someone found something really different. I think it’s more like nuancing the results; if someone found two thousand novels of Black sci-fi, you know, it might be different. I’m not trying to get the last word—this is a foundation for more research.

RH: The book invokes economic language and measurement to think through the problem of the literary publishing industry as being extremely white. Writers of color, and Black writers in particular, are culturally redlined from opportunities and resources that their white counterparts have in abundance. What I thought was interesting is that the book doesn’t really look at the economics of publishing—how much money is spent on an average ad campaign or book advance at Random House, for example, for a white best-selling author versus a writer of color in the 1970s. Is there a reason why this is, other than the fact that it’s hard to collate all those contracts?

RJS: That data is super hard to get, though we’re working on some stuff right now—we’re trying to get all that advance information. This question of capitalism and how does it interact with questions of diversity are very interesting to me. I mean, publishers didn’t really give me any data, and they definitely were not going to give me sales data. I didn’t want to ignore the world of commerce so I just had proxies in my book. Best sellers are kind of a proxy for making money; book reviews are kind of a proxy, since you want to get prizes and reviews in The New York Times and stuff. I just have these proxies for being successful because I think writers also don’t just write to make money. They want to be recognized, right?

This kind of cuts to another question about publishing, which is the fact that houses are so secretive about sales information, and they themselves don’t keep very good records. There’s always this mythology around publishing, which is most publishers don’t do it for money. But it’s important because writers need to make a living! I couldn’t resolve that for the book. I didn’t know how to talk about commerce in publishing, and then diversity. But someone else should do it. That’d be really good.

I do speculate that more conglomeration seems to hurt minority writers, because what really sells books for publishers is the mid-list, like genre fiction, romance, and so on. That’s where it’s really hard for minority writers to break in. In literary fiction you’ll have about one super famous literary writer [of color], but then romance will be like 100 percent white. The latter’s actually the engine of what makes them money. I just couldn’t include that in the book.

RH: Switching track a little bit—there’s a bit of scholarly debate in the book right over whether academia has become much more sensitive to issues of race. Part of the argument that might prove to be more incendiary to scholars who read this book is the claim that the language of critical theory, and more specifically the overt focus on rhetorical moves and language, obscures the fact that racial inequality is still a burning trash can fire in American history. I don’t know if that’s necessarily true for specifically Asian American studies or scholars that don’t work overtly in literary studies, but who are reading and writing about minority literatures nonetheless.

RJS: I tried to frame it in a way that was not incendiary! I’m trying to build bridges, you know, not to antagonize people. But you’re right. Just to reiterate the argument: I end the book after showing this, like, insane, pervasive racial inequality, by pointing out this puzzle: When you look at Norton anthologies, which are proxies for what’s being taught, anthologies became 60 percent white, compared to 89 percent previously. Something was happening in the university, syllabi were being diversified, people were really getting interested in talking about race. And yet when I look at a huge corpus of 60,000 academic articles in this period that discuss race, there’s a real disconnect between how people are not really talking about the structural inequality in the literary field as it’s happening at the time, and the attention paid to race. I do some analysis—even this keyword “inequality” is not really one of the words we really use.

I think you’re right. People in ethnic studies are using “inequality”; they do care about this. But I still believe, through my analysis, that the attention is much more toward rhetorical, discursive type things, not necessarily material conditions. Certainly disciplines like cultural sociology are looking at material production. But because no one’s grinding the numbers, people just did not realize how insane it was, because we focus so much on individual success examples. People actually thought things were really changing because Toni Morrison was super successful in the late ’80s. I think it’s an oversight. It’s not a real damning critique of scholars who do this work; I just think that no one ran the numbers, which actually are surprising.

There was a massive change in the university in the ’80s and ’90s, but I do think we might have overstated its impact—that suddenly you look around and everyone’s teaching Black literature in your department and across the nation. That created to a certain degree this illusion of a broader success of multiculturalist society. The publishing industry itself didn’t really change that much in terms of racial representation in the kinds of stories it was publishing. I don’t talk about this much in the book, but my final comment about this is, this explosion of white nationalism that we’re dealing with now should not be surprising, if you always knew how supposedly (white) liberals actually think about rightwing cultural forms.


Apple co-founder Steve Wozniak announces private space venture ‘unlike the others’

[Note:  This item comes from reader Randall Head.  DLH]

Apple co-founder Steve Wozniak announces private space venture ‘unlike the others’
By John Loeffler
Sep 13 2021

Apple co-founder Steve Wozniak and Ripcord Inc. founder Alex Fielding announced a new space company, Privateer, which aims to “keep space safe and accessible for all humankind.”

Wozniak, who co-founded Apple in April 1976 with Steve Jobs, isn’t new to launching new ventures of all sorts – he’s been doing that over the years – so a private space company isn’t totally out of step for him. That said, there’s no indication yet what Privateer aims to achieve actually. 

The company does plan to provide more details in private group sessions at the upcoming AMOS conference this week, presumably with potential investors, according to Space Explored.

The new company did release a trailer of sorts announcing itself, which may provide some clue as to its purpose, including footage of past rocket launches and spaceflights, something that looks like a burning crash test dummy with a missing arm, a rotating space station, and an assortment of fire-streaked asteroids threatening to ruin someone’s day for sure. 

Analysis: what the heck is Woz getting up to now?

On the one hand, another private space company entering the fray is getting so passé that it barely qualifies as news at all. 

Private space operations are starting to heat up, with several new startups like Firefly Aerospace coming onto the scene in recent years aimed at providing low cost, small satellite launch services for the broader public beyond just government agencies, various militaries, and the handful of private businesses who already have a presence in low-earth orbit.

On the other hand, Wozniak is involved with this one, which does raise our level of interest. What makes his involvement notable is that Wozniak has a reputation for being a tech geek’s tech geek. While Steve Jobs gets the lion’s share of the accolades as Apple’s pioneer-in-chief (not undeservedly), Wozniak has always been the one behind the stage with the soldering iron and screwdriver over an open computer case.

While Steve Job’s sold the public on the Apple II personal computer and revolutionized the world, Wozniak was the chief designer of the computer itself, and has gone in the years since to become something of a public advocate for issues like net neutrality, right to repair, and open access technology. He is most definitely not the “Money Guy” in the room, and in the private space venture sector, the money guys dominate. We have no clue what Woz is up to, but we are most definitely intrigued. 

Re: Prisons and epidemics

[Note:  This comment comes from friend David P. Reed.  DLH]

From: “David P. Reed” <>
Subject: RE: [Dewayne-Net] Prisons and epidemics
Date: September 13, 2021 at 6:48:28 PM EDT

Dewayne – to give some context to this JAMA study, let me point this out:

I’ve been reading Michael Lewis’s book Premonition, which I recommend to anyone who is interested a deeper understanding of how messed up the US Public Health Services are, deeply and fundamentally. I mean seriously messed up, resulting in hundreds of thousands of preventable deaths in the current pandemic that could have been managed by containment. I can’t summarize it here. But one point is clear:

Long before COVID-19, back in the GW Bush and Obama administrations, some doctors in the government *knew* that schools and prisons and nursing homes were critical points that had to be addressed very early in a potential pandemic, to have any hope of containing a pandemic before it became uncontainable. And early in January 2020, these and other doctors had the data to extrapolate that COVID-19 was likely to kill vast numbers of Americans (and in the world, many more) unless it was contained early.

Yet, the official position of the CDC doctors, not a political position, but the opinion of medical experts throughout the CDC, was that non-phamacological interventions (NPI) was utterly pointless in an epidemic. So much so, that senior medical officers there told all public health officials in all the states that NPI was to be discouraged as a waste of resources – that the only possible approach was to wait for a vaccine to be developed – nothing else would be useful. (The CDC also decided that no one should test patients unless they were on respirators or otherwise severely infected.)

This is only a small part of the book’s extensive reporting on what actually happened behind the scenes throughout the US health systems – not just the political appointees, but the careerists who had been there for a long time.

So the point with respect to this article is that there have NOT been a lot of studies of places like prisons and schools and their role in pandemic or epidemic spread. This is both a cause and effect of the CDC’s unwillingness to believe that NPI was valuable in the crisis of a pandemic.

This should have been studied years ago. We should have, after 1918, studied what to do about pandemics when there is no magic bullet cure or shield. What to do when the spread pattern is exponential, as all airborne infectious diseases are. Doctors should have been taught that R0 isn’t a function just of the virus, but is a function of the society’s behavioral choices, and of medical testing, etc.

So why is this study just coming out NOW? Well, it is a profound failure of US and World Public Health systems that such research was deemed essentially worthless by the leadership of the US medical research institutions.

It’s not that we have lacked the technology to do such research until today, though we have far better (genomically based) technology for tracking exactly how a virus like SAR-CoV-2 spreads from person to person by tracking genome changes between successive rounds of the infected subgroup. 

I had thought myself, a year ago, that such knowledge would surely have been known, at least at the CDC. But apparently the CDC actually has been profoundly uninterested in dealing with pandemics for decades. At least that is what Michael Lewis’s book shows.

Also, Anthony Fauci heads the National Institute for Allergy and Infectious Diseases, charged with doing research on *infectious diseases*. You’d think that this organization would have done or sponsored such research. But again, you’d be surprised. They are only interested in vaccines and drug based treatments that might be developed by Big Pharma. Which might explain why Fauci never really supported doing significant testing or containment or NPI w.r.t. COVID-19 during the entire 2020, either in public or inside the government.

It wasn’t just Donald Trump (who didn’t lead). It’s actually, apparently, that the entire government public health apparatus has been asleep at the switch for decades. They didn’t have the ability or know-how. Now Fauci eventually could take credit for a vaccine, but it’s actually unclear how much he led even that effort – Moderna actually got its major funding push from DoD, and Pfizer funded its development mostly out of pocket, as I understand it.

As we go forward, whatever happens with COVID-19, it’s important to capture this context so that Americans will understand how big a failure happened in 2020, how many Americans have died unnecessarily as a result, and rather than allowing it to be covered up by a government propaganda move to say “Mission Accomplished – 2021”, we actually learn lessons in preparedness for such crises.

That’s why this study is heartening. We should have done it long ago. We could have done it long ago. But at least we are doing it today. We haven’t, however, done any comparable on-site studies of school airborne viral transmission. Schools are the single largest tool we have to contain a pandemic before it becomes catastrophic.

From: “David S. H. Rosenthal” <>
Subject: Prisons and epidemics
Date: September 13, 2021 at 10:18:27 AM EDT

“This cohort study provides one such comparative analysis, suggesting that
government implementation of emergent measures—such as nursing home and
prison visitation restrictions, school closures, mask mandates, and jail
decarceration—are important for effective epidemic mitigation. Furthermore,
its findings reflect that epidemic control depends not only on emergent responses
but also on longer-term policy determinants of public health vulnerability.
Specifically, our results suggest that the globally unparalleled system of mass
incarceration in the US, which is known to incubate infectious diseases and to
spread them to broader communities, puts the entire country at distinctive
epidemiologic risk. This study is thus consistent with existing expert consensus16
that public investment in a national program of large-scale decarceration and
reentry support is an essential policy priority for reducing racial inequality and
improving US public health and safety, pandemic preparedness, and biosecurity.”