Hurricane Ian is no anomaly. The climate crisis is making storms more powerful

Hurricane Ian is no anomaly. The climate crisis is making storms more powerful
By Michael E Mann and Susan Joy Hassol
Sep 30 2022
https://www.theguardian.com/commentisfree/2022/sep/30/hurricane-ian-climate-crisis-no-anomaly-storms-more-powerful

Climate change once seemed a distant threat. No more. We now know its face, and all too well. We see it in every hurricane, torrential rainstorm, flood, heatwave, wildfire and drought. It’s even detectable in our daily weather. Climate disruption has changed the background conditions in which all weather occurs: the oceans and air are warmer, there’s more water vapor in the atmosphere and sea levels are higher. Hurricane Ian is the latest example.

Ian made landfall as one of the five most powerful hurricanes in recorded history to strike the US, and with its 150 mile per hour winds at landfall, it tied with 2004’s Hurricane Charley as the strongest to ever hit the west coast of Florida. In isolation, that might seem like something we could dismiss as an anomaly or fluke. But it’s not – it’s part of a larger pattern of stronger hurricanes, typhoons and superstorms that have emerged as the oceans continue to set record levels of warmth.

Many of the storms of the past five years – Harvey, Maria, Florence, Michael, Ida and Ian – aren’t natural disasters so much as human-made disasters, whose amplified ferocity is fueled by the continued burning of fossil fuels and the increase in heat-trapping carbon pollution, a planet-warming “greenhouse gas”.

This Atlantic hurricane season, although it started out slow, has heated up, thanks to unusually warm ocean waters. Fiona hit Puerto Rico as a powerful category 4 storm, and hundreds of thousands of people there are still without power. The storm barreled on into the open Atlantic, eventually making landfall in the maritime provinces to become Canada’s strongest ever storm. Then came Ian, which feasted on a deep layer of very warm water in the Gulf of Mexico.

Human-caused warming is not just heating the surface of the oceans; the warmth is diffusing down into the depths of the ocean, leading to year after year of record ocean heat content. That means that storms are less likely to churn up colder waters from below, inhibiting one of the natural mechanisms that dampen strengthening. It also leads to the sort of rapid intensification we increasingly see with these storms, where they balloon into major hurricanes in a matter of hours.

Too often we still hear, even from government scientists, the old saw that we cannot link individual hurricanes to climate change. There was a time when climate scientists believed that to be true. But they don’t any more. We have developed powerful tools to attribute the degree to which global warming affects extreme events. One study found, for example, that the devastating flooding from Hurricane Florence as it made landfall in North Carolina four years ago was as much as 50% greater and 80km (50 miles) larger due to the warmer ocean.

We can also draw upon basic physics, as we explained in Scientific American in 2017. Warmer oceans mean more fuel to strengthen hurricanes, with an average increase in wind speeds of major hurricanes of about 18mph for each 1C (1.8F) of ocean surface warming, a roughly 13% increase. Since the power of the storm increases roughly the wind speed not only squared but raised to the third power, that amounts to a roughly 44% increase in the destructive potential of these storms.

There is also evidence that human-caused warming is increasing the size of these storms. All else being equal, larger storms pile up greater amounts of water, leading to larger storm surges like the 12 to 18 feet estimated for Ian in some locations. Add sea level rise, and that’s the better part of foot of additional coastal flooding baked into every single storm surge. If humanity continues to warm the planet, and destabilize the Greenland and west Antarctic ice sheets, we could see yards, not feet, of eventual sea-level rise. Think of that as a perpetual coastal flooding event.

Then there is the flooding rainfall, like the 20 inches (50cm) of it we’re seeing across a large swath of Florida with Ian. Simple physics tells us that the amount of moisture that evaporates off the ocean into the atmosphere increases about 7% for each 1C of ocean surface warming. That means 7% more moisture to turn into flooding rains. But that’s not the whole story. Stronger storms can entrain more moisture into them – a double whammy that produced the record flooding we saw in Philadelphia a year ago with Hurricane Ida, and the flooding we saw with Harvey in Texas in 2017 and Florence in the Carolinas in 2018, the two worst flooding events on record in the US.

Tampa’s wide shallow coastal shelf, low topography combined with rising sea levels and vulnerable infrastructure make it particularly vulnerable to a landfalling major hurricane. Tampa Bay has dodged multiple bullets in recent years in the form of major hurricanes that ultimately weakened or swerved away from the city. Ian is the latest example, as it passed to the east rather than to the west of Tampa Bay, sparing the sprawling urban population a devastating storm surge that would have flooded the homes of millions.

Unfortunately, Tampa’s luck will eventually run out. We must prepare for the inevitable calamity that will occur when the city is at the receiving end of a losing roll of the weather dice.

It is important to take steps to increase resilience and adapt to the changes that are inevitable, taking all of the precautions we can to spare our coasts from the devastating consequences of sea-level rise combined with stronger, more damaging hurricanes. But no amount of adaptation can shield Florida, or anywhere else, from the devastating consequences of the continued warming of our planet.

[snip]

The Pandemic’s Legacy Is Already Clear

The Pandemic’s Legacy Is Already Clear
All of this will happen again.
By Ed Yong
Sep 30 2022
https://www.theatlantic.com/health/archive/2022/09/covid-pandemic-exposes-americas-failing-systems-future-epidemics/671608/

Recently, after a week in which 2,789 Americans died of COVID-19, President Joe Biden proclaimed that “the pandemic is over.” Anthony Fauci described the controversy around the proclamation as a matter of “semantics,” but the facts we are living with can speak for themselves. COVID still kills roughly as many Americans every week as died on 9/11. It is on track to kill at least 100,000 a year—triple the typical toll of the flu. Despite gross undercounting, more than 50,000 infections are being recorded every day. The CDC estimates that 19 million adults have long COVID. Things have undoubtedly improved since the peak of the crisis, but calling the pandemic “over” is like calling a fight “finished” because your opponent is punching you in the ribs instead of the face.

American leaders and pundits have been trying to call an end to the pandemic since its beginning, only to be faced with new surges or variants. This mindset not only compromises the nation’s ability to manage COVID, but also leaves it vulnerable to other outbreaks. Future pandemics aren’t hypothetical; they’re inevitable and imminent. New infectious diseases have regularly emerged throughout recent decades, and climate change is quickening the pace of such events. As rising temperatures force animals to relocate, species that have never coexisted will meet, allowing the viruses within them to find new hosts—humans included. Dealing with all of this again is a matter of when, not if.

In 2018, I wrote an article in The Atlantic warning that the U.S. was not prepared for a pandemic. That diagnosis remains unchanged; if anything, I was too optimistic. America was ranked as the world’s most prepared country in 2019—and, bafflingly, again in 2021—but accounts for 16 percent of global COVID deaths despite having just 4 percent of the global population. It spends more on medical care than any other wealthy country, but its hospitals were nonethelessoverwhelmed. It helped create vaccines in record time, but is 67th in the world in full vaccinations. (This trend cannot solely be attributed to political division; even the most heavily vaccinated blue state—Rhode Island—still lags behind 21 nations.) America experienced the largest life-expectancy decline of any wealthy country in 2020 and, unlike its peers, continued declining in 2021. If it had fared as well as just the average peer nation, 1.1 million people who died last year—a third of all American deaths—would still be alive.

America’s superlatively poor performance cannot solely be blamed on either the Trump or Biden administrations, although both have made egregious errors. Rather, the new coronavirus exploited the country’s many failing systems: its overstuffed prisons and understaffed nursing homes; its chronically underfunded public-health system; its reliance on convoluted supply chains and a just-in-time economy; its for-profit health-care system, whose workers were already burned out; its decades-long project of unweaving social safety nets; and its legacy of racism and segregation that had already left Black and Indigenous communities and other communities of color disproportionately burdened with health problems. Even in the pre-COVID years, the U.S. was still losing about 626,000 people more than expected for a nation of its size and resources. COVID simply toppled an edifice whose foundations were already rotten.

In furiously racing to rebuild on this same foundation, America sets itself up to collapse once more. Experience is reputedly the best teacher, and yet the U.S. repeated mistakes from the early pandemic when faced with the Delta and Omicron variants. It got early global access to vaccines, and nonetheless lost almost half a million people after all adults became eligible for the shots. It has struggled to control monkeypox—a slower-spreading virus for which there is already a vaccine. Its right-wing legislators have passed laws and rulings that curtail the possibility of important public-health measures like quarantines and vaccine mandates. It has made none of the broad changes that would protect its population against future pathogens, such as better ventilation or universal paid sick leave. Its choices virtually guarantee that everything that’s happened in the past three years will happen again.

The U.S. will continue to struggle against infectious diseases in part because some of its most deeply held values are antithetical to the task of besting a virus. Since its founding, the country has prized a strain of rugged individualism that prioritizes individual freedom and valorizes self-reliance. According to this ethos, people are responsible for their own well-being, physical and moral strength are equated, social vulnerability results from personal weakness rather than policy failure, and handouts or advice from the government are unwelcome. Such ideals are disastrous when handling a pandemic, for two major reasons.

First, diseases spread. Each person’s choices inextricably affect their community, and the threat to the collective always exceeds that to the individual. The original Omicron variant, for example, posed slightly less risk to each infected person than the variants that preceded it, but spread so quickly that it inundated hospitals, greatly magnifying COVID’s societal costs. To handle such threats, collective action is necessary. Governments need policies, such as vaccine requirements or, yes, mask mandates, that protect the health of entire populations, while individuals have to consider their contribution to everyone else’s risk alongside their own personal stakes. And yet, since the spring of 2021, pundits have mocked people who continue to think this way for being irrational and overcautious, and government officials have consistently framed COVID as a matter of personal responsibility.

Second, a person’s circumstances always constrain their choices. Low-income and minority groups find it harder to avoid infections or isolate when sick because they’re more likely to live in crowded homes and hold hourly-wage jobs without paid leave or the option to work remotely. Places such as prisons and nursing homes, whose residents have little autonomy, became hot spots for the worst outbreaks. Treating a pandemic as an individualist free-for-all ignores how difficult it is for many Americans to protect themselves. It also leaves people with vulnerabilities that last across successive pathogens: The groups that suffered most during the H1N1 influenza pandemic of 2009 were the same ones that took the brunt of COVID, a decade later.

America’s individualist bent has also shaped its entire health-care system, which ties health to wealth and employment. That system is organized around treating sick people at great and wasteful expense, instead of preventing communities from falling sick in the first place. The latter is the remit of public health rather than medicine, and has long been underfunded and undervalued. Even the CDC—the nation’s top public-health agency—changed its guidelines in February to prioritize hospitalizations over cases, implicitly tolerating infections as long as hospitals are stable. But such a strategy practically ensures that emergency rooms will be overwhelmed by a fast-spreading virus; that, consequently, health-care workers will quit; and that waves of chronically ill long-haulers who are disabled by their infections will seek care and receive nothing. All of that has happened and will happen again. America’s pandemic individualism means that it’s your job to protect yourself from infection; if you get sick, your treatment may be unaffordable, and if you don’t get better, you will struggle to find help, or even anyone who believes you.

In the late 19th century, many scholars realized that epidemics were social problems, whose spread and toll are influenced by poverty, inequality, overcrowding, hazardous working conditions, poor sanitation, and political negligence. But after the advent of germ theory, this social model was displaced by a biomedical and militaristic one, in which diseases were simple battles between hosts and pathogens, playing out within individual bodies. This paradigm conveniently allowed people to ignore the social context of disease. Instead of tackling intractable social problems, scientists focused on fighting microscopic enemies with drugs, vaccines, and other products of scientific research—an approach that sat easily with America’s abiding fixation on technology as a panacea.

The allure of biomedical panaceas is still strong. For more than a year, the Biden administration and its advisers have reassured Americans that, with vaccines and antivirals, “we have the tools” to control the pandemic. These tools are indeed effective, but their efficacy is limited if people can’t access them or don’t want to, and if the government doesn’t create policies that shift that dynamic. A profoundly unequal society was always going to struggle with access: People with low incomes, food insecurity, eviction risk, and no health insurance struggled to make or attend vaccine appointments, even after shots were widely available. A profoundly mistrustful society was always going to struggle with hesitancy, made worse by political polarization and rampantly spreading misinformation. The result is that just 72 percent of Americans have completed their initial course of shots and just half have gotten the first of the boosters necessary to protect against current variants. At the same time, almost all other protections have been stripped away, and COVID funding is evaporating. And yet the White House’s recent pandemic-preparedness strategy still focuses heavily on biomedical magic bullets, paying scant attention to the social conditions that could turn those bullets into duds.

Technological solutions also tend to rise into society’s penthouses, while epidemics seep into its cracks. Cures, vaccines, and diagnostics first go to people with power, wealth, and education, who then move on, leaving the communities most affected by diseases to continue shouldering their burden. This dynamic explains why the same health inequities linger across the decadeseven as pathogens come and go, and why the U.S. has now normalized an appalling level of COVID death and disability. Such suffering is concentrated among elderly, immunocompromised, working-class, and minority communities—groups that are underrepresented among political decision makers and the media, who get to declare the pandemic over. Even when inequities are highlighted, knowledge seems to suppress action: In one study, white Americans felt less empathy for vulnerable communities and were less supportive of safety precautions after learning about COVID’s racial disparities. This attitude is self-destructive and limits the advantage that even the most privileged Americans enjoy. Measures that would flatten social inequities, such as universal health care and better ventilation, would benefit everyone—and their absence harms everyone, too. In 2021, young white Americans died at lower rates than Black and Indigenous Americans, but still at three times the rate of their counterparts in other wealthy countries.

[snip]

Something Strange Happens When You Tear These Creatures Apart

Something Strange Happens When You Tear These Creatures Apart
Behold choanoflagellates, tiny creatures that can be one body and many bodies all at once.
By Katherine J. Wu
Sep 28 2022
https://www.theatlantic.com/science/archive/2022/09/choanoflagellates-multicellularity-individuality-origins-of-animal-life/671588/

When people die, our whole body dies with us. The heart stops pumping; the gut stops digesting; every cell that carries a person’s genetic blueprint eventually extinguishes, until their molecular signature is extinct. This is the curse of humans’—really, most animals’—multicellular makeup: The cells within our bodies are so specialized, so interdependent, that their fates are lashed together even in death.

Multicellularity does not have to manifest this way, however. Just a hop, skip, and a jump over from us on the tree of life are the choanoflagellates—little marine and freshwater creatures roughly the size of yeast. Choanoflagellates commonly appear as single cells with a long, whipping tail, a bulbous head, and a frilly collar, resembling, as my colleague Ed Yong has memorably described it, “a sperm wearing a skirt.” But under the right conditions, choanoflagellates can also bloom into many-celled bodies, joining individual cells together into single entities that, with some squinting and imagination, bear curious resemblances to the bodies of animals. Their bodies, like ours, are usually genetically identical all the way through; their bodies, like ours, can bend and flex, as if composed of muscles in motion. Their bodies, like ours, can even harbor tiny communities of bacteria that may help them survive.

When blown apart or sliced, though, choanoflagellate bodies don’t bleed or collapse into fleshy bits, like ours do. They disaggregate back into single cells, each now free to wander away from its former compatriots and go off on its own. There’s no theoretical limit to this resiliency, experts told me: Were a predator to consume 99 percent of a choano coalition, whatever single cells remained could persist—the rough equivalent of a single human finger, left over from an explosion, crawling away to start existence anew.

In a single lifetime, a choanoflagellate can “completely change its way of interacting with the environment, in truly fundamental ways,” says Nicole King, a choanoflagellate biologist at UC Berkeley. It can shift its means of being in this world. This flexible form of multicellularity isn’t one that the animal lineage kept, but its existence could reveal a lot about our origins all the same. The study of these little creatures has helped reshape how humans conceive of complex bodies, even what it means to be an individual—a notion that gets challenged every time a cell successfully separates itself from the body it once belonged to.

For any creature that can pull it off, managing multicellularity comes with obvious perks. Bigger bodies move more rapidly, use nutrients more efficiently, and more easily resist life’s stressors; they’re harder for predators to swallow and better at chasing down prey. In the course of evolution, multicellularity has proved to be such a boon that it’s thought to have arisen up to 25 separate times—maybe more—in the past 800 million years or so, begetting today’s fauna, flora, fungi, and more in all their wild and wondrous forms. But the modern members of the animal lineage—millions and millions of species of them that flit and fly and sprint and swim and wriggle and crawl—can all trace their origins to a singular uni-to-multi switch. “Everyone agrees that multicellularity in animals evolved once,” says Pawel Burkhardt, a neurobiologist at the University of Bergen, in Norway.

Just how our ancestors pulled this off, however, remains a big mystery. Choanoflagellates offer a crucial clue. The creatures are widely considered to be the closest living unicellular relatives of animals: a sister twig on the tree of life that grew up alongside ours. That positioning makes choanoflagellates one of the best modern glimpses of the branch from which the animal lineage once sprang, says Flora Rutaganira, a biologist at Stanford University.

More than 100 species of choanoflagellates have been identified so far by scientists. As far as experts can tell, the creatures are quite content to remain on their own in many circumstances. In their default unicellular state, they spend their days swimming and grazing on bacteria; when it’s time to make more of themselves, single choanos double in size, then split cleanly in two. Sometimes, though, choanos decide that a lonesome life is not enough. Instead of fully separating after division, newborn cells that might have once meandered off stay tethered together. In the species Salpingoeca rosetta, even the cells’ innards may interconnect, a bit like an umbilical cord between parent and child that never gets fully cut.

By repeating this process, choanoflagellate colonies can swell to contain dozens, even hundreds, of cells, King told me, and take on a menagerie of shapes. As Thibaut Brunet, a biologist at the Pasteur Institute, in Paris, has found, some, like the acrobatic Choanoeca flexa, can assemble into cup-shaped colonies when exposed to copious light; a plunge into darkness, meanwhile, prompts the groups to invert so that their wiggly ends face out instead of in, a conformation that makes it easier for the coterie to swim. Other species, including S. rosetta, blossom into the lumpy rosettes that give them their name when in the presence of certain types of bacteria. Take a careful gander at some of these orblike colonies, King told me, and they might look a shade familiar: Roughly, thematically, they look almost like animal embryos, ballooning outward into bodies, ready to be born.

Choanoflagellate colonies aren’t really built to last. In the laboratory, scientists can disperse colonies by shaking them or starving them, even squeezing them through a tight space. What was once a body then “just disintegrates” into its cellular components, Burkhardt told me, as if it was never whole. This toggling is an eerie ability, and a powerful one. As great as big bodies can be, they’re also cumbersome, especially when food becomes scarce. Single cells are easier to sate with limited nutrients and reproduce more rapidly; they’re more adaptable to changing conditions, because they don’t have to wait for dozens of their comrades to “come to consensus,” Rutaganira told me. Earth’s mass-extinction events have disproportionately impacted large animals, while sparing the speedy and small, says Pedro Márquez-Zacarías, an evolutionary biologist at the Santa Fe Institute.

Such flexible strategies might be an odd way to think about multicellularity, at least for humans, whose notions hew to the traits of our own bodies: stable, codependent, composed of cells that can survive only if they’re a part of a larger whole. That’s how King first conceived of the concept when she started her lab more than a decade ago. Now, though, “I’ve seen that multicellularity exists along a continuum,” she told me. Cooperation can manifest in a multitude of ways, from transient affiliations—cellular small talk—to permanent mutual reliance.

Humans have had their notions of independence challenged before. Super-social creatures such as bees and naked mole rats, for instance, live in such tight-knit familial societies that they can operate only as a collective; animal guts harbor bustling communities of microbes that evolve alongside their hosts. (Coincidentally, or not so coincidentally, certain choanos also seem capable of housing a bespoke microbiome while in colony form, as a team now being led by UC Berkeley’s Alain Garcia de las Bayonas has found.) It is not unusual to turn many into one. But choanos are among the creatures that flip that narrative and interrogate the worlds that exist within us. When in colonies, they are individuals made up of individuals; when they fragment, they turn one into many. Choanoflagellates’ definition of self “can exist at multiple stacking levels, like Russian dolls,” says María Rebolleda-Gómez, a biologist at UC Irvine. As long as natural selection and evolution can act on an entity—a group, a creature, a cell, a gene—there is arguably an individual lurking within.

[snip]

Is This the Beginning of the End of the Internet?

[Note:  This item comes from friend Tim Pozar.  DLH]

Is This the Beginning of the End of the Internet?
How a single Texas ruling could change the web forever
By Charlie Warzel
Sep 28 2022
https://www.theatlantic.com/ideas/archive/2022/09/netchoice-paxton-first-amendment-social-media-content-moderation/671574/

Occasionally, something happens that is so blatantly and obviously misguided that trying to explain it rationally makes you sound ridiculous. Such is the case with the Fifth Circuit Court of Appeals’s recent ruling in NetChoice v. Paxton. Earlier this month, the court upheld a preposterous Texas law stating that online platforms with more than 50 million monthly active users in the United States no longer have First Amendment rights regarding their editorial decisions. Put another way, the law tells big social-media companies that they can’t moderate the content on their platforms. YouTube purging terrorist-recruitment videos? Illegal. Twitter removing a violent cell of neo-Nazis harassing people with death threats? Sorry, that’s censorship, according to Andy Oldham, a judge of the United States Court of Appeals and the former general counsel to Texas Governor Greg Abbott.

A state compelling social-media companies to host all user content without restrictions isn’t merely, as the First Amendment litigation lawyer Ken White put iton Twitter, “the most angrily incoherent First Amendment decision I think I’ve ever read.” It’s also the type of ruling that threatens to blow up the architecture of the internet. To understand why requires some expertise in First Amendment law and content-moderation policy, and a grounding in what makes the internet a truly transformational technology. So I called up some legal and tech-policy experts and asked them to explain the Fifth Circuit ruling—and its consequences—to me as if I were a precocious 5-year-old with a strange interest in jurisprudence.

Techdirt founder Mike Masnick, who has been writing for decades about the intersection of tech policy and civil liberties, told me that the ruling is “fractally wrong”—made up of so many layers of wrongness that, in order to fully comprehend its significance, “you must understand the historical wrongness before the legal wrongness, before you can get to the technical wrongness.” In theory, the ruling means that any state in the Fifth Circuit (such as Texas, Louisiana, and Mississippi) could “mandate that news organizations must cover certain politicians or certain other content” and even implies that “the state can now compel any speech it wants on private property.” The law would allow both the Texas attorney general and private citizens who do business in Texas to bring suit against the platforms if they feel their content was removed because of a specific viewpoint. Daphne Keller, the director of the Program on Platform Regulation at Stanford’s Cyber Policy Center, told me that such a law could amount to “a litigation DDoS [Denial of Service] attack, unleashing a wave of potentially frivolous and serious suits against the platforms.”

To give me a sense of just how sweeping and nonsensical the law could be in practice, Masnick suggested that, under the logic of the ruling, it very well could be illegal to update Wikipedia in Texas, because any user attempt to add to a page could be deemed an act of censorship based on the viewpoint of that user (which the law forbids). The same could be true of chat platforms, including iMessage and Reddit, and perhaps also Discord, which is built on tens of thousands of private chat rooms run by private moderators. Enforcement at that scale is nearly impossible. This week, to demonstrate the absurdity of the law and stress test possible Texas enforcement, the subreddit r/PoliticalHumor mandated that every comment in the forum include the phrase “Greg Abbott is a little piss baby” or be deleted. “We realized what a ripe situation this is, so we’re going to flagrantly break this law,” a moderator of the subreddit wrote. “We like this Constitution thing. Seems like it has some good ideas.”

Everyone I spoke with believes that the very future of how the internet works is at stake. Accordingly, this case is likely to head to the Supreme Court. Part of this fiasco touches on the debate around Section 230 of the Communications Decency Act, which, despite its political-lightning-rod status, makes it extremely clear that websites have editorial control. “Section 230 tells platforms, ‘You’re not the author of what people on your platform put up, but that doesn’t mean you can’t clean up your own yard and get rid of stuff you don’t like.’ That has served the internet very well,” Dan Novack, a First Amendment attorney, told me. In effect, it allows websites that host third-party content to determine whether they want a family-friendly community or an edgy and chaotic one. This, Masnick argued, is what makes the internet useful, and Section 230 has “set up the ground rules in which all manner of experimentation happens online,” even if it’s also responsible for quite a bit of the internet’s toxicity too.

But the full editorial control that Section 230 protects isn’t just a boon for giants such as Facebook and YouTube. Take spam: Every online community—from large platforms to niche forums—has the freedom to build the environment that makes sense to them, and part of that freedom is deciding how to deal with bad actors (for example, bot accounts that spam you with offers for natural male enhancement). Keller suggested that the law may have a carve-out for spam—which is often filtered because of the way it’s disseminated, not because of its viewpoint (though this gets complicated with spammy political emails). But one way to look at content moderation is as a constant battle for online communities, where bad actors are always a step ahead. The Texas law would kneecap platforms’ abilities to respond to a dynamic threat.

“It says, ‘Hey, the government can decide how you deal with content and how you decide what community you want to build or who gets to be a part of that community and how you can deal with your bad actors,’” Masnick said. “Which sounds fundamentally like a totally different idea of the internet.”

“A lot of people envision the First Amendment in this affirmative way, where it is about your right to say what you want to say,” Novack told me. “But the First Amendment is just as much about protecting your right to be silent. And it’s not just about speech but things adjacent to your speech—like what content you want to be associated or not associated with. This law and the conservative support of it shreds those notions into ribbons.”

The implications are terrifying and made all the worse by the language of Judge Oldham’s ruling. Perhaps the best example of this brazen obtuseness is Oldham’s argument about “the Platforms’ obsession with terrorists and Nazis,” concerns that he suggests are “fanciful” and “hypothetical.” Of course, such concerns are not hypothetical; they’re a central issue for any large-scale platform’s content-moderation team. In 2015, for example, the Brookings Institution issued a 68-page report titled “The ISIS Twitter census,” mapping the network of terrorist supporters flooding the platform. The report found that in 2014, there were at least 46,000 ISIS accounts on Twitter posting graphic violent content and using the platform to recruit and collect intelligence for the Islamic State.

I asked Masnick whether he felt that Oldham’s ruling was rooted in a fundamental misunderstanding of the internet, or whether it was more malicious—a form of judiciary trolling resulting from former President Donald Trump getting kicked off of Twitter.

He likened the ruling to this past summer’s Dobbs v. Jackson Women’s Health Organization, which overturned Roe v. Wade and took away Americans’ constitutional right to an abortion. “You had 50 years of conservative activists pushing for the overturning of Roe, but this Texas ruling actually goes against almost everything the conservative judicial activists have worked for for decades,” Masnick said. “You have Citizens United, Hobby Lobby, the [Masterpiece Cakeshop] case, which are all complicated, but at the core, they are rooted in how to conceive of First Amendment rights. And in all cases, the conservative justices on the Supreme Court have been all about the right to expand First Amendment rights inside organizations, especially the right to exclude.”

If the case ends up before the Supreme Court, many of the justices would have to decide against their priors in order to uphold the Texas law. Specifically, Justice Brett Kavanaugh would need to directly contradict his opinion in Manhattan Community Access Corp. v. Halleck, a case where Kavanaugh clearly argued that private forums have First Amendment rights to editorial discretion.

Keller, of Stanford’s Cyber Policy Center, has tried to game out future scenarios, such as social networks having a default non-moderated version that might quickly become unusable, and a separate opt-in version with all the normal checks and balances (terms-of-service agreements and spam filters) that sites have now. But how would a company go about building and running two simultaneous versions of the same platform at once? Would the Chaos Version run only in Texas? Or would companies try to exclude Texas residents from their platforms?

“You have potential situations where companies would have to say, ‘Okay, we’re kicking off this neo-Nazi, but he’s allowed to stay on in Texas,” Masnick said. “But what if the neo-Nazi doesn’t live in Texas?” The same goes for more famous banned users, such as Trump. Do you ban Trump’s tweets in every state except Texas? It seems almost impossible for companies to comply with this law in a way that makes sense. The more likely reality, Masnick suggests, is that companies will be unable to comply and will end up ignoring it, and the Texas attorney general will keep filing suit against them, causing more simmering resentment among conservatives against Big Tech.

What is the endgame of a law that is both onerous to enforce and seemingly impossible to comply with? Keller offered two theories: “I think passing this law was so much fun for these legislators, and I think they might have expected it would get struck down, so the theater was the point.” But she also believes that there is likely some lack of understanding among those responsible for the law about just how extreme the First Amendment is in practice. “Most people don’t realize how much horrible speech is legal,” she said, arguing that historically, the constitutional right has confounded logic on both the political left and right. “These legislators think that they’re opening the door to some stuff that might offend liberals. But I don’t know if they realize they are also opening the door to barely legal child porn or pro-anorexia content and beheading videos. I don’t think they’ve understood how bad the bad is.”

[snip]

Thousands were released from prison during covid. The results are shocking.

[Note:  This item comes from friend David Rosenthal.  DLH]

Thousands were released from prison during covid. The results are shocking.
By Molly Gill
Sep 29 2022
https://www.washingtonpost.com/opinions/2022/09/29/prison-release-covid-pandemic-incarceration/

We are keeping many people in prison even though they are no danger to the public, a jaw-dropping new statistic shows. That serves as proof that it’s time to rethink our incarceration policies for those with a low risk of reoffending.

To protect those most vulnerable to covid-19 during the pandemic, the Cares Act allowed the Justice Department to order the release of people in federal prisons and place them on home confinement. More than 11,000 people were eventually released. Of those, the Bureau of Prisons (BOP) reported that only 17 of them committed new crimes.

That’s not a typo. Seventeen. That’s a 0.15 percent recidivism rate in a country where it’s normal for 30 to 65 percent of people coming home from prison to reoffend within three years of release.

Of those 17 people, most new offenses were for possessing or selling drugs or other minor offenses. Of the 17 new crimes, only one was violent (an aggravated assault), and none were sex offenses.

This extremely low recidivism rate shows there are many, many people in prison we can safely release to the community. These 11,000 releases were not random. People in low- and minimum-security prisons or at high risk of complications from covid were prioritized for consideration for release.

Except for people convicted of some offenses, such as sex offenses, no one was automatically barred from consideration because of their crime, sentence length or time served. The BOP instead assessed each eligible person individually, looking at their prison disciplinary record, any violent or gang-related conduct and their risk to the public.

The agency allowed a person’s release if they had a home to go to and would be able to weather all the burdens of home confinement. Home confinement requires people to wear an ankle monitor with GPS tracking, stay home except when given permission to leave for things such as work or doctor’s appointments and remain drug- and crime-free. No one was simply released onto the street without support or supervision.

The Cares Act policy teaches us that many of our prison sentences are unnecessarily lengthy. People who commit crimes should be held accountable, and that might include serious time in prison. Many of the people released to home confinement had years or even decades left to serve on their sentences. But they changed in prison and are no longer a danger to others, as the new data confirms.

Releases to home confinement were also focused on two groups of people who pose little to no risk to public safety: the elderly and the ill (i.e., those most likely to face serious covid complications). Study after study confirms that people become less likely to reoffend as they get older. America’s elderly prison population is growing rapidly, because of our use of lengthy prison terms.

People with serious chronic illnesses or physical disabilities are another group who can be safely released from long sentences. They are not dangerous, but their increased medical needs make them exponentially more expensive to incarcerate. Taxpayers aren’t getting much public safety bang for their buck when we incarcerate bedridden people.

[snip]

The Search for Intelligent Life Is About to Get a Lot More Interesting

[Note:  This item comes from friend David Rosenthal.  DLH]

The Search for Intelligent Life Is About to Get a Lot More Interesting
There are an estimated 100 billion galaxies in the universe, home to an unimaginable abundance of planets. And now there are new ways to spot signs of life on them.
By Jon Gertner
Sep 18 2022
https://www.nytimes.com/2022/09/15/magazine/extraterrestrials-technosignatures.html

When the space shuttle Atlantis lifted off from the Kennedy Space Center on Oct. 18, 1989, it carried the Galileo in its cargo bay. Arrayed with scientific instruments, Galileo’s ultimate destination was Jupiter, where it would spend years in orbit collecting data and taking pictures. After it left the shuttle, though, Galileo headed in the other direction, turning toward the sun and circling around Venus, in order to slingshot around the planet and pick up speed for its journey to the outer solar system. Along the way, it flew around Earth too — twice, in fact, at altitudes of 597 and 188 miles. This gave its engineering team an opportunity to test the craft’s sensors. The astronomer Carl Sagan, a member of Galileo’s science team, called the maneuver the first flyby in our planet’s history. It also allowed him to contemplate what a spacecraft might find when looking at a far-off planet for signs of intelligent life.

There was plenty to see. Our technology creates an intriguing mess. Lights blaze, and heat islands glow in paved-over urban areas. Atmospheric gases ebb and flow — evident today not only in rising concentrations of carbon dioxide and methane, but also in clouds of floating industrial byproducts. Sometimes there are radiation leaks. And all the while, billions of gadgets and antennas cast off a buzzing, planetary swarm of electromagnetic transmissions.

Would other planets’ civilizations be like ours? Would they create the same telltale chemical and electromagnetic signs — what scientists have recently begun calling technosignatures — that Galileo detected? The search for intelligence beyond Earth has long been defined by an assumption that extraterrestrials would have developed radio technologies akin to what humans have created. In some early academic papers on the topic, dating to the late 1950s, scientists even posited that these extraterrestrials might be interested in chatting with us. “That played into this whole idea of aliens as salvation — you know, aliens were going to teach us things,” Adam Frank, an astrophysicist at the University of Rochester, told me recently. Frank points out that the search for signals from deep space has, over time, become more agnostic: Rather than looking for direct calls to Earth, telescopes now sweep the sky, searching billions of frequencies simultaneously, for electronic signals whose origins can’t be explained by celestial phenomena. At the same time, the search for intelligent life has turned in a novel direction.

In 2018, Frank attended a meeting in Houston whose focus was technosignatures. The goal was to get the 60 researchers in attendance to think about defining a new scientific field that, with NASA’s help, would seek out signs of technology on distant worlds, like atmospheric pollution, to take just one example. “That meeting in Houston was the dawn of the new era, at least as I saw it,” Frank recalls. NASA has a long history of staying out of the extraterrestrial business. “Everybody was sort of there with wide eyes — like, ‘Oh, my God, is this really happening?’”

The result, at least for Frank, has been a new direction for his work, as well as some money to fund it. He and a few astronomy colleagues around the country formed the group Categorizing Atmospheric Technosignatures, or CATS, which NASA has since awarded nearly $1 million in grants. The ambition for CATS is to create a “library” of possible technosignatures. In short, Frank and his colleagues are researching what could constitute evidence that technological civilization exists on other planets. At this stage, Frank stresses, his team’s work is not about communicating with aliens; nor is it meant to contribute to research on extraterrestrial radio transmissions. They are instead thinking mainly about the atmospheres of distant worlds, and what those might tell us. “The civilization will just be doing whatever it’s doing, and we’re making no assumptions about whether anybody wants to communicate or doesn’t want to communicate,” he says.

This line of inquiry might not have been productive just a few years ago. But several advances have made the search for technosignatures feasible. The first, thanks to new telescopes and astronomical techniques, is the identification of planets orbiting distant stars. As of August, NASA’s confirmed tally of such exoplanets was 5,084, and the number tends to grow by several hundred a year. “Pretty much every star you see in the night sky has a planet around it, if not a family of planets,” Frank says; he notes that this realization has only taken hold in the past decade or so. Because there are probably at least 100 billion stars in the Milky Way galaxy, and an estimated 100 billion galaxies in the universe, the potential candidates for life — as well as for civilizations that possess technology — may involve numbers almost too large to imagine. Perhaps more important, our tools keep getting better. This summer, the first pictures from the new James Webb Space Telescope were released. But several other powerful ground- and space-based instruments are being developed that will allow us to view exceedingly distant objects for the first time or view previously identified objects in novel ways. “With things like J.W.S.T. and some of the other telescopes, we’re beginning to be able to probe atmospheres looking for much smaller signals,” Michael New, a NASA research official who attended the 2018 Houston conference, told me. “And this is something we just couldn’t have done before.”

As Frank puts it, more bluntly: “The point is, after 2,500 years of people yelling at each other over life in the universe, in the next 10, 20 and 30 years we will actually get data.”

In July, when NASA released the first batch of images from the Webb telescope, we could glimpse remote corners of the universe with newfound clarity and beauty — a panorama of “cosmic cliffs,” 24 trillion miles tall, constructed from gas and dust, for instance. The images were stunning but also bewildering; they defied description. What could we even compare them to? Webb was reaching farther in distance and into the past than any telescope before it, collecting light from stars that in some cases required more than 13 billion years to reach us. We will need to acclimate ourselves to the task of constantly looking at — and interpreting — things we’ve never seen before.

The Webb telescope can look near as well as far. During its first year, about 7 percent of its time will be spent observing our own solar system, according to Heidi B. Hammel, an interdisciplinary scientist who worked on the telescope’s development. Webb can analyze the atmospheres of nearby planets like Jupiter and Mars using its infrared sensors. These capabilities can also be directed at some of the closest Earth-size exoplanets, like those surrounding the small Trappist-1 star, 40 light-years away.

One goal of that focus is to discern a biosignature — that is, an indication that life exists (or has existed) on those worlds. On Earth, a biosignature might be the discarded shell of a clam, the fallen feather of a bird, a fossilized fern embedded in sedimentary rock. On an exoplanet, it might be a certain ratio of gases — oxygen, methane, H₂O and CO₂, say — that suggest the presence of microbes or plants. Nikole Lewis, an associate professor of astronomy at Cornell University whose team has been approved for 22.5 hours of Webb observation time this year to look at Trappist-1e, one of seven planets circling the Trappist-1 star, told me that well before declaring the discovery of a biosignature, she would have to carefully determine the planet’s atmosphere and potential habitability. “First, we have to find out if there’s air,” she says, “and then we can ask, ‘OK, what’s in the air?’” She estimates that it would take three or more years of observing a system to be able to say there’s a biosignature.

Biosignatures and technosignatures point the same way: toward life. But for now, they are being pursued by two separate scientific communities. One reason is historical: The study of biosignatures — which began in the 1960s, within the new discipline of exobiology — has been receiving support from NASA and academic institutions for decades. But “technosignature” was coined only recently, in 2007, by Jill Tarter, a pioneering figure in astronomy who has spent her career conducting searches for alien transmissions. Jason Wright, a professor of astronomy and astrophysics at Penn State who is a member of Frank’s CATS group, says he thinks of Tarter’s idea as a “rebranding” of the search for extraterrestrial intelligence, which has long been relegated to the scientific fringe. “When Jill coined the phrase,” Wright told me, “she was trying to emphasize that NASA was looking for microbes and slime and atmospheric biosignatures, but technosignatures were really under the same umbrella.” Any search for biosignatures on a distant planet, Wright contends, would logically overlap the search for technosignatures, once it became time to explain unusual observations. Does a telescopic reading suggest a life-sustaining atmosphere? Or is it possibly a sign of technology, too? Scientists looking for biosignatures, in other words, may encounter marks of technology as well.

Wright, Frank and the rest of the CATS team are thus interested in atmospheric markers that would probably never occur naturally. One recent group paper, for example, written primarily by Jacob Haqq-Misra, a CATS member who works at the nonprofit Blue Marble Space Institute of Science, considers how the presence of chlorofluorocarbons, an industrial byproduct, would give a distinct spectral signal and could be picked up by Webb. Haqq-Misra was also the first author on a recent paper suggesting that an exoplanet with agriculture — “exofarms” — might emit telltale atmospheric emissions. Another paper, one written mainly by Ravi Kopparapu, a CATS member who works at NASA’s Goddard Space Flight Center, makes the case that the emission of nitrogen dioxide, an industrial byproduct, could signal the existence of alien technology. Those emissions might be observable by a NASA space telescope, known as LUVOIR (Large Ultraviolet Optical Infrared Surveyor), that is slated to be deployed after 2040. These scenarios — aliens running factories, say, or aliens riding tractors at harvest time — might seem unlikely, but the scientists working on technosignatures are comfortable with the low odds. “If we focus on what’s detectable, based on these instruments that we’re building, that’s really the fundamental question,” Haqq-Misra told me.

When I visited Wright at his office at Penn State in the spring, he made the case that technosignatures are not only more detectable than biosignatures, possibly, but also more abundant and longer lived. Consider Earth as an example, he said. Its technology already extends all over the solar system. We have junk on the moon; we have Rovers driving around Mars; we have satellites orbiting other planets. What’s more, several spacecraft — including two Pioneers, two Voyagers and the Pluto-probe New Horizons, all launched by NASA — are venturing beyond the edge of the solar system into interstellar space. Such technosignatures could last billions of years. And we’re only 65 years into the age of space exploration. An older civilization could have seeded the galaxy with thousands of technosignatures, which could make them easier to detect.

“Look, I’m truly agnostic about whether there’s even anything to find,” Wright said. In 1961, he pointed out, the astronomer Frank Drake presented what’s now known as the Drake Equation, which is made up of many variables and attempts to help calculate the number of intelligent civilizations elsewhere in the galaxy. But with so little data to plug in to the variables, there has yet to be any solution to the equation.

For Wright, Drake’s equation at least allows for a “plausibility” that something is out there. But is it life or complex life? Biosignatures, Wright said, are going to be “extremely challenging to detect — if they exist. So that’s two big ifs. It’s very possible that life is just so rare that there’s nothing within a kiloparsec for us to find.” But technology, he explained, could have started the same distance away — a kiloparsec is 3,261 light-years in distance — and moved closer to Earth over eons. It could be a traveling probe like one of our Voyagers or a systematic species migration; it could be an electronic signal, sent 3,250 years ago and, moving at the speed of light, just coming into our range.

“So we have a much bigger search radius for technology,” Wright said. “But also, perhaps complex life that builds technology is itself extremely rare, even when life forms.” He paused. “I don’t know,” he said. “What drives me is not the idea that we will find something in my lifetime. What drives me is that we’re not looking very well. And it’s too important a search, answering too important a question, not to do well.”

“The giggle factor” — that’s what anyone who does research on extraterrestrials is bound to encounter, according to Frank. As a graduate student in the ’80s, Frank was wary of the field as a career move. “I’d never worked in this before, I’d never published any papers,” he told me, referring to his pre-technosignature research. His reluctance was reinforced by the marginalization of the subject. Early on, in the 1970s, NASA had shown a willingness to fund radio-telescope searches for extraterrestrial activity. But the search for aliens aroused opposition. In 1978, Senator William Proxmire declared that taxpayers were being fleeced, a criticism NASA heeded by striking the search for extraterrestrials from its budget. The agency was willing to back survey projects again in the 1980s, but another senator, Richard Bryan, stopped the programs in 1993. “This hopefully will be the end of Martian hunting at the taxpayer’s expense,” Bryan said at the time.

Only recently has the stigma begun to wear off. At the urging of the Texas representative Lamar Smith (now retired), who was chairman of the House Science Committee, a bill was introduced in Congress for NASA to allocate $10 million to technosignatures. NASA quickly asked for a forum to get a clearer sense of what research was worth funding, positioning the effort as a departure from radio astronomy. “I was told the workshop had to be in a certain Texas congressional district,” Wright, who was asked to organize the Houston meeting, told me.

When Frank, who trained as a theoretical astrophysicist rather than an observational astronomer, attended the Houston meeting, he had been writing about how civilizations alter their planetary atmospheres. Because humans have changed our world so significantly through global warming — essentially by burning wood and fossil fuels — he had been wondering if this would happen everywhere. “When you pull back and think of the evolution of any planet, you find that what we’re going through may be a common transition that you do, or don’t, make it through,” Frank says. In his view, any species that expands and grows is probably going to create significant feedback effects on its planet. “Civilizations are basically focused on harvesting energy and putting it to work,” he says. “And there should be unintentional markers when you do that. You’re leaving traces.” You’re creating technosignatures. Such assumptions about energy generation and activity are mostly what guide the CATS group.

[snip]

‘A Crisis Coming’: The Twin Threats to American Democracy

‘A Crisis Coming’: The Twin Threats to American Democracy
By David Leonhardt
Sep 21 2022
https://www.nytimes.com/2022/09/17/us/american-democracy-threats.html

The United States has experienced deep political turmoil several times before over the past century. The Great Depression caused Americans to doubt the country’s economic system. World War II and the Cold War presented threats from global totalitarian movements. The 1960s and ’70s were marred by assassinations, riots, a losing war and a disgraced president.

These earlier periods were each more alarming in some ways than anything that has happened in the United States recently. Yet during each of those previous times of tumult, the basic dynamics of American democracy held firm. Candidates who won the most votes were able to take power and attempt to address the country’s problems.

The current period is different. As a result, the United States today finds itself in a situation with little historical precedent. American democracy is facing two distinct threats, which together represent the most serious challenge to the country’s governing ideals in decades.

The first threat is acute: a growing movement inside one of the country’s two major parties — the Republican Party — to refuse to accept defeat in an election.

The violent Jan. 6, 2021, attack on Congress, meant to prevent the certification of President Biden’s election, was the clearest manifestation of this movement, but it has continued since then. Hundreds of elected Republican officials around the country falsely claim that the 2020 election was rigged. Some of them are running for statewide offices that would oversee future elections, potentially putting them in position to overturn an election in 2024 or beyond.

“There is the possibility, for the first time in American history, that a legitimately elected president will not be able to take office,” said Yascha Mounk, a political scientist at Johns Hopkins University who studies democracy.

The second threat to democracy is chronic but also growing: The power to set government policy is becoming increasingly disconnected from public opinion.

The run of recent Supreme Court decisions — both sweeping and, according to polls, unpopular — highlight this disconnect. Although the Democratic Party has won the popular vote in seven of the past eight presidential elections, a Supreme Court dominated by Republican appointees seems poised to shape American politics for years, if not decades. And the court is only one of the means through which policy outcomes are becoming less closely tied to the popular will.

Two of the past four presidents have taken office despite losing the popular vote. Senators representing a majority of Americans are often unable to pass bills, partly because of the increasing use of the filibuster. Even the House, intended as the branch of the government that most reflects the popular will, does not always do so, because of the way districts are drawn.

“We are far and away the most countermajoritarian democracy in the world,” said Steven Levitsky, a professor of government at Harvard University and a co-author of the book “How Democracies Die,” with Daniel Ziblatt.

The causes of the twin threats to democracy are complex and debated among scholars.

The chronic threats to democracy generally spring from enduring features of American government, some written into the Constitution. But they did not conflict with majority opinion to the same degree in past decades. One reason is that more populous states, whose residents receive less power because of the Senate and the Electoral College, have grown so much larger than small states.

The acute threats to democracy — and the rise of authoritarian sentiment, or at least the acceptance of it, among many voters — have different causes. They partly reflect frustration over nearly a half-century of slow-growing living standards for the American working class and middle class. They also reflect cultural fears, especially among white people, that the United States is being transformed into a new country, more racially diverse and less religious, with rapidly changing attitudes toward gender, language and more.

The economic frustrations and cultural fears have combined to create a chasm in American political life, between prosperous, diverse major metropolitan areas and more traditional, religious and economically struggling smaller cities and rural areas. The first category is increasingly liberal and Democratic, the second increasingly conservative and Republican.

The political contest between the two can feel existential to people in both camps, with disagreements over nearly every prominent issue. “When we’re voting, we’re not just voting for a set of policies but for what we think makes us Americans and who we are as a people,” Lilliana Mason, a political scientist and the author of “Uncivil Agreement: How Politics Became Our Identity,” said. “If our party loses the election, then all of these parts of us feel like losers.”

These sharp disagreements have led many Americans to doubt the country’s system of government. In a recent poll by Quinnipiac University, 69 percent of Democrats and 69 percent of Republicans said that democracy was “in danger of collapse.” Of course, the two sides have very different opinions about the nature of the threat.

Many Democrats share the concerns of historians and scholars who study democracy, pointing to the possibility of overturned election results and the deterioration of majority rule. “Equality and democracy are under assault,” President Biden said in a speech this month in front of Independence Hall in Philadelphia. “We do ourselves no favor to pretend otherwise.”

Many Republicans have defended their increasingly aggressive tactics by saying they are trying to protect American values. In some cases, these claims rely on falsehoods — about election fraud, Mr. Biden’s supposed “socialism,” Barack Obama’s birthplace, and more.

In others, they are rooted in anxiety over real developments, including illegal immigration and “cancel culture.” Some on the left now consider widely held opinions among conservative and moderate Americans — on abortion, policing, affirmative action, Covid-19 and other subjects — to be so objectionable that they cannot be debated. In the view of many conservatives and some experts, this intolerance is stifling open debate at the heart of the American political system.

The divergent sense of crisis on left and right can itself weaken democracy, and it has been exacerbated by technology.

Conspiracy theories and outright lies have a long American history, dating to the personal attacks that were a staple of the partisan press during the 18th century. In the mid-20th century, tens of thousands of Americans joined the John Birch Society, a far-right group that claimed Dwight Eisenhower was a secret Communist.

Today, however, falsehoods can spread much more easily, through social media and a fractured news environment. In the 1950s, no major television network spread the lies about Eisenhower. In recent years, the country’s most watched cable channel, Fox News, regularly promoted falsehoods about election results, Mr. Obama’s birthplace and other subjects.

These same forces — digital media, cultural change and economic stagnation in affluent countries — help explain why democracy is also struggling in other parts of the world. Only two decades ago, at the turn of the 21st century, democracy was the triumphant form of government around the world, with autocracy in retreat in the former Soviet empire, Argentina, Brazil, Chile, South Africa, South Korea and elsewhere. Today, the global trend is moving in the other direction.

In the late 1990s, 72 countries were democratizing, and only three were growing more authoritarian, according to data from V-Dem, a Swedish institute that monitors democracy. Last year, only 15 countries grew more democratic, while 33 slid toward authoritarianism.

Some experts remain hopeful that the growing attention in the United States to democracy’s problems can help avert a constitutional crisis here. Already, Donald Trump’s efforts to overturn the 2020 election failed, partly because of the refusal of many Republican officials to participate, and both federal and state prosecutors are investigating his actions. And while the chronic decline of majority rule will not change anytime soon, it is also part of a larger historical struggle to create a more inclusive American democracy.

Still, many experts point out that it is still not clear how the country will escape a larger crisis, such as an overturned election, at some point in the coming decade. “This is not politics as usual,” said Carol Anderson, a professor at Emory University and the author of the book, “One Person, No Vote,” about voter suppression. “Be afraid.”

The Will of the Majority

The founders did not design the United States to be a pure democracy.

They distrusted the classical notion of direct democracy, in which a community came together to vote on each important issue, and believed it would be impractical for a large country. They did not consider many residents of the new country to be citizens who deserved a voice in political affairs, including Natives, enslaved Africans and women. The founders also wanted to constrain the national government from being too powerful, as they believed was the case in Britain. And they had the practical problem of needing to persuade 13 states to forfeit some of their power to a new federal government.

Instead of a direct democracy, the founders created a republic, with elected representatives to make decisions, and a multilayered government, in which different branches checked each other. The Constitution also created the Senate, where every state had an equal say, regardless of population.

Pointing to this history, some Republican politicians and conservative activists have argued that the founders were comfortable with minority rule. “Of course we’re not a democracy,” Senator Mike Lee of Utah has written.

But the historical evidence suggests that the founders believed that majority will— defined as the prevailing view of enfranchised citizens — should generally dictate national policy, as George Thomas of Claremont McKenna College and other constitutional scholars have explained.

In the Federalist Papers, James Madison equated “a coalition of a majority of the whole society” with “justice and the general good.” Alexander Hamilton made similar points, describing “representative democracy” as “happy, regular and durable.” It was a radical idea at the time.

For most of American history, the idea has prevailed. Even with the existence of the Senate, the Electoral College and the Supreme Court, political power has reflected the views of people who had the right to vote. “To say we’re a republic not a democracy ignores the past 250 years of history,” Mr. Ziblatt, a political scientist at Harvard University, said.

Before 2000, only three candidates won the presidency while losing the popular vote (John Quincy Adams, Rutherford Hayes and Benjamin Harrison), and each served only a single term. During the same period, parties that won repeated elections were able to govern, including the Democratic-Republican Party of Thomas Jefferson’s time, the New Deal Democrats and the Reagan Republicans.

The situation has changed in the 21st century. The Democratic Party is in the midst of a historic winning streak. In seven of the past eight presidential elections, stretching back to Bill Clinton’s 1992 victory, the Democratic nominee has won the popular vote. Over more than two centuries of American democracy, no party has previously fared so well over such an extended period.

Yet the current period is hardly a dominant Democratic age.

What changed? One crucial factor is that, in the past, the parts of the country granted outsize power by the Constitution — less populated states, which tend to be more rural — voted in broadly similar ways as large states and urban areas.

This similarity meant that the small-state bonus in the Senate and Electoral College had only a limited effect on national results. Both Democrats and Republicans benefited, and suffered, from the Constitution’s undemocratic features.

Democrats sometimes won small states like Idaho, Montana, Utah and Wyoming in the mid-20th century. And California was long a swing state: Between the Great Depression and 2000, Democratic and Republican presidential candidates won it an equal number of times. That the Constitution conferred advantages on residents of small states and disadvantages on Californians did not reliably boost either party.

In recent decades, Americans have increasingly sorted themselves along ideological lines. Liberals have flocked to large metropolitan areas, which are heavily concentrated in big states like California, while residents of smaller cities and more rural areas have become more conservative.

This combination — the Constitution’s structure and the country’s geographic sorting — has created a disconnect between public opinion and election outcomes. It has affected every branch of the federal government: the presidency, Congress and even the Supreme Court.

In the past, “the system was still antidemocratic, but it didn’t have a partisan effect,” Mr. Levitsky said. “Now it’s undemocratic and has a partisan effect. It tilts the playing field toward the Republican Party. That’s new in the 21st century.”

In presidential elections, the small-state bias is important, but it is not even the main issue. A more subtle factor — the winner-take-all nature of the Electoral College in most states — is. Candidates have never received extra credit for winning state-level landslides. But this feature did not used to matter very much, because landslides were rare in larger states, meaning that relatively few votes were “wasted,” as political scientists say.

Today, Democrats dominate a handful of large states, wasting many votes. In 2020, Mr. Biden won California by 29 percentage points; New York by 23 points; and Illinois by 17 points. Four years earlier, Hillary Clinton’s margins were similar.

[snip]

Charging cars at home at night is not the way to go, Stanford study finds

[Note:  This item comes from friend Judi Clark.  DLH]

Charging cars at home at night is not the way to go, Stanford study finds
The move to electric vehicles will result in large costs for generating, transmitting, and storing more power. Shifting current EV charging from home to work and night to day could cut costs and help the grid, according to a new Stanford study.
By Mark Golden
Sep 22 2022
https://news.stanford.edu/press/view/45245

The vast majority of electric vehicle owners charge their cars at home in the evening or overnight. We’re doing it wrong, according to a new Stanford study.

In March, the research team published a paper on a model they created for charging demand that can be applied to an array of populations and other factors. In the new study, published Sept. 22 in Nature Energy, they applied their model to the whole of the Western United States and examined the stress the region’s electric grid will come under by 2035 from growing EV ownership. In a little over a decade, they found, rapid EV growth alone could increase peak electricity demand by up to 25%, assuming a continued dominance of residential, nighttime charging.

To limit the high costs of all that new capacity for generating and storing electricity, the researchers say, drivers should move to daytime charging at work or public charging stations, which would also reduce greenhouse gas emissions. This finding has policy and investment implications for the region and its utilities, especially since California moved in late August to ban sales of gasoline-powered cars and light trucks starting in 2035.

“We encourage policymakers to consider utility rates that encourage day charging and incentivize investment in charging infrastructure to shift drivers from home to work for charging,” said the study’s co-senior author, Ram Rajagopal, an associate professor of civil and environmental engineering at Stanford.

In February, cumulative sales of EVs in California reached one million, accounting for about 6% of cars and light trucks. The state has targeted five million EVs on the road by 2030. When the penetration hits 30% to 40% of cars on the road, the grid will experience significant stress without major investments and changes in charging habits, said Rajagopal. Building that infrastructure requires significant lead time and cannot be done overnight.

“We considered the entire Western U.S. region, because California depends heavily on electricity imports from the other Western states. EV charging plus all other electricity uses have consequences for the whole Western region given the interconnected nature of our electric grid,” said Siobhan Powell, lead author of the March study and the new one.

“We were able to show that with less home charging and more daytime charging, the Western U.S. would need less generating capacity and storage, and it would not waste as much solar and wind power,” said Powell, mechanical engineering PhD ’22.

“And it’s not just California and Western states. All states may need to rethink electricity pricing structures as their EV charging needs increase and their grid changes,” added Powell, who recently took a postdoctoral research position at ETH Zurich.

Once 50% of cars on the road are powered by electricity in the Western U.S. – of which about half the population lives in California – more than 5.4 gigawatts of energy storage would be needed if charging habits follow their current course. That’s the capacity equivalent of 5 large nuclear power reactors. A big shift to charging at work instead of home would reduce the storage needed for EVs to 4.2 gigawatts.

Changing incentives

Current time-of-use rates encourage consumers to switch electricity use to nighttime whenever possible, like running the dishwasher and charging EVs. This rate structure reflects the time before significant solar and wind power supplies when demand threatened to exceed supply during the day, especially late afternoons in the summer.

Today, California has excess electricity during late mornings and early afternoons, thanks mainly to its solar capacity. If most EVs were to charge during these times, then the cheap power would be used instead of wasted. Alternatively, if most EVs continue to charge at night, then the state will need to build more generators – likely powered by natural gas – or expensive energy storage on a large scale. Electricity going first to a huge battery and then to an EV battery loses power from the extra stop.

At the local level, if a third of homes in a neighborhood have EVs and most of the owners continue to set charging to start at 11 p.m. or whenever electricity rates drop, the local grid could become unstable.

“The findings from this paper have two profound implications: the first is that the price signals are not aligned with what would be best for the grid – and for ratepayers. The second is that it calls for considering investments in a charging infrastructure for where people work,” said Ines Azevedo, the new paper’s other co-senior author and associate professor of energy science and engineering in the Stanford Doerr School of Sustainability, which opened on Sept. 1.

“We need to move quickly toward decarbonizing the transportation sector, which accounts for the bulk of emissions in California,” Azevedo continued. “This work provides insight on how to get there. Let’s ensure that we pursue policies and investment strategies that allow us to do so in a way that is sustainable.”

[snip]

Posits, a New Kind of Number, Improves the Math of AI

[Note:  This item comes from friend David P. Reed.  DLH]

Posits, a New Kind of Number, Improves the Math of AI
The first posit-based processor core gave a ten-thousandfold accuracy boost
By Dina Genkina
Sep 25 2022
https://spectrum.ieee.org/floating-point-numbers-posits-processor

Training the large neural networks behind many modern AI tools requires real computational might: For example, OpenAI’s most advanced language model, GPT-3, required an astounding million billion billions of operations to train, and cost about US $5 million in compute time. Engineers think they have figured out a way to ease the burden by using a different way of representing numbers.

Back in 2017, John Gustafson, then jointly appointed at A*STAR Computational Resources Centre and the National University of Singapore, and Isaac Yonemoto, then at Interplanetary Robot and Electric Brain Co., developed a new way of representing numbers. These numbers, called posits, were proposed as an improvement over the standard floating-point arithmetic processors used today.

Now, a team of researchers at the Complutense University of Madrid have developed the first processor core implementing the posit standard in hardware and showed that, bit-for-bit, the accuracy of a basic computational task increased by up to four orders of magnitude, compared to computing using standard floating-point numbers. They presented their results at last week’s IEEE Symposium on Computer Arithmetic.

“Nowadays it seems that Moore’s law is starting to fade,” says David Mallasén Quintana, a graduate researcher in the ArTeCS group at Complutense. “So, we need to find some other ways of getting more performance out of the same machines. One of the ways of doing that is changing how we encode the real numbers, how we represent them.”

The Complutense team isn’t alone in pushing the envelope with number representation. Just last week, Nvidia, Arm, and Intel agreed on a specificationfor using 8-bit floating-point numbers instead of the usual 32-bit or 16-bit for machine-learning applications. Using the smaller, less-precise format improves efficiency and memory usage, at the cost of computational accuracy.

Real numbers can’t be perfectly represented in hardware simply because there are infinitely many of them. To fit into a designated number of bits, many real numbers have to be rounded. The advantage of posits comes from the way the numbers they represent exactly are distributed along the number line. In the middle of the number line, around 1 and -1, there are more posit representations than floating point. And at the wings, going out to large negative and positive numbers, posit accuracy falls off more gracefully than floating point.

“It’s a better match for the natural distribution of numbers in a calculation,” says Gustafson. “It’s the right dynamic range, and it’s the right accuracy where you need more accuracy. There’s an awful lot of bit patterns in floating-point arithmetic no one ever uses. And that’s waste.”

Posits accomplish this improved accuracy around 1 and -1 thanks to an extra component in their representation. Floats are made up of three parts: a sign bit (0 for positive, 1 for negative), several “mantissa” (fraction) bits denoting what comes after the binary version of a decimal point, and the remaining bits defining the exponent (2 exp).

Posits keep all the components of a float but add an extra “regime” section, an exponent of an exponent. The beauty of the regime is that it can vary in bit length. For small numbers, it can take as few as two bits, leaving more precision for the mantissa. This allows for the higher accuracy of posits in their sweet spot around 1 and -1.

Deep neural networks usually work with normalized parameters called weights, making them the perfect candidate to benefit from posits’ strengths. Much of neural-net computation is comprised of multiply-accumulate operations. Every time such a computation is performed, each sum has to be truncated anew, leading to accuracy loss. With posits, a special register called a quire can efficiently do the accumulation step to reduce the accuracy loss. But today’s hardware implements floats, and so far, computational gains from using posits in software have been largely overshadowed by losses from converting between the formats.

With their new hardware implementation, which was synthesized in a field-programmable gate array (FPGA), the Complutense team was able to compare computations done using 32-bit floats and 32-bit posits side by side. They assessed their accuracy by comparing them to results using the much more accurate but computationally costly 64-bit floating-point format. Posits showed an astounding four-order-of-magnitude improvement in the accuracy of matrix multiplication, a series of multiply-accumulates inherent in neural network training. They also found that the improved accuracy didn’t come at the cost of computation time, only a somewhat increased chip area and power consumption.

Although the numerical accuracy gains are undeniable, how exactly this would impact the training of large AIs like GPT-3 remains to be seen.

[snip]

California sheriff’s department deems 47 deputies ‘not suited’ for duty

[Note:  This item comes from friend Judi Clark.  DLH]

California sheriff’s department deems 47 deputies ‘not suited’ for duty
Roughly 10% of the force had their guns and arrest powers stripped after an internal audit of psychological exams
By Associated Press
Sep 27 2022
https://www.theguardian.com/us-news/2022/sep/27/alameda-county-sheriffs-deputies-psychological-exam

A northern California sheriff’s department stripped 47 deputies – 10% of the force – of their guns and arrest powers because they failed psychological exams, it was reported on Monday.

It was “horrible″ to have to relieve the deputies of their duties, said Lt Ray Kelly, spokesperson for the Alameda county sheriff’s office, to KTVU-TV.

The TV station obtained a copy of a letter notifying the deputies of their change of status last Friday. The deputies will still receive their pay and benefits.

The move came after the sheriff’s office conducted an internal audit of deputies’ psychological examinations from January 2016 to the present.

The letter from Sheriff Gregory Ahern notified a deputy – whose name was blacked out in the copy – that the deputy had been graded “D. Not Suited” in a psychological evaluation and under state law could not serve as a peace officer.

The letter said the sheriff’s office had previously been operating under incorrect advice received several years ago from the California Commission on Peace Officer Standards and Training, which said it could hire candidates who received such a rating.

Ahern said the deputy can be hired if a second examination declares him or her “suitable” and the office intends to schedule an appointment for such an exam.

“Our intention is to resolve this issue as quickly as possible,” Ahern wrote. “We also intend to have you return to full duty status once you obtain a ‘suitable’ finding.”

Kelly said he hoped that the retests would occur in the next two months. The tests will be conducted by a psychologist who doesn’t work for the sheriff, he said.

Kelly said the audit followed the arrest earlier this month of former deputy Devin Williams Jr, 24. He is charged with shooting and killing a couple in their Dublin, California, home on 7 September.

His mother, Anitra Williams, told KTVU-TV that her son had been in a romantic relationship with Maria Tran and he believed she was unmarried.

The station said four sources said Williams had failed his psychological exam, although Kelly previously said Williams, who was hired in September 2021, had passed all psychological tests.

The Ella Baker Center for Human Rights, an Oakland non-profit that deals with race and criminal justice issues, accused the sheriff’s office of previously ignoring the problem of unsuited deputies.

“This further highlights the egregious levels of dysfunction and corruption that have plagued the sheriff’s office for years,” the center’s organizing director Jose Bernal, said in a statement.

[snip]