Why your brain is not a computer

Why your brain is not a computer
For decades it has been the dominant metaphor in neuroscience. But could this idea have been leading us astray all along?
By Matthew Cobb
Feb 27 2020

We are living through one of the greatest of scientific endeavours – the attempt to understand the most complex object in the universe, the brain. Scientists are accumulating vast amounts of data about structure and function in a huge array of brains, from the tiniest to our own. Tens of thousands of researchers are devoting massive amounts of time and energy to thinking about what brains do, and astonishing new technology is enabling us to both describe and manipulate that activity.

We can now make a mouse remember something about a smell it has never encountered, turn a bad mouse memory into a good one, and even use a surge of electricity to change how people perceive faces. We are drawing up increasingly detailed and complex functional maps of the brain, human and otherwise. In some species, we can change the brain’s very structure at will, altering the animal’s behaviour as a result. Some of the most profound consequences of our growing mastery can be seen in our ability to enable a paralysed person to control a robotic arm with the power of their mind.

Every day, we hear about new discoveries that shed light on how brains work, along with the promise – or threat – of new technology that will enable us to do such far-fetched things as read minds, or detect criminals, or even be uploaded into a computer. Books are repeatedly produced that each claim to explain the brain in different ways.

And yet there is a growing conviction among some neuroscientists that our future path is not clear. It is hard to see where we should be going, apart from simply collecting more data or counting on the latest exciting experimental approach. As the German neuroscientist Olaf Sporns has put it: “Neuroscience still largely lacks organising principles or a theoretical framework for converting brain data into fundamental knowledge and understanding.” Despite the vast number of facts being accumulated, our understanding of the brain appears to be approaching an impasse.

In 2017, the French neuroscientist Yves Frégnac focused on the current fashion of collecting massive amounts of data in expensive, large-scale projects and argued that the tsunami of data they are producing is leading to major bottlenecks in progress, partly because, as he put it pithily, “big data is not knowledge”.

“Only 20 to 30 years ago, neuroanatomical and neurophysiological information was relatively scarce, while understanding mind-related processes seemed within reach,” Frégnac wrote. “Nowadays, we are drowning in a flood of information. Paradoxically, all sense of global understanding is in acute danger of getting washed away. Each overcoming of technological barriers opens a Pandora’s box by revealing hidden variables, mechanisms and nonlinearities, adding new levels of complexity.”

The neuroscientists Anne Churchland and Larry Abbott have also emphasisedour difficulties in interpreting the massive amount of data that is being produced by laboratories all over the world: “Obtaining deep understanding from this onslaught will require, in addition to the skilful and creative application of experimental technologies, substantial advances in data analysis methods and intense application of theoretic concepts and models.”

There are indeed theoretical approaches to brain function, including to the most mysterious thing the human brain can do – produce consciousness. But none of these frameworks are widely accepted, for none has yet passed the decisive test of experimental investigation. It is possible that repeated calls for more theory may be a pious hope. It can be argued that there is no possible single theory of brain function, not even in a worm, because a brain is not a single thing. (Scientists even find it difficult to come up with a precise definition of what a brain is.)

As observed by Francis Crick, the co-discoverer of the DNA double helix, the brain is an integrated, evolved structure with different bits of it appearing at different moments in evolution and adapted to solve different problems. Our current comprehension of how it all works is extremely partial – for example, most neuroscience sensory research has been focused on sight, not smell; smell is conceptually and technically more challenging. But the way that olfaction and vision work are different, both computationally and structurally. By focusing on vision, we have developed a very limited understanding of what the brain does and how it does it.

The nature of the brain – simultaneously integrated and composite – may mean that our future understanding will inevitably be fragmented and composed of different explanations for different parts. Churchland and Abbott spelled out the implication: “Global understanding, when it comes, will likely take the form of highly diverse panels loosely stitched together into a patchwork quilt.”

For more than half a century, all those highly diverse panels of patchwork we have been working on have been framed by thinking that brain processes involve something like those carried out in a computer. But that does not mean this metaphor will continue to be useful in the future. At the very beginning of the digital age, in 1951, the pioneer neuroscientist Karl Lashley argued against the use of any machine-based metaphor.

“Descartes was impressed by the hydraulic figures in the royal gardens, and developed a hydraulic theory of the action of the brain,” Lashley wrote. “We have since had telephone theories, electrical field theories and now theories based on computing machines and automatic rudders. I suggest we are more likely to find out about how the brain works by studying the brain itself, and the phenomena of behaviour, than by indulging in far-fetched physical analogies.”

This dismissal of metaphor has recently been taken even further by the French neuroscientist Romain Brette, who has challenged the most fundamental metaphor of brain function: coding. Since its inception in the 1920s, the idea of a neural code has come to dominate neuroscientific thinking – more than 11,000 papers on the topic have been published in the past 10 years. Brette’s fundamental criticism was that, in thinking about “code”, researchers inadvertently drift from a technical sense, in which there is a link between a stimulus and the activity of the neuron, to a representational sense, according to which neuronal codes represent that stimulus.

The unstated implication in most descriptions of neural coding is that the activity of neural networks is presented to an ideal observer or reader within the brain, often described as “downstream structures” that have access to the optimal way to decode the signals. But the ways in which such structures actually process those signals is unknown, and is rarely explicitly hypothesised, even in simple models of neural network function.


Freeman Dyson, Math Genius Turned Visionary Technologist, Dies at 96

Freeman Dyson, Math Genius Turned Visionary Technologist, Dies at 96
After an early breakthrough on light and matter, he became a writer who challenged climate science and pondered space exploration and nuclear warfare.
By George Johnson
Feb 28 2020

Freeman J. Dyson, a mathematical prodigy who left his mark on subatomic physics before turning to messier subjects like Earth’s environmental future and the morality of war, died on Friday at a hospital near Princeton, N.J. He was 96.

His daughter Mia Dyson confirmed the death. His son, George, said Dr. Dyson had fallen three days earlier in the cafeteria of the Institute for Advanced Study in Princeton, “his academic home for more than 60 years,” as the institute put it in a news release.

As a young graduate student at Cornell University in 1949, Dr. Dyson wrote a landmark paper — worthy, some colleagues thought, of a Nobel Prize — that deepened the understanding of how light interacts with matter to produce the palpable world. The theory the paper advanced, called quantum electrodynamics, or QED, ranks among the great achievements of modern science.

But it was as a writer and technological visionary that he gained public renown. He imagined exploring the solar system with spaceships propelled by nuclear explosions and establishing distant colonies nourished by genetically engineered plants.

“Life begins at 55, the age at which I published my first book,” he wrote in “From Eros to Gaia,” one of the collections of his writings that appeared while he was a professor of physics at the Institute for Advanced Study — an august position for someone who finished school without a Ph.D. The lack of a doctorate was a badge of honor, he said. With his slew of honorary degrees and a fellowship in the Royal Society, people called him Dr. Dyson anyway.

Dr. Dyson called himself a scientific heretic and warned against the temptation of confusing mathematical abstractions with ultimate truth. Although his own early work on QED helped bring photons and electrons into a consistent framework, Dr. Dyson doubted that superstrings, or anything else, would lead to a Theory of Everything, unifying all of physics with a succinct formulation inscribable on a T-shirt.

In a speech in 2000 when he accepted the Templeton Prize for Progress in Religion, Dr. Dyson quoted Francis Bacon: “God forbid that we should give out a dream of our own imagination for a pattern of the world.”

Relishing the role of iconoclast, he confounded the scientific establishment by dismissing the consensus about the perils of man-made climate change as “tribal group-thinking.” He doubted the veracity of the climate models, and he exasperated experts with sanguine predictions they found rooted less in science than in wishfulness: Excess carbon in the air is good for plants, and global warming might forestall another ice age.

In a profile of Dr. Dyson in 2009 in The New York Times Magazine, his colleague Steven Weinberg, a Nobel laureate, observed, “I have the sense that when consensus is forming like ice hardening on a lake, Dyson will do his best to chip at the ice.”

Dr. Dyson’s distrust of mathematical models had earlier led him to challenge predictions that the debris from atomic warfare could blot out the sun and bring on a devastating nuclear winter. He said he wished that were true — because it would add to the psychological deterrents to nuclear war — but found the theory wanting.

For all his doubts about the ability of mortals to calculate anything so complex as the effects of climate change, he was confident enough in our toolmaking to propose a technological fix: If carbon dioxide levels became too high, forests of genetically altered trees could be planted to strip the excess molecules from the air. That would free scientists to confront problems he found more immediate, like the alleviation of poverty and the avoidance of war.

He considered himself an environmentalist. “I am a tree-hugger, in love with frogs and forests,” he wrote in 2015 in The Boston Globe. “More urgent and more real problems, such as the overfishing of the oceans and the destruction of wildlife habitat on land, are neglected, while the environmental activists waste their time and energy ranting about climate change.” That was, to say the least, a minority position.

He was religious, but in an unorthodox way, believing good works to be more important than theology.


Key Missteps at the CDC Have Set Back Its Ability to Detect the Potential Spread of Coronavirus

[Note:  This item comes from friend David Rosenthal.  DLH]

Key Missteps at the CDC Have Set Back Its Ability to Detect the Potential Spread of Coronavirus
The CDC designed a flawed test for COVID-19, then took weeks to figure out a fix so state and local labs could use it. New York still doesn’t trust the test’s accuracy.
By Caroline Chen, Marshall Allen, Lexi Churchill and Isaac Arnsdorf
Feb 28 2020

As the highly infectious coronavirus jumped from China to country after country in January and February, the U.S. Centers for Disease Control and Prevention lost valuable weeks that could have been used to track its possible spread in the United States because it insisted upon devising its own test.

The federal agency shunned the World Health Organization test guidelines used by other countries and set out to create a more complicated test of its own that could identify a range of similar viruses. But when it was sent to labs across the country in the first week of February, it didn’t work as expected. The CDC test correctly identified COVID-19, the disease caused by the virus. But in all but a handful of state labs, it falsely flagged the presence of the other viruses in harmless samples.

As a result, until Wednesday the CDC and the Food and Drug Administration only allowed those state labs to use the test — a decision with potentially significant consequences. The lack of a reliable test prevented local officials from taking a crucial first step in coping with a possible outbreak — “surveillance testing” of hundreds of people in possible hotspots. Epidemiologists in other countries have used this sort of testing to track the spread of the disease before large numbers of people turn up at hospitals.

This story is based on interviews with state and local public health officials and scientists across the country, which, taken together, describe a frustrating, bewildering bureaucratic process that seemed at odds with the urgency of the growing threat. The CDC and Vice President Mike Pence’s office, which is coordinating the government’s response to the virus, did not respond to questions for this story. It’s unclear who in the government originally made the decision to design a more complicated test, or to depart from the WHO guidance.

“We’re weeks behind because we had this problem,” said Scott Becker, chief executive officer of the Association of Public Health Laboratories, which represents 100 state and local public laboratories. “We’re usually up-front and center and ready.”

The CDC announced on Feb. 14 that surveillance testing would begin in five key cities, New York, Chicago, Los Angeles, San Francisco and Seattle. That effort has not yet begun.

On Wednesday, under pressure from health experts and public officials, the CDC and the FDA told labs they no longer had to worry about the portion of the test intended “for the universal detection of SARS-like coronaviruses.” After three weeks of struggle, they could now use the test purely to check for the presence of COVID-19.

It remains unclear whether the CDC’s move on Wednesday will resolve all of the problems around the test. Some local labs have raised concerns about whether the CDC’s test is fully reliable for detecting COVID-19.

In New York, scientists at both the city’s and state’s laboratories have seen false positives even when following the CDC’s latest directions, according to a person familiar with their discussions.

“Testing for coronavirus is not available yet in New York City,” city Department of Health spokeswoman Stephanie Buhle said in an email late Thursday. “The kits that were sent to us have demonstrated performance issues and cannot be relied upon to provide an accurate result.”

Until the middle of this week, only the CDC and the six state labs — in Illinois, Idaho, Tennessee, California, Nevada and Nebraska — were testing patients for the virus, according to Peter Kyriacopoulos, APHL’s senior director of public policy. Now, as many more state and local labs are in the process of setting up the testing kits, this capacity is expected to increase rapidly.

So far, the United States has had only 15 confirmed cases, a dozen of them travel-related, according to the CDC. An additional 45 confirmed cases involve people returning to the U.S. having gotten sick abroad. But many public health experts and officials believe that without wider testing the true number of infected Americans remains hidden.

“The basic tenet of public health is to know the situation so you can deal with it appropriately,” said Marc Lipsitch, professor of epidemiology at the Harvard T. H. Chan School of Public Health. He noted that Guangdong, a province in China, conducted surveillance testing of 300,000 people in fever clinics to find about 420 positive cases. Overall, Guangdong has more than 1,000 confirmed cases. “If you don’t look, you won’t find cases,” he said.

Janet Hamilton, senior director of Policy and Science at Council of State and Territorial Epidemiologists, said that with the virus spreading through multiple countries, “now is the time” for widespread surveillance testing.

“The disease,” she said, “is moving faster than the data.”

It remains to be seen what effect the delay in producing a working test will have on the health of Americans. If the United States dodges the rapidly spreading outbreaks now seen in Iran and South Korea, the impact will be negligible. But if it emerges that the disease is already circulating undetected in communities across the country, health officials will have missed a valuable chance to lessen the harm.

The need to have testing capacity distributed across local health departments became even more apparent Wednesday, when the CDC said it was investigating a case in California in which the patient may be the first infected in the United States without traveling to affected areas or known exposure to someone with the illness.

Doctors at the University of California, Davis Medical Center, where the patient is being treated, said testing was delayed for nearly a week because the patient didn’t fit restrictive federal criteria, which limits tests only to symptomatic patients who recently traveled to China.

“Upon admission, our team asked public health officials if this case could be COVID-19,” UC Davis said in a statement. UC Davis officials said because neither the California Department of Public Health nor Sacramento County could test for the virus, they asked the CDC to do so. But, the officials said, “since the patient did not fit the existing CDC criteria for COVID-19, a test was not immediately administered.”

After this case, and under pressure from public officials, the CDC broadened its guidelines Thursday for identifying suspected patients to include people who had traveled to Iran, Italy, Japan or South Korea.

The debate over whether federal, state and local officials should have already been engaged in widespread surveillance testing has become more heated as the virus has spread globally. The CDC had said the purpose of its five-city surveillance program was to provide the U.S. with an “early warning signal” to help direct its response. The cities were selected based on the likelihood that infection would be present, Hamilton said.

But Mark Pandori, director of the Nevada State Public Health Laboratory, which began offering testing on Feb.11, said surveillance testing may not be the best use of resources right now. “A lot of people look at lab tests like they are magic,” Pandori said. “But when you run lab tests, the more chances you have for getting false answers.”


How the Coronavirus Revealed Authoritarianism’s Fatal Flaw

[Note:  This item comes from reader Randall Head.  DLH]

How the Coronavirus Revealed Authoritarianism’s Fatal Flaw
China’s use of surveillance and censorship makes it harder for Xi Jinping to know what’s going on in his own country.
Feb 22 2020

China is in the grip of a momentous crisis. The novel coronavirus that emerged late last year has already claimed three times more lives than the SARS outbreak in 2003, and it is still spreading. More than 50 million people (more than the combined metro populations of New York, Los Angeles, Chicago, and San Francisco) remain under historically unprecedented lockdown, unable to leave their city—and in many cases, even their apartment. Many countries no longer accept visiting Chinese nationals, or if they do, quarantine them for weeks. Big companies are pulling out of trade shows. Production is suffering. Profound economic consequences are bound to ensue, not just in China but around the world.

How did Xi Jinping—the general secretary of the Communist Party of China, who has been consolidating his power since taking over the post in 2012—let things get to this point?

It might be that he didn’t fully know what was happening in his own country until it was too late.

Xi would be far from the first authoritarian to have been blindsided. Ironically, for all the talk of the technological side of Chinese authoritarianism, China’s use of technology to ratchet up surveillance and censorship may have made things worse, by making it less likely that Xi would even know what was going on in his own country.

Authoritarian blindness is a perennial problem, especially in large countries like China with centralized, top-down administration. Indeed, Xi would not even be the first Chinese ruler to fall victim to the totality of his own power. On August 4, 1958, buoyed by reports pouring in from around the country of record grain, rice, and peanut production, an exuberant Chairman Mao Zedong wondered how to get rid of the excess, and advised people to eat “five meals a day.” Many did, gorging themselves in the new regime canteens and even dumping massive amounts of “leftovers” down gutters and toilets. Export agreements were made to send tons of food abroad in return for machinery or currency. Just months later, perhaps the greatest famine in recorded history began, in which tens of millions would die because, in fact, there was no such surplus. Quite the opposite: The misguided agricultural policies of the Great Leap Forward had caused a collapse in food production. Yet instead of reporting the massive failures, the apparatchiks in various provinces had engaged in competitive exaggeration, reporting ever-increasing surpluses both because they were afraid of reporting bad news and because they wanted to please their superiors.

Mao didn’t know famine was at hand, because he had set up a system that ensured he would hear lies.

Smart rulers have tried to create workarounds to avoid this authoritarian dilemma. Dynastic China, for example, had institutionalized mechanisms to petition the emperor: a right that was theoretically granted to everyone, including the lowest farmers and the poorest city dwellers. This system was intended to check corruption in provinces and uncover problems, but in practice, it was limited in many ways, filtered through courtiers to a single emperor, who could listen to only so many in a day. Many rulers also cultivated their own independent sources of information in far-flung provinces.

Thanks to technology, there is a much more robust option for authoritarians in the 21st century: big-data analytics in a digital public sphere. For a few years, it appeared that China had found a way to be responsive to its citizens without giving them political power. Researchers have shown, for example, that posts on Weibo (China’s Twitter) complaining about problems in governance or corruption weren’t all censored. Many were allowed to stay up, allowing crucial information to trickle up to authorities. For example, viral posts about forced demolitions (a common occurrence in China) or medical mistreatment led to authorities sacking the officials involved, or to victim compensation that would otherwise not have occurred. A corrupt official was even removed from office after outraged netizens on social media pointed out the expensive watches he wore, which were impossible to buy on his government salary.

The public sphere in China during those years wasn’t a free-for-all, to be sure. One couldn’t call for collective action or for deposing the central government. But social media gave citizens a voice and a way to make an impact, and it served as an early-warning system for party leaders. (The only other topic that seemed to be off-limits was the censors themselves—researchers found that they eagerly zapped complaints directed at them.)

This responsive form of authoritarianism didn’t happen just on social media. Beginning in the early 2000s, China held “deliberative polls” in which citizens debated local budgets, important issues, and even reforms that would give them the right to information on government actions. In Zeguo township in Wenling, a municipality of more than 1 million residents, authorities created deliberative bodies wherein they engaged citizens (usually a few hundred, with randomness ensuring they were representative of the population) over a few days by providing information (including detailed accounts of the city’s budget) and hosting discussions to decide on issues of public significance. Authorities sometimes went as far as to pledge, in advance, to abide by the decisions of these bodies. For many years, such experiments flourished all over China and, combined with the digital public sphere, led scholars to wonder whether the “deliberative turn” in the country’s otherwise authoritarian state was not a means of weakening authoritarianism, but of making it more sustainable.

Yet, this deliberative turn was soon reversed.

Since taking power in 2012, Xi has shifted back to traditional one-man rule, concentrating more and more power into his hands. He has deployed an ever-more suffocating system of surveillance, propaganda, and repression, while attempting to create a cult of personality reminiscent of the Mao era, except with apps instead of little red books.


You’re Likely to Get the Coronavirus

You’re Likely to Get the Coronavirus
Most cases are not life-threatening, which is also what makes the virus a historic challenge to contain.
Feb 24 2020

In May 1997, a 3-year-old boy developed what at first seemed like the common cold. When his symptoms—sore throat, fever, and cough—persisted for six days, he was taken to the Queen Elizabeth Hospital in Hong Kong. There his cough worsened, and he began gasping for air. Despite intensive care, the boy died.

Puzzled by his rapid deterioration, doctors sent a sample of the boy’s sputum to China’s Department of Health. But the standard testing protocol couldn’t fully identify the virus that had caused the disease. The chief virologist decided to ship some of the sample to colleagues in other countries.

At the U.S. Centers for Disease Control and Prevention in Atlanta, the boy’s sputum sat for a month, waiting for its turn in a slow process of antibody-matching analysis. The results eventually confirmed that this was a variant of influenza, the virus that has killed more people than any in history. But this type had never before been seen in humans. It was H5N1, or “avian flu,” discovered two decades prior, but known only to infect birds.

By then, it was August. Scientists sent distress signals around the world. The Chinese government swiftly killed 1.5 million chickens (over the protests of chicken farmers). Further cases were closely monitored and isolated. By the end of the year there were 18 known cases in humans. Six people died.

This was seen as a successful global response, and the virus was not seen again for years. In part, containment was possible because the disease was so severe: Those who got it became manifestly, extremely ill. H5N1 has a fatality rate of around 60 percent—if you get it, you’re likely to die. Yet since 2003, the virus has killed only 455 people. The much “milder” flu viruses, by contrast, kill fewer than 0.1 percent of people they infect, on average, but are responsible for hundreds of thousands of deaths every year.

Severe illness caused by viruses such as H5N1 also means that infected people can be identified and isolated, or that they died quickly. They do not walk around feeling just a little under the weather, seeding the virus. The new coronavirus (known technically as SARS-CoV-2) that has been spreading around the world can cause a respiratory illness that can be severe. The disease (known as COVID-19) seems to have a fatality rate of less than 2 percent—exponentially lower than most outbreaks that make global news. The virus has raised alarm not despite that low fatality rate, but because of it.

Coronaviruses are similar to influenza viruses in that they are both single strands of RNA. Four coronaviruses commonly infect humans, causing colds. These are believed to have evolved in humans to maximize their own spread—which means sickening, but not killing, people. By contrast, the two prior novel coronavirus outbreaks—SARS (severe acute respiratory syndrome) and MERS (Middle East respiratory syndrome, named for where the first outbreak occurred)—were picked up from animals, as was H5N1. These diseases were highly fatal to humans. If there were mild or asymptomatic cases, they were extremely few. Had there been more of them, the disease would have spread widely. Ultimately, SARS and MERS each killed fewer than 1,000 people.

COVID-19 is already reported to have killed more than twice that number. With its potent mix of characteristics, this virus is unlike most that capture popular attention: It is deadly, but not too deadly. It makes people sick, but not in predictable, uniquely identifiable ways. Last week, 14 Americans tested positive on a cruise ship in Japan despite feeling fine—the new virus may be most dangerous because, it seems, it may sometimes cause no symptoms at all.

The world has responded with unprecedented speed and mobilization of resources. The new virus was identified extremely quickly. Its genome was sequenced by Chinese scientists and shared around the world within weeks. The global scientific community has shared genomic and clinical data at unprecedented rates. Work on a vaccine is well under way. The Chinese government enacted dramatic containment measures, and the World Health Organization declared an emergency of international concern. All of this happened in a fraction of the time it took to even identify H5N1 in 1997. And yet the outbreak continues to spread.

The Harvard epidemiology professor Marc Lipsitch is exacting in his diction, even for an epidemiologist. Twice in our conversation he started to say something, then paused and said, “Actually, let me start again.” So it’s striking when one of the points he wanted to get exactly right was this: “I think the likely outcome is that it will ultimately not be containable.”

Containment is the first step in responding to any outbreak. In the case of COVID-19, the possibility (however implausible) of preventing a pandemic seemed to play out in a matter of days. Starting in January, China began cordoning off progressively larger areas, radiating outward from Wuhan City and eventually encapsulating some 100 million people. People were barred from leaving home, and lectured by drones if they were caught outside. Nonetheless, the virus has now been found in 24 countries.

Despite the apparent ineffectiveness of such measures—relative to their inordinate social and economic cost, at least—the crackdown continues to escalate. Under political pressure to “stop” the virus, last Thursday the Chinese government announced that officials in the Hubei province would be going door to door, testing people for fevers and looking for signs of illness, then sending all potential cases to quarantine camps. But even with the ideal containment, the virus’s spread may have been inevitable. Testing people who are already extremely sick is an imperfect strategy if people can spread the virus without even feeling bad enough to stay home from work.

Lipsitch predicts that, within the coming year, some 40 to 70 percent of people around the world will be infected with the virus that causes COVID-19. But, he clarifies emphatically, this does not mean that all will have severe illnesses. “It’s likely that many will have mild disease, or may be asymptomatic,” he said. As with influenza, which is often life-threatening to people with chronic health conditions and of older age, most cases pass without medical care. (Overall, around 14 percent of people with influenza have no symptoms.)

Lipsitch is far from alone in his belief that this virus will continue to spread widely. The emerging consensus among epidemiologists is that the most likely outcome of this outbreak is a new seasonal disease—a fifth “endemic” coronavirus. With the other four, people are not known to develop long-lasting immunity. If this one follows suit, and if the disease continues to be as severe as it is now, “cold and flu season” could become “cold and flu and COVID-19 season.”

At this point, it is not even known how many people are infected. As of Sunday, there have been 35 confirmed cases in the U.S., according to the World Health Organization. But Lipsitch’s “very, very rough” estimate when we spoke a week ago (banking on “multiple assumptions piled on top of each other,” he said) was that 100 or 200 people in the U.S. were infected. That’s all it would take to seed the disease widely. The rate of spread would depend on how contagious the disease is in milder cases. On Friday, Chinese scientists reported in the medical journal JAMA an apparent case of asymptomatic spread of the virus, from a patient with a normal chest CT scan. The researchers concluded with stolid understatement that if this finding is not a bizarre abnormality, “the prevention of COVID-19 infection would prove challenging.”


We have entered the Trump Unbound era — and journalists need to step it up.

We have entered the Trump Unbound era — and journalists need to step it up.
By Margaret Sullivan
Feb 23 2020

When Donald Trump was elected, the media spent months figuring out how to cover a far-from-ordinary presidency.

Some will argue that many journalists never rose to that challenge — that they normalized Trump at every turn and never successfully conveyed to the public a clear and vivid picture of how he has toppled democratic norms and marched the country toward autocracy.

To be sure, they made adjustments.

Big Journalism began to call a lie a lie. It began to call racism by its name. It began to offer fact-checking in real time.

In other words, journalists adapted — within the framework of their tried-and-true beliefs.

We’re not “part of the resistance,” said New York Times Executive Editor Dean Baquet; “We’re not at war with the administration; we’re at work,” Washington Post Executive Editor Martin Baron said. These were thoughtful, reasonable remarks, and they set the tone for much of how the mainstream media — from NPR to the broadcast nightly news to regional newspapers — has proceeded.

And then came Trump’s impeachment. And his acquittal. And now, a new era for this president who chooses to believe he’s been vindicated.

In this new era, Trump has declared himself the nation’s chief law enforcement official. He has pardoned a raft of corrupt officials. He has exacted revenge on those he sees as his impeachment enemies — Lt. Col. Alexander Vindman, the decorated military veteran and national security staffer; and Gordon Sondland, Trump’s own handpicked ambassador to the European Union — simply because they testified under subpoena to what they knew about the White House’s dealings with Ukraine.

In other words, we are in entirely new territory now. Should the news media continue as usual? Should it retain its own traditions as the nation slides toward autocracy? Should it treat the Trump presidency as pretty much the usual thing, with a few more fact-checks and the occasional use of a word like “lie”?

No. We need a new and better approach if we’re going to do our jobs adequately.

First, we need to abandon neutrality-at-all-costs journalism, to replace it with something more suited to the moment. Call it Fairness First.

I’m talking about the kind of fairness that serves the public by describing the world we report on in honest and direct terms — not the phony kind of fairness that tries to duck out of difficult decisions by giving “both sides” of an argument equal time, regardless of their truth or merit.

Now more than ever, with a president feeling empowered and vindictive after his acquittal, we need to apply more scrutiny and less credulity to his increasingly extreme actions and statements.

Second, we need to be far more direct in the way stories are put together and presented.

I often talk to news consumers — citizens by another name — who insist that they want “just the facts” reporting. They’re understandably frustrated that they can’t seem to find that when so many news organizations, especially cable news, seem to have chosen political sides for commercial purposes. They want news that is unbiased — that doesn’t come with a side helping of opinion. Just tell me what happened, they say. I’ll make my own decisions about what it means.

That sounds good in theory. In practice, every piece of reporting on national politics is unavoidably the product of choices: What’s the angle? Who is quoted? What’s the headline? How much historical context is there? How prominent is it on a front page, a home page, an app?

Once again, President Trump’s annual address was full of dubious claims. (Sarah Cahlan/The Washington Post)
It’s in these small but crucial decisions that mainstream media often fails its audience: We simply are not getting across the big picture or the urgency. This happens, in part, because those news organizations that haven’t chosen up sides — those that want to serve all Americans — fear being charged with bias.

And so they soften the language. They blunt the impact.

Take the story of Trump’s angry reaction to the warning that Russia is interfering in the 2020 election to help his reelection. After hearing this, he reportedly moved to dump the acting director of national intelligence.

That’s big news that ought to be told with real urgency, right?

But not all of mainstream journalism saw it that way. On Friday morning, I searched and scrolled the home page of ABC News, whose evening news show attracts millions every night, the most-watched program of its kind. There were stories about the coronavirus, about the “mom of Idaho kids arrested in Hawaii,” and even a breathless in-case-you-missed-it piece about new fish sandwiches at Arby’s and Bojangles as Lent approaches. I could find the story in question only after a search for the term “Russia.”

And even those news organizations that did emphasize the story were using words that failed to get the importance across — headline after headline used the word “meddling” to describe the reported Russian intrusions into America democracy.

Meddling sounds like your nosy neighbor getting involved, over the backyard fence, in your family’s squabble.


The Billionaire Election

The Billionaire Election
Does the world belong to them or to us?
By Anand Giridharadas
Feb 21 2020

Bernie Sanders wants to get rid of them. Amy Klobuchar is fine with them, but wants them to pay somewhat higher taxes. Joe Biden promises them that under him, “nothing would fundamentally change.” Tom Steyer is one of them and wouldn’t be in the race if he wasn’t but seems slightly embarrassed about it. Elizabeth Warren wants to break up the companies that made many of them in the first place. Michael Bloomberg is trying to become president largely on the basis of being one. It would take Pete Buttigieg thousands of years to become one at his past rate of adult wealth creation, and yet he seems to be their top choice.

And waiting across the aisle, Donald Trump claims he’s one of them, which, because he’s Trump, means he probably isn’t.

I’m talking about billionaires, of course.

The Democratic debate on Wednesday made it clearer than ever that November’s election has become the billionaire referendum, in which it will be impossible to vote without taking a stand on extreme wealth in a democracy. The word “billionaire” came up more often than “China,” America’s leading geopolitical competitor; “immigration,” among its most contentious issues; and “climate,” its gravest existential threat.

Ms. Warren dominated the night by framing Mr. Bloomberg’s campaign as a bid to “substitute one arrogant billionaire for another.” When Mr. Sanders later confirmed his view that billionaires should not exist, one of the moderators, Chuck Todd, asked, “Mayor Bloomberg, should you exist?” Mr. Bloomberg replied, “I worked very hard for it, and I’m giving it away.”

With the debate careening between billionaire loathing and billionaire self-love, Mr. Buttigieg warned against making voters “choose between a socialist who thinks that capitalism is the root of all evil and a billionaire who thinks that money ought to be the root of all power.”

As the veteran Washington watchers Jim VandeHei and Mike Allen, of Axios, have observed, billionaires are less a major topic of this race than the total atmosphere of it. It’s not just the politicians. From Jack Dorsey and Mark Zuckerberg to Jeff Bezos and Rupert Murdoch, billionaires are the captains of an economy whose cruelties have given this year its populist verve, the boogeypeople for some candidates, the bankrollers of others, and the owners of the platforms of persuasion.

So what should we do about them? Voters are being treated to a vast range of answers to that question — from “Let’s tax them down to mere millionaire status” to “Let’s put them in charge of everything A.S.A.P.”

The debate is testing abiding American assumptions. A country more ardently capitalist than most is asking itself, as seriously as at any time in the modern era, whether the ultrarich, just because they are ultrarich, endanger democracy. And a country just as committed, contrarily, to its founding ideal of equality is asking whether to resign itself to a gilded revolving door in which you unseat billionaire leaders you hate by electing billionaires you don’t mind.

These conditions make it at once utterly remarkable, and totally explicable, that Mr. Sanders, the junior senator from Vermont and a democratic socialist, has become the front-runner for the Democratic nomination — and that Ms. Warren’s debate performance this week resonated as much as it did. You wouldn’t know it from watching cable news, where pundits are often aghast at the tastes of regular people who think green rooms are just rooms that are green, but in recent years, anger at billionaires has risen to a boil. This is thanks to the financial crisis, to endless wars cheered on by corporate and media elites and to yawning inequality. There is a growing sense that billionaires are not people who just happen to have drifted up from our midst, that in fact they are up there because they are standing on our backs, pinning us down.

Mr. Sanders and Ms. Warren, the senior senator from Massachusetts, have some meaningful differences of policy and personality. But the thread that connects their campaigns is their insistence that the “left behind” in America are not actually being left behind so much as stood on. They each seek to take the passive voice out of the grammar of American hardship: Your health insurance hasn’t somehow, mysteriously been made too expensive; your brick-and-mortar store hasn’t somehow, mysteriously been undercut. Someone did those things to you, probably by rigging the system to secure an undeserved advantage. And that person was probably a billionaire.

The degree of support for these ideas in 2020 is astonishing in a center-right country where, as John Steinbeck once wrote, explaining socialism’s limited growth in America: “We didn’t have any self-admitted proletarians. Everyone was a temporarily embarrassed capitalist.” It is a reflection of how fed up many Americans are with the old narratives about how, with a little pluck and patience, they too will rise. And it is a sign of a generational changing of the guard. As the (millennial) journalist Charlotte Alter, author of the new book “The Ones We’ve Been Waiting For,” told me, “Socialism is a generational Rorschach test: Boomers think of Soviet gulags and bad shoes, millennials think of Swedish health care and free education.”

In the mainstream of the Democratic Party, it has long been said that billionaires should pay more of their “fair share.” But, until recently, few would have questioned that you’d want more billionaires on the Forbes list, not fewer. Today a vocal chunk of the Democratic electorate is gravitating to a strikingly different conclusion: that America would actually be better off reducing its billionaire population through taxes and profit-trimming regulations.

(In fact, if I could ask one debate question, it would be this: Raise your hand if you would want there to be more billionaires at the end of your presidency than the start; raise your hand if you’d want fewer billionaires. Then, same question, but applied to millionaires. I think it would be revealing.)

Ballooning anti-billionaire sentiment is galvanizing billionaires. Some have been motivated to go on television to cast their critics as naïve and un-American. Others donate to centrist candidates like Mr. Biden, Mr. Buttigieg and Senator Amy Klobuchar of Minnesota, who serve a cocktail of down-home incrementalism shaken with wealth defense. But it took a special billionaire — Mr. Bloomberg, the former mayor of New York — to find a more direct way to thwart ascendant progressives. He is seeking to buy the election.

Just when the accountants thought they knew every tax-avoidance trick, here is the ultimate: become the leader of the free world. Of course, Mr. Bloomberg would say that he is running for an entirely different reason, which also happens to be very billionairey: He thinks he’s the only one with the wits and war chest to pull it off. “I alone can fix it,” as Mr. Trump once put it. It is something of a mantra for the billionaires.

There was never a way for Mr. Bloomberg to run as anything but Mr. Billionaire. The pitch he landed on was incorruptibility. “I will be the only candidate in this race who isn’t corruptible,” Mr. Bloomberg told an audience in Phoenix last November, “who isn’t going to take a penny from anyone, and will work for a dollar a year.” This was the best he could do: suggest that being a billionaire would make him more honest because billionaires are so rich they don’t have to listen to other billionaires.


Cripple the Intelligence Agencies? Not Smart

[Note:  This item comes from friend Robert Berger.  DLH]

Cripple the Intelligence Agencies? Not Smart
What happens when intelligence officials warn that Russia is meddling in American politics again? Donald Trump gets mad — at the intelligence officials.
By NYT Editorial Board
Feb 21 2020

President Trump is intensifying his efforts to undermine the nation’s intelligence agencies.

On Wednesday, Mr. Trump announced that he was replacing the acting director of national intelligence, Joseph Maguire, with Richard Grenell, the ambassador to Germany.

Mr. Maguire is a retired Navy admiral who previously served as the head of the National Counterterrorism Center. Mr. Trump tapped him to lead the Office of the Director of National Intelligence last August to replace the outgoing director, Dan Coats. Until recently, Mr. Maguire was thought to have a decent shot at becoming the permanent director of the office, overseeing the nation’s spy agencies.

But that was before one of his aides gave a classified briefing on Feb. 13 to the House Intelligence Committee, in which she warned that Russia was attempting to meddle in the 2020 election with an eye toward aiding Mr. Trump — as it had in 2016.

Mr. Trump doesn’t like to hear about election interference, much less about interference by Russia. He sees the entire topic as an effort to devalue his 2016 victory. Members of his administration, as well as congressional Republicans, know that this is a matter to be broached delicately, if at all.

When the president learned of the briefing from a member of the committee, he was furious — not over the threat of foreign meddling, but that Congress had been told about it. According to a report in The Times, he was especially miffed that the meeting had included the committee’s Democratic chairman, Adam Schiff, who oversaw the recent impeachment proceedings.

The president took Mr. Maguire to the woodshed over what he saw as an act of disloyalty. He was angry about not being alerted earlier about the briefing and fretted that such delicate information would be “weaponized” by his political enemies, The Times reported. Less than a week later, Mr. Maguire was out. Administration officials insisted that the timing was a coincidence.

What is obviously not a coincidence is that Mr. Trump is once again turning for a critical appointment to someone who is short on relevant expertise but long on loyalty to him. The director of national intelligence, a position created after the Sept. 11, 2001, terror attacks, coordinates intelligence gathering and analysis across 17 federal agencies. The post has been held by former diplomats, senior military officers and, for more than six years, James Clapper, a seasoned intelligence professional.

Mr. Trump’s replacement pick, Mr. Grenell, has little intelligence experience and has never run a large bureaucracy. Before he was dispatched to Berlin, he worked as a public affairs consultant and commentator for Fox News. Before that, he was a communications official in President George W. Bush’s administration.

Although he is taking on a hugely important new job, Mr. Grenell will continue in his capacity as ambassador. As with so many of his appointments, Mr. Trump has installed Mr. Grenell in an “acting” capacity. This puts the president’s appointees on a short leash and avoids the inconvenience of Senate confirmation hearings.

Mr. Grenell has been an aggressive public cheerleader for Mr. Trump, fiercely and frequently defending him on Fox News and on social media. That appears to be the qualification that truly matters to this president — especially when it comes to overseeing an intelligence community that Mr. Trump has always believed has been out to get him.

Mr. Trump’s effort to pack the administration with political loyalists has gained momentum since the Senate acquitted him on impeachment charges earlier this month. In recent weeks, the president has removed multiple officials with connections to impeachment, including top National Security Council and Pentagon officials.


Fates of humans and insects intertwined, warn scientists

Fates of humans and insects intertwined, warn scientists
Experts call for solutions to be enforced immediately to halt global population collapses
By Damian Carrington
Feb 20 2020

The “fates of humans and insects are intertwined”, scientists have said, with the huge declines reported in some places only the “tip of the iceberg”.

The warning has been issued by 25 experts from around the world, who acknowledge that little is known about most of the estimated 5.5 million insect species. However, enough was understood to warrant immediate action, they said, because waiting for better data would risk irreversible damage.

The researchers said solutions were available and must be implemented immediately. These range from bigger nature reserves and a crackdown on harmful pesticides to individual action such as not mowing the lawn and leaving dead wood in gardens. They also said invertebrates must no longer be neglected by conservation efforts, which tend to focus on mammals and birds.

The alert has been published as two articles in the Biological Conservationjournal.

“The current [insect] extinction crisis is deeply worrisome. Yet, what we know is only the tip of the iceberg,” the scientists write. “We know enough to act immediately. Solutions are now available – we must act upon them.”

“Insect declines lead to the loss of essential, irreplaceable services to humanity. Human activity is responsible for almost all current insect population declines and extinctions.”

Insect population collapses have been reported in Germany, Puerto Rico and elsewhere. The first global scientific review, published in February 2019, said widespread declines threatened to cause a “catastrophic collapse of nature’s ecosystems”. Insects pollinate three-quarters of crops, and another recent study showed widespread losses of such insects across Britain.

The report notes that only about a fifth of the world’s insect species have even been named, mostly from only single specimens.

“Many insect species are going extinct even before being described,” the researchers said. “It is likely that insect extinctions since the industrial era are around 5-10%, ie 250,000 to 500,000 species.”

This estimate is based on the extinctions of land snails. Prof Pedro Cardoso, at the Finnish Museum of Natural History and the lead author of the latest report, said: “It is the best estimate we have. There is no reason to think the trends are different between insects and land snails, but snails leave their shells behind as evidence.”

The paper also notes that British butterfly and beetle populations were said to be “fast disappearing” in the 1870s by the entomologist Archibald Swinton.

Long-term data on insect populations is rare. “We don’t know everything – in fact we know very little – but if we wait until we have better information to act it might be too late to recover many species,” Cardoso said.

“Many species are declining, probably the majority, and overall it seems the trend is for a large decline. But there are of course some species that are benefitting, for example the swarms of locusts currently in east Africa.”

Matt Shardlow, the chief executive of the conservation group Buglife, said a key report in 2016 told world governments that declines in wild pollinators presented risks to societies and ecosystems.

“However, in a repeat of the failure of politicians to respond to scientific warnings about climate change, the cautious, scientific language used has not produced an appropriate response from governments,” he said.

“Scientists are now turning up the heat on insect declines in the hope that politicians will understand the urgency and the link to human survival, and will take action before it is too late.”

The key causes of insect losses, according to the scientists, are the destruction of natural habitat for farming and buildings; the intensive use of pesticides; industrial pollution and light pollution; and invasive alien species; and the climate crisis.


The Computer Scientist Responsible for Cut, Copy, and Paste, Has Passed Away

[Note:  This item comes from friend Judi Clark.  DLH]

The Computer Scientist Responsible for Cut, Copy, and Paste, Has Passed Away
By Andrew Liszewski
Feb 19 2020

The advent of the personal computer wasn’t just about making these powerful machines available to everyone, it was also about making them accessible and usable, even for those lacking a computer science degree. Larry Tesler, who passed away on Monday, might not be a household name like Steve Jobs or Bill Gates, but his contributions to making computers and mobile devices easier to use are the highlight of a long career influencing modern computing.

Born in 1945 in New York, Tesler went on to study computer science at Stanford University, and after graduation he dabbled in artificial intelligence research (long before it became a deeply concerning tool) and became involved in the anti-war and anti-corporate monopoly movements, with companies like IBM as one of his deserving targets. In 1973 Tesler took a job at the Xerox Palo Alto Research Center (PARC) where he worked until 1980. Xerox PARC is famously known for developing the mouse-driven graphical user interface we now all take for granted, and during his time at the lab Tesler worked with Tim Mott to create a word processor called Gypsy that is best known for coining the terms “cut,” “copy,” and “paste” when it comes to commands for removing, duplicating, or repositioning chunks of text.

Xerox PARC is also well known for not capitalizing on the groundbreaking research it did in terms of personal computing, so in 1980 Tesler transitioned to Apple Computer where he worked until 1997. Over the years he held countless positions at the company including Vice President of AppleNet (Apple’s in-house local area networking system that was eventually canceled), and even served as Apple’s Chief Scientist, a position that at one time was held by Steve Wozniak, before eventually leaving the company.

In addition to his contributions to some of Apple’s most famous hardware, Tesler was also known for his efforts to make software and user interfaces more accessible. In addition to the now ubiquitous “cut,” “copy,” and “paste” terminologies, Tesler was also an advocate for an approach to UI design known as modeless computing, which is reflected in his personal website. In essence, it ensures that user actions remain consistent throughout an operating system’s various functions and apps. When they’ve opened a word processor, for instance, users now just automatically assume that hitting any of the alphanumeric keys on their keyboard will result in that character showing up on-screen at the cursor’s insertion point. But there was a time when word processors could be switched between multiple modes where typing on the keyboard would either add characters to a document or alternately allow functional commands to be entered.

There are still plenty of software applications where tools and functionality change depending on the mode they’re in (complex apps like Photoshop, for example, where various tools behave differently and perform very distinct functions) but for the most part modern operating systems like Apple’s macOS and Microsoft’s Windows have embraced user-friendliness through a less complicated modeless approach.

After leaving Apple in 1997, Tesler co-founded a company called Stagecast Software which developed applications that made it easier and more accessible for children to learn programming concepts. In 2001 he joined Amazon and eventually became the VP of Shopping Experience there, in 2005 he switched to Yahoo where he headed up that company’s user experience and design group, and then in 2008 he became a product fellow at 23andMe. According to his CV, Tesler left 23andMe in 2009 and from then on mostly focused on consulting work. 

While there are undoubtedly countless other contributions Tesler made to modern computing as part of his work on teams at Xerox and Apple that may never come to light, his known contributions are immense. Tesler is one of the major reasons computer moved out of research centers and into homes.