How  moved into the business of U.S. elections

[Note:  This item comes from friend David Rosenthal.  DLH]

How moved into the business of U.S. elections
By Nandita Bose
Oct 15 2019

WASHINGTON (Reuters) – Inc’s (AMZN.O) cloud computing arm is making an aggressive push into one of the most sensitive technology sectors: U.S. elections.

The expansion by Amazon Web Services into state and local elections has quietly gathered pace since the 2016 U.S. presidential vote. More than 40 states now use one or more of Amazon’s election offerings, according to a presentation given by an Amazon executive this year and seen by Reuters. 

So do America’s two main political parties, the Democratic presidential candidate Joe Biden and the U.S. federal body charged with administering and enforcing federal campaign finance laws. 

While it does not handle voting on election day, AWS – along with a broad network of partners – now runs state and county election websites, stores voter registration rolls and ballot data, facilitates overseas voting by military personnel and helps provide live election-night results, according to company documents and interviews.

In the fullest public picture yet of Amazon’s strategic move into U.S. election infrastructure, Reuters reviewed previously unreported company presentations and documents, and conducted more than two dozen interviews with lawmakers, election administrators, and heads of election security and technology in nearly a dozen states and counties that use Amazon’s cloud. 

Amazon pitches itself as a low-cost provider of secure election technology at a time when local officials and political campaigns are under intense pressure to prevent a repeat of 2016 presidential elections, which saw cyber-attacks on voting systems and election infrastructure. 

“The fact that we have invested heavily in this area, it helps to attest to the fact that in over 40 states, the Amazon cloud is being trusted to power in some way, some aspect of elections,” Michael Jackson, leader, Public Health & U.S. Elections at AWS, told prospective government clients in February via a presentation on a webinar, which was viewed by Reuters. 

The company’s efforts are welcomed by election administrators, who in interviews said they often struggle with keeping outdated systems up to date at the local level. 

In Oregon, for example, the state’s in-house servers that support election services shut down every time there is a power outage – an often occurrence as Oregon updates its electric grid, according to Peter Threlkel, chief information officer at the Oregon Secretary of State. A move to the cloud fixes that problem, and Oregon ran a pilot with AWS to move its voter registration system to the cloud, he said. 

Some security experts like David O’Berry, co-founder, Precog Security, said moving to AWS is “a good option for campaigns, who do not have the resources to protect themselves.” 

Still, Amazon’s growing presence in the elections business could undermine what many officials view as a strength of the U.S. voting system: decentralization. 

Most security experts Reuters spoke to said that while Amazon’s cloud is likely much harder to hack than systems it is replacing, putting data from many jurisdictions on a single system raises the prospect that a single major breach could prove damaging. 

“It makes Amazon a bigger target” for hackers, “and also increases the challenge of dealing with an insider attack,” said Chris Vickery, director of cyber risk research at cybersecurity startup Upguard. 

A recent hack into Capital One Financial Corp’s (COF.N) data stored on Amazon’s cloud service was perpetrated by a former Amazon employee. The breach affected more than 100 million customers, underscoring how rogue employees or untrained workers can create security risks even if the underlying systems are secure. [nL2N24U1LH] 

Amazon says its systems are reliable. “Over time, states, counties, cities, and countries will leverage AWS services to ensure modernization of their elections for increased security, reliability, and analytics for an efficient and more effective use of taxpayer dollars,” an AWS spokesperson told Reuters. 

Amazon’s push into the election business comes as the company faces criticism from politicians, labor unions and privacy advocates over its business practices and growing influence. President Donald Trump has accused the company of competing unfairly and repeatedly attacked the Washington Post, owned by Amazon CEO Jeff Bezos, for alleged bias, a charge Bezos and the paper deny.


Export of Sensitive U.S. Technology Slows Under Trump

[Note:  This item comes from friend Matt Drange.  DLH]

Export of Sensitive U.S. Technology Slows Under Trump
By Matt Drange and Mike Sullivan
Oct 15 2019

The Trump administration has made it harder for U.S. tech companies to export sensitive U.S. technologies, such as encryption software, semiconductors and drones. Export license approvals have dropped and rejections have risen in recent years, data obtained by The Information shows, and lawyers advising companies seeking export licenses say it’s taking longer to win approvals.

Still, the overall number of government-approved export licenses for technologies with both military and civilian uses far outpaces denials, the data shows. This suggests President Donald Trump’s desire to clamp down on the export of sensitive technologies, particularly to China, has had a relatively limited impact. 

All told, the Commerce Department approved more than 74,700 license applications, worth more than $64 billion, to export regulated technology in 2018, according to the data. That’s down from a high of just over 92,100 licenses in 2015 under then-President Barack Obama. 

The data, which is collected by a small division of the Department of Commerce, has never before been made public. It was obtained via a Freedom of Information Act request, and The Information is making the complete dataset, including every category the government collects, available for download here. The data include the dollar amount and destination country for technologies that require government approval for export. Many of the license applications were filed by tech companies, including some that do business with Chinese firms ZTE and Huawei, which have been sanctioned by the U.S. government. We’ve also prepared an interactive world map that lets you explore the data by country, available here.

License denials, meanwhile, have risen since Trump took office. More than 1,000 applications were denied in each of 2018 and 2017, representing more than $1 billion in goods combined and outpacing prior years. (In the chart above, the denials for 2014 represent a change in how the Commerce Department, which grants licenses, categorizes certain types, resulting in an artificial spike that year.) 

“We’ve seen slowdowns and unexpected rejections for China” since Trump took office, said Dan Fisher-Owens, an attorney at Berliner Corcoran & Rowe. “We’re seeing more rejections of national security–controlled items that would have been licensed to China two years ago, including clients who have received the same items in the past.”

A number of different exports appear to have been affected by Trump’s policies on China. Approvals of certain semiconductors and chip-related intellectual property, for example, dipped in 2018. Many of these items are at the heart of China’s “Made in China 2025” plan, which aims to have lucrative technology the country currently imports be made at home. 

To put the data in context, The Information had a half-dozen experts, including former Commerce Department officials as well as lawyers who submit applications to the agency on behalf of companies, study the numbers. Among the findings that stood out:

• A number of cryptographic-related items have been approved for export to North Korea, totaling more than $1.2 million, since 2010. Experts said these were likely for use either by an allied government, for instance at an embassy, or by a humanitarian-oriented nonprofit organization. (A Commerce Department spokesman said export laws allow for “approval of humanitarian items, items supporting United Nations humanitarian efforts and agricultural commodities or medical devices that are not luxury goods.”) 
• A handful of high-value IT-related goods were approved for export to Syria, items that experts say were most likely destined for NGOs working on humanitarian aid, rather than Syrian government users. (A Commerce Department spokesman said end users “would be parties engaged in activity supporting U.S. foreign policy objectives in Syria.”) 
• A handful of high-end cryptographic goods were approved for export to Israel, possibly for government use, experts said. This reflects a longstanding cooperation between the U.S. and Israel on cyber and network security issues. 
‘Chilling Effect’

The current political climate has had a “chilling effect” on Silicon Valley, Fisher-Owens said, adding that mid to small-size tech firms are seeking legal advice on export issues for the first time. Many of them, he said, do business with ZTE and Huawei, the Chinese hardware giants  at the center of the U.S.-China trade and security policy battle in recent years. While large companies, such as established semiconductor makers, typically have export control programs with dedicated staff, many smaller tech firms “are not used to having export controls applied to them,” Fisher-Owens said.


This Is the Most Realistic Path to Medicare for All

This Is the Most Realistic Path to Medicare for All
It’s similar to how we got to Medicare in the first place: the failure of private insurance.
By J.B. Silvers
Oct 15 2019

Much to the dismay of single-payer advocates, our current health insurance system is likely to end with a whimper, not a bang. The average person simply prefers what we know versus the bureaucracy we fear.

But for entirely practical reasons, we might yet end up with a form of Medicare for All. Private health insurance is failing in slow motion, and all signs are that it will continue. It was for similar reasons that we got Medicare in 1965. Private insurance, under the crushing weight of chronic conditions and technologic breakthroughs (especially genetics), will increasingly be a losing proposition.

As a former health insurance company C.E.O., I know how insurance is supposed to work: It has to be reasonably priced, spread risks across a pool of policyholders and pay claims when needed. When companies can’t do those fundamental tasks and make a decent profit is when we will get single payer.

It’s already a tough business to be in. Right now the payment system for health care is just a mess. For every dollar of premium, administrative costs absorb up to 20 percent. That’s just too high, and it’s not the only reason for dissatisfaction.

Patients hate paying for cost-sharing in the form of deductibles and copays. Furthermore, narrow networks with a limited number of doctors and hospitals are good for insurers, because it gives them bargaining power, but patients are often left frustrated and hit with surprise bills.

As bad as these problems are, most people are afraid of losing coverage through their employers in favor of a government-run plan. Thus inertia wins — for now. 

But there’s a reason Medicare for All is even a possibility: Most people like Medicare. It works reasonably well. And what could drive changes to our current arrangement is a disruption — like the collapse of private insurance.

There are two things insurers hate to do — take risks and pay claims. Before Affordable Care Act regulations, insurance companies cherry-picked for lower-risk customers and charged excessive rates for some enrollees.

Those were actually the first indications of market failure. Since the enactment of the Affordable Care Act, insurers have actually had to take these risks as they were supposed to all along and provide rebates of excessive profits.

With insurers under such pressure, we’re now facing another sort of market dysfunction. Insurance companies are doing what they can to avoid paying claims. A recent report says that Obamacare plans average an 18 percent denial rate for in-network claims submitted by providers. Some reject more than a third. This suggests that even in a regulated marketplace like the Obamacare exchanges, insurers somehow manage to dispute nearly one out of every five claims.

These are systemic failures that can and should be fixed by regulation of the exchanges, better information on plan performance and robust competition. Unfortunately, consumers often still can’t make informed choices, and the options they have are limited.

But even if we fix these problems, there are two bigger factors looming that threaten the integrity of the entire system. Insurance at its root assumes that the payout required cannot be determined for each individual but can be estimated for the whole group. We can’t predict who will be affected by trauma or a broken bone, but in the aggregate, it is possible to estimate what will happen to the insured group as a whole. Some will suffer losses while the majority will be fine, and all will pay a fair average premium to cover the expenses that result.

Yet with the increases in chronic conditions and the promise of genetic information, these insurance requirements are not met. Someone with diabetes or rheumatoid arthritis will have the same condition and similar costs in each future year. And the woman with a positive BRCA gene is much more likely to develop breast cancer. In these cases, known costs simply must be paid. Instead of spreading these across all enrolled populations, they must be financed across time for the increasing numbers with such conditions. Loading private insurance companies with these expenses results in uncompetitive rates and market failure.

There is only one solution: pooling and financing some or all of these at the broadest levels. In a nutshell, that is how we get a single-payer government system.


Scientists Create a Material That Captures CO2 And Turns It Into Organic Matter

[Note:  This item comes from friend David Rosenthal.  DLH]

Scientists Create a Material That Captures CO2 And Turns It Into Organic Matter 
Oct 15 2019

Scientists have come up with an innovative way to try and counter the massive amounts of carbon dioxide we’re still pumping into the air, even as a climate crisis unfolds around us: turning that CO2 into a useful organic polymer.

The newly developed method sucks CO2 molecules out of the air, without expending much energy in the process. The material can then potentially be turned into an ingredient for packaging or clothing.

The secret weapon is a porous coordination polymer (PCP) made up of zinc metal ions.

Those ions are able to selectively capture CO2 molecules with 10 times greater efficiency than other PCPs, the scientists say. What’s more, the material is reusable, and was still running at maximum efficiency after 10 reaction cycles.

“We have successfully designed a porous material which has a high affinity towards CO2 molecules and can quickly and effectively convert it into useful organic materials,” says materials chemist Ken-ichi Otake, from Kyoto University in Japan.

The idea of carbon sequestration has been around for some time, but the low reactivity of carbon dioxide means it’s difficult to capture and lock away without using a lot of energy along the way – which kind of defeats the point.

PCPs (also known as metal-organic frameworks or MOFs) might hold the key to overcoming this obstacle. The one outlined in this new study uses a clever trick: an organic component with a propeller-like structure.

Using X-ray structural analysis, the researchers found that as CO2 molecules approach the PCP, its molecular structure rotates and rearranges, allowing the carbon dioxide to be trapped in the material.

The PCP is essentially working as a molecular sieve, able to recognise molecules by size and shape. Once the material has done its CO2-sucking job, it can be reused or recycled as an organic polymer. Organic polymers are able to be turned into polyurethane, which is used in clothing, packaging, domestic appliances and a variety of other areas.

We’re seeing a number of promising in the field of carbon storage. Earlier this year scientists from RMIT University in Australia presented a way of turning CO2 back into coal, using a chemical reaction involving the metal cerium.

Another team of researchers, from Rice University in the US, have been able to develop a device for turning CO2 into liquid fuel: in this case the metal bismuth is the key ingredient, and formic acid is the end result.

All these ideas require further research and need to work at larger scales, but progress is being made. That said, they shouldn’t distract us from the best way of cutting down the CO2 in the air and slowing global warming – reducing our carbon emissions.


The New Makers of Plant-Based Meat? Big Meat Companies

The New Makers of Plant-Based Meat? Big Meat Companies
Tyson, Smithfield, Perdue and Hormel have all rolled out meat alternatives, filling supermarket shelves with an array of plant-based burgers, meatballs and chicken nuggets.
By David Yaffe-Bellany
Oct 14 2019

Beyond Meat and Impossible Foods, scrappy start-ups that share a penchant for superlatives and a commitment to protecting the environment, have dominated the relatively new market for vegetarian food that looks and tastes like meat.

But with plant-based burgers, sausages and chicken increasingly popular and available in fast-food restaurants and grocery stores across the United States, a new group of companies has started making meatless meat: the food conglomerates and meat producers that Beyond Meat and Impossible Foods originally set out to disrupt.

In recent months, major food companies like Tyson, Smithfield, Perdue, Hormel and Nestlé have rolled out their own meat alternatives, filling supermarket shelves with plant-based burgers, meatballs and chicken nuggets. 

Once largely the domain of vegans and vegetarians, plant-based meat is fast becoming a staple of more people’s diets, as consumers look to reduce their meat intake amid concerns about its health effects and contribution to climate change. Over the last five months, Beyond Meat’s stock price has soared and Impossible Foods’ deal to provide plant-based Whoppers at Burger King has prompted a wave of fast-food chains to test similar products. Analysts project that the market for plant-based protein and lab-created meat alternatives could be worth as much as $85 billion by 2030.

Now, at supermarkets across the United States, shoppers can find plant-based beef and chicken sold alongside the packaged meat products that generations of Americans have eaten.

“There is a growing demand out there,” said John Pauley, the chief commercial officer for Smithfield, one of the largest pork producers in the country. “We’d be foolish not to pay attention.” 

In September, Nestlé released the Awesome Burger, its answer to the meatless patties of Beyond Meat and Impossible Foods. (“We do feel like it’s an awesome product,” a Nestlé spokeswoman said.) Smithfield started a line of soy-based burgers, meatballs and sausages, and Hormel began offering plant-based ground meat. 

There are also blended options — a kind of faux fake meat that falls somewhere in the existential gray area between the Beyond Burger and a cut of beef. Tyson is introducing a part-meat, part-plant burger. And Perdue is selling blended nuggets, mixing poultry with “vegetable nutrition” in the form of cauliflower and chickpeas.

Many supporters of meatless alternatives have hailed the new products as a sign that plant-based meat has gained widespread acceptance. 

“When companies like Tyson and Smithfield launch plant-based meat products, that transforms the plant-based meat sector from niche to mainstream,” said Bruce Friedrich, who runs the Good Food Institute, an organization that advocates plant-based substitutes. “They have massive distribution channels, they have enthusiastic consumer bases, and they know what meat needs to do to satisfy consumers.”

But the emergence of these meat companies in the plant-based-protein market has also prompted suspicion and unease among some environmental activists, who worry the companies could co-opt the movement by absorbing smaller start-ups, or simply use plant-based burgers to draw attention away from other environmental misdeeds.

“That’s a legitimate concern,” said Glenn Hurowitz, who runs the environmental advocacy organization Mighty Earth. For years, big oil companies bought clean-energy start-ups and essentially shut them down, he noted. 

“Making admittedly modest investments in plant-based protein is a legitimately good thing for these businesses to do,” Mr. Hurowitz said, but “it doesn’t entirely balance out all the pollution they’re causing.” 

Many of the major food companies began investing in plant-based meat or other vegan alternatives years ago. But the pace has accelerated over the past few months. 

“The entire end-to-end process happened in less than a year,” said Justin Whitmore, Tyson’s executive vice president for alternative protein. “We’ll move with the consumer, and we have the capacity that helps us move quickly.”


The Amelia Earhart Mystery Stays Down in the Deep

The Amelia Earhart Mystery Stays Down in the Deep
Robert Ballard’s expedition to a remote island in the South Pacific found no evidence of the vanished aviator’s plane. But the explorer and his crew haven’t given up.
By Julie Cohn
Oct 15 2019

For two weeks in August, a multimillion-dollar search from air, land and sea sought to solve the 80-year mystery of Amelia Earhart’s disappearance.

Robert Ballard, the ocean explorer famous for locating the wreck of the Titanic, led a team that discovered two hats in the depths. It found debris from an old shipwreck. It even spotted a soda can. What it did not find was a single piece of the Lockheed Electra airplane flown in 1937 by Amelia Earhart and Fred Noonan, which vanished during their doomed voyage around the world.

Dr. Ballard and his crew don’t consider it a failure. For one thing, he says, they know where the plane isn’t. And in the process, they may have dispensed with one clue that has driven years of speculation, while a team of collaborating archaeologists potentially turned up more hints at the aviator’s fate.

“This plane exists,” Dr. Ballard said. “It’s not the Loch Ness monster, and it’s going to be found.”

Dr. Ballard had avoided the Earhart mystery for decades, dismissing the search area as too large, until he was presented with a clue he found irresistible. Kurt Campbell, then a senior official in President Barack Obama’s State Department, shared with him what is known as the Bevington image — a photo taken by a British officer in 1940 at what is now known as Nikumaroro, an atoll in the Phoenix Islands in the Republic of Kiribati. American intelligence analysts had enhanced the image at Mr. Campbell’s request, and concluded a blurry object in it was consistent with landing gear from Earhart’s plane.

Motivated by this clue, and by 30 years of research on Nikumaroro by the International Group for Historic Aircraft Recovery, Dr. Ballard and his crew set a course for the island in August. They were joined by archaeologists from the National Geographic Society, which sponsored and documented the journey for “Expedition Amelia,” which will air on the National Geographic Channel on Sunday.

Dr. Ballard and Allison Fundis, the Nautilus’s chief operating officer, coordinated an elaborate plan of attack. First, they sent the ship five times around the island to map it with multibeam sonar, and deployed a floating autonomous surface vehicle to map shallower areas off the island’s shore. They also used four aerial drones for additional inspections of the surrounding reef.

Nikumaroro and its reef are just the tip of a 16,000-foot underwater mountain, a series of 13 sheer escarpments that drop off onto ramps, eventually fanning out at the base for six nautical miles.

If Earhart crashed there, they believe, rising tides would have dragged her plane over the reef, and down the escarpments. Fragments should have collected on the ramps, especially heavier components like the engine and the radio.

In deeper water the team deployed the Hercules and the Argus, remotely operated vehicles equipped with spotlights and high-definition cameras. These robots descended 650 feet around the entire island, and found nothing.

At that point, the crew focused on the northwest corner of the island near the S.S. Norwich City, a British freighter that ran aground on the island in 1929, eight years before Earhart’s disappearance. That is the area where the Bevington photo was taken.

While they searched there, crew members found so many beach rocks consistent in size and shape with the supposed landing gear in the Bevington image that it became a joke on the ship.

“Oh look,” Dr. Ballard would chuckle, “another landing gear rock.”

Ms. Fundis said, “We felt like if her plane was there, we would have found it pretty early in the expedition.” But she said they kept up their morale because Dr. Ballard reminded them that it took four missions to find the Titanic, and that one of those expeditions missed the ship by just under 500 feet.

The crew mapped the mountain’s underwater drainage patterns and searched the gullies that might have carried plane fragments down slope, to a depth of 8,500 feet. Crew members even searched roughly four nautical miles out to sea, in case the plane lifted off the reef intact and glided underwater as it sank.

Each time a new search tactic yielded nothing, Dr. Ballard said, he felt he was adding “nail after nail after nail” to the coffin of the Nikumaroro hypothesis.

Still, Dr. Ballard and Ms. Fundis confess that other clues pointing to Nikumaroro have left them with lingering curiosity about whether Earhart crashed there. For instance, Panamerican Airway radio direction finders on Wake Island, Midway Atoll and Honolulu each picked up distress signals from Earhart and took bearings, which triangulated in the cluster of islands that includes Nikumaroro.

For years, many Earhart historians have been skeptical of the Nikumaroro theory. And Dr. Ballard, Ms. Fundis and their team’s return to the island will now depend on whether the archaeologists from the National Geographic Society came up with evidence that Earhart’s body was there.

Fredrik Hiebert, the society’s archaeologist in residence, has some leads. His team awaits DNA analysis on soil samples taken at a bivouac shelter found on the island.


Without encryption, we will lose all privacy. This is our new battleground

Without encryption, we will lose all privacy. This is our new battleground
The US, UK and Australia are taking on Facebook in a bid to undermine the only method that protects our personal information
By Edward Snowden
Oct 15 2019

In every country of the world, the security of computers keeps the lights on, the shelves stocked, the dams closed, and transportation running. For more than half a decade, the vulnerability of our computers and computer networks has been ranked the number one risk in the US Intelligence Community’s Worldwide Threat Assessment – that’s higher than terrorism, higher than war. Your bank balance, the local hospital’s equipment, and the 2020 US presidential election, among many, many other things, all depend on computer safety.

And yet, in the midst of the greatest computer security crisis in history, the US government, along with the governments of the UK and Australia, is attempting to undermine the only method that currently exists for reliably protecting the world’s information: encryption. Should they succeed in their quest to undermine encryption, our public infrastructure and private lives will be rendered permanently unsafe.

In the simplest terms, encryption is a method of protecting information, the primary way to keep digital communications safe. Every email you write, every keyword you type into a search box – every embarrassing thing you do online – is transmitted across an increasingly hostile internet. Earlier this month the US, alongside the UK and Australia, called on Facebook to create a “backdoor”, or fatal flaw, into its encrypted messaging apps, which would allow anyone with the key to that backdoor unlimited access to private communications. So far, Facebook has resisted this.

If internet traffic is unencrypted, any government, company, or criminal that happens to notice it can – and, in fact, does – steal a copy of it, secretly recording your information for ever. If, however, you encrypt this traffic, your information cannot be read: only those who have a special decryption key can unlock it.

I know a little about this, because for a time I operated part of the US National Security Agency’s global system of mass surveillance. In June 2013 I worked with journalists to reveal that system to a scandalised world. Without encryption I could not have written the story of how it all happened – my book Permanent Record – and got the manuscript safely across borders that I myself can’t cross. More importantly, encryption helps everyone from reporters, dissidents, activists, NGO workers and whistleblowers, to doctors, lawyers and politicians, to do their work – not just in the world’s most dangerous and repressive countries, but in every single country.

When I came forward in 2013, the US government wasn’t just passively surveilling internet traffic as it crossed the network, but had also found ways to co-opt and, at times, infiltrate the internal networks of major American tech companies. At the time, only a small fraction of web traffic was encrypted: six years later, Facebook, Google and Apple have made encryption-by-default a central part of their products, with the result that today close to 80% of web traffic is encrypted. Even the former director of US national intelligence, James Clapper, credits the revelation of mass surveillance with significantly advancing the commercial adoption of encryption. The internet is more secure as a result. Too secure, in the opinion of some governments.

Donald Trump’s attorney general, William Barr, who authorised one of the earliest mass surveillance programmes without reviewing whether it was legal, is now signalling an intention to halt – or even roll back – the progress of the last six years. WhatsApp, the messaging service owned by Facebook, already uses end-to-end encryption (E2EE): in March the company announced its intention to incorporate E2EE into its other messaging apps – Facebook Messenger and Instagram – as well. Now Barr is launching a public campaign to prevent Facebook from climbing this next rung on the ladder of digital security. This began with an open letter co-signed by Barr, UK home secretary Priti Patel, Australia’s minister for home affairs and the US secretary of homeland security, demanding Facebook abandon its encryption proposals.

If Barr’s campaign is successful, the communications of billions will remain frozen in a state of permanent insecurity: users will be vulnerable by design. And those communications will be vulnerable not only to investigators in the US, UK and Australia, but also to the intelligence agencies of China, Russia and Saudi Arabia – not to mention hackers around the world.

End-to-end encrypted communication systems are designed so that messages can be read only by the sender and their intended recipients, even if the encrypted – meaning locked – messages themselves are stored by an untrusted third party, for example, a social media company such as Facebook.

The central improvement E2EE provides over older security systems is in ensuring the keys that unlock any given message are only ever stored on the specific devices at the end-points of a communication – for example the phones of the sender or receiver of the message – rather than the middlemen who own the various internet platforms enabling it. Since E2EE keys aren’t held by these intermediary service providers, they can no longer be stolen in the event of the massive corporate data breaches that are so common today, providing an essential security benefit. In short, E2EE enables companies such as Facebook, Google or Apple to protect their users from their scrutiny: by ensuring they no longer hold the keys to our most private conversations, these corporations become less of an all-seeing eye than a blindfolded courier.