Secret surveillance laws make it impossible to have an informed debate about privacy
By Cory Doctorow
Oct 26 2015
James Losey’s new, open access, peer-reviewed article in the Journal of International Communication analyzes how secret laws underpinning surveillance undermine democratic principles and how transparency from both companies and governments is a critical first step for supporting an informed debate:
Transparency is a critical step toward accountability of the mechanisms through which law enforcement and government agencies access communications data. Since 2010, a growing contingent of ICT companies have begun to publish transparency reports on the extent that governments request their user data, and some include requirements to remove content as well. However, governments have fallen short on providing the level of detail on surveillance programs that is necessary for informed debate. This article offers an overview of transparency reports currently published by ICT companies and discusses why increased transparency is a necessary but insufficient condition for accountability and supporting democratic debates on the practice and extent of surveillance of communications. Furthermore, this article discusses why governments are well-positioned to provide a greater level of transparency on the legal processes and technical means through which law enforcement actors and agencies access private communications data.
Surveillance of Communications: A Legitimization Crisis and the Need for Transparency
Exclusive: Elevated CO2 Levels Directly Affect Human Cognition, New Harvard Study Shows
By JOE ROMM
Oct 26 2015
In a landmark public health finding, a new study from the Harvard School of Public Health finds that carbon dioxide (CO2) has a direct and negative impact on human cognition and decision-making. These impacts have been observed at CO2 levels that most Americans — and their children — are routinely exposed to today inside classrooms, offices, homes, planes, and cars.
Carbon dioxide levels are inevitably higher indoors than the baseline set by the outdoor air used for ventilation, a baseline that is rising at an accelerating rate thanks to human activity, especially the burning of fossil fuels. So this seminal research has equally great importance for climate policy, providing an entirely new public health impetus for keeping global CO2 levels as low as possible.
In a series of articles, I will examine the implications for public health both today (indoors) as well as in the future (indoors and out) due to rising CO2 levels. This series is the result of a year-long investigation for Climate Progress and my new Oxford University Press book coming out next week, “Climate Change: What Everyone Needs to Know.” This investigative report is built on dozens of studies and literature reviews as well as exclusive interviews with many of the world’s leading experts in public health and indoor air quality, including authors of both studies.
What scientists have discovered about the impact of elevated carbon dioxide levels on the brain
Significantly, the Harvard study confirms the findings of a little-publicized 2012 Lawrence Berkeley National Laboratory (LBNL) study, “Is CO2 an Indoor Pollutant? Direct Effects of Low-to-Moderate CO2 Concentrations on Human Decision-Making Performance.” That study found “statistically significant and meaningful reductions in decision-making performance” in test subjects as CO2 levels rose from a baseline of 600 parts per million (ppm) to 1000 ppm and 2500 ppm.
Both the Harvard and LBNL studies made use of a sophisticated multi-variable assessment of human cognition used by a State University of New York (SUNY) Upstate Medical University team, led by Dr. Usha Satish. Both teams raised indoor CO2 levels while leaving all other factors constant. The findings of each team were published in the peer-reviewed open-access journal Environmental Health Perspectives put out by the National Institute of Environmental Health Sciences, a part of NIH.
The new study, led by Dr. Joe Allen, Director of Harvard’s Healthy Buildings program, and Dr. John Spengler, Professor of Environmental Health and Human Habitation at Harvard, used a lower CO2 baseline than the earlier study. They found that, on average, a typical participant’s cognitive scores dropped 21 percent with a 400 ppm increase in CO2. Here are their astonishing findings for four of the nine cognitive functions scored in a double-blind test of the impact of elevated CO2 levels:
[Note: This item comes from friend Bob Frankston. DLH]
From: “Bob Frankston” <firstname.lastname@example.org>
Date: October 26, 2015 at 8:52:17 AM PDT
Subject: CCTV cameras worldwide used in DDoS attacks | ZDNet notsp
Again … the real message is not in the particular vulnerability of reusing credentials. It’s a reminder that it’s going to take a while to evolve this new landscape of connected things. In the meantime, we need to learn to survive such problems rather focusing on preventing and trying to put a wall between good and evil.
CCTV cameras worldwide used in DDoS attacks
Over 900 CCTV cameras have been enlisted as slaves in a botnet thanks to default credentials.
By Charlie Osborne for Zero Day
Oct 26 2015
Over 900 CCTV cameras have become slaves in a global botnet used to disrupt online services, researchers have discovered.
In the past year, we’ve seen refrigerators being hacked, Jeeps being remotely controlled by attackers while the driver is a helpless passenger, and everything from baby monitors to routers being criticized for poor security which can place not only our Internet of Things (IoT) devices at risk, but our personal privacy and security.
There are approximately 240 million surveillance cameras in use worldwide — counting only those which have been professionally logged and installed. Unfortunately, if default settings are left in place and forgotten about, surveillance cameras can become an easy target for cyberattackers setting up or empowering botnets — networks of slave systems which can flood Internet services with traffic after directions from a master controller, resulting in denial-of-service for legitimate traffic.
According to Incapsula’s research team, CCTV cameras are a common element of IoT-based botnets. In March last year, Incapsula discovered a 240 percent surge in botnet activity across the firm’s network — and much of this uptake was placed at the feet of enslaved CCTV cameras across the globe.
Now, a fresh attack is poised to disrupt online services. First discovered when investigating a HTTP Get Flood attack — a type of distributed denial-of-service (DDoS) campaign — which peaked at around 20,000 requests per second, the researchers found that within the list of attacking IPs, many of them belonging to CCTV cameras.
Traffic was able to surge through these connected devices due to installers failing to change default credentials in order to protect the cameras from infiltration.
All of the compromised devices were running BusyBox, a lightweight Unix utility bundle designed for systems with limited resources. Once an attacker gained access to a camera through the default credentials, they installed a variation of the ELF Bashlite malware, a type of malicious code which scans for network devices running BusyBox.
If devices are discovered, the malware then searches for open Telnet/SSH services which are susceptible to brute force dictionary attacks. This particular variant, however, was also equipped with the power to launch DDoS attacks.
A map of all the hacked CCTV cameras involved in DDoS attacks is below:
Processed meats do cause cancer – WHO
Processed meats – such as bacon, sausages and ham – do cause cancer, according to the World Health Organization (WHO).
By James Gallagher and Helen Briggs
Oct 26 2015
Its report said 50g of processed meat a day – less than two slices of bacon – increased the chance of developing colorectal cancer by 18%.
Meanwhile, it said red meats were “probably carcinogenic” but there was limited evidence.
The WHO did stress that meat also had health benefits.
Cancer Research UK said this was a reason to cut down rather than give up red and processed meats.
And added that an occasional bacon sandwich would do little harm.
Processed meat is meat that has been modified to increase its shelf-life or alter its taste – such as by smoking, curing or adding salt or preservatives.
It is these additions which could be increasing the risk of cancer. High temperature cooking, such as on a barbeque, can also create carcinogenic chemicals.
The WHO has come to the conclusion on the advice of its International Agency for Research on Cancer, which assesses the best available scientific evidence.
It has now placed processed meat in the same category as plutonium, but also alcohol as they definitely do cause cancer.
However, this does not mean they are equally dangerous. A bacon sandwich is not as bad as smoking.
“For an individual, the risk of developing colorectal (bowel) cancer because of their consumption of processed meat remains small, but this risk increases with the amount of meat consumed,” Dr Kurt Straif from the WHO said.
Estimates suggest 34,000 deaths from cancer every year could be down to diets high in processed meat.
That is in contrast to one million deaths from cancer caused by smoking and 600,000 attributed to alcohol each year.
Red meat does have nutritional value too and is a major source of iron, zinc and vitamin B12.
However, the WHO said there was limited evidence that 100g of red meat a day increased the risk of cancer by 17%.
An eight ounce steak is 225g.
The WHO said its findings were important for helping countries give balanced dietary advice.
Note: This item comes from friend Bruce Koball. DLH]
Russian Ships Near Data Cables Are Too Close for U.S. Comfort
By DAVID E. SANGER and ERIC SCHMITT
Oct 25 2015
WASHINGTON — Russian submarines and spy ships are aggressively operating near the vital undersea cables that carry almost all global Internet communications, raising concerns among some American military and intelligence officials that the Russians might be planning to attack those lines in times of tension or conflict.
The issue goes beyond old worries during the Cold War that the Russians would tap into the cables — a task American intelligence agencies also mastered decades ago. The alarm today is deeper: The ultimate Russian hack on the United States could involve severing the fiber-optic cables at some of their hardest-to-access locations to halt the instant communications on which the West’s governments, economies and citizens have grown dependent.
While there is no evidence yet of any cable cutting, the concern is part of a growing wariness among senior American and allied military and intelligence officials over the accelerated activity by Russian armed forces around the globe. At the same time, the internal debate in Washington illustrates how the United States is increasingly viewing every Russian move through a lens of deep distrust, reminiscent of relations during the Cold War.
Inside the Pentagon and the nation’s spy agencies, the assessments of Russia’s growing naval activities are highly classified and not publicly discussed in detail. American officials are secretive about what they are doing both to monitor the activity and to find ways to recover quickly if cables are cut. But more than a dozen officials confirmed in broad terms that it had become the source of significant attention in the Pentagon.
“I’m worried every day about what the Russians may be doing,” said Rear Adm. Frederick J. Roegge, commander of the Navy’s submarine fleet in the Pacific, who would not answer questions about possible Russian plans for cutting the undersea cables.
Cmdr. William Marks, a Navy spokesman in Washington, said: “It would be a concern to hear any country was tampering with communication cables; however, due to the classified nature of submarine operations, we do not discuss specifics.”
In private, however, commanders and intelligence officials are far more direct. They report that from the North Sea to Northeast Asia and even in waters closer to American shores, they are monitoring significantly increased Russian activity along the known routes of the cables, which carry the lifeblood of global electronic communications and commerce.
Just last month, the Russian spy ship Yantar, equipped with two self-propelled deep-sea submersible craft, cruised slowly off the East Coast of the United States on its way to Cuba — where one major cable lands near the American naval station at Guantánamo Bay. It was monitored constantly by American spy satellites, ships and planes. Navy officials said the Yantar and the submersible vehicles it can drop off its decks have the capability to cut cables miles down in the sea.
“The level of activity,” a senior European diplomat said, “is comparable to what we saw in the Cold War.”
One NATO ally, Norway, is so concerned that it has asked its neighbors for aid in tracking Russian submarines.
Adm. James Stavridis, formerly NATO’s top military commander and now dean of the Fletcher School of Law and Diplomacy, said in an email last week that “this is yet another example of a highly assertive and aggressive regime seemingly reaching backwards for the tools of the Cold War, albeit with a high degree of technical improvement.”
The war on the ‘hoverboard’
It has become the year’s must-have gadget, but is there room on the road for the self-balancing scooter?
By David K Gibson
Oct 22 2015
Two things made the news last week.
The first is that the US state of California rescinded a ban on motorised skateboards, which had been in place since 1977. As of the first day of 2016, electric versions will be street legal — though only in bike lanes. The second is that the Metropolitan Police in the UK — via Twitter — reminded citizens that those weird self-balancing things, often optimistically referred to as ‘hoverboards’, are unregistered motor vehicles and therefore illegal to ride on roads in England and Wales. They are likewise forbidden on sidewalks, thanks to the Highway Act of 1835. Yes, 1835.
Jeff Bezos, given an advance look at the Segway personal transporter back in 2001, was quoted as saying “…Cities will be built around this device.” Later accounts offer a bit more context, suggesting that the Amazon.com founder recognized that for the Segway to become popular, cities would have to be built around it, astutely noting that most of our cities have already been built.
That the Segway is the domain of mall security rather than our best technological selves boils down to the fact that there’s no place for it to go. It’s too fast for sidewalks, too slow for roadways, too wide for bike lanes. But since its unveiling, dozens of inventors have taken inspiration (and perhaps proprietary technology) from the Segway to create vehicles that would take us from point A to point B, if there were only a legal roadway between them. In the US, for something as seemingly straightforward as an electric bicycle, each state has its own rules and regulations, and local and municipal laws often trump those state laws. That makes for a tough environment to innovate.
A multimodal transportation infrastructure is the holy grail of transportation planning these days, and the concept of “Complete Streets” rests upon it. But “multimodal” should balance public transport, automobiles, bicycles, and pedestrians, as well as the spaces in between — spaces filled by Neighborhood Electric Vehicles, light quadracycles, low-speed electric trikes, and weird self-balancing things. Any one of these spaces might be the perfect spot to invent the economical, safe, and not-at-all-dorky future of transportation. But we’re going to have to follow the example of California rather than the Metropolitan Police, and be a little more chill about what rides where.
The Segway, by the way, is not legal for operation on UK public paths or roads, thanks to the Highway Act of 1835. Yes, 1835.
Europe votes on the future of its internet tomorrow
Proposed legislation aims to protect net neutrality across the EU, but major loopholes threaten to undermine it
By Amar Toor
Oct 26 2015
Europe’s internet is about to go on trial, and activists are very worried about its future.
On Tuesday, European lawmakers will vote on a proposal that aims to protect net neutrality — the principle that internet service providers (ISPs) should treat all web traffic equally, without discriminating against some services in favor of others. The proposed legislation broadly prohibits ISPs from charging websites for faster connections, ostensibly keeping the web open and equal. But it also includes major loopholes that could undermine the very principle that it claims to protect.
If lawmakers approve the regulations tomorrow, they will become law across the EU, replacing existing net neutrality laws already implemented in the Netherlands and Slovenia. And if the proposal is passed without amendments, experts say it could have devastating impacts on innovation, market competition, and consumer privacy.
“Europe will have far weaker network neutrality rules than the US.”
“Europe will have far weaker network neutrality rules than the US, and the European internet would become less free and less open,” writes Barbara van Schewick, a Stanford law professor and director of the Stanford Center for Internet and Society. (The US Federal Communications Commission passed net neutrality regulations in February.)
Tuesday’s vote comes after two years of negotiations among the 28 member states in the European Union. The European Parliament approved rules that would strengthen net neutrality in April 2014, but making them law requires agreement among the Parliament, the European Commission, and the Council of the European Union, a body of 28 EU ministers. The Council took issue with some of the key provisions laid out in the Parliament’s initial plan, and proposed amendments that would allow for crucial exceptions. A compromise proposal was announced earlier this year, and its current form includes troubling provisions.
Among the most contentious is a clause that would allow so-called “specialized services” to pay to have their content delivered faster. The idea is to protect IP services that demand high-quality connections and use the same access network as the internet but are not open to everyone, such as self-driving cars or remote medical operations. But critics say the current parameters are too broad, effectively allowing ISPs to create the kind of two-tiered system that net neutrality is designed to prevent. There are concerns over a similar loophole in the FCC’s net neutrality regulations, and companies have reportedly sought to exploit it for fast lane access.
A call for clarity
“Large corporations that pay to be in the fast lane will have higher costs, so we the customers will be forced to pay higher prices for their products and services,” van Schewick wrote in a lengthy Medium post last week explaining the major loopholes and their implications. “Small businesses that are unable to pay will be shut out of the market.”
European telecoms have argued that tighter regulations on specialized services would hinder their business, and ultimately harm the consumer. “If restrictive rules on traffic management and specialized services are approved, we risk to worsen the user experience and to reduce the overall growth and job creation potential of Europe’s digital economy,” Steven Tas, head of the industry group ETNO, told Reuters earlier this year.
Another provision pertains to zero-rating, a practice whereby the use of certain services or applications doesn’t count against a consumer’s monthly data allowance. The proposal both allows for zero-rating — which allows ISPs to favor one service over another — and leaves no room for member states to regulate it. (Netflix came under fire for a zero-rating scheme in Australia, as did Facebook’s Internet.org initiative in India.)
[Note: This item comes from friend Judi Clark. DLH]
Oklahoma Earthquakes Are a National Security Threat
North America’s biggest commercial oil storage hub is already on guard against terrorism, but quakes could prove the bigger risk.
By Matthew Philips
Oct 23 2015
In the months after Sept. 11, 2001, as U.S. security officials assessed the top targets for potential terrorist attacks, the small town of Cushing, Okla., received special attention. Even though it is home to fewer than 10,000 people, Cushing is the largest commercial oil storage hub in North America, second only in size to the U.S. government’s Strategic Petroleum Reserve. The small town’s giant tanks, some big enough to fit a Boeing 747 jet inside, were filled with around 10 million barrels of crude at the time, an obvious target for someone looking to disrupt America’s economy and energy supply.
The FBI, state and local law enforcement and emergency officials, and the energy companies that own the tanks formed a group called the Safety Alliance of Cushing. Soon, guards took up posts along the perimeter of storage facilities and newly installed cameras kept constant surveillance. References to the giant tanks and pipelines were removed from the Cushing Chamber of Commerce website. In 2004, the Safety Alliance simulated a series of emergencies: an explosion, a fire, a hostage situation.
After the shale boom added millions of additional barrels to Cushing, its tanks swelled to a peak hoard of more than 60 million barrels this spring. That’s about as much petroleum as the U.S. uses in three days, and it’s more than six times the quantity that triggered security concerns after Sept. 11. The Safety Alliance has remained vigilant, even staging tornado simulations after a few close calls.
Now the massive oil stockpile faces an emerging threat: earthquakes. In the past month, a flurry of quakes have hit within a few miles of Cushing, rattling the town and its massive tanks. According to the Oklahoma Geological Survey, more than a dozen quakes have registered 3.0 or higher on the Richter scale within a few miles of Cushing since mid-September. The biggest, registering at 4.5, hit about three miles away on Oct. 10.
This is all part of the disturbing rise in earthquakes in Oklahoma, which has corresponded to increased fracking activity and oil production in the state. Since 2008, Oklahoma has gone from averaging fewer than two earthquakes per year that measure at least 3.0 in magnitude to surpassing California as the most seismically active state in the continental U.S. This year, Oklahoma is on pace to endure close to 1,000 earthquakes. Scientists at the National Earthquake Information Center in Colorado recently published a paper (PDF) raising concerns that the welter of moderate-sized earthquakes around Cushing could increase the risk of larger quakes in the future.
Seismologists believe the quakes are the result of wastewater injection wells used by the fracking industry. Horizontal oil wells in Oklahoma can produce as many as nine or 10 barrels of salty, toxin-laced water for every barrel of oil. Much of that fluid is injected back underground into wastewater disposal wells. It is this water, injected near faults, that many seismologists—including those at the U.S. Geological Survey—say has caused the spike in earthquakes.
The role that fracking plays in the rise of earthquakes has been hugely controversial in Oklahoma, where one in five jobs is tied to the oil and gas industry. This year, as Bloomberg reported, seismologists at the Oklahoma Geological Survey were pressured by oil companies not to make a link between the earthquakes and fracking-related wastewater injection wells. Under the weight of mounting scientific evidence, Republican Governor Mary Fallin’s administration in April finally acknowledged the role fracking played in earthquake activity.
In June, the Oklahoma Supreme Court said that a woman injured in an earthquake could sue an Oklahoma oil company for damages. That company, Tulsa-based New Dominion, is one of the pioneers of a new breed of high-volume wastewater injection wells that can suck down millions of barrels of water and bury it deep underground. In April, Bloomberg Businessweek profiled David Chernicky, its charismatic founder and chairman.
[Note: This item comes from friend Judi Clark. DLH]
America’s Top Fears 2015
By Sheri Ledbetter
Oct 13 2015
The Chapman University Survey of American Fears, Wave 2 (2015) provides an unprecedented look into the fears of average Americans. In April of 2015, a random sample of 1,541 adults from across the United States were asked their level of fear about eighty-eight different fears across a huge variety of topics ranging from crime, the government, disasters, personal anxieties, technology and many others.
Domains of Fear
Fear Domain Types of Questions Included
Crime Murder, rape, theft, burglary, fraud, identity theft
Daily Life Romantic rejection, ridicule, talking to strangers
Environment Global warming, overpopulation, pollution
Government Government corruption, Obamacare, drones, gun control, immigration issues
Judgment of others Appearance, weight, age, race
Man-Made Disasters Bio-warfare, terrorism, nuclear attacks
Natural Disasters Earthquakes, droughts, floods, hurricanes
Personal Anxieties Tight spaces, public speaking, clowns, vaccines
Personal Future Dying, illness, running out of money, unemployment
Technology Artificial intelligence, robots, cyber-terrorism
Top Fear Domains, 2015
Each fear question asks Americans to rate their level of fear on a scale ranging from 1 (not afraid) to 4 (very afraid). The average score for each domain of fear provides insight into what types of fear are of greatest concern to Americans in 2015.
On average, Americans expressed the highest levels of fear about man-made disasters, such as terrorist attacks, followed by fears about technology, including corporate and government tracing of personal data and fears about the government (such as government corruption and ObamaCare). The complete, ranked list of Domains of Fear follows:
DecodeDC: Episode 111: Conversation in the Digital Age…nvm, tldr.
How does our constant need to stay connected affect our ability to well, connect?
By Dick Meyer
Oct 22 2015
It’s a bizarre question at first: Is our capacity for meaningful, soul-nourishing conversation something that can go away? Sherry Turkle, professor of psychology at MIT, and author of “Reclaiming Conversation: The Power of Talk in the Digital Age”, says yes, emphatically.
On this episode of DecodeDC, Dick Meyer has a long conversation with Turkle about conversation. Turkle is our foremost scholar of how new technology affects old emotions, behaviors and ways of bonding. Spoiler alert: We’re all at risk of becoming device-addicted, never-present techno-dweebs if we don’t wise up fast.
Audio: 18:15 min