2010 redistricting yields new breed of recalcitrant Republicans

2010 redistricting yields new breed of recalcitrant Republicans
By Will Femia
Sep 30 2013

Some amazing statistics about the districts of the 236 House GOPers from ’95-’96 shutdown vs. the 232 House GOPers today…

— Dave Wasserman (@Redistrict) September 30, 2013

David Wasserman, House editor for the Cook Political Report, in tonight’s discussion with Rachel about the role of gerrymandered Congressional districts in creating a political atmosphere in the House that encourages radicalism and recalcitrance over negotiation and accountability, shared some remarkable statistics about the current crop of House Republicans. On top of previous TRMS reporting on how redistricting allowed Republicans to win fewer votes but more House seats in the 2012 election, Wasserman presented the following:

Back in ’95 and ’96 when Republicans had 236 seats during that shutdown, there were 79 out of those 236 seats that were carried by Bill Clinton in 1992. That was many more than the 17 districts that Republicans represent that were won by Barack Obama in the 2012 election. So you’re talking about going from 79 districts where there was some incentive to compromise to 17. 

Republicans are living in a completely alternate universe from the rest of the country. Their districts are 75 percent white, compared to 63 percent for the national average and 50 percent for Democratic districts. Consider that only 37 Republicans in the House today out of 233 were around for the ’95, ’96 shutdown. And then, you also have the fact that 48 percent of all House Republicans, and this blows my mind, were elected after George W. Bush left office.

These people owe no allegiance to John Boehner. They ran against not only Democrats, but Republican leadership to get to Congress and they’re reflecting what the primary electorates, which decided their elections back home, wanted in the first place.

Wasserman’s statistics bear out what many have, sometimes jokingly, taken as common wisdom: whatever President Obama supports, reactionary Republicans automatically oppose: 


The Emperor has no Problem: Is Wi-Fi Spectrum Really Congested?

The Emperor has no Problem: Is Wi-Fi Spectrum Really Congested?
By J. Pierre De Vries, et al
Sep 21 2013

“Wi-Fi congestion is a very real and growing problem.” So said then-Chairman of the FCC Genachowski in a recent proceeding, and this sentiment is widely heard. However, we are aware of very few engineering studies that have a bearing on this matter, and their conclusions are equivocal. Beyond this, evidence for “congestion problems” in the 2.4 GHz ISM band is anecdotal at best.

While some users no doubt sometimes have service difficulties they ascribe to congestion, that is not sufficient to prove that there is a policy problem. In order to provide a basis for policy decisions, we offer an analysis of congestion metrics and results in the engineering literature. We conclude that there is little consensus on how to measure congestion, and that network metrics are difficult to correlate with user experience.

We therefore propose a list of user experience-oriented service impairment criteria that, if met, would demonstrate that congestion exists to a degree that justifies regulatory intervention: For more than one key scenarios that are provided by an operator or technology there is a significant increase in the percentage of users who can’t complete a valuable task on a persistent, ubiquitous basis in spite of the use of state-of-the-art engineering/deployment best practices, and users’ willingness to pay the market rate for the best available service level.

Based on our assessment of public reports and experimental data, we conclude that there is currently no evidence for pervasive Wi-Fi congestion. We do not claim that the absence of evidence of congestion amounts to evidence for the absence of congestion. However, we question the argument that congestion occurring somewhere, sometimes is a justification for regulatory intervention.

Number of Pages in PDF File: 34

A CEO who resisted NSA spying is out of prison. And he feels ‘vindicated’ by Snowden leaks

A CEO who resisted NSA spying is out of prison. And he feels ‘vindicated’ by Snowden leaks.
By Andrea Peterson
Sep 30 2013

Just one major telecommunications company refused to participate in a legally dubious NSA surveillance program in 2001. A few years later, its CEO was indicted by federal prosecutors. He was convicted, served four and a half years of his sentence and was released this month.

Prosecutors claim Qwest CEO Joseph Nacchio was guilty of insider trading, and that his prosecution had nothing to do with his refusal to allow spying on his customers without the permission of the Foreign Intelligence Surveillance Court. But to this day, Nacchio insists that his prosecution was retaliation for refusing to break the law on the NSA’s behalf.

After his release from custody Sept. 20, Nacchio told the Wall Street Journal that he feels “vindicated” by the content of the leaks that show that the agency was collecting American’s phone records.

Nacchio was convicted of selling of Qwest stock in early 2001, not long before the company hit financial troubles. However, he claimed in court documents that he was optimistic about the firm’s ability to win classified government contracts — something they’d succeeded at in the past. And according to his timeline, in February 2001 — some six months before the Sept. 11 terrorist attacks — he was approached by the NSA and asked to spy on customers during a meeting he thought was about a different contract. He reportedly refused because his lawyers believed such an action would be illegal and the NSA wouldn’t go through the FISA Court. And then, he says, unrelated government contracts started to disappear.

His narrative matches with thewarrantless surveillance programreported by USA Today in 2006 which noted Qwest as the lone holdout from the program, hounded by the agency with hints that their refusal “might affect its ability to get future classified work with the government.” But Nacchio was prevented from bringing up any of this defense during his jury trial — the evidence needed to support it was deemed classified and the judge in his case refused his requests to use it. And he still believes his prosecution was retaliatory for refusing the NSA requests for bulk access to customers’ phone records. Some other observers share that opinion, and it seems consistent with evidence that has been made public, including some of the redacted court filings unsealed after his conviction.

The NSA declined to comment on Nacchio, referring inquiries to the Department of Justice. The Department of Justice did not respond to The Post’s request for comment.

Snowden leaked documents about NSA spying programs to the public and arguably broke the law in doing so. In contrast, Nacchio seems to have done what was in his power to limit an illegal government data collection program. Even during his own defense, he went through the legal channels he could to make relevant information available for his defense — albeit unsuccessfully.


Our governing crisis, in one sentence

Our governing crisis, in one sentence
By Ezra Klein
Sep 30 2013

Greg Sargent boils our “current governing crisis” down to two sentences. I’m going to try to do it in one sentence:

The GOP has become an insurgent outlier in American politics — it is ideologically extreme; scornful of compromise; and dismissive of the legitimacy of its political opposition.

Okay, so that’s not my sentence. It’s Thomas Mann and Norm Ornstein’s sentence. But these past few weeks have acted like an advertisement for their book “It’s Even Worse Than It Looks: How the American Constitutional System Collided With the New Politics of Extremism.”

That pretty much does it. If I had more space, I might add Mann and Ornstein’s next sentence: “When one party moves this far from the mainstream, it makes it nearly impossible for the political system to deal constructively with the country’s challenges.”

And if I had even more space, I’d say that the Republican Party’s problem is not that all or even most of its members are so extreme, but that the “silent majority” that knows that governing-by-crisis is bad for the country and bad for the GOP is too scared to take on the party’s activist base.

But I don’t have more space, of course, and I’d never betray the purity of this post’s concept by adding more explanation on some technicality.

Uncle Sam Wants You to Help Us Design a Spectrum Monitoring Pilot Project

Uncle Sam Wants You to Help Us Design a Spectrum Monitoring Pilot Project
September 30, 2013

Meeting Americans’ increasing demand for broadband wireless technologies requires finding more spectrum. NTIA has been leading efforts to help meet President Obama’s goal of identifying 500 megahertzof spectrum for wireless broadband by 2020 while balancing the spectrum needs of federal agencies.

Finding spectrum bands, however, that can be shifted from their current applications to enable new broadband services is a difficult task. While clearing spectrum bands to make way for new wireless services has been a viable approach for many years, options for relocating incumbent operations are dwindling, getting more expensive, and taking longer to implement. Given this, NTIA has been working with the Federal Communications Commission, other federal agencies, and industry stakeholders to explore ways to share the spectrum without displacing existing systems in the same bands.

In a June 2013 executive memorandum on “Expanding America’s Leadership in Wireless Innovation,” President Obama noted that spectrum sharing can and should be used to enhance efficiency among all users and can expedite commercial access to additional spectrum bands where technically and economically feasible. The memorandum directs federal agencies to take a number of additional steps to accelerate shared access to spectrum and tasks NTIA to design and conduct a pilot program to monitor spectrum usage in real time in selected communities throughout the country.

NTIA encourages all those interested in providing input on the development of this new spectrum monitoring project to submit comments in response to our Notice of Inquiry (NOI). The NOI seeks comment from stakeholders on the measurement system’s design, features, deployment options, operational parameters, expected utility, potential benefits, and other issues. The deadline for submitting comments is this Thursday, October 3, 2013, and they can be sent via email to:measurementNOI@ntia.doc.gov. Comments will continue to be accepted electronically in the event of a government shutdown.

This spectrum monitoring initiative would provide many potential benefits. By collecting better data on how much spectrum is being used in selected metro areas, NTIA can evaluate the potential for relocating federal systems or determining where spectrum sharing would be more viable. The new system would initially include a new network of fixed radiofrequency sensors installed at selected sites in up to 10 major metropolitan areas to collect data across particular bands of interest. The measurement equipment would automatically feed data to a centralized database for storing, retrieving, and analyzing spectrum usage and occupancy information. More comprehensive and longer-term data collections like those proposed in the new pilot program could provide better general information that could be analyzed along with other considerations to evaluate the potential for deploying additional federal and nonfederal systems on a shared basis.

Snowden says his “sole intention” was to prompt national security debate

Snowden says his “sole intention” was to prompt national security debate
Whistleblower advocate reads statement on Snowden’s behalf before EU committee.
By Cyrus Farivar
Sep 30 2013

Former National Security Agency contractor Edward Snowden spoke publicly for the first time in many weeks, sort of. The famed leaker didn’t speak for himself; rather, someone read a written statement on his behalf before a committee hearing at the European Parliament in Brussels on Monday.

“I thank the European Parliament and the LIBE [Civil Liberties, Justice, and Home Affairs] Committee for taking up the challenge of mass surveillance,” said Jesselyn Radack, who read the statement for Snowden.

Radack is a former ethics adviser to the United States Department of Justice, but he now serves as director of national security and human rights at the Government Accountability Project, a nonprofit organization.

The American attorney and other known transparency advocates testified before the LIBE committee. One of the additional speakers was Thomas Drake, a former NSA employee and whistleblower (see some of the other related documents included in the day’s testimony).

“The surveillance of whole populations, rather than individuals, threatens to be the greatest human rights challenge of our time,” Snowden, via Radack, continued. “The success of economies in developed nations relies increasingly on their creative output, and if that success is to continue, we must remember that creativity is the product of curiosity, which in turn is the product of privacy.

“If we are to enjoy such debates in the future, we cannot rely upon individual sacrifice. We must create better channels for people of conscience to inform not only trusted agents of government but independent representatives of the public outside of government. When I began my work, it was with the sole intention of making possible the debate we see occurring here in this body and in many other bodies around the world.”

Snowden has been living low in Russia since his temporarily asylum was granted in August 2013.The Guardian reported earlier this month that “Western diplomats and Russian government sources say they have no idea where he is staying or whether he has the protection of the Russian state or its security services.”

However, it seems unlikely that Russian officials have no idea where he is.


NSA stores metadata of millions of web users for up to a year, secret files show

NSA stores metadata of millions of web users for up to a year, secret files show
• Vast amounts of data kept in repository codenamed Marina
• Data retained regardless of whether person is NSA target
• Material used to build ‘pattern-of-life’ profiles of individuals
By James Ball
Sep 30 2013

The National Security Agency is storing the online metadata of millions of internet users for up to a year, regardless of whether or not they are persons of interest to the agency, top secret documents reveal.

Metadata provides a record of almost anything a user does online, from browsing history – such as map searches and websites visited – to account details, email activity, and even some account passwords. This can be used to build a detailed picture of an individual’s life.

The Obama administration has repeatedly stated that the NSA keeps only the content of messages and communications of people it is intentionally targeting – but internal documents reveal the agency retains vast amounts of metadata.

An introductory guide to digital network intelligence for NSA field agents, included in documents disclosed by former contractor Edward Snowden, describes the agency’s metadata repository, codenamed Marina. Any computer metadata picked up by NSA collection systems is routed to the Marina database, the guide explains. Phone metadata is sent to a separate system.

“The Marina metadata application tracks a user’s browser experience, gathers contact information/content and develops summaries of target,” the analysts’ guide explains. “This tool offers the ability to export the data in a variety of formats, as well as create various charts to assist in pattern-of-life development.”

The guide goes on to explain Marina’s unique capability: “Of the more distinguishing features, Marina has the ability to look back on the last 365 days’ worth of DNI metadata seen by the Sigint collection system,regardless whether or not it was tasked for collection.” [Emphasis in original.]

On Saturday, the New York Times reported that the NSA was using itsmetadata troves to build profiles of US citizens’ social connections, associations and in some cases location, augmenting the material the agency collects with additional information bought in from the commercial sector, which is is not subject to the same legal restrictions as other data.

The ability to look back on a full year’s history for any individual whose data was collected – either deliberately or incidentally – offers the NSAthe potential to find information on people who have later become targets. But it relies on storing the personal data of large numbers of internet users who are not, and never will be, of interest to the US intelligence community.

Marina aggregates NSA metadata from an array of sources, some targeted, others on a large scale. Programs such as Prism – which operates though legally-compelled “partnerships” with major internet companies – allow the NSA to obtain content and metadata on thousands of targets without individual warrants.


Ocean warming and acidification deliver double blow to coral reefs

Ocean warming and acidification deliver double blow to coral reefs
Under business-as-usual conditions, corals start dissolving into the oceans.
By Jeremy Jacquot
Sep 30 2013

The dual threats of ocean acidification and anthropogenic warming have the potential to wreak havoc on marine life over the coming decades. Corals require acid-sensitive calcium carbonate for structure and heat-sensitive symbionts for sustenance, so they seem to have the most to lose from a warmer, more acidic ocean. Indeed, numerous studies have already indicated that calcifying organisms, including corals, would be among the worst to suffer.

Although many studies have looked at heat and acidification, few have addressed the possible synergistic effects of these processes on intact coral reefs. To that end, a team of Australian researchers exposed patches of coral reefs to varying seawater temperature and pH conditions associated with a range of CO2 emission scenarios. Their findings, though nuanced, do not bode well for the long-term well-being of coral reefs.

At its core, the study aimed to answer two questions with important ramifications for the future of coral reefs. First, how do current reef calcification rates compare to those of pre-industrial conditions? Second, will reefs respond differently to a possible future in which emission growth continues unabated versus one in which growth is moderately curbed?

Setting up the experiment

In order to better understand how corals—as members of a community rather than individuals—might respond to future conditions, the authors “built” replicate patches by piecing together species collected from the Great Barrier Reef. (The Reef doubled as the control, or reference, site for their experiment.) The coral reef patches were composed of a mixture of hard corals, macroalgae, vertebrates, and invertebrates, all of which were collected locally from a shallow depth. The underlying sediments and structure were made up of the skeletons of corals, calcareous algae, foraminifers (shelled amoeba-like protists), and mollusks.

Each patch was placed in a separate enclosed tank, and a continuous supply of warm/cold seawater and CO2 was piped in. The tank lids were clear to let sunlight to come in. A buoy at the reference site allowed them to monitor daily and monthly variability in temperature and partial pressure of CO2. They then incorporated this natural variability into their experiment.

The four emission scenarios they picked matched those used by several Intergovernmental Panel on Climate Change modeling studies. The first two correspond to pre-industrial and present-day conditions, while the latter two correspond to the low and high ends of future emission scenarios. Pre-industrial conditions were simulated by lowering the present-day seawater temperature (~24.3 to 27.8°C, depending on monthly variability) and increasing pH (~8.1) by 1°C and 0.1 unit, respectively. For the lower end of the future scenarios, which assumes some level of emission reductions, the temperature and pH were increased and lowered by 2°C and 0.2 unit, respectively. For the higher end, which assumes unabated or “business-as-usual” emission growth, the temperature and pH were increased and lowered by 4°C and 0.4 unit, respectively. Three replicate patches were assembled for each emission scenario.


With data breaches on the rise, BitTorrent debuts ‘private, secure, and free’ server-less chat experiment

With data breaches on the rise, BitTorrent debuts ‘private, secure, and free’ server-less chat experiment
By Emil Protalinski
Sep 30 2013

BitTorrent today announced a new experiment simply called BitTorrent Chat. The company is inviting users to check out the private alpha over atlabs.bittorrent.com.

You can apply now, but BitTorrent says just a small number will be invited in. How many exactly is not clear, but we wouldn’t be surprised if it was just a few hundred.

The company itself admits it isn’t sure where this latest experiment will go, but it does have a vision: serverless chat that is private, secure and free. Unsurprisingly, it will be using the BitTorrent protocol.

BitTorrent justifies its experiment in server-less messaging by citing a report by Symantec that notes more than 6 million people have been impacted by data breaches this year alone. This seems to be a US-specific number as the report states: “In 2012, the Identity Theft Resource Center (ITRC) documented 447 breaches in the United States, exposing 17,317,184 records. In the first half of 2013, there have so far been 255 incidents, exposing 6,207,297 records.”

As a result, here is BitTorrent’s pitch:

So over at Labs, we’re working on something that could solve for conversation security. BitTorrent Chat applies distributed technology to the idea of IM. Our goal is to ensure that your messages stay yours: private, secure, and free.

Unfortunately, details surrounding BitTorrent Chat are still very scarce. We presume it will start as a desktop offering and roll out to mobile, although it could very well land on both.

When we contacted the company to figure out whether this is exclusively a Web service or if it requires you to download a chat client first, we were told it’s the latter. We will update this article with more details, such as platform support, as we get them.


A Facebook Like Is Now Covered by the First Amendment

[Note:  This item comes from reader Monty Solomon.  DLH]

From: Monty Solomon <monty@roscom.com>
Subject: A Facebook Like Is Now Covered by the First Amendment
Date: September 30, 2013 7:05:12 AM PDT

A Facebook Like Is Now Covered by the First Amendment

The Founders could not have anticipated Facebook. In another way, 
though, they totally anticipated Facebook.

SEP 19 2013
The Atlantic

In November of 2009, B.J. Roberts, the sheriff of Hampton, Virginia, 
ran for re-election. A group of workers in Roberts’ office, however, 
among them one Bobby Bland, weren’t enthused about the prospects of 
their boss’s continuation in his role. So they took to their Facebook 
accounts to protest the run: They Liked the campaign of Roberts’s 
opponent, Jim Adams. Despite the minuscule mutiny, however, Roberts 
won the election. He then chose not to retain Bland and the others as 
his employees. The dismissals, Roberts said at the time, were the 
result not only of  budgeting concerns, but also of the workers’ 
hindrance of “the harmony and efficiency of the office.” The sheriff 
had not liked his workers’ Likes.

Bland and his colleagues took Roberts to court, arguing that, in the 
dismissals, Roberts had violated their First Amendment rights. In 
April of 2012, however, the U.S. District Court of Eastern Virginia 
dismissed the case on the grounds that a Like didn’t involve an 
“actual statement,” and therefore was “insufficient speech to merit 
constitutional protection.”

Simple clicks of a button are now enshrined as constitutionally 
protected conduits of self-expression.
Yesterday, however, that decision was overturned. A federal appeals 
court ruled that a Facebook Like is, indeed, a form of expression 
that is covered by the First Amendment. Clicking a button is, per the 
decision, a protected form of speech.