Marcy Wheeler commits journalism

[Note:  This item comes from friend David Rosenthal.  DLH]

From: “David S. H. Rosenthal” <dshr@abitare.org>
Subject: Marcy Wheeler commits journalism
Date: July 29, 2014 at 18:12:17 EDT
To: dewayne@warpspeed.com

About the NSA “reform” being debated in Congress:

<http://www.emptywheel.net/2014/07/29/leahy-freedom-act-permits-fbis-continued-uncounted-use-of-back-door-searches/>
<http://www.emptywheel.net/2014/07/29/leahy-usa-freedoms-bulky-corporate-persons/>
<http://www.emptywheel.net/2014/07/29/a-good-idea-that-may-backfire-fiscr-fast-track/>

You don’t need to understand all the details to know that this kind
of analysis is what the press should be doing.

David. 

ISPs tell government that congestion is “not a problem,” impose data caps anyway

ISPs tell government that congestion is “not a problem,” impose data caps anyway
Shocking government research also finds Internet users don’t want data caps.
By Jon Brodkin
Jul 29 2014
<http://arstechnica.com/business/2014/07/isps-tell-government-that-congestion-is-not-a-problem-impose-data-caps-anyway/>

After consulting focus groups of Internet customers, government researchers have come to a conclusion that should surprise no one: people don’t want data caps on home Internet service.

But customers are getting caps anyway, even though ISPs admit that congestion isn’t a problem. The US Government Accountability Office (GAO) today released preliminary findings of research involving surveys of cellular carriers, home Internet providers, and customers.

The majority of top wireline ISPs are at least experimenting with data caps. But while cellular carriers say they impose usage-based pricing (UBP) to manage congestion on wireless networks, that’s not the case with cable, fiber, and DSL. “Some wireless ISPs told us they use UBP to manage congestion,” the GAO wrote. On the other hand, “wireline ISPs said that congestion is not currently a problem.”

Why set up data limits and charge extra when users go over them, then? “UBP can generate more revenues for ISPs to help fund network capacity upgrades as data use grows,” the GAO wrote.

The GAO said it interviewed “some experts” who think usage-based pricing “may be unnecessary because the marginal costs of data delivery are very low, [and] heavier users impose limited additional costs to ISPs.” Limiting heavy users could even “limit innovation and development of data-heavy applications,” the GAO wrote.

Customers told the GAO they don’t want data caps, at least on home Internet.

Eight focus groups of nine or 10 people each were polled about data caps on both cellular service and wireline home Internet. While they were generally accepting of limits on cellular data, most did not want any limits on home Internet usage, in part because they manage limited wireless plans by connecting mobile devices to their home Wi-Fi. The GAO wrote:

In only two groups did any participants report experience with wireline UBP [usage-based pricing]. However, in all eight groups, participants expressed strong negative reactions to UBP, including concerns about:

• The importance of the Internet in their lives and the potential effects of data allowances.
• Having to worry about data usage at home, where they are used to having unlimited access.
• Concerns that ISPs would use UBP as a way of increasing the amount they charge for Internet service.

[snip]

Former NSA chief makes up to $1 million a month selling cybersecurity services

Former NSA chief makes up to $1 million a month selling cybersecurity services
Gen. Keith Alexander stepped down from the NSA after the Snowden leaks, now he’s back with a new security firm related to his government work
By Carl Franzen

Jul 29 2014

General Keith Alexander was in charge of the National Security Agency when all hell broke loose and former security contractor Edward Snowden leaked documents showing the organization was spying far beyond the extent to which most people were aware (or comfortable with). But he’s not letting that episode stop him from launching what looks to be an exceptionally lucrative private career selling…you guessed it, cybersecurity software.

As Bloomberg first reported last week, Alexander has spent the last few months since his retirement as NSA head in March giving paid talks on cybersecurity to banks and other large financial institutions. Bloomberg also noted that Alexander has charged up to $1 million a month for his services, and even co-founded his own private security firm, IronNet Cybersecurity, Inc. In a more recent interview with Foreign Policy, Alexander admitted that his firm has developed “unique” technology for detecting and fighting so-called “advanced persistent threats” — cyberattacks that can extend for months or years at a time without being noticed, and are directed against specific targets like big companies or governments.

Beyond the somewhat uncomfortable optics created by America’s leading spymaster turning his skill-set to the private security sector, there are other problems with Alexander’s new job. As Foreign Policy points out, the former NSA chief plans to file patents on his firm’s technology, patents that are “directly related to the job he had in government.” In other words, Alexander stands to profit directly off of his taxpayer-funded experience, and may do so with a competitive advantage over other competing private firms. Alexander claimed the technology he would be patenting was distinct enough from his work at the head of the NSA, but that excuse is not likely to assuage rival cybersecurity firms, nor those concerned with the revolving door between government and related private industries.

How much data can one smart home generate? About 1 GB a week.

How much data can one smart home generate? About 1 GB a week.
By Stacey Higginbotham

Jul 29 2014
 
SUMMARY:
The internet of things is about data. So this week’s podcast we talk to a Splunk executive who connected his home and uses the data to inform his lifestyle and purchases.

In my connected home I’ve focused mostly on automation, but Stephen Sorkin, the chief strategy officer at Splunk has decided to go a much nerdier route. He focused on data, specifically gathering data from his circuit breaker, his connected weather station and his pool. He sends that data to Splunk and has used it to make some starting conclusions — among them that his home generates about 200 MB of data a day.

In this week’s podcast Sorkin discusses how he is using data to make decisions about when to water his lawn and when to replace ol inefficient appliances. It’s one thing to guess that a new washer or dryer will save you money, but Sorkin has the data to prove it. We discuss this and why users should be in control of their data. And Kevin Tofel was on vacation this week, but my colleague Kevin Fitchard joined us to talk about the Wink hub, the connected kitchen and new networks for the internet of things. Listen up.

Host: Stacey Higginbotham
Guests: Kevin Fitchard and Stephen Sorkin of Splunk

  • A few thoughts on Wink. It has Lutron and a nice interface
  • Stop trying to push the iOS v. Android worldview on the internet of things
  • Does the internet of things need cellular networks?
  • Why open data is not just useful, but essential
  • Tips to monitor your electric meter and pool
[snip]

Re: The Server Needs To Die To Save The Internet

Note:  This comment comes from friend Steve Schear.  DLH]

From: Steven Schear <steven.schear@googlemail.com>
Subject: Re: [Dewayne-Net] The Server Needs To Die To Save The Internet
Date: July 28, 2014 at 21:46:59 EDT
To: dewayne@warpspeed.com

This seems an awful lot like Mojo Nation. I helped Jim McCoy patent the concept of a network where users offering resources (e.g., disk space or bandwidth) are compensated in a network-centric currency about 2002. Unless Jim allowed the patent to expire due to non-payment it should still be in force. One of MN’s developers, Bram Cohen, went on to create BitTorrent.


The Server Needs To Die To Save The Internet
By Natasha Lomas
Jul 23 2014
<http://techcrunch.com/2014/07/23/maidsafe/>

New White House Report Highlights Economic Cost Of Delaying Action On Climate Change

New White House Report Highlights Economic Cost Of Delaying Action On Climate Change
By SAHIL KAPUR
Jul 29 2014
<http://talkingpointsmemo.com/dc/white-house-climate-change-report>

A new White House report released Tuesday warns that delaying environmental action would be costly and argues that swift action serves as “climate insurance” to mitigate the “most severe and irreversible potential consequences of climate change.”

The report, titled “A Cost Of Delaying Action To Stem Climate Change,” comes in the wake of the Obama administration’s decision to bypass Congress and propose new rules on coal-fired power plants aimed at slashing carbon pollution by 30 percent by 2030.

Citing the Intergovernmental Panel On Climate Change, the report lists economic consequences of putting off action.

“Impacts include decreased agricultural production; coastal flooding, erosion, and submergence; increases in heat-related illness and other stresses due to extreme weather events; reduction in water availability and quality; displacement of people and increased risk of violent conflict; and species extinction andbiodiversity loss,” it reads. “Although these impacts vary by region, and some impacts are not well-understood, evidence of these impacts has grown in recent years.”

The report claims that the rule on power plant emissions would “generate large positive net benefits, which EPA estimates to be in the range of $27-50 billion annually in 2020 and $49-84 billion in 2030″ — including health benefits from cutting harmful emissions.

The president’s action is fiercely opposed by congressional Republicans, most of whom doubt or deny the scientific consensus that human activities are significantly exacerbating global climate change.

[snip]

From Concorde to the iPhone, state intervention drives technological innovation

From Concorde to the iPhone, state intervention drives technological innovation
History tells us that state involvement is the best route to prosperity. Our politicians need to think big and accept the risk of failure
By Paul Mason
Jul 27 2014
<http://www.theguardian.com/commentisfree/2014/jul/27/concorde-iphone-history-state-intervention-technological-innovation>

If you wanted to radically alter the economy, making a country such as Britain as dynamic as China or Brazil, what would the state have to do? Intervene, obviously, but how?

That has become a hard question to answer since the onset of free-market economics. Much of the old apparatus of state control has been dismantled. Plus, the political culture in which planners, engineers and technical innovators inhabited the same offices has been shattered.

If a modern-day civil servant wanted to place the proposal to build, say, Concorde on a minister’s desk, there wouldn’t even be an obvious ministry to go to. It was the UK’s Ministry of Supply that set up the supersonic airliner project: it commissioned a prototype aircraft, immediately, at its first meeting.

When state innovation projects worked, they did so because their owners were politicians, who were allowed to think big and who accepted the risk of failure. In the freemarket model, the private sector is given the task of driving innovation. But though the individual results look spectacular – from info-tech, genetics and materials science to neuro-medicine – we have not experienced the same “lift-off” as in previous industrial revolutions, where all the innovations synergise, producing high-dynamism and rising wealth for all.

Instead, in the developed world, amid rapid tech innovation, we sway between low growth and stagnation. If the information revolution creates the possibility of a “third capitalism” – as different from the industrial era as it was from the age of Sir Francis Drake – then it is, so far, a possibility unrealised.

And a growing number of economists believe it will remain so unless we rethink the role of the state. The Sussex University professor Mariana Mazzucato, whose calls for an “entrepreneurial state” were greeted with incomprehension four years ago, recently put together a conference attended by ministers, central bankers and serious investors. The buzzwords were: think big, and do “mission-oriented finance”.

Mazzucato points out the state played a role in financing nearly every key technology in an iPhone, from GPS to the touch screen. She says that, even now, the lion’s share of funding for climate change technologies comes from state investment banks and public utilities, with just 6% coming from private capital. The problem is, the modern state sees this as accidental and residual. It avoids major projects, and their associated risks, seeing its role as mainly to act where the market “fails” – as with the near evaporation of venture capital funding for technology startups in the UK.

Mazzucato, in a paper with LSE professor Carlota Perez, points out the danger of leaving tech to the private sector. In an economy bloated with printed money and cheap credit, if capital can’t find real-world, high-growth, high-profit opportunities to invest in, it will pool into the finance system, creating one bubble after another.

Seen from this angle, the financial crisis looks less like the product of bad practices in the City, and more like a structural crisis. At all previous takeoff points, capital in the finance system flowed out into the real economy, where a paradigm had been established making it easy for businesspeople to invest in tried-and-tested models, with predictable and growing demand.

[snip]