Rachel Maddow Show: Ferguson failures contrast with St. Louis shooting response

Rachel Maddow Show: Ferguson failures contrast with St. Louis shooting response
Aug 21 2014

Rachel Maddow contrasts the response by St. Louis officials to a deadly police shooting to the bungling of the Michael Brown case in Ferguson, and talks with Kenneth Murdock, WGNU-AM radio host about the grievances of the people of Ferguson.

Video: 19:49 min

False facts and the conservative distortion machine: It’s much more than just Fox News

[Note:  This item comes from friend John McMullen.  DLH]

False facts and the conservative distortion machine: It’s much more than just Fox News
Social scientists use “knowledge distortion” index to test partisanship and reality. Guess who is wrong most often?
Aug 18 2014

Citizens are misinformed — often badly so. It’s not just that they lack good information — which would merely make them uninformed — they have plenty of bad information that leads them to believe untrue things. Or more likely the other way around: They believe untrue things, and that leads them to collect — even invent — bad information to flesh out what they already believe.

This was vividly illustrated by a 1991 study that found that the more people watched TV during the first Gulf War, the less they knew about fundamental issues and facts, even as they were more likely to support the war. Wanting to believe that the U.S. was involved in a noble cause, for example, only 13 percent knew that when Iraq first threatened to invade Kuwait, the U.S. said it would take no action, while 65 percent falsely “knew” that the U.S. said it would support Kuwait militarily.

But the problem is hardly limited to this one example, or to issues of war and peace more generally. Misinformation in public life isn’t the exception, it’s the rule, and researchers have been grappling with that fact, and its implications, for some time now. A new study published in Social Science Quarterly employs a “knowledge distortion index” and looks at two competing explanations for why this is so — one more top-down, the other more bottom-up — using three Washington state initiatives from the 2006 general election cycle to examine the dynamics of what is going on in this particular sort of political environment.

The study, “How Voters Become Misinformed: An Investigation of the Emergence and Consequences of False Factual Beliefs,” found that “voters’ values and partisanship had the strongest associations with distorted beliefs, which then influenced voting choices. Self-reported levels of exposure to media and campaign messages played a surprisingly limited role,” despite the presence of significantly mistaken “facts,” which were used to help construct the knowledge distortion index.

“Two of the competing theories on how people analyze political issues and develop factual beliefs are heuristics and cultural cognition,” the study’s lead author, Justin Reedy, told Salon. “Both of these theories recognize that citizens can develop distorted factual beliefs because of their political views, but they disagree about how those distortions might happen. Heuristics researchers generally think that citizens have limited attention for politics and try to process information quickly and efficiently.”


America in Decay

[Note:  This item comes from friend John McMullen.  DLH]

America in Decay
By Francis Fukuyama
September/October 2014

The creation of the U.S. Forest Service at the turn of the twentieth century was the premier example of American state building during the Progressive Era. Prior to the passage of the Pendleton Act in 1883, public offices in the United States had been allocated by political parties on the basis of patronage. The Forest Service, in contrast, was the prototype of a new model of merit-based bureaucracy. It was staffed with university-educated agronomists and foresters chosen on the basis of competence and technical expertise, and its defining struggle was the successful effort by its initial leader, Gifford Pinchot, to secure bureaucratic autonomy and escape routine interference by Congress. At the time, the idea that forestry professionals, rather than politicians, should manage public lands and handle the department’s staffing was revolutionary, but it was vindicated by the service’s impressive performance. Several major academic studies have treated its early decades as a classic case of successful public administration.

Today, however, many regard the Forest Service as a highly dysfunctional bureaucracy performing an outmoded mission with the wrong tools. It is still staffed by professional foresters, many highly dedicated to the agency’s mission, but it has lost a great deal of the autonomy it won under Pinchot. It operates under multiple and often contradictory mandates from Congress and the courts and costs taxpayers a substantial amount of money while achieving questionable aims. The service’s internal decision-making system is often gridlocked, and the high degree of staff morale and cohesion that Pinchot worked so hard to foster has been lost. These days, books are written arguing that the Forest Service ought to be abolished altogether. If the Forest Service’s creation exemplified the development of the modern American state, its decline exemplifies that state’s decay.

Civil service reform in the late nineteenth century was promoted by academics and activists such as Francis Lieber, Woodrow Wilson, and Frank Goodnow, who believed in the ability of modern natural science to solve human problems. Wilson, like his contemporary Max Weber, distinguished between politics and administration. Politics, he argued, was a domain of final ends, subject to democratic contestation, but administration was a realm of implementation, which could be studied empirically and subjected to scientific analysis.

The belief that public administration could be turned into a science now seems naive and misplaced. But back then, even in advanced countries, governments were run largely by political hacks or corrupt municipal bosses, so it was perfectly reasonable to demand that public officials be selected on the basis of education and merit rather than cronyism. The problem with scientific management is that even the most qualified scientists of the day occasionally get things wrong, and sometimes in a big way. And unfortunately, this is what happened to the Forest Service with regard to what ended up becoming one of its crucial missions, the fighting of forest fires.

Pinchot had created a high-quality agency devoted to one basic goal: managing the sustainable exploitation of forest resources. The Great Idaho Fire of 1910, however, burned some three million acres and killed at least 85 people, and the subsequent political outcry led the Forest Service to focus increasingly not just on timber harvesting but also on wildfire suppression. Yet the early proponents of scientific forestry didn’t properly understand the role of fires in woodland ecology. Forest fires are a natural occurrence and serve an important function in maintaining the health of western forests. Shade-intolerant trees, such as ponderosa pines, lodgepole pines, and giant sequoias, require periodic fires to clear areas in which they can regenerate, and once fires were suppressed, these trees were invaded by species such as the Douglas fir. (Lodgepole pines actually require fires to propagate their seeds.) Over the years, many American forests developed high tree densities and huge buildups of dry understory, so that when fires did occur, they became much larger and more destructive.

After catastrophes such as the huge Yellowstone fires in 1988, which ended up burning nearly 800,000 acres in the park and took several months to control, the public began to take notice. Ecologists began criticizing the very objective of fire prevention, and in the mid-1990s, the Forest Service reversed course and officially adopted a “let burn” approach. But years of misguided policies could not simply be erased, since so many forests had become gigantic tinderboxes.


Ferguson’s citizen journalists revealed the value of an undeniable video

Ferguson’s citizen journalists revealed the value of an undeniable video
Until the police stops treating communities as war zones and people as enemy combatants, keep your phone handy
By Dan Gillmor
Aug 16 2014

In Ferguson, Missouri this week, the public has turned the notion of “see something, say something” back on the state, via a digital tool of enormous power: online pictures and video. Their efforts – which began days before reporters descended when Twitter user @TheePharaoh posted pictures immediately after a police officer killed an unarmed black teenager, Michael Brown – have helped bring international attention to both Brown’s death and law enforcement’s disproportionate response to the ensuing protests.

Antonio French, an alderman in nearby St Louis, spent days posting to Twitter pictures and a series of videos of the demonstrations and police actions that he captured on his mobile phone – and was reportedly arrested and then released on Wednesday evening. He is a citizen journalist of the best kind: a credible witness who has helped inform the wider public about a critical matter. Can anyone plausibly doubt that he and the two professional journalists who werebriefly taken into custody after police demanded they stop recording were targeted because they were documenting law enforcement actions?

Ferguson isn’t the first example of this kind of citizen journalism, which has been going on for years in any number of other places including Iran, Egypt, Occupy Wall Street and Syria. But the videos, blog posts, tweets, and photos from French and others on the ground have complemented the work of the traditional journalists on the scene – and have reminded us of what is becoming a civic duty in today’s America.

It’s a sad comment on the state of law enforcement, but I now encourage people who see the police doing something that seems out of the ordinary to document it with pictures or video and save it (if not post it online). I say that reluctantly, because law enforcement is not, per se, our enemy: “To protect and serve” is deeply honorable motto, and communities are vastly better off where it is followed in good faith. But law enforcement today too often violates the civil liberties of those they are sworn to protect, and the increasing militarization of American law enforcement (an offshoot of the Wars on (Some) Drugs and Terror) is poisoning the trust of many citizens. (For others, particularly in minority communities who have borne the brunt of the “broken windows” model, that trust died long ago.).

Video and pictures are an equalizer: they’re not the only ones, and most of the power remains with the state, but they can be essential tools to help restore some balance in a system that, in recent years, has tilted in favor of those who interpret “protect and serve” as license to act with impunity. Among other uses, documentation and dissemination is helping professional and citizen journalists alike bring more clarity to events like those in Ferguson, via“crowd-powered” coverage.


NSA and GCHQ agents ‘leak Tor bugs’, alleges developer

NSA and GCHQ agents ‘leak Tor bugs’, alleges developer
British and American intelligence agents attempting to hack the “dark web” are being deliberately undermined by colleagues, it has been alleged.
By Leo Kelion
Aug 22 2014

Spies from both countries have been working on finding flaws in Tor, a popular way of anonymously accessing “hidden” sites.

But the team behind Tor says other spies are tipping them off, allowing them to quickly fix any vulnerabilities.

The agencies declined to comment.

The allegations were made in an interview given to the BBC by Andrew Lewman, who is responsible for all the Tor Project’s operations.

He said leaks had come from both the UK Government Communications Headquarters (GCHQ) and the US National Security Agency (NSA).

By fixing these flaws, the project can protect users’ anonymity, he said.

“There are plenty of people in both organisations who can anonymously leak data to us to say – maybe you should look here, maybe you should look at this to fix this,” he said. “And they have.”

Mr Lewman is part of a team of software engineers responsible for the Tor Browser – software designed to prevent it being possible to trace users’ internet activity. The programs involved also offer access to otherwise hard-to-reach websites, some of which are used for illegal purposes.

The dark web, as it is known, has been used by paedophiles to share child abuse imagery, while online drug marketplaces are also hosted on the hidden sites.

Mr Lewman said that his organisation received tips from security agency sources on “probably [a] monthly” basis about bugs and design issues that potentially could compromise the service.

However, he acknowledged that because of the way the Tor Project received such information, he could not prove who had sent it.

“It’s a hunch,” he said. “Obviously we are not going to ask for any details.

“You have to think about the type of people who would be able to do this and have the expertise and time to read Tor source code from scratch for hours, for weeks, for months, and find and elucidate these super-subtle bugs or other things that they probably don’t get to see in most commercial software.

“And the fact that we take a completely anonymous bug report allows them to report to us safely.”

He added that he had been told by William Binney, a former NSA official turned whistleblower, that one reason NSA workers might have leaked such information was because many were “upset that they are spying on Americans”.

In response, a spokesman from the NSA public affairs office said: “We have nothing for you on this one.”


The National Science Foundation thinks scientists need their own clouds

The National Science Foundation thinks scientists need their own clouds
By Derrick HarrisAug 22 2014<http://gigaom.com/2014/08/22/the-national-science-foundation-thinks-scientists-need-their-own-clouds/>
The National Science Foundation is giving a combined $20 million to two projects that are building cloud computing testbeds for scientists. They’ll features a wide variety of processor, storage and networking options so researchers can test their workloads against new architectures.

The National Science Foundation is investing $20 million to help launch two cloud computing testbeds that will allow scientists to experiment with new types of computing architectures. The agency is giving $10 million a piece to the projects, called Chameleon and CloudLab, in the hopes they will help scientists figure out if different types of hardware, processors or distributed designs are better suited to tackle particular computing challenges.

Chameleon will be comprised of 650 computing nodes and 5 petabytes of storage, and will allow researchers access to a broad range of options.According to a press release, these will include “low-power processors, general processing units (GPUs) and field-programmable gate arrays (FPGAs), as well as a variety of network interconnects and storage devices. Researchers can mix-and-match hardware, software and networking components and test their performance.”

CloudLab aims to accomplish a similar results, albeit with a very different architecture — its 15,000 processing cores and 1 petabyte of storage will be spread across three university data centers. According to the release, “Each site will have unique hardware, architecture and storage features, and will connect to the others via 100 gigabit-per-second connections on Internet2’s advanced platform, supporting OpenFlow (an open standard that enables researchers to run experimental protocols in campus networks) and other software-defined networking technologies.”

The testbeds seem fairly wise within the realm of science, where certain applications and areas of research almost certainly can benefit from on-demand access to infrastructure that won’t make its way into public clouds anytime soon. Many scientists have already benefited from the sheer scale of resources available on public cloud platforms such as Amazon Web Services, Microsoft Azure and Google Compute Engine, but limited options and configurability also limit the types of jobs that can be run. Security and performance concerns could prevent other research that involves sensitive data or guaranteed fast network connections.

However, like any sort of private or specialized cloud efforts before it, the NSF’s NSFCloud initiative (under which these two projects fall) will likely to have to figure out a way to match the user experience that public clouds provide. A clunky experience might suffice for testing out architectures that will eventually be deployed locally on physical gear, but if the goal is to host production jobs and achieve real results on NSF-funded infrastructure, it might be difficult to resist the limited (although always expanding) features and ease of use that commercial clouds provide.

Comcast donations help company secure support of Time Warner Cable merger

Comcast donations help company secure support of Time Warner Cable merger
Mayors, governors, charities praise Comcast’s giving ways in letters to FCC.
By Jon Brodkin
Aug 22 2014

Comcast’s proposed $45.2-billion acquisition of Time Warner Cable has been criticized by angry customers, consumer advocacy groups, and even somemembers of Congress.

But Comcast has plenty of support, too, much of it from politicians and organizations that benefit from its political and charitable donations. With the deadline to submit initial comments on the merger to the Federal Communications Commission set to expire Monday, a number of elected officials and charities have urged the FCC to think favorably of Comcast during its merger review.

Charities supporting the acquisition include the Greater Washington Urban League, the Urban League of Broward County in Florida, the Boys and Girls Club of Rockford, Illinois, and the United Way of Tucson in Arizona. “Comcast has dedicated itself to advancing organizations like ours through financial support and partnerships,” the Greater Washington Urban League wrote.

Comcast fans also come from political organizations. The Democratic Governors Association asked the FCC “to consider Comcast’s impressive body of work and all that they do in helping strengthen the middle class and investing in our nation’s infrastructure.”

Comcast gave $225,000 to the Democratic Governors Association this year, according to the Center for Responsive Politics. The group includes the governors of Vermont, New Hampshire, California, Montana, New York, Colorado, Washington, Delaware, Maryland, and Illinois. Comcast is a prolific donor, giving money even to politicians and organizations who criticize the company.

“On behalf of the Democratic Governors Association (DGA), I am writing to emphasize the crucial role Comcast Corporation has played in business and communities all across America,” Executive Director Colm O’Comartun wrote to the FCC. “Comcast’s record of infrastructure innovation helps residents and businesses meet the constantly evolving challenges of the 21st century. The company has invested billions in technology improvements over the last 20 years. These improvements benefit all of their customers and improve the playing field for other citizens through the American spirit of competition for service excellence. In the years ahead, Comcast will work to strengthen the mission of supporting an economy built to last in every state.”

Similarly, an FCC filing from the Pennsylvania State Mayors’ Association mentioned only Comcast’s positive attributes without acknowledging the horrid customer service experiences many Comcast subscribers complain about.

“As the mayor of Whitehall Borough, Pennsylvania, and the president of the Pennsylvania State Mayors’ Association, I have had the opportunity to observe and interact with Comcast,” James Nowalk wrote to the FCC yesterday under the Mayors’ Association letterhead. “In my opinion, Comcast has been an exceptional corporate sponsor which has given substantial support to my municipality and mayoral association.”