Locast, a Free App Streaming Network TV, Would Love to Get Sued

Locast, a Free App Streaming Network TV, Would Love to Get Sued
Want to watch the Super Bowl and other network TV for free? A start-up called Locast will let you, and (so far) the big broadcasters aren’t trying to stop it.
By Edmund Lee
Jan 31 2018
https://www.nytimes.com/2019/01/31/business/locast-streaming-free-network-tv.html

On the roof of a luxury building at the edge of Central Park, 585 feet above the concrete, a lawyer named David Goodfriend has attached a modest four-foot antenna that is a threat to the entire TV-industrial complex. 

The device is there to soak up TV signals coursing through the air — content from NBC, ABC, Fox, PBS and CBS, including megahits like “This Is Us” and this Sunday’s broadcast of Super Bowl LIII. Once plucked from the ether, the content is piped through the internet and assembled into an app called Locast. It’s a streaming service, and it makes all of this network programming available to subscribers in ways that are more convenient than relying on a home antenna: It’s viewable on almost any device, at any time, in pristine quality that doesn’t cut in and out. It’s also completely free.

If this sounds familiar, you might be thinking of Aereo, the Barry Diller-backed start-up that in 2012 threatened to upend the media industry by capturing over-the-air TV signals and streaming the content to subscribers for a fee — while not paying broadcasters a dime. NBC, CBS, ABC and Fox banded together and sued, eventually convincing the Supreme Court that Aereo had violated copyright law. The clear implication for many: If you mess with the broadcasters, you’ll file for bankruptcy and cost your investors more than $100 million.

Mr. Goodfriend took a different lesson. A former media executive with stints at the Federal Communications Commission and in the Clinton administration, he wondered if an Aereo-like offering that was structured as a noncommercial entity would remain within the law. Last January, he started Locast in New York. The service now has about 60,000 users in Houston, Chicago, Boston, Philadelphia, Dallas and Denver as well as New York, and will soon add more in Washington, D.C.

Mr. Goodfriend, 50, said he hoped to cover the entire nation as quickly as possible. “I’m not stopping,” he said. “I can’t now.”

The comment is basically a dare to the networks to take legal action against him. By giving away TV, Mr. Goodfriend is undercutting the licensing fees that major broadcasters charge the cable and satellite companies — a sum that will exceed $10 billion this year, according to the research firm Kagan S&P Global Market Intelligence. For cable customers, the traditional network channels typically add about $12 to a monthly bill. 

With consumers increasingly willing to piece together their own bespoke packages of content — paying a few bucks to Netflix here, a few to HBO there — anything that encourages people to cut their cable cords is a challenge to the cable TV empire. That calculus makes tiny Locast, whose modest website (“Help us free your TV!”) asks for donations starting at $5, perhaps the most audacious media experiment in years. 

Locast has about 60,000 users in seven cities, with hopes of eventually expanding nationwide.Jeenah Moon for The New York Times

‘Do you know you’re supposed to get television for free?’

With a shaved head and a short mustache, Mr. Goodfriend looks much younger than his age, and he speaks with the enthusiasm and the cadence of an earnest law student. 

“We really did our homework,” he said. “We are operating under parameters that are designed to be compliant within the law.”

The copyright code has an exemption for nonprofits. Mr. Goodfriend, who does not draw a salary, said he has collected $10,000 in donations so far, mostly in $5 increments. He took out a high-interest loan, at around 15 percent, to fund the operation, which to date has cost more than $700,000. 

Mr. Goodfriend is not a rich tech entrepreneur or a wealthy heir — just a lawyer who has made a decent living. Locast could still meet the fate of Aereo and be sued into financial oblivion by the networks. So why is he doing this?

The answer is partly principle, and partly intellectual mischief: With his public-private background, he has spotted an imbalance in the media ecosystem, he said, and decided to give the whole thing a shake.

“I ask people all the time, ‘Do you know you’re supposed to get television for free?’” Mr. Goodfriend said during an interview in Central Park, gesturing to a gaggle of visitors. “Most people under 50 don’t get it.”

Although his practice is in Washington, where he also teaches law at Georgetown and lectures at George Washington University, Mr. Goodfriend had come to New York to inspect the installation of the antenna, on the Trump International Hotel and Tower. 

(This is another area where Locast has to operate carefully: The organization must install signal equipment in every city where it operates, because all broadcast stations are regional and retransmissions can be made only to local residents. If you live in, say, Miami, you can’t get Locast until Mr. Goodfriend puts up an antenna there.)

[snip]

A World Without Clouds

A World Without Clouds
A state-of-the-art supercomputer simulation indicates that a feedback loop between global warming and cloud loss can push Earth’s climate past a disastrous tipping point in as little as a century.
By Natalie Wolchover
Feb 25 2019
https://www.quantamagazine.org/cloud-loss-could-add-8-degrees-to-global-warming-20190225/

On a 1987 voyage to the Antarctic, the paleoceanographer James Kennett and his crew dropped anchor in the Weddell Sea, drilled into the seabed, and extracted a vertical cylinder of sediment. In an inch-thick layer of plankton fossils and other detritus buried more than 500 feet deep, they found a disturbing clue about the planet’s past that could spell disaster for the future.

Lower in the sediment core, fossils abounded from 60 plankton species. But in that thin cross-section from about 56 million years ago, the number of species dropped to 17. And the planktons’ oxygen and carbon isotope compositions had dramatically changed. Kennett and his student Lowell Stott deduced from the anomalous isotopes that carbon dioxide had flooded the air, causing the ocean to rapidly acidify and heat up, in a process similar to what we are seeing today.

While those 17 kinds of plankton were sinking through the warming waters and settling on the Antarctic seabed, a tapir-like creature died in what is now Wyoming, depositing a tooth in a bright-red layer of sedimentary rock coursing through the badlands of the Bighorn Basin. In 1992, the finder of the tooth fossil, Phil Gingerich, and collaborators Jim Zachos and Paul Koch reported the same isotope anomalies in its enamel that Kennett and Stott had presented in their ocean findings a year earlier. The prehistoric mammal had also been breathing CO2-flooded air.

More data points surfaced in China, then Europe, then all over. A picture emerged of a brief, cataclysmic hot spell 56 million years ago, now known as the Paleocene-Eocene Thermal Maximum (PETM). After heat-trapping carbon leaked into the sky from an unknown source, the planet, which was already several degrees Celsius hotter than it is today, gained an additional 6 degrees. The ocean turned jacuzzi-hot near the equator and experienced mass extinctions worldwide. On land, primitive monkeys, horses and other early mammals marched northward, following vegetation to higher latitudes. The mammals also miniaturized over generations, as leaves became less nutritious in the carbonaceous air. Violent storms ravaged the planet; the geologic record indicates flash floods and protracted droughts. As Kennett put it, “Earth was triggered, and all hell broke loose.”

The PETM doesn’t only provide a past example of CO2-driven climate change; scientists say it also points to an unknown factor that has an outsize influence on Earth’s climate. When the planet got hot, it got really hot. Ancient warming episodes like the PETM were always far more extreme than theoretical models of the climate suggest they should have been. Even after accounting for differences in geography, ocean currents and vegetation during these past episodes, paleoclimatologists find that something big appears to be missing from their models — an X-factor whose wild swings leave no trace in the fossil record.

Evidence is mounting in favor of the answer that experts have long suspected but have only recently been capable of exploring in detail. “It’s quite clear at this point that the answer is clouds,” said Matt Huber, a paleoclimate modeler at Purdue University.

Clouds currently cover about two-thirds of the planet at any moment. But computer simulations of clouds have begun to suggest that as the Earth warms, clouds become scarcer. With fewer white surfaces reflecting sunlight back to space, the Earth gets even warmer, leading to more cloud loss. This feedback loop causes warming to spiral out of control.

For decades, rough calculations have suggested that cloud loss could significantly impact climate, but this concern remained speculative until the last few years, when observations and simulations of clouds improved to the point where researchers could amass convincing evidence.

Now, new findings reported today in the journal Nature Geoscience make the case that the effects of cloud loss are dramatic enough to explain ancient warming episodes like the PETM — and to precipitate future disaster. Climate physicists at the California Institute of Technology performed a state-of-the-art simulation of stratocumulus clouds, the low-lying, blankety kind that have by far the largest cooling effect on the planet. The simulation revealed a tipping point: a level of warming at which stratocumulus clouds break up altogether. The disappearance occurs when the concentration of CO2 in the simulated atmosphere reaches 1,200 parts per million — a level that fossil fuel burning could push us past in about a century, under “business-as-usual” emissions scenarios. In the simulation, when the tipping point is breached, Earth’s temperature soars 8 degrees Celsius, in addition to the 4 degrees of warming or more caused by the CO2 directly.

Once clouds go away, the simulated climate “goes over a cliff,” said Kerry Emanuel, a climate scientist at the Massachusetts Institute of Technology. A leading authority on atmospheric physics, Emanuel called the new findings “very plausible,” though, as he noted, scientists must now make an effort to independently replicate the work.

To imagine 12 degrees of warming, think of crocodiles swimming in the Arctic and of the scorched, mostly lifeless equatorial regions during the PETM. If carbon emissions aren’t curbed quickly enough and the tipping point is breached, “that would be truly devastating climate change,” said Caltech’s Tapio Schneider, who performed the new simulation with Colleen Kaul and Kyle Pressel.

Huber said the stratocumulus tipping point helps explain the volatility that’s evident in the paleoclimate record. He thinks it might be one of many unknown instabilities in Earth’s climate. “Schneider and co-authors have cracked open Pandora’s box of potential climate surprises,” he said, adding that, as the mechanisms behind vanishing clouds become clear, “all of a sudden this enormous sensitivity that is apparent from past climates isn’t something that’s just in the past. It becomes a vision of the future.”

[snip]

Haisong Tang: Innovation in China: Rise & Transform

[Note:  This item comes from friend Jennifer Snow.  DLH]

Haisong Tang: Innovation in China: Rise & Transform
By TWIN Global – The World Innovation Network
Nov 29 2018

Haisong Tang discusses the future—and past—of business, fashion, technology and the military in China, France, Israel, the United States and points in between in this wide-ranging and incisive talk. 

Video: 23:38 min

White House to set up panel to counter climate change consensus, officials say

White House to set up panel to counter climate change consensus, officials say
By  Juliet Eilperin ,Josh Dawsey and Brady Dennis
Feb 24 2019
https://www.washingtonpost.com/national/health-science/white-house-to-select-federal-scientists-to-reassess-government-climate-findings-sources-say/2019/02/24/49cd0a84-37dd-11e9-af5b-b51b7ff322e9_story.html

The White House plans to create an ad hoc group of select federal scientists to reassess the government’s analysis of climate science and counter conclusions that the continued burning of fossil fuels is harming the planet, according to three administration officials.

The National Security Council initiative would include scientists who question the severity of climate impacts and the extent to which humans contribute to the problem, according to these individuals, who asked for anonymity to discuss internal deliberations. The group would not be subject to the same level of public disclosure as a formal advisory committee.

The move would represent the Trump administration’s most forceful effort to date to challenge the scientific consensus that greenhouse gas emissions are helping drive global warming and that the world could face dire consequences unless countries curb their carbon output over the next few decades.

The idea of a new working group, which top administration officials discussed Friday in the White House Situation Room, represents a modified version of an earlier planto establish a federal advisory panel on climate and national security. That plan — championed by William Happer, NSC’s senior director and a physicist who has challenged the idea that carbon dioxide could damage the planet — would have created an independent federal advisory committee.

The Federal Advisory Committee Act imposes several ground rules for such panels, including that they meet in public, are subject to public records requests and include a representative membership.

While the plan is not finalized, NSC officials said they would take steps to assemble a group of researchers within the government. The group will not be tasked with scrutinizing recent intelligence community assessments of climate change, according to officials familiar with the plan.

The National Security Council declined requests to comment on the matter.

During the Friday meeting, these officials said, deputy national security adviser Charles Kupperman said Trump was upset that his administration had issued the National Climate Assessment, which must be published regularly under federal law. Kupperman added that congressional Democrats had seized upon the report, which is the product of more than a dozen agencies, to bolster their case for cutting carbon emissions as part of the Green New Deal.

Attendees at the session, which included acting interior secretary David Bernhardt and senior officials from across the government, debated how best to establish a group of researchers that could scrutinize recent federal climate reports.

Happer, who headed an advocacy group called the CO2 Coalition before joining the administration in the fall, has challenged the scientific consensus on climate change inside and outside of government.

Public records show the coalition, which describes its mission as informing policymakers and the public of the “important contribution made by carbon dioxide to our lives and the economy,” has received money from far-right organizations and donors with fossil fuel interests.

In 2017, according to federal tax filings obtained by the Climate Investigations Center, the group received $170,000 from the Mercer Family Foundation and more than $33,000 from the Charles Koch Institute.

One senior administration official said the president was looking for “a mixture of opinions” and disputed a massive inter-agency report in November that described intensifying climate change as a threat to the United States.

“The president wants people to be able to decide for themselves,” the aide said.

Several scientists, however, said the federal government’s recent findings on climate change had received intense scrutiny from other researchers in the field before they became public.

Christopher Field, director of the Stanford Woods Institute who served on the National Academy of Sciences review panel for the scientific report that formed the basis of last year’s climate assessment, said the committee met several times “to do a careful, page by page evaluation by the entire report.”

“The whole review process is confrontational from the very get-go, but it’s based in scientific credibility, in a traceable chain of evidence through publications,” said Field, an earth system science and biology professor.

[snip]

No, ‘Oumuamua is not an alien spaceship. It might be even weirder.

[Note:  This item comes from friend David Rosenthal.  DLH]

No, ‘Oumuamua is not an alien spaceship. It might be even weirder.
By Phil Plait
Feb 18 2019
https://www.syfy.com/syfywire/no-oumuamua-is-not-an-alien-spaceship-it-might-be-even-weirder

Not to put too fine a point on it, but what the frak is ‘Oumuamua?

Oh, you remember ‘Oumuamua. It caused quite a stir last year; first seen in late 2017 by the Pan-STARRS survey telescope in Hawaii, it was quickly found to have a very unusual orbit. Instead of the usual ellipse or circle around the Sun like normal solar system objects, it was found to have a hyperbolic orbit. That means it was moving too quickly to be bound to the Sun, and that, in turn, means it came from Out There. Like really out there: interstellar space, the void between the stars.

Subsequent observations confirmed it: ‘Oumuamua was just passing through the solar system, with so much extra velocity (about 25 km/sec) that it was moving faster than the Sun’s escape velocity. This was a one-time visitor, screaming through the solar system and heading back out into The Black once again.

Yeah, it’s not coming back.

That was certainly enough to make it the object of intense scrutiny. We’d never seen something from interstellar space pass through the solar system before! But what was it? At first it was classified as a comet, then an asteroid, and then maybe a cometagain (this confusion is reflected in its provisional designations; at first it was A/2017 U1, for “asteroid”, then C/2017 U1, for “comet”, then finally I/2017 U1, for “interstellar”). It was hard to tell what it was; it was too small, faint, and far away to get good observations, and worse, it was only seen on its way out, so it was farther from us literally every day.

Then another very weird thing happened: More observations allowed a better determination of its trajectory, and it was found that it wasn’t slowing down fast enough. As it moves away, the Sun’s gravity pulls on it, slowing it down … but it wasn’t slowing down enough.

Some force was acting on it, accelerating it very slightly. Comets are made of rock and ice, so maybe the ice was turning into gas, and as this was blown off it acted like a very gentle rocket. The problem with this is that no such venting was detected. If it were like comets in our solar system, you’d expect to see lots of carbon monoxide (CO) and carbon dioxide (CO2) coming from it, but none was seen. So maybe it was some other kind of ice, like water. But again, if it is like our local comets, it would take so much water that we’d have noticed.

That’s when a couple of astronomers posited something interesting: Maybe this force was radiation pressure, literally the force of sunlight hitting it and giving it a tiny push. That makes some sense, but for the math to work out with the acceleration seen, ‘Oumuamua had to be flat. Like, really flat: So thin that it looked more like a solar sail, a very thin sheet of material designed to catch sunlight and accelerate. But that, in turn, meant that ‘Oumuamua was artificial. As in, a spaceship.

Besides the obvious (it seems like a big leap!), I have my problems with this idea. Not much has changed with that hypothesis since I wrote that, and while I wouldn’t dismiss it being an alien probe out of hand, the evidence doesn’t support that conclusion, and in fact points against it.

So, I ask again: What the frak is ‘Oumuamua?

A new paper has come out that might have a solution, and it’s really clever. Maybe ‘Oumuamua’s not flat. Maybe it’s fluffy.

When the astronomers speculated it might be a thin and flat, giving it a large area like a sail, they had to assume a density for it. That’s because the amount of pressure sunlight exerts is very small, so if an object is massive it has to be spread out very thin and big to catch enough sunlight to accelerate it enough to match the observations. So they assumed it had some normal density like 1 – 3 grams per cubic centimeter (roughly somewhere between the density of water to rock).

The new paper turns that around. Instead of assuming a density to find the area, let’s assume the size determined using normal methods is correct, use that to get an area, and from there get the density needed to match the observations.

Assuming a size for ‘Oumuamua of 50 – 130 meters, what they get is a very low density: About 0.00005 grams per cc. That’s incredibly low, and at first it seems ridiculously so. That’s 100 times less dense than air! No solid object could have a density that low!

… so what if it’s not solid?

[snip]

Cyber-Mercenary Groups Shouldn’t be Trusted in Your Browser or Anywhere Else

[Note:  This item comes from friend David Rosenthal.  DLH]

Cyber-Mercenary Groups Shouldn’t be Trusted in Your Browser or Anywhere Else
By COOPER QUINTIN
Feb 22 2019
https://www.eff.org/deeplinks/2019/02/cyber-mercenary-groups-shouldnt-be-trusted-your-browser-or-anywhere-else

DarkMatter, the notorious cyber-mercenary firm based in the United Arab Emirates, is seeking to become approved as a top-level certificate authority in Mozilla’s root certificate program. Giving such a trusted position to this company would be a very bad idea. DarkMatter has a business interest in subverting encryption, and would be able to potentially decrypt any HTTPS traffic they intercepted. One of the things HTTPS is good at is protecting your private communications from snooping governments—and when governments want to snoop, they regularly hire DarkMatter to do their dirty work.

Membership in the root certificate program is the way in which Mozilla decides which certificate authorities (CAs) get to have their root certificates trusted in Firefox. Mozilla’s list of trusted root certificates is also used in many other products, including the Linux operating system.  

Browsers rely on this list of authorities, which are trusted to verify and issue the certificates that allow for secure browsing, using technologies like TLS and HTTPS. Certificate Authorities are the basis of HTTPS, but they are also its greatest weakness. Any of the dozens of certificate authorities trusted by your browser could secretly issue a fraudulent certificate for any website (such as google.com or eff.org.) A certificate authority (or other organization, such as a government spy agency,) could then use the fraudulent certificate to spy on your communications with that site, even if it is encrypted with HTTPS. Certificate Transparency can mitigate some of the risk by requiring public logging of all issued certificates, but is not a panacea.

The companies on your browser’s trusted CA list rarely commit such fraud, since not issuing malicious certificates is the foremost responsibility for a certificate authority. But it can and does still happen. The concern in this case is that DarkMatter has made its business spying on internet communications, hacking dissidents’ iPhones, and other cyber-mercenary work. DarkMatter’s business objectives directly depend on intercepting end-user traffic on behalf of snooping governments. Giving DarkMatter a trusted root certificate would be like letting the proverbial fox guard the henhouse. 

Currently, the standard for being accepted as a trusted certificate authority in the browser is a technical and bureaucratic one. For example, do the organization’s documented practices meet the minimum requirements? Can the organization issue standards-compliant certificates? Dark Matter will likely meet those standards, eventually. But the standards don’t take into account an organization’s history of trying to break encryption, or its conflicts of interest.

Other organizations have used this fact to game the system in the past and worm their way into our browsers. In 2009, Mozilla allowed CNNIC, the Chinese state certification authority, into the root CA program, after CNNIC assured Mozilla and the larger community that it would not abuse this power to create fake certificates and break encryption. In 2015 CNNIC was caught in a scandal when an intermediate CA authorized by CNNIC issued illegitimate certificates for several google-owned domains. Google, Mozilla, and others quickly revoked CNNIC’s authority in their browsers and operating systems after learning about the breach of trust. CNNIC is not the only example of this. In 2013 Mozilla considered dropping the Swedish company Teliasonera after accusations that it had helped enable government spying. Teliasonera ultimately did not get dropped, but it continues to have security problems to this day. 

DarkMatter was already given an “intermediate” certificate by another company, called QuoVadis, now owned by DigiCert. That’s bad enough, but the “intermediate” authority at least comes with ostensible oversight by DigiCert. Without that oversight, the situation will be much worse. We would encourage Mozilla and others to revoke even this intermediate certificate, given DarkMatter’s known practices subverting internet security.

[snip]

Why a Focus on “Fake News” and Facebook Misses the Internet’s Real Problems – and Solutions

[Note:  This item comes from friend David Rosenthal.  DLH]

Why a Focus on “Fake News” and Facebook Misses the Internet’s Real Problems – and Solutions
By Yves Smith
Feb 23 2019
https://www.nakedcapitalism.com/2019/02/focus-fake-news-facebook-misses-internets-real-problems-solutions.html

Yves here. Needless to say, I have little sympathy for the handwringing over “fake news,” and worst, the plans to regulate content provision on the Internet. There is no evidence that “Russian” campaigns on social media has any impact on election results, and political scientists like Tom Ferguson have debunked the idea. In addition, Marina Bart explained why Cambridge Analytica’s claims about its ability to sway voters were hogwash. 

In recent years, Google has repeatedly changed its search algos to downgrade alternative sites and we now barely appear on searches when our original reporting used to dominate results.

By Jennifer Cobbe, the co-ordinator of Cambridge University’s Trustworthy Technologies strategic research initiative, which researches trust, computer and internet technologies. She researches and writes on law, tech and surveillance issues. Originally published at openDemocracy

Yesterday morning, the House of Commons Digital, Culture, Media and Sport Select Committee published its long-awaited final report into disinformation and ‘fake news’. The report – which follows a long and at times dramatic investigation – is full of interesting and insightful details about political microtargeting (the targeting of political messaging to relatively small groups of people) and the spread of disinformation.

But the report’s myopic focus on one company – Facebook – means that it misses the bigger picture – including the internet’s dominant variety of capitalism.

It is of course welcome that attention is being paid to these problems, and there is much in the Committee’s report that’s good. The report is undoubtedly right to find that Britain’s electoral laws are woefully inadequate for the age of the algorithm and are badly in need of reform. Its recommendation that inferences drawn from analysis of other data about people should be more clearly considered to be personal data likewise seems eminently sensible.

Is It OK to Manipulate People to Extract Their Money, Just not for Politics?

But there are also clear shortcomings. Focusing on disinformation itself as a target for regulation brings an obvious problem. By calling for interventions based on ‘harmful’ content, the report asks the Government to step into the dangerous territory of regulating lawful political conversations between people. Are private companies to be mandated to police these communications on the Government’s behalf? There are numerous good reasons why this is deeply undesirable (not to mention incompatible with human rights laws).

The biggest oversight, however, is in diagnosing disinformation as essentially a problem with Facebook, rather than a systemic issue emerging in part from the pollution of online spaces by the business model that Facebook shares with others: the surveillance and modification of human behaviour for profit.

‘Surveillance capitalism’, as it’s known, involves gathering as much data as possible about as many people as possible doing as many things as possible from as many sources as possible. These huge datasets are then algorithmically analysed so as to spot patterns and correlations from which future behaviour can be predicted. A personalised, highly dynamic, and responsive form of behavioural nudging then seeks to influence that future behaviour to drive engagement and profit for platforms and advertisers. These targeted behaviour modification tools rely on triggering cognitive biases and known short-cuts in human decision-making. Platforms and advertisers extensively experiment to find the most effective way to influence behaviour.

[snip]