Locast, a Free App Streaming Network TV, Would Love to Get Sued

Locast, a Free App Streaming Network TV, Would Love to Get Sued
Want to watch the Super Bowl and other network TV for free? A start-up called Locast will let you, and (so far) the big broadcasters aren’t trying to stop it.
By Edmund Lee
Jan 31 2018

On the roof of a luxury building at the edge of Central Park, 585 feet above the concrete, a lawyer named David Goodfriend has attached a modest four-foot antenna that is a threat to the entire TV-industrial complex. 

The device is there to soak up TV signals coursing through the air — content from NBC, ABC, Fox, PBS and CBS, including megahits like “This Is Us” and this Sunday’s broadcast of Super Bowl LIII. Once plucked from the ether, the content is piped through the internet and assembled into an app called Locast. It’s a streaming service, and it makes all of this network programming available to subscribers in ways that are more convenient than relying on a home antenna: It’s viewable on almost any device, at any time, in pristine quality that doesn’t cut in and out. It’s also completely free.

If this sounds familiar, you might be thinking of Aereo, the Barry Diller-backed start-up that in 2012 threatened to upend the media industry by capturing over-the-air TV signals and streaming the content to subscribers for a fee — while not paying broadcasters a dime. NBC, CBS, ABC and Fox banded together and sued, eventually convincing the Supreme Court that Aereo had violated copyright law. The clear implication for many: If you mess with the broadcasters, you’ll file for bankruptcy and cost your investors more than $100 million.

Mr. Goodfriend took a different lesson. A former media executive with stints at the Federal Communications Commission and in the Clinton administration, he wondered if an Aereo-like offering that was structured as a noncommercial entity would remain within the law. Last January, he started Locast in New York. The service now has about 60,000 users in Houston, Chicago, Boston, Philadelphia, Dallas and Denver as well as New York, and will soon add more in Washington, D.C.

Mr. Goodfriend, 50, said he hoped to cover the entire nation as quickly as possible. “I’m not stopping,” he said. “I can’t now.”

The comment is basically a dare to the networks to take legal action against him. By giving away TV, Mr. Goodfriend is undercutting the licensing fees that major broadcasters charge the cable and satellite companies — a sum that will exceed $10 billion this year, according to the research firm Kagan S&P Global Market Intelligence. For cable customers, the traditional network channels typically add about $12 to a monthly bill. 

With consumers increasingly willing to piece together their own bespoke packages of content — paying a few bucks to Netflix here, a few to HBO there — anything that encourages people to cut their cable cords is a challenge to the cable TV empire. That calculus makes tiny Locast, whose modest website (“Help us free your TV!”) asks for donations starting at $5, perhaps the most audacious media experiment in years. 

Locast has about 60,000 users in seven cities, with hopes of eventually expanding nationwide.Jeenah Moon for The New York Times

‘Do you know you’re supposed to get television for free?’

With a shaved head and a short mustache, Mr. Goodfriend looks much younger than his age, and he speaks with the enthusiasm and the cadence of an earnest law student. 

“We really did our homework,” he said. “We are operating under parameters that are designed to be compliant within the law.”

The copyright code has an exemption for nonprofits. Mr. Goodfriend, who does not draw a salary, said he has collected $10,000 in donations so far, mostly in $5 increments. He took out a high-interest loan, at around 15 percent, to fund the operation, which to date has cost more than $700,000. 

Mr. Goodfriend is not a rich tech entrepreneur or a wealthy heir — just a lawyer who has made a decent living. Locast could still meet the fate of Aereo and be sued into financial oblivion by the networks. So why is he doing this?

The answer is partly principle, and partly intellectual mischief: With his public-private background, he has spotted an imbalance in the media ecosystem, he said, and decided to give the whole thing a shake.

“I ask people all the time, ‘Do you know you’re supposed to get television for free?’” Mr. Goodfriend said during an interview in Central Park, gesturing to a gaggle of visitors. “Most people under 50 don’t get it.”

Although his practice is in Washington, where he also teaches law at Georgetown and lectures at George Washington University, Mr. Goodfriend had come to New York to inspect the installation of the antenna, on the Trump International Hotel and Tower. 

(This is another area where Locast has to operate carefully: The organization must install signal equipment in every city where it operates, because all broadcast stations are regional and retransmissions can be made only to local residents. If you live in, say, Miami, you can’t get Locast until Mr. Goodfriend puts up an antenna there.)


A World Without Clouds

A World Without Clouds
A state-of-the-art supercomputer simulation indicates that a feedback loop between global warming and cloud loss can push Earth’s climate past a disastrous tipping point in as little as a century.
By Natalie Wolchover
Feb 25 2019

On a 1987 voyage to the Antarctic, the paleoceanographer James Kennett and his crew dropped anchor in the Weddell Sea, drilled into the seabed, and extracted a vertical cylinder of sediment. In an inch-thick layer of plankton fossils and other detritus buried more than 500 feet deep, they found a disturbing clue about the planet’s past that could spell disaster for the future.

Lower in the sediment core, fossils abounded from 60 plankton species. But in that thin cross-section from about 56 million years ago, the number of species dropped to 17. And the planktons’ oxygen and carbon isotope compositions had dramatically changed. Kennett and his student Lowell Stott deduced from the anomalous isotopes that carbon dioxide had flooded the air, causing the ocean to rapidly acidify and heat up, in a process similar to what we are seeing today.

While those 17 kinds of plankton were sinking through the warming waters and settling on the Antarctic seabed, a tapir-like creature died in what is now Wyoming, depositing a tooth in a bright-red layer of sedimentary rock coursing through the badlands of the Bighorn Basin. In 1992, the finder of the tooth fossil, Phil Gingerich, and collaborators Jim Zachos and Paul Koch reported the same isotope anomalies in its enamel that Kennett and Stott had presented in their ocean findings a year earlier. The prehistoric mammal had also been breathing CO2-flooded air.

More data points surfaced in China, then Europe, then all over. A picture emerged of a brief, cataclysmic hot spell 56 million years ago, now known as the Paleocene-Eocene Thermal Maximum (PETM). After heat-trapping carbon leaked into the sky from an unknown source, the planet, which was already several degrees Celsius hotter than it is today, gained an additional 6 degrees. The ocean turned jacuzzi-hot near the equator and experienced mass extinctions worldwide. On land, primitive monkeys, horses and other early mammals marched northward, following vegetation to higher latitudes. The mammals also miniaturized over generations, as leaves became less nutritious in the carbonaceous air. Violent storms ravaged the planet; the geologic record indicates flash floods and protracted droughts. As Kennett put it, “Earth was triggered, and all hell broke loose.”

The PETM doesn’t only provide a past example of CO2-driven climate change; scientists say it also points to an unknown factor that has an outsize influence on Earth’s climate. When the planet got hot, it got really hot. Ancient warming episodes like the PETM were always far more extreme than theoretical models of the climate suggest they should have been. Even after accounting for differences in geography, ocean currents and vegetation during these past episodes, paleoclimatologists find that something big appears to be missing from their models — an X-factor whose wild swings leave no trace in the fossil record.

Evidence is mounting in favor of the answer that experts have long suspected but have only recently been capable of exploring in detail. “It’s quite clear at this point that the answer is clouds,” said Matt Huber, a paleoclimate modeler at Purdue University.

Clouds currently cover about two-thirds of the planet at any moment. But computer simulations of clouds have begun to suggest that as the Earth warms, clouds become scarcer. With fewer white surfaces reflecting sunlight back to space, the Earth gets even warmer, leading to more cloud loss. This feedback loop causes warming to spiral out of control.

For decades, rough calculations have suggested that cloud loss could significantly impact climate, but this concern remained speculative until the last few years, when observations and simulations of clouds improved to the point where researchers could amass convincing evidence.

Now, new findings reported today in the journal Nature Geoscience make the case that the effects of cloud loss are dramatic enough to explain ancient warming episodes like the PETM — and to precipitate future disaster. Climate physicists at the California Institute of Technology performed a state-of-the-art simulation of stratocumulus clouds, the low-lying, blankety kind that have by far the largest cooling effect on the planet. The simulation revealed a tipping point: a level of warming at which stratocumulus clouds break up altogether. The disappearance occurs when the concentration of CO2 in the simulated atmosphere reaches 1,200 parts per million — a level that fossil fuel burning could push us past in about a century, under “business-as-usual” emissions scenarios. In the simulation, when the tipping point is breached, Earth’s temperature soars 8 degrees Celsius, in addition to the 4 degrees of warming or more caused by the CO2 directly.

Once clouds go away, the simulated climate “goes over a cliff,” said Kerry Emanuel, a climate scientist at the Massachusetts Institute of Technology. A leading authority on atmospheric physics, Emanuel called the new findings “very plausible,” though, as he noted, scientists must now make an effort to independently replicate the work.

To imagine 12 degrees of warming, think of crocodiles swimming in the Arctic and of the scorched, mostly lifeless equatorial regions during the PETM. If carbon emissions aren’t curbed quickly enough and the tipping point is breached, “that would be truly devastating climate change,” said Caltech’s Tapio Schneider, who performed the new simulation with Colleen Kaul and Kyle Pressel.

Huber said the stratocumulus tipping point helps explain the volatility that’s evident in the paleoclimate record. He thinks it might be one of many unknown instabilities in Earth’s climate. “Schneider and co-authors have cracked open Pandora’s box of potential climate surprises,” he said, adding that, as the mechanisms behind vanishing clouds become clear, “all of a sudden this enormous sensitivity that is apparent from past climates isn’t something that’s just in the past. It becomes a vision of the future.”


Haisong Tang: Innovation in China: Rise & Transform

[Note:  This item comes from friend Jennifer Snow.  DLH]

Haisong Tang: Innovation in China: Rise & Transform
By TWIN Global – The World Innovation Network
Nov 29 2018

Haisong Tang discusses the future—and past—of business, fashion, technology and the military in China, France, Israel, the United States and points in between in this wide-ranging and incisive talk. 

Video: 23:38 min

White House to set up panel to counter climate change consensus, officials say

White House to set up panel to counter climate change consensus, officials say
By  Juliet Eilperin ,Josh Dawsey and Brady Dennis
Feb 24 2019

The White House plans to create an ad hoc group of select federal scientists to reassess the government’s analysis of climate science and counter conclusions that the continued burning of fossil fuels is harming the planet, according to three administration officials.

The National Security Council initiative would include scientists who question the severity of climate impacts and the extent to which humans contribute to the problem, according to these individuals, who asked for anonymity to discuss internal deliberations. The group would not be subject to the same level of public disclosure as a formal advisory committee.

The move would represent the Trump administration’s most forceful effort to date to challenge the scientific consensus that greenhouse gas emissions are helping drive global warming and that the world could face dire consequences unless countries curb their carbon output over the next few decades.

The idea of a new working group, which top administration officials discussed Friday in the White House Situation Room, represents a modified version of an earlier planto establish a federal advisory panel on climate and national security. That plan — championed by William Happer, NSC’s senior director and a physicist who has challenged the idea that carbon dioxide could damage the planet — would have created an independent federal advisory committee.

The Federal Advisory Committee Act imposes several ground rules for such panels, including that they meet in public, are subject to public records requests and include a representative membership.

While the plan is not finalized, NSC officials said they would take steps to assemble a group of researchers within the government. The group will not be tasked with scrutinizing recent intelligence community assessments of climate change, according to officials familiar with the plan.

The National Security Council declined requests to comment on the matter.

During the Friday meeting, these officials said, deputy national security adviser Charles Kupperman said Trump was upset that his administration had issued the National Climate Assessment, which must be published regularly under federal law. Kupperman added that congressional Democrats had seized upon the report, which is the product of more than a dozen agencies, to bolster their case for cutting carbon emissions as part of the Green New Deal.

Attendees at the session, which included acting interior secretary David Bernhardt and senior officials from across the government, debated how best to establish a group of researchers that could scrutinize recent federal climate reports.

Happer, who headed an advocacy group called the CO2 Coalition before joining the administration in the fall, has challenged the scientific consensus on climate change inside and outside of government.

Public records show the coalition, which describes its mission as informing policymakers and the public of the “important contribution made by carbon dioxide to our lives and the economy,” has received money from far-right organizations and donors with fossil fuel interests.

In 2017, according to federal tax filings obtained by the Climate Investigations Center, the group received $170,000 from the Mercer Family Foundation and more than $33,000 from the Charles Koch Institute.

One senior administration official said the president was looking for “a mixture of opinions” and disputed a massive inter-agency report in November that described intensifying climate change as a threat to the United States.

“The president wants people to be able to decide for themselves,” the aide said.

Several scientists, however, said the federal government’s recent findings on climate change had received intense scrutiny from other researchers in the field before they became public.

Christopher Field, director of the Stanford Woods Institute who served on the National Academy of Sciences review panel for the scientific report that formed the basis of last year’s climate assessment, said the committee met several times “to do a careful, page by page evaluation by the entire report.”

“The whole review process is confrontational from the very get-go, but it’s based in scientific credibility, in a traceable chain of evidence through publications,” said Field, an earth system science and biology professor.


No, ‘Oumuamua is not an alien spaceship. It might be even weirder.

[Note:  This item comes from friend David Rosenthal.  DLH]

No, ‘Oumuamua is not an alien spaceship. It might be even weirder.
By Phil Plait
Feb 18 2019

Not to put too fine a point on it, but what the frak is ‘Oumuamua?

Oh, you remember ‘Oumuamua. It caused quite a stir last year; first seen in late 2017 by the Pan-STARRS survey telescope in Hawaii, it was quickly found to have a very unusual orbit. Instead of the usual ellipse or circle around the Sun like normal solar system objects, it was found to have a hyperbolic orbit. That means it was moving too quickly to be bound to the Sun, and that, in turn, means it came from Out There. Like really out there: interstellar space, the void between the stars.

Subsequent observations confirmed it: ‘Oumuamua was just passing through the solar system, with so much extra velocity (about 25 km/sec) that it was moving faster than the Sun’s escape velocity. This was a one-time visitor, screaming through the solar system and heading back out into The Black once again.

Yeah, it’s not coming back.

That was certainly enough to make it the object of intense scrutiny. We’d never seen something from interstellar space pass through the solar system before! But what was it? At first it was classified as a comet, then an asteroid, and then maybe a cometagain (this confusion is reflected in its provisional designations; at first it was A/2017 U1, for “asteroid”, then C/2017 U1, for “comet”, then finally I/2017 U1, for “interstellar”). It was hard to tell what it was; it was too small, faint, and far away to get good observations, and worse, it was only seen on its way out, so it was farther from us literally every day.

Then another very weird thing happened: More observations allowed a better determination of its trajectory, and it was found that it wasn’t slowing down fast enough. As it moves away, the Sun’s gravity pulls on it, slowing it down … but it wasn’t slowing down enough.

Some force was acting on it, accelerating it very slightly. Comets are made of rock and ice, so maybe the ice was turning into gas, and as this was blown off it acted like a very gentle rocket. The problem with this is that no such venting was detected. If it were like comets in our solar system, you’d expect to see lots of carbon monoxide (CO) and carbon dioxide (CO2) coming from it, but none was seen. So maybe it was some other kind of ice, like water. But again, if it is like our local comets, it would take so much water that we’d have noticed.

That’s when a couple of astronomers posited something interesting: Maybe this force was radiation pressure, literally the force of sunlight hitting it and giving it a tiny push. That makes some sense, but for the math to work out with the acceleration seen, ‘Oumuamua had to be flat. Like, really flat: So thin that it looked more like a solar sail, a very thin sheet of material designed to catch sunlight and accelerate. But that, in turn, meant that ‘Oumuamua was artificial. As in, a spaceship.

Besides the obvious (it seems like a big leap!), I have my problems with this idea. Not much has changed with that hypothesis since I wrote that, and while I wouldn’t dismiss it being an alien probe out of hand, the evidence doesn’t support that conclusion, and in fact points against it.

So, I ask again: What the frak is ‘Oumuamua?

A new paper has come out that might have a solution, and it’s really clever. Maybe ‘Oumuamua’s not flat. Maybe it’s fluffy.

When the astronomers speculated it might be a thin and flat, giving it a large area like a sail, they had to assume a density for it. That’s because the amount of pressure sunlight exerts is very small, so if an object is massive it has to be spread out very thin and big to catch enough sunlight to accelerate it enough to match the observations. So they assumed it had some normal density like 1 – 3 grams per cubic centimeter (roughly somewhere between the density of water to rock).

The new paper turns that around. Instead of assuming a density to find the area, let’s assume the size determined using normal methods is correct, use that to get an area, and from there get the density needed to match the observations.

Assuming a size for ‘Oumuamua of 50 – 130 meters, what they get is a very low density: About 0.00005 grams per cc. That’s incredibly low, and at first it seems ridiculously so. That’s 100 times less dense than air! No solid object could have a density that low!

… so what if it’s not solid?


Cyber-Mercenary Groups Shouldn’t be Trusted in Your Browser or Anywhere Else

[Note:  This item comes from friend David Rosenthal.  DLH]

Cyber-Mercenary Groups Shouldn’t be Trusted in Your Browser or Anywhere Else
Feb 22 2019

DarkMatter, the notorious cyber-mercenary firm based in the United Arab Emirates, is seeking to become approved as a top-level certificate authority in Mozilla’s root certificate program. Giving such a trusted position to this company would be a very bad idea. DarkMatter has a business interest in subverting encryption, and would be able to potentially decrypt any HTTPS traffic they intercepted. One of the things HTTPS is good at is protecting your private communications from snooping governments—and when governments want to snoop, they regularly hire DarkMatter to do their dirty work.

Membership in the root certificate program is the way in which Mozilla decides which certificate authorities (CAs) get to have their root certificates trusted in Firefox. Mozilla’s list of trusted root certificates is also used in many other products, including the Linux operating system.  

Browsers rely on this list of authorities, which are trusted to verify and issue the certificates that allow for secure browsing, using technologies like TLS and HTTPS. Certificate Authorities are the basis of HTTPS, but they are also its greatest weakness. Any of the dozens of certificate authorities trusted by your browser could secretly issue a fraudulent certificate for any website (such as google.com or eff.org.) A certificate authority (or other organization, such as a government spy agency,) could then use the fraudulent certificate to spy on your communications with that site, even if it is encrypted with HTTPS. Certificate Transparency can mitigate some of the risk by requiring public logging of all issued certificates, but is not a panacea.

The companies on your browser’s trusted CA list rarely commit such fraud, since not issuing malicious certificates is the foremost responsibility for a certificate authority. But it can and does still happen. The concern in this case is that DarkMatter has made its business spying on internet communications, hacking dissidents’ iPhones, and other cyber-mercenary work. DarkMatter’s business objectives directly depend on intercepting end-user traffic on behalf of snooping governments. Giving DarkMatter a trusted root certificate would be like letting the proverbial fox guard the henhouse. 

Currently, the standard for being accepted as a trusted certificate authority in the browser is a technical and bureaucratic one. For example, do the organization’s documented practices meet the minimum requirements? Can the organization issue standards-compliant certificates? Dark Matter will likely meet those standards, eventually. But the standards don’t take into account an organization’s history of trying to break encryption, or its conflicts of interest.

Other organizations have used this fact to game the system in the past and worm their way into our browsers. In 2009, Mozilla allowed CNNIC, the Chinese state certification authority, into the root CA program, after CNNIC assured Mozilla and the larger community that it would not abuse this power to create fake certificates and break encryption. In 2015 CNNIC was caught in a scandal when an intermediate CA authorized by CNNIC issued illegitimate certificates for several google-owned domains. Google, Mozilla, and others quickly revoked CNNIC’s authority in their browsers and operating systems after learning about the breach of trust. CNNIC is not the only example of this. In 2013 Mozilla considered dropping the Swedish company Teliasonera after accusations that it had helped enable government spying. Teliasonera ultimately did not get dropped, but it continues to have security problems to this day. 

DarkMatter was already given an “intermediate” certificate by another company, called QuoVadis, now owned by DigiCert. That’s bad enough, but the “intermediate” authority at least comes with ostensible oversight by DigiCert. Without that oversight, the situation will be much worse. We would encourage Mozilla and others to revoke even this intermediate certificate, given DarkMatter’s known practices subverting internet security.


Why a Focus on “Fake News” and Facebook Misses the Internet’s Real Problems – and Solutions

[Note:  This item comes from friend David Rosenthal.  DLH]

Why a Focus on “Fake News” and Facebook Misses the Internet’s Real Problems – and Solutions
By Yves Smith
Feb 23 2019

Yves here. Needless to say, I have little sympathy for the handwringing over “fake news,” and worst, the plans to regulate content provision on the Internet. There is no evidence that “Russian” campaigns on social media has any impact on election results, and political scientists like Tom Ferguson have debunked the idea. In addition, Marina Bart explained why Cambridge Analytica’s claims about its ability to sway voters were hogwash. 

In recent years, Google has repeatedly changed its search algos to downgrade alternative sites and we now barely appear on searches when our original reporting used to dominate results.

By Jennifer Cobbe, the co-ordinator of Cambridge University’s Trustworthy Technologies strategic research initiative, which researches trust, computer and internet technologies. She researches and writes on law, tech and surveillance issues. Originally published at openDemocracy

Yesterday morning, the House of Commons Digital, Culture, Media and Sport Select Committee published its long-awaited final report into disinformation and ‘fake news’. The report – which follows a long and at times dramatic investigation – is full of interesting and insightful details about political microtargeting (the targeting of political messaging to relatively small groups of people) and the spread of disinformation.

But the report’s myopic focus on one company – Facebook – means that it misses the bigger picture – including the internet’s dominant variety of capitalism.

It is of course welcome that attention is being paid to these problems, and there is much in the Committee’s report that’s good. The report is undoubtedly right to find that Britain’s electoral laws are woefully inadequate for the age of the algorithm and are badly in need of reform. Its recommendation that inferences drawn from analysis of other data about people should be more clearly considered to be personal data likewise seems eminently sensible.

Is It OK to Manipulate People to Extract Their Money, Just not for Politics?

But there are also clear shortcomings. Focusing on disinformation itself as a target for regulation brings an obvious problem. By calling for interventions based on ‘harmful’ content, the report asks the Government to step into the dangerous territory of regulating lawful political conversations between people. Are private companies to be mandated to police these communications on the Government’s behalf? There are numerous good reasons why this is deeply undesirable (not to mention incompatible with human rights laws).

The biggest oversight, however, is in diagnosing disinformation as essentially a problem with Facebook, rather than a systemic issue emerging in part from the pollution of online spaces by the business model that Facebook shares with others: the surveillance and modification of human behaviour for profit.

‘Surveillance capitalism’, as it’s known, involves gathering as much data as possible about as many people as possible doing as many things as possible from as many sources as possible. These huge datasets are then algorithmically analysed so as to spot patterns and correlations from which future behaviour can be predicted. A personalised, highly dynamic, and responsive form of behavioural nudging then seeks to influence that future behaviour to drive engagement and profit for platforms and advertisers. These targeted behaviour modification tools rely on triggering cognitive biases and known short-cuts in human decision-making. Platforms and advertisers extensively experiment to find the most effective way to influence behaviour.


Smartwatches Are Changing the Purpose of the EKG

Smartwatches Are Changing the Purpose of the EKG
Wearables help cast the medical test as a talisman of health-care competence.
Feb  17 2019

Think of the stereotypical representations of medicine, as they might appear on a television show: the crisp white coat, of course, and the stethoscope dangling at the ready. Syringes and intravenous lines, maybe. An X-ray or a CT scan slammed theatrically into a light box.

But any medical scene is incomplete without an electrocardiogram (EKG) machine running in the background, its jagged line tracing across the screen reassuringly, or alarmingly to cue a dramatic threat. The EKG is the backbeat of many hospital scenes on television. Important medical things are happening here, it says.

To tap into that potent association, many private medical practices, urgent-care clinics, community hospitals, technology companies, and health-care-product designers use EKG imagery in their advertising. Most of those images bear little resemblance to actual EKG tracings. The spikes and bumps generated for signs or emblems (like the logo of the daytime talk show The Doctors, for example) mostly amount to arbitrary peaks and valleys. They do not reflect the output of a human heart, healthy or diseased.

But accuracy might be less important than allegory. Like the white coat or the caduceus, the EKG has become talismanic, more valuable for the symbolism it provides than any diagnostic information it can convey. Now that EKGs are making their way into smartwatches, their symbolic purpose could risk overtaking their medical one.

Wearable medical technology promises a new, and better, way to manage personal health. Whether it’s Fitbits counting steps and calories burned, continuous blood glucose monitors aiding insulin dosing for diabetic patients, or Bluetooth earpieces offering round-the-clock heart rate and body temperature tracking, wearable devices sell the promise of the coldly clinical made portably intimate. Continuous EKG monitoring, like that available in the latest Apple Watch, might seem like a small technological leap, putting what was once the sole purview of hospitals and doctor’s offices neatly around a consumer’s wrist.

But continuous EKG monitoring is a little different from other, more discrete medical information. Unlike devices that measure more cleanly numerical metrics—step counts or target heart rates or blood glucose levels—a wearable EKG display doesn’t give the user an easy sense of hitting targets or falling short. Reading an EKG tracing is nuanced and interpretive, more art than math. A Fitbit gives you a number. An EKG paints a picture.

The 12-lead EKG, the gold standard of the diagnostic, measures the flow of current from 12 points on the patient’s body, offering a 360-degree view of the heart’s electrical activity. Its tracing reports the patient’s heart rate, rhythm, and regularity. Because the various parts of the heart produce different shapes of electrical activity owing to their size and muscularity, the EKG can also detect which chambers are beating at what time, and whether these chambers are correctly synced up and beating effectively. The larger a muscle is, the stronger its electrical impulses, so the size of an EKG wave can also indicate whether parts of the heart muscle are enlarged or dangerously thickened.

The most urgent diagnostic use of the device determines the presence and location of cardiac damage due to decreased blood flow. Areas of the heart getting less oxygen will show changes in their electrical conduction, and the 12-lead EKG provides real-time information: not just indicating whether a patient is having a heart attack, but also which coronary vessels are most likely blocked. The 12-lead EKG can also detect the location of scarring left behind by prior, sometimes silent heart attacks. An EKG tracing will clearly show an area of dead heart muscle no longer conducting electrical signals. Dead meat don’t beat, as cardiologists put it.

Still, there’s plenty that a snapshot EKG can’t do, including diagnosing intermittent problems with rhythm or changes that only occur with certain activities. An EKG doesn’t capture the shape or function of the heart’s valves, nor can it diagnose precarious plaques in the coronary arteries that could signal heart attacks waiting to happen.

The fewer number of leads an EKG has, the less information it can give you. A one-lead EKG, such as the kind that appears in the latest iteration of the Apple Watch, gives just a single vantage point. For the diagnosis of some cardiac abnormalities, that might be akin to only solving one side of a Rubik’s Cube. A watch endowed with this kind of EKG feature likely won’t have a large public-health impact, despite promotional materials from Apple touting the “momentous achievement” of a wearable “that can provide critical data for doctors and peace of mind for you.” Apple’s not alone, either. Another smartwatch-EKG offering, from Withings, promises “the opportunity to take an ECG anytime and anywhere.”


A Different Kind of Theory of Everything

[Note:  This item comes from friend David Rosenthal.  DLH]

A Different Kind of Theory of Everything
Physicists used to search for the smallest components of the universe. What if that’s not the point?
By Natalie Wolchover
Feb 19 2019

In 1964, during a lecture at Cornell University, the physicist Richard Feynman articulated a profound mystery about the physical world. He told his listeners to imagine two objects, each gravitationally attracted to the other. How, he asked, should we predict their movements? Feynman identified three approaches, each invoking a different belief about the world. The first approach used Newton’s law of gravity, according to which the objects exert a pull on each other. The second imagined a gravitational field extending through space, which the objects distort. The third applied the principle of least action, which holds that each object moves by following the path that takes the least energy in the least time. All three approaches produced the same, correct prediction. They were three equally useful descriptions of how gravity works.

“One of the amazing characteristics of nature is this variety of interpretational schemes,” Feynman said. What’s more, this multifariousness applies only to the truelaws of nature—it doesn’t work if the laws are misstated. “If you modify the laws much, you find you can only write them in fewer ways,” Feynman said. “I always found that mysterious, and I do not know the reason why it is that the correct laws of physics are expressible in such a tremendous variety of ways. They seem to be able to get through several wickets at the same time.”

Even as physicists work to understand the material content of the universe—the properties of particles, the nature of the big bang, the origins of dark matter and dark energy—their work is shadowed by this Rashomon effect, which raises metaphysical questions about the meaning of physics and the nature of reality. Nima Arkani-Hamed, a physicist at the Institute for Advanced Study, is one of today’s leading theoreticians. “The miraculous shape-shifting property of the laws is the single most amazing thing I know about them,” he told me, this past fall. It “must be a huge clue to the nature of the ultimate truth.”

Traditionally, physicists have been reductionists. They’ve searched for a “theory of everything” that describes reality in terms of its most fundamental components. In this way of thinking, the known laws of physics are provisional, approximating an as-yet-unknown, more detailed description. A table is really a collection of atoms; atoms, upon closer inspection, reveal themselves to be clusters of protons and neutrons; each of these is, more microscopically, a trio of quarks; and quarks, in turn, are presumed to consist of something yet more fundamental. Reductionists think that they are playing a game of telephone: as the message of reality travels upward, from the microscopic to the macroscopic scale, it becomes garbled, and they must work their way downward to recover the truth. Physicists now know that gravity wrecks this naïve scheme, by shaping the universe on both large and small scales. And the Rashomon effect also suggests that reality isn’t structured in such a reductive, bottom-up way.

If anything, Feynman’s example understated the mystery of the Rashomon effect, which is actually twofold. It’s strange that, as Feynman says, there are multiple valid ways of describing so many physical phenomena. But an even stranger fact is that, when there are competing descriptions, one often turns out to be more true than the others, because it extends to a deeper or more general description of reality. Of the three ways of describing objects’ motion, for instance, the approach that turns out to be more true is the underdog: the principle of least action. In everyday reality, it’s strange to imagine that objects move by “choosing” the easiest path. (How does a falling rock know which trajectory to take before it gets going?) But, a century ago, when physicists began to make experimental observations about the strange behavior of elementary particles, only the least-action interpretation of motion proved conceptually compatible. A whole new mathematical language—quantum mechanics—had to be developed to describe particles’ probabilistic ability to play out all possibilities and take the easiest path most frequently. Of the various classical laws of motion—all workable, all useful—only the principle of least action also extends to the quantum world.

It happens again and again that, when there are many possible descriptions of a physical situation—all making equivalent predictions, yet all wildly different in premise—one will turn out to be preferable, because it extends to an underlying reality, seeming to account for more of the universe at once. And yet this new description might, in turn, have multiple formulations—and one of those alternatives may apply even more broadly. It’s as though physicists are playing a modified telephone game in which, with each whisper, the message is translated into a different language. The languages describe different scales or domains of the same reality but aren’t always related etymologically. In this modified game, the objective isn’t—or isn’t only—to seek a bedrock equation governing reality’s smallest bits. The existence of this branching, interconnected web of mathematical languages, each with its own associated picture of the world, is what needs to be understood.


China Bans Millions From Flights, Trains In Social-Credit Crackdown

China Bans Millions From Flights, Trains In Social-Credit Crackdown
By Tyler Durden
Feb 20 2019

China has banned millions of people from any number of activities for being labeled as ‘untrustworthy’ on the country’s Orwellian social credit system. 

Banned from things such as air and train travel, blacklisted individuals are being punished in a broad effort to boost “trustworthiness” among the 1.4 billion Chinese citizens tracked by the massive system – which assigns both positive and negative scores to various metrics, reports SCMP.  

People with great social credit will get “green channel” benefits while those who violate laws will be punished with restrictions and penalties.

Hangzhou, the capital city of China’s Zhejiang province, rolled out its social credit system earlier last year, rewarding “pro-social behaviors” such as blood donations, healthy lifestyles, and volunteer work while punishing those who violate traffic laws, smoke and drink, and speak poorly about government. 

Human rights advocates have voiced concerns that the social credit system does not take into account individual circumstances, and has unfairly labeled people and companies as untrustworthy. 

Over 3.59 million Chinese enterprises were added to the official creditworthiness blacklist last year, banning them from a series of activities, including bidding on projects, accessing security markets, taking part in land auctions and issuing corporate bonds, according to the 2018 annual report released by the National Public Credit Information Centre. -SCMP

According to the NPCIC report, Chinese officials collected 14.21 million pieces of information of “untrustworthy conduct” by both business and individuals – including failure to repay loans, illegal fund collection, false and misleading advertising, swindling customers, and – for individuals, acts such as taking reserved seats on trains or causing trouble in hospitals, SCMP reports. 

Meanwhile, around 17.46 million people who are “discredited” were prevented from buying plane tickets, while 5.47 million were disallowed from purchasing tickets to China’s high-speed train system. 

Besides restrictions on buying tickets, local authorities also used novel methods to put pressure on untrustworthy subjects, including preventing people from buying premium insurance, wealth management products or real estate, as well as shaming them by exposing their information in public.

A total of 3.51 million untrustworthy individuals and entities repaid their debts or paid off taxes and fines last year due to pressure from the social credit system, the report said. -SCMP

The NIPC report also noted untrustworthiness issues on 1,282 peer-to-peer (P2P) lending platforms, more than half of those located in Zehjiang, Guangdong and Shanghai – which were placed on the creditworthiness blacklist because of illegal fundraising or the inability to repay investors. 

Health care product maker Quanjian Group and vaccine maker Changsheng Bio-Technology were added to the creditworthiness blacklist because of their involvement in major health sector scandals.

Quanjian was accused of making false marketing claims about the benefits of a product that a four-year-old cancer patient drank, while Changsheng, the major Chinese manufacturer of rabies vaccines, was fined US$1.3 billion in October after it was found to have fabricated records. -SCMP

Legal experts have expressed concern that the accelerated use of the creditworthiness system trample on what little privacy rights they have in China.