Save Your Money: Vast Majority Of Dietary Supplements Don’t Improve Heart Health or Put Off Death

Save Your Money: Vast Majority Of Dietary Supplements Don’t Improve Heart Health or Put Off Death
Jul 16 2019
https://www.hopkinsmedicine.org/news/newsroom/news-releases/save-your-money-vast-majority-of-dietary-supplements-dont-improve-heart-health-or-put-off-death

In a massive new analysis of findings from 277 clinical trials using 24 different interventions, Johns Hopkins Medicine researchers say they have found that almost all vitamin, mineral and other nutrient supplements or diets cannot be linked to longer life or protection from heart disease.

Although they found that most of the supplements or diets were not associated with any harm, the analysis showed possible health benefits only from a low-salt diet, omega-3 fatty acid supplements and possibly folic acid supplements for some people. Researchers also found that supplements combining calcium and vitamin D may in fact be linked to a slightly increased stroke risk.

Results of the analysis were published on July 8 in Annals of Internal Medicine.

Surveys by the Centers for Disease Control and Prevention show that 52% of Americans take a least one vitamin or other dietary/nutritional supplement daily. As a nation, Americans spend $31 billion each year on such over-the-counter products. An increasing number of studies –including this new one from Johns Hopkins –have failed to prove health benefits from most of them.

“The panacea or magic bullet that people keep searching for in dietary supplements isn’t there,” says senior author of the study Erin D. Michos, M.D., M.H.S., associate director of preventive cardiology at the Ciccarone Center for the Prevention of Cardiovascular Disease and associate professor of medicine at the Johns Hopkins University School of Medicine. “People should focus on getting their nutrients from a heart-healthy diet, because the data increasingly show that the majority of healthy adults don’t need to take supplements.”

For the current study, the researchers used data from 277 randomized clinical trials that evaluated 16 vitamins or other supplements and eight diets for their association with mortality or heart conditions including coronary heart disease, stroke, and heart attack. All together they included data gathered on 992,129 research participants worldwide.

The vitamin and other supplements reviewed included: antioxidants, β-carotene, vitamin B-complex, multivitamins, selenium, vitamin A, vitamin B3/niacin, vitamin B6, vitamin C, vitamin E, vitamin D alone, calcium alone, calcium and vitamin D together, folic acid, iron and omega-3 fatty acid (fish oil). The diets reviewed were a Mediterranean diet, a reduced saturated fat (less fats from meat and dairy) diet, modified dietary fat intake (less saturated fat or replacing calories with more unsaturated fats or carbohydrates), a reduced fat diet, a reduced salt diet in healthy people and those with high blood pressure, increased alpha linolenic acid (ALA) diet (nuts, seeds and vegetable oils), and increased omega-6 fatty acid diet (nuts, seeds and vegetable oils). Each intervention was also ranked by the strength of the evidence as high, moderate, low or very low risk impact.

The majority of the supplements including multivitamins, selenium, vitamin A, vitamin B6, vitamin C, vitamin E, vitamin D alone, calcium alone and iron showed no link to increased or decreased risk of death or heart health.

In the three studies of 3,518 people that looked at a low-salt diet in people with healthy blood pressure, there were 79 deaths. The researchers say that they found a 10% decrease in the risk of death in these people, which they classified as a moderate associated impact.

Of the five studies in which 3,680 participants with high blood pressure were put on a low-salt diet, they found that the risk of death due to heart disease decreased by 33%, as there were 674 heart disease deaths during the study periods. They also classified this intervention as moderate evidence of an impact.

Forty-one studies with 134,034 participants evaluated the possible impact of omega-3 fatty acid supplements. In this group, 10,707 people had events such as a heart attack or stroke indicating heart disease. Overall, these studies suggested that supplement use was linked to an 8 percent reduction in heart attack risk and a 7 percent reduction in coronary heart disease compared to those not on the supplements. The researchers ranked evidence for a beneficial link to this intervention as low.

[snip]

Amazon deforestation accelerating towards unrecoverable ‘tipping point’

Amazon deforestation accelerating towards unrecoverable ‘tipping point’
Data confirms fears that Jair Bolsonaro’s policy encourages illegal logging in Brazil
By Jonathan Watts
Jul 25 2019
https://www.theguardian.com/world/2019/jul/25/amazonian-rainforest-near-unrecoverable-tipping-point

Deforestation of the Brazilian Amazon has surged above three football fields a minute, according to the latest government data, pushing the world’s biggest rainforest closer to a tipping point beyond which it cannot recover.

The sharp rise – following year-on-year increases in May and June – confirms fears that president Jair Bolsonaro has given a green light to illegal land invasion, logging and burning.

Clearance so far in July has hit 1,345 sq km, a third higher than the previous monthly record under the current monitoring system by the Deter B satellite system, which started in 2015.

With five days remaining, this is on course to be the first month for several years in which Brazil loses an area of forest bigger than Greater London.

The steady erosion of tree cover weakens the role of the rainforest in stabilising the global climate. Scientists warn that the forest is in growing danger of degrading into a savannah, after which its capacity to absorb carbon will be severely diminished, with consequences for the rest of the planet.

“It’s very important to keep repeating these concerns. There are a number of tipping points which are not far away,” said Philip Fearnside, a professor at Brazil’s National Institute of Amazonian Research. “We can’t see exactly where they are, but we know they are very close. It means we have to do things right away. Unfortunately that is not what is happening. There are people denying we even have a problem.”

It may also complicate ratification of Brazil’s biggest ever trade deal with the European Union if EU legislators decide the South American nation is not keeping its side of the bargain, which includes a commitment to slow deforestation in line with the Paris climate agreement.

The official numbers from the National Institute for Space Research are a growing embarrassment to Bolsonaro, who has tried to fob them off as lies and criticised the head of the institute. Earlier this week, the president insisted the numbers should be screened by the ministry of science and technology and shown to him before being made public so that he did not get “caught with his pants down”.

This has raised fears that the data could be vetted in future rather than automatically updated online each day as is currently the case.

In his first seven months in power, Bolsonaro, who was elected with strong support from agribusiness and mining interests, has moved rapidly to erode government agencies responsible for forest protection.

He has weakened the environment agency and effectively put it under the supervision of the agricultural ministry, which is headed by the leader of the farming lobby. His foreign minister has dismissed climate science as part of a global Marxist plot. The president and other ministers have criticised the forest monitoring agency, Ibama, for imposing fines on illegal land grabbers and loggers. The government has also moved to weaken protections for nature reserves, indigenous territories and zones of sustainable production by forest peoples and invited businesspeople to register land counter-claims within those areas.

This has emboldened those who want to invade the forest, clear it and claim it for commercial purposes, mostly in the speculative expectation it will rise in value, but also partly for cattle pastures, soya fields and mines.

[snip]

Neuroscientists decode brain speech signals into written text

Neuroscientists decode brain speech signals into written text
Study funded by Facebook aims to improve communication with paralysed patients
By Ian Sample
Jul 30 2019
https://www.theguardian.com/science/2019/jul/30/neuroscientists-decode-brain-speech-signals-into-actual-sentences

Doctors have turned the brain signals for speech into written sentences in a research project that aims to transform how patients with severe disabilities communicate in the future.

The breakthrough is the first to demonstrate how the intention to say specific words can be extracted from brain activity and converted into text rapidly enough to keep pace with natural conversation.

In its current form, the brain-reading software works only for certain sentences it has been trained on, but scientists believe it is a stepping stone towards a more powerful system that can decode in real time the words a person intends to say.

Doctors at the University of California in San Francisco took on the challenge in the hope of creating a product that allows paralysed people to communicate more fluidly than using existing devices that pick up eye movements and muscle twitches to control a virtual keyboard.

“To date there is no speech prosthetic system that allows users to have interactions on the rapid timescale of a human conversation,” said Edward Chang, a neurosurgeon and lead researcher on the study published in the journal Nature.

The work, funded by Facebook, was possible thanks to three epilepsy patients who were about to have neurosurgery for their condition. Before their operations went ahead, all three had a small patch of tiny electrodes placed directly on the brain for at least a week to map the origins of their seizures.

During their stay in hospital, the patients, all of whom could speak normally, agreed to take part in Chang’s research. He used the electrodes to record brain activity while each patient was asked nine set questions and asked to read a list of 24 potential responses.

With the recordings in hand, Chang and his team built computer models that learned to match particular patterns of brain activity to the questions the patients heard and the answers they spoke. Once trained, the software could identify almost instantly, and from brain signals alone, what question a patient heard and what response they gave, with an accuracy of 76% and 61% respectively.

“This is the first time this approach has been used to identify spoken words and phrases,” said David Moses, a researcher on the team. “It’s important to keep in mind that we achieved this using a very limited vocabulary, but in future studies we hope to increase the flexibility as well as the accuracy of what we can translate.”

Though rudimentary, the system allowed patients to answer questions about the music they liked; how well they were feeling; whether their room was too hot or cold, or too bright or dark; and when they would like to be checked on again.

Despite the breakthrough, there are hurdles ahead. One challenge is to improve the software so it can translate brain signals into more varied speech on the fly. This will require algorithms trained on a huge amount of spoken language and corresponding brain signal data, which may vary from patient to patient.

Another goal is to read “imagined speech”, or sentences spoken in the mind. At the moment, the system detects brain signals that are sent to move the lips, tongue, jaw and larynx – in other words, the machinery of speech. But for some patients with injuries or neurodegenerative disease, these signals may not suffice, and more sophisticated ways of reading sentences in the brain will be needed.

While the work is still in its infancy, Winston Chiong, a neuroethicist at UCSF who was not involved in the latest study, said it was important to debate the ethical issues such systems might raise in the future. For example, could a “speech neuroprosthesis” unintentionally reveal people’s most private thoughts?

[snip]

Researchers Develop Technology To Harness Energy From Mixing Of Freshwater And Seawater

[Note:  This item comes from reader Randall Head.  DLH]

Researchers Develop Technology To Harness Energy From Mixing Of Freshwater And Seawater
By Scott Snowden
Jul 30 2019
https://www.forbes.com/sites/scottsnowden/2019/07/30/researchers-develop-technology-to-harness-energy-from-mixing-of-freshwater-and-seawater/

Researchers from Stanford University in California have developed an affordable, durable technology that could harness energy generated from mixing freshwater from seawater.

Outlined in paper, recently published in American Chemical Society’s ACS Omega, they suggest that this “blue energy” could make coastal wastewater treatment plants energy-independent.

“Blue energy is an immense and untapped source of renewable energy,” Kristian Dubrawski, a postdoctoral scholar in civil and environmental engineering at Stanford and study coauthor said in a statement. “Our battery is a major step toward practically capturing that energy without membranes, moving parts or energy input.”

The researchers tested a prototype of the battery, monitoring its energy production while flushing it with alternating hourly exchanges of wastewater effluent from the Palo Alto Regional Water Quality Control Plant and seawater collected nearby from Half Moon Bay. Over 180 cycles, battery materials maintained 97 percent effectiveness in capturing the salinity gradient energy.

The technology could work in any location where fresh and saltwater intermix, but wastewater treatment plants offer a particularly valuable case study. Wastewater treatment is energy-intensive, accounting for about three percent of the total US electrical load. The process – essential to community health – is also vulnerable to power grid shutdowns. Making wastewater treatment plants energy independent would not only cut electricity use and emissions but also make them immune to blackouts – a major advantage in places such as California, where recent wildfires have led to large-scale outages.

Every cubic meter of freshwater that mixes with seawater produces about .65 kilowatt-hours of energy – enough to power the average US household for about 30 minutes. Globally, the theoretical amount of recoverable energy from coastal wastewater treatment plants is about 18 gigawatts – enough to power more than 1,700 homes for a year.

The Stanford group’s battery isn’t the first technology to succeed in capturing blue energy, but it’s the first to use battery electrochemistry instead of pressure or membranes. If it works at scale, the technology would offer a more simple, robust and cost-effective solution.

The process first releases sodium and chloride ions from the battery electrodes into the solution, making the current flow from one electrode to the other. Then, a rapid exchange of wastewater effluent with seawater leads the electrode to reincorporate sodium and chloride ions and reverse the current flow. Energy is recovered during both the freshwater and seawater flushes, with no upfront energy investment and no need for charging. This means that the battery is constantly discharging and recharging without needing any input of energy.

While lab tests showed power output is still low per electrode area, the battery’s scale-up potential is considered more feasible than previous technologies due to its small footprint, simplicity, constant energy creation and lack of membranes or instruments to control charge and voltage. The electrodes are made with Prussian Blue, a material widely used as a pigment and medicine, that costs less than $1 a kilogram, and polypyrrole, a material used experimentally in batteries and other devices, which sells for less than $3 a kilogram in bulk.

[snip]

Alaska’s sweltering summer is ‘basically off the charts’

Alaska’s sweltering summer is ‘basically off the charts’
By Matthew Cappucci ,Juliet Eilperin ,Andrew Freedman and Brady Dennis
Jul 31 2019
https://www.washingtonpost.com/climate-environment/alaskas-sweltering-summer-is-basically-off-the-charts/2019/07/31/d3ffa082-b2d9-11e9-951e-de024209545d_story.html

Steve Perrins didn’t see the lightning, but he couldn’t miss the smoke that followed.

It was around dinnertime on July 23 at Alaska’s oldest hunting lodge, nestled in the wilderness more than 100 miles northwest of Anchorage. What began as a quiet evening at the Rainy Pass Lodge soon turned frantic as Alaska’s latest wildfire spread fast.

The Alaska National Guard soon evacuated 26 people and two dogs by helicopter from the lodge, which serves as a checkpoint for the Iditarod Trail Sled Dog Race.

The fire came within a half-mile of the lodge. In the days that followed, Perrins and his family housed and fed dozens of federal and state firefighters who rushed to contain the blaze — one of many raging across Alaska.

“It’s the hottest summer we’ve had, ever,” said Perrins, who began working at the lodge in 1977.

The nation’s 49th state is warming faster than any other, having heated up more than 2 degrees Celsius (3.6 degrees Fahrenheit) over the past century — double the global average. And parts of the state, including its far northern reaches, have warmed even more rapidly in recent decades.

This trend, driven in part by the burning of fossil fuels, is transforming the nation’s only Arctic state. Scientists around the world, including in the U.S. government, predict the warming will continue unless countries drastically reduce their greenhouse gas emissions in coming years.

Temperatures have been above average across Alaska every day since April 25. None of the state’s nearly 300 weather stations have recorded a temperature below freezing since June 28 — the longest such streak in at least 100 years. On Independence Day, the temperature at Ted Stevens Anchorage International Airport hit 90 degrees for the first time on record.

More than 2 million acres have gone up in flames across the state as thousands of firefighters have worked to contain wildfires. Stores have sold out of fans and ice. Moose have been spotted seeking respite in garden sprinklers.

Alaska, which logged its warmest June on record, now seems destined to register not only its warmest July but also its warmest month.

“Usually if you were to break this sort of record, you’d do it by a sliver of a degree,” said Brian Brettschneider, a climatologist and research associate at the International Arctic Research Center. He said that the state is on course to shatter the record by more than one degree Fahrenheit.

The combination of relentless high pressure, extremely warm sea surface temperatures and high humidity are “basically off the charts,” Brettschneider said.

The entire Arctic is suffering under extreme temperatures. In Siberia, sweeping wildfires are sending smoke thousands of miles away and lofting dark soot particles onto the vulnerable Arctic ice cover. Arctic sea ice is melting at an alarming pace and could break the 2012 record. In addition, the weather system that caused last week’s heat wave in Western Europe has now settled above the Atlantic side of the Arctic, accelerating surface-ice melting in Greenland.

Mark Parrington, a senior scientist with the Copernicus Atmosphere Monitoring Service in Europe, said that through July 28, wildfires in the Arctic region, including Siberia and Alaska, had emitted 125 metric megatons — the highest of any year-to-date since such monitoring began in 2003.

“We’re seeing something exceptional this year,” Parrington said, even though the acreage burned in Alaska is not yet a record.

Even as researchers in Alaska are working to capture climate change’s impact on the region, sharp cuts by Gov. Mike Dunleavy (R) to the state’s education budget threaten to trigger an exodus of some of the very scientists who are trying to explain the unprecedented changes that residents are experiencing.

“I think it will lead to many of the best Arctic scientists in the UA system [leaving] the state,” Christopher Arp, an associate research professor at the University of Alaska at Fairbanks Water and Environmental Research Center, said in an email. “Having scientists live where they do research is very important in my view, so I think that will have a negative impact on Arctic research that will be very challenging to reverse.”

Meanwhile, this summer’s heat has transformed Alaska’s landscape and waterways, affecting humans and wildlife alike.

[snip]

Our planet is in crisis. But until we call it a crisis, no one will listen

[Note:  This item comes from friend Jock Gill.  DLH]

Our planet is in crisis. But until we call it a crisis, no one will listen
We study disaster preparedness, and ‘climate change’ is far too mild to describe the existential threat we face
By Caleb Redlener, Charlotte Jenkins and Irwin Redlener
Jul 31 2019
https://www.theguardian.com/commentisfree/2019/jul/31/climate-crisis-change-language

When Senator Kamala Harris was asked about climate change during the Democratic debate in June, she did not mince words. “I don’t even call it climate change,” she said. “It’s a climate crisis.”

She’s right – and we, at Columbia University’s National Center for Disaster Preparedness, wish more people would call this crisis what it is.

The language we use to refer to the climate crisis has changed over time, often due to political pressures. In 1975, the geophysicist Wallace S Broecker published the first major paper on planetary heating – Climatic Change: Are We on the Brink of a Pronounced Global Warming? – and for a while the term “global warming” was the most common. But in the decades following, politicians and members of the media began to use the softer, more euphemistic term “climate change” to describe changing weather and atmospheric conditions.

That wasn’t an accident. In the early years of George W Bush’s first term as presidency, scientists were actually making serious progress in establishing overwhelming evidence that we were, in fact, facing a global crisis. Public opinion on climate change was shifting; Americans were curious about how worried they should be by the damage being done to our atmosphere.

Enter Frank Luntz, a renowned Republican pollster and strategist. Luntz was concerned that the Republican party was losing the communications battle. He advised Republicans to cast doubt on scientific consensus on the dangers of greenhouse gases and to publicly hammer home a message of uncertainty.

In 2002, Luntz wrote a memo to Bush urging him and the rest of his party to use the term “climate change” instead of “global warming”. Climate change sounded “less frightening”, he pointed out, “like you’re going from Pittsburgh to Fort Lauderdale”.

Luntz succeeded. “Climate change” began to eclipse “global warming” in the American vernacular, downplaying its menacing predecessor.

Technically, the term climate change makes sense. The climate is, indeed, changing. But the term is far too mild to describe the existential threat to the planet that these changes pose. Jeffrey Sachs, former director of the Earth Institute at Columbia University, agrees. “We should have a term that emphasizes the incredible cost and dangers,” he told us. “We do not need to be shy about this.”

Under the administration of Donald Trump, the situation is even worse. Now, instead of re-framing language about the climate crisis, Republican officials simply remove references to it entirely and put pressure on researchers and analysts who disagree.

It is time to update the language we use to better reflect the seriousness of global heating. “People do not understand the scale and pace of the climate emergency,” Jamie Henn, strategy and communications director for 350.org, an international climate campaign, told us. “This is not an issue with one future date where we will start to see effects. We may hit tipping points at any time in which we will see immediate problems.”

There is no longer any doubt that climate change is an unprecedented planetary emergency. And the terms we use to describe this crisis must deliberately reflect an appropriate sense of urgency.

[snip]

Computers that can see

Computers that can see
By Benedict Evans
Jul 19 2019
https://www.ben-evans.com/benedictevans/2019/7/19/computers-that-can-see

One of the basic mechanics of tech over the past few decades has been that we tend to start with special-purpose components, but then over time we replace them with general-purpose components. When performance is expensive, optimizing on one task gets you a better return, but then economies of scale, Moore’s Law and so on mean that the general purpose component overtakes the single-purpose component. There’s an old engineering joke that a Russian screwdriver is a hammer  – “just hit it harder!” – and that’s what Moore’s Law tends to do in tech. “Never mind your clever efficiency optimizations, just throw more CPU at it!”

You could argue this is happening now with machine learning – “instead of complex hand-crafted rules-based systems, just throw data at it!” But it’s certainly happening with vision. The combination of a flood of cheap image sensors coming out of the smartphone supply chain with computer vision based on machine learning means that all sorts of specialized inputs are being replaced by imaging plus ML. 

The obvious place to see this is in actual physical sensors – there are all sorts of industrial applications where people are exploring using vision instead of some more specialised sensing system. Where is that equipment? Has that task been done? Is there a flood on the floor of the warehouse? Many of these things don’t necessarily *look* like vision problems, but now they can now be turned into one.

This is also, of course, the debate between Elon Musk and the autonomy community around LIDAR. A priori, you should be able to drive with vision alone, without all these expensive impractical special-purpose sensors – after all, people do. Just throw enough images and enough data at the problem (“hit it harder!”) and we should be able to make it work. Pretty much everyone does actually agree in theory – the debate is about how long it will take for vision to get there (consensus = ‘not yet’), and how hard it is to solve all the other components of autonomy. Giving the car a perfect model of the world around it, with or without LIDAR, is not the only problem to solve. 

This also overlaps with two other current preoccupations of mine: how can we get software to be good at recommendation and discovery, and how does machine learning move internet platforms away from being mechanical Turks? 

So far the internet has been very good at giving us what we already know we want, either in logistics (Amazon) or search (Google). It’s been much worse at knowing what we might like, without any explicit request. And to the extent that we can do this, we need people to tell it first. You have to buy a lot of stuff on Amazon or like a lot of things on Instagram or Spotify for your recommendations to be any good. Meanwhile, these systems often lack any real understanding of what the data you’re interacting with really is – hence all the jokes about Amazon recommendations – “Dear Amazon, I bought a refrigerator but I’m not collecting them – don’t show me five more”. In all of this, the user is being treated as a mechanical Turk – the system doesn’t know or understand, but people do, so find ways to get people to tell it (this is also what PageRank did).

Now, suppose I post five photos of myself and Mr Porter knows what clothes to recommend, without my having to buy anything first, or go through any kind of onboarding? Suppose I wave my smartphone at my living room, and 1stDibs or Chairish know what lamps I’d like, without my having to spend days browsing, liking or buying across an inventory of thousands of items? And what happens if a dating app actually knows what’s in the photos? No more swiping – just take a selfie and it tells you what the match is. Seven or eight years ago this would have been science fiction, but today it’s ‘just’ engineering, product and route to market.

[snip]