iPod: How It Changed Apple

iPod: How It Changed Apple
Fifteen years ago, the iPod and its iTunes music distribution system were born. No one saw how they were to transform Apple, enable the Smartphone 2.0 era, and unlock hundreds of billions of dollars of revenue for the company.
By Jean-Louis Gassée
Oct 31 2016

Many of the facts surrounding the iPod’s birth and rise are well known. In The Perfect Thing, Steven Levy, one of the Valley’s best annalists (regular readers will note the spelling, this one is meliorative), lovingly and accurately chronicles how the iPod “became a full-blown cultural phenomenon, […] revolutionizing the way we experience music and radio.”

Less lovingly, when Apple’s music player came out fifteen years ago, I quickly dismissed it as yet another MP3 player, one of many since the genre emerged in 1998. Incorrigible gadgeteer, I already owned several such devices including the popular Rio from Diamond Multimedia and one from SanDisk — which continues to make MP3 players to this day. This one will set you back $49.99 for 8GB.

By fixating on the iPod itself, I completely overlooked iTunes, introduced a few months earlier. I wasn’t alone in missing the forest for a tree: As popular as the iPod would become, no one imagined that it would be iTunes that would unleash Apple’s potential as it unlocked hundreds of billions of dollars of revenue and profit.

With iTunes, Steve Jobs managed to break the back, for lack of a better word, of music “majors”, the content owners and distributors, by selling music “by the slice”, one song at a time. Apple’s co-founder would repeat this feat with cellular carriers, starting with AT&T, hypnotizing them into letting the iPhone manufacturer dictate configuration, pricing, and content distribution. (Sadly, unlocking and unbundling hasn’t worked as well for TV programs and streaming video distribution. One can hope…)

The paramount iTunes/iPod innovation was micro-payments. Paying 99 cents on-line? No way. But Jobs convinced the credit card companies and, reportedly, allowed Apple to take losses on some iTunes transactions — only to sell more high-margin iPods.

We can also mention in passing that iTunes was the first Apple app also made available on Windows, quickly followed by an Airport base station management client, if memory serves.

The well-designed hardware/software, iPod/iTunes combo quickly became ubiquitous, boosted by a clever, graphically minimalist, Jobs-like, if that’s the right word, Silhouette Advertising campaign:


Imagine a world without animals. You’ll soon see how much we need them

Imagine a world without animals. You’ll soon see how much we need them
Hand-pollinating crops, growing meat in labs and dealing with the stench of scavenging fungus – this is the future we face
By Jules Howard
Oct 31 2016

A couple of years ago we heard news that 50% of all vertebrate species had disappeared in 40 years. On Thursday, we were greeted with news that by 2020 the figure is likely to rise to 66% of all vertebrates. It is no wonder that the conservationists are shouting. It is no wonder that they are so desperate to get their message heard. Animals, it seems, are on the way out. And no one appears to much care.

So, allow me to entertain the idea of a post-animal Britain. Could we make the best of this world, in true Theresa May fashion? Are animals, perhaps, all a bit overrated? Maybe it wouldn’t be so bad?

Sure, there would be some tough choices at first. One particularly pressing matter would be finding a way to cross-pollinate flowering crops. As is well known, trees and insects co-evolved, the plants offering a sugary reward to insects in return for their pollination services. How might we achieve this without bees and flies? Simple. The problem of cross-fertilisation could quite simply be solved by robots or people on day release from jail (or even children who don’t get into grammar schools). They could be made to hand-fertilise flowers. They could be like little unthinking bees.

Indeed, hand-fertilisation is already common practice in some parts of the world, where invertebrate populations have already been ravaged. Think of the savings of such a plan! Robots don’t need sugary water produced by plants, after all. They run on cheap oil and gas, which there’s plenty of, forever. Without invertebrates, everyone wins, right?

More seriously, there could be other benefits. Think of climate change. Global emissions of CO2 would be greatly reduced without them, seeing that the gas that sprouts forth from farm animals accounts for 15% of human-caused global CO2emissions. So that would be good. We won’t even miss the farm animals either, since it seems increasingly likely that we will be 3D printing our dinners in the future. Indeed, lab-grown meat is already becoming a very real prospect, which means that sheep and pigs and chickens, all very costly to run, could disappear quite happily.

And pets, too. Pets can go. They contribute to global emissions as well. If we were to get rid of them, within two generations the very notion of having a pet would probably end up being absurd. The very idea that some houses had cats that would warm a person’s lap while they watched Netflix could seem almost perverted to our great-great-grandchildren whose idea of having a meaningful relationship with an animal would, by then, involve tossing 35 Pokéballs at an imaginary one while sitting on the toilet.

It’s not all rosy, however. I’m the first to admit that there would be problems in a post-animal world. Sure, green algal blooms would cover the entire face of the planet – most of the land and sea – and all plants would remain uncropped and there would be the stench of scavenging fungus which would come to fill the niche left by Earth’s animal decomposers. But we can handle a bit of a whiff or maybe get a man in to fix this for us or something. Oh, and the crops wouldn’t grow because there would be no worms to oxygenate the soil. And the soil communities – tiny nematode worms and mites – would die so the soil would essentially be dead and likely to blow away, being completely unsuitable for any forms of crops. But … let’s not get bogged down in the details, OK? Again, children and robots can probably fill this role. It’ll be fine.


Conflict and Common Goals: the Government and Silicon Valley

[Note: This item comes from friend Jen Snow. DLH]

Conflict and Common Goals: the Government and Silicon Valley
By Levi Maxey
Oct 30 2016

Earlier this month, Director of National Intelligence (DNI) James Clapper describedreaching out to the private sector as a “daunting task,” and that “there is still much to be done,” to improve information sharing in the age of digital communications. Brad Brekke, the FBI’s director of private sector engagement, added that there is a need to go even further than that, and “move from information sharing to collaboration.” These statements allude to the ongoing tensions between the government and the tech giants of Silicon Valley.

Tensions remain

These tensions are not new by any means, but were clearly—and publicly—demonstrated in the wake of the San Bernardino shooting in December 2015. Despite a warrant, Apple refused to help the FBI access information on one of the suspect’s iPhones. Apple claimed that the FBI’s request to change the software of the seized phone—which would allow the FBI to automatically attempt the 10,000 possible pin combinations without activating a security system designed to erase all data after too many incorrect guesses—would weaken the encryption of Apple’s entire system as it could be used on all iPhones, anywhere in the world.

Cipher Brief expert and former Senior Deputy General Council at the CIA, Robert Eatinger, argues that by refusing the lawful request from the FBI, a private company has made a decision that affects all Americans. “We were not asked if we wanted to amend the Constitution to withhold from our government the authority to search the contents of cellphones used by any person, for any purpose, anywhere in the world,” he writes. Apple has effectively “imposed terms of governance on the American people without our consent.”

The situation begs for a deeper public discussion on the trade-offs between strong technical security for the consumer and providing the government sufficient tools to effectively mitigate security threats.

There are issues inherent in digital communications that do not recognize national borders. The pros and cons of individual government policies can fall by the wayside when considering the global nature of the Internet and the industry surrounding it. True, the U.S. has a unique opportunity to cooperate with tech companies because many of them are based there, but the products of these companies are used around the world. Should the security of people living under foreign governments affect the decision-making calculus on how Americans ensure their own security? Is Apple able to grant the U.S. intelligence community access to iPhones while denying Chinese intelligence services the same?

Though the public debate is portrayed as one of competing values—privacy vs. security—these questions are not necessarily about opposing principles. Both the U.S. government and private industry have common goals: the promotion of free speech and individual liberty while combatting extremism and violence. Rather, it seems that new technologies have raised conflicting views on how best to achieve those goals—and the shared values, it could be argued, are getting lost in the debate.

Nuala O’Connor, President and CEO of the Center for Democracy and Technology asserts “all Americans—including both company executives and law enforcement officials across the nation—want to keep our country safe and secure.”

But progress is possible

Tech companies sharing information with government is not new. This became immediately apparent in 2013, after former NSA contractor Edward Snowden revealed the NSA program PRISM, which allowed the collection from prominent tech companies of Internet communications data being transmitted abroad. Recently it was revealed that Yahoo has been scanning emails in real time for intelligence agencies, using a modified spam-filter to search for a character string associated with a foreign terrorist organization.


The Merger of AT&T and Time Warner Is Too Big (and Bad) To Succeed

The Merger of AT&T and Time Warner Is Too Big (and Bad) To Succeed
By Tim Karr
Oct 31 2016

AT&T executives think their plan to take over Time Warneris too big to fail. But the proposed merger’s astronomical cost may prove them wrong.

The $85 billion deal, combining the nation’s largest phone, Internet and pay-TV provider with an entertainment and news colossus whose holdings include CNN, HBO, TBS, TNT and Warner Bros. Studios, would be one of the largest media mergers ever.

The resulting enterprise would have an approximate market value of $300 billion. That’s nearly three times the value of Comcast after it bought NBCUniversal in 2011.

For the deal to go through, AT&T and Time Warner need the approval of government regulators, especially those at the Department of Justice, who will vet it to see if it violates antitrust laws.

If approved, the merger would saddle AT&T with a whopping debt load estimated to be $187 billion, according to industry analysts. AT&T took on tens of billions of dollars in debt last year when it completed its $69 billion takeover of DirecTV, a move soon followed by price increases for DirecTV and AT&T broadband customers.

There’s every reason to believe that similar rate hikes would follow AT&T’s acquisition of Time Warner. But the deal’s biggest proponents certainly won’t admit that. “I think anyone who characterizes this as a means to raise prices is ignoring the basic premise of what we’re trying to do here,” AT&T CEO Randall Stephenson told investors during a recent conference in California.

Stephenson and Time Warner CEO Jeff Bewkes went on a largely unsuccessful tour of media outlets to pump the deal and address its doubters. “This will be essentially a catalyst to more competition, more innovation,” Bewkes said during a stop at CNN studios in New York. “More competition leads to lower prices and happier consumers.”

Stephenson and Bewkes are ignoring some fundamental math here. AT&T will need to regularly pay interest to service its massive debt. The telecommunications giant doesn’t print cash; it bills customers. In other words, to pay down its interest, AT&T will have to hike prices.

It’s likely those cost increases would occur in its consumer-product markets, including broadband, where AT&T faces little to no real competition. It could also raise the rates it charges and restrict the offerings it makes to other cable distributors that carry channels like HBO and CNN, limiting what people can see and generating new costs that would be passed along to pay-TV subscribers everywhere.

This trickle-down of higher prices would be felt not in the boardroom but in the living room, where working families are already struggling to keep up with escalating costs for Internet, wireless services and pay-TV.


Before the Flood – Full Movie

Note: This item comes from friend David Isenberg. National Geographic has made the full feature film available on the internet for free viewing. DLH]

Before the Flood – Full Movie
Oct 31 2016

Join Leonardo DiCaprio as he explores the topic of climate change, and discovers what must be done today to prevent catastrophic disruption of life on our planet.

About Before the Flood:
Before the Flood, directed by Fisher Stevens, captures a three-year personal journey alongside Academy Award-winning actor and U.N. Messenger of Peace Leonardo DiCaprio as he interviews individuals from every facet of society in both developing and developed nations who provide unique, impassioned and pragmatic views on what must be done today and in the future to prevent catastrophic disruption of life on our planet.

Video: 1:35.33 min

What the heck is happening at Apple?

What the heck is happening at Apple?
By Robert X. Cringely
Oct 31 2016

“What the heck is happening at Apple?” people ask me. “Has the company lost its mojo? Why no new product categories? Why didn’t Apple, instead of AT&T, buy Time Warner? And why are the new MacBook Pros so darned expensive?

After first getting out of the way the fact that Apple is still the richest public company in the history of public companies, let’s take these questions in reverse order beginning with the MacBook Pros. In addition to their nifty OLED finger bar above the keyboard, these new Macs seem to have gained an average of $200 over the preceding models of the same size. What makes Apple think they can get away with that?

Apple can get away with that because it always has gotten away with it. Apple has always prided itself on high profit margins and really socking it to the customer when a new product is released is a tradition in Cupertino dating back to the Apple III. Remember the original 128K Macintosh cost more than $2400 and it was close to useless.

High prices not only mean high margins, they also act to control demand, making it somewhat easier to handle problems that come with any truly new model. And those who are willing to pay more are often more understanding, too. Apple fanboys are proud to be the first and proud to have spent so much. It’s a luxury thing, I suppose.

What we can count on is that MacBook Pro prices won’t get any higher for many years and these models will decrease in price as production ramps up (expect $100 off just before and after Christmas) and especially when they are replaced by subsequent models with more powerful processors.

And as IBM reported a couple weeks ago, even at higher prices, Macs tend to be cheaper to own. I’m writing this on a mid-2010 non-Retina 13-inch MacBook Pro I bought six years ago last June. Yes, over time I increased the memory to from four to 16 gigs, took the hard drive up from 240 gigs to a terabyte Fusion drive, replaced both the battery and the keyboard when they wore out, but that still puts me only about $1600 into this device with which I have so far generated well over $1 million in revenue. I have no plans to replace it.

In the same period I have also gone through three Windows notebooks from Toshiba, Acer and HP.

This very durability presents a problem for Apple that they’ve tried to deal with by eventually stopping software support for older machines. That’s why the Mac Minis of my kids now run Ubuntu. Old Macs get handed down or sold on Craigslist and that’s a problem for Apple, but not nearly as big a problem as the fact that pretty much everyone who wants a smart phone now has one.

Yes, Apple has a problem — a problem most other companies would love to have: customers like the products too much so the market is becoming saturated.

All Apple needs is a new product category, right? Another iMac, iPod, iPhone, iPad will do nicely. Where is it?

It isn’t anywhere and in that sense Apple has lost its mojo.

Apple has a problem of big numbers: unless a new product category can produce $5-10 billion in revenue its first year it almost isn’t worth doing. And Apple has such new products — headphones and Apple Music primarily — but those just don’t seem like much when Amazon — Amazon — seems to be inventing new categories all the time. But Amazon’s requirements for success are much lower than Apple’s and its tolerance for failure (Amazon Fire phone anyone?) are higher.

Apple, whether it admits so or not, has to live with the memory of Steve Jobs.

And so Apple is both paralyzed and isolated. These two characteristics have to be considered together to understand where the company stands. While I doubt that Apple is out of good ideas I also don’t doubt that the company is close to incapable of seriously committing to any of the ones it has. What would Steve do? And the absence of Steve Jobs is made even worse by the fact that Apple generally stands apart from the rest of the industries in which it competes. Where is Apple at industry conferences, for example? With the exception of technical standards bodies, where Apple shines, the company just isn’t out there. Rather than holding up a finger to test the wind Apple’s tendency is to examine its own navel because great ideas are supposed to, well, just appear.

Except great ideas don’t just appear. As Steve Jobs (and Picasso) said, they are stolen. And to my knowledge nobody from Apple has been out stealing anything for a long time.

So Goldman Sachs is upset that Apple didn’t at least bid for Time Warner. I’m pretty sure Apple didn’t even know Time Warner was for sale.

The basic problem here is that Apple insists on “thinking different” when in fact there’s not much real thinking happening there at all — just waiting for something to percolate.


The Urban Broadband Gap

[Note: This item comes from friend Harold Feld. DLH]

The Urban Broadband Gap
By Doug Dawson
Oct 26 2016

It’s natural to think that all city-dwellers have great broadband options. But when you look closer you find out it’s often not really so. For various reasons there are sizable pockets of urban folks with gaping broadband needs.

Sometimes the broadband gap is just partial. I was just talking to a guy yesterday from Connecticut who lives in a neighborhood that largely commutes to New York City for work. These are rich neighborhoods of investment bankers, stockbrokers and other white collar households. They have cable modem service from Comcast and can get home broadband, but he tells me that cell phone coverage is largely non-existent. He can’t even use his cellphone outside of his house. There is a lot of talk about broadband migrating to wireless, but 5G broadband isn’t going to benefit people that can’t even get low-bandwidth cellular voice service.

I also have a good friend who lives in a multi-million dollar home in Potomac, Maryland – the wealthiest town in one of the wealthiest counties in the country. He has no landline broadband – no cable company, no Verizon FiOS, and not even any usable DSL. His part of the town has winding roads and sprawling lots and was built over time. I’m sure that it never met the cable company’s franchise density requirement of at least 15 or 20 homes per street mile of fiber – so it never got built. I am sure that most of the city has broadband, but even within the richest communities there are homes without.

You often see this problem just outside of city boundaries. Cities generally have franchise agreements that require the cable company to serve everybody, or almost everybody. But since counties rarely have these agreements the cable and phone companies are free to pick and choose who to serve outside of town. You will see some neighborhoods outside of a city with a cable company network while another similar neighborhood nearby goes without. It’s easy to find these pockets by looking for satellite TV dishes. The difference between the two neighborhoods is often due to nothing more to the whim of the telco and cable companies at the time of original construction.

The fault for not having broadband can’t always be laid on the cable company. Apartment owners and real estate developers for new neighborhoods are often at fault. For example there are many apartments around where the apartment owner made a deal years ago with a satellite TV providers to provide bulk cable TV service on a revenue sharing basis. In electing satellite TV the apartment owner excluded the cable company and today has no broadband.

Real estate developers often make the same bad choices. For instance some of hoped to provide broadband themselves but it never came to fruition. I’ve even seen some developments that just waited too long to invite in the cable company or telco and the service providers declined to build after the streets were paved. The National Broadband Map is a great resource for understanding local broadband coverage. In my own area there are two neighborhoods on the map that show no broadband. When I first saw the map I assumed these were parks, but there are homes in both of these areas. I don’t know why these areas are sitting without broadband, but it’s as likely to be a developer issue as a cable company issue.

There have also been several articles written recently that accuse the large cable companies and telcos of economic redlining. These companies may use some of the above excuses for not building to the poorer parts of an urban area, but overlaying broadband coverage and incomes often paints a startling picture. Since deciding where a cable company expands is often at the discretion of local and regional staff it’s not hard to imagine bias entering the process.

I’ve seen estimates that between 6 and 8 million urban households don’t have broadband available. These have to be a mixture of the above situations – the neighborhoods are outside of a franchise area, or the developers or apartments owners didn’t allow ISPs in, or the ISPs are engaging in economic redlining. But for whatever the reasons this is a lot of households, especially when added to the 14 million rural homes without broadband.


Seeking Ownership of Both the Information and the Superhighway

Seeking Ownership of Both the Information and the Superhighway
By Jim Rutenberg
Oct 30 2016

On the face of it, there is something Strangelovian about the proposed merger between AT&T and Time Warner.

A company that controls the signal to the wireless devices of more than 130 million people, and to televisions in some 25 million households, buys a major movie studio and one of the biggest collections of cable channels in the country — potentially attaining a dominant position from which to control the information flow to a large percentage of Americans. A cultural-political Doomsday Machine is born. Mass media hegemony, or some such, follows. Or does it?

Like a lot of news consumers, I’ve been struggling to get my head around this deal, which would give AT&T control of the Warner Bros. movie studio and cable networks including CNN, HBO and TBS. It would be gargantuan, carrying an $85 billion price tag. And it would further concentrate media ownership into a few powerful hands, playing to fears of a big corporate media takeover of the wild and woolly web, which has been so central to this year’s great political upheaval. But it’s all very fuzzy.

What is it about this proposed merger that has both the left and the right, on the presidential trail and on Capitol Hill, so suspicious of it, if not downright opposed? Are the stakes really so high and the potential damage so great?

For answers, I turned to Senator Al Franken of Minnesota, the former “Saturday Night Live” comedian who led the congressional opposition to Comcast’s failed bidto buy Time Warner Cable nearly two years ago. He was one of the first to voice concerns about the AT&T-Time Warner deal.

To be fair, Mr. Franken, a Democrat, started by knocking down my “Dr. Strangelove” comparison, given that the movie is, after all, about the planet’s thermonuclear annihilation.

“I think you’re being alarmist,” he told me.

He laid out the case in less cinematic terms, because, as the comedian John Oliver has pointed out in his warnings about corporate media power on his HBO show, the big stakes of the wired future are hidden in the small print.

“This is AT&T, which owns DirecTV, so that’s a large pay-TV provider,” Mr. Franken said. “Also, they’re the second biggest in mobile broadband. And they’re trying to buy this content company, Time Warner, which has Warner Bros. and all this very desirable TV content.

“When the company that controls the pipes, so to speak, owns this very, very large content provider, it can cause a whole bunch of different horribles for consumers,” he said.

First and foremost, Mr. Franken said, there’s the dual threat of less choice and higher prices.

On paper at least, AT&T could hoard some of its most popular movies or channels, making them more readily available to DirecTV, to promote DirecTV subscriptions, while pushing competitors off DirecTV or into the deep wilderness of the DirecTV channel guide. Or, Mr. Franken said, the new company could charge competitors like Comcast and Cox more for its most popular channels, price increases that would no doubt be passed on to the consumer.

Jeffrey L. Bewkes, the Time Warner chief executive, dismissed those fears when we spoke Friday, saying that they don’t jibe with the company’s business imperatives: to offer the most channels for the best price, and to have its own channels as widely distributed as possible.

“It would be like selling toothpaste and not putting it in Duane Reade,” he said, referring to the ubiquitous New York City drugstore chain. “It doesn’t make any sense.”

AT&T and Time Warner executives say they are prepared to abide by the conditions government regulators are certain to impose, including prohibitions against using the new company’s power to discriminate against rivals.


Why the Industrial Revolution didn’t happen in China

Why the Industrial Revolution didn’t happen in China
By Ana Swanson
Oct 28 2016

To economic historians like Joel Mokyr, there’s nothing inevitable about the incredible wealth and health of the modern world. But for a spark in a little corner of Europe that ignited the Industrial Revolution — which spread incredible advances in technology and living standards first across the north Atlantic coast in the 1700 and 1800s and gradually around the world — we could all be living the nasty, brutish and short lives of our ancestors centuries before.

Mokyr, who teaches at Northwestern University, dives into the mystery of how the world went from being poor to being so rich in just a few centuries in a forthcoming book, “A Culture of Growth: The Origins of the Modern Economy.”

Drawing on centuries of philosophy and scientific advancements, Mokyr argues that there’s a reason the Industrial Revolution occurred in Europe and not, for example, in China, which had in previous centuries shown signs of more scientific advancement: Europe developed a unique culture of competitive scientific and intellectual advancement that was unprecedented and not at all predestined.

This interview has been edited for length and clarity.

Why is it important to consider this question, of why the Industrial Revolution occurred?

It is a question that needs to be asked if we want to know how we became what we are. The 19th and 20th centuries are in many ways the most transformative centuries in all of human history. Until about 1800, the vast bulk of people on this planet were poor. And when I say poor, I mean they were on the brink of physical starvation for most of their lives.

Life expectancy in 1750 was around 38 at most, and much lower in some places. The notion that today we would live 80 years, and spend much of those in leisure, is totally unexpected. The lower middle class in Western and Asian industrialized societies today has a higher living standard than the pope and the emperors of a few centuries back, in every dimension. That is the result of one thing: Our ability to understand the forces of nature and harness them for our economic needs.

If we understood how that happened, we would understand human history. For thousands of years, the material conditions that people lived in changed very little. Then all of a sudden, in 1800, it just zooms up.

That came out of Western Europe and its offshoot in North America after 1800. If it hadn’t been for that, you and I would be looking at a life expectancy of maybe 40, and I probably I wouldn’t be sipping cappuccino from a fancy machine and talking to you on my smartphone. Look at what we have achieved in every dimension. Technology hasn’t just increased our income, it’s changed every aspect of daily life.

The question is, was all this inevitable? My answer is, absolutely no.

So why did this dramatic change occur? And why did it start in Europe, rather than in China?

China has a glorious past in its scientific achievements. And yet they were never able to turn it into economic growth as the West did. If you look at Europe and China in the 19th century, Europe is advancing at breathtaking speed. It’s building a rail network, steamships, factories. By the early 20th century, China looked like it was going to be completely occupied by imperialist powers. Clearly the technological and economic development of East and West diverged from 1850 on. The $64,000 question is “Why?”

People have given different answers, and I’m giving mine. One way of thinking about it is culture. But to state, “Hey, the Chinese have a different culture because they were Confucianists, and the Europeans were Christian,” I don’t buy that for a second. It’s much more subtle and complicated. The way I would phrase it is that culture is not independent of political and institutional circumstances.

China and Europe are different in many ways, but one is that after the Mongol conquest in the 12th century, China remains a unified empire run by a single Mandarin bureaucracy. There is nothing that competes with or threatens China. China does get invaded by Manchu tribes in 1644, but they don’t change the structure of the state. They learned to speak Chinese, dress like Chinese and eat like Chinese.

In Europe, no one ever succeeds in unifying it, and you have continuous competition. The French are worried about the English, the English are worried about the Spanish, the Spanish are worried about the Turks. That keeps everybody on their toes, which is something economists immediately recognize as the competitive model. To have progress, you want a system that is competitive, not one that is dominated by a single power.


Your home’s online gadgets could be hacked by ultrasound

[Note: This item comes from friend Jen Snow. DLH]

Your home’s online gadgets could be hacked by ultrasound
By Sally Adee
Oct 28 2016

This may have happened to you. You idly browse a pair of shoes online one morning, and for the rest of the week, those shoes follow you across the Internet, appearing in adverts across the websites you visit.

But what if those ads could pop out of your browser and hound you across different devices? This is the power of ultrasound technology, says Vasilios Mavroudis at University College London – and it offers a whole new way in for hacking attacks and privacy invasions. He and his colleagues will spell out their concerns at next week’s Black Hat cybersecurity conference in London.

So far, this kind of ultrasound technology has mainly been used as a way for marketers and advertisers to identify and track people exposed to their messages, like a cross-device cookie. High-frequency audio “beacons” are embedded into TV commercials or browser ads. These sounds, which are inaudible to the human ear, can be picked up by any nearby device that has a microphone and can then activate certain functions on that device. But the technology has many more applications. Some shopping reward apps, such as Shopkick, already use it to let retailers push department or aisle-specific ads and promotions to customers’ phones as they shop.

“It doesn’t require any special technology,” Mavroudis says. “If you’re a supermarket, all you need are regular speakers.”

Who is listening?

But the technology has been identified as a privacy risk. In March, the US Federal Trade Commission (FTC) rapped the knuckles of 12 app developers who used ultrasound for cross-device tracking – even when the apps weren’t turned on. This means that the apps could collect information about users without their awareness.

The software developer providing this code quickly withdrew it, but an FTC spokesperson says that the commission continues to be interested in cross-device tracking: “We’re continuing to look at the ways that can be achieved.”

And this is just one of the problems Mavroudis and his colleagues discovered when examining the vulnerabilities of ultrasound-based technologies.

One worry is that these programs may not just be picking up ultrasound. “Any app that wants to use ultrasound needs access to the full range of the microphone,” says Mavroudis. That means it would be possible, in theory, for the app to spy on your conversation.

The ultrasonic audio beacons that these apps pick up can also be imitated. This means that hackers could create fake beacons to send unwanted or malicious messages to your device, like malware. Mavroudis and his team realised that this would be possible when they found evidence of people trying to cheat a shopping rewards app by recording the ‘silent’ beacons (or just downloading recordings from the Internet) and then playing them to the app to supercharge their reward points. “That was when we realised how easy it would be to spoof these,” he says.

Home invasion

Mavroudis says that these vulnerabilities do not affect many people yet, as ultrasound apps are still niche. But the simplicity of ultrasound could make it an attractive technology for use in applications across the Internet of Things (IoT), says Mu Mu, a lecturer at the University of Northampton, UK.

As more IoT devices become connected and interlinked, they could overwhelm a home’s Wi-Fi channel, and different technologies will need to step in. Ultrasound is a good candidate for pairing home-connected devices that have a speaker and microphone. For example,Google’s Chromecast app uses ultrasound to pair your mobile phone with its streaming dongle.