Apple has redesigned its privacy website to give quick overview primers, and more detailed white papers
The page, simply titled Privacy, has existed for some time and contained much of the same information, most notably including instructions for how users can request a copy of all the data Apple stores about them. The new redesign, however, first breaks the topics down into simpler, clearer descriptions.
It seems intended to give concerned, if inexperienced, users a fast way to grasp the issues —and to see Apple as being different to all its competitors.
Alongside the chiefly rewritten explanatory introductions, however, Apple has also provided four detailed white papers. These cover privacy issues in Location Services, Photos, Safari web browsing, and the new Sign-In with Apple.
The papers total over 40 pages of information about Apple policies and specifics of how the company uses which technologies in what circumstances or with which devices.
By themselves, these papers and the revised primer are a clear, comprehensive guide to issues that are increasingly concerning everyone. However, they’re also only the latest round in Apple’s ambition to increase awareness both of privacy and of its role in protecting us.
One of Apple’s new primers on Privacy, as shown on its redesigned site.
Apple’s privacy push
Tim Cook repeatedly says that privacy is central to Apple, and often adds that the company considers privacy right from the very start of designing any product. You might think that’s right or you might think it’s hype, but Apple has definitely been talking up privacy for a long time.
“We’ve always had a very different view of privacy than some of our… colleagues,” he said. “We take privacy extremely seriously. Privacy means people know what they’re signing up for, in plain English and repeatedly.”
“I believe people are smart and some people want to share more data than other people do,” he continued. “Ask them. Ask them every time. Make them tell you to stop asking them if they get tired of your asking them. Let them know precisely what you’re going to do with their data.”
Even as Jobs effectively launched Apple’s stance on privacy in that interview, the company was already seeing a commercial advantage to being rigidly protective of user data.
Around this same time, Jobs was visiting publishers in an attempt to get them to put their news sites on the iPad‘s special news section, and they were resisting. A part of that resistance, which would be repeated years later over Apple News+, was that Apple wanted to take a cut of revenue.
However, a greater part was that in that scenario, readers would be Apple customers —and Apple would not release any user data without that user’s explicit permission. Knowing your audience, and being able to market to them, is key, and while Apple was protecting us all from unwanted advertising, it was able to do as much marketing to us as it liked.
Nonetheless, Apple did keep adding features, especially to iOS, that were specifically to help us keep control over our own privacy choices. This only increased when Tim Cook permanently took over from Steve Jobs in 2011.
Some privacy moves that Cook introduced were inspired by the discovery in 2013 of an app named Path which had passed approvals and so was on the App Store, but was illicitly taking user data.
Under Cook, the Settings app in iOS 6 added a whole new Privacy section which gave the ability to toggle on or off permissions for apps to use certain data. Then iOS 7 added the activation lock, which meant you needed the original Apple ID and password of a device, even if you’ve completely erased it.
As well as helping make your data on the device secure, because no one could get around the activation lock, police also later said that the feature had led to a significant drop in iPhone thefts.
It was now 2013, and Apple was not just adding features to its software, it was beginning to try raising awareness in general. Apple introduced an annual Transparency Report, which details what it has done about privacy in the year —and precisely how many government requests for data it’s received.
If you’d been paying attention to security, and most people were not, then Apple would’ve been looking pretty good at this point.
Until it was revealed that the NSA was using iPhone spyware and suddenly Apple’s claims of independence and privacy were being doubted.
Tim Cook denied being complicit with the NSA and issued a statement:
Apple has never worked with the NSA to create a backdoor in any of our products, including iPhone. Additionally, we have been unaware of this alleged NSA program targeting our products. We care deeply about our customers’ privacy and security. Our team is continuously working to make our products even more secure, and we make it easy for customers to keep their software up to date with the latest advancements. Whenever we hear about attempts to undermine Apple’s industry-leading security, we thoroughly investigate and take appropriate steps to protect our customers. We will continue to use our resources to stay ahead of malicious hackers and defend our customers from security attacks, regardless of who’s behind them.
There was still the issue of how the NSA could get by Apple’s security, but then the very next year it briefly appeared as if just about anyone could. In what became known as Celebgate, the personal and intimate photos of 200 celebrities were seemingly taken as hackers got into iCloud.
They didn’t. It was done by phishing, these celebrities were each conned into handing over enough information for their iCloud accounts to be opened.
Try telling that to a public and, especially, a press just relishing a story with Apple, Hollywood celebrities, and intimate photos. Tim Cook certainly tried.
Detail from one of Apple’s new White Papers on Privacy.
After it all blew over, though, Cook became more reflective.
When I step back from this terrible scenario that happened and say what more could we have done, I think about the awareness piece,” he told the Wall Street Journal. “I think we have a responsibility to ratchet that up.”
“That’s not really an engineering thing,” he continued. “We want to do everything we can do to protect our customers, because we are as outraged if not more so than they are.”
San Bernardino, the iPhone 5c, and the FBI
Celebgate is the reason you now get emailed whenever you log into iCloud from an unknown device. But it is not why the issue of Apple and privacy was ratcheted up. Arguably all the attention Apple gets today about privacy stems from December 2, 2015, and the San Bernardino attack.
During the FBI‘s investigation of that incident, it first requested and then increasingly pressed Apple to help. Specifically, it wanted Apple to make a new version of iOS that they could open to read the phone data of any suspects they needed. And Apple said no.
In retrospect, it was a clear continuation of everything Jobs and then Cook had said, but at the time, throughout early 2016, it was seen as Apple versus the FBI. The Bureau painted this as Apple being on the side of criminals, and in a public statement, Apple painted it as their being on the side of our civil liberties.
The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.
This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.
The statement, a letter by Tim Cook, is still available on Apple’s site, and it takes very careful steps through the argument that any backdoor would ultimately be exploited by criminals.
Through a mixture of technology, security and that “awareness piece” that Cook described, Apple won the day and the FBI backed down.
Since then, Cook has said publicly that he wishes the case against the FBI had gone to court, and since then he has been increasingly vocal about privacy issues, most noticeably this year.
Las Vegas privacy billboard
Just a few days into 2019, Apple, which was not exhibiting at the Las Vegas Consumer Electronics Show, nonetheless displayed a billboard ad about privacy there.
Las Vegas billboard ad. (Photo: Chris Velazco via Twitter)
It’s also had its problems this year, as along with the likes of Google Home and Amazon Alexa, Apple’s Siri was revealed to record users requests and have staff analyse them. It was all anonymised, it was all in the ambition of improving Siri’s accuracy, but it backfired.
Apple might again say that this was about awareness more than engineering, though. It said that it would cease using the recordings and review its policies. Then, it actually did cease using them, and it really did revise its policies. And this is why you are now asked permission to let Siri use recordings in this way.
Facebook’s Mark Zuckerberg has previously described Apple’s attitudes to privacy as being “glib,” and suggesting that it’s fine to have a high horse when you’ve got all that money.
He does have a point, even if his firm is not exactly destitute either. You can see Apple as being on the side of the angels in all this. Or you can more cynically see it as making a profitable virtue out of the fact that it’s always made its money though selling devices instead of through selling data about its users.
Either way, though, you can’t argue that Apple is jumping on a bandwagon. Apple’s been publicly vehement about privacy for at least the last decade, and the newly redesigned Privacy website is just the latest part of that.
How the Dumb Design of a WWII Plane Led to the Macintosh
The B-17 Flying Fortress rolled off the drawing board and onto the runway in a mere 12 months, just in time to become the fearsome workhorse of the US Air Force during World War II. Its astounding toughness made pilots adore it: The B-17 could roar through angry squalls of shrapnel and bullets, emerging pockmarked…
The B-17 Flying Fortress rolled off the drawing board and onto the runway in a mere 12 months, just in time to become the fearsome workhorse of the US Air Force during World War II. Its astounding toughness made pilots adore it: The B-17 could roar through angry squalls of shrapnel and bullets, emerging pockmarked but still airworthy. It was a symbol of American ingenuity, held aloft by four engines, bristling with a dozen machine guns.
Imagine being a pilot of that mighty plane. You know your primary enemy—the Germans and Japanese in your gunsights. But you have another enemy that you can’t see, and it strikes at the most baffling times. Say you’re easing in for another routine landing. You reach down to deploy your landing gear. Suddenly, you hear the scream of metal tearing into the tarmac. You’re rag-dolling around the cockpit while your plane skitters across the runway. A thought flickers across your mind about the gunners below and the other crew: “Whatever has happened to them now, it’s my fault.” When your plane finally lurches to a halt, you wonder to yourself: “How on earth did my plane just crash when everything was going fine? What have I done?”
For all the triumph of America’s new planes and tanks during World War II, a silent reaper stalked the battlefield: accidental deaths and mysterious crashes that no amount of training ever seemed to fix. And it wasn’t until the end of the war that the Air Force finally resolved to figure out what had happened.
To do that, the Air Force called upon a young psychologist at the Aero Medical Laboratory at Wright-Patterson Air Force Base near Dayton, Ohio. Paul Fitts was a handsome man with a soft Tennessee drawl, analytically minded but with a shiny wave of Brylcreemed hair, Elvis-like, which projected a certain suave nonconformity. Decades later, he’d become known as one of the Air Force’s great minds, the person tasked with hardest, weirdest problems—such as figuring out why people saw UFOs.
For now though, he was still trying to make his name with a newly minted PhD in experimental psychology. Having an advanced degree in psychology was still a novelty; with that novelty came a certain authority. Fitts was supposed to know how people think. But his true talent is to realize that he doesn’t.
When the thousands of reports about plane crashes landed on Fitts’s desk, he could have easily looked at them and concluded that they were all the pilot’s fault—that these fools should have never been flying at all. That conclusion would have been in keeping with the times. The original incident reports themselves would typically say “pilot error,” and for decades no more explanation was needed. This was, in fact, the cutting edge of psychology at the time. Because so many new draftees were flooding into the armed forces, psychologists had begun to devise aptitude tests that would find the perfect job for every soldier. If a plane crashed, the prevailing assumption was: That person should not have been flying the plane. Or perhaps they should have simply been better trained. It was their fault.
But as Fitts pored over the Air Force’s crash data, he realized that if “accident prone” pilots really were the cause, there would be randomness in what went wrong in the cockpit. These kinds of people would get hung on anything they operated. It was in their nature to take risks, to let their minds wander while landing a plane. But Fitts didn’t see noise; he saw a pattern. And when he went to talk to the people involved about what actually happened, they told of how confused and terrified they’d been, how little they understood in the seconds when death seemed certain.
The examples slid back and forth on a scale of tragedy to tragicomic: pilots who slammed their planes into the ground after misreading a dial; pilots who fell from the sky never knowing which direction was up; the pilots of B-17s who came in for smooth landings and yet somehow never deployed their landing gear. And others still, who got trapped in a maze of absurdity, like the one who, having jumped into a brand-new plane during a bombing raid by the Japanese, found the instruments completely rearranged. Sweaty with stress, unable to think of anything else to do, he simply ran the plane up and down the runway until the attack ended.
Fitts’ data showed that during one 22-month period of the war, the Air Force reported an astounding 457 crashes just like the one in which our imaginary pilot hit the runway thinking everything was fine. But the culprit was maddeningly obvious for anyone with the patience to look. Fitts’ colleague Alfonse Chapanis did the looking. When he started investigating the airplanes themselves, talking to people about them, sitting in the cockpits, he also didn’t see evidence of poor training. He saw, instead, the impossibility of flying these planes at all. Instead of “pilot error,” he saw what he called, for the first time, “designer error.”
The reason why all those pilots were crashing when their B-17s were easing into a landing was that the flaps and landing gear controls looked exactly the same. The pilots were simply reaching for the landing gear, thinking they were ready to land. And instead, they were pulling the wing flaps, slowing their descent, and driving their planes into the ground with the landing gear still tucked in. Chapanis came up with an ingenious solution: He created a system of distinctively shaped knobs and levers that made it easy to distinguish all the controls of the plane merely by feel, so that there’s no chance of confusion even if you’re flying in the dark.
By law, that ingenious bit of design—known as shape coding—still governs landing gear and wing flaps in every airplane today. And the underlying idea is all around you: It’s why the buttons on your videogame controller are differently shaped, with subtle texture differences so you can tell which is which. It’s why the dials and knobs in your car are all slightly different, depending on what they do. And it’s the reason your virtual buttons on your smartphone adhere to a pattern language.
But Chapanis and Fitts were proposing something deeper than a solution for airplane crashes. Faced with the prospect of soldiers losing their lives to poorly designed machinery, they invented a new paradigm for viewing human behavior. That paradigm lies behind the user-friendly world that we live in every day. They realized that it was absurd to train people to operate a machine and assume they would act perfectly under perfect conditions.
Instead, designing better machines meant figuring how people acted without thinking, in the fog of everyday life, which might never be perfect. You couldn’t assume humans to be perfectly rational sponges for training. You had to take them as they were: distracted, confused, irrational under duress. Only by imagining them at their most limited could you design machines that wouldn’t fail them.
This new paradigm took root slowly at first. But by 1984—four decades after Chapanis and Fitts conducted their first studies—Apple was touting a computer for the rest of us in one of its first print ads for the Macintosh: “On a particularly bright day in Cupertino, California, some particularly bright engineers had a particularly bright idea: Since computers are so smart, wouldn’t it make sense to teach computers about people, instead of teaching people about computers? So it was that those very engineers worked long days and nights and a few legal holidays, teaching silicon chips all about people. How they make mistakes and change their minds. How they refer to file folders and save old phone numbers. How they labor for their livelihoods, and doodle in their spare time.” (Emphasis mine.) And that easy-to-digest language molded the smartphones and seamless technology we live with today.
Along the long and winding path to a user-friendly world, Fitts and Chapanis laid the most important brick. They realized that as much as humans might learn, they would always be prone to err—and they inevitably brought presuppositions about how things should work to everything they used. This wasn’t something you could teach of existence. In some sense, our limitations and preconceptions are what it means to be human—and only by understanding those presumptions could you design a better world.
Today, this paradigm shift has produced trillions in economic value. We now presume that apps that reorder the entire economy should require no instruction manual at all; some of the most advanced computers ever made now come with only cursory instructions that say little more than “turn it on.” This is one of the great achievements of the last century of technological progress, with a place right alongside GPS, Arpanet, and the personal computer itself.
It’s also an achievement that remains unappreciated because we assume this is the way things should be. But with the assumption that even new technologies need absolutely no explaining comes a dark side: When new gadgets make assumptions about how we behave, they force unseen choices upon us. They don’t merely defer to our desires. They shape them.
User friendliness is simply the fit between the objects around us and the ways we behave. So while we might think that the user-friendly world is one of making user-friendly things, the bigger truth is that design doesn’t rely on artifacts; it relies on our patterns. The truest material for making new things isn’t aluminum or carbon fiber. It’s behavior. And today, our behavior is being shaped and molded in ways both magical and mystifying, precisely because it happens so seamlessly.
I got a taste of this seductive, user-friendly magic recently, when I went to Miami to tour a full-scale replica of Carnival Cruise’s so-called Ocean Medallion experience. I began my tour in a fake living room, with two of the best-looking project staffers pretending to be husband and wife, showing me how the whole thing was supposed to go.
Using the app, you could reserve all your activities way before you boarded the ship. And once on board, all you needed was to carry was a disk the size of a quarter; using that, any one of the 4,000 touchscreens on the ship could beam you personalized information, such which way you needed to go for your next reservation. The experience recalled not just scenes from Her and Minority Report, but computer-science manifestos from the late 1980s that imagined a suite of gadgets that would adapt to who you are, morphing to your needs in the moment.
Behind the curtains, in the makeshift workspace, a giant whiteboard wall was covered with a sprawling map of all the inputs that flow into some 100 different algorithms that crunch every bit of a passenger’s preference behavior to create something called the “Personal Genome.” If Jessica from Dayton wanted sunscreen and a mai tai, she could order them on her phone, and a steward would deliver them in person, anywhere across the sprawling ship.
The server would greet Jessica by name, and maybe ask if she was excited about her kitesurfing lesson. Over dinner, if Jessica wanted to plan an excursion with friends, she could pull up her phone and get recommendations based on the overlapping tastes of the people she was sitting with. If only some people like fitness and others love history, then maybe they’ll all like a walking tour of the market at the next port.
Jessica’s Personal Genome would be recalculated three times a second by 100 different algorithms using millions of data points that encompassed nearly anything she did on the ship: How long she lingered on a recommendation for a sightseeing tour; the options that she didn’t linger on at all; how long she’d actually spent in various parts of the ship; and what’s nearby at that very moment or happening soon. If, while in her room, she had watched one of Carnival’s slickly produced travel shows and seen something about a market tour at one her ports of call, she’d later get a recommendation for that exact same tour when the time was right. “Social engagement is one of the things being calculated, and so is the nuance of the context,” one of the executives giving me the tour said.
Subscribe to WIRED and stay smart with more of your favorite writers.
It was like having a right-click for the real world. Standing on the mocked-up sundeck, knowing that whatever I wanted would find me, and that whatever I might want would find its way either onto the app or the screens that lit up around the cruise ship as I walked around, it wasn’t hard to see how many other businesses might try to do the same thing. In the era following World War II, the idea that designers could make the world easier to understand was a breakthrough.
But today, “I understand what I should do” has become “I don’t need to think at all.” For businesses, intuitiveness has now become mandatory, because there are fortunes to be made by making things just a tad more frictionless. “One way to view this is creating this kind of frictionless experience is an option. Another way to look at it is that there’s no choice,” said John Padgett, the Carnival executive who had shepherded the Ocean Medallion to life. “For millennials, value is important. But hassle is more important, because the era they’ve grow up in. It’s table stakes. You have to be hassle-free to get them to participate.”
By that logic, the real world was getting to be disappointing when compared with the frictionless ease of this increasingly virtual world. Taken as a whole, Carnival’s vision for seamless customer service that can anticipate your every whim was like an Uber for everything, powered by Netflix recommendations for meatspace. And these are in fact the experiences that many more designers will soon be striving for: invisible, everywhere, perfectly tailored, with no edges between one place and the next. Padgett described this as a “market of one,” in which everything you saw would be only the thing you want.
The Market of One suggests to me a break point in the very idea of user friendliness. When Chapanis and Fitts were laying the seeds of the user-friendly world, they had to find the principles that underlie how we expect the world to behave. They had to preach the idea that products built on our assumptions about how things should work would eventually make even the most complex things easy to understand.
Steve Jobs’ dream of a “bicycle for the mind”—a universal tool that might expand the reach of anyone—has arrived. High technology has made our lives easier; made us better at our jobs, and created jobs that never existed before; it has made the people we care about closer to us. But friction also has value: It’s friction that makes us question whether we do in fact need the thing we want. Friction is the path to introspection. Infinite ease quickly becomes the path of least resistance; it saps our free will, making us submit to someone else’s guess about who we are. We can’t let that pass. We have to become cannier, more critical consumers of the user-friendly world. Otherwise, we risk blundering into more crashes that we’ll only understand after the worst has already happened.
Excerpted from USER FRIENDLY: How the Hidden Rules of Design Are Changing the Way We Live, Work, and Play by Cliff Kuang with Robert Fabricant. Published by MCD, an imprint of Farrar, Straus and Giroux November 19th 2019. Copyright © 2019 by Cliff Kuang and Robert Fabricant. All rights reserved.
When you buy something using the retail links in our stories, we may earn a small affiliate commission. Read more about how this works.
More Great WIRED Stories
- The super-optimized dirt that helps keep racehorses safe
- The 12 best foreign horror movies you can stream right now
- VSCO girls are just banal Victorian archetypes
- Google’s .new shortcuts are here to simplify your life
- The delicate ethics of using facial recognition in schools
- 👁 Prepare for the deepfake era of video; plus, check out the latest news on AI
- 💻 Upgrade your work game with our Gear team’s favorite laptops, keyboards, typing alternatives, and noise-canceling headphones
A Tesla Cybertruck Mishap, a Massive Data Leak, and More News
Hackers are stealing and Elon is squealing, but first: a cartoon about subscription dreams.Here’s the news you need to know, in two minutes or less.Want to receive this two-minute roundup as an email every weekday? Sign up here!Today’s NewsMeet the Tesla Cybertruck, Elon Musk’s Ford-fighting pickup truckTesla CEO Elon Musk last night unveiled his newest…
Hackers are stealing and Elon is squealing, but first: a cartoon about subscription dreams.
Here’s the news you need to know, in two minutes or less.
Want to receive this two-minute roundup as an email every weekday? Sign up here!
Meet the Tesla Cybertruck, Elon Musk’s Ford-fighting pickup truck
Tesla CEO Elon Musk last night unveiled his newest baby, an all-electric pickup called the Tesla Cybertruck. He demonstrated that it can take a sledgehammer to the door with nary a scratch, and he also accidentally demonstrated that it can’t take a ball to the window. But behind the showmanship and Elon’s audible disbelief at the onstage mishap is a truck with a 500-mile range and the torque that comes from an electric motor. It represents an important new market expansion for Tesla. Now it just has to actually put the darn thing into production.
1.2 billion records found exposed online in a single server
Hackers have long used stolen personal data to break into accounts and wreak havoc. And a dark web researcher found one data trove sitting exposed on an unsecured server. The 1.2 billion records don’t include passwords, credit card numbers, or Social Security numbers, but they do contain cell phone numbers, social media profiles, and email addresses—a great start for someone trying to steal your identity.
Fast Fact: 2025
That’s the year NASA expects to launch the first dedicated mission to Europa, where water vapor was recently discovered. The mission to Jupiter’s moon will involve peering beneath Europa’s icy shell for evidence of life.
WIRED Recommends: The Gadget Lab Newsletter
First of all, you should sign up for WIRED’s Gadget Lab newsletter, because every Thursday you’ll get the best stories about the coolest gadgets right in your inbox. Second of all, it will give you access to early Black Friday and Cyber Monday deals so you can get your shopping done early.
News You Can Use:
Here’s how to hide nasty replies to your tweets on Twitter.
This daily roundup is available as a newsletter. You can sign up right here to make sure you get the news delivered fresh to your inbox every weekday!