Spooky Action — Real Life
as the devices that are central to our work, leisure, and personal relationships recede further and further from a layperson’s understanding, occult metaphors become a useful way of naming the inscrutable new powers at work in our lives. The terms of magic, possession, and haunting express both awe and fear of the cloaked nature of technology that seems on the point of being able to operate without us.
The results? On fully eight of the nine measures, “polarization increases more for the old than the young.” If Facebook is the problem, then how come the problem is worst among those who don’t use Facebook?
Starting in the summer of 2018, the European Union may require that companies be able to give users an explanation for decisions that automated systems reach. This might be impossible, even for systems that seem relatively simple on the surface, such as the apps and websites that use deep learning to serve ads or recommend songs. The computers that run those services have programmed themselves, and they have done it in ways we cannot understand.
Because the second AI is working so hard to identify images as fake, the first learns to mimic the real in ways it couldn’t on its own. In the process, these two neural networks can push AI toward a day when computers declare independence from their human teachers.
If one recalls the Eric Raymond line that a computer should never ask you something that it should be able to work out, then a computer that can see everything you see and know what you're looking at, and that has the next decade of development of machine learning to build on, ought to remove whole layers of questions that today we assume we have to deal with manually.
In order to advance technologically, we apparently need the public and decision makers to repeatedly fall for the promises of honey-tongued promoters, to enter analogues of the Steve Jobs "reality distortion field." An effective fake news detecting method could easily destroy the illusions that progress is based on.
We need to go beyond basic skills to raise the first generation of native digital understanders – people who, unlike most of the rest of us, know where and how their technology is made. Imagine a Britain where tech no longer scares or dazzles us, where it is as useful but unremarkable as a wristwatch.
When Booming Ben, the last heath hen, died on Martha’s Vineyard, they said he’d spent his last days crying out for a female that never came to him. The Vineyard Gazette dedicated an entire issue to his memory: “There is no survivor, there is no future, there is no life to be recreated in this form again. We are looking upon the uttermost finality which can be written, glimpsing the darkness which will not know another ray of light.”
An opinionated interface equips users to express complex concepts like self, story, or environment by prescribing a method of capture without restricting the result, like a blank canvas with a set of specific brushes.
Includes quotes from me, but I swear that’s not why I’m sharing it.
"The growing importance of cameras — of images rather than just text — is altering much about culture. It’s transforming many people’s personal relationships. It’s changing the kind of art and entertainment we produce. You might even credit cameras — or blame them — for our more emotional, and less rational, politics."
Stickers are not a devolution of language. They aren’t even an evolution. They’re a cultural memory of the way things used to be, made possible by recent advances in technology.
If the robots take over the world, it won't be by knocking the door down. At the moment, I think it's certainly as big a risk that we have a GMO moment, and there's a powerful reaction against the technology which prevents us from reaping the benefits, which are enormous. I think that's as big a risk as the risks from the technologies themselves.
"For a time, she will be gone without our knowing it. The information will travel like the compressional wave ahead of an earthquake, detectable only by special equipment."
This implicit belief in some sort of base progress for humankind has been massively eroded by a combination of fear, unchecked growth in inequality, and a general yearning for the better days of yore—better days that never existed for many of us. I often think of a Pew study from 2014 that highlighted just how pessimistic America's older generations are about the financial prospects of their kids and youth as a whole. This sentiment has helped fuel the protective populism that put President Trump in the White House.
I feel the writing is on the wall for digital display advertising, our main revenue stream for supporting online news. I see more and more small businesses taking money they would once have spent with local news outlets, and spending it on digital ads — not on local websites, but on promoted Facebook posts and Google keyword advertising.
If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias. Imagine, Mercier and Sperber suggest, a mouse that thinks the way we do. Such a mouse, “bent on confirming its belief that there are no cats around,” would soon be dinner. To the extent that confirmation bias leads people to dismiss evidence of new or underappreciated threats—the human equivalent of the cat around the corner—it’s a trait that should have been selected against.
Old-school Twitter could be a bubble of sorts, too, depending on who you followed. But the chronological timeline at least gave equal weight to every tweet, regardless of whether it was likely to please or upset you.
The ranked timeline, even in its modest present form, has changed that. You’re now far more likely to see certain types of tweets than others when you log in. The question is: What types of tweets are you seeing more of?
As the mobile market has matured and become more commoditised, web technologies have also matured. The opportunity is still there, in fact the timing may now be better than ever. Although I understand the reasons the project was stopped, I feel Firefox OS was ended prematurely, before it had the chance to reach its full potential. What it needed was a reboot and some ruthless and clear direction, which unfortunately it didn’t get.
Global elites — those with high income and educational levels, who live in capital cities — are considerably more enthusiastic about innovation than the general population, the FT/Qualcomm Essential Future survey found. This gap, unless addressed, will continue to cause political friction.
Of course it is one thing to point out the problems with Facebook’s dominance, but it’s quite another to come up with a strategy for dealing with it; too many of the solutions — including demands that Zuckerberg use Facebook for political ends — are less concerned with the abuse of power and more with securing said power for the “right” causes.
The alt-right does not exist. It's nothing more than white supremacists who have repackaged the hate and served it up in a more palatable form for human consumption.
Snowflakes are to be mocked because they take things personally; their feelings are hurt. The outrage of populist correctness, however, is framed more as righteous indignation. It is not you who is offended. You are offended on behalf of the people. On behalf of your country. Your outrage is morally superior.
I said, “Do you mind me asking if you’ve had bad experiences with people cheating on you?”
“I have so far abstained from sex,” he said. “I have never had a girlfriend.”
“You’re saving yourself for the sexbots?”
He nodded slowly, shrewdly raising his eyebrows. You bet he was saving himself for the sexbots.
VR is an embodied medium: creators are taking that detached eye and reattaching it to someone’s face. VR reminds us of the nuances of experiences, what connects people with each other, with places, with things in the real world. And that to me is the key to really understanding what kind of storytelling could even exist in a VR space.
New technology does scare us now, just as it always has, but few films have articulated why. Video conferencing apps like Skype have provided a fresh twist on the mirror-scare cliché, but the true terror of modern technology is the isolation and stress of constant connection, which isn’t as easy to put in cinematic terms.
The next time I'd meet Luckey he'd be many, many millions of dollars richer, and Oculus would be a Facebook-owned company. But despite that very real marker of success, our topic of conversation each time we met remained the same: How are you going to convince people it's worth it? And isn't it going to be way too expensive?
We ultimately believe the pixels are the source of truth and being able to understand the truth at scale, unlocks new product experiences and business workflows.
what is the future of typography? And what does the web have in store for us? It’s easy to think of the latest trends in virtual and augmented reality, or drone typography, or to discuss how flat design might conquer the world. What I’m really interested in though are the technologies and ideas that are likely to stick around for the long term.
There was a hard, if old lesson in the fate of the web, namely, that to preserve anything public spirited takes more than good vibes – it requires some really hard-line institutional structure to maintain spaces that bring out the best in us.
This is why standards nerds argue so passionately. They’ve seen how long it takes to steer technology away from its roots, no matter what zippy startup culture can get away with on the surface. Choices we agree on now are going to stick around, and get baked into the foundational brick of our biggest, most critical systems. Be careful what you toss in there! Be careful what you assume!
according to a study by Ball State University, nearly nine in 10 jobs that disappeared since 2000 were lost to automation in the decades-long march to an information-driven economy, not to workers in other countries.
Even if those jobs returned, a high school diploma is simply no longer good enough to fill them. Yet rarely discussed in the political debate over lost jobs are the academic skills needed for today’s factory-floor positions, and the pathways through education that lead to them.
It’s not unusual to try to cram the previous generation’s content into the latest technological medium. It’s happened over and over throughout time, whether you’re talking about bringing the newspaper to radio, radio to cinema, or cinema to television. The shift from the web and apps to messaging or voice will follow a similar pattern until people start to experiment with, take risks on, and ultimately develop experiences that are inherent to and optimized for the messaging and voice contexts.
the success of Amazon’s Echo and the onslaught of connected devices has reinvigorated gadget manufacturers to give alternative interactions another go. The question is whether these companies can uncover something people like enough to change their behavior and keep the technology alive.
A new medium requires new technology and new design. Each time we encounter a new medium, we do what humans do, we retrofit our assumptions and needs from prior experiences. Bots are frontier technology — starting to deliver on a promise that computing has been making for decades — that we will be able to speak to computers, in our language, be it text or spoken word.
The problem with new technologies is that we’re really, really bad at predicting their futures. Our immediate frame of reference is so small, and our experience of time so fleeting, that any attempt to imagine how technologies will develop is bound to fail. We suffer from what Carolyn Marvin, in her book When Old Technologies Were New, calls “cognitive imperialism.” We can only imagine futures which are broad extensions of our own contexts and needs.
“I learned a lesson from watching other companies who held onto things too long. If you look at the history of companies that have succeeded and the ones that have failed, there’s a pretty clear pattern that the ones that have succeeded typically morph every couple of years into something new. And that change is fairly uncomfortable.”
The Bad Product Fallacy
Your personal use cases and opinion are a shitty predictor of a product’s future success.
Today, ten years after the iPhone launched, I have some of the same sense of early constraints and assumptions being abandoned and new models emerging. If in 2004 we had 'Web 2.0', now there's a lot of 'Mobile 2.0' around.
The end of Moore’s law does not mean that the computer revolution will stall. But it does mean that the coming decades will look very different from the preceding ones, for none of the alternatives is as reliable, or as repeatable, as the great shrinkage of the past half-century.
In Britain today, around a quarter of smartphone users get news alerts through their phones, according to Newman’s study. But in other nations, that figure is larger: 40% of Taiwanese users have news alerts on their phones, and 33% of US users do.
As the gadgets around us get more and more capable, they’ll need to get more polite, and more socially aware. The real design challenges will become less about screens and things, and more about scripts and cues. When technology gets laced into the fabric of everything, what we’re left with is etiquette.
being able to talk to computers abolishes the need for the abstraction of a “user interface” at all. Just as mobile phones were more than existing phones without wires, and cars were more than carriages without horses, so computers without screens and keyboards have the potential to be more useful, powerful and ubiquitous than people can imagine today.
“Hey Siri, predict the future of audio interfaces.” If she were smarter, she’d respond that 2017 will be the tipping point — the year we fell headfirst into our always-on, audio-enabled, ambient computing future.
We’ll have users asking our [digital assistant], “Are you male or female?” “Do you have a girlfriend?” “Would you want to be my girlfriend?” Friendship is one of the use cases we support, and it’s quite popular among the users.
The lack of secure identification and authentication inherent in the internet’s genetic code has also prevented easy transactions, thwarted financial inclusion, destroyed the business models of content creators, unleashed deluges of spam, and forced us to use passwords and two-factor authentication schemes that would have baffled Houdini.
A new form of information manipulation is unfolding in front of our eyes. It is political. It is global. And it is populist in nature. The news media is being played like a fiddle, while decentralized networks of people are leveraging the ever-evolving networked tools around them to hack the attention economy.
The world is not something we study neutrally, that we gather neutral knowledge about, on which we can act neutrally. Rather, we make the world by understanding it, and the way we understand it changes it.
Cuarón was, against all odds, confident that better days lie ahead. “I used to think that any solution would come from the paradigms that I know,” he says. “Now I think that the only thing is to think of the unimaginable. For the new generation, the unimaginable is not as unimaginable.”
Audiences are moving to more innovative, modern platforms created by tech companies in California like Google, Facebook and Snap. Meanwhile, media companies have been much too slow to shift to digital; they’ve clung to print and broadcast, even when it was clear audiences are moving elsewhere. This means the budgets for quality journalism are focused on the wrong places, creating a void that is filled by the cheapest possible content, often from questionable sources. The attention has moved, but the content-creation resources mostly haven’t.