Imperica's pick of interesting, long-form articles on contemporary culture from around the Internet.
Subscribe to our newsletter to get it all wrapped up, every Friday.
Customer journey mapping is a logical way to organize the elements of the consumer experience into a cohesive whole. It’s a framework that gives designers a way to better understand what users need.
But, in real life, people don’t follow logical paths. Technology, culture and basic human unpredictability mean that experience and service designers must assume that every customer journey will be unique, non-linear and fragmented. Therefore, if your narrative and benefits rely upon a linear interaction with all parts of an experience, it’s certain that the vast majority of customers/users will never experience most of them.
A simple journey with 4 steps in fact has 24 different possible paths. And this complexity is compounded dramatically for services or experiences that operate in multiple countries or regions or that need to serve multiple levels of user expertise. That means that for any substantial service or experience design project, it’s unrealistic to map all the possible paths.
Many humans don’t settle for just barely scraping by; they improve themselves towards self-actualization. Could we do the same at the level of civilization? Rather than merely avoiding disasters, let’s truly reach for the stars.
In this essay, I describe a decentralized system for humanity to collectively graph the steps towards self-actualization, from conquering malaria and a shared planetary database, to universal basic income and even asteroid mining. But vision is not enough. We need execution, and the resources to power it. There’s a new tool: the blockchain token launch. It’s a low-friction way for a community of aspirational thinkers to fund big ideas with big dollars, and to benefit from the success of those ideas. We start with a token launch for the map itself. That in turn propels token launches for the first steps of the map, which in turn propels the second steps, and so on.
In a shiny Airstream trailer, on the roof of his company’s new headquarters, Wayne McGregor looks across the Olympic Park in Stratford, east London. This is not your usual dance HQ. But McGregor isn’t what you’d expect from a choreographer. The resident brainbox of British dance is always questing for new territory. His work with ballet companies often attracts headlines – it’s a world new to extreme moves, music by Mark Ronson and the White Stripes, big ideas about the multiverse – but his own company is a research lab for innovation.
Now, the science geek is using his own DNA, and collaborating with scientists at the Wellcome Trust Sanger Institute, as the inspiration for a show called Autobiography. “If you’re looking for a document of my life with a narrative arc about me growing up in Stockport, you’ll be frustrated,” he grins. Instead, it’s Who Do You Think You Are? but with genes. The show began taking shape when McGregor wondered how artificial intelligence (AI) might animate his archive of 25 years’ working in dance. This led him to consider the body itself as “a living archive. Not as a nostalgia-fest but as an idea of speculative future. Each cell carries in it the whole blueprint of your life, basically.” Your genetic code tells the story of your past – and predicts possible stories of your future.
Big brands pay the salaries and provide investment returns for many millions of people via pension funds. So if anyone declares that big brands are dying they receive a great deal of attention.
There is quite a long history of such alarms, going back at least to 1993’s “Marlboro Friday”. Recently, there have been claims that big/global brands are losing to small/local brands. Theories have hastily been put forward why this might be, leading to marketing strategy recommendations.
But what is fact and what is fiction? And what strategies make sense for big brands? We report extensive new analyses along with the scientific research (published in peer-reviewed journals).
They say there’s no such thing as bad publicity, but social media companies might yet prove that old dictum untrue. They’ve made headlines daily lately, in a fairly appalling way. Facebook selling anti-semitic ads, swaying an election with “fake news”, Twitter being a platform for extremists. And so on.
So. A tough but necessary question: is social media a failure? Let’s think about it for a moment, not with condemnation, blame, or shame, but just clarity, purpose, and understanding.
The economics of social media are stellar. Facebook earns piles of cash. Twitter isn’t as successful, but it’s still a publicly traded company — a billion dollar tale of modern-day fortune.
This is a story about how the airport became the setting for the Great American Freakout. Once an icon of progress, then another stale waiting room of modern life, the airport has now entered a third phase.
This summer, Ann Coulter threw a three-day tantrum over a Delta seat assignment, comparing the airline gate attendants to Nurse Ratched, the sadistic warden who rules over the lunatics in One Flew Over the Cuckoo’s Nest. There was some truth to the observation. It was the latest incident in a year of airport fracases—including a brawl at the Spirit Airlines counter in Fort Lauderdale, Florida (May), the concussion of the 69-year-old David Dao who wouldn’t relinquish his seat (April), widespread pro-immigrant protests (January), two full-on panic stampedes one year ago, and a steady drumbeat of racial and religious profiling at security and immigration—that have confirmed the airport’s new role in American life as the marble-floored home of our national, fear-fueled psychosis.
The airport is, on the one hand, as representative a civic space as America has. Nearly half of American adults fly commercial each year, making the airport nearly as common a shared experience as the voting booth. It is also roiled by the ceaseless friction of its many internal borders, real and felt, that separate safety from danger, admittance from expulsion, brown from white, the rich from the rest. Real anxiety has swelled in this liminal space for decades, as airlines grew stingier, the security state grew stricter, and the borders in airport basements grew busier. But as with many conflicts in American life, the rise of Donald Trump has both clarified and exacerbated the fault lines.
If you ask a child to draw a cat, you’ll learn more about the child than you will about cats. In the same way, asking neural networks to generate images helps us see how they reason about the information they’re given. It’s often difficult to interpret neural networks—that is, to relate their functioning to human intuition—and generative algorithms offer a way to make neural nets explain themselves.
Neural networks are most commonly implemented as classifiers—models that are able to distinguish, say, an image of a cat from an image of a dog, or a stop sign from a fire hydrant. But over the last three years, researchers have made astonishing progress in essentially reversing these networks. Through a handful of generative techniques, it’s possible to feed a lot of images into a neural network and then ask for a brand-new image that resembles the ones it’s been shown. Generative AI has turned out to be remarkably good at imitating human creativity at superficial levels.
For the past 12 months of my life, I paid the bargain price of $1,250 per month to sleep diagonally in a bunk bed in a 10ft by 10ft room that I shared with a 32-year old man. Because I am 6ft4in, sleeping diagonally in my undersized accommodation was the only way I could make it through the night without getting cramps.
Welcome to my life in the hacker house.
In July last year, I left my home in the comfy suburbs of Washington DC to make the 3,000-mile drive west to San Francisco, with my mother along for the ride. I had just graduated from college that May, and as the cliched story goes, I was in pursuit of the tech dream. I didn’t have a lease, or a job. Because of the high rent in the Bay area, you usually can’t secure a lease without a job offer, and well, you can’t exactly say the jobs were coming easy. So I just went for it.
Upon reaching Louisville, Kentucky, I received a call from a friend. “You should look up hacker houses,” he said. “It’s a place where a bunch of tech people live to hack and build stuff.”
In “The End of Philosophy and the Task of Thinking” (1964), Heidegger famously takes stock of the present and future of philosophy in the time of cybernetics. “Philosophy is ending in the present age,” he writes. “It has found its place in the scientific attitude of socially active humanity. But the fundamental characteristic of this scientific attitude is its cybernetic, that is, technological character. The need to ask about modern technology is presumably dying out to the same extent that technology more decisively characterized and directs the appearance of the totality of the world and the position of man in it.” For the late Heidegger, writing near the last decade of his life and well ensconced in his mountain chalet, the rapid technological development of the global north spells an impending doom: the end to philosophical thinking and to a properly authentic relationship to the world. The planetary control apparatuses that we subsume under the sign of “cybernetics” have replaced the traditional role of metaphysics and, thus, usurped philosophy. “Philosophy is metaphysics. Metaphysics thinks beings as a whole—the world, man, God—with respect to Being, with respect to the belonging together of beings in Being."
Now, for Heidegger, it is cybernetics that thinks the totality. So, new questions are raised. Whither philosophy in the half century since Heidegger announced its death knell? Can philosophy survive the complete digitization of the world? Can metaphysics still have currency in an age of ubiquitous computation?
Mainstream narratives about What Theatre Is and What Video Games Are often marr the perception of these kindred art forms (both derived from a common ground of formalised play). Many non-theatre goers think that theatre is all Shakespeare and Andrew Lloyd-Webber, while many non-gamers assume games are all guns-blazing first-person-shooters.
So, in case you’ve never seen anything but adverts for Call of Duty, ‘AAA’ or ‘triple A’ is the word for the blockbuster, billion-dollar industry level of game development. Think ‘Hollywood’. This is by no means a value judgment, rather one of scale and ambition. There are some great, slick action movies, westerns and romantic epics made by Hollywood, and also a lot of by-numbers forgettable stuff, but AAA and Hollywood share a scale of budget, studio size and ambition.
So there is a ‘AAA’ level of the games industry – your Assasain's Creed, Halo, Final Fantasy, etc. But just as you get Hollywood in film (and the West End in theatre), you also get the indie film industry (and the DIY and fringe performance scenes). That scale of production exists in games, too, which is often overlooked by people just starting to think about games and gaming.
Rebecca Porter and I were strangers, as far as I knew. Facebook, however, thought we might be connected. Her name popped up this summer on my list of “People You May Know,” the social network’s roster of potential new online friends for me.
The People You May Know feature is notorious for its uncanny ability to recognize who you associate with in real life. It has mystified and disconcerted Facebook users by showing them an old boss, a one-night-stand, or someone they just ran into on the street.
These friend suggestions go far beyond mundane linking of schoolmates or colleagues. Over the years, I’d been told many weird stories about them, such as when a psychiatrist told me that her patients were being recommended to one another, indirectly outing their medical issues.
What makes the results so unsettling is the range of data sources—location information, activity on other apps, facial recognition on photographs—that Facebook has at its disposal to cross-check its users against one another, in the hopes of keeping them more deeply attached to the site. People generally are aware that Facebook is keeping tabs on who they are and how they use the network, but the depth and persistence of that monitoring is hard to grasp. And People You May Know, or “PYMK” in the company’s internal shorthand, is a black box.
When things get serious in the media space, my friends at the Knight Foundation rally the troops. Last week, I was invited to a workshop Knight held with the Aspen Institute on trust, media and democracy in America. I prepared a whitepaper for the workshop, which I’m publishing here at the suggestion of several of the workshop participants, who found it useful.
The paper I wrote - “Mistrust, efficacy and the new civics: understanding the deep roots of the crisis of faith in journalism” - served two purposes for me. First, it’s a rough outline of the book I’m working on this next year about mistrust and civics, which means I can pretend that I’ve been working on my book this summer. Second, it let me put certain stakes in the ground for my discussion with my friends at Knight. Conversations about mistrust in journalism have a tendency to focus on the uniqueness of the profession and its critical civic role in the US and in other open societies. I wanted to be clear that I think journalism has a great deal in common with other large institutions that are suffering declines in trust. Yes, the press has come under special scrutiny due to President Trump’s decision to demonize and threaten journalists, but I think mistrust in civic institutions is much broader than mistrust in the press.
Because mistrust is broad-based, press-centric solutions to mistrust are likely to fail. This is a broad civic problem, not a problem of fake news, of fact checking or of listening more to our readers. The shape of civics is changing, and while many citizens have lost confidence in existing institutions, others are finding new ways to participate. The path forward for news media is to help readers be effective civic actors. If news organizations can help make citizens feel powerful, like they can make effective civic change, they’ll develop a strength and loyalty they’ve not felt in years.