Artificial intelligence is beginning to replace many of the workplace roles that men dominate. The parts of those jobs that will have staying power are those that rely more heavily on emotional intelligence — skills in which women typically excel.

Many researchers are reporting, and our research confirms, that artificial intelligence (AI) will reshape our economy — and the roles of workers and leaders along with it. Jobs that don’t disappear will see a significant shift as the tasks that are easily and inexpensively accomplished by robots become automated. The work that remains will very likely focus on relating. To adapt and prosper, the smart worker will invest in “human relating” skills — empathy, compassion, influence, and engagement. For simplicity, let’s call these emotional quotient (EQ) skills. These are skills in which women commonly excel.

Gender differences are a sensitive topic and we address them in this article with trepidation. There is a fine line between understanding commonalities and stereotyping, and the debate about nature versus nurture is robust. But whether you believe that men and women, on average, have different types of brains (as Simon Baron-Cohen, a British clinical psychologist and professor of developmental psychopathology at the University of Cambridge, has theorized) or that gender differences are a result of cultural norms and conditioning (as numerous other studies have explored), the real-world results are similar: Men and women, on average, excel in different dimensions and take on different roles in the workforce. By no means does that suggest that men and women are not equal — just different.

It is clear that men have quite an advantage in the working world — just check out the latest research by McKinsey & Co. on gender equality in the workplace. Men have greater representation among leadership roles, greater presence in higher-paid industries, hold nearly 80% of board seats, and earn higher compensation on average, even for the same jobs.

We believe that AI has the ability to help level the playing field. It will do so, we think, by replacing many roles and functions where men typically dominate.

Read more (MIT Sloan)

Fred Turner is one of the world’s leading authorities on Silicon Valley. A professor at Stanford and a former journalist, he has written extensively on the politics and culture of tech. We sat down with him to discuss how Silicon Valley sees itself, and what it means when the tech industry says it wants to save the world.

Let’s start with the idea that technology is always a force for good. This strain of thought is pervasive in Silicon Valley. Where does it come from? What are its origins?

It owes its origins to 1960s communalism. A brief primer on the counterculture: there were actually two countercultures. One, the New Left, did politics to change politics. It was very much focused on institutions, and not really afraid of hierarchy.

The other—and this is where the tech world gets its mojo—is what I've called the New Communalists. Between 1966 and 1973, we had the largest wave of commune building in American history. These people were involved in turning away from politics, away from bureaucracy, and toward a world in which they could change their consciousness. They believed small-scale technologies would help them do that. They wanted to change the world by creating new tools for consciousness transformation.

This is the tradition that drives claims by companies like Google and Facebook that they are making the world a better place by connecting people. It's a kind of connectionist politics. Like the New Communalists, they are imagining a world that’s completely leveled, in which hierarchy has been dissolved. They’re imagining a world that’s fundamentally without politics.

It's worth pointing out that this tradition, at least in the communes, has a terrible legacy. The communes were, ironically, extraordinarily conservative.

When you take away bureaucracy and hierarchy and politics, you take away the ability to negotiate the distribution of resources on explicit terms. And you replace it with charisma, with cool, with shared but unspoken perceptions of power. You replace it with the cultural forces that guide our behavior in the absence of rules.

So suddenly you get these charismatic men running communes—and women in the back having babies and putting bleach in the water to keep people from getting sick. Many of the communes of the 1960s were among the most racially segregated, heteronormative, and authoritarian spaces I've ever looked at.

Read more (Logic)

My dinner-party party piece for many years was to say, “Well, actually, I invented Baileys. You know, Baileys Irish Cream. I did that back in 1973.”

If one of the unfortunate listening group is a woman – and this is based on actual past experience - she is likely to respond something like this: “Oh-my-God. Baileys. My mother absolutely adores it. Did you hear that, Jocasta? This man invented Baileys. It’s unreal. I don’t believe it. He must be terribly rich. Baileys Cream. Wow!”

And it’s not as if these rather posh people really adore Baileys. Or even hold it in the same esteem as, say, an obscure Islay single malt or a fine white burgundy from Meursault. Not a bit of it. They might have respected it years ago but most people of legal drinking age regard Baileys as a bit naff. To my mind, they’d be very wrong.

On December 3rd, 2007, Diageo announced the sale of the billionth bottle of Baileys since it was first introduced in 1973. That’s a thousand million bottles. And they will have sold at least a further 250 million bottles in the decade since then bringing the total up to something in the area of 1,250,000,000. If we assume that every bottle of Baileys delivered eight generous servings that suggests that over 12 billion glasses of Baileys have been poured since it all began.

The initial thought behind Baileys Irish Cream took about 30 seconds. In another 45 minutes the idea was formed. Baileys was like that for me. A decade of experience kicked in and delivered a great idea. It wasn’t as instant as it seemed. This is the story of its creation.

Read more (The Irish Times)

Look, whatever else might have happened in the world this week, it pales into insignificance when compared to THIS. Just enjoy it on a loop; you're welcome. 

Anyway, I'm in quite a good mood today and so am going to try not to ruin it by ranting too much at you. It's Friday! It's the weekend (practically)! This week's Curios contains an uncommon number of excellent links! Oh, ok, fine, everything's still AWFUL, obviously, but manageably so. Sit back, relax, let my words permeate your consciousness like those weird little brain-burrowing worms in Star Trek: The Wrath of Khan - because what could be nicer than having a whole week's worth of web insinuated into your consciousness on a Friday afternoon? Well, yes, fine, but you probably can't get away with that in the office whereas this can legitimately be timesheeted as 'general internet research' - HAPPY FRIDAY EVERYONE WELCOME TO WEB CURIOS!

Since 1965, British artist Stephen Willats has self-published Control magazine, a seminal forum for artists’ writings on art practice and social organization. With over 150 contributors throughout its 50-year run, Control has drawn on research from cybernetics, advertising theory, and behavioral science to develop models for how artworks operate in dialogue with an audience and society at large. Last year Willats published the 20th issue of Control, in which he continues to pose incisive questions about the ethics of information systems and networked artistic practice that feel more crucial than ever.

Cybernetics was famously defined by Norbert Wiener as “the scientific study of communication and control in the animal and the machine.” The models of feedback that cyberneticians developed were transdisciplinary from the outset, bridging the worlds of computation and engineering with those of design, art, and counterculture.

According to Anthony Hudek, “It is … Control’s function as a self-determining information network, instead of its content, that makes it truly cybernetic”: while being about networks, the magazine also represents a network in itself. Willats’ choice of title, Control, signals this departure from traditional models of editorial authority, seeking instead to develop a conceptual practice determined by the networked relationships of coordinating agents. Artists’ publishing served as a key means of actualizing these ideas. The magazine has always been self-published, self-funded, and free of advertising, while also attaining a broadly international reach.

The interview that follows focuses specifically on Control’s early years, notable for their iconic cover illustrations by designer Dean Bradley. Released between 1965 and 1970, Control’s first issues mark a period when cybernetic ideas resonated broadly within the visual arts, from Jasia Reichardt’s 1968 Cybernetic Serendipity exhibition at the ICA London, to Stewart Brand’s Whole Earth Catalog in California. Willats’ own practice deployed the frameworks that he and his collaborators devised across Control’s pages in a variety of ways, from computer simulations to social and educational projects such as the Centre for Behavioral Art (1972-73). Control is not only a key node within Willats’ body of work; it offers a fascinating toolkit for reconsidering the present status of social hierarchy and networked interaction.

Read more (Avant)

Ever wondered how to use protected images without permission, without payment of royalties or even giving credit to the creator? The European Union allows you to do exactly that, provided you do so via embedding. Rightly, creators and content providers refuse to accept this legal loophole. However, they are not just dinosaurs failing to embrace progress. This is a major problem where legislation is lagging behind.

We are surrounded by hysteria about the future of artificial intelligence and robotics—hysteria about how powerful they will become, how quickly, and what they will do to jobs.

I recently saw a story in ­MarketWatch that said robots will take half of today’s jobs in 10 to 20 years. It even had a graphic to prove the numbers.

The claims are ludicrous. (I try to maintain professional language, but sometimes …) For instance, the story appears to say that we will go from one million grounds and maintenance workers in the U.S. to only 50,000 in 10 to 20 years, because robots will take over those jobs. How many robots are currently operational in those jobs? Zero. How many realistic demonstrations have there been of robots working in this arena? Zero. Similar stories apply to all the other categories where it is suggested that we will see the end of more than 90 percent of jobs that currently require physical presence at some particular site.

Mistaken predictions lead to fears of things that are not going to happen, whether it’s the wide-scale destruction of jobs, the Singularity, or the advent of AI that has values different from ours and might try to destroy us. We need to push back on these mistakes. But why are people making them? I see seven common reasons.

Read more (MIT Technology Review)

Last year I was working on an article about the tech industry when I decided to interview a software engineer who writes for Quillette under the pseudonym “Gideon Scopes”. Gideon had mentioned to me in passing that he had Asperger’s Syndrome (a mild variant of autism spectrum disorder) and I wanted to find out more about the industry from the point of view of someone who is not neurotypical.

I first asked him when it was that he knew he wanted to work in technology. He told me that he first knew it when he was five. His family got their first home computer and he was transfixed. Later, he would come across a brief introduction to the BASIC programming language in a book and proceed to teach himself his first programming language. He was only seven.

As a child he taught himself programming out of books, mostly alone at home. He told me that his family were not particularly supportive of his hobby. His mother was not happy to see him focus so intently on one interest and viewed his study of programming “as the equivalent of a kid spending too much time watching TV.”

Growing up in suburban New York, he told me that a compiler for a programming language would cost at least $100, and programming books generally cost $40-60 each. His only source of income was a $1 per week allowance, so it would take him a year or two to save for just one item. This was despite the fact that his parents were in a high income bracket, and could have easily provided resources to help him learn. He learned anyway.

Despite his cognitive ability, however, Gideon underperformed early on in his schooling. He thinks it may have been because he experienced the school environment as overly rigid and inflexible, and the work was just not challenging enough to engage him. It wasn’t until he was able to take accelerated math and science classes that his grades reflected his ability.

Fast forward several years, and today Gideon is a successful senior software engineer in a prestigious technology company in New York. He loves his job and he loves where he works. He is grateful for the fact that his company values his work, and not how he promotes himself and how he dresses. He feels that the technology industry rewards talent and hard work, and that it is one of the best places for “Aspies” to be. He tells me that the only drawback is the occasional bar event (where he doesn’t like the noise) and a weird and somewhat rigid political culture.

Read more (Quillette)

In early 2006, I got a call from Chris Kelly, then the chief privacy officer at Facebook, asking if I would be willing to meet with his boss, Mark Zuckerberg. I had been a technology investor for more than two decades, but the meeting was unlike any I had ever had. Mark was only twenty-two. He was facing a difficult decision, Chris said, and wanted advice from an experienced person with no stake in the outcome.

When we met, I began by letting Mark know the perspective I was coming from. Soon, I predicted, he would get a billion-dollar offer to buy Facebook from either Microsoft or Yahoo, and everyone, from the company’s board to the executive staff to Mark’s parents, would advise him to take it. I told Mark that he should turn down any acquisition offer. He had an opportunity to create a uniquely great company if he remained true to his vision. At two years old, Facebook was still years away from its first dollar of profit. It was still mostly limited to students and lacked most of the features we take for granted today. But I was convinced that Mark had created a game-changing platform that would eventually be bigger than Google was at the time. Facebook wasn’t the first social network, but it was the first to combine true identity with scalable technology. I told Mark the market was much bigger than just young people; the real value would come when busy adults, parents and grandparents, joined the network and used it to keep in touch with people they didn’t get to see often.

My little speech only took a few minutes. What ensued was the most painful silence of my professional career. It felt like an hour. Finally, Mark revealed why he had asked to meet with me: Yahoo had made that billion-dollar offer, and everyone was telling him to take it.

It only took a few minutes to help him figure out how to get out of the deal. So began a three-year mentoring relationship. In 2007, Mark offered me a choice between investing or joining the board of Facebook. As a professional investor, I chose the former. We spoke often about a range of issues, culminating in my suggestion that he hire Sheryl Sandberg as chief operating officer, and then my help in recruiting her. (Sheryl had introduced me to Bono in 2000; a few years later, he and I formed Elevation Partners, a private equity firm.) My role as a mentor ended prior to the Facebook IPO, when board members like Marc Andreessen and Peter Thiel took on that role.

Read More (Washington Monthly)

In September 2003, philosophy changed my life. I didn’t realise it then, but hindsight is 20/20. And, looking back, I now realise how important it would become for my future. I was entering secondary school in Portugal, and we were offered some optional classes. We could choose either French or German as our secondary languages; I chose German. We were also given a choice between Philosophy and Latin; I went with Philosophy. It’s not that I knew what was expecting me. To be honest, I think I just didn’t feel like learning two new languages in one go.

Twitter, Facebook
Terms & Conditions, Privacy, Cookies

x