clock menu more-arrow no yes mobile

Filed under:

Why effective altruism struggles on sexual misconduct

EA was built on making the world a better place. Can the movement police itself when it comes to sexual harassment?

A photo illustration of a lightbulb breaking against a black background. Getty Images/EyeEm
Kelsey Piper is a senior writer at Future Perfect, Vox’s effective altruism-inspired section on the world’s biggest challenges. She explores wide-ranging topics like climate change, artificial intelligence, vaccine development, and factory farms, and also writes the Future Perfect newsletter.

On February 3, Time magazine published an article about sexual harassment experienced by several women at parties, lunches, and events with ties to the effective altruism community — a community we’ve written about at Future Perfect and which I consider myself a part of. The stories in the article are horrifying and infuriating.

The first question you ask when you read stories like these is simply, “What needs to change so this doesn’t happen again?” I’ve read plenty of stories about sexual harassment, and they often have common features: dysfunctional institutions that try to cover up what’s going on, warning signs ignored.

But the problem becomes even more difficult when it’s a community with no obvious hierarchy, or a loose-knit one like effective altruism, which has an institution whose mission is to grow and nurture the community — the Centre for Effective Altruism (CEA) — but which has authority over only a certain share of what happens under the community’s broader umbrella.

That means that while bad actors can and have been banned from major professional events after complaints, it’s difficult to completely cast them out of the broad, diffuse EA world, which no one organization or individual controls. Where does that leave us?

No less urgent, the Time story also raises an important question about EA broadly: Does it have a sexual harassment problem? More precisely: Are there qualities or factors inherent in EA that increase the likelihood of such behavior?

These are questions the community has to reckon with. One of the big claims that effective altruism makes is that the stakes in the world we live in are enormous, and that individual choices can save many lives, or meaningfully affect policy on crucial, big questions.

I want that hope to be realized. But for it to come true, there need to be functional processes for weeding out bad actors and systems, formal and informal, that ensure a healthy and safe culture. Power, money, and an environment where trusting, well-intentioned young people mingle in settings with blurred social and professional lines are all risk factors for misconduct, and existing mechanisms don’t seem adequate to address that risk.

Why sexual harassment can be so hard to address in cultures like EA

What do you do if you’re sexually harassed at work? Well, that’s illegal, so you can report it to HR, and if you face retaliation — or if the company doesn’t respond — you can hire a lawyer.

What do you do if you’re sexually harassed at a professional conference? You can report it to the conference organizer, sponsoring organizations, your employer, etc.

These mechanisms aren’t perfect — not everyone can access them, for one thing. But they have one critical thing going for them: They exist, thanks to decades of determined effort by activists and others.

But what do you do if you get sexually harassed in a context that’s halfway social and halfway professional — drinks at a bar after work, which are often portrayed as a necessary part of moving up the corporate ladder but not officially a part of “work”?

Or what about the “tech founder houses” cropping up in San Francisco and Los Angeles, where people network and live together in the hope of getting access to investors, and which, as my Vox colleague Rebecca Jennings reported last year, have in a few cases turned out to be places where sexual abuse and coercion allegedly happened?

To whom do you report it? The media, which can investigate but not litigate? What can you actually do to make sure it doesn’t happen again?

Those are questions that effective altruism needs to answer.

Abuse and misconduct without centralized accountability

I live in San Francisco, where weird niche subcultures are practically normal, and I’ve long found it interesting to watch how weird niche subcultures go through revelations that some members (usually, but not always, men) were engaged in a pattern of abusive behavior.

There are two challenges. One is figuring out what happened. Often, community members trying to unravel these allegations don’t have relevant professional training in running a sound, helpful investigation without further harm to victims. (Before I started at Vox, I participated in various community-based processes for addressing sexual assault and misconduct, not specifically in the EA community but in event spaces and communities I was part of. It left me convinced that most people — including me — simply do not have the skills to do this work, especially unpaid in their free time, in a way that successfully supports victims, protects potential future victims, and arrives at the truth.)

And two, while it’s possible to get to the truth with enough effort, it’s not often obvious within these subcultures precisely whose job it is to get to the truth or what their duties are once they find it, especially when cases of misconduct occur in informal, nonprofessional contexts.

These are the ones where there’s no governing authority or formal procedures in place to deal with abuse allegations, or where a respected organization in the space might be able to ban someone from events they host but little more beyond that.

The CEA has a community health team that takes reports on misconduct in the community and responds, often by banning people from the professional events and conferences they specifically sponsor. “We try to support local EA organizers in establishing a healthy culture in their group, for example through workshops and other materials about how to prevent and address problems. We try to make it clear that misconduct doesn’t fly in the community, and we can take some action on that (like banning people from CEA-run events based on harmful behavior),” Julia Wise, community liaison for CEA’s community health team, told me.

But the CEA can only take steps to control EA events they specifically oversee, and that may not be enough. In one of the incidents in the Time story, a woman says she was harassed over dinner in New York. What’s the recourse there? Shunning? Blacklisting? Should a centralized authority be established that’s responsible for all community events? Should there be a consultable directory of bad actors available for women to check before making dinner plans?

In the case of EA, as with other subcultures, the rules for being included in the community are unwritten and unclear, which compounds the issue. If you go to someone’s “EA party” — yes, there are EA parties, and EA hobbyist events, and EA group houses — the hosts are just unselected people who decided to call their parties and events and houses that. There’s no EA certification program. This state of affairs can lead people to expect the kinds of process-based handling of sexual harassment that’s appropriate to a professional community — except that EA, in its current phase of development, doesn’t provide that.

“We’re available if local organizers want advice on a situation,” Wise told me. “But ultimately we don’t control community members’ decisions, especially when things happen in an informal setting.”

EA and sexual misconduct

Then there’s the community itself. If we look at the bigger picture, there are well-known dynamics that effective altruism needs to address as it grows: its gender and racial imbalances, its extremely high mixing of social and professional lives, the conflict-of-interest challenges of having a tight-knit community where benefactors socialize with would-be recipients who depend on their largesse.

Perhaps more so than most subcultures, EA straddles the border between being a social subculture and a professional one. I’m appreciative of the EA community’s acceptance of nontraditional romantic relationships — I’m in committed relationships (of six years and nine years) with two women myself — but Time spotlighted that, too, as a potential driver of harassment and misbehavior.

Because of these reasons and the decentralized, amorphous way EA operates, I absolutely expect that there are people who haven’t spoken out yet about their experiences, but hopefully will in the weeks ahead.

For the effective altruism community, the stakes couldn’t be higher.

As weird subcultures go, it’s one that does a lot more than most to bring folks into the community, trying to convince young people that they should put their own careers into EA-priority areas. That means it incurs a special kind of responsibility if people in its professional community experience harassment.

One possible solution is for an organization like the CEA to take on a much more active role of identifying and protecting against bad actors in a far more expansive sense than banning them from events it runs. “We’re looking into whether we can affect cultural change in the EA community, through education or pushing for better organizational policies,” Wise told me. “We can’t fix this — or any — problem for the community as a whole, but we will continue to try to use our position to shift things in a positive direction.”

But making a large global community without centralized governance safe for everyone is an incredibly hard task, and frankly, I’m not sure the CEA has the resources to do it. And trying and failing to do something like that can leave people mistakenly believing that an environment is policed and protected when it’s not.

Getting this right is particularly urgent because effective altruism is built on the idea that it’s possible to “do good better.” The last year has been full of revelations about the ways the existing EA movement hasn’t always lived up to that — and a reminder of the staggering complexity of doing good at scale.

Money and power attract bad actors, corrode people’s idealism, encourage self-serving narratives, increase the cost of mistakes, and make the institutional flexibility to fix them rare and hard to come by. The sexual harassment conversation is an important test of whether EA is able to grow into the role it wants to have.

A version of this story was initially published in the Future Perfect newsletter. Sign up here to subscribe!

Sign up for the newsletter Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.