Greg M.
Epstein is the Humanist Chaplain at Harvard and MIT, and the author of the New York Times bestselling book Good Without God.
Described as a godfather to the [humanist] movement by The New York Times Magazine in recognition of his efforts to build inclusive, inspiring, and ethical communities for the nonreligious and allies, Greg was also named one of the top faith and moral leaders in the United States by Faithful Internet, a project of the United Church of Christ and the Stanford Law School Center for Internet and Society.
More posts by this contributorIn June, TechCrunch Ethicist in Residence Greg M.
Epstein attended EmTech Next, a conference organized by the MIT Technology Review.
The conference, which took place at MITs famous Media Lab, examined how AI and robotics are changing the future of work.Gregs essay, Will the Future of Work Be Ethical? reflects on his experiences at the conference, which produced what he calls a religious crisis, despite the fact that I am not just a confirmed atheist but a professional one as well.
In it, Greg explores themes of inequality, inclusion and what it means to work in technology ethically, within a capitalist system and market economy.Accompanying the story for Extra Crunch are a series of in-depth interviews Greg conducted around the conference, with scholars, journalists, founders and attendees.Below he speaks to two key organizers: Gideon Lichfield, the editor in chief of the MIT Technology Review, and Karen Hao, its artificial intelligence reporter.
Lichfield led the creative process of choosing speakers and framing panels and discussions at the EmTech Next conference, and both Lichfield and Hao spoke and moderated key discussions.Gideon Lichfield is the editor in chief at MIT Technology Review.
Image via MIT Technology ReviewGreg Epstein: I want to first understand how you see your job what impact are you really looking to have?Gideon Lichfield: I frame this as an aspiration.
Most of the tech journalism, most of the tech media industry that exists, is born in some way of the era just before the dot-com boom.
When there was a lot of optimism about technology.
And so I saw its role as being to talk about everything that technology makes possible.
Sometimes in a very negative sense.
More often in a positive sense.
You know, all the wonderful ways in which tech will change our lives.
So there was a lot of cheerleading in those days.In more recent years, there has been a lot of backlash, a lot of fear, a lot of dystopia, a lot of all of the ways in which tech is threatening us.
The way Ive formulated the mission for Tech Review would be to say, technology is a human activity.
Its not good or bad inherently.
Its what we make of it.The way that we get technology that has fewer toxic effects and more beneficial ones is for the people who build it, use it, and regulate it to make well informed decisions about it, and for them to understand each other better.
And I said the role of a tech publication like Tech Review, one that is under a university like MIT, probably uniquely among tech publications, were positioned to make that our job.
To try to influence those people by informing them better and instigating conversations among them.
And thats part of the reason we do events like this.
So that ultimately better decisions get taken and technology has more beneficial effects.
So thats like the high level aspiration.
How do we measure that day to day? Thats an ongoing question.
But thats the goal.Yeah, I mean, I would imagine you measure it qualitatively.
In the sense that What I see when I look at a conference like this is, I see an editorial vision, right? I mean that Im imagining that you and your staff have a lot of sort of editorial meetings where you set, you know, what are the key themes that we really need to explore.
What do we need to inform people about, right?Yes.What do you want people to take away from this conference then?A lot of the people in the audience work at medium and large companies.
And theyre thinking aboutwhat effect does automation and AI going to have in their companies? How should it affect their workplace culture? How should it affect their high end decisions? How should it affect their technology investments? And I think the goal for me is, or for us is, that they come away from this conference with a rounded picture of the different factors that can play a role.There are no clear answers.
But they ought to be able to think in an informed and in a nuanced way.
If were talking about automating some processes, or contracting out more of what we do to a gig work style platform, or different ways we might train people on our workforce or help them adapt to new job opportunities, or if were thinking about laying people off versus retraining them.
All of the different implications that that has, and all the decisions you can take around that, we want them to think about that in a useful way so that they can take those decisions well.Youre already speaking, as you said, to a lot of the people who are winning, and who are here getting themselves more educated and therefore more likely to just continue to win.
How do you weigh where to push them to fundamentally change the way they do things, versus getting them to incrementally change?Thats an interesting question.
I dont know that we can push people to fundamentally change.
Were not a labor movement.
What we can do is put people from labor movements in front of them and have those people speak to them and say, Hey, this is the consequences that the decisions youre taking are having on the people we represent.
Part of the difficulty with this conversation has been that it has been taking place, up till now, mainly among the people who understand the technology and its consequences.
Which with was the people building it and then a small group of scholars studying it.
Over the last two or three years Ive gone to conferences like ours and other peoples, where issues of technology ethics are being discussed.
Initially it really was only the tech people and the business people who were there.
And now youre starting to see more representation.
From labor, from community organizations, from minority groups.
But its taken a while, I think, for the understanding of those issues to percolate and then people in those organizations to take on the cause and say, yeah, this is something we have to care about.In some ways this is a tech ethics conference.
If you labeled it as such, would that dramatically affect the attendance? Would you get fewer of the actual business people to come to a tech ethics conference rather than a conference thats about tech but that happened to take on ethical issues?Yeah, because I think they would say its not for them.Right.Business people want to know, what are the risks to me? What are the opportunities for me? What are the things I need to think about to stay ahead of the game? The case we can make is [about the] ethical considerations are part of that calculus.
You have to think about what are the risks going to be to you of, you know, getting rid of all your workforce and relying on contract workers.
What does that do to those workers and how does that play back in terms of a risk to you?Yes, youve got Mary Gray, Charles Isbell, and others here with serious ethical messages.What about the idea of giving back versus taking less? There was an L.A.
Times op ed recently, by Joseph Menn, about how its time for tech to give back.
It talked about how 20% of Harvard Law grads go into public service after their graduation but if you look at engineering graduates, the percentage is smaller than that.
But even going beyond that perspective, Anand Giridharadas, popular author and critic of contemporary capitalism, might say that while we like to talk about giving back, what is really important is for big tech to take less.
In other words: pay more taxes.
Break up their companies so theyre not monopolies.
To maybe pay taxes on robots, that sort of thing.
Whats your perspective?I dont have a view on either of those things.
I think the interesting question is really, what can motivate tech companies, what can motivate anybody whos winning a lot in this economy, to either give back or take less? Its about what causes people who are benefiting from the current situation to feel they need to also ensure other people are benefiting.Maybe one way to talk about this is to raise a question Ive seen you raise: what the hell is tech ethics anyway? I would say there isnt a tech ethics.
Not in the philosophy sense your background is from.
There is a movement.
There is a set of questions around it, around what should technology companies responsibility be? And theres a movement to try to answer those questions.A bunch of the technologies that have emerged in the last couple of decades were thought of as being good, as being beneficial.
Mainly because they were thought of as being democratizing.
And there was this very nave Western viewpoint that said if we put technology and power in the hands of the people they will necessarily do wise and good things with it.
And that will benefit everybody.And these technologies, including the web, social media, smart phones, you could include digital cameras, you could include consumer genetic testing, all things that put a lot more power in the hands of the people, have turned out to be capable of having toxic effects as well.That took everybody by surprise.
And the reason that has raised a conversation around tech ethics is that it also happens that a lot of those technologies are ones in which the nature of the technology favors the emergence of a dominant player.
Because of network effects or because they require lots of data.
And so the conversation has been, what is the responsibility of that dominant player to design the technology in such a way that it has fewer of these harmful effects? And that again is partly because the forces that in the past might have constrained those effects, or imposed rules, are not moving fast enough.
Its the tech makers who understand this stuff.
Policy makers, and civil society have been slower to catch up to what the effects are.
Theyre starting to now.This is what you are seeing now in the election campaign: a lot of the leading candidates have platforms that are about the use of technology and about breaking up big tech.
That would have been unthinkable a year or two ago.So the discussion about tech ethics is essentially saying these companies grew too fast, too quickly.
What is their responsibility to slow themselves down before everybody else catches up?Another piece that interests me is how sometimes the giving back, the generosity of big tech companies or tech billionaires, or whatever it is, can end up being a smokescreen.
A way to ultimately persuade people not to regulate.
Not to take their own power back as a people.
Is there a level of tech generosity that is actually harmful in that sense?I suppose.
It depends on the context.
If all thats happening is corporate social responsibility drives that involve dropping money into different places, but there isnt any consideration of the consequences of the technology itself those companies are building and their other actions, then sure, its a problem.
But its also hard to say giving billions of dollars to a particular cause is bad, unless what is happening is that then the government is shirking its responsibility to fund those causes because its coming out of the private sector.
I can certainly see the U.S.
being particularly susceptible to this dynamic, where government sheds responsibility.
But I dont think were necessarily there yet.
Music
Trailers
DailyVideos
India
Pakistan
Afghanistan
Bangladesh
Srilanka
Nepal
Thailand
StockMarket
Business
Technology
Startup
Trending Videos
Coupons
Football
Search
Download App in Playstore
Download App
Best Collections