Monday, October 22, 2018

On teaching ethics to tech companies

Kara Swisher (who is unafraid to call it like it is!) has a new op-ed in the NYT titled "Who will teach Silicon Valley to be ethical". She asks
How can an industry that, unlike other business sectors, persistently promotes itself as doing good, learn to do that in reality? Do you want to not do harm, or do you want to do good? These are two totally different things. 
And how do you put an official ethical system in place without it seeming like you’re telling everyone how to behave? Who gets to decide those rules anyway, setting a moral path for the industry and — considering tech companies’ enormous power — the world.

There are things that puzzle me about this entire discussion about ethics and tech. It seems like an interesting idea for tech companies to incorporate ethical thinking into their operations. Those of us who work in this space are clamoring for more ethics education for budding technologists.

There is of course the cynical view that this is merely window dressing to make it look like Big Tech (is that a phrase now?) cares without actually having to change their practices.

But let's put that aside for a minute. Suppose we assume that indeed tech companies are (in some shape of form) concerned about the effects of technology on society and that their leaders do want to do something about it.

What I really don't understand is the idea that we should teach Silicon Valley to be ethical. This seems to play into the overarching narrative that tech companies are trying to do good in the world and slip up because they're not adults yet -- a problem that can be resolved by education that will allow them to be good "citizens" with upstanding moral values.

This seems rather ridiculous. When chemical companies were dumping pesticides on the land by the ton and Rachel Carson wrote Silent Spring, we didn't shake our heads sorrowfully at companies and sent them moral philosophers. We founded the EPA!

When the milk we drink was being adulterated with borax and formaldehyde and all kinds of other horrific additives that Deborah Blum documents so scarily in her new book 'The Poison Squad', we didn't shake our heads sorrowfully at food vendors and ask them to grow up. We passed a law that led eventually to the formation of the FDA.

Tech companies are companies. They are not moral agents, or even immoral agents. They are amoral profit-maximizing vehicles for their shareholders (and this is not even a criticism). Companies are supposed to make money, and do it well. Facebook's stock price didn't slip when it was discovered how their systems had been manipulated for propaganda. It slipped when they proposed changes to their newsfeed ratings mechanisms to address these issues.

It makes no sense to rely on tech companies to police themselves, and to his credit, Brad Smith of Microsoft made exactly this point in a recent post on face recognition systems. Regulation, policing and whatever else we might imagine, has to come from the outside. While I don't claim that regulation mechanisms all work as they are currently conceived, the very idea of checks and balances seems more robust than merely hoping that tech companies will get their act together on their own.

Don't get me wrong. It's not even clear what has to be regulated here. Unlike with poisoned food or toxic chemicals, it's not clear how to handle poisonous speech or toxic propaganda. And that's a real discussion we need to have.

But let's not buy into Silicon Valley's internal hype about "doing good". Even Google has dropped its "Don't be evil" credo.

Thursday, October 11, 2018

Google's analysis of the dilemma of free speech vs hate speech

Breitbart just acquired a leaked copy of an internal google doc taking a cold hard look at the problems of free speech, fake news and censorship in the current era. I wrote a tweet storm about it, but also wanted to preserve it here because tweets, once off the TL, cease to exist.

Breitbart acquired an internal google doc discussing the misinformation landscape that the world finds itself in now: … 
I almost wish that Google had put out this document to read in public. It's a well thought out exploration of the challenges faced by all of us in dealing with information dissemination, fake news, censorship and the like. And to my surprise, it (mostly) is willing to point figures backwards at Google and other tech companies for their role in it. (although there are some glaring omissions like the building of the new censored search tool in China). It's not surprising that people inside Google are thinking carefully about these issues, even as they flail around in public. And the analysis is comprehensive without attempting to provide glib solutions

Obviously, since this is a doc generated within Google, the space of solutions is circumscribed to those that have tech as a major player. For e.g the idea of publicly run social media isn't really on the table, or even better ways to decentralize value assignment for news, or alternate models for search that don't require a business model. But with those caveats in mind, the analysis of the problems is reasonable.

Monday, October 08, 2018

A new sexual harassment policy for TCS conferences.

One of my most visited posts is the anonymous post by a theoryCS colleague describing her own #metoo moments inside the TCS conference circuit. It was a brutal and horrific story to read.

Concurrently (I don't know if the blog post had an effect, but one can but hope it helped push things along), a committee was set up under the auspices of TCMF (FOCS), ACM, SIAM, and EATCS to
Draft a proposal for joint ToC measures to combat discrimination, harassment, bullying, and retaliation, and all matters of ethics that might relate to that.
That committee has now completed its work, and a final report is available. The report was also endorsed at the FOCS business meeting this week. The report is short, and you should read it. The main takeaways/recommendations are that every conference should
  • adopt a code of conduct and post it clearly. 
  • recruit and train a group of advocates to provide confidential support to those facing problems at a conference
  • have mechanisms for authors to declare a conflict of interest without needing to be openly specific about the reasons. 
There are many useful references in the report, as well as more concrete suggestions about how to implement the above recommendations. This committee was put together fast, and generated a very useful report quickly. Well done!

Disqus for The Geomblog