• Search

    Christopher Wolf on Cyber Hate and Digital Citizenship

    150 150 Holocaust Denial on Trial

    Christopher Wolf is widely recognized as one of the leading American practitioners of privacy and data security law. He is a partner at the Washington D.C. law firm Hogan & Hartson.

    Wolf also has extensive experience combating online hate speech and Holocaust denial. He chaired the Washington board of the Anti-Defamation League (ADL) and is a national ADL leader, serving on their Internet task force. He is also chairman emeritus of the International Network Against Cyber-Hate, an NGO that coordinates national efforts to fights online hate speech. Wolf has represented the ADL and INACH at conferences in London, Paris, Warsaw, Stockholm, Berlin, Jerusalem, and Herzliya [Hertzeliyah].

    He is interviewed by Dan Leshem, program manager of the Holocaust Denial On Trial website. For more information please visit us at www.hdot.org. This podcast is underwritten by the Jewish Community Federation of San Francisco, the Peninsula, Marin and Sonoma Counties.

    Q. [Leshem]

    Can you please describe your background and how you came to be involved with combating online hate speech?

    A. [Wolf]

    Sure, I’m happy to be speaking with you today on a subject that is really vitally important to society, I think, and especially to the Jewish community. The issue of anti-Semitism and hate speech online really becomes more compelling every day, as there are new technologies that the haters can use to spread their messages.

    I got involved as a result of my involvement with the Anti-Defamation League. I’ve been a lay-leader at the ADL since the late 1980s, and as my career at the ADL as a volunteer progressed I also migrated from being a generalist litigator and law firm to focusing on the Internet in my legal practice. Currently I chair the privacy and data security practice and do quite a lot of Internet law at the law firm of Hogan and Hartson. My work in this area is a wonderful combination of both my extracurricular and my curricular interests.

    Q. [Leshem]

    Excellent. And what roles do you currently fill with various anti-hate groups?

    A. [Wolf]

    At the Anti-Defamation League I chair the Internet task force that I founded in 1995. I should add that the ADL has been monitoring the issue of the misuse of technology by hate groups since the 1980s, even before the Internet, when hate groups were using dial-up bulletin boards.

    I also am involved with the International Network Against Cyber Hate, INACH. I served as its chair for the last several years, and recently passed that baton to one of the leaders of the group in France. INACH is an NGO based in Amsterdam with about 14 or 15 member NGOs from around the world, all unified in this fight against online hate speech.

    Q. [Leshem]

    And can you briefly describe a little more about what kind of work INACH does, and what kind of work the ADL’s Internet Task Force does? What does it produce?

    A. [Wolf]

    Sure. INACH tries to coordinate the activities of the various member organizations, and I can describe what that is by telling you what ADL does. ADL has 24/7 monitoring of the Internet to identify those websites and content online (and today that means in social networking sites like Facebook and MySpace and YouTube) that constitutes hate speech. And we will bring such content to the attention of the hosts because typically that content violates the terms of service of the commercial sites like Facebook, MySpace, and YouTube.

    And we also monitor a lot of these sites because they can rise to the level of illegal speech, which is a very limited category in the United States with the First Amendment, but nevertheless exists. If there are direct threats towards identifiable individuals, then in that case we’ll report it to law enforcement. We monitor all of this material and report on it so that quite simply the lies that are inherent in hate speech can be exposed and countered.

    A lot of the work we do today has to do with cyber-bullying because much of the hate speech that we encounter online in addition to the, what I’ll call, traditional websites of hate groups has to do with kids bullying other kids and using anti-Semitic, racial, or homophobic epithets. Or otherwise just engaging in hostile conduct, which leads later in life to hateful activity.

    Q. [Leshem]

    That’s very interesting. Given your experience in this field, what do you think it is that makes the Internet such fertile ground for messages of violence and hate?

    A. [Wolf]

    Well, maybe the best way to answer that question is to use kind of a case study, and the one I would use is the one involving this 88 year old man, James Von Braun, who is currently charged with first degree murder for killing a security guard at the United States Holocaust Memorial Museum here in Washington, where I am.

    Before the Internet Mr. Von Braun was a self-proclaimed white supremacist, and an anti-Semite, and a Holocaust denier. And now, as I say, he stands accused of murder at the Holocaust museum. But in the old days, pre-Internet, he was relegated to using mail to communicate his rage with his like-minded haters, and the only place for him to have his benighted views applauded was in whatever meeting he might be able to go to in dark alleys, as I like to say.

    And the Internet changed all of that because like his fellow bigots, Mr. Von Braun found the Internet was a boon to his warped causes. He maintained a website called Holy Western Empire, where he touted and provided excerpts from his book that denied the Holocaust, the book that also praised Hitler. And he was found online in chat rooms, on bulletin boards, and had links on his site to other haters’ websites, and he had online a virtual fan club that cheered on his kind of vicious thinking.

    And so we will never know whether his rage would have burned out but for the Internet, but we do know that Von Braun the murderer found validation in the rage that he harbored on the Internet because it allowed him to interact with like-minded people that otherwise might be very difficult to do.

    Q. [Leshem]

    And beyond that level of the ease of communication that the Internet provides, do you think that there’s anything… Let me rephrase. One of the things I encountered reading through INACH’s reports, and also some of the stuff you’ve written for ADL, it seems that you and the other member organizations are making a claim that there is something about the way information is transmitted on the Internet that makes it particularly ripe for hate messages. Is that right?

    A. [Wolf]

    Right. And part of that is caused by the anonymity that is possible on the Internet. There are a lot of people who are saying things online where they cannot be held accountable, and the messages cannot be attributed to them. The thing I’m thinking of when I say that are the comment sections to legitimate websites, like newspapers and elsewhere where you can register under a pseudonym, and express yourself. In the wake of the Bernie Madoff scandal we saw in Palm Beach Post, for example, incredibly anti-Semitic comments posted by people who wouldn’t stand in a public square and be interviewed or say such mean and hateful things. But online, sitting in the privacy of their home and hiding behind a shield of anonymity, they feel of ease saying those kind of things.

    Q. [Leshem]

    And why do you think it is that, as you point out, starting back in the early days of the Internet, when it was just dial-up and there was no graphical interface even, why do you think it is that these haters (I hate to use such a colloquial-sounding phrase), but why do you think it is that they’re drawn, that they’re sort of the early adopters, and we see them, as you’ve pointed out, we see these kind of hate groups be early adopters of Facebook, and early adopters of YouTube, and Wikipedia, until they started changing their policies, really enforcing policies.

    A. [Wolf]

    Well you also see child predators and other people engaged in socially unacceptable, and illegal behavior ‐ in the case of child predators ‐ going online because it is a safe place for them to do things that otherwise are not acceptable in our society. So I’m not sure it’s because they are any more technologically adept than right-thinking people. I think it’s because it’s a technology that makes the expression of their hateful ideas easier.

    Q. [Leshem]

    One of the other things I notice reading over some of the wonderful material you guys have prepared is that antisemitism and Holocaust denial speech seemed to hold a sort of privileged place within the world of online hate speech. Can you explain that a little bit?

    A. [Wolf]

    Well, you know, there’s nothing new under the sun, and antisemitism has been with us through the ages, for thousands of years. And so we’re not surprised to see it migrate online. It is one of the most virulent forms of hate around the world. And it is, while it’s socially unacceptable in some circles in the United States and happily we can say that it is not an epidemic here as it was in Europe at one time, of course, and as it is in the Middle East, it still is a very popular form of hatred. It’s very easy for people to demonize Jewish people because it’s been done for millennia.

    Q. [Leshem]

    In one of the presentations INACH prepared in 2004 the person who writes the report for the state of French Internet hate speech, refers to the Web as having both “shop windows” of hatred, by which he means the websites themselves, and also the “back shops” of racism, by which he means the forums on the sites that you already referred to. Do you still think those are the two main ways in which hate is done online?

    A. [Wolf]

    Well, I would add a third category, and it’s probably arisen since that report was written, and that’s through the proliferation of social networking online, which provides new technological platforms for all kinds of communication, but including hate speech. I mentioned earlier the comment sections that you find in newspapers; you find it on websites as well. There’s a brand new Google application called Sidewiki where people can comment on any website as long as they have that application installed on their Google Toolbar. There are so many new ways for people to express themselves through social networking that that is yet another channel for people to express their hate.

    And again, they do it hiding behind the veil of anonymity, and while I’m a privacy lawyer I think there are limits to the legitimacy of anonymity, particularly when people ought to be held accountable for their speech. You know, I harken back to the days when the ADL fought for a law in Georgia that required Ku Klux Klan members to be unmasked. They were permitted to assemble and to espouse their hate-filled views but they had to identify themselves, if you will. And one wonders whether such a requirement ought to be imposed online so that people are held accountable for what they say and do.

    Q. [Leshem]

    Which raises the question, I mean, how much can the legal system be helpful in combating online hate speech. How did they pass such a law regarding the Klan and how could some of those same legal resources be used today?

    A. [Wolf]

    Well, it’s interesting. That law was upheld by the Georgia Supreme Court, but the United States Supreme Court never had occasion to rule on it under the First Amendment. Our First Amendment essentially is a safe haven for virtually all Internet content, and that’s why at INACH when there’s a suggestion that websites be shut down I remind them that the borderless nature of the Internet means that, like chasing cockroaches, squashing one doesn’t solve the problem because the websites will run under the wall to the United States, cross the border, if you will, and be hosted here.

    So, the law is not a powerful tool. I’m not saying it has no role to play in fighting online hate speech. Countries with speech codes should make sure that those laws are deployed in appropriate cases. But in some cases going after hate mongers makes them martyrs. And I’m thinking of Ernst Zündel and Fredrick Töben, who were prosecuted, and even David Irving. That wasn’t an online situation, but he was made somewhat of a martyr by his prosecution.

    The law is one tool in the fight against online hate speech. I think it may sound like a cliché, but I really do firmly believe that the best antidote to hate speech is counter speech that’s exposing hate speech for what it is, deceitful and false. And that not only sets the record straight but it promotes the values of tolerance and diversity. And to paraphrase a famous Jewish Supreme Court justice, Brandeis, sunlight is still the best disinfectant. It’s always a good thing to expose hate to the light of day, than to let it fester in darkness. And so we think, at the ADL, that the best answer to bad speech is more speech.

    Q. [Leshem]

    Absolutely. I want to talk about two particular instances of hate speech online, and I’m wondering what your reactions to them are.

    First, I’ve seen that the ADL has several videos of its own on YouTube. I’ve also seen that there are infinitely more videos on YouTube that attack the ADL. In an effort to diminish those attacks it seems that one strategy the ADL employs is to disallow comments to appear at the end its videos. Given that this takes away some of the power of sites like YouTube, how could the sunshine policy you were talking about earlier be applied to the types of comments people post to YouTube videos? And further, what is the best way to combat the proliferation of videos that attack the ADL and promote antisemitism?

    A. [Wolf]

    Well, in our monitoring, obviously, we can’t capture all of the videos that are antisemitic. And it’s not just those that attack the ADL that concern us. But we do bring videos to the attention of the abuse folks at YouTube. And in fact the ADL is the resource YouTube uses on its site in the resource center on dealing with problematic speech. We would like to see the day when ISPs and service providers are more proactive in what they do to filter out hate speech. It’s very difficult to define it. For many of us we know it when we see it, but it’s very hard to articulate a standard. But we still think that certain content could be identified and filtered technologically. And we’re hoping that day will come, and we encourage ISPs to think about that. They don’t have a legal obligation to do it; it’s more of an ethical issue than a legal issue.

    Q. [Leshem]

    Just to play devil’s advocate for a moment, do we really want ISPs deciding what content should be filtered or not?

    A. [Wolf]

    Well, good point. Professor Jeffrey Rosen at George Washington University Law School actually wrote a cover story for the New York Times magazine about the staff at Google whose job it is to filter. And the issue was who made them the supreme judge of this sort of thing. And that’s why we embrace the First Amendment and principles of free expression. We don’t want to decide for others, as we don’t want them to decide for us what is appropriate expression.

    But when websites and service providers say that they will not tolerate antisemitic, racist, homophobic speech there has to be teeth given to that admonition, I think. And they can rely on the community and the ADL to report violations, but with the ability, as you pointed out, of people to post very quickly, and have those posts proliferate, perhaps the ISPs should take a more active role in weeding that kind of content out. And there are some bright lines that can be drafted, I think.

    Q. [Leshem]

    Can you give us any examples?

    A. [Wolf]

    Well, in Europe, for example, the bright lines that they’ve decided on is that you can’t display symbols of the Holocaust; you can’t deny the Holocaust. That obviously wouldn’t float in the United States. But the use of epithets that we all know very well could be filtered out, that kind of language. So, there are ways to do it, I think.

    Q. [Leshem]

    The other instance I wanted to ask you about was the recent controversy over Facebook’s refusal, or foot-dragging approach, to eliminating certain Holocaust denial groups from their sites. What position did ADL take within that argument, and what do you think the right solution is?

    A. [Wolf]

    ADL told Facebook, and I firmly believe it as well, that hosting a Holocaust denial site is, per se, hate speech. Facebook’s position was that they would monitor the contents of the site, and that if it evolved into hate speech they would take some action. We pointed out to them that denying the Holocaust, per se, is hate speech. I should add that MySpace recognizes that principle and does not allow Holocaust denial groups.

    Q. [Leshem]

    And do they filter it themselves, or do they respond quickly to outside groups who point them out?

    A. [Wolf]

    I believe that they filter it themselves.

    Q. [Leshem]

    That’s very interesting. I wonder what is the right approach to educating users. It seems, within your description, that sunshine is the best solution. That what you’re talking about in essence is some kind of education of web users. I know that when I was growing up (I’m in my thirties) there was no web education in school, for instance. But, do you envision any type of web education, or what some people call digital literacy, as being part of a solution to this issue?

    A. [Wolf]

    Absolutely. And I think it’s a national disgrace that we don’t have as a requirement ‐ in our elementary education, and middle school education ‐ courses instructing children on appropriate use of these modes of communication. We wouldn’t dare let our kids go into bad neighborhoods in the real world, and yet we really don’t do much to have them protect themselves against going into bad neighborhoods online. More fundamentally, there are rules of etiquette, and ethics, and morality that are unique to the Internet that ought to be taught to our kids. The whole issue of downloading and stealing intellectual property ‐ sound recordings and movies, for example ‐ is a problem that’s unique to the Internet. So is the use of anonymity to engage in cyber-bullying, or to migrate from that to hate speech.

    There are only a few school systems in the country that require this kind of education, and as I understand it, the curriculum is quite limited. These tools are the major means of communication not only for young people, but for those of us who are, I guess, known as digital immigrants. They’re digital natives ‐ they grew up with it ‐ and we’re the digital immigrants. And it is absolutely essential that kids get better education about what is, and what is not acceptable behavior online.

    Q. [Leshem]

    Do you know of any resources that you’ve come across that you think are valuable, and that could be used in classrooms to educate students?

    A. [Wolf]

    Actually, the Media Awareness Project in Canada, which is funded in part with Canadian tax dollars, has some very good curricula. And of course the Anti-Defamation League itself has some very good cyber-bullying curriculum. The states of Virginia and Texas, I believe, have some curriculum. I’ve not seen it. But this is something that even at the Federal level ought to be mandated, or at least made available, through the Department of Education. It absolutely is fundamental not only to create a more civil society, but also to empower kids to protect themselves online, so that when they grow up they understand that certain words and behavior are not acceptable.

    Q. [Leshem]

    One of the challenges we face here at the Holocaust Denial on Trial project is that in many ways it is so much easier to tell a lie than it is to tell the truth. And Holocaust deniers have gotten very sophisticated with the design of their websites, with the kinds polished video, very polished audio recordings, multimedia presentations, that have really left behind the roots of Holocaust denial and the Neo-Nazi movement. And we find that, given how much simpler it is to tell the story of a lie, whether in video or any multimedia format, that it’s quite challenging to tell the story of the truth. Have you thought about that, or do you have any thoughts on the sort of best practices for combating hate online?

    A. [Wolf]

    Well, all of those points are absolutely valid, and for many years the position of the ADL was not to dignify the lies spread by haters with a response. But because of the proliferation of Holocaust denial, and other forms of antisemitism, we have undertaken to show that that kind of thought is fallacious.

    It’s a lot more work. There is no question about it. It is much harder to disprove something than to prove it. But you look at what President Netanyahu did recently at the U.N. when he brought the recently disclosed blueprints of Auschwitz. You don’t need those blueprints to prove that the Holocaust existed, but it’s one more piece of evidence. And things that we take for granted perhaps we shouldn’t. I remember visiting Auschwitz when I was in Poland not so long ago and there were school kids there, one of whom was in tears at the end of the tour because they learned first hand what the Holocaust was about. And so the antidote to hate speech is counter speech, and it may take more words and more pictures, but we all have an obligation, I think, to engage in it.

    Q. [Leshem]

    Often when students encounter the information on our site for the first time, they will write to us asking how they can respond. I think a lot of people that confront this type of material, once they realize what’s going on, feel very moved to respond. Do you have any suggestions on how people can get involved, or what involvement might mean?

    A. [Wolf]

    Well, we actually have a program at the ADL called confronting antisemitism. And we have, for college kids, some campus materials on how to deal with antisemitism and anti-Zionism, anti-Israel sentiments on campus. And I think if you go to the ADL website that material is available.

    Q. [Leshem]

    And do you have anything for cyber-bullying, which you said is quickly becoming a focus?

    A. [Wolf]

    Indeed. Same place: right on the ADL website, www.adl.org.

    Q. [Leshem]

    Okay, wonderful. Well, that’s been very enriching. Those are all the questions I have for you, but I think that our audience will find a lot to ponder and take away from the information you’ve presented.

    A. [Wolf]

    Well, I very much appreciate your giving me the opportunity to speak with you, and I admire all the work that you’ve been doing there in Georgia.

    Q. [Leshem]

    Thank you so much.

    A. [Wolf]

    And nice to talk with you.