Ramapo’s Gross Center hosts AI and antisemitism talk

The Gross Center for Holocaust and Genocide Studies hosted a talk titled “Artificial Intelligence (AI) and Antisemitism” last Wednesday. This talk was one of the five-part webinar series of talks on the relationship between antisemitism and artificial intelligence (AI). 

The talk featured author and talk show host Edwin Black who brought in his own views on antisemitism and AI. Black began the event by dedicating the talk to his father, a Jewish Polish man who fought in World War II. 

After holding a moment of silence for his father and for those who have dealt with antisemitism, Black opened the talk by explaining that AI isn’t actually “thinking” and that AI “is artificial but not really intelligent.” 

Black went on to describe the link between antisemitism and computerized technology, citing the use of punch cards and barcodes in 1940s-era Germany. 

“If AI is based on the internet and the internet is toxic, AI is toxic,” he said. “If misinformation is a weapon, AI has that weapon mastered,” 

He cited AI’s wide reach in distributing information. Black compares this to the distribution of an antisemitic text by the Ford Motor Company and antisemitic radio broadcasts in 1930s and 1940s-era Germany. 

After sharing his ideas on AI, Black opened up the discussion for questions to educate not only students but educators who were in attendance. 

Dr. Jacob Ari Labendz, director of the Gross Center, asked Black if he feels the main concern with AI is if it may engage in antisemitism on its own or if he is worried that people may use AI to spread hate. Black responded that he is worried AI may be programmed to engage in hate and the nature of AI may magnify its effects. 

Labendz, while listening to Black’s answer, brought in his own experience here on Ramapo’s campus. He mentioned how he’s cautious about letting students use AI as a tool but tells students that “any mistakes [the AI] makes are considered [the student’s] mistakes” and that everything must be cited correctly. He also asks any student who uses AI for an assignment to write a paper justifying their use of the tool.

Black then shared his thoughts on Labendz’s approach. He stated that the use of this type of software is “Promethean” and cautioning that any use may be a “Pandora’s box.” Black suggested that servers that run AI should be destroyed, although he acknowledges this may not entirely be possible because many different AI software programs have already been created. 

Labendz mentioned via email that one of the most significant worries covered in this discussion is that AI has the potential to be used as a “force multiplier” for antisemitism and other forms of hate. He stated that the devaluing of the human element is occurring in education, with the example of professors being replaced by AI. 

“Students today are growing up in a media environment very different than people of other generations,” he stated.

Labendz has some hope, however, noting that based on his interactions with college students at Ramapo, most are aware of the challenges of social media and electronic communication. This includes the presence of false and slanted information. 

“My faith for the future rests in our capacity for self-awareness,” he stated.

The final two talks of this series, “Artificial Intelligence as a Tool for Holocaust Research” and “Virtually Modeling the Janowska Concentration Camp” will be held on April 9 and May 7 via Zoom. These discussions hope to bring more information into this conversation, help give tools for attendees to critically assess media and talk about how technology is being used to educate people about the Holocaust.

 

jgray11@ramapo.edu

 

Featured photo courtesy of Tara Winstead, Pexels