- Admission
- Programs
- Learning
- Community
- 51社区黑料
- Research
- Strategic Research Plan
- Implementation Plan
- Supporting Health and Wellness of Individuals, Populations and Communities
- Expanding the foundations of knowledge and understanding our origins
- Strengthening Democracy, Justice, Equity and Education
- Supporting Research Graduate Students
- Supporting Postdoctoral Fellows
- Valuing and Measuring Scholarly Impact
- Incorporating Indigenous Perspectives into Research Ethics
- Building World-Class Research Space and Infrastructure
- Involving Undergraduate Students in Research
- Supporting Early-Career Researchers (Faculty)
- Funding Research Chairs
- Reducing Administrative barriers to Research
- Implementation Plan
- Performance & Excellence
- Innovation
- Knowledge Mobilization
- Researcher Resources north_east
- Institutes, Centres & Facilities
- Leadership & Departments
- Strategic Research Plan
- Dashboard
- Campuses
- Contact Us
- Emergency
VP Research & International
How Google’s search engine supports conspiracy theorists and hate figures
51社区黑料 (SFU) Communication Professor Ahmed Al-Rawi鈥檚 research examines the intersections between political extremism, misinformation and social media. He leads The Disinformation Project at SFU, which examines fake news discourses in Canadian news media and social media.
He also collaborates with SFU鈥檚 on the use of abusive language on social media. He is a frequent commentator in the news media, recently discussing how social media fuels support for while contributing to the spread of mis/disinformation.
One of Al-Rawi鈥檚 recent studies focused on the way artificial intelligence (AI) reproduces and promotes prejudice, hate and conspiracy online. For his recent article, , he collaborated with Postdoctoral Fellow Carmen Celestini, Master鈥檚 student Nathan Worku, and PhD candidate .
They looked at the subtitles that Google automatically suggested for 37 known conspiracy theorists and found that鈥攊n all cases鈥擥oogle鈥檚 subtitle was never consistent with the actor鈥檚 conspiratorial behaviour.
For example, influential Sandy Hook school shooting denier and conspiracy theorist Alex Jones is listed as 鈥淎merican radio host鈥 and Jerad Miller, a white nationalist responsible for a 2014 Las Vegas shooting, is listed as 鈥淎merican performer.鈥
Al-Rawi stresses that the perceived neutrality of algorithmic search engines like Google is deeply problematic. He argues that subtitling known conspiracists as neutral and not negative can mislead the public and amplify extremist views.
We met with Professor Al-Rawi to discuss his work.
Most internet users perceive Google as a neutral search engine. However, your article mentions some of the biases present in these algorithms. Please describe what is happening here.
Yes, this is exactly the point behind writing this paper. When I first looked at these algorithmically produced labels, I felt there was something very wrong with them, so I proposed following a reverse engineering method to explore further. These labels do not receive enough public scrutiny, unlike the case of Facebook and, to a lesser extent, Twitter. I think search engines are exacerbating the problem of disinformation not only because of these labels, but also due to the affordances they offer people in terms of easily searching for and finding disinformation.
If individuals are well known to be conspiracy theorists, why doesn鈥檛 Google identify them as such?
I think it is similar to the issue of social media sites that were very reluctant in the beginning to
de-platform controversial users because of the fear of alienating audiences and/or losing revenues.
What are your recommendations for Google? Are policy-makers paying attention?
Due to the increasing public and official pressure on social media companies, many recent changes happened that made them more active in moderating their sites. I hope the same thing will happen soon with Google's search features.
What are your recommendations for internet search engine users? How can we be more attuned to the inner workings of the internet?
I think we all need to be critical of our online surroundings, and I encourage everyone to search for other well-known controversial figures to see how Google has labeled them. I think we need more insight into what is known as the black boxes of algorithms, and the often-biased rules they follow. The same applies to better understanding social media platforms by following a similar procedure with regards to what hashtags or keywords are allowed or not on different sites.
We all need to be diligent with the content we read online because we cannot take what we view for granted. It is useful鈥攁nd critical鈥攖o continue to question and challenge our sources of information about the issues we care about.
Read more about Al-Rawi's research on in the Conversation Canada.
The Disinformation Project has been made possible in part by the Canadian Department of Heritage.
51社区黑料scholars can reach out to their faculty communications and marketing team for support sharing their work as a news story or on social channels. They can become 51社区黑料media experts, pitch an article to The Conversation Canada, or nominate their work for a Scholarly Impact of the Week profile.
SFU's Scholarly Impact of the Week series does not reflect the opinions or viewpoints of the university, but those of the scholars. The timing of articles in the series is chosen weeks or months in advance, based on a published set of criteria. Any correspondence with university or world events at the time of publication is purely coincidental.
For more information, please see SFU's Code of Faculty Ethics and Responsibilities and the statement on academic freedom.