How Artificial Intelligence Enriches Data to Reduce Gender-Based Violence in Jamaica

by Oxfam Canada | February 7, 2024
Background media: Green and purple collage depicting two Black women holding protest signs against gender-based violence. At the front in black and white there's a phrase that reads, "Using AI to understand gender-based violence in Jamaica."
Collage: Emma Buchanan/Oxfam

Why the WE-Talk project uses AI to study what shapes people's perceptions of gender-based violence in Jamaica.

The WE-Talk project aims to reduce and prevent gender-based violence (GBV) against women, girls, boys and other disadvantaged groups in Jamaica. By raising awareness of harmful norms and practices that contribute to GBV, the project works to shift attitudes and behaviours that perpetuate this type of violence while strengthening the capacity of civil society organizations to address it effectively.

In partnership with the international tech company Quilt AI, the project used artificial intelligence (AI) to analyze how people talk about GBV online. The goal was to gain deeper insights into the influences and information that shape perceptions and attitudes on GBV in Jamaica. The outcome is the first-ever comprehensive dataset on Jamaicans' online discussion of GBV. With this knowledge, WE-Talk will craft appropriate campaigns and activities to tackle the root causes of this type of violence.

Ruth Howard, WE-Talk's program manager, explains the significance of using AI in analyzing Jamaican online discussions about gender-based violence.

What Was the Research?

WE-Talk's research covered different types of gender-based violence, such as domestic violence, intimate partner violence, financial and sexual violence and exploitation, and online violence. Quilt AI harnessed its proprietary tools to dissect 245,000 Google queries and over 16,000 individual social media posts between 2021 and 2023. The company identified trustworthy sources and messaging on various social media platforms by analyzing search data, keywords, videos, images, profiles, and posts.

What Were the Results?

The study identified 11 harmful gender stereotypes and narratives that justify or contribute to GBV. These beliefs prevent survivors from seeking help and make it challenging to develop strategies to prevent GBV. The gender stereotypes and narratives were classified based on their prevalence as: dominant, emerging, stable, and receding.

Genderless violence is one of the most dominant stereotypes that contribute to the undermining of violence against women. This stereotype suggests that GBV isn't a real issue as there are other types of violence in Jamaica, like gang disputes, that affect men more than women.

The study identified emerging narratives, including an invisible matriarch, a child disciplined, and trivializing risk factors. While these are areas of concern, they also present potential opportunities for positive change through appropriate interventions.

 

  • The invisible matriarch stereotype is the belief that women hold more financial, professional, and social power than men and, therefore, can't experience GBV.
  • A child disciplined narrative is the belief that corporal punishment is an acceptable means of discipline. 
  • The trivializing risk factors narrative is the belief that GBV risk factors are so ingrained in Jamaican society that they should be ridiculed rather than addressed. 

The analysis of people's opinions about gender-based violence shows some harmful stereotypes are talked about more negatively than others. However, there's also a lot of positive talk that challenges these stereotypes, promotes gender equality, and supports marginalized groups — about a third of all analyzed discourse. This sentiment analysis is helpful because it gives WE-Talk an idea of what kind of messaging may be more effective in changing Jamaican perceptions about GBV.

The AI analysis identifies journalists, reporters, radio hosts, mainstream music artists, celebrities, meme pages, content creators and comedians as influencers perpetuating harmful stereotypes.

The analysis also indicates that men's lifestyle and self-help influencers, progressive religious leaders and organizations, and influencers in the gender equality space are promoting positive counternarratives. Despite their smaller followings, these influencers play a significant role in counternarrative discussions, and some could be powerful advocates for driving change.

The study reveals which groups of people should be the focus of social media campaigns to change harmful beliefs. 

To combat stereotypes such as genderless violence and trivializing risk factors, targeted efforts toward middle-aged men, younger women, and millennial men are required. The analysis indicates younger women should be the primary focus of healthy sexuality education initiatives. The findings also reveal the project's four target groups — teen moms, youth, older women, and economic dependents — face distinct vulnerabilities and risks of GBV.

This X post is an example of a positive influencer speaking up against the harmful 'child disciplined' stereotype, which suggests that corporal punishment is an acceptable means of discipline.

How Does AI Benefit WE-Talk's Work?

Artificial intelligence data engagement methods provide unique insights into people's attitudes and beliefs regarding GBV. This technology can capture a larger population's perspective, which is challenging with more traditional research methods like street-based or household surveys and interviews or focus groups.

AI readily identified specific influencers in the online space who perpetuate harmful myths that normalize or perpetuate GBV. It also identified trailblazers who are speaking out against gender stereotypes and discrimination.

The findings from the AI analysis will be used to develop a media campaign that will help to move away from gendered notions that condone violence. WE-Talk will use the insights to determine which messaging, social media platforms, and people to target.

AI Limitations to Consider

AI doesn't work in a vacuum. It's crucial to have human intelligence familiar with the local context to improve the accuracy of the results. For instance, our project partner, WMW Jamaica, and GBV expert Carol Watson Williams from reThink Social Development, a Jamaican social research organization, worked with Quilt AI to customize the AI algorithms for the country's context. This collaboration involved identifying and detecting cultural terms and phrases in Jamaican Patois and reviewing preliminary findings through a local lens aligned with local values and ethics. 

Studying GBV through digital media might lead to a limited view of the problem. This is because people with more internet access and who are more educated are overrepresented in such studies. As a result, this can lead to an inaccurate portrayal of who's impacted by this form of violence and the extent to which the attitudes and beliefs expressed online are shared or perpetuated by the wider population.

AI analytics can help understand a group of people's characteristics. However, it cannot directly tell someone's age or financial status. Instead, it uses other information to make guesses. Accurately analyzing groups of people based on their different backgrounds and identities can be challenging because there may not be enough information available.

Under Oxfam and partner's ethical standards, private online groups or unsecure platforms weren't mined for data. Therefore, it's possible that the analysis didn't fully capture the severity of violent discourse towards women, girls, and other minority groups.

Thanks to Our Supporters!

WE-Talk is possible thanks to the financial support of the Government of Canada, provided through Global Affairs Canada, and the generous Canadian public.

New logo from government of Canada that reads, in partnership with Canada.

Share this page:

gender-based-violence