The Saga of Gendered Endangerment in Digital Spaces
Societal structures have continued to maintain their biases towards women since the beginning of time. For this traditions of dehumanisation and humiliation have been of help, playing their parts prominently in shaming and comparison with the “upheld individuals and ideas”, morphing of images or by engaging in revenge pornography, because for the authority and system to stay intact, autonomy becomes provocative and visibility provides a form of power that has to be taken away at the earliest. To represent this old pattern of power dynamics, maintained and evolved to suit the progressions of scientific developments, is where Artificial Intelligence is also making itself quite comfortable, continuing this unfolding saga, prevalent in the increasing use and facilitation of deepfakes which constitute 99% of its targets as women and 90–95% of these are non-consensual, these platforms have been developed by mostly men to target specifically women’s bodies. These can be replicated multiple times, shared and stored on private devices, making it more and more difficult to locate and remove, causing harm to women’s physical, sexual, psychological, social, political, economic and other forms of rights and freedoms. For this Laura Bates, a feminist activist and author, has highlighted that the best way to address this risk would be “to recognise that the online-offline divide is an illusion.”
According to The Economist, in a survey, it has been found that 38% of women have personally experienced this sort of violence, and 85% reported witnessing digital violence against others. The speed and anonymity provide the perpetrators greater efficiency in getting away with it, and the lack of understanding of the methods and procedures involved provides little help for its victims. It should be recognized that this is an overwhelmingly gendered issue, where AI platforms provide easier access and tools, making digital abuse faster, more targeted, and harder to detect.
One sustainable method for preventing such exploitation, among others, has been to recruit more women researchers and builders of technology and to work with women-centred organisations but this faces its own gender issues, as, according to a study published by Harvard Business School, women are less likely to adopt AI tools, at an almost 25% lower rate than that of men on average, with a combination of other societally developed biases. With lesser engagement by women on these platforms, it would lead to continued development in ways that would retain attributes of these inequalities, broadening the ways in which AI responds to particular genders. For this it is especially important for women to take on these roles, to have representation in this section for fighting against this gendered issue and the victims have to be provided space to speak up rather than being shamed or told to handle it quietly. We require voices to bring attention and to make policies which pertain to safeguarding vulnerable groups. Development should be inclusive, where everyone has space to exist.
Sources:
https://www.thenewsminute.com/news/grok-bikini-prompts-and-the-casual-dehumanisation-of-women
