Gender, Power, and Risk in the Age of Artificial Intelligence
Power Dynamics, Gender Relations, and Risk Management in the Age of Artificial Intelligence
DOI:
https://doi.org/10.82015/NNR.2025.100113Keywords:
Artificial Intelligence, Gender, Algorithmic Bias, Technological Feminism, Surveillance, Power, Risk, Ethics, Intersectionality, Social JusticeAbstract
The advent of Artificial Intelligence (AI) as a pervasive and structuring technology does not occur in a social vacuum but is grafted onto a pre-existing world, marked by profound gender inequalities.
This article explores the conceptual triad of gender, power, and risk within the socio-technical ecosystem of AI, arguing that algorithmic systems are not neutral but are active agents in shaping, reinforcing, and, in rare cases, potentially countering existing power dynamics.
Through a critical analysis spanning from the philosophy of technology to media sociology and feminist studies, it will examine the multiple risks of gendered AI: from discriminatory biases embedded in datasets and algorithms, to differentiated surveillance and the erosion of agency, to the reproduction of stereotypes in generative models.
Concurrently, the article will investigate the power asymmetries in the design, development, and governance of AI, dominated by a male and Western-centric technological culture. The article is not limited to diagnosing problems but also proposes an ethical-political path towards a feminist, intersectional, and pluralistic AI, founded on principles of algorithmic justice, democratic participation, and responsible design. With over 60 bibliographic references, the essay aims to be a critical mapping and a tool for reflection to navigate one of the most controversial and decisive territories of our time.
References
Amoore, L. (2020). Cloud Ethics: Algorithms and the Attributes of Ourselves and Others. Duke University Press. ISBN: 9781478008420.
Bender, E. M., Gebru, T., McMillan-Major, A., and Shmitchell, S. (2021). On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, 610–623.
https://doi.org/10.1145/3442188.3445922.
Benjamin, R. (2019). Race After Technology: Abolitionist Tools for the New Jim Code. Polity Press. ISBN: 9781509526390.
Bianchi, F., Kalluri, P., Durmus, E., Ladhak, F., Cheng, M., Nozza, D., Hashimoto, T., Jurafsky, D., Zou, J., and Caliskan, A. (2023). Easily Accessible Text-to-Image Generation Amplifies Demographic Stereotypes at Large Scale. Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency, 1493–1504. https://doi.org/10.1145/3593013.3594095.
Bolukbasi, T., Chang, K. W., Zou, J. Y., Saligrama, V., and Kalai, A. T. (2016). Man is to computer programmer as woman is to homemaker? Debiasing word embeddings. Advances in Neural Information Processing Systems, 29: 4349–4357. https://arxiv.org/abs/1607.06520.
Browne, S. (2015). Dark Matters: On the Surveillance of Blackness. Duke University Press. ISBN: 9780822359384.
Buolamwini, J., and Gebru, T. (2018). Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. Proceedings of Machine Learning Research, 81, 1–15. http://proceedings.mlr.press/v81/buolamwini18a.html.
Butler, J. (1990). Gender Trouble: Feminism and the Subversion of Identity. Routledge. ISBN: 9780415389556.
Cheryan, S., Plaut, V. C., Davies, P. G., & Steele, C. M. (2009). Ambient belonging: How stereotypical cues impact gender participation in computer science. Journal of Personality and Social Psychology, 97(6), 1045–1060. https://doi.org/10.1037/a0016239.
Crawford, K. (2021). Atlas of AI: Power, Politics, and the Planetary Costs of Artificial
Intelligence. Yale University Press. ISBN: 9780300209570.
Crenshaw, K. (1989). Demarginalizing the Intersection of Race and Sex: A Black Feminist Critique of Antidiscrimination Doctrine, Feminist Theory and Antiracist Politics. University of Chicago Legal Forum, 1989(1), 139–167.
https://chicagounbound.uchicago.edu/uclf/vol1989/iss1/8.
Dastin, J. (2018). Amazon scraps secret AI recruiting tool that showed bias against women. Reuters. https://www.reuters.com/article/idUSKCN1MK0AG/.
D’Ignazio, C., and Klein, L. F. (2020). Data Feminism. The MIT Press. ISBN: 9780262044004.
Eubanks, V. (2018). Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. St. Martin's Press. ISBN: 9781250074317.
European Commission. (2021). Proposal for a Regulation laying down harmonised rules on artificial intelligence (Artificial Intelligence Act). https://eur-lex.europa.eu/legalcontent/EN/TXT/?uri=CELEX:52021PC0206.
European Parliament and Council. (2016). Regulation (EU) 2016/679 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation).
https://eur-lex.europa.eu/eli/reg/2016/679/oj.
Foucault, M. (1976). La volontà di sapere. Storia della sessualità 1. Feltrinelli. ISBN:
9788807880730.
Gebru, T. (2020). Race and Gender. In The Oxford Handbook of Ethics of AI (pp. 253-274). Oxford University Press. https://doi.org/10.1093/oxfordhb/9780190067397.013.13.
Gilligan, C. (1982). In a Different Voice: Psychological Theory and Women's Development. Harvard University Press. ISBN: 9780674445444.
Google LLC. (2023). 2023 Diversity Annual Report. https://shorturl.at/PQZ7j.
Haraway, D. (1988). Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective. Feminist Studies, 14(3), 575–599.
https://doi.org/10.2307/3178066.
Hooks, B. (2000). Feminism Is for Everybody: Passionate Politics. South End Press. ISBN: 9780896086289.
Jobin, A., Ienca, M., and Vayena, E. (2019). The global landscape of AI ethics guidelines. Nature Machine Intelligence, 1(9), 389–399. https://doi.org/10.1038/s42256-019-0088-2.
Lupton, D. (2016). The Quantified Self: A Sociology of Self-Tracking. Polity Press. ISBN: 9780745667852.
Microsoft (2023). 2023 Diversity & Inclusion Report. https://www.microsoft.com/enus/
diversity/inside-microsoft/annual-report.
Noble, S. U. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. New York University Press. ISBN: 9781479837243.
O’Neil, C. (2016). Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Crown Publishing Group. ISBN: 9780553418811.
Richardson, K. (2015). The Asymmetrical 'Relationship': Parallels Between Prostitution and the Development of Sex Robots. ACM SIGCAS Computers & Society, 45(3), 290–293. https://doi.org/10.1145/2874239.2874281.
Sap, M., Card, D., Gabriel, S., Choi, Y., and Smith, N. A. (2019). The Risk of Racial Bias in Hate Speech Detection. Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 1668–1678. https://doi.org/10.18653/v1/P19-1163.
Saxenian, A. (1994). Regional Advantage: Culture and Competition in Silicon Valley and Route. Harvard University Press. ISBN: 9780674753402.
Schechner, S., and Secada, M. (2019, 22 febbraio). You Give Apps Sensitive Personal Information. Then They Tell Facebook. The Wall Street Journal. https://shorturl.at/Ylr6e.
Selbst, A. D., Boyd, D., Friedler, S. A., Venkatasubramanian, S., and Vertesi, J. (2019). Fairness and Abstraction in Sociotechnical Systems. Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), 1–36. Stark, L., and Levy, K. (2018). The Surveillant Society of the Digitally Mediated Public Sphere. Surveillance & Society, 16(2), 242-248. https://doi.org/10.24908/ss.v16i2.12944.
Tronto, J. C. (1993). Moral Boundaries: A Political Argument for an Ethic of Care. Routledge. ISBN: 9780415906425.
Vallor, S. (2016). Technology and the Virtues: A Philosophical Guide to a Future Worth Wanting. Oxford University Press. ISBN: 9780190498511.
Wachter, S., Mittelstadt, B., and Floridi, L. (2017). Why a Right to Explanation of Automated Decision-Making Does Not Exist in the General Data Protection Regulation. International Data Privacy Law, 7(2), 76–99. https://doi.org/10.1093/idpl/ipx005.
West, M., Kraut, R., and Chew, H. E. (2019). I'd blush if I could: closing gender divides in digital skills through education. UNESCO.
https://unesdoc.unesco.org/ark:/48223/pf0000367416.page=1.
World Economic Forum. (2023). Global Gender Gap Report 2023.
https://www.weforum.org/publications/global-gender-gap-report-2023/.
Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. PublicAffairs. ISBN: 9781610395694.