- Priyanka Vaidyanath
Algorithm Equality and International Law: An Incongruous Reality
Gender bias – A Premier
Gender bias has scourged our society since the dawn of civilization. Gender biases are prejudices against one gender by another through thoughts or actions: perceiving one gender to be unequal to another in terms of rights and dignity. Gender bias also includes favouritism of one over the other. Gender stereotype has affected public, private and domestic life of women with a perception that women is a weaker gender. Gender bias is not only seen between men and women but against other genders as well. The current article is limited to the bias between men and women
There needs to be collective action and cooperation to ensure that women enjoy human rights and dignity in all nooks and corners of the world. Owing to the current globalized - borderless world the first step to ensure the lack of bias in any domain falls upon international law and international institutions to thrive towards equality. International law has always been perceived to be masculine than feminine owing to the higher representation of male members through the subjects of international law. The World Women Statistics reveal that only 28% of women hold a managerial position which was the same as in 1995 and less than 25% of women hold parliamentary seats globally. Yet again highlighting the fewer representation of women in the decision-making process.
Algorithm Bias – A Present-Day Danger
An algorithm bias is a default in the machine learning system due to failed training data sets or the sided assumptions during the development of the algorithm. Gender bias has persisted in society for ages. Though there are international and regional instruments thriving to bring equality between the genders, they seem to have become incompetent given the new ‘algorithmic bias’. We are currently witnessing the fourth industrial revolution driven by data and algorithmic systems. It is used extensively and intimately in public and private spaces. Algorithm driven ‘artificial agents’ such as Alexa, Siri, Watson are used in the decision-making process in different spheres of life. The gender bias against women has pervaded the AI algorithms to such a deep level wherein it identifies the pronoun ‘he’ easily but fails to do the same with ‘hers’. Gender bias is one such among the many biases (racial bias, minority bias) that an algorithm can undergo. Artificial intelligence and machine learning include a huge amount of data sets from which the algorithms predict the outcome. Needless to forget that the training of data, labelling of data, modelling techniques are all the works of humans at the primary level which impacts the algorithm in the decision-making process and thereby the outcome. Statistics from the World Economic Forum reveals a shocking gap between the gender roles in AI wherein 78% are male professionals and 22% female. The effect of such bias in algorithms is not just a theoretical academic work but has seen an impact in the real world.
The recruiting tool used by Amazon resonates AI bias, where the algorithm-driven system shortlists male résumé’s based on the past recruiting pattern and outrightly doing away with the applications representing women. While society seems to be looking forward to reducing gender discrimination, a google employee James Damore defended the company for its wage gap between genders stating the psychological difference between men and women. Berkeley Haas Center for Equity, Gender and Leadership has made a report on the occurrence of algorithmic bias wherein 44.2% reveal gender bias. Few instances of gender bias illustrated in the report are: the algorithm responsible for creditworthiness offered lesser credit scores to women than men in the Apple card; Gender shades evaluated the gender classification by AI in three big tech companies – Microsoft, IBM and Face++, wherein the accuracy level of female subjects was 89.3% (Microsoft) 79.7% (IBM) and 78.7% (Face++) and accuracy of male subjects were 97.4%, 94.4% and 99.3% respectively. The word embeddings used in natural language processing and machine learning by google news article highlight the significant gender specified words. Research was conducted using Adfisher, a tool to study Google’s ads which revealed that a female would come across fewer ads on high paying executive jobs as compared to a male. Algorithm bias in health care system poses a higher risk to women’s health than men. There is a health gap between men and women wherein men seem to fall ill more than women, which does not negate the fact women do not get ill. The data of higher illness in men may be misread by the algorithm and not diagnose similar health issue in women. For instance, ‘a heart attack is more common in men than women’, a gender-biased data set regarding the symptoms of a heart attack may misdiagnose women’s symptoms pushing her into greater harm.
Concurring that the algorithmic gender bias is the new age problem that the world is facing right now it is imperative to check on the existing ‘social data’. Needless to forget that the inherent social and historical inequalities have sneaked into the algorithm through the data entered by individuals. The underrepresented communities such as racial minorities, minority women, ethnic minorities are marginalized and prone to more discrimination than ever owing to insufficient data. Caroline Criado Perez has made astonishing research in her book ‘Invisible Women’ exposing data bias and gender differentiation for ages. The world has evolved with the persistent gender gap and gender-oriented pronouns, one can imagine the consequence of such biased data fed into the AI system.
International law
It’s been almost 73 years since the Universal Declaration of Human Rights which echoed the need for equal rights among men and women followed by the International Bill of Rights For Women (CEDAW), 1979; to eliminate discrimination against women but has landed humanity nowhere closer to equality between the genders. As society is progressing with advanced science and technology to ease the lifestyle it is as well encumbering newer ways of gender discrimination. ‘Women rights are also human rights’ is a painful contention to make in the 21st century after undergoing years-long struggle for equality between genders. After reviewing the consequences of algorithmic gender bias as stated above, rights such as the Right to life right to health and medical care, right to the same employment opportunity clearly seem to be violated. The international community has come together to eliminate all forms of discrimination against women but in vain.
Not discounting the fact that there are initiatives by international actors to address the ethical issues in AI, who are yet to address the gender bias. Montreal Declaration for the responsible development of artificial intelligence took up the responsibility to develop a ‘responsible’ AI has no mention of gender bias or the need for gender equality. AI4People’s Ethical Framework which attempts to bring in a “good AI society” fails to include a gender perspective in its recommendation. UNESCO has drafted The first recommendation on the Ethics of AI after concluding an inter-governmental meeting that addresses ‘Gender’ as one of the policy areas. Further, UNESCO has as well drafted a global dialogue on AI and Gender Equality recommending inclusion of gender equality into principles of AI. The dialogue focused on the practical aspect of operationalizing the inclusion of gender equality into AI principles, which has to be done in the technical level and as well to build a sense of gender inclusion through education and societal awareness
Owing to the spurt of the use of AI, OECD adopted an intergovernmental policy on the standard of AI with an inclusive view to respect and foster Human rights and democratic values. The recommendation aims to bridge the gap between the genders with a view to create growth, well-being and sustainable development.[1] G20 Ministerial Statement of Trade and Digital Economy 2019 also imbibes the same principle to ensure trustworthiness in AI and thereby reduce the gender gap in the system. Likewise, Article 22 of GDPR envisages the right not to be subject to automated decision which may have a legal effect or significantly affects ‘him or her’. This right comes with an exception in cases of necessity for a contract or is authorized by the union or such an individual has expressed explicit consent for such use of data.
Way Ahead
Having been aware of the elements of masculinity in international law and international relations; such as ‘state' , ‘security’, ‘military’ the concepts which are perceived to be masculine; there is a dearth of strong platform to further the cause of gender equality. Goal 5 of the UN Sustainable goals is one of the agendas to cater to gender equality. Tere is a dire need to come up with a strong ethical AI policy and to define the limits and structures that the algorithm can take. All institutions in use of algorithms in their decision-making process must adopt ‘bias evaluation’ to do good the inherent gender bias existing in the society. The data fed can impact women in areas known and beyond thereby resonating the need to have conscious gender-neutral data. The technology can be utilized to find out the existing (un)conscious bias before it is put out to use in the decision-making process. Prevention is better than cure!
Algorithms have already seeped into different layers of society and exposing gender discrimination as well. Women have been struggling to find an equal place and position as that of men through various means viz. international cooperation, national and regional legislations, works of NGO(s), women education, women empowerment and what not. It is better late than never, hence training of data to be gender-neutral at the nascent stage may reduce the intensity of gender discrimination in future. Big-tech companies such as Google, Facebook, Twitter owe a huge corporate responsibility in bridging the gap between genders due to the impact of their platforms on daily lives of people. 4.8 billion people use social media, highlighting the reach that these companies have. It is here the international law has to proactively accommodate and deliberate upon policies to be complied by the corporate bodies to ensure gender neutrality. When there are such discrepancies between the binary gender, it is beyond imagination the harm that can be caused by the algorithms to other non-binary genders. Though we are in the 21st century the sense of inclusiveness about the genders are yet to be achieved. The new and advanced technology should only benefit in leading the civilization towards an unbiased equal world and not a step behind.
[1] Principles for responsible stewardship of trustworthy AI OECD Recommendation of the Council on AI Sec1.1
Priyanka Vaidyanath, has completed her B.A LLB , LLM and is currently pursuing her PhD at Christ (deemed to be) University. Her area of research focuses on the intersection of artificial intelligence and international humanitarian law. Her areas of interest include international law, technology law, and cyber law.
Image: Credit to Medium