Editorial 1 : Bias in the machine
Introduction: Meta’s actions in May 2021 appear to have had an adverse human rights impact (in Palestine). The report commissioned by Meta last year on how its policies harmed the rights of Palestinian Instagram and Facebook users during the attacks on Gaza in 2021 was damning.
More on news
- An investigation by The Guardian has found that a new feature on WhatsApp – which, like Facebook and Instagram, is owned by Meta — that generates images in response to queries seems to promote anti-Palestinian bias, if not outright bigotry.
- Searches for “Palestinian” and “Palestinian boy” resulted in images of children holding guns.
- In contrast, searching for “Israeli boy” shows children playing sports or smiling, and even for “Israeli army” shows jolly, pious — and unarmed — people in uniform.
- The controversy around AI-generated stickers has not occurred in a vacuum.
- Meta’s social media platforms have been accused of being biased against content from and in support of Palestinians.
What is AI?
- AI is the ability of a computer, or a robot controlled by a computer to do tasks that are usually done by humans because they require human intelligence and discernment.
- Although there is no AI that can perform the wide variety of tasks an ordinary human can do, some AI can match humans in specific tasks.
Characteristics & Components:
- The ideal characteristic of artificial intelligence is its ability to rationalize and take actions that have the best chance of achieving a specific goal. A subset of AI is Machine Learning (ML).
- Deep Learning (DL) techniques enable this automatic learning through the absorption of huge amounts of unstructured data such as text, images, or video.
What is AI bias?
- AI Bias refers to an anomaly in the output produced by a machine learning algorithm.
- Bias in AI is when the machine gives consistently different outputs for one group of people compared to another.
- Typically, these bias outputs follow classical societal biases like race, gender, biological sex, nationality or age.
- This may be caused due to prejudiced assumptions made during the algorithm development process or prejudices in the training data.
What are the types of AI Bias?
- Cognitive bias - These are unconscious errors in thinking that affects individuals’ judgements and decisions.
- These biases could seep into machine learning algorithms via either designers unknowingly introducing them to the model or a training data set which includes those biases.
- Lack of complete data - If data is not complete, it may not be representative and therefore it may include bias.
- It is also difficult to find out the factor that causes the biased output due to the ‘black box effect’ in AI.
What can be done to correct these biases?
- For some time now, considerable work has been done around bias in artificial intelligence and machine learning (ML) models.
- Since the programs are amoral, they can reflect — perhaps even enhance — the prejudices in the data used to train them.
- Addressing the prejudices in the machine, then, requires active interventions and even regulation.
- Blind Taste Test Mechanism - It works by checking if the results produced by an AI system are dependent upon a specific variable such as their sex, race, economic status or sexual orientation.
- Open-Source Data Science (OSDS) - Opening the code to a community of developers may reduce the bias in the AI system.
- Human-in-the-Loop systems - It aim to do what neither a human being nor a computer can accomplish on their own.
Conclusion: No search should paint people, especially children, from an entire community as inherently violent. For all it uses for human beings, AI should not be used to dehumanise so many of them.
Editorial 2 : Delayed, denied
Introduction: A study of over 4 lakh FIRs in Haryana, published recently in the American Political Science Review demonstrated that the path of legal remedy is not so straightforward for women.
More findings of the study
- It has found that not only are cases of violence against women, in which women are the primary complainants, less likely to be registered and more likely to be dismissed in court or result in acquittals, a gender bias is visible even in other types of cases, from registration to prosecution, resulting in, what the researchers call, “multi-stage” discrimination.
More evidences to support the bias against women
- Reports and anecdotal evidence have long shown that complaints by women — not just in Haryana, but across India — are less likely to be taken seriously at the level of the police station.
- Women are made to wait longer and are frequently “counselled” to withdraw complaints.
- The inequality persists at every level of the legal process, shored up by the sentiment, often publicly expressed, that women tend to exaggerate their complaints or misuse the law.
- For example, the recent comment by the Madhya Pradesh High Court in Rajan v. The State of Madhya Pradesh, about the “misuse” of Section 498A (cruelty to women) of the Indian Penal Code — wives these days file “a package of five cases” against their husbands and in-laws.
- Calcutta High Court’s observation in August in Swapan Kumar Das v State of West Bengal, that Section 498A is used by women to unleash “legal terrorism”.
- As the data from the study shows, not only do such pronouncements have no basis in fact, they further discourage women from seeking justice.
How to solve gender prejudice problem?
- Legal Reforms:
- Review and amend discriminatory laws
- Strengthen anti-discrimination laws
- Fast-track gender-based violence cases
- Introduce quotas and representation
- Sensitize legal professionals
- Awareness and Education:
- Promote legal literacy
- Encourage women to report abuse
- Engage with men and boys
- Support Services:
- Establish women's shelters and support centers
- Legal aid and counseling
- Strengthen the Police:
- Sensitization and training
- Encourage more female officers
- Data Collection and Monitoring:
- Collect gender-disaggregated data
- Monitor the implementation of laws and policies to ensure they are effectively combating gender prejudice.
- Public Awareness:
- Promote public awareness campaigns
- Encourage reporting
- International Agreements:
- Implement international agreements and conventions: Ensure that India's legal system is in compliance with international treaties and agreements related to women's rights, such as the Convention on the Elimination of All Forms of Discrimination Against Women (CEDAW).
Conclusion: Increasing the number of women in the police force (11.7 per cent, as shared by MOS MHA in the Rajya Sabha earlier this year) should be prioritised, as should greater sensitisation and training at every level of the system, from the thana to the bench.