Research#1

https://en.unesco.org/courier/2020-4/we-must-educate-algorithms
Human society is rapidly moving into the digital age, where data-driven consumption and artificial intelligence are widely and profoundly influencing how people produce and consume. The seemingly objective and neutral AI technology has instead resulted in the creation of gender discrimination. The questions I raised seek to explore the issue of gender imbalance in digital consumption scenarios.
Gender is an objective issue in all aspects of society, and I would like to draw attention to the point of gender in algorithms by discussing it.
The interview noted that a certain amount of subjectivity manifests itself in the choice of words and the turns of phrases – even if we have the impression that we are writing a very factual text. Our approach consisted of dissecting the different stages of “sexist contagion” to identify the biases.
In contrast to other disciplines, the problem of discrimination became apparent early on. Only three years after algorithms became popular, perceptive people began to draw attention to the distinctions in specific algorithms.
Research#2
Case Study
“On 20 August 2019, Apple Inc. and Goldman Sachs joined forces to officially launch the credit card, Apple Card. In November of the same year, American entrepreneur David Heinemeier Hansson took to Twitter to accuse the Apple Card of alleged algorithmic gender discrimination because his credit limit was 20 times that of his wife. The latter had a higher credit score when they submitted a joint tax return with their application. Steve Wozniak also noted that he received 10 times the credit limit on his Apple Card than his wife, although the couple shared multiple banks and credit card accounts.
Soon after, a spokesperson for the New York Department of Financial Services made it clear that the Department had decided to investigate the matter to determine whether the Apple Card algorithm had been violated and to ensure that all consumers were treated equally, regardless of gender.”
In fact, in 1972, foreign scholars pointed out that there was clear gender discrimination in the credit market and that women were mistreated in terms of access to credit even though they were deemed to have the same ability to repay as men. It has also been suggested that adverse selection and moral hazard problems in the credit market make women more vulnerable to preference discrimination in the traditional credit market. As a result, female borrowers are less successful in the conventional credit market and are often required to pay higher average repayment rates.
In 2016, Jinan University analysed data from 170,817 orders generated by the online lending platform Renren between March 2012 and December 2014. A similar study by Jiangnan University in 2018 found a similar effect and that women were more discriminated against than men in the Chinese internet lending market. Similar research not only reached identical conclusions but also found that women’s academic background did not help improve women’s borrowing success rates.
In summary, it is clear that there is severe gender discrimination in the financial consumer space and that algorithms built on mathematics do not naturally remain neutral. This social issue is the focus of the Shanghai Mana Data Technology Development Foundation (from now on, referred to as “Mana Data Foundation”).
Even though women are not inferior to men in terms of repayment ability, educational background, performance integrity and other critical financial references, women are still discriminated against by algorithms because they inherit the original gender discrimination in society. This discrimination is not easily detectable due to the algorithm’s ‘black box’ nature. Even if a woman discovers that she has been discriminated against, she cannot argue with the algorithm, which boasts accuracy and objectivity. The customer service agent will tell her that “this is the result of the system, and there is nothing we can do about it”.
The lack of a “female perspective” is an important reason for the problem of algorithmic sexism in the financial consumer sector and the innovation of artificial intelligence technology, where men dominate the discourse. Returning to the case of the Apple Card, before any internet product is launched, it must undergo continuous testing to find and eliminate vulnerabilities and ensure its stability. If the Apple Card team had a gender perspective and had tested the product for gender before it went live, it would have been easy to identify the problem of algorithmic sexism.
We are ushering in the arrival of the algorithmic society, and algorithms will play an increasingly important role in the financial consumer space. We should be alert to and respond to algorithmic sexism in economic consumption scenarios to protect consumers’ vital interests.
