Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.

Week3: initial project ideas

With the advancement of the information age, algorithms greatly improve the efficiency of people’s use of data, and they enable the reuse and reutilization of data. But in some ways, these technologies are precisely exacerbating various inequalities.

There is a huge potential for growing income inequality. Leaders of various countries emphasized that the benefits of global economic growth should be shared more equally.
In the information age, everything can be data, and at this stage, data is wealth, which leads to more useful algorithms, which indirectly leads to a more unequal amount of wealth owned by different people.

People are easily “cheated” by the algorithm. For example, in the process of job hunting, a job seeker was brushed off because of a personality test, just because the algorithm thought he was not the right person; a loan borrower went to the bank to get a loan and the interest rate was higher than others, just because he lived in an area where there were many people with bad credit records; a minor offender was sentenced to a heavy penalty, just because one of his friends or relatives was a repeat offender. Moreover, after being “counted” by the algorithm, they were all kept in the dark and could not get a reasonable explanation.

The use of algorithms and big data in different fields can make the lives of the poor worse and magnify social inequalities to varying degrees.

Employers are asking candidates to submit credit reports, equating credit scores with accountability or reliability. Doing so creates a dangerous positive feedback loop of poverty. If someone can’t get a job and has no income because of poor credit history, their credit history will only get worse, making it harder and harder to get a job. But employers always think credit history with full of data is more reliable than even human judgment. They hardly consider what assumptions are hidden behind the numbers.

In isolation, the effects of these algorithms are bad enough, not to mention that they reinforce each other.

The main directions of my research then include:
the impact of Big data and algorithms on inequality.
Why do they have such a knock-on effect?
What can be done to address this inequality more effectively?
Do the effects vary across countries and why?

3 replies to “Week3: initial project ideas”

  1. Darcie Harding says:

    This got me remembering when I worked at a Canadian bank processing credit applications. Since I was very new and young, I got sent the customers who were largely not good credit risks according to the bank as they had no credit history or very little money. This included people who had never borrowed money before, people who had no or very few assets, or people who were new to Canada. Needless to say I didn’t stay in the job long since most people were not approved for loans or credit cards and it seemed very unfair to me as most of these people just needed someone to give them a chance. I see business decisions being made but I wish they would consider the human element. I hope your project work can help people who just need that first chance!

  2. Wang Hanyu says:

    Hi, Ye, I noticed that both you and Jiayi are interested in exploring the relationship between inequality and algorithm in your project. Perhaps you could consider talking to each other and see if you could benefit from each others’ thoughts on the topic

  3. Jiayi Zhao says:

    ‘People are easily “cheated” by the algorithm. a job seeker was brushed off due to a personality test, just because the algorithm thought he was not the right person’
    This is deeply agreeable to me. Due to my undergraduate major, I have many classmates who are looking for jobs in Internet companies. There are a large number of graduates every year, companies use some unreasonable algorithms to screen out people in the resume screening process simply and brutally. I think It’s inequality.

Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

css.php

Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.

  Cancel