Ethics, Sustainability and Social Impact in Digital Gender Equality
VerifiedAdded on 2023/06/13
|10
|2528
|306
AI Summary
This report discusses the use of AI to combat racial and gender bias, the views of JUST AI fellows on racial and gender justice, and various barriers in bringing justice in terms of digital connectivity. It also provides key messages and recommendations to address the digital gender gap in developing and underdeveloped nations.
Contribute Materials
Your contribution can guide someone’s learning journey. Share your
documents today.
ETHICS, SUSTAINABILIY
AND SOCIAL IMPACT
AND SOCIAL IMPACT
Secure Best Marks with AI Grader
Need help grading? Try our AI Grader for instant feedback on your assignments.
Table of Contents
INTRODUCTION...........................................................................................................................3
PART A...........................................................................................................................................3
1. AI to combat racial bias-..........................................................................................................3
2. Issues with AI, gender and racial bias-....................................................................................3
(3) JUST AI fellows and their views-..........................................................................................4
PART B............................................................................................................................................5
(1) Key messages of the report -..................................................................................................5
(2) Recommendations and barrier addressing- ...........................................................................6
(3) Important barriers in mobile internet use- .............................................................................7
CONCLUSION................................................................................................................................8
REFERENCES................................................................................................................................1
INTRODUCTION...........................................................................................................................3
PART A...........................................................................................................................................3
1. AI to combat racial bias-..........................................................................................................3
2. Issues with AI, gender and racial bias-....................................................................................3
(3) JUST AI fellows and their views-..........................................................................................4
PART B............................................................................................................................................5
(1) Key messages of the report -..................................................................................................5
(2) Recommendations and barrier addressing- ...........................................................................6
(3) Important barriers in mobile internet use- .............................................................................7
CONCLUSION................................................................................................................................8
REFERENCES................................................................................................................................1
INTRODUCTION
The term ethics, sustainability and social impacts are here discussed in context to digital gender
equality. The report will be discussing AI and racial, gender bias and how it can be combatted.
Latter on it will be discussing the views of JUST AI fellows on racial and gander justice. The
report will further discuss various barriers in bringing the state of justice in term of digital
connectivity and which are those key barriers on this way (Timmers, 2019.)
PART A
1. AI to combat racial bias-
Over the years the use of AI in corporate world has been surged with enormous rate. Now face
recognition, finger biometrics are quite general phenomenon. In such technocratic environment
the use of AI can't be denied. But with the higher use the high amount of controversies is also
seen with it and racial bias is one among them.
Racial bias is one of the most horrendous element in the society and the modern corporates are
also not far from it. In daily life such instances can be seen. For addressing such issues
effectively there is a rigorous need of new field of AI which can combat these issues. Prevailing
AI system is technologically owned and working as just a technological tool but it is needed to
be a technologically guided or driven not owned.
Over the time such ethical issues are seen like in face recognition the AI has shown discrepancies
in identifying black people and people with pigmentation on their faces. Along with it, it has also
shown such jeopardies while identifying women. Several researches found that the AI system is
having problem in dealing with human behavioural aspect. That's why we are having an
immediate requirement of introducing new discipline of AI which can combat such issues and
can work being more humanistic in nature. As it is discussed it must be suitable in dealing with
human behavioural differentiations and variety of human races (Dennis, 2021.)
2. Issues with AI, gender and racial bias-
As it is discussed that the modern AI is having a great requirement to be reformed in attempt to
combat such bias issues. AI is not a gender free technology, since it is structured in such way that
it can identify such complex societal issues yet it is less competent when it comes to human race
and gender which are away from their behavioural aspect (Domnich and Anbarjafari, 2021)
The term ethics, sustainability and social impacts are here discussed in context to digital gender
equality. The report will be discussing AI and racial, gender bias and how it can be combatted.
Latter on it will be discussing the views of JUST AI fellows on racial and gander justice. The
report will further discuss various barriers in bringing the state of justice in term of digital
connectivity and which are those key barriers on this way (Timmers, 2019.)
PART A
1. AI to combat racial bias-
Over the years the use of AI in corporate world has been surged with enormous rate. Now face
recognition, finger biometrics are quite general phenomenon. In such technocratic environment
the use of AI can't be denied. But with the higher use the high amount of controversies is also
seen with it and racial bias is one among them.
Racial bias is one of the most horrendous element in the society and the modern corporates are
also not far from it. In daily life such instances can be seen. For addressing such issues
effectively there is a rigorous need of new field of AI which can combat these issues. Prevailing
AI system is technologically owned and working as just a technological tool but it is needed to
be a technologically guided or driven not owned.
Over the time such ethical issues are seen like in face recognition the AI has shown discrepancies
in identifying black people and people with pigmentation on their faces. Along with it, it has also
shown such jeopardies while identifying women. Several researches found that the AI system is
having problem in dealing with human behavioural aspect. That's why we are having an
immediate requirement of introducing new discipline of AI which can combat such issues and
can work being more humanistic in nature. As it is discussed it must be suitable in dealing with
human behavioural differentiations and variety of human races (Dennis, 2021.)
2. Issues with AI, gender and racial bias-
As it is discussed that the modern AI is having a great requirement to be reformed in attempt to
combat such bias issues. AI is not a gender free technology, since it is structured in such way that
it can identify such complex societal issues yet it is less competent when it comes to human race
and gender which are away from their behavioural aspect (Domnich and Anbarjafari, 2021)
For example- the one of the most famous feature of AI is the algorithm to automate
headhunting. It is seen that this is also doing injustice with female candidates and is showing
incompetency in equating the actions. It is reported that such face recognition technologies are
citing male to the female with dark skin. Even in many cases it shows their age more than they
are. The reason behind these problems is really the matter of inattentiveness since for guiding AI
the images are uploaded in the system and it is quite lunatic that some corporations are just
fixing the images of white people. Perhaps the intention is not malicious yet the perils can't be
avoided. There are many more such cases where the AI technology which is considered gender
and race neutral has carried out such act of shame.
Redressal of these issues is a burning requirement for the corporate organisations. For keeping
their credential alive it is the foremost thing to perform. For example- 'Algorithmic
improvements' can be carried out so the algorithmic glitches can be eliminated, along with it the
approaches like “Safe face pledge” can also be widely disseminated so radical changes or
reforms can be introduced. Even the biometric data collection of the government is also not fit in
this regard where it is needed to lay down such regimes which can insure better participation of
people from all races, gender or other sub-groups. AI technology and its dark side must be tabled
in corporate conversations to make it a discussable issue and should be self driven to address it
(Katyal and Jung, 2021.)
(3) JUST AI fellows and their views-
JUST AI is performing a key duty spreading awareness with regard to racial and gender justice
in corporate organisations. All thinkers of it are having invaluable ideas to improve the
technocratic injustice but these two snatched the attention.
Dr Erinma Ochu has shared her views on it. Who insisted reconfiguring the ways in which the
digital revolutions like automation, interconnectivity, sensing technologies and real-time data etc
can be shaped. So that a new culture can be generated. Her thoughts are holistic, and she always
suggests such feasible ways to deal with the prevailing perils. As she also suggested to embedded
digital literacies at the intersection of art, science, technology and health. By following her views
the potential sources of such discrepancies can be kicked out from the beginning and not only in
the corporate would but everywhere such ethical issues can be resolved.
headhunting. It is seen that this is also doing injustice with female candidates and is showing
incompetency in equating the actions. It is reported that such face recognition technologies are
citing male to the female with dark skin. Even in many cases it shows their age more than they
are. The reason behind these problems is really the matter of inattentiveness since for guiding AI
the images are uploaded in the system and it is quite lunatic that some corporations are just
fixing the images of white people. Perhaps the intention is not malicious yet the perils can't be
avoided. There are many more such cases where the AI technology which is considered gender
and race neutral has carried out such act of shame.
Redressal of these issues is a burning requirement for the corporate organisations. For keeping
their credential alive it is the foremost thing to perform. For example- 'Algorithmic
improvements' can be carried out so the algorithmic glitches can be eliminated, along with it the
approaches like “Safe face pledge” can also be widely disseminated so radical changes or
reforms can be introduced. Even the biometric data collection of the government is also not fit in
this regard where it is needed to lay down such regimes which can insure better participation of
people from all races, gender or other sub-groups. AI technology and its dark side must be tabled
in corporate conversations to make it a discussable issue and should be self driven to address it
(Katyal and Jung, 2021.)
(3) JUST AI fellows and their views-
JUST AI is performing a key duty spreading awareness with regard to racial and gender justice
in corporate organisations. All thinkers of it are having invaluable ideas to improve the
technocratic injustice but these two snatched the attention.
Dr Erinma Ochu has shared her views on it. Who insisted reconfiguring the ways in which the
digital revolutions like automation, interconnectivity, sensing technologies and real-time data etc
can be shaped. So that a new culture can be generated. Her thoughts are holistic, and she always
suggests such feasible ways to deal with the prevailing perils. As she also suggested to embedded
digital literacies at the intersection of art, science, technology and health. By following her views
the potential sources of such discrepancies can be kicked out from the beginning and not only in
the corporate would but everywhere such ethical issues can be resolved.
Secure Best Marks with AI Grader
Need help grading? Try our AI Grader for instant feedback on your assignments.
Yasmin Boudiaf who uses technology to change the behaviour of people. Her work in this regard
is exceptional the way she addresses the issues and suggest the way forward is admirable. Along
with racial and gender bias in AI, she also negotiates the issues like inequality in society having
any nature or any formation. She is rendering her precious services to the giant incorporations of
the market, which consists google as well. Her this role would definitely play a big role in
bringing equality and copping up the AI inclination issues. She is helping the corporations in
making their business model or design in this way she is assisting them in fetching the best
business practices. Her role may bring mammoth amount of changes due to her reach to the
clients and her passion of reaching to more people. These all qualities are making her the
forerunner in this field (van Wynsberghe, 2021.)
PART B
(1) Key messages of the report -
The report has vastly discussed the digital gander gap, specifically in the countries with low or
middle income. It is a big threat for the world specially in such predicaments where the world
experienced the horrendous tragedy of pandemic. The catastrophic situations what corona virus
brought to us surges this gap and made women more vulnerable. The report suggests that it is
right time to address the issue since in such situations where the digital activeness of women is
the only way to get them connected to the rest of the world. It will further assist them in
becoming more self-dependent and self-employed. The current scenario of the nations is very
poor specifically the developing nations like India, Bangladesh, Pakistan and other South Asian
nations and countries like Kenya and other African nations, where already there was a big gap
and the heat of pandemic made it more deepened (Shin, 2021.)
The report further discussed this digital gap in context to a few developing and under developing
nations where the gap is really alarming. Due to social, economical, cultural, geographical,
demographical dynamics the examination of the problem and then addressing it, is a complex
task. The report has disclosed couple of unseen elements as well like- disconnected women is not
individually deprived but it is also impacting the rest of members of the family. Since as report
said the pattern of mobile use for women is very holistic in nature like generally kids use the
same instrument for their study lectures and even as she is intensively connected to the rest of
members of the family which makes it more useful for all other members. These all messages
can be drawn from the report (Elias and Lemish, 2021.)
is exceptional the way she addresses the issues and suggest the way forward is admirable. Along
with racial and gender bias in AI, she also negotiates the issues like inequality in society having
any nature or any formation. She is rendering her precious services to the giant incorporations of
the market, which consists google as well. Her this role would definitely play a big role in
bringing equality and copping up the AI inclination issues. She is helping the corporations in
making their business model or design in this way she is assisting them in fetching the best
business practices. Her role may bring mammoth amount of changes due to her reach to the
clients and her passion of reaching to more people. These all qualities are making her the
forerunner in this field (van Wynsberghe, 2021.)
PART B
(1) Key messages of the report -
The report has vastly discussed the digital gander gap, specifically in the countries with low or
middle income. It is a big threat for the world specially in such predicaments where the world
experienced the horrendous tragedy of pandemic. The catastrophic situations what corona virus
brought to us surges this gap and made women more vulnerable. The report suggests that it is
right time to address the issue since in such situations where the digital activeness of women is
the only way to get them connected to the rest of the world. It will further assist them in
becoming more self-dependent and self-employed. The current scenario of the nations is very
poor specifically the developing nations like India, Bangladesh, Pakistan and other South Asian
nations and countries like Kenya and other African nations, where already there was a big gap
and the heat of pandemic made it more deepened (Shin, 2021.)
The report further discussed this digital gap in context to a few developing and under developing
nations where the gap is really alarming. Due to social, economical, cultural, geographical,
demographical dynamics the examination of the problem and then addressing it, is a complex
task. The report has disclosed couple of unseen elements as well like- disconnected women is not
individually deprived but it is also impacting the rest of members of the family. Since as report
said the pattern of mobile use for women is very holistic in nature like generally kids use the
same instrument for their study lectures and even as she is intensively connected to the rest of
members of the family which makes it more useful for all other members. These all messages
can be drawn from the report (Elias and Lemish, 2021.)
(2) Recommendations and barrier addressing-
Different stack holders had extended different recommendations to counter the digital gender
gaps, all are having their absolute importance in mitigating the situation. Yet here the three of
them are discussed which are found relatively substantial for the nations like- India and Kenya.
(1) Affordability approach by Mobile operators- It is seen in the developing and under
developing nations that the people are not in the position to afford mobile phone. If they can
afford one instrument then definitely the women candidate will be ignored. So for addressing this
issue the Mobile operators can play a gigantic role. They can support the industry to produce the
instrument at lower cost and also provides modern features. Such mobiles should be internet-
enabled so that can connect the users with rest of the world. Operators may also provide easy
payment facility which can give a boost to the potential customers, due to poor financial health
such people can't pay the entire sum at once so if will get such easy methodology of payment
then they can make their mind to purchase it.
(2) Literacy approach by policy-makers- For bringing bigger and revolutionary changes in such
countries where a substantial amount of population is not able to read and write the policy-
makers play a big role. It is again relatively very salient. Governments can invest more in digital
literacy and can also encourage women with financial and non financial efforts. Due to their
politic power and control on finance they can collaborate with the private sector and can make it
more effective.
(3) Accessibility approach by internet companies- In developing and under developing nations
even if women are having mobile phones, yet they are disconnected due to poor educational
background and other social obstacles. In such countries social barriers are more horrendous than
other factors. So here the internet companies can make big difference by their efforts like- They
can provide internet services in more regional languages and dialects. They can make it easier to
operate or can further provide better tools to make it more comfortable and easily operating. The
use of AI can also incentivize the services and will build confidence of women (Koteeswaran,
2021. )
Different stack holders had extended different recommendations to counter the digital gender
gaps, all are having their absolute importance in mitigating the situation. Yet here the three of
them are discussed which are found relatively substantial for the nations like- India and Kenya.
(1) Affordability approach by Mobile operators- It is seen in the developing and under
developing nations that the people are not in the position to afford mobile phone. If they can
afford one instrument then definitely the women candidate will be ignored. So for addressing this
issue the Mobile operators can play a gigantic role. They can support the industry to produce the
instrument at lower cost and also provides modern features. Such mobiles should be internet-
enabled so that can connect the users with rest of the world. Operators may also provide easy
payment facility which can give a boost to the potential customers, due to poor financial health
such people can't pay the entire sum at once so if will get such easy methodology of payment
then they can make their mind to purchase it.
(2) Literacy approach by policy-makers- For bringing bigger and revolutionary changes in such
countries where a substantial amount of population is not able to read and write the policy-
makers play a big role. It is again relatively very salient. Governments can invest more in digital
literacy and can also encourage women with financial and non financial efforts. Due to their
politic power and control on finance they can collaborate with the private sector and can make it
more effective.
(3) Accessibility approach by internet companies- In developing and under developing nations
even if women are having mobile phones, yet they are disconnected due to poor educational
background and other social obstacles. In such countries social barriers are more horrendous than
other factors. So here the internet companies can make big difference by their efforts like- They
can provide internet services in more regional languages and dialects. They can make it easier to
operate or can further provide better tools to make it more comfortable and easily operating. The
use of AI can also incentivize the services and will build confidence of women (Koteeswaran,
2021. )
(3) Important barriers in mobile internet use-
There is an ample range of barriers is characterized. Which are hindering the way of the nations
to be more digitally bias free. These are a few of those barriers which are impacting the situation
of gander biased mobile uses.
Handset cost- In developing and under developing nations because of their poor
technological infrastructure the mobile handsets are very costly. African nations found
this problem as a key problem since the countries like- Kenya, Mozambique, Nigeria are
having a substantial amount of people who had found it a big issue behind their
disconnections.
Reading and writing difficulties- In Asia education is still a big social issue. Here a huge
number of population is not able to read and write. They cited it as one of the biggest
cause behind their not using mobile phones. It is recorded highest in Pakistan.
Harmful content- A few nations are having threat of harmful content. In stereo type
society it is a general problem. People believe that giving mobile to women will
somehow reduce their cultural inclination and will make them more inclined to nudity
and became root cause behind not giving mobiles to them.
Data cost- Poor income of people is a big driver of this jeopardy. It is seen specifically in
the countries with poor income generating economy that people are having mobile, yet
they are not availing internet services. It is costly and out of their pocket. In Asian nations
it is one of the causes.
Information security- In latin American nations people often consider it a tool for data
keeping, and they have expressed the problem of data or information security which is a
big concern for them. Along with it the other cited problem is being contacted by
unknown people which cause discomfort to them. It is more lethal issue when it comes to
women since male domination never accept such notions with women (Rufai, 2021.)
There is an ample range of barriers is characterized. Which are hindering the way of the nations
to be more digitally bias free. These are a few of those barriers which are impacting the situation
of gander biased mobile uses.
Handset cost- In developing and under developing nations because of their poor
technological infrastructure the mobile handsets are very costly. African nations found
this problem as a key problem since the countries like- Kenya, Mozambique, Nigeria are
having a substantial amount of people who had found it a big issue behind their
disconnections.
Reading and writing difficulties- In Asia education is still a big social issue. Here a huge
number of population is not able to read and write. They cited it as one of the biggest
cause behind their not using mobile phones. It is recorded highest in Pakistan.
Harmful content- A few nations are having threat of harmful content. In stereo type
society it is a general problem. People believe that giving mobile to women will
somehow reduce their cultural inclination and will make them more inclined to nudity
and became root cause behind not giving mobiles to them.
Data cost- Poor income of people is a big driver of this jeopardy. It is seen specifically in
the countries with poor income generating economy that people are having mobile, yet
they are not availing internet services. It is costly and out of their pocket. In Asian nations
it is one of the causes.
Information security- In latin American nations people often consider it a tool for data
keeping, and they have expressed the problem of data or information security which is a
big concern for them. Along with it the other cited problem is being contacted by
unknown people which cause discomfort to them. It is more lethal issue when it comes to
women since male domination never accept such notions with women (Rufai, 2021.)
Paraphrase This Document
Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser
CONCLUSION
In the report it can be concluded that there is an intensive requirement to introduce a new form of
AI which can deal with such bias issues and can bring equality. Along with it the factors which
are mainly responsible behind this inequality in the use of mobile phone and internet
connectivity in women.
In the report it can be concluded that there is an intensive requirement to introduce a new form of
AI which can deal with such bias issues and can bring equality. Along with it the factors which
are mainly responsible behind this inequality in the use of mobile phone and internet
connectivity in women.
REFERENCES
Timmers, P., 2019. Ethics of AI and cybersecurity when sovereignty is at stake. Minds and
Machines. 29(4). pp.635-645.
Dennis, 2021. Looking back to leap forward: a framework for operationalizing the structural
racism construct in minority health research. Ethnicity & disease. 31(Suppl 1), p.301.
Domnich, A. and Anbarjafari, G., 2021. Responsible AI: Gender bias assessment in emotion
recognition. arXiv preprint arXiv:2103.11436.
Katyal, S. K. and Jung, J. Y., 2021. The Gender Panopticon: AI, Gender, and Design
Justice. UCLA Law Review. 68(3).
van Wynsberghe, A., 2021. Sustainable AI: AI for sustainability and the sustainability of AI. AI
and Ethics. 1(3). pp.213-218.
Shin, H. S., 2021. The fintech gender gap. Available at SSRN 3799864.
Elias, N. and Lemish, D., 2021. Parents’ Social Uses of Mobile Phones in Public Places: The
Case of Eateries in Two National Contexts. International Journal of
Communication, 15, p.19.
Koteeswaran, B. M., 2021. Bridging the Gender Gap.
Rufai, A., 2021. A survey of cyber-security practices in Nigeria. International Research Journal
of Advanced Engineering and Science, pp.222-226.
1
Timmers, P., 2019. Ethics of AI and cybersecurity when sovereignty is at stake. Minds and
Machines. 29(4). pp.635-645.
Dennis, 2021. Looking back to leap forward: a framework for operationalizing the structural
racism construct in minority health research. Ethnicity & disease. 31(Suppl 1), p.301.
Domnich, A. and Anbarjafari, G., 2021. Responsible AI: Gender bias assessment in emotion
recognition. arXiv preprint arXiv:2103.11436.
Katyal, S. K. and Jung, J. Y., 2021. The Gender Panopticon: AI, Gender, and Design
Justice. UCLA Law Review. 68(3).
van Wynsberghe, A., 2021. Sustainable AI: AI for sustainability and the sustainability of AI. AI
and Ethics. 1(3). pp.213-218.
Shin, H. S., 2021. The fintech gender gap. Available at SSRN 3799864.
Elias, N. and Lemish, D., 2021. Parents’ Social Uses of Mobile Phones in Public Places: The
Case of Eateries in Two National Contexts. International Journal of
Communication, 15, p.19.
Koteeswaran, B. M., 2021. Bridging the Gender Gap.
Rufai, A., 2021. A survey of cyber-security practices in Nigeria. International Research Journal
of Advanced Engineering and Science, pp.222-226.
1
2
1 out of 10
Related Documents
Your All-in-One AI-Powered Toolkit for Academic Success.
+13062052269
info@desklib.com
Available 24*7 on WhatsApp / Email
Unlock your academic potential
© 2024 | Zucol Services PVT LTD | All rights reserved.