Home » News and Events » News » COVID-19 Could Bring Bias in AI to Pandemic Level Crisis

COVID-19 Could Bring Bias in AI to Pandemic Level Crisis

During the COVID-19 pandemic, artificial intelligence (AI) has become a trusted ally and partner in our daily lives. While there are countless benefits of AI, embedded bias could be determining who keeps their job, what news we see – even who lives or dies – without our even knowing it.

AI-based technology is enabling us to stay connected to our communities, order essential supplies and perform our jobs while adhering to stay-at-home orders. It ensures we’re entertained by using algorithms to compare our past Netflix viewings to recommend our next binge watch. AI even enables robots to answer the call for contactless food delivery to our homes and deliver personal protective equipment (PPE) to hospitals without exposing supply workers to the virus.

These developments have saved and enhanced lives during the pandemic. However, even in the best of times, the sharpest minds at the most sophisticated companies struggle to ensure their use of AI is neither discriminatory nor inequitable. For instance, the most advanced AI facial recognition programs often fail to identify persons of color, as recently addressed by the large tech companies, including Amazon, Microsoft and IBM in the laudable decision to take these programs used by law enforcement off the shelf for at least the next year. Further risks can be seen most prominently in our current use of AI to facilitate review of online media content, employment decisions, and healthcare opportunities.

The original article can be found here.

According to Artificial Intelligence World Society Innovation Network (AIWS.net), AI can be an important technology and a potential tool for COVID-19 prediction.  Regarding to AI Ethics, AIWS.net initiated and promoted to design AIWS Ethics framework within four components including transparency, regulation, promotion and implementation for constructive use of AI for avoiding bias and discrimination.