1 - Introduction: Ethical Data Use

At Environics Analytics, we believe in using the power of data analytics for good. Unfortunately, data can also be used for discriminatory practices. The purpose of this training is to provide education on how data has been misused in the past, and what we can do as responsible data stewards to keep from repeating these same pitfalls.

Your training objectives are:

  1. Understand Potential Stigmatization & how it applies here at EA.
  2. Understand Attribute Disclosure & how it applies here at EA.

This training will:

  1. Outline some common stigmatizations that can perpetuate negative attitudes about individuals or communities.
  2. Define areas that are out-of-bounds based on EAs end-license user agreement.
  3. Explain ways in which data has been used to selectively classify neighbourhoods or communities in a discriminatory manner.
  4. Detail the legally protected areas defined in the Canadian Human Rights Act.

Before we get started, let's talk about stigma. Stigma is the negative stereotype, while discrimination is the behaviour that results from this negative stereotype. We encourage you to read more about stigma and discrimination opens in new window.

What are some of the more common stigmatizations in Canada opens in new window?

  • Racism experienced by First Nations, Inuit, and Métis peoples
  • Racism experienced by African, Caribbean, and Black Canadians
  • Racism & colonialism that undermines self-determination & sovereignty
  • Sexual stigma and gender identity stigma as experienced by LGBTQ2+ people
  • Mental illness stigma
  • HIV stigma
  • Substance use stigma
  • Obesity stigma

Examples:

  • Mental Health Records: Stigmatization can occur when people are reluctant to seek mental health treatment due to fear of their records being accessed or shared, potentially affecting their employment or social status.
  • HIV/AIDS Status: Disclosure of one's HIV/AIDS status can lead to social stigmatization, discrimination, and even violence.
  • LGBTQ2+ Discrimination: People may experience discrimination and stigmatization based on their gender identity or sexual orientation, leading to negative consequences in numerous areas of their lives.

Addressing data stigmatization often involves implementing policies, regulations, and ethical guidelines to protect individuals' privacy and rights. Additionally, raising awareness and promoting inclusivity and diversity can help reduce stigmatization associated with various data types.

It also involves proactively assessing potentially negative consequences. This starts with raising awareness internally and engaging with impacted communities to learn about their concerns. An example is the Our Data Bodies Project.

The Our Data Bodies (O.D.B.) Project opens in new window is a collaborative, participatory research and organizing effort working in three cities: Charlotte, North Carolina; Detroit, Michigan; and Los Angeles, California. The O.D.B. Project asks three main questions:

  • How do marginalized adults experience and make sense of the collection, storage, sharing and analysis of their personal information in housing, criminal justice, employment, and municipal open data systems?
  • How, if at all, do marginalized adults connect their ability to meet their basic material and social needs—food, shelter, safety, employment, health, social services, belonging, family integrity, cultural expression—to their inclusion in (or exclusion from) data-based systems?
  • What strategies do marginalized adults deploy, if any, to protect their digital privacy, self-determination, and data rights?

Ready to continue? Let's move on to Understanding Potential Stigmatization.