2 - Understanding Potential Stigmatization

When misused, Demographic data can lead to harmful stigmatizations of minority or vulnerable populations. Such as:

For example, you are working on a project to determine which areas were the most hesitant to receive the COVID-19 vaccine. During this review, you observe much lower vaccination rates in minority and recent immigrant communities. You raise this concern, but your manager says those are the statistics and we should stand by them. This is an example of potentially stigmatizing project work because the results could be used to negatively impact the communities.

It is the lack of context and/or the interpretation of such data that can reinforce misconceptions, stereotypes and stigma.

For example, Nova Scotia public health authorities characterized two predominantly Black communities with higher COVID-19 incident rates as non-compliant with COVID-19 restrictions rather than exploring other explanations for high incidence, such as a higher percentage of frontline workers that are more likely to be exposed to COVID-19.

With a core concept of EAs business in the creation of geo-demographic analysis, it is important that we understand how our data can be used to create these harmful stigmas.

This training will walk you through how we can review EA's data to identify possibly stigmatizing group attributes and provide more details on some common examples.

Data for Good Use Case: The Downtown Vancouver Business Improvement Association (DVBIA)

Challenge: COVID-19 lockdown measures forced several waves of closures and reopenings for DVBIA members. To understand the impacts on key business and retail corridors, the DVBIA needed data on their visitor base to compare with pre-pandemic levels.

Solution: Equipped with EA data, an automated template was created to analyze month-by-month visitor traffic to downtown Vancouver's retail and business corridors.

Results: The automated template and weighted reports have allowed the DVBIA to respond to sudden changes in visitor traffic and identify what their primary visitors now look like—young, single, and diverse—while recognizing that their family-based visitors decreased significantly from the start of the pandemic.

How is this using data for good? Using mobile movement data, the DVBIA provided post-COVID support for small businesses trying to respond and recover after the pandemic. You can read the full case study opens in new window on our website.

Digital Redlining

Digital redlining is defined as using technologies to create and reinforce race boundaries. Digital redlining is becoming more of a topic for Big Data algorithms that allow for segmentation based on ethnicity. These algorithms could lead to racialized search results where minorities only see material believed to be relevant to them, which could be driven by harmful stereotypes. This is a growing concern with the increase of algorithms that try to push related content to individuals.

Graphic portrating digital redlining

What are the roots of digital redlining?

Neighbourhoods with higher percentages of visible minority group members, black or brown, were deemed as high-risk communities by banks. This limited access to funds for those communities and prevented the upward mobility experienced by communities outside the redline.

How can EA's data be used for good to reduce digital redlining?

Using information such as FootFall, EA can create maps to understand if tourism is promoted more in neighbourhoods with predominantly white populations rather than areas with higher percentages of visible minority group members. This information could be used to determine under-represented areas for tourism and travel.

Conscious efforts should be made not to exclude based on non-relevant or protected characteristics.

Data for Good Use Case: True North Sports + Entertainment (TNSE)

Challenge: To better understand and serve their fanbase, TNSE wanted to construct a more robust picture of the fans who attend Winnipeg Jets and Manitoba Moose games.

Solution: A profile analysis was conducted by leveraging TNSE’s existing data to understand fans’ demographics and purchase behaviours relative to the Canadian market.

Results: This work revealed that fans tend to have large young families, are more likely to be part of a visible minority group, and often earn above-average household incomes. The segmentation analysis helped provide TNSE with a more complete view of their fans who attended games and what the market potential was for TNSE’s brand at large.

How is this using data for good? By understanding demographic information, TNSE was able to understand the diversity of their market and tailor their offerings to these different groups that may have otherwise been excluded from these promotions and events. You can read the full case study opens in new window on our website.

Food Deserts

Food deserts are neighbourhoods in areas with lower-than-average incomes that have limited access to full-service supermarkets or grocery stores. This includes both rural and urban areas. Food deserts have a profound and lasting negative effect on people's health as they have limited access to healthy foods.

What to be aware of?

Any data set that has neighbourhood income statistics can be used by supermarkets/grocery store chains to determine areas that they deem are not a good fit for an expansion due to income requirements.

How could data be used for good to prevent food deserts?

For example, public sector clients could use data to identify where food deserts exist so they can create policies to bring access to supermarkets/grocery stores to these areas.

Data for Good Use Case: Cushman & Wakefield Asset Services (CWAS)

Challenge: Like many shopping mall and real estate management companies, Cushman & Wakefield Asset Services needed actionable data about their visitors to provide key inputs that inform leasing, marketing, and investment decisions.

Solution: CWAS turned to MobileScapes, a product from Environics Analytics that uses privacy-compliant mobile movement data. By geofencing their properties and competitors, CWAS captured anonymized, permission-based location data from recent visitors. These were used to determine data-driven trade areas for each mall, understand competitive visitation, and develop holistic visitor profiles by overlaying third-party data available at the neighbourhood level.

Results: The insights are assisting CWAS in unlocking new ways to connect with their communities. When one mall saw that it had an opportunity to attract a more family-oriented shopper, it hosted a temporary exhibit of life-size animatronic dinosaurs. The display significantly increased traffic to help tenants boost sales from a new target audience.

How is this using data for good? By understanding the target audience, companies can better plan their business offerings to serve the community. You can read the full case study opens in new window on our website.

Digital Exclusion of "Undesirable" Job Candidates

Artificial Intelligence (AI) is being used to help filter the massive amount of potential job candidates. This can lead to digital exclusion of what the AI algorithm defines as “undesirable” job candidates.

What to be aware of?

When presenting demographic information around job statistics, we should not reinforce any negative biases that could further the digital divide. Some EA data sets, such as CrimeStats, could be misused by companies executing hiring algorithms to automatically exclude individuals from higher crime neighbourhoods.

One employee being excluded from group

How could data be used for good to reduce digital exclusion of job candidates?

Using demographic data, we can create models that detail job statistics by gender, age, and other socioeconomic factors. This data could be cross-referenced with information about secondary school enrollment, application rates or other information to determine where members of visible minority groups have been unfairly eliminated from job considerations.

Data for Good Use Case: Mackenzie Health Foundation

Challenge: Between 2011 and 2012, the Mackenzie Health Foundation's one-size-fits-all marketing strategy was starting to falter. During that time, the Foundation saw its donor base shrink by 35 percent. The organization needed a better way to engage the community.

Solution: Using PRIZM, our proprietary segmentation system, the Foundation profiled its existing donors and developed target groups with vivid profiles. Overlaying databases like SocialValues, DemoStats and Opticks Powered by Vividata, the Foundation was able to get a better understanding of their demographic, psychographic, and behavioural tendencies and understand the best manner to message and reach its core donors.

Results: By looking at the community through PRIZM, the Mackenzie Health Foundation discovered they serve a highly diverse, wealthy, and fast-growing community that tends to be very generous in their charitable giving. Almost 50 percent of 1.2 million residents in York Region, near Toronto, are immigrants and approximately 30 percent of residents speak a non-official language at home, which explained why the Foundation’s one-size-fits-all approach was struggling. Despite mailing 30 percent fewer packages, overall annual response rates from the new data-driven annual giving direct mail and telemarketing campaigns jumped by 62 percent, resulting in a 45 percent increase in overall direct mail revenue while lowering costs.

How is this using data for good? By better understanding their target demographic, the organization was able to improve fundraising efforts by understanding and leveraging the social values of the community it serves. You can read the full case study opens in new window on our website.

Client use cases need to be able to justify why certain neighbourhoods are excluded based on demographic, lifestyle, or other characteristics. This should relate to your business objective and be able to justify if the business case is appropriate.

Ready to continue? Let's move on to Legally Protected Areas & Disinformation.