Privacy, Technology and Fair Housing- In a Nutshell

How to protect privacy and the right to housing in an increasingly digital world.
August 22, 2023

What would you do if you discovered that your application for a mortgage or a rental unit was rejected—and you suspected it was due to discrimination? Existing remedies would require that you prove the housing provider was illegally discriminating against you.  But what do you do when it’s not a person making that decision—it’s an algorithm?

The housing space is becoming increasingly technified. Algorithms, built on troves of personal data, are being used to predictively screen tenants, approve mortgages, and more. The problem is that there are no consistent, concrete, accountable, or agreed-upon standards at a regulatory—or even industry—level to ensure that the data being used to build the algorithm is appropriately collected, trained, secured, and not creating a discriminatory outcome. 

TechEquity Collaborative and the National Fair Housing Alliance teamed up to explore the current state of privacy, technology, and data in housing—and how we can implement privacy-preserving technologies and regulations to center the right to housing moving forward.

Privacy, Technology, and Fair Housing

Read the full report here.

This examination of privacy in housing tech is borne out of the National Fair Housing Alliance’s (NFHA) Tech Equity Initiative and TechEquity Collaborative’s Tech, Bias, and Housing Initiative.

How to Preserve Privacy in a Data-Driven Housing Market

One of the best ways to protect people’s privacy in data-driven technologies is something called a data-minimization framework. This framework narrows the type of data that can be collected and analyzed to a specific, justifiable purpose—reducing the potential for excessive data collection and strengthening people’s rights to privacy.

Policies that advance privacy at all costs (meaning ones that would allow for no data collection or estimation) can provide cover to companies or systems that have disproportionately adverse outcomes for certain protected groups. On the other hand, if there are no requirements to monitor, measure, and set guardrails around the use of personal data, other harms can emerge. For example, a team of journalists at ProPublica was the first to uncover that the Facebook ad algorithm used personal data—collected for an entirely different purpose—to discriminate against Black users. Nearly three years later, the U.S. Department of Justice charged Facebook with racial discrimination in their targeted housing advertisements after the National Fair Housing Alliance and other Civil Rights Advocates settled a case that asserted that Facebook used its advertising platform to discriminate against certain protected groups, such as African Americans, Hispanics, and Asian Americans.

We need to ensure that data collection is specific, anonymized where possible, and still allows for discrimination testing and search for the least discriminatory alternative models. A balanced approach that allows for discrimination testing after privacy-enhancing technologies have been applied to the data—rather than simply deleting key fields in the data—offers a way forward.

How to Protect Our Housing Rights

Right now, we have a patchwork of state and federal privacy laws that operate in silos, offering protections in individual sectors and circumstances without a comprehensive approach to people’s privacy and civil rights. We need a federal privacy framework incorporating agile civil rights and consumer protections that can stand up to the increasingly digitized systems in which we live. 

Reckoning with the imbalance of power between individuals, companies, and regulators is a critical first step in building a better future. To tackle the imbalance of power, we think three major shifts are immediately required to ensure that our civil rights and privacy are protected as modern technologies emerge and existing technologies grow. These shifts can and should be applied at a company and a regulatory level. 

  1. Shift responsibility from the individual to companies and regulators – Let’s rebalance the burden for safety & harm reduction from consumers to the people with the info, power, and resources—namely those at companies & regulatory agencies. 
  2. Strengthen the review of these tools prior to their use on the public – With tech tools increasingly affecting everyday lives, they should meet critical business necessity, non-discrimination, & harm minimization standards prior to deployment and use on the public. Independent, third-party algorithm reviewers should be involved in the process.
  3. Develop an intersectional approach to design and regulate tools and models – When your consumer data is used to train algorithms that later deny you housing, your rights to privacy, housing justice, & non-discrimination might all be at play. Regulatory bodies must require that protections are intersectional, broad, & nimble enough to apply across all sectors of modern life simultaneously. Companies must also take a transparent and comprehensive approach to monitoring & mitigating harms.

The world is digital. Everything we do is tracked, monitored, stored, and often monetized—for good, bad, and everything in between. An anti-discrimination framework will only get us so far. If a landlord or property manager is using video technology or online rental management software to surveil tenants, people in the building could be equally affected by monitoring algorithms—irrespective of race, gender, or other protected characteristics. The collection, storage, use, and transmission of personal data pose ethical and humanitarian questions we must address. 

We must ensure that our rights are not so narrowly considered that they can be dismissed by a model developer who hasn’t considered the implications of the data they are using—or can only be enforced through complex sectoral protections that have limited utility in the modern, intersectional, heavily digitized context in which we live our lives. We need an integrated, federal approach to privacy and technology, and it must extend to our most basic need: housing.