AI might decide where you get to live

0
242
AI now performing background checks for housing...but is this a good thing?

Anyone who has applied to rent an apartment knows that it can be a laborious process. From background checks to credit checks, the ordeal is not a fun one. No one wants a potential landlord snooping around in their history. Interestingly, humans might not be doing the checking at all before long.

Instead, as reported by The Verge, automated background checks performed by artificial intelligence (AI) and algorithms may determine who is fit for renting a place to live—and who isn’t. This development has outraged many people. They claim AI isn’t capable of understanding the complexity of criminal records and human stories.

CoreLogic, a California-based company behind such a product at the center of this controversy, contends that its system helps landlords keep communities safer without judgment bias.

Advertisement
Manage your supply chain from home with Sourcengine

But who is right?

Behind the Machine

The concept of digitalizing a background check is nothing new. Fingerprints aren’t taken with ink pads and cards anymore. Likewise, no one rummages through a filing cabinet to pull a criminal record.

Today, everything is digital. This is something that most people have come to accept. However, the very thing that makes digitalizing records so inviting is leading to potential problems. Making records easily accessible online and via private databases has opened the door for companies like CoreLogic to cash in.

In essence, CoreLogic offers landlords an array of screening tools to help decide if a person is trustworthy and deserves a lease. Some of these products include ScorePLUS, a screening model that calculates a single score to determine the risk of someone renting a property. Meanwhile, CrimCHECK conducts a database search for criminal records.

The company claims that this feature includes access to 80 million booking and incarceration records from more than 2000 facilities across the United States. So, what’s the problem? With these two, there isn’t much of an issue. After all, there are many software services that allow users to look up criminal records from a database. The problem lies in a third tool: CrimSAFE.

Potential for False Flagging

After flagging a debilitated coma patient, Mikhail Arroyo, and his mother, CrimSAFE is being criticized by countless individuals as being unfair. The company describes the service as “current and comprehensive data and predictive tools that make your leasing decisions easier, faster and more effective.”

The tool gives landlords a report, including a simple decision on a person’s risk level as a tenant. For example, a person with a criminal background is reported as “background failed.” CrimSAFE’s one-page forms leave disqualified tenants asking questions, but landlords have no answers.

For prospective landlords, the program can “eliminate the need for judgment calls.” This leaves many shaking their heads, wondering if judgment calls really should be removed from the equation.

In the case of Mikhail, a young man who suffered an accident that left him needing assistance for all acts of daily living, the CrimSAFE tool’s flaws were on full display. Mikhail’s record featured one retail theft charge—less than a misdemeanor. However, the CrimSAFE system reported him as having a “disqualifying record” without actually telling landlords what the charge was. The property owner denied him an apartment with his mother despite his current condition.

Many argue that the vague nature of CoreLogic’s results can lead to unfair treatment and judgment of potential residents. Of course, there was relatively no possibility for Mikhail to commit crimes of any fashion in his debilitated state. This, some argue, would have been easy for a landlord to see and consider in decision-making while the CrimSAFE program overlooked it.

Accuracy is Key

As everything from cars to background checks becomes more automated, the key to success in a hands-off world is human judgment and conventional wisdom. Without it, there will be (and have been) many more cases like Mikhail’s. Such cases leave hopeful tenants disputing charges on their record which were uncovered by a narrow-sighted AI program; all while the apartment they applied for slips away to another applicant.

Accuracy brings into question another potential flaw of automated background check systems. Whether or not they even look at the right person.

Eric Dunn is the director of litigation at the National Housing Law Project. He weighed in on the fact that many automated background check systems are notoriously inaccurate. He told The Verge, “I’ve looked at more criminal records reports than I could count. And I would say that well over half the ones I’ve looked at had some kind of inaccuracy.”

One of the most common problems that arise? A similar name. For example, CrimSAFE may show a record for a crime that a person did not commit. In fact, the program was looking at the background of a different person with the same name. The prospective tenant then has to deny the record in a lengthy process with CoreLogic’s help department.

Dunn believes that there is a solution to the problem. A more individualized screening system that takes personal factors and a face-to-face relationship into account.

Racial and Social Typing

While many debate whether or not automated background checking is fair to those with a criminal record, others are more concerned with another factor: the program’s disproportionate effect on African American and Latino prospective tenants. Unfortunately, companies like CoreLogic argue that, as a background check company, they are not subject to the Fair Housing Act. Instead, their position is that only the people using their tools are.

The Fair Housing Act was passed in 1968 to protect tenants from discrimination by landlords based on sex, race, or religion. CoreLogic argues that the landlords are responsible for the tools they use, not the companies who make them. Consequently, the company believes it is not in violation of the Department of Housing and Urban Development guidance regarding real estate transaction tests.

With several lawsuits currently dealing with the issue, it begs a question: Should landlords in racially diverse areas be able to use this technology if it negatively affects certain people more frequently?

The use of automated background checks for leases is not new. However, the technology is in the spotlight due to stories and lawsuits detailing the flaws of tools from companies like CoreLogic. Should the government provide oversight on the usage of these programs? Perhaps regulation is needed to resolve current problems and prevent future damage.

Automated background checks need to be kept in check for a safe, just society. Meanwhile, human landlords must remember the small details of life that machines simply cannot consider…and avoid the trap of blindly following advice from an automated system.