Facebook has once again found itself the target of a government investigation.
The Department of Housing and Urban Development (HUD) filed charges against the social media giant for alleged housing discrimination. The agency argues Facebook’s ad targeting system allowed property owners to exclude applicants by race and gender.
In 1968, the passing of the Fair Housing Act made discrimination based on ethnic background, religion, national origin, and disability illegal.
Facebook’s Advertising Problem Isn’t New
Facebook’s advertising system being vulnerable to discriminatory practices was first brought to light by ProPublica in 2016. The organization reported the platform’s “Ethnic Affinities” tool classified users based on the pages and posts they liked.
Subsequently, advertisers could keep their posts from being seen by users if they belonged to certain undesirable groups.
Facebook promised significant reforms after the ProPublica article was published. However, the media organization ran a follow-up piece a year later that found it was still possible to post ads that excluded users based on their race, national background, and religious affiliation.
Facebook expressed surprise in response to HUD’s lawsuit. The firm noted it recently removed thousands of targeting options from its advertising portal.
However, the firm only made many of those changes after being sued by the National Fair Housing Alliance.
The corporation also stated that it was working with HUD to reform its advertising practices but the collaboration ended after the agency allegedly asked for access to private user data.
Background Check AI Company Also Accused of Discrimination
Notably, Facebook is not the first tech platform to face major backlash for participating in housing discrimination.
Recently, digital screening company CoreLogic was criticized for allowing landlords to conduct potentially biased background checks. Using an artificial intelligence (AI) powered tool, the company could tell property owners if applicants had criminal backgrounds and if they were trustworthy.
However, The Verge pointed out the company’s AI had a tendency to disproportionately disqualify black and Latino applicants in a way that a human landowner might not.
CoreLogic argued that as a background check company, the Fair Housing Act doesn’t apply to the service it provides. Nevertheless, a federal judge in Connecticut ruled the company can be sued by a man who says its flawed background reporting prevented him from renting an apartment.
What Needs to Happen Next
As recently as 2011, the Justice Department has levied fines against newspapers for publishing ads that featured discriminatory language. While Facebook has been good about banning biased housing posts, the platform’s targeted ad tools let advertiser narrowcast the availability of their property. Hence, HUD believes Facebook has indirectly violated the law.
Moreover, the social media giant has known about the issue for years. But it hasn’t made many substantial changes. One has to ask why. The Verge notes Facebook’s adverting is effective because it allows users to decide who sees what they have to offer. If the company makes its targeting system less precise, it would undoubtedly weaken its efficacy and value as a marketing channel.
In Q4 2018, Facebook made $16.6 billion in advertising revenue. As such, it’s not inconceivable that the company allowed its ad portal to be misused believing that its earnings will offset potential fines and damage awards.
Conversely, it’s possible the technology company’s leadership was unaware that essentially classifying people by race would inevitably lead to discrimination.
Similarly, CoreLogic may not have intended for their product to bar black and Latino applicants from securing housing. But a court may rule against the firm regardless of intent.
Right now, there are two big takeaways from the Facebook and CaseLogic lawsuits. First, the government needs to better define how decades-old legislation applies to contemporary tech companies. Secondly, the technology sector needs the same level of oversight and regulation as every other industry.
Leaving corporations to police themselves isn’t a good idea, just ask Facebook founder Mark Zuckerberg.