New bill would 'open up Big Tech's hood,' make companies explain how they decide which content to show
A new federal bill seeks to demystify how social media platforms determine which posts users see, without touching a law that has become a lightning rod in Congress.
The Algorithmic Justice and Online Platform Transparency Act of 2021, announced by Sen. Ed Markey, D-Mass., and Rep. Doris Matsui, D-Calif., on Thursday, seeks to expose and address social injustices that are exacerbated by algorithmic amplification online.
In this particular usage of the word, “algorithms” are parts of software programs that sites like Facebook, Twitter and Google use to determine which content and advertisements to show users.
The bill would prohibit platforms from using algorithms that discriminate based on protected characteristics like race and gender, empower the Federal Trade Commission to review platforms’ algorithmic processes, and create a new inter-agency task force to investigate discrimination in algorithms.
Platforms would also have to explain to users how they use algorithms and what information they use to run them.
“It is time to open up Big Tech’s hood, enact strict prohibitions on harmful algorithms, and prioritize justice for communities who have long been discriminated against as we work toward platform accountability,” Markey said in a statement.
However, one industry group backed by companies including Amazon, Facebook, Google and Twitter warned that exposing platforms’ processes could be risky.
“No one wants tech to exacerbate racial inequality or deprive people of opportunity,” Adam Kovacevich, founder and CEO of Chamber of Progress, said in a statement. “One approach would be expanding our existing civil rights and discrimination laws in housing, employment, and credit. There’s some danger that fully lifting the hood on tech algorithms could provide a road map for hackers, Russian trolls, and conspiracy theorists.”
Researchers and government agencies have accused the platforms of employing discriminatory algorithms in the past. For example, in 2019, the Department of Housing and Urban Development accused Facebook of breaking housing discrimination laws with its ad targeting. Shortly after that, researchers from Northeastern University, the University of Southern California, and nonprofit group Upturn found Facebook’s ad delivery algorithm could discriminate based on race and gender, even if that’s not what advertisers intended.
Facebook said at the time it stands “against discrimination in any form” and pointed to changes it made to its ad targeting tools to address some of the concerns.
Not touching Section 230
The new bill is a notable approach to tech reform in part because of what it does not do: Touch the hotly debated legal shield that protects companies from liability over what users post online.
Section 230 of the Communications Decency Act is a 1990s-era law that says online platforms are not responsible for their users’ speech and empowers platforms to moderate their services essentially as they see fit. In recent years, both Democrats and Republicans have criticized the shield as too broad.
But altering Section 230 is no easy task. Democrats and Republicans disagree on its problems and how to solve them. Progressives advocate for removing liability protection for platforms that fail to moderate certain types of content, fearing the proliferation of hate speech. Conservatives say the law should limit what platforms are allowed to moderate, claiming the platforms suppress posts expressing conservative viewpoints (the companies have denied this).
Many legal scholars have warned of the potential unintended harms that could come from scaling back Section 230. Platforms could actually be incentivized to limit speech much further than intended, for example.
Progressive digital rights group Fight for the Future sees the new bill as a responsible way of addressing harm by Big Tech companies “without poking holes in Section 230,” according to a statement.
While introduced by two Democrats, the bill touches on a key tenet put forth by Republicans earlier this year on how they seek to handle tech reform. In an April memo, Republican staff for the House Energy and Commerce Committee urged an emphasis on transparency in content moderation practices. Markey and Matsui’s bill would require online platforms to publish annual reports for the public about their content moderation practices.
WATCH: The messy business of content moderation on Facebook, Twitter, YouTube