Share this:

Like this:

Here’s how US lawmakers could finally rein in Facebook

"Here's my message to Mark Zuckerberg: Your time to invade our privacy, promote toxic content and rape children and teens is over. Congress will take action."

Congress is currently considering about a dozen proposed bills targeting Big Tech, some of which could force Meta to change how it handles algorithmic recommendations and user data collection, as well as its ability to make acquisitions. Late last year, a two-part group of 10 state attorneys launched a study of Meta, focusing on the potential harms of their Instagram platform on young users.

And last week, a federal judge said the Federal Trade Commission could proceed with a lawsuit seeking to dissolve Meta after the company argued that the complaint should be dismissed. (The case could drag on for years.) The FTC and several state attorneys are also reportedly investigating Metas' Oculus virtual reality device over antitrust concerns, according to a Bloomberg report Friday that quotes people with knowledge of the case.
Some industry observers have pointed to newly appointed federal officials such as FTC Chair Lina Khan, a vocal tech industry critic, and lawmakers' sharper focus as a cause for optimism that something could happen on the regulatory front.
Instagram plans to bring chronological feed back next year, the top executive says

"You see much less of the politicized comments and much more focus and coordination on these issues, the underlying technology behind them and the business model," said Katie Paul, director of the tech advocacy group Tech Transparency Project. "Clearly, many of these members of Congress have done their homework, and they understand what they are looking at."

Yet, after years of talk and glimpses of progress, it is still unclear whether or when U.S. lawmakers and regulators can take successful action. - as their EU and UK counterparts have - it would limit Meta's power, as well as Big Tech's power more broadly. And the window of opportunity may be limited, as preparations for the US midterm elections may divert attention from promoting new legislation.
Recent revelations from former Facebook employee and whistleblower Frances Haugen and the hundreds of internal documents she leaked have heightened support from two parties for new legislation related to child protection online. But the likelihood of success for the many other Meta-related proposals is darker, and not just because of the company's enormous lobbying power.
Despite their agreement that something must be done to counter Big Tech's dominance - and to crack down on Meta in particular - Democrats and Republicans disagree on what the core issue really is. Republicans accuse Facebook of anti-conservative bias despite lack of evidence, while Democrats are concerned that the company is not doing enough to protect against hate speech, misinformation and other problematic content.
The effort for action, or passivity, only grows. "Facebook Papers" revealed a wide range of potential harms and consequences in the real world from Meta's platforms. Still, lawmakers are still largely involved in understanding and regulating the company's older platforms, though Meta is pushing to move to a "metavers business" and perhaps shape a whole new generation of user experiences.

"Congress must seize this historic moment - a crucial turning point to rein in Big Tech," Senator Richard Blumenthal, the Connecticut Democrat who chairs the Senate Commerce Consumer Protection Committees, told CNN Business. "After seeing Big Tech's injuries and abuses, in our hearings and their own lives, Americans are ready for action - and results."

Here are a few of the approaches legislators could take.

§ 230

One of the first places legislators and experts often look at when considering new rules for technology companies like Meta is a piece of federal law called Section 230 of the Communications Decency Act.

The 25-year-old law prevents tech companies from being held accountable for the content that users post on their platforms. For years, big tech companies have relied on the law to avoid being held accountable for some of the most controversial content on their platforms, by using it to dismiss lawsuits over messages, videos and other content created by users.

Momentum has grown on Capitol Hill around the idea of ​​scrapping or updating Section 230, which could expose technology platforms to more lawsuits over hate speech and misinformation. Proposed changes include making platforms responsible for hosting child abuse content. President Biden has also proposed that platforms be held accountable for hosting misinformation related to vaccines. (Social media companies and industry organizations have lobbied hard against changes to section 230.)
Facebook may face a major tobacco bill

But there is a major obstacle to this approach, experts say: the first change. Even if lawmakers got rid of section 230 and, for example, Meta faced lawsuits over misinformation on their platforms, that speech is protected by the first amendment. That means the company would probably still win in the end, according to Jeff Kosseff, a cyber security law professor at the U.S. Naval Academy and author of a book on Section 230 called "The Twenty-Six Words That Created the Internet."

"Where Section 230 really makes a difference is in things like defamation cases," Kosseff said. "But that's not really what drives the debate around Facebook and other social media - it's more of this legitimate but horrible type of content."

Kosseff also raised the concern that attempts to hold technical platforms accountable for certain types of speech - such as health misinformation - could give the government considerable leeway to decide what content falls into these categories.

"There have been some countries that have enacted laws on fake news and they have abused them just as one would expect one would," he said.

Algorithms

Haugen, meanwhile, has called for reform of Section 230 to hold platforms accountable for how their algorithms promote content. In that scenario, Meta and other technology companies would still not be responsible for user-generated content, but could be held accountable for the way their algorithms promote and make that content go viral.

Bipartisan legislation introduced in Parliament in November would take a slightly different path by forcing large technology companies to give users access to a version of their platforms where what they see is not at all shaped by algorithms.
Perhaps in anticipation of such a law, Meta-owned Instagram has said it will bring back users' ability to access a reverse chronological version of their feed (one not manipulated by its algorithm) later in the year. Facebook already offers this option, but it can be frustrating to use - instead of being an option in settings where users could expect, it switches using a button in a long menu on the left side of the NewsFeed screen, and it resets each time you close the page.

Privacy

Lawmakers have also used recent hearings on Meta to find updated privacy laws.

"We have done nothing to update our privacy laws in this country, our federal privacy laws. Nothing. Zilch," Minnesota Democratic Senator Amy Klobuchar said during Haugen's hearing.

Currently, progress on this front is coming more at the state level than at the federal level.

Instagram offers 'drug pipeline' to children, claims tech advocacy group
The California Consumer Privacy Act, which came into force last year, gives consumers the right to require large corporations to disclose what data they have collected about them. By law, consumers can also ask companies to delete their data and in some cases sue companies for data breaches. Meanwhile, Virginia's Consumer Data Protection Act (due to come into force in 2023) also gives consumers more control over their online data, but it contains more exceptions than California law and does not allow consumers to sue businesses. A federal bill could help provide consistent, nationwide standards for how data can be collected and sold online.

Congress is considering the KIDS Act, which aims to protect Internet users under 16 in various ways, including by banning the use of age verification data for commercial purposes, as well as the SAFE DATA Act, which would give consumers more choice in how their data is collected and used.

A new technical supervisory body

In his testimony to a Senate subcommittee earlier this month, Instagram chief Adam Mosseri proposed the creation of an industry body that would set standards for "how to verify age, how to build age-appropriate experiences, how to build parental controls" and other best practices on social media.

But lawmakers did not seem enthusiastic about the idea of ​​leaving standards and oversight to industry players. "Self-policing depends on trust, and trust is gone," Blumenthal said during Mosseri's hearing.

Instead, lawmakers and advocates are pushing for the creation of a new federal regulatory body responsible for overseeing Big Tech. The group could be tasked with developing the framework and structures needed to regulate the technology industry, similar to the mechanisms in government that help oversee the banking industry, TTP's Paul said. It could also, as Haugen testified, serve as "a regulatory home where someone like me could take a trip on duty."
Such a group would help complement the limited existing accountability structures around Meta. Facebook's supervisory board - which says it acts independently even though its members are appointed and paid by the company - is only responsible for weighting content moderation decisions. Even then, the group has recently focused on smaller, stand-alone flubs rather than the many broader, structural problems the company faces (although it has made greater demands on transparency).

The role of the FTC

If Congress passes any Big Tech laws, the FTC will play a key role in enforcing them. And even if we do not see new legislation in the next year, Meta will not necessarily be out of the hook.

The FTC is considering drafting new rules on data and algorithms to protect consumers' privacy and civil rights

The judge's Tuesday ruling in the FTC case opens the door to perhaps the most existential threat yet to Meta: The FTC is seeking to settle Meta's acquisition of Instagram and WhatsApp. (Meta said earlier that it was convinced that "the evidence will reveal the fundamental weakness of [FTC's] requirements.")

The case will give Khan, the FTC chairman, a chance to mark his first trip as a federal regulator - and there is some reason to believe Meta is nervous. Last July, company officials wrote to the FTC asking Khan to waive all cases related to the social media giant (she has not). Meta also argued that the FTC's case should be dismissed on the grounds that Khan should not have been able to vote to approve the updated complaint; however, the judge took the FTC's side.

In addition to the agency's lawsuit, Khan said last month that the FTC is considering drafting new rules that will better regulate how U.S. companies can use data and algorithms. The effort could lead to "market-wide claims" targeting "damages that may be due to commercial surveillance and other data practices," Khan said in a letter to Blumenthal. It could provide another potential blow to Meta's business model.
And Friday's report that the FTC is also working with state attorneys to investigate potential anti-competitive practices from Meta's Oculus - a key unit in its plans for the meta-verse - indicates that its future ambitions are also the risk of regulatory repression.

--CNN's Brian Fung contributed to this report.

.

Leave a Reply

Your email address will not be published. Required fields are marked *

Share this:

Like this:

%d bloggers like this: