Supreme Court clarifies one social media challenge, leaves one for later
If government doesn’t have the power to dictate what a reporter reports (and it doesn’t), it follows that government doesn’t have authority to dictate what social media companies must include on their sites.
That’s a simple proposition. And it is accurate, affirmed by every justice on the U.S. Supreme Court in a July ruling. As clear as the decision in Moody v. Netchoice was, arguments will persist that the power of social media companies is either abused or too pervasive. A companion unresolved issue is the platforms’ blanket immunity from liability for content posted by users.
In 2021, conservative legislators in Florida and Texas reacted to their belief that Facebook and others were filtering out conservative viewpoints and boosting liberal ideals. Both states passed laws ordaining that the platforms must follow the states’ rules on content instead of setting their own.
Soon after, the 11th Circuit Court heard a challenge to the Florida law and issued a mixed opinion. Then, the 5th Circuit Court, which includes Texas, Mississippi and Louisiana, heard an appeal of the Texas statute. The 5th Circuit judges were whole hog in agreement that private companies should defer to government controls. “We reject the idea that corporations have a freewheeling First Amendment right to censor what people say,” was the 5th Circuit’s view.
In sorting the different rulings by the lower courts, Supreme Court Justice Elena Kagan, writing for the court, slapped down both lower courts, more severely chastising the 5th Circuit. “However imperfect the private marketplace of ideas, here was a worse proposal — the government itself deciding when speech was imbalanced, and then coercing speakers to provide more of some views or less of others,” she wrote.
Absolutely. The entire premise of the First Amendment is that individuals bear both the burden and the opportunity to decide which expressions have merit, which don’t and which are irrelevant. Interposing government filters, the Founders believed, would make the government more powerful than the people, defeating the very essence of democracy.
The Supreme Court’s opinion relied heavily on a 1974 decision involving the Miami Herald. In that case, a politician aggrieved at the newspaper’s reporting demanded the government force the newspaper to publish his response. The court ruled for the newspaper, saying the First Amendment protected the newspaper’s “editorial discretion” on deciding what to print.
Technology has changed how most people get information, but the Moody opinion says, “the principle does not change because the curated compilation has gone from the physical to the virtual world.”
‘Imbalance’ not the only rationale for controls
Again, the legislators in Florida and Texas were reacting to a perceived ideological imbalance, whether real or imagined. The laws they passed were rooted in the idea of fairness, which is a perfectly legitimate goal. And, as stated, they were coming from the right side of the political spectrum.
Strange as it may be, however, the strongest advocacy to end “editorial discretion” as practiced by social media companies comes from the left. A leader in that arena is former New York Law School professor and ACLU president Nadine Strossen. She sees content curation (platforms deciding what can and can’t be posted) as “(A) great threat to free speech (that) comes from powerful private sector forces, with great power over speech, which have had a strong speech-suppressive impact.”
The potential to manipulate public opinion is something Kagan didn’t question. According to the opinion, Facebook, which is not the most popular social media platform, still accepts 100 billion posts per day. Five hundred hours of new video are uploaded to YouTube every minute. Those are staggering numbers, and there’s no question that a corporation with an audience so vast could engage in all sorts of treachery. Strossen says, simply, that’s too much power to place in private hands.
To make that claim, however, is to ignore the purpose for which the platforms exist. The big ones aren’t political. They are businesses. They are big businesses. Their content may lean politically left or right on any given day, but their filters are designed to provide a return on investment for advertisers by entertaining the public. This is not to accuse social media platform owners of caring nothing about the greater good, but they set their algorithms to reject profanity, nudity, hate speech, bullying, gore and such in the holy name of audience retention. And only in the name of audience retention.
More importantly, the business model of social media companies depends on setting their own controls. Imagine a retailer not allowed to select his stock. Same thing.
The blanket immunity question lingers
The immunity issue is based on Section 230 of the federal Communications Decency Act of 1996. In it’s review of the Texas law, the 5th Circuit talked about it. The Supreme Court apparently didn’t think it was necessary to do so – at least not yet.
Most provisions in the Communications Decency Act failed, as apparent via the prurient content of many internet sites. In Reno vs. ACLU, decided by a 7-2 margin the year after Congress passed the Communications Decency Act, Justice John Paul Stevens wrote for the majority that the internet was deserving of the same First Amendment rights as newspapers, magazines or books. No more, and no less.
Section 230 survived the constitutional test, though, and remains in effect. It provides a blanket exemption for internet service providers from any liability for content. That’s something no traditional media company has ever enjoyed.
When adopted, Section 230 made perfect sense. There was no social media. The internet served emailers and chat room chatterers. Providers of those services didn’t screen or monitor what people wrote, so blaming the platforms was like blaming a table because somebody put a rotten apple on it. Today, things are different. Very different. Social media platforms have “commodified” the internet. They set content standards and aggressively filter what people post. But Section 230 says the owners cannot be sued for damages based on the content they allow. X, Instagram, Facebook and all the others practice the same “editorial discretion” as traditional media. What they don’t have is any legal responsibility for anything they allow to be written or uploaded to their platforms.
One word for this is “duality.” Individuals and all mass communication media except social media face consequences for their decisions. It’s a double standard and one, admittedly, with no easy or apparent remedy. The U.S. Justice Department and countless experts, individuals and assorted agencies have pondered the situation, yet no consensus has emerged.
In many of the Supreme Court’s decisions during this year’s term, conservative justices sided with conservative views and liberal justices were in lockstep with liberal views. On this question, they were unified in their respect for freedom of expression. That’s good news for the First Amendment. The issue of lack of social media liability remains, however, and eventually it must be decided whether it’s right to have social media companies with both the power to make choices and no responsibility for the choices they make.
Charles Mitchell is a member of the panel of experts at the Overby Center for Southern Journalism and Politics.