People say that companies like
Facebook and Twitter can do business, and refuse to do business, with whoever
they want, because they are private companies.
However, I don’t see much difference between refusing to serve customers
because of their political views and refusing to do business with people
because of their religious views. I know
all about refusing to sell to people because of their religious views, because
it happened to my grandparents. My
Methodist grandmother bought a house in her name alone because my Jewish
grandfather could not in the restricted neighborhood where the house was
located. The deed stated no Jews or
blacks. We needed the Civil Rights Act
of 1964 to stop such nonsense. Perhaps
we need a legislative solution to stop discrimination against people for their
political views.
Facebook, Twitter and similar
companies take advantage of an exemption in the copyright law that says they
are not responsible for material on their site that violates copyrights because
they don't exercise editorial control. Since
they supposedly exercise no editorial control, they are considered to be a
platform, not a publisher. I think that
if
they are blocking content for failure to meet their obviously political “community
standards,” there is a good argument to be made that they are no longer a
platform. They have become a publisher. I think the law should be changed so that companies
like Facebook and Twitter have a choice of not censoring content and having the
exemption, or exercising control of content and losing the copyright exemption. Platform censorship should be limited to the
option of stopping people who advocate violence or spread explicit pornography.
If that’s too radical a change
to pass Congress, I have two alternatives.
The milder legislative alternative is to force companies like Facebook
and Twitter to publish a full explanation of their “community standards.” At
the moment, there is no way anybody can know in advance what will offend My
Lords Dorsey
and Zuckerberg. They should have to
publish guidelines in advance so that users actually have a chance of skirting
their censorship. The companies also
should be forced to give people whose posts are
censored or whose accounts are suspended or banned an “Error Report” that shows
the offending post(s) and explains what is wrong with them and how the user can
correct them so that the user can avoid future censorship or being banned
completely. These “Error Reports” could
also be used to push back against the censorship as extreme or completely arbitrary. In other words, sunlight could be a good
disinfectant for this censorship behavior.
The other solution I think would
require an anti-trust suit and settlement. Platforms like Facebook and Twitter are natural monopolies. Part of the convenience is that everyone is on the same platform, exchanging information, The monopoly part that needs to be broken up is who selects what the user sees.
There is no technical reason that the only Facebook can select what you see. Technically, Facebook could be forced to share its data with third parties who could select what you see. Users could then choose who they wanted to select what they see. Facebook would still manage the platform and be able to place some ads based on customer data. Third parties selecting what the users see also could place some ads along with the content they show users.
Facebook would no longer be allowed to remove any content except on the grounds of content advocating violence or containing pornography. Users could choose which companies would select content for them. Competition should remove any need for government content regulation
The first company to get hit
with this anti-trust settlement hopefully might scare the others straight. This would represent a tremendous loss of
value to the platform company. They
would no longer receive monopoly rents provided by exclusive content filtration. Hopefully, the abuse of content filtration monopolies for
political censorship would stop.