Back in July when Steve Bannon, who was still then still a Chief Advisor to President Trump, expressed his view that the largest data/media companies such as Facebook (FB) and Google (GOOGL) should be highly regulated much like public utilities or a lesser degree like the cable and telecom industry.
At time people felt Bannon’s view was in the minority of the administration and in the days following his firing in August it had been pretty much pretty much dismissed out of hand.
But this week’s acknowledgement by Facebook that Russia firms and money sponsored news and bought ads https://www.reuters.com/article/us-facebook-propaganda/facebook-says-some-russian-ads-during-u-s-election-promoted-live-events-idUSKCN1BN2VG to influence the 2016 election has brought revived discussion as to whether Facebook should indeed not only have tighter internal controls but possibly government oversight.
The argument for some form of regulation comes from the huge market share Facebook and Google , and its YouTube channel, control for social media and search respectively.
While some argue they are far from monopolies given the number of competitors ranging major news, broadcast TV and other sites, the technical barriers to entry are also low enough that regulation is unnecessary when it comes to internet services.
But one of Bannon’s points that some are starting to agree with is we may need adopt new “frontier view” in monopoly regulation. While Google and Facebook don’t control a network of wires, their dominance creates other barriers, including so-called “network effects,” that make it hard for new entrants to compete.
In act the notion Facebook might be a utility was initially expressed by founder and CEO Mark Zuckerberg in a 2006 interview with Time calling it a “social utility”; a view he routinely repeated over the next few years but has recently stepped back from while still maintaining the company has a huge social responsibility.
The initial acknowledgment for a need to monitor content came a few years ago when videos, from plain stupid stunts to very disturbing live streams of suicides and shootings, started appearing with greater frequency.
But as upsetting and possibly socially disruptive, such as sparking and help organize protests, those might have been it didn’t really impact Facebook’s essential business in of an advertising driven revenue model.
Advertisers’ Running Scared
Which brings us to the larger, and probably more possible, challenge than regulation to Facebook; a slowdown or actual withdrawal from advertising by the major corporations.
Blue chip brands such as Proctor & Gamble (PG), which had already reduced its ad spending allocated to FB, have always been worried their ads might be placed next to unsavory or misaligned content.
This week’s news Facebook’s algorithms for targeting ads actually provided hate groups with screening tools https://www.propublica.org/article/facebook-enabled-advertisers-to-reach-jew-haters with topics such as neo-nazi, such as anti-sematic search terms like “Jew hater” to target ads.
This is a big black eye and Facebook knows if doesn’t address it swiftly, competently and completely could poison the entire platform as the spillover to what each person “likes” gets blurry as the connections among network expands. No major company wants to have their product even remotely associated with such groups.
The question not only how and at what cost such monitoring come—will each ad and site page need to be vetted—but where the lines will be drawn. At what point does “editing” become censorship. And does placing controls on what was built as an open platform become bad for business?
As of now investors don’t seem concerned as shares are up 45% year to date and stands just 1% from its all-time high.
— The Option Specialist