As the “fake news” crisis escalates in the U.S. and Europe, Arjuna Capital is asking Facebook and Google to evaluate the impact fabricated news and related hate speech are having on their businesses and platforms. The independent RIA is concerned that if these technology giants don’t adequately address this issue, they’ll lose the trust of their users and may jeopardize long-term shareholder wealth.

On February 2, Arjuna Capital, in partnership with investment advisory firm Baldwin Brothers Inc. and with assistance from nonprofit Open MIC, filed a shareholder resolution with Facebook asking it to report its policies for blocking ads and posts from fake news sites, evaluating fabricated content claims, and managing the issue without impacting free speech. Arjuna Capital filed a similar resolution on December 29 with Alphabet Inc., the parent company of Google.

This isn’t Arjuna Capital’s first foray into Silicon Valley. Last year, it filed resolutions on the issue of gender pay equity with nine major tech companies, six of whom reported their gender pay gap is closed, nearly closed or would soon close.

Prompting the latest resolutions were the troubling “revelations that came out of the back half of 2016 regarding the scale and impact of fake news,” Natasha Lamb, a managing partner at Arjuna Capital and the firm’s director of equity research and shareholder engagement, tells Financial Advisor. “All of a sudden, fake news has become a material issue, for investors and our democracy.”

Shareholders are concerned Facebook users will become disenchanted and move on to the next thing, as they did with the MySpace and Napster platforms, she says. “As for our democracy, if we don’t have an informed electorate, we have a real problem,” she adds.
 
Lamb hopes to have productive dialogues with Facebook and Alphabet before their annual shareholder meetings in June. “If that can’t be achieved, we expect the issue will go to a vote,” she says.

So far, with the gender pay issue, neither company has provided “the kind of meaningful disclosure we see from their peers,” she says. However, she adds, “Given the current threat fake news poses to Facebook’s brand, I would expect greater responsiveness.”
 
Facebook’s transformation from a technical platform to a media platform “provides a briar’s nest of complications,” she says.  “How is Facebook addressing the issue without impinging on free speech rights, without censorship, and without creating systemic bias every time they tweak their algorithm?”

Facebook provides “the platform and the financial incentive to propagate fabricated content,” she says. “Whether the users’ primary intent is to make money or manipulate, it doesn’t much matter; the platform has served both ends.” Furthermore, she asks, “When users are empowered to flag fake news, what stops them from flagging real news as fake?”

Fixing the problem of fake news “is not as simple as updating an advertising policy or algorithm,” adds Lamb. “Investors want to ensure [companies] have systems in place to adapt to a rapidly changing information landscape.”