(Daily Caller) My research team is currently monitoring online political content being sent to voters in swing states through more than 2,500 computers owned by a politically-diverse group of registered voters (our “field agents”), and we are concerned about what we’re seeing.
We are aggregating and analyzing search results on the Google and Bing search engines, messages displayed on Google’s home page, autoplay videos suggested on YouTube, tweets sent to users by the Twitter company (as opposed to tweets sent by other users), email suppression on Gmail, and more.
We have so far preserved more than 1.9 million “ephemeral experiences” – exposure to short-lived content that impacts people and then disappears, leaving no trace – that Google and other companies are able to use to shift opinions and voting preferences, and we expect to have captured more than 2.5 million by Election Day.
In emails leaked from Google to The Wall Street Journal in 2018, Googlers (that’s what they call themselves) discussed how they might be able to use “ephemeral experiences” to change people’s views about Trump’s travel ban. The company later denied that this plan was ever implemented, but leaked content (including multiple blacklists) and startling revelations by Tristan Harris, Zach Vorhies, and other whistleblowers show that Google is indeed out to remake the world in its own image. As the company’s CFO, Ruth Porat, said in a November 11th, 2016 video that leaked in 2018, “we will use the great strength and resources and reach we have” to advance Google’s values.
Since early 2016, my team has been developing and improving Neilsen-type monitoring systems that allow us to do to Google-and-the-Gang what they do to us and our children 24/7: to track their activity, and, specifically, to preserve that very dangerous and persuasive ephemeral content.
Since 2013, I have been conducting rigorous controlled experiments to quantify how persuasive that kind of content can be. I’ve so far identified about a dozen new forms of online manipulation that make use of ephemeral experiences, and nearly all these techniques are controlled exclusively by Google and, to a lesser extent, other tech companies.
These new forms of influence are stunning in their impact. Search results that favor one candidate (in other words, that lead people who click on high-ranking results to web pages that glorify that candidate) can shift the voting preferences of undecided voters by up to 80 percent in some demographic groups after a single search. Carefully crafted search suggestions that flash at you while you are typing a search term can turn a 50/50 split among undecided voters into a 90/10 split with no one knowing they have been manipulated. A single question-and-answer interaction on a digital personal assistant can shift the voting preferences of undecided voters by more than 40 percent.
In 2020, the 1.5 million ephemeral experiences we aggregated from the computers of our 1,735 field agents showed us manipulations that were sufficient, in theory, to have shifted more than six million votes to Joe Biden (whom I supported) – again, without people knowing they were being manipulated. Among other findings: Google was sending more go-vote reminders to liberals and moderates than to conservatives; that’s a brazen and powerful manipulation that would go completely undetected unless someone was monitoring.
Our preliminary analyses of the data we have collected so far in 2022 are equally disturbing. In swing states, and especially in Wisconsin, Arizona, and Florida, we are finding a high level of liberal bias in Google search results, but not in search results on Bing (the same pattern we have found in every election since 2016). In several swing states, 92 percent of the autoplay videos being fed to YouTube users are coming from liberal news sources (YouTube is owned by Google). Unless Google backs down, it will shift hundreds of thousands of votes on Election Day itself with those brazen targeted go-vote reminders – and we will catch them doing so.