It appears Facebook, now called Meta, is once again preparing to flex its technological might, societal influence, and intelligence-community connections ahead of a consequential presidential election.
Meta has appointed former CIA agent Aaron Berman — who up until recently served on its "misinformation" team — to oversee its elections policies.
According to Berman's Meta-verified LinkedIn page , he spent 17 years working as a senior intelligence analyst at the CIA. His duties included contributing to the oftentimes classified President's Daily Brief , "a daily summary of high-level, all-source information and analysis on national security issues produced for the president and key cabinet members and advisers"; briefing various National Security Council officials and members of Congress; and engaging with entities within the administrative state.
Berman first joined Facebook as a "Senior Product Policy Manager, Misinformation" in July 2019. In that role, he "built the misinformation policy team’s US workforce and put policies into practice during critical events."
Berman clarified his censorial role in a company video, noting he was "part of the team that writes the rules for Facebook" and in the business of determining "what is acceptable and what is not."
The former intelligence agent indicated he also was tasked with finding a balance between "harmful content" and "protecting freedom of speech."
In 2020, that balance appeared to favor Democrats or at the very least disfavor Republicans and other conservatives.
The New York Times indicated that, at the time, the company "hired tens of thousands of employees to secure the site for the election, consulted with legal and policy experts and expanded partnerships with fact-checking organizations."
While a foundation Mark Zuckerberg started poured over $400 million into election offices nationwide , the Meta CEO executed on his purported "responsibility to protect our democracy."
In a September 2020 post, Zuckerberg said , "This election is not going to be business as usual."
With the help of former statists like Berman, the company worked to "fight misinformation," "remove misinformation," straddle the forwarding of information on Messenger, and remove undesirable political groups from the platform. The impact proved to be historical.
The company has since admitted to suppressing damning stories about Hunter Biden's laptop, which the majority of Americans reckon would have changed the outcome of the election, according to a poll conducted last year by the New Jersey-based Technometrica Institute of Policy and Politics .
The National Review
that Berman's colleagues, under pressure from the FBI, ensured few would learn about how then-candidate Joe Biden may have been seriously compromised.
Last month, Berman received a promotion, such that he is now "Lead for Elections Content Policies," where he will put to work his alleged skills in the areas of content moderation, trust and safety, and policy writing,
According to his LinkedIn profile, in his new role Berman will lead "a team responsible for elections-related content policies worldwide"; oversee policy development; advise senior executives; coordinate "with teams on implementation via technical and human workflows"; and put "policies into practice on key elections."
Miranda Devine, journalist with the New York Post and victim to Meta's last round of election-altering censorship,
, "It's happening again, right in front of our eyes."
The Daily Mail reported that Berman is just one of several former CIA agents who are working or have recently worked at the company.
Here is Berman with other censorship czars discussing "best practices for social media companies to mitigate online misinformation and disinformation" at Stanford University's 2021 conference on "
Social Media, Ethics, and COVID-19 Misinformation (INFODEMIC)
The Role of Social Media Companies - Dr. Anne Merritt, Aaron Berman, Brian Clarke youtu.be
Like Blaze News? Bypass the censors, sign up for our newsletters, and get stories like this direct to your inbox. Sign up here !