Meta Ad Insider Exposes Its Inner Workings

▼ Summary
– Brian Boland, a former Meta executive, testified that the company’s culture prioritized growth and user engagement over safety, driven by Mark Zuckerberg’s clear directives.
– He stated that Meta’s “move fast and break things” ethos discouraged considering potential harms, focusing instead on rapid product deployment.
– Boland described the company’s algorithms as “relentless” and amoral tools designed to maximize engagement, not user wellbeing.
– He disputed Meta’s public stance that user safety is a primary interest, claiming safety issues were often treated as public relations problems rather than opportunities for deep investigation.
– Boland left Meta in 2020, forfeiting significant unvested stock, after raising concerns directly to Zuckerberg and concluding the company’s priorities were growth and profit.
A former high-level Meta executive provided critical testimony in a California courtroom, detailing how the company’s internal culture and financial incentives allegedly prioritized user growth and engagement over safety concerns. Brian Boland, who spent eleven years at the social media giant, described a system designed to maximize profit, which he believes drew more users, including teenagers, onto platforms like Facebook and Instagram without sufficient regard for potential harm.
Boland’s appearance followed testimony from CEO Mark Zuckerberg, who framed the company’s mission as a balance between safety and free expression. In contrast, Boland explained the underlying business mechanics. He testified that Zuckerberg cultivated a top-down culture where growth and profit consistently took precedence over user wellbeing. Starting in advertising roles in 2009 and eventually serving as Vice President of Partnerships before leaving in 2020, Boland said he evolved from having “deep blind faith” in Meta to holding the “firm belief that competition and power and growth were the things that Mark Zuckerberg cared about most.”
He pointed to the company’s early “move fast and break things” mantra as a defining cultural ethos. The idea, he explained, was generally to avoid overthinking potential negative consequences and instead launch products quickly to learn from the market. At its peak, this philosophy was so ingrained that employees would find notes on their desks asking, “what will you break today?”
According to Boland, Zuckerberg’s priorities were unmistakably communicated to staff. Whether shifting focus to mobile-first design or racing to outpace competitors like the rumored Google+ network, the objectives were clear. Boland recalled a digital countdown clock in the office during a period the company called a “lockdown,” which underscored the urgency to meet aggressive goals. He stated that during his tenure, there was never a similar concerted effort or “lockdown” dedicated to addressing user safety issues. Instead, he said engineers understood that “the priorities were on winning growth and engagement.”
Meta’s leadership has consistently rejected the claim that it sacrifices user wellbeing for engagement. Both Zuckerberg and Instagram CEO Adam Mosseri have recently testified that creating positive user experiences aligns with the company’s long-term interests and guides its decisions.
Boland directly challenges this narrative. “My experience was that when there were opportunities to really try to understand what the products might be doing harmfully in the world, that those were not the priority,” he stated. “Those were more of a problem than an opportunity to fix.” He described a reactive approach to safety crises, where the primary focus was managing media coverage rather than conducting deep, introspective investigations. While he encouraged his own advertising team to proactively identify “broken parts,” he testified that this mindset did not permeate the broader company culture.
The plaintiff’s attorney, Mark Lanier, also had Boland explain the nature of the algorithms that power Meta’s platforms. Boland described these systems as possessing “immense power” and being “absolutely relentless” in pursuing their programmed objectives, which often centered on boosting engagement. “There’s not a moral algorithm, that’s not a thing,” Boland said. “Doesn’t eat, doesn’t sleep, doesn’t care.”
During cross-examination, Meta’s attorney noted that Boland did not work on teams specifically focused on youth safety. Boland agreed that advertising models and algorithms are not inherently negative. He also acknowledged that many of his concerns related to user-generated content, which falls outside the scope of the current lawsuit.
When asked if he had ever raised his worries directly with Zuckerberg, Boland recalled a conversation where he presented data suggesting the company’s algorithms were causing “harmful outcomes” and recommended a deeper inquiry. He said Zuckerberg’s response was along the lines of, “I hope there’s still things you’re proud of.” Boland resigned soon after.
He disclosed that he forfeited over $10 million in unvested Meta stock when he departed, though he acknowledged earning more than that during his career. Speaking out, he admitted, remains a “nerve-wracking” experience. “This is an incredibly powerful company,” Boland concluded.
(Source: The Verge)





