For years, Big Tech companies have claimed they are simply platforms, not publishers, to avoid responsibility in Africa. However, a Lagos State High Court judge changed that on January 13, 2026. The court ordered Meta Platforms Inc., which owns Facebook, Instagram, and WhatsApp, to pay $25,000 in damages to Femi Falana, a well-known human rights lawyer.
This case began in early 2025 and was not about a banned account or a lost password; it was about dignity. It involved a deepfake ad that misled millions of Nigerians, stating that Falana, one of their most respected legal figures, was secretly dying.
The problem started in January 2025 when a sponsored video appeared on Facebook from a page called “AfriCare Health Center.” The video skillfully used Falana’s image and a fake voice to say he had been suffering from a prostate condition for 16 years. It falsely claimed he faced ongoing pain and failed treatments, all to promote a questionable herbal remedy.
Falana, who has never had this condition, was horrified. He was not only impersonated but also used to scam vulnerable Nigerians seeking medical help. When his legal team, led by Olumide Babalola, contacted Meta to take down the video, they received the usual slow, automated responses. So, Falana chose to go to court.
Monetisation Triggers Compliance Obligations
Meta’s defense, led by Mr. Tayo Oyetibo (SAN), focused on the common argument of intermediary liability. They claimed, “We didn’t create the video; a user did. We just host the site.” However, Justice Olalekan Oresanya of the Lagos High Court (Tafawa Balewa Square) disagreed.
In a ruling that will likely be referenced in law schools across Africa, the judge stated that Meta is not just a neutral bystander. He decided that Meta acts as a “Joint Data Controller” because it:
- Earns money from the content (through ad revenue),
- Decides who sees it (using algorithms), and
- Sets the platform rules.
“A global technology company which hosts pages for commercial benefit owes a duty of care to persons affected by content disseminated on its platform,” Justice Oresanya ruled. “Where a platform monetises content and the harm from misinformation is reasonably foreseeable, it cannot escape liability.”
The $25,000 Gamble That Turned Into a $5 Million Loss
Femi Falana originally sued for $5 million (about ₦7.5 billion). He claimed that the invasion of his privacy caused him “mental and emotional distress.” The court agreed that his privacy was violated under Section 37 of the Nigerian Constitution and the Nigerian Data Protection Act (NDPA). However, it awarded him only $25,000.
Why the lower amount? In defamation and privacy cases, courts often separate “punitive” damages (to punish the offender) from “compensatory” damages (to compensate the victim for the harm done). While $25,000 might seem small for a big company, the legal win is significant. It sets a precedent: Nigerian courts now have a case showing that Big Tech can be held responsible for the ads it posts.
What This Means for You
For the average Nigerian scrolling through Facebook, this judgment is a wake-up call.
- For Content Creators: The platform may start enforcing stricter rules against misleading content to avoid legal issues.
- For Scammers: Using AI deepfakes to impersonate celebrities (like Elon Musk or Dangote) is now riskier.
- For Victims: If a platform ignores your report about a fake profile or scam ad, you can now refer to this court ruling.
While Falana did not receive the $5 million he wanted, he achieved something more important. He held a giant company accountable, even if just a little. In the battle of African law against Silicon Valley, it’s clear that David just scored a point.
