A file photo shows the icons of Facebook and WhatsApp are pictured on an iPhone, in Gelsenkirchen, Germany. Last spring, as false claims about vaccine safety threatened to undermine the world s response to COVID-19, researchers at Facebook wrote that they could reduce vaccine misinformation by tweaking how vaccine posts show up on users newsfeeds, or by turning off comments entirely. Yet despite internal documents showing these changes worked, Facebook was slow to take action. AP
Representatives from Facebook, Google, Twitter and TikTok will be questioned by members of a parliamentary committee scrutinizing the British government's draft online safety legislation.
Governments on both sides of the Atlantic want tougher rules aimed at protecting social media users, especially younger ones, but the United Kingdom's efforts are much further along. U.K. lawmakers are questioning researchers, journalists, tech executives and other experts for a report to the government on how to improve the final version of the online safety bill.
The hearing comes the same week YouTube, TikTok and Snapchat were questioned by a U.S. Senate panel. They provided little firm commitment for U.S. legislation bolstering protection of children from online harm, which lawmakers say ranges from eating disorders, sexually explicit content and material promoting addictive drugs.
Facebook whistleblower Frances Haugen appeared before the U.K. committee this week, telling members that the company's systems make online hate worse and that it has little incentive to fix the problem. She said time is running out to regulate social media companies that use artificial intelligence systems to determine what content people see.
Haugen was a Facebook data scientist who copied internal research documents and turned them over to the U.S. Securities and Exchange Commission. They also were provided to a group of media outlets, including The Associated Press, which reported numerous stories about how Facebook prioritized profits over safety and hid its own research from investors and the public.
The U.K.'s online safety bill calls for a regulator to ensure tech companies comply with rules requiring them to remove dangerous or harmful content or face penalties worth up to 10% of annual global revenue. The European Union is working on similar digital rules.
British lawmakers are still grappling with thorny issues such as ensuring privacy and free speech and defining legal but harmful content, including online bullying and advocacy of self-harm.
They're also trying to get a handle on misinformation that flourishes on social media.
Maria Ressa, a Filipino journalist who shared this year's Nobel Peace Prize for her fight for freedom of expression under grave risks, acknowledged the challenge, telling the committee on Wednesday that a law to curb disinformation is needed.
``Regulation is our last hope,`` Ressa said. ``The problem is that you will be a model for everyone else around the the world, so you must be a gold standard, that's tough.`` At the same time, ``doing nothing pushes the world closer to fascism,`` she added.