Facebook was due to appear before the Senate’s Select Committee on Foreign Interference through Social Media on Friday, alongside controversial video-sharing platform TikTok.
While TikTok’s name is still on the schedule, Facebook has pulled out.
A statement from the committee said it has been in communication with Facebook to arrange its appearance at a public hearing, saying the social media giant was initially willing to participate and had been tentatively confirmed as a witness for the hearing before its decision to cancel.
“Facebook has since stated that key personnel are not willing to make themselves available on this date,” the committee said.
“Facebook has expressed a preference for any appearance to be after the US election.”
A Facebook spokesperson told ZDNet it intends to cooperate with the committee, but a “scheduling issue” has meant testimony cannot occur this coming Friday.
“We are committed to cooperating with the Senate Committee on this inquiry and answering the questions they may have. Due to a scheduling issue we’ve requested to appear at a later day,” the spokesperson said.
The committee was stood up in December to inquire into, and report on, the risk posed to Australia’s democracy by foreign interference through social media.
Read more: Countering foreign interference and social media misinformation in Australia
Committee chair Senator Jenny McAllister on Wednesday thanked TikTok for its “constructive” approach to the inquiry and its willingness to appear before the committee. Meanwhile, she said it was “disappointing Facebook has not adopted the same approach”.
“Facebook’s platform has been used by malicious actors to run sophisticated disinformation campaigns in elections around the globe,” McAllister said.
With 84% of the nation’s population on Facebook — around 17 million Australians use the site every month — McAllister believes the public deserves to know how the company manages the risks presented by the platform to Australia’s democracy and public discourse.
“Facebook claims they can be trusted to support Australia’s democratic processes but seem unwilling to participate in our processes of democratic accountability,” she said.
“As chair of the inquiry, I will be talking to my colleagues about options we have to ensure that Facebook answers the legitimate questions Australians have for the platform.”
Must read: Facebook comments manifest into real world as neo-luddites torch 5G towers
Earlier this month before a House of Representatives Committee, Facebook said that during the quarter when the 2019 Australian federal election was held, it removed around 1.5 billion fake accounts from its platform.
“These fake accounts are the things that people try to use to share harmful content,” Facebook vice president of public policy Simon Milner said at the time.
“Almost 100% of that was removed because of our actions, using artificial intelligence to find these accounts and get rid of them. We spend a lot of effort trying to protect our platform from fake accounts.”
During the 2019 election period, there were approximately 10 million unique people involved in 45 million interactions related to the election.
Only 17 individual pieces of information were fact-checked during this period.
“Once a post has been found, we use artificial intelligence to apply the same treatment to similar posts that make the same claim … the ultimate number of posts that would have received fact treatment would be a number much higher, in the thousands,” Facebook’s Australia and New Zealand public policy manager Joshua Machin added.
The pair admitted, however, that Facebook does not fact-check political advertising “because we believe it’s important for the debate to play out”.
“I would say Facebook does the same as any media platform. If you see a billboard … an ad for a campaign … because that person is trying to target that constituency, an opponent might think that that ad contains false information and they have an opportunity to respond to that, beat that with an ad further down the road,” Milner said.
“There’s no expectation that the company that enabled you to put that ad on that billboard had to put something on it saying, ‘Hey, this information has been marked as false’, so we apply exactly the same approach on our service when it comes to political advertising.
“We don’t think it’s right that we should be the arbiters of truth.”