Australian government agencies are making direct, "informal" approaches to internet providers and telcos to block access to websites, rather than continue to use a controversial power in the Telecommunications Act.
The use of "informal" requests means agencies can also avoid reporting the block requests to a central register that was set up to bring transparency to the space.
The register, run by the Australian Communications and Media Authority (ACMA), relies on agencies that request website blocks under section 313(3) powers in theTelecommunications Act 1997 voluntarily reporting their request numbers.
It was designed to avoid a repeat of a notorious overblocking incident that impacted 250,000 legitimate websites in 2013.
But almost a decade on, while an overblocking incident of the same scale hasn't materialised, neither has an accurate central register of all website takedown requests.
There are two issues at play.
First, while the ACMA encourages federal and state, territory agencies to report S313(3) requests they make, a spokesperson told iTnews that this is "good practice" and not a legal requirement.
"As such failure to report to the ACMA on the use of section 313(3) of the Telecommunications Act is not illegal," the ACMA spokesperson said.
Second, some agencies are avoiding the issue entirely by not using section 313(3) at all.
A department of infrastructure, transport, regional development, communications and the arts spokesperson acknowledged the practice of informal website takedowns and blocks.
The department drew up the guidelines that recommend agencies report their use of section 313(3) blocks.
The spokesperson said they were "not in a position to comment on the specific scenarios referred to as it does not appear a section 313(3) request has been made in the circumstances."
Blocking strategies
S313(3) was first interpreted as enabling agencies to take down websites in 2012 when the Australian Federal Police set a precedent by using the laws to block a number of child abuse sites.
In the current financial year, TEQSA - the tertiary education and quality and standards agency - reported taking down more than 100 academic contract cheating sites. It is extensively documented as a user of S313(3).
The ACMA itself also reported blocking illegal, offshore gambling and affiliate marketing sites.
However during the same period, the federal health department made an unreported request to take down the suspected telehealth scam Dr ASAP.
And, using Netcraft’s automated takedown technology, the Australian Securities and Investments Commission (ASIC) and the Australian Competition and Consumer Commission (ACCC) made 5579 unreported takedowns of cryptocurrency scams, fake online retailers and other illegal sites.
It is likely that other agencies have made similarly unreported, informal takedown requests as iTnews obtained these figures through submitting freedom of information requests to the agencies.
RMIT electrical and computer engineering associate professor Mark Gregory told iTnews that any increased use of informal takedowns was "a matter of considerable concern.”
He said that maintaining a centralised reporting and disclosure scheme of the use of the power was “vital”.
“There have been a very large number of takedown notices issued and the number of informal requests is expected to be similar or larger, now that government agencies and other organisations have adopted this practice,” Gregory told iTnews.
“Secrecy, when used by government, its agencies and other authorised organisations, is a matter of public concern and can be used to undermine democracy and free speech."
Other observers and organisations declined to comment when approached by iTnews.
No central content takedown register
While a register for formal website takedowns under S313(3) exists, there is no equivalent for reporting content removal requests, which is another takedown-like power being used at the federal level.
In the previous year, content removal requests were sent by the Australian Electoral Commission (AEC) to remove unauthorised electoral material from platforms like Facebook and TikTok, and by the Therapeutic Goods Administration to remove advertising from undisclosed online platforms.
A central registry of such individual content removals is yet to be established.
The most comprehensive statistics on the type of content being pulled from platforms by the government is provided by the eSafety Commissioner’s annual reports.
eSafety blocks content in response to complaints of material such as cyberbullying, adult cyber abuse and other illegal or age-restricted content.
There is also a registry of content takedowns related to disinformation and misinformation that is voluntarily produced by digital platforms, rather than by government.
Signatories to the digital platforms association’s voluntary industry code for dealing with disinformation and misinformation, such as Google and Redbubble, also produce publicly accessible records of measures taken against the content.
However, the transparency reports on content removals related to disinformation and misinformation could be managed by the ACMA in future as the watchdog has said it is considering registering its own mandatory code which would supersede the platforms’ voluntary one.