CCAN’s Public Statement Marking One Year of War in Gaza and Further Regional Escalation

4 October 2024

The Christchurch Call Advisory Network (CCAN) consists of not-for-profit organizations and individuals from civil society, academia, and the technical community convened as part of the Christchurch Call. CCAN exists to provide expert advice on implementing the commitments in the Call in a manner consistent with international human rights law and a free, open, and secure internet. 

One year on from the terrorist attack of October 7, the devastating war in Gaza continues to terrorize civilians. Now, with the further escalation in Lebanon and Israel, an even greater regional war looms. In this urgent context, CCAN again calls to attention the myriad of actions and behaviors by parties to the conflict, governments, tech companies, and others that are inconsistent with the commitments of the Christchurch Call. While not all parties to the conflict are Call signatories, signatories to the Call should not only adhere to the Call commitments, but take a stand against actions inconsistent with the Call, including:

  • Internet shutdowns and repeated attacks on civilian telecommunications infrastructure including communications devices;  
  • The leveraging of user data and user-generated content and communication channels, for targeting and other military purposes in a manner inconsistent with international humanitarian law;
  • Restricting access to information and restricting (or requiring the restriction of) freedom of expression through content moderation that is not in line with international human rights standards;
  • The dissemination of state and non-state sponsored propaganda and disinformation, hate speech, incitement to genocide, violent extremism, and  other content that violates international human rights standards and humanitarian law. 

The harmful impact of these actions resonates globally, exposing already vulnerable communities to further violence and harm, online and offline.

Against the backdrop of the Call’s commitment to uphold human rights and protect the most vulnerable against violent extremism and terrorism, CCAN reaffirms its responsibility to safeguard these commitments and to hold signatories to the Call accountable, particularly in times of armed conflict and crisis.

In light of the above, CCAN urges governments and tech companies to uphold human rights, international humanitarian law, and the Call commitments, and reiterates its previous call for actors in the conflict, platforms, and governments to:

  • Ensure that all people across the region have unhindered and reliable access to the internet and digital communications platforms without human-initiated outages, consistent with the Call’s commitment to “free, open and secure Internet and international human rights law,” and for actors not to interfere with this access unless it is in full compliance with the requirements of the applicable human rights instruments;        
  • Neither force the removal of content that is within the bounds of free expression, nor destroy content that contains evidence of possible criminal war activity;       
  • Exercise great care in their use of social media, and not allow their social media accounts to disseminate false information; 
  • Involve civil society in crisis response as early as possible; and
  • Apply human rights standards and support frameworks to reduce amplification of terrorist content on the internet.

Sincerely,

Christchurch Call Advisory Network (CCAN)
christchurchcall.network


The Fifth Anniversary of the Terrorist Attack on Christchurch Masjidain on 15 March 2019

This is a solemn time for all of us in the Christchurch Call community as we remember the whānau (families) of the 51 shuhada, as well as the survivors and witnesses of the terrorist attack and their whānau. As the largest terror incident in the modern history of Aotearoa New Zealand, the work of the Call has a particular relevance for the small but resilient Muslim community. Part of our faith-community worldview calls on us to consistently stride from darkness into light. The Call has been one of the ways in which grief and loss have been used to craft a way forward to ensure that others may never have to traverse this same tragic path. The Christchurch Call was established on 15 May 2019 when the New Zealand Prime Minister at the time, Jacinda Ardern and French President Emmanuel Macron brought together Heads of State and leaders from the technology sector to adopt a commitment to eliminate terrorist and violent extremist content online. It recognises that the internet is being actively used to disseminate content which impacts on the human rights of others and that there is a need for a collective approach to address such threats. The Call has emerged over the last five years as a thriving community which extends to 55 government supporters, 18 major technology companies, 11 partner organisations, and an active and diverse civil society spanning almost all global time zones.

With a fundamental baseline commitment to respect freedom of expression within a free, open and secure internet, the Call provides an inclusive global platform for individuals, civil society organisations, technology companies and governments. This unique platform has resulted in some significant and successful outcomes. These include the first agreement of its kind, where governments and tech companies have jointly agreed to a series of commitments and ongoing collaboration to make the internet safer from terrorist and violent extremist content online. Noting that the live streaming video of the gruesome killing of innocent worshippers on 15 March in Christchurch took more than 30 minutes to take-down, the incident protocols developed by the Call which coordinates efforts across governments and tech platforms, resulted in the live stream of the Buffalo attack in May 2022, being brought down within two minutes. For the Muslim community the Call has been an integral part of the healing process since making the online ecosystem safer is a positive legacy of those who are shaheed. Their memory is cherished and serves as a source of inspiration for others to stand firm in the face of hatred and uphold the values of compassion, peace, and mutual respect – not only in the streets of our towns and cities but also in the parallel online spaces in which we dwell.

We acknowledge that the Call is also five years old. In this respect we offer shukur (gratitude) for the contribution of the diverse and inclusive community for collectively striving to bring light to the dark spaces of terrorist and violent extremist content online.

Abdur Razzaq
Co-Chair, Christchurch Call Advisory Network
Chairperson of FIANZ Royal Commission of Inquiry & National Coordinator of 15 March Anniversary


Christchurch Call Advisory Network Submission to Ofcom: Consultation On Protecting people from illegal harms online

23 February 2024

Thank you for the opportunity to provide a response to Ofcom’s consultation on Protecting people from illegal harms online. The Christchurch Call Advisory Network (CCAN) is the civil society arm of the Christchurch Call community. CCAN’s mission is to provide expert advice on implementing the commitments in the Call in a manner consistent with a free, open, and secure Internet and with international human rights law.   

The Call objectives overlap significantly with various facets of the Online Safety Act, and CCAN members have a wide variety of relevant expertise. As Ofcom notes in Volume 1.11, “Of particular relevance to Ofcom’s functions under the Act are the right to freedom of expression (Article 10 ECHR) and the right to privacy (Article 8 ECHR).” CCAN has put together a concise response that highlights our collective knowledge in these areas, and in human rights and technology more broadly, specifically focusing on Ofcom’s approach to moderation of terrorist and violent extremist content online.  

Ofcom’s assessment of the causes and impacts of online harms: Preserving a free, open, and secure Internet

CCAN maintains that there is a danger in simplistically framing encryption as a risk factor, given the potential use of encryption by terrorists, violent extremists and other actors intending to cause harm to others. CCAN agrees with the report that encryption is not inherently a risk and advocate that end to end encryption enables safe communication for people all around the world, from ordinary users to vulnerable populations, which is essential to both freedom of expression and privacy— two fundamental rights Ofcom has noted that are particularly relevant to the Online Safety Act. 

Encryption is a technical feature that is vital for Internet security for two reasons: (1) it ensures the confidentiality and integrity of the data (for example in financial transactions) and (2) it reduces the vulnerabilities to ordinary Internet users. Establishing that service providers enable access is a technical matter that is not feasible due to how end-to-end encryption is implemented. Therefore, framing end to end encryption as a “service” increases risk that can potentially hamper the Christchurch Call commitment to upholding the principle of free, open and secure Internet. Encryption is not a service that platforms and Internet actors are offering (like video streaming) but it is a technical feature essential for preserving the integrity and confidentiality of data on the Internet, due to a mathematical function, thus it is not and should not be treated as an additional “service”.  CCAN members Internet Society and Electronic Frontier Foundation (EFF) have written on the UK Online Safety Act and its effect on encryption which can be accessed here, and here.

Framing end-to-end encryption as a risk factor implies that there are actions that providers can take to mitigate this risk. At the same time, the Online Safety Act clearly states that Ofcom does not have the power to require the use of proactive content moderation technologies within encrypted environments, except for Child Sexual Exploitation and Abuse (CSEA) material.

Ofcom’s framing of encryption as a risk factor places indirect pressure on providers that would effectively circumvent exceptions for E2EE laid out in the Online Safety Act, implicitly pushing service providers not to roll-out encryption on their services. CCAN members are strongly against this type of indirect pressure.

User access and account removal: Upholding freedom of expression

Ofcom suggests that “Accounts should be removed if there are reasonable grounds to infer they are run by or on behalf of a terrorist group or organisation proscribed by the UK Government.” Such a categorical approach, is based on a  much lower standard of evidence  (“reasonable grounds to infer”) than it would have been required under UK Criminal law. It also does not take into account the serious harms to freedom of expression, including access to information and important evidence of crimes including human rights violations, that such removals can cause. 

Designated terrorist organizations are sometimes state-sponsored, part of elected governments, or have the resources to form quasi governments. Not all of their activities and accounts engage with terrorist activities, while some engage with providing public services and announcements. In certain circumstances, some designated terrorist organizations have governmental power obliging the local population to join compulsory military service, for example. Association with such accounts might not even be voluntary. 

Removing suspected terrorist accounts without due diligence and without considering the impact on freedom of expression and on third parties’ access to information hampers the UK’s ability to uphold the following call commitment: “Respect, and for Governments to protect, human rights, including by avoiding directly or indirectly contributing to adverse human rights impacts through business activities and addressing such impacts where they occur.” 

Furthermore, despite Ofcom’s acknowledgement in A2.4 that “it is not an offense to portray terrorism (for example in a video clip from a film or TV show) or to report on terrorism (for example as news or current affairs),” categorical and low-evidence approaches to suspending accounts and removing content often leads to removal of reporting, and even condemnation, especially when combined with automated content moderation. 

Instead of the proposed approach, the decision to remove accounts, and the access to all the information provided by such accounts, should be based on the type of content information such accounts are disseminating, rather than the fact accounts are on a list. Here is where CCAN very much agrees with Ofcom that: “Services should consider the purpose and meaning of content when making illegal content judgements, having regard to the whole context in which it appears. Ofcom would take into account a user’s right to freedom of expression in enforcing the safety duty.” (A.2.4, p.19)

However, A.2.18, allows for a broader interpretation: “Content which does none of the above, but which relates somehow to a proscribed organisation, may still be illegal content.”.  Considering accounts held by proscribed terrorist organizations as of higher risk sounds legitimate. However, in practice since the companies are usually very risk averse, there are limited attempts to contextualize illegal content and prefer not to allow for any kind of proscribed organization to have an account (a practice called ‘collateral censorship’). 

In other words, Ofcom departs from positive obligations under UK law to determine criminal conduct, and gives a blank check to service providers to apply standards that breach basic legal standards to preserve human rights. This delegation of powers to the private sector under lowered standards could ultimately lead the courts to declare takedown decisions illegal under UK law. It is worth noting that the lack of transparency around removal notices made under ToS by the UK police have already been critiqued from a human rights perspective, including by the Oversight Board for Meta in a case where it considered a request to remove a “drill rap” video under the company’s terms of service rather than through a legal order. 

Destruction of evidence

Removal and takedowns of certain types of content can result in harm including destruction of evidence, such evidence can be critical for law enforcement and/or international bodies like the International Criminal Court or investigations being carried out by the United Nations. This is further highlighted by CCAN members in this report for the Global Internet Forum to Counter Terrorism (GIFCT) and this whitepaper. Such removals would hinder the UK’s ability to uphold the Call commitment to: “Ensure appropriate cooperation with and among law enforcement agencies for the purposes of investigating and prosecuting illegal online activity in regard to detected and/or removed terrorist and violent extremist content, in a manner consistent with rule of law and human rights protections.” CCAN suggests Ofcom works to create an evidence preservation mechanism when such content does need to be removed for legal reasons. 

CCAN maintains that Ofcom should use a more holistic approach (if allowed by law) and consider the context and content by undertaking a human rights impact assessment to ascertain that informational content does not become inaccessible. 

Information gathering and supervision: Ensuring Transparency 

One “best practice” making the Christchurch Call innovative is the government and online service provider commitment to “Recognise the important role of civil society in supporting work on the issues and commitments in the Call.” CCAN urges Ofcom to work with civil society to implement the Online Safety Act. CCAN understands that Ofcom has research, media literacy, and engagement functions and believes these can all be used to work with CCAN, as well as with individual civil society organizations, academics, and in particular groups representing impacted communities.This kind of relationship provides a dual function: (1.) CCAN as a Civil Society Organization provides technical advice and (2.) it is iterative, for example, it enables Ofcom to address potential pitfalls that correspond to preserving human rights, open, free and secure Internet. 

“The statutory information gathering powers conferred on Ofcom by the Act give us the legal tools to obtain information in support of our online safety functions. These powers will help us to address the information asymmetry that exists between Ofcom and regulated services.” (28.2).  CCAN reminds Ofcom that this information asymmetry is even more drastic for civil society, and particularly for users and affected communities, and urges Ofcom to use its supervisory function and information notices in a transparent manner. These provisions of the Online Safety Act could enable Ofcom to share key information relevant to the impact of online service providers on human rights, for example providing transparency into companies’ uses of the GIFCT database and number of takedowns done under terms of service rather than in response to legal orders. 

However, used improperly or without sufficient public reporting, information notices could be used to circumvent legal protections in place for user privacy, or simply in a way that does not take into account the limitations of small, medium, and large companies. They could also further disadvantage civil society and users, who are already at a disadvantage in an uneven playing field when it comes to information asymmetry. Ofcom’s supervisory functions more broadly could have the effect of creating opaque, bilateral relationships between government and companies, cutting out essential civil society input and hampering the UK’s ability to carry out the Call Commitment of “Recognis[ing] the important role of civil society in supporting work on the issues and commitments in the Call, including through: Offering expert advice on implementing the commitments in this Call in a manner consistent with a free, open and secure Internet and with international human rights law; [and] Working, including with governments and online service providers, to increase transparency.” Finally, if used too extensively, some of the enforcement provisions could encourage companies to take sweeping measures to comply with the Online Safety Act that do not sufficiently consider the impact on users’ rights.  

We believe that Ofcom could also  play a role in preserving researchers’ data access. Such preservation could help with Civil Society Organizations consultation. Providing data access can surface issues Ofcom may not have noticed previously and then can act on in its regulatory capacity.

In sum, CCAN urges Ofcom to ensure it adheres to a free, open and secure Internet that is human rights compliant. CCAN strongly implores Ofcom to use its information gathering and supervisory powers in a transparent way that upholds the UK’s Christchurch Call commitment to a free, open, and secure Internet.

Thank you for the opportunity to provide CCAN’s perspective at this stage of the Online Safety Act. CCAN encourages Ofcom to have continuous  engagements with civil society organizations  throughout its consultative phase and as it transitions to policy. 

Sincerely,

Christchurch Call Advisory Network 

Recommendations

  1. We suggest Ofcom works to create an evidence preservation mechanism when such content does need to be removed for legal reasons. Such a mechanism should have strong privacy and legal protections for access alongside methods for international mechanisms to access preserved evidence. CCAN members would be willing to consult with the UK Government on such an initiative.
  1. We urge Ofcom to work with civil society to implement the Online Safety Act. CCAN understands that Ofcom has research, media literacy, and engagement functions and believes these can all be used to work with CCAN, as well as with individual civil society organizations, academics, and in particular groups representing impacted communities. 
  1. CCAN suggest a more formalized role between companies and civil society that can contribute to meaningful engagement. Ofcom overwhelmingly relies on tech-companies and does not direct tech-companies to work with civil society. One of the Christchurch Call commitments is to: “work with civil society to promote community-led efforts to counter violent extremism in all its forms, including through the development and promotion of positive alternatives and counter-messaging.” It is not clear whether Ofcom has plans to direct tech-companies to work with civil society.
  1. CCAN advise Ofcom not to consider “encryption” as a risk. 
  2. CCAN suggest that Ofcom uses a more holistic approach (if allowed by law) and consider the context and content and undertake human rights impact assessment to ascertain that nonviolent, informational content does not become inaccessible. Such a mechanism must still protect privacy. 
  1. We recommend a stronger, proactive approach to addressing emerging threats as opposed to putting in place reactive measures. This must be done together with the tech sector and intersecting subject-matter experts and organizations. CCAN members would be willing to consult with the UK Government on such initiatives.

Christchurch Call Advisory Network Statement on Israel-Hamas War

9 November 2023

The Christchurch Call Advisory Network (CCAN) consists of not-for-profit organizations and individuals from civil society, academia, and the technical community convened as part of the Christchurch Call. CCAN exists to provide expert advice on implementing the commitments of the Call in a manner consistent with international human rights law and a free, open, and secure internet.

The current conflict between Israel and Hamas has seen behaviour by the actors in the conflict, governments, social media platforms, and others that is inconsistent with the Christchurch Call commitments. While non-signatories to the Call cannot be held to those commitments, governments and other entities who are signatories should adhere to the commitments.

The Christchurch Call Advisory Network urges actors in the conflict, platforms, and governments to:

  • Ensure that all people in the region have continuous access to the Internet and digital communications platforms without human-initiated outages, consistent with the Call’s commitment to “free, open and secure Internet and international human rights law”, and for actors not to interfere with this access unless it is in full compliance with the requirements of the applicable human rights instruments;        
  •  Neither force the removal of content that is within the bounds of free expression, nor destroy content that contains evidence of possible criminal war activity;       
  • Exercise great care in their use of social media, and not allow their social media accounts to disseminate false information; 
  • Include civil society in crisis response as early as possible;
  • Apply human rights standards and support frameworks to reduce amplification of terrorist content on the Internet.

Signed,
Christchurch Call Advisory Network 


Christchurch Call Evaluation Pilot Project

12 September 2022

To mark the third year since the Christchurch Call to Action, the Christchurch Call Advisory Network (CCAN) embarked on a first-ever independent evaluation of the work done by supporting governments and companies (“supporters”) to further the Call. Through this pilot evaluation, the CCAN engaged with key signatories to understand how their commitments under the Call had shaped supporters’ approaches to curbing the spread of terrorist and violent extremist content in a manner consistent with human rights and a free, open and secure
internet. In addition, our pilot evaluation examined ways in which supporting governments and companies engaged in multi-stakeholder discussion and policy development within the broader Call Community. This short brief summarises our initial findings, which we will elaborate on further in a forthcoming report. It also includes a description of our methodology and several insights for future work that seeks to take stock of the impact of the Call.

Preliminary findings are available in the document below.

Christchurch Call Commitments Evaluation Project 2022


Executive Summary – Christchurch Call Summit 2022

12 September 2022

The Christchurch Call Advisory Network (CCAN) has led work over the past year to understand
CCAN Members’ responses to dehumanizing speech and discourse. The executive summary, presented at the Christchurch Call Summit 2022 is linked below and the full report is available here.

Executive Summary of Anti-Dehumanization Policy


Position Statement – Christchurch Call Summit 2022

12 September 2022

The Christchurch Call Advisory Network attended a Summit in September 2022. We have written a collective position statement and submitted the outcome of two research projects the CCAN has been undertaking.

Christchurch Call Advisory Network (CCAN) position statement 2022