About CCAN

The Christchurch Call Advisory Network (CCAN) consists of leading organisations and individuals from civil society, academia, and the technical community convened as part of the Christchurch Call, which emerged as a response to the Christchurch massacre on March 15, 2019, when 51 Muslims were killed during prayer. The CCAN exists to provide expert advice on implementing the commitments in the Call in a manner consistent with a free, open, and secure internet and with international human rights law. CCAN balances a spectrum of civil society, internet governance, and human rights concerns, including the right to be free from deprivation of life and incitement to violence, as well as the right to free expression. CCAN accomplishes its work through meaningful collaboration with governments, civil society organisations, and private sector actors.

CCAN membership has been drawn from interested civil society groups, who represent a range of perspectives, including human rights, freedom of expression, digital rights, counter-radicalization, victim support and public policy. Many of the Advisory Network members have been engaged on the Christchurch Call since its launch and are committed to continuing to share their expertise.


Latest updates

  • The Fifth Anniversary of the Terrorist Attack on Christchurch Masjidain on 15 March 2019

    This is a solemn time for all of us in the Christchurch Call community as we remember the whānau (families) of the 51 shuhada, as well as the survivors and witnesses of the terrorist attack and their whānau. As the largest terror incident in the modern history of Aotearoa New Zealand, the work of the Call has a particular relevance for the small but resilient Muslim community. Part of our faith-community worldview calls on us to consistently stride from darkness into light. The Call has been one of the ways in which grief and loss have been used to craft a way forward to ensure that others may never have to traverse this same tragic path. The Christchurch Call was established on 15 May 2019 when the New Zealand Prime Minister at the time, Jacinda Ardern and French President Emmanuel
    Macron brought together Heads of State and leaders from the technology sector to adopt a commitment to eliminate terrorist and violent extremist content online. It recognises that the internet is being actively used to disseminate content which impacts on the human rights of others and that there is a need for a collective approach to address such threats. The Call has emerged over the last five years as a thriving community which extends to 55 government supporters, 18 major technology companies, 11 partner organisations, and an active and diverse civil society spanning almost all global time zones.

    With a fundamental baseline commitment to respect freedom of expression within a free, open and secure internet, the Call provides an inclusive global platform for individuals, civil society organisations, technology companies and governments. This unique platform has resulted in some significant and successful outcomes. These include the first agreement of its kind, where governments and tech companies have jointly agreed to a series of commitments and ongoing collaboration to make the internet safer from terrorist and violent extremist content online. Noting that the live streaming video of the gruesome killing of innocent worshippers on 15 March in Christchurch took more than 30 minutes to take-down, the incident protocols developed by the Call which coordinates efforts across governments and tech platforms, resulted in the live stream of the Buffalo attack in May 2022, being brought down within two minutes. For the Muslim community the Call has been an integral part of the healing process since making the online ecosystem safer is a positive legacy of those who are shaheed. Their memory is cherished and serves as a source of inspiration for others to stand firm in the face of hatred and uphold the values of compassion, peace, and mutual respect – not only in the streets of our towns and cities but also in the parallel online spaces in which we dwell.

    We acknowledge that the Call is also five years old. In this respect we offer shukur (gratitude) for the contribution of the diverse and inclusive community for collectively striving to bring light to the dark spaces of terrorist and violent extremist content online.

    Abdur Razzaq
    Co-Chair, Christchurch Call Advisory Network
    Chairperson of FIANZ Royal Commission of Inquiry & National Coordinator of 15 March Anniversary.

  • Christchurch Call Advisory Network Submission to Ofcom: Consultation On Protecting people from illegal harms online

    23 February 2024

    Thank you for the opportunity to provide a response to Ofcom’s consultation on Protecting people from illegal harms online. The Christchurch Call Advisory Network (CCAN) is the civil society arm of the Christchurch Call community. CCAN’s mission is to provide expert advice on implementing the commitments in the Call in a manner consistent with a free, open, and secure Internet and with international human rights law.   

    The Call objectives overlap significantly with various facets of the Online Safety Act, and CCAN members have a wide variety of relevant expertise. As Ofcom notes in Volume 1.11, “Of particular relevance to Ofcom’s functions under the Act are the right to freedom of expression (Article 10 ECHR) and the right to privacy (Article 8 ECHR).” CCAN has put together a concise response that highlights our collective knowledge in these areas, and in human rights and technology more broadly, specifically focusing on Ofcom’s approach to moderation of terrorist and violent extremist content online.  

    Ofcom’s assessment of the causes and impacts of online harms: Preserving a free, open, and secure Internet

    CCAN maintains that there is a danger in simplistically framing encryption as a risk factor, given the potential use of encryption by terrorists, violent extremists and other actors intending to cause harm to others. CCAN agrees with the report that encryption is not inherently a risk and advocate that end to end encryption enables safe communication for people all around the world, from ordinary users to vulnerable populations, which is essential to both freedom of expression and privacy— two fundamental rights Ofcom has noted that are particularly relevant to the Online Safety Act. 

    Encryption is a technical feature that is vital for Internet security for two reasons: (1) it ensures the confidentiality and integrity of the data (for example in financial transactions) and (2) it reduces the vulnerabilities to ordinary Internet users. Establishing that service providers enable access is a technical matter that is not feasible due to how end-to-end encryption is implemented. Therefore, framing end to end encryption as a “service” increases risk that can potentially hamper the Christchurch Call commitment to upholding the principle of free, open and secure Internet. Encryption is not a service that platforms and Internet actors are offering (like video streaming) but it is a technical feature essential for preserving the integrity and confidentiality of data on the Internet, due to a mathematical function, thus it is not and should not be treated as an additional “service”.  CCAN members Internet Society and Electronic Frontier Foundation (EFF) have written on the UK Online Safety Act and its effect on encryption which can be accessed here, and here.

    Framing end-to-end encryption as a risk factor implies that there are actions that providers can take to mitigate this risk. At the same time, the Online Safety Act clearly states that Ofcom does not have the power to require the use of proactive content moderation technologies within encrypted environments, except for Child Sexual Exploitation and Abuse (CSEA) material.

    Ofcom’s framing of encryption as a risk factor places indirect pressure on providers that would effectively circumvent exceptions for E2EE laid out in the Online Safety Act, implicitly pushing service providers not to roll-out encryption on their services. CCAN members are strongly against this type of indirect pressure.

    User access and account removal: Upholding freedom of expression

    Ofcom suggests that “Accounts should be removed if there are reasonable grounds to infer they are run by or on behalf of a terrorist group or organisation proscribed by the UK Government.” Such a categorical approach, is based on a  much lower standard of evidence  (“reasonable grounds to infer”) than it would have been required under UK Criminal law. It also does not take into account the serious harms to freedom of expression, including access to information and important evidence of crimes including human rights violations, that such removals can cause. 

    Designated terrorist organizations are sometimes state-sponsored, part of elected governments, or have the resources to form quasi governments. Not all of their activities and accounts engage with terrorist activities, while some engage with providing public services and announcements. In certain circumstances, some designated terrorist organizations have governmental power obliging the local population to join compulsory military service, for example. Association with such accounts might not even be voluntary. 

    Removing suspected terrorist accounts without due diligence and without considering the impact on freedom of expression and on third parties’ access to information hampers the UK’s ability to uphold the following call commitment: “Respect, and for Governments to protect, human rights, including by avoiding directly or indirectly contributing to adverse human rights impacts through business activities and addressing such impacts where they occur.” 

    Furthermore, despite Ofcom’s acknowledgement in A2.4 that “it is not an offense to portray terrorism (for example in a video clip from a film or TV show) or to report on terrorism (for example as news or current affairs),” categorical and low-evidence approaches to suspending accounts and removing content often leads to removal of reporting, and even condemnation, especially when combined with automated content moderation. 

    Instead of the proposed approach, the decision to remove accounts, and the access to all the information provided by such accounts, should be based on the type of content information such accounts are disseminating, rather than the fact accounts are on a list. Here is where CCAN very much agrees with Ofcom that: “Services should consider the purpose and meaning of content when making illegal content judgements, having regard to the whole context in which it appears. Ofcom would take into account a user’s right to freedom of expression in enforcing the safety duty.” (A.2.4, p.19)

    However, A.2.18, allows for a broader interpretation: “Content which does none of the above, but which relates somehow to a proscribed organisation, may still be illegal content.”.  Considering accounts held by proscribed terrorist organizations as of higher risk sounds legitimate. However, in practice since the companies are usually very risk averse, there are limited attempts to contextualize illegal content and prefer not to allow for any kind of proscribed organization to have an account (a practice called ‘collateral censorship’). 

    In other words, Ofcom departs from positive obligations under UK law to determine criminal conduct, and gives a blank check to service providers to apply standards that breach basic legal standards to preserve human rights. This delegation of powers to the private sector under lowered standards could ultimately lead the courts to declare takedown decisions illegal under UK law. It is worth noting that the lack of transparency around removal notices made under ToS by the UK police have already been critiqued from a human rights perspective, including by the Oversight Board for Meta in a case where it considered a request to remove a “drill rap” video under the company’s terms of service rather than through a legal order. 

    Destruction of evidence

    Removal and takedowns of certain types of content can result in harm including destruction of evidence, such evidence can be critical for law enforcement and/or international bodies like the International Criminal Court or investigations being carried out by the United Nations. This is further highlighted by CCAN members in this report for the Global Internet Forum to Counter Terrorism (GIFCT) and this whitepaper. Such removals would hinder the UK’s ability to uphold the Call commitment to: “Ensure appropriate cooperation with and among law enforcement agencies for the purposes of investigating and prosecuting illegal online activity in regard to detected and/or removed terrorist and violent extremist content, in a manner consistent with rule of law and human rights protections.” CCAN suggests Ofcom works to create an evidence preservation mechanism when such content does need to be removed for legal reasons. 

    CCAN maintains that Ofcom should use a more holistic approach (if allowed by law) and consider the context and content by undertaking a human rights impact assessment to ascertain that informational content does not become inaccessible. 

    Information gathering and supervision: Ensuring Transparency 

    One “best practice” making the Christchurch Call innovative is the government and online service provider commitment to “Recognise the important role of civil society in supporting work on the issues and commitments in the Call.” CCAN urges Ofcom to work with civil society to implement the Online Safety Act. CCAN understands that Ofcom has research, media literacy, and engagement functions and believes these can all be used to work with CCAN, as well as with individual civil society organizations, academics, and in particular groups representing impacted communities.This kind of relationship provides a dual function: (1.) CCAN as a Civil Society Organization provides technical advice and (2.) it is iterative, for example, it enables Ofcom to address potential pitfalls that correspond to preserving human rights, open, free and secure Internet. 

    “The statutory information gathering powers conferred on Ofcom by the Act give us the legal tools to obtain information in support of our online safety functions. These powers will help us to address the information asymmetry that exists between Ofcom and regulated services.” (28.2).  CCAN reminds Ofcom that this information asymmetry is even more drastic for civil society, and particularly for users and affected communities, and urges Ofcom to use its supervisory function and information notices in a transparent manner. These provisions of the Online Safety Act could enable Ofcom to share key information relevant to the impact of online service providers on human rights, for example providing transparency into companies’ uses of the GIFCT database and number of takedowns done under terms of service rather than in response to legal orders. 

    However, used improperly or without sufficient public reporting, information notices could be used to circumvent legal protections in place for user privacy, or simply in a way that does not take into account the limitations of small, medium, and large companies. They could also further disadvantage civil society and users, who are already at a disadvantage in an uneven playing field when it comes to information asymmetry. Ofcom’s supervisory functions more broadly could have the effect of creating opaque, bilateral relationships between government and companies, cutting out essential civil society input and hampering the UK’s ability to carry out the Call Commitment of “Recognis[ing] the important role of civil society in supporting work on the issues and commitments in the Call, including through: Offering expert advice on implementing the commitments in this Call in a manner consistent with a free, open and secure Internet and with international human rights law; [and] Working, including with governments and online service providers, to increase transparency.” Finally, if used too extensively, some of the enforcement provisions could encourage companies to take sweeping measures to comply with the Online Safety Act that do not sufficiently consider the impact on users’ rights.  

    We believe that Ofcom could also  play a role in preserving researchers’ data access. Such preservation could help with Civil Society Organizations consultation. Providing data access can surface issues Ofcom may not have noticed previously and then can act on in its regulatory capacity.

    In sum, CCAN urges Ofcom to ensure it adheres to a free, open and secure Internet that is human rights compliant. CCAN strongly implores Ofcom to use its information gathering and supervisory powers in a transparent way that upholds the UK’s Christchurch Call commitment to a free, open, and secure Internet.

    Thank you for the opportunity to provide CCAN’s perspective at this stage of the Online Safety Act. CCAN encourages Ofcom to have continuous  engagements with civil society organizations  throughout its consultative phase and as it transitions to policy. 

    Sincerely,

    Christchurch Call Advisory Network 

    Recommendations

    1. We suggest Ofcom works to create an evidence preservation mechanism when such content does need to be removed for legal reasons. Such a mechanism should have strong privacy and legal protections for access alongside methods for international mechanisms to access preserved evidence. CCAN members would be willing to consult with the UK Government on such an initiative.
    1. We urge Ofcom to work with civil society to implement the Online Safety Act. CCAN understands that Ofcom has research, media literacy, and engagement functions and believes these can all be used to work with CCAN, as well as with individual civil society organizations, academics, and in particular groups representing impacted communities. 
    1. CCAN suggest a more formalized role between companies and civil society that can contribute to meaningful engagement. Ofcom overwhelmingly relies on tech-companies and does not direct tech-companies to work with civil society. One of the Christchurch Call commitments is to: “work with civil society to promote community-led efforts to counter violent extremism in all its forms, including through the development and promotion of positive alternatives and counter-messaging.” It is not clear whether Ofcom has plans to direct tech-companies to work with civil society.
    1. CCAN advise Ofcom not to consider “encryption” as a risk. 
    2. CCAN suggest that Ofcom uses a more holistic approach (if allowed by law) and consider the context and content and undertake human rights impact assessment to ascertain that nonviolent, informational content does not become inaccessible. Such a mechanism must still protect privacy. 
    1. We recommend a stronger, proactive approach to addressing emerging threats as opposed to putting in place reactive measures. This must be done together with the tech sector and intersecting subject-matter experts and organizations. CCAN members would be willing to consult with the UK Government on such initiatives.
  • Christchurch Call Advisory Network Statement on Israel-Hamas War
    Christchurch Call Advisory Network Logo

    Statement for public release
    9 November 2023

    The Christchurch Call Advisory Network (CCAN) consists of not-for-profit organizations and individuals from civil society, academia, and the technical community convened as part of the Christchurch Call. CCAN exists to provide expert advice on implementing the commitments of the Call in a manner consistent with international human rights law and a free, open, and secure internet.

    The current conflict between Israel and Hamas has seen behaviour by the actors in the conflict, governments, social media platforms, and others that is inconsistent with the Christchurch Call commitments. While non-signatories to the Call cannot be held to those commitments, governments and other entities who are signatories should adhere to the commitments.

    The Christchurch Call Advisory Network urges actors in the conflict, platforms, and governments to:

    • Ensure that all people in the region have continuous access to the Internet and digital communications platforms without human-initiated outages, consistent with the Call’s commitment to “free, open and secure Internet and international human rights law”, and for actors not to interfere with this access unless it is in full compliance with the requirements of the applicable human rights instruments;        
    •  Neither force the removal of content that is within the bounds of free expression, nor destroy content that contains evidence of possible criminal war activity;       
    • Exercise great care in their use of social media, and not allow their social media accounts to disseminate false information; 
    • Include civil society in crisis response as early as possible;
    • Apply human rights standards and support frameworks to reduce amplification of terrorist content on the Internet.

    Signed,
    Christchurch Call Advisory Network 

  • CCAN members go to RightsCon 2023

    RightsCon Costa Rica is on next week from June 5-8 2023. Some members of CCAN are speaking, including Access Now, Research ICT Africa, Article 19 and Global Network Initiative. The Christchurch Call itself is hosting a roundtable Countering gender-based hate online and its drive towards violent extremism, a workshop on what’s next for Christchurch Call and a couple meet and greet event.

    With us, our CCAN members are bringing our new two page flyers that introduce what the Christchurch Call and CCAN is and how you can join to be a member. The flyer can be downloaded below.

  • Christchurch Call Advisory Network welcomes Rt Hon Jacinda Arden to her new role

    Media Release: 4 April 2023

    The Christchurch Call Advisory Network (CCAN) congratulates Rt Hon Jacinda Ardern on her appointment as The Prime Minister’s Special Envoy for the Christchurch Call. We wholeheartedly support her new appointment and look forward to working with her as civil society members of the Christchurch Call community.

    Ms Ardern’s work in building the Call community has provided a useful forum for stakeholders to work together on complex issues. It is therefore appropriate for her to continue to lead this work.

    The Christchurch Call continues to be an essential endeavour to combat the ongoing danger of terrorist and violent extreme content online. The Christchurch Call community and supporters, including the Christchurch Call Advisory Network, continue to focus on the priority areas of:

    • algorithmic transparency and audits, to reduce virality of extreme content and understand the impact
    • positive interventions to respond to extreme content
    • building the Christchurch Call community for collective action on the Call commitments
    • crisis and incident response protocol to remove live streaming video of mass murder events and related manifestos

    The Christchurch Call Advisory Network (CCAN) exists to provide expert advice on implementing the commitments in the Call in a manner consistent with a free, safe, open, interoperable, and secure internet and with international human rights law. CCAN accomplishes its work through meaningful collaboration with governments, civil society organisations, and private sector actors.