Our research and technology development work is organised into labs which run specific tracks covering:

Public Interest Technology Lab

Technology Design, Development & Deployment

We contribute to the design, development and implementation of open source technologies and communities that increase free expression, circumvent censorship, and obstruct repressive surveillance as a way to promote human rights and open societies (Privacy and Security Enhancing Technologies).

We undertake research including surveys, design, development, and implementation programs focused on increasing:

  • Access to the internet, including tools to circumvent website blocks, connection blackouts, and widespread censorship;
  • Awareness of access, privacy, or security threats and protective measures, including how-to guides, instructional apps, data collection platforms, and other efforts that increase the efficacy of internet freedom tools;
  • Privacy enhancement, including the ability to be free from repressive observation and the option to be anonymous when accessing the internet; and
  • Security from danger or threat when accessing the internet, including encryption tools.

Example

Open Observatory of Network Interference (OONI)

We collaborate with the Open Observatory of Network Interference (OONI) to perform specific tests that are designed to examine whether (and how) websites are blocked, as well as whether systems that could be responsible for censorship and/or surveillance are used in a tested network by an ISP. This increases transparency around internet censorship through the collection of network measurements. We also work with other technical partners to do internet scanning, network penetration and intrusion detection testing.

Social Justice Technologies

We collaborate with GAMAs in effort to fight hate across all technologies, from Facebook to VR to video games, as well as helping bring civil rights into the digital context. As part of this remit, we work on topics such as assessing how effective machine learning can be in helping identify and counter hate; making recommendations on how platforms can change to incentivize better online behaviour; how governments should evaluate and deploy algorithms in administrative and other matters; deciding what tools may be needed to monitor extremists on platforms like Gab; identifying and deciding how best to fill gaps in protecting vulnerable populations online; and identifying new and emerging trends in cyberhate and civil rights.

Example: We are currently on the Working Group on the IEEE P7003 Standard for Algorithmic Bias Considerations

 Cyber Policy Lab

 Track 2: Information security threats, political risk analysis

Through the use of mixed research methods we document and analyse information controls including technical and non-technical (regulatory and social controls). We also make policy recommendations with the goal to bridge technology and policy. We examine increased technical capabilities at the national level, which are meant to deny (e.g., Internet filtering), disrupt (e.g., network shutdowns), and monitor (e.g., network surveillance) online activities. Second, through non-technical legal measures, such as an expanded use of defamation, slander, and ‘‘veracity’’ laws, governments seek to deter bloggers and independent media from posting material critical of the government or specific government officials, however benign (including humour).

 

Examples

Politically-motivated security threats

Politically-motivated information security threats seek to deny (e.g. Internet filtering, denial-of-service attacks), manipulate (e.g. website defacements) or monitor (e.g. targeted malware) information related to their work.

 Surveillance and the Right to Privacy in the Digital Age

This track builds on the work that has been done at the United Nations level, in particular the OHCHR Workshop on 19 February 2018 under the Human Rights Council Resolution 34/7. The track examines:

  • National, regional and international norms, standards, principles, and best practices regarding the promotion and protection of the right to privacy in the digital age, including the responsibility of business enterprises in this regard
  • Surveillance and communications interception
  • Securing and protecting online confidentiality
  • Processing of personal data by individuals, governments, business enterprises and private organizations.
  • Safeguards, oversight and remedies, including an effective oversight of intelligence service bodies and intelligence collection activities.
  • New and emerging issues, including artificial intelligence, machine learning and algorithm decision making and their impact on human rights and how big data can be used for social good and decentralised technologies

Cyber Security, Crime and Intelligence analysis

We examine the relationship between cyber-security, cyber-crime and cyber -intelligence. We seek to answer the following questions:

What is the state/condition of cybersecurity legislation and policy in Africa?

  • What are the drivers of cybersecurity policymaking?
  • What would a citizen-centric cybersecurity framework look like? (What is needed to make things better in the future?)

In light of the legislative void on data collection and retention in most African countries and the current passage of cyber-crime legislation, we pay particular attention to three special areas: bulk data collection and retention, bulk equipment interference and computer remote searches and public-private partnerships.

Bulk Data Collection and Retention

Intrusion caused by blanket retention/mass surveillance of data, and the potential cybercrime issues that that raises, beyond the human rights concerns. This track examines issues like how data gathering capabilities should be authorised on the basis of a judicial warrant, rather than gathered a priori with judicial warrants being necessary only to access the data.

Bulk equipment interference and Remote Searches

This track examines targeted hacking and how it can be used as an effective investigative tool with a search warrant and under suitable conditions but how it can also be misused to target civil society and political opponents. It examines hacking into computers for domestic crime investigations and computer espionage done under the guise of national security and their impact on human rights.

Public-private partnerships in cybersecurity

The protection of Critical Information Infrastructure (CII) is similarly recognized as central to national security and the public interest (normally the responsibility of the public sector) yet most CII is privately owned and therefore its governance demands the involvement of private and civil society sector stakeholders to safeguard its safety, reliability, and resilience.

Internet Governance and Regulation

Governance

Some of the questions we address in our research include:

  • What new forms of governance exist or can be devised to regulate the internet to serve a public good?
  • How can states be held accountable for the policy commitments/promises/narratives that they make in international forums about digital inclusion?
  • What preconditions are required/investments needed for African countries to become active participants in the digital economy and policy space?
  • What are the opportunities and the risks of digital economies for Africa, for example of micro-work?

We examine:

Regulation

This track inquires whether the internet should be regulated or not by seeking to take an holistic approach to the question of Internet regulation.

Questions asked are:

  1. Is there a need to introduce specific regulation for the internet? Is it desirable or possible?
  2. What should the legal liability of online platforms be for the content that they host?
  3. How effective, fair and transparent are online platforms in moderating content that they host? What processes should be implemented for individuals who wish to reverse decisions to moderate content? Who should be responsible for overseeing this?
  4. What role should users play in establishing and maintaining online community standards for content and behaviour?
  5. What measures should online platforms adopt to ensure online safety and protect the rights of freedom of expression and freedom of information?
  6. What information should online platforms provide to users about the use of their personal data?
  7. In what ways should online platforms be more transparent about their business practices—for example in their use of algorithms?
  8. What is the impact of the dominance of a small number of online platforms in certain online markets?
  9. What is the diversity, inclusion and ethical issues that needs addressing on the internet?

Innovation, Decentralized Technologies and Digital Economy

  • Block chain
  • Net neutrality and zero rated services.

Example: Artificial intelligence & Social good

The application of big data for social change represents a relatively new trend. The major ICT corporations view big data as a critical driver to generate new insights across a range of fields, from health care to environment and education. At the same time concerns around big data focus on tracking and targeting consumers.

This track addresses challenges, solutions and policy recommendations on the topic of big data and human rights to shed light on how big data can be used for social good.

Standard-setting in algorithms design and application in decision-making processes both within the public governance and administrative sectors.

Research and advice on the societal impact of computer analytics, popular information services and autonomous intelligence systems such as machine learning and artificial intelligence applications and all its subsets: Machine Learning, Deep Learning, and Reinforcement Learning.