LDRI

Call for Applications: Country Researcher for Global Index on Responsible AI

Call for Applications: Country Researcher for Global Index on Responsible AI

The Local Development Research Institute (LDRI) is an action-oriented non-profit think tank supporting African countries in their efforts to take practical, evidence-informed actions to end extreme poverty, end hunger, and reduce inequalities. We build the capacity of governments and other stakeholders across Sub-Saharan Africa to use data and evidence for better socio-economic development outcomes. As a member  of the Data for Development Network (D4D.net) Global Research Hub we are excited to announce our collaboration with the Global Index on Responsible Artificial Intelligence, a groundbreaking initiative led by Research ICT Africa. Together, we aim to foster the development of an all-encompassing set of benchmarks designed to assess countries’ dedication to responsible AI on a global scale. 

LDRI will be responsible for leading research for a subset of  20 countries in the Sub-Saharan Region. Applications are open for the following countries; Benin, Botswana, Ivory Coast, Ghana, Liberia, Malawi, Morocco, Mozambique, Namibia, Nigeria, Rwanda, Senegal, Sierra Leone, South Africa, Togo, Tunisia, Zambia and Uganda. As a Country Researcher, you will play a crucial role in gathering and assessing evidence on responsible AI commitments and progress in your designated country, contributing to the advancement of accountable and rights-based AI principles globally.

Join us in shaping the future of responsible AI practices and promoting human rights-based AI principles globally. As a Country Researcher for the Global Index, you will make a significant impact in fostering accountability, refining best practices, and driving positive change in the AI landscape.

About the Global Index on Responsible AI:

The GIRAI is a groundbreaking tool designed to provide a reliable and comparative assessment of responsible AI practices in countries worldwide. By integrating human rights obligations into AI ethics principles, the Global Index establishes concrete benchmarks for measuring responsible AI based on existing human rights treaties and standards. The Global Index aims to foster accountability, refine best practices, develop precise policy interventions, and encourage regional and international cooperation in promoting responsible AI.

To do so, the GIRAI has built a Global Research Network composed of Regional Hubs, which are civil society organisations, research institutions, think tanks and universities very well known and recognized in their regions for their work towards digital technologies and AI and human rights. These organisations are in charge of coordinating the work of more than 130 country level researchers.

Responsibilities of a Country Researcher:

As a Country Researcher, your main responsibilities will involve conducting primary data collection in your assigned country to complete the survey for GIRAI. To accomplish this, researchers will participate in mandatory capacity building sessions where they will be introduced to the concepts of responsible AI through the lens of human rights, and be familiarised with the GIRAI tools and methodology, data collection strategies, and supervision procedures. They will also receive a Researcher’s Handbook which provides definitions, justifications, identifications, and examples for each thematic area of the GIRAI. Ensuring that Country Researchers adhere to data quality standards, follow the established data flow processes, and comply with the timelines and milestones for completing data collection, all of which are established by the GIRAI team and the Regional Hubs, is crucial.

The GIRAI survey is organized across thematic areas, which cover the core components of responsible AI, based upon human rights-based standards and democratic principles related to AI. Some examples of these include Gender Equality, Data Protection and Privacy, Bias and Unfair Discrimination, Labor Protection and Right to Work, Accountability and Responsibility, Transparency & Explainability. Country Researchers will be asked to provide and assess evidence on frameworks, government actions and non-state actors working to advance each thematic area and its intersection with AI. 

Requirements for Country Researchers:

  • In-depth knowledge of AI governance, policies, as well as the social and ethical implications of AI and digital technologies.
  • Advanced research skills and experience in AI-related fields, with a focus on responsible AI, data protection, data policy, and data sharing.
  • Familiarity with recent policy developments related to AI in the designated country.
  • Strong command of written and spoken English and fluency in one or more official languages of the designated country. 
  • Experience in online data collection and assessment of documentary evidence.
  • Demonstrated ability to work independently, meet deadlines, and adhere to data quality standards.
  • No conflicts of interest that could interfere with the ability to conduct impartial assessments of available evidence.
  • Stable internet connection. (with an average download speed of  25 Mbps)
  • Availability to work full-time for 3 months, ideally from September to November 2023. 

Application Process:

To apply for the position of Country Researcher, please fill out the application form here  by September 22nd, 2023 

  1. A brief motivation letter expressing your interest in the role of country researcher.
  2. Prior research experience, with specific attention to AI and familiarity with AI landscape in the specific country.
  3. Indication of your availability and time commitment to work full-time for three months, from November 2023- January 2024.
  4. References or examples of research papers, reports authored or co-authored.

We look forward to receiving your application and reviewing your qualifications.