Senior Researcher, Emerging Technology and Industry Practice - Ada Lovelace Institute

Nuffield Foundation

Location Hybrid · We are London based (Farringdon). Our staff have the option to work part of the week (2 days) from home, with some possibility to work more flexibly depending on location and requirements of the role. 
Salary Starting salary from £47,444 per annum. Permanent, full-time (35 hours) - reduced hours considered
Team Ada Lovelace Institute
  • Closing: 9:00am, 20th Jun 2023 BST

Job Description

The Ada Lovelace Institute (Ada) is hiring a Senior Researcher to lead our research into industry practices and emerging technologies. Amongst other projects, this researcher will lead a series of projects exploring the effectiveness of responsible AI and ethics/accountability practices and demystifying emerging technologies, including:

  • AI auditing practices as a method for assessing and inspecting algorithmic systems

  • Impact assessments as a method for identifying and documenting the potential risks

  • Transparency mechanisms like transparency standards or datasheets

  • Emerging generative AI governance mechanisms, like red-teaming or bias testing

  • Exploring the societal impacts of emerging technologies like general purpose AI or synthetic data

This role is an excellent opportunity to oversee a series of projects that will explore practical on-the-ground experiences of responsible AI practitioners and produce a series of projects that will feed into contemporary AI legislative and policy debates.

The role 

This role sits within Ada’s Emerging Technology and Industry Practice research Directorate. With 6 other team members, this directorate undertakes research exploring the societal implications of emerging technologies and what steps developers of these technologies can take to address them.  

This Senior Researcher position will oversee a research team of 1-2 people sitting within this directorate that develops methods for AI and data practitioners and regulators to evaluate and assess potential risks, harms and impacts of AI and data-driven technologies. This role will report directly to the Associate Director for Emerging Technology and Industry Practice.

Working with the Associate Director, this role will be responsible for developing and executing a research agenda that explores the practices that industry firms can implement to improve accountability over AI products over AI products and demystifying the limitations, opportunities, and potential societal impacts of emerging technologies.

There are three potential projects this role may immediately oversee:

  • A project to explore lessons learned from a local government's attempt to require algorithmic bias audits of employment tools.

  • A project with a law firm to study how a third-party algorithmic auditing agency can develop and implement practices for algorithmic auditing.

  • A project exploring generative AI governance approaches.

This role will work on these projects with the support of up to two research staff and wider Ada functions, including our Communications, Operations, and Policy & Public Affairs teams. This role may also advise and contribute to other projects within the Industry and Emerging Technology Research Directorate.

In addition to these projects, this role will be responsible for communication strategies for outputs, and conceptualising, facilitating and attending meetings, workshops and events with a view to achieving strategic impact with key stakeholders. 

Some of our work to date

This Senior Researcher will lead the next iteration of our Ethics & Accountability in Practice programme, which produced research exploring (a) what methods exist for holding AI systems accountable, (b) what methods need to be developed, (c) what can we learn from testing these methods, and (d) how can these methods be reflected in policy and law.

Our previous research has looked at the following areas:

 To date, Ada’s methodologies include the use of working groups and expert convenings, public deliberation initiatives, desk-based research and synthesis, policy and legal analysis and translation, and ethnographic research. We welcome new kinds of expertise and methodologies into our team, and for this role we are hoping to attract candidates with a background in data science and/or computer science.

About you

You are a researcher or professional who may have a background researching for a policy department or a regulator, a technology company, research institute, charity or academic organisation. You have experience and familiarity with AI and data science concepts, and can engage with technical communities and lay audiences on these topics. You are curious and passionate about the issues which arise at the intersection of technology and society, and are committed to bringing an interdisciplinary and intersectional lens to understanding them. Importantly, you’ll be comfortable taking initiative, working independently and to short deadlines at times. 

You’ll enjoy working in a team environment, willing to jump into projects and keen to explore areas of policy, technology and practice that you don’t already understand. You’ll appreciate the importance of high standards of rigour in research, but also want to think creatively about communicating and influencing in novel ways. 

For further information about the role and the skills and experience we're looking for, please download the full job description here

About the Ada Lovelace Institute 

The Ada Lovelace Institute is an independent research institute funded and incubated by the Nuffield Foundation since 2018. Our mission is to ensure data and artificial intelligence work for people and society. We do this by building evidence and fostering rigorous debate on how data and AI affect people and society.  We recognise the power asymmetries that exist in ethical and legal debates around the development of data-driven technologies and seek to level those asymmetries by convening diverse voices and creating a shared understanding of the ethical issues arising from data and AI. Finally, we seek to define and inform good practice in the design and deployment of AI technologies.   

The Institute has emerged as a leading independent voice on the ethical and societal impacts of data and AI. We have built relationships in the public, private and civil society sectors in the UK and internationally. Please find details of our work here

Our research takes an interconnected approach to issues such as power, social justice, distributional impact and climate change (read our strategy to find out more), and our team have a wide range of expertise that cuts across policy, technology, academia, industry, law and human rights.  We value diversity in background, skills, perspectives and life experiences. As part of the Nuffield Foundation, we are a small team with the practical support of an established organisation that cares for its employees.

We strongly encourage applicants from backgrounds that are underrepresented in the research, policy and technology sectors (for example those from a marginalised community, those who did not go to university or had free school meals as a child). We are committed to tackling societal injustice and inequality through our work, and believe that all kinds of experiences and backgrounds can contribute to this mission.  

How to apply

The closing date for applications is 09:00am (BST) on Monday 19th June 2023, with interviews scheduled to take place on Thursday 29th June 2023.

You will be required to complete some questions as part of this application process, and you are also required to upload an up-to-date copy of your CV. The Applied platform lets you save an application and resume it ahead of submitting before the application deadline. 

We are committed to inclusive working practices and during the application process we commit to:

  • paying for travel costs (and any childcare or care costs) for interviews where in-person attendance is required

  • making any reasonable adjustments – for example providing documents in different formats, arranging for a sign language interpreter for interviews etc

  • As a Disability Confident employer, we will offer a guaranteed first stage interview for disabled candidates who meet the essential criteria for the role.

Should you need to make an application in a different format or require any adjustments as part of the application process, please get in touch with us: recruitment@nuffieldfoundation.org

 Our benefits package includes: 

  • 28 days holiday per annum and all public holidays (with the option to buy or sell up to 5 days). 

  • A salary exchange pension scheme that offers employer contributions of up to 11%. 

  • Life assurance scheme. 

  • We offer family leave policies that provide an enhanced level of pay 

  • Cycle to work scheme and loans towards season tickets. 

  • Opportunities for learning and development

  • Wellbeing support including an employee assistance provider, personal health reviews with Bupa and a staff network of trained Mental Health First Aiders.

Removing bias from the hiring process

Applications closed Tue 20th Jun 2023

x

Removing bias from the hiring process

  • Your application will be anonymously reviewed by our hiring team to ensure fairness
  • You’ll need a CV/résumé, but it’ll only be considered if you score well on the anonymous review

Applications closed Tue 20th Jun 2023