Unit information: Introduction to AI Societies, Ethics and Futures in 2030/31

Please note: Programme and unit information may change as the relevant academic field develops. We may also make changes to the structure of programmes and assessments to improve the student experience, occasionally this includes not running units if they are not viable.

Unit name Introduction to AI Societies, Ethics and Futures
Unit code ALSSM0001
Credit points 20
Level of study M/7
Teaching block(s) Teaching Block 2 (weeks 13 - 24)
Unit director Dr. James Freeman
Open unit status Not open
Units you must take before you take this one (pre-requisite units)

None

Units you must take alongside this one (co-requisite units)

None

Units you may not take alongside this one

None

School/department Arts, Law and Social Sciences Faculty Office
Faculty Faculty of Arts, Law and Social Sciences

Unit Information

Why is this unit important?

How can we anticipate and tackle the ramifications of Artificial Intelligence for human societies now and in the near future?

Modern AI promises to give millions of people access to personalised services, education and healthcare previously accessible only to the wealthiest. It promises better public policy, seamless translation and exchange of knowledge, and a new era of discovery in medicine, science and the humanities. It is said to have the potential to dramatically lower the cost of innovation for creative arts and businesses and help us design products and infrastructure that prioritise sustainability.

Yet AI poses unequal risks to us as individuals and confronts our societies with difficult questions about trust, responsibility, human autonomy, privacy, and justice. While it could keep some of us safe and extend lifespans, it has the potential to upend livelihoods, entrench biases, and widen inequalities. While offering hope in solving some of the world’s most pressing problems, AI also poses collective societal threats.

With so much at stake, how can we balance harms, risks and benefits for individuals alongside societal implications and environmental impacts? This unit shares answers from a variety of disciplines but most importantly it challenges you to develop an approach of your own.

How does this unit fit into your programme of study

Introduction to AI Societies, Ethics and Futures connects the technical knowledge of modern AI gained in other units with key socio-technical perspectives and foundational ethical concepts that help us reason about how AI should be used and evaluate its risks, benefits and consequences.

A pre-requisite for responsible AI is that all involved in the development, application, and governance of AI technologies can foresee the likely consequences of their decisions. To that end, this unit delivers a programme outcome in enabling you to make strong links between AI's technical capabilities and how and why these might differently impact individuals, societies, and our environment depending on your choices.

The unit's teaching and assessment design will also allow you to connect these big societal questions about AI with specialist interests or use-cases developed elsewhere in your programme. We will show that engaging with these debates and the professional standards of given domains can enrich the development of AI technologies themselves.

Your learning on this unit

An overview of content

This unit equips you to connect your own use and development of AI to the broader societal impacts, transformations, and ethical dilemmas it presents now and in the near future.

Rather than merely contemplate technological change or ethical questions in the abstract, this unit encourages you to actively participate in shaping the future of AI in society by helping you define responsible and ethical use of AI in the specialist fields that interest you and place this in the context of wider societal change, benefits, risks and harms.

To do so, you will draw upon three types of interdisciplinary perspective:

First, you will be able to apply key socio-technical perspectives to place the impacts of the current ‘AI revolution’ in context. You will learn to identify AI’s implications in your field of interest by recognising not only how data and technologies come to transform societies in large (seen) and small (unseen) ways, but also how the needs and priorities of human societies shape technological adoption.

Second, you will practice connecting ethical frameworks, rules, and dilemmas specifically relating to AI, data, and algorithmic decision-making with the professional values and ethics of diverse subject domains, including medicine, business, engineering, law, and the creative industries. We consider the core governance and decision-making strategies that can be put in place at each stage of AI development to make it more likely that technologies serve human society rather than the other way around.

Third, by showcasing how Bristol’s leading academics are developing approaches to responsible and sustainable AI across a range of fields, the unit will encourage you to make a set of cross-disciplinary connections that can inform your own approach to the design, implementation, and evaluation of AI in your areas of interest.

How will students, personally, be different as a result of the unit.

This unit aims to help you lead the future of responsible AI through your own practice. As well as a more nuanced understanding of the ramifications of AI for individuals and societies, you will acquire a set of conceptual and analytical tools that allow you to confidently identify the societal and ethical implications of AI in given use cases and make design decisions or policy recommendations that best reconcile AI's prospects for benefits and harms.

By practicing this ability, you will build up your own approach that links wider debates and trends with developments in specific domains that interest you (e.g. law, medicine, creative arts, engineering). You'll practice communicating AI's complex implications to stakeholders and learn to move easily between technical, ethical, professional, industry and governance responses to this period of rapid technological change. No matter your future role, you will be able to apply an informed approach to questions of AI regulation, responsibility and safety, recognising why these require interdisciplinary perspectives on ethics and society as well as technical fluency.

Learning Outcomes

On successful completion of this unit, you will be able to:

1. Apply key socio-technical frameworks for understanding the relationship between technology and societal change.

2. Identify the biases present in data and the differentiated impacts that can arise from algorithmic decision-making.

3. Evaluate the societal and ethical implications of AI in given use cases.

4. Construct and communicate recommendations for different stakeholders that best reconcile AI's potential benefits and harms.

5. Critically reflect on the development of your own approach to responsible AI in the context of regulation, industry standards, and/or professional ethics.

How you will learn

Each week will be taught using a range of examples and case studies relating key themes, principles and concepts to different fields. To ensure relevance for all students, the main cases referred to in teaching will often be supported by wider examples from specialist fields, prompting students to reflect on how these issues relate back to their existing knowledge by keeping a logbook.

The unit will be delivered through a mix of lectures and problem-based learning workshops in which students will work together across and within their programmes to understand the implications of a given AI application.

How you will be assessed

Tasks which help you learn and prepare you for summative tasks (formative)

Problem-based learning tasks completed in workshop classes.

Reflections logbook recording professional practice.

Tasks which count towards your unit mark (summative):

AI Evaluation Studies Portfolio, 3000 words or equivalents (75%) [ILOs 2-4].

The portfolio will consist of three evaluation case studies. For each study, you will select one from a choice of formats that best demonstrate their communication strengths: developer guidelines, policy report, narrated video, presentation deck.

Two of the case studies will extend problem-based learning tasks set in class and must be completed independently.

One of the case studies will be one of your own design (programmes may further specify that this is linked to the specialist domain associated with your wider course). You will be offered the choice to work on this wider case study as a group or as an individual. We will modify the length requirements accordingly.

Professional practice reflection, 1000 words (25%) [ILOs 1 and 5].

When assessment does not go to plan

When required by the Board of Examiners, you will normally complete reassessments in the same formats as those outlined above. However, the Board reserves the right to modify the form or number of reassessments required. Details of reassessments are normally confirmed by the School shortly after the notification of your results at the end of the academic year.

Resources

If this unit has a Resource List, you will normally find a link to it in the Blackboard area for the unit. Sometimes there will be a separate link for each weekly topic.

If you are unable to access a list through Blackboard, you can also find it via the Resource Lists homepage. Search for the list by the unit name or code (e.g. ALSSM0001).

How much time the unit requires
Each credit equates to 10 hours of total student input. For example a 20 credit unit will take you 200 hours of study to complete. Your total learning time is made up of contact time, directed learning tasks, independent learning and assessment activity.

See the University Workload statement relating to this unit for more information.

Assessment
The assessment methods listed in this unit specification are designed to enable students to demonstrate the named learning outcomes (LOs). Where a disability prevents a student from undertaking a specific method of assessment, schools will make reasonable adjustments to support a student to demonstrate the LO by an alternative method or with additional resources.

The Board of Examiners will consider all cases where students have failed or not completed the assessments required for credit. The Board considers each student's outcomes across all the units which contribute to each year's programme of study. For appropriate assessments, if you have self-certificated your absence, you will normally be required to complete it the next time it runs (for assessments at the end of TB1 and TB2 this is usually in the next re-assessment period).
The Board of Examiners will take into account any exceptional circumstances and operates within the Regulations and Code of Practice for Taught Programmes.