Participation, democracy, sustainability

Author: Caitlin Hafferty

Governing nature recovery in Scotland: growing transformative change

Rewilding on the banks of Loch Ness (source: Caitlin Hafferty)

Our study examines how governance influences transformative change in nature recovery initiatives. Part of the Leverhulme Centre for Nature Recovery at the University of Oxford, our research explores the governance conditions that enable or constrain the transformative potential of nature recovery for delivering simultaneous community, biodiversity, and climate benefits.

Successful nature recovery involves benefitting people and fostering deep connections between humans and nature. This requires embracing local, indigenous, and scientific knowledge through holistic, integrated, and participatory decision-making processes. However, current approaches risk perpetuating norms and practices which may exacerbate inequalities and injustices, hindering the necessary changes for transformative human-ecological well-being. In particular, there is considerable opportunity to investigate the role of natural and private capital, along with supportive policy mechanisms, in achieving scientifically robust ecological and climate goals while addressing social risks, enhancing community benefits, and strengthening local democracies. There is a continued debate around how natural capital markets support high-integrity and equitable nature recovery, including the risks or trade-offs for promoting meaningful community participation and empowerment. It is important to explore the careful balance of collaboration at the local level while managing broader priorities, incentives, standards, and regulations.

We aim to understand, examine, and advocate for inclusive governance frameworks that actively address inequalities and promote collaboration to tackle the biodiversity and climate crisis, build community wealth and a circular economy, strengthen democracy and social justice, ensure a just transition, and diversity land ownership. Key questions will revolve around:

  • Views on the interactions between community and socio-economic benefits and nature recovery initiatives in Scotland.
  • How local knowledges, values, and community benefits can be captured, mapped, and integrated into monitoring, evaluation, and broader decision-making processes for landscape and ecological change.
  • Emerging opportunities for and tensions between mechanisms for financing, incentivising, and certifying nature recovery and delivering community benefits, strengthening local democracies, and contributing to a ‘just’ transformation.
  • How we can achieve the balance between high-integrity, credible, and scalable nature recovery and delivering democratic, socially inclusive forms of governance that are place-based and locally sensitive, while aligning with broader standards and priorities.

Overall, our study investigates how politics and governance – including framings, actors, and institutional dynamics – shape transformative change in nature recovery. We will explore how different governance and institutional arrangements affect social and ecological outcomes, drawing initially from case study landscapes in Scotland, and grounding these in national and international debates. This project ultimately aims to understand, promote, and embed new ideas and pathways for reimagining and remaking the future to support justice and well-being for humans and nature. In doing so, it aims to deliver conceptually-driven, pragmatic and actionable options for policy-makers and practitioners on how to ‘grow’ transformative change through nature recovery.

If you are involved in nature recovery initiatives in Scotland, we would love to hear from you!

We are conducting a study on the governance conditions that enable and constrain the transformative potential of nature recovery initiatives for meeting multiple ecological, climate, and social objectives. Your insights, expertise, and experiences will help us understand how we can grow transformative pathways that support justice and well-being for both humans and nature. Your contributions may also help inform pragmatic and actionable recommendations for policy-makers and practitioners in Scotland, across the UK, and beyond.

We appreciate that participating in academic studies takes time, and we believe it is important that research relationships are participatory and reciprocal with a positive, mutual exchange of benefits. If there is anything that the research team can do to help you in return, please do let us know!

Research team:

Caitlin Hafferty (Leverhulme Centre for Nature Recovery, University of Oxford).

Contact: caitlin.hafferty@ouce.ox.ac.uk

Caitlin will be occasionally joined by two MSc students from the School of Geography and the Environment.

Who can participate?

Anyone involved in working with nature to benefit both people and biodiversity in Scotland are warmly invited to participate. This includes anyone involved in the policy, financing, strategy, design, and/or delivery of a range of nature recovery projects like rewilding and restoration, marine and peatland restoration, urban greening, species introduction and management, community-led conservation, and more. We welcome insights from a range of private, public, and civil or community-led initiatives, including blended finance collaborations and community wealth building.

We are initially interested in participants involved in nature recovery projects in Argyll and Bute, Inverness-shire, and Aberdeenshire, however also welcome broader perspectives across Scotland, the UK, and further afield. Please feel free to get in touch if you have any questions about your eligibility to participate.

What’s involved?

A 45-60 minute interview conducted in-person or online. Interviews will be semi-structured, informal and conversational, and can be conducted at a time and location to suit you. All interviews will be confidential and used solely for research purposes.

The research team will be based in Scotland and able to conduct in-person interviews in Argyll and Bute, Inverness-shire, and Aberdeenshire over specific periods of time in 2024.

In-person interviews are being conducted in Inverness-shire between the 20th June and 10th July 2024. The research team would love to visit a diversity of nature recovery projects in the area and conduct interviews while walking around the site. Alternatively, sit-down interviews (e.g., in a café) or online interviews can be conducted flexibly.

Further dates for in-person interviews in Inverness-shire, Argyll and Bute, and Aberdeenshire, are to be confirmed. Please get in touch with Caitlin to discuss.

How to enquire & more information

Please contact Caitlin Hafferty (caitlin.hafferty@ouce.ox.ac.uk) for more information or to arrange an interview. Please include a brief description of and/or link to your nature recovery project/s or organisation. We look forward to hearing from you!

For more information about the project, what to expect, what happens to data provided, and more, you can view and/or download the project information sheet in the PDF below.

Participatory governance for scaling-up Nature-based Solutions

On January 18th 2024, I presented a seminar as part of the CCRI seminar series entitled “Participatory Governance for Scaling-up Nature-based Solutions: Social Science Insights from a fast-paced, impact-focused interdisciplinary project.”.

In the seminar, I discussed how, within the context of dual biodiversity and climate crises, how Nature-based Solutions (NbS) address societal challenges with multiple benefits for people and nature. Advocates argue that NbS fundamentally have ecological and social goals, and as such have attracted considerable attention from national to international scales. However, there were concerns about the potential exclusion and marginalisation of local communities and other groups. Participatory and democratic approaches are often promoted as an antidote to these issues, ensuring more equitable and inclusive decision-making outcomes.

The seminar explored how different framings and messages around NbS can work to open up and close down opportunities for participation, and the implications of this for delivering multiple socio-economic and ecological outcomes in a way that is equitable and sustainable. It presented insights from a fast-paced, solutions-focused interdisciplinary project at the University of Oxford on Scaling-up Nature-based Solutions in the UK. This work was part of a larger NERC-funded project called the Agile Initiative, aiming to revolutionize how research responds to urgent global environmental policy and practice challenges, and the Leverhulme Centre for Nature Recovery. In particular, I focused on the social science contributions, reflecting on also the lessons learned from contributing social science expertise to a fast-paced, solutions-oriented interdisciplinary project at the science-policy interface.

Watch the seminar on YouTube and download the PowerPoint slides below.

Are you a practitioner interested in engagement and participation for Nature-based Solutions? Sign up to a free webinar in February 2024 which launches our new guidance, the Recipe for Engagement. Sign up here.

How can we build trust and integrity for connected communities and transformative democracy?

Participatory, decentralised governance, and citizen engagement is often promoted as a key part of the solution to the world’s most pressing societal challenges. There is a critical opportunity for leveraging participatory approaches to bring people together, promote collaboration and deliberative discussion, and help tackle existing power structures.

The evidence-backed benefits of participatory and democratic processes include: building trust and integrity, enhancing the perceived credibility of decisions and decision-making institutions; negotiating political divisions and polarisation, promoting solidarity and togetherness; improving socio-economic and environmental outcomes through more plural, flexible and anticipatory governance processes; enhanced quality of knowledge and evidence through the incorporation of diverse knowledge types and realities; and fostering empowerment, collective action, and community benefits through localised and bottom-up approaches.

In July 2023, I presented a seminar titled “How to build standards of trust, accountability, and inclusion for sustainable places” to the Department of Levelling Up, Housing and Communities (DLUHC). I presented in collaboration with Oxford University’s Agile Initiative and Leverhulme Centre for Nature Recovery projects.

The talk was part of the DLUHC 2023 Science Seminar Series, curated by the Chief Scientific Advisor’s Office, which aims to seamlessly integrate scientific evidence into DLUHC’s focus areas, aligning with their research interests and priorities. Our aim was to bridge the gap between academic research and real-world applications in the realm of urban planning and regeneration, housing, and fostering sustainable, thriving and connected communities.

Our core message revolved around the power of ‘engagement’, which is part of broader transformative efforts for more participatory and deliberative democracy and justice. We underscored the significance of involving the public in decision-making processes concerning local places and communities. The evidence we presented shed light on the connection between engagement and the establishment of trust, inclusion, and integrity in political decision-making.

A key highlight of our presentation was the exploration of digital tools for engagement. This was particularly relevant to DLUHC’s initiatives for digital planning and ‘PropTech‘ which aim to promote innovative tools and technologies for citizen engagement with planning policy and practice. We delved into both the technical and ethical aspects of technological innovation, offering insight into their application. While digital tools can undoubtedly enhance the effectiveness of engagement in many ways, it is crucial to be cautious about the ethical risks, such as issues related to digital literacy and infrastructure. Our recent academic paper pre-print (free to download) explores these technical and ethical debates around digital tools for democratic and participatory engagement, making relevant recommendations for practitioners and policymakers.

The seminar also emphasised the necessity of embedding a culture of democratic engagement within DLUHC, and also across Government more broadly. We stressed the importance of building the capacity and capability to implement best practices in engagement processes, ensuring decisions align with development, sustainability and local community needs.

Our research gains particular relevance in the rapidly evolving landscape of democratic and digital transformation in the UK. With increasing calls for democratic reform and citizen participation, and an ever-growing toolkit of digital technologies and platforms at our fingertips, the dynamics of planning and environmental decision-making are undergoing a significant shift. On a global scale, influential organizations like the OECD and the European Union are promoting digital tools as catalysts for fostering more interactive, human-centred approaches. Closer to home, the United Kingdom is making bold strides in digital transformation, positioning digital technologies as the front and centre of public service provision and engagement.

Our presentation not only enriched the learning of DLUHC staff, but is also available for viewing to the broader UK public sector. The presentation slides can also be downloaded here.

In a world where democratic reform, sustainable transformations and community empowerment has never been more critical, our seminar served as a reminder of the pivotal role that engagement plays in driving standards of trust, accountability, and inclusion in decision-making and public institutions.

This blog post has been adapted from its original version, posted here.

From automated transcription to qualitative analysis: 3 easy steps

This blog post has been reposted from my old blog (original post March 2022)

I have been using automated transcription software Otter.ai throughout my 3-year PhD to facilitate data collection and analysis. This tool has been indispensable for transcribing events (e.g. workshops and conferences), in-depth interviews and focus groups with research participants, meetings with colleagues, and much more. 

If you’re new to using automated transcription, navigate to my previous blog posts which offer an introduction and tutorial. Importantly, automated transcription comes with a specific set of ethical and privacy considerations, which you can read more about in this post. Since writing these, I’ve run different talks and workshops on automated transcription – you can read a summary of the key messages from these here, including links to presentation slides and recordings. 

In this post, I share some insights and tips from my experience using Otter.ai to generate, edit, and prepare transcripts ready for qualitative analysis. I use qualitative analysis software NVivo by QSR in this example. NVivo helps qualitative researchers to organise, analyse, and find insights in unstructured or qualitative data like interviews, open-ended survey responses, social media content, etc. However, there are lots of other proprietary tools you can use for analysing text, as well as free and open source options such as Voyant Tools. You can also use programming languages like R and Python to conduct text mining and analytics (e.g. see this guide for text mining in R). Of course, computer-aided qualitative analysis isn’t the only way to go and manual coding remains just as important. 

The core messages in this post should hopefully be relevant for a broad audience of researchers, regardless of what specific tools and approaches they are using. Equally, while I use Otter.ai in this example, there are plenty of other free and paid tools available in 2022, many of which have pretty similar core features.

1. Edit the transcript

Once you’ve uploaded a recording into Otter.ai (or used the live transcription function) and it has finished transcribing, you’ll need to manually edit it. Despite the fact that Otter does a pretty accurate job at translating the audio to text, it will always need human input to check that there are no mistakes. This is a particularly important consideration for researchers who want to make sure that your participants’ contributions are being accurately represented. It’s also beneficial to spend time going through each transcript to get a ‘feel’ for the data. 

So, this first step is to read back through the transcript and correct any mistakes. Different methods will work for different people, but I tend to read through and edit the transcript while listening back to the audio recording at around 1.5x to 2x the speed, slowing and speeding up as is necessary. Now that I’ve been using this method for a long time, it’s become increasingly straightforward and efficient (it takes a few goes to really get used to it!).

The features offered in Otter.ai are particularly useful for editing because you can listen while editing in your internet browser. As shown in the photo below, individual words are highlighted as the audio recording plays. However, do make sure that you have a reliable internet connection to make sure everything saves properly (I’ve learnt this the hard way by losing lots of edited data and having to start again!). If I’m working somewhere with poor WiFi connection, I usually export the edited transcript as a text file at regular intervals, so if the edited transcript doesn’t save properly at least I don’t lose all of my edits.

A screenshot of the Otter.ai browser interface showing how words in the transcript are highlighted as the audio plays, speaker labelling, and the speed settings. (Transcript source: public webinar “Engaging for the Future”, Commonplace).

The key things that I check for when editing include:

  • Punctuation errors – e.g. full stops, commas, and question marks where they shouldn’t be (or a lack of punctuation in the right places).
  • Random paragraph breaks – sometimes, for example when a speaker pauses mid-sentence, Otter.ai automatically starts a new paragraph, so it’s worth checking to see if this has happened and merge paragraphs where necessary.
  • Lack of paragraph breaks – Otter.ai has a tendency to generate long monologues of speech, which might need to be broken up into smaller paragraphs to make it easier to read. 
  • Spelling errors and incorrect words – I find this happens quite a lot when transcribing different accents, when specific names and locations are mentioned, or when abbreviations are used. 
  • Linked to the above, please do carefully check for any words which could be interpreted as rude or inappropriate – I won’t repeat any here, but I have removed some rather interesting misinterpretations of words from some of my transcripts (!).
  • Mislabelled speakers – it’s really important to check that Otter.ai has labelled your speakers correctly and not mislabelled anyone (this can happen, for example, when someone interupts someone else mid-sentence, or if two people have very similar sounding voices). 
  • Remove repetition and utterances – in natural spoken language, people tend to repeat words, use filler words (like “uhm”, “ah”, and “like”), and can stop talking or change the course of conversation mid-sentence. While utterances and repetition can be useful to retain in the transcript for some purposes, there are other times when you might want to edit these out.
  • Removing any identifiers – for research in particular, it’s important to make sure that you protect the anonymity of participants at all times. Because Otter.ai transcribes verbatim, the text will include everything in the conversation (e.g. peoples names, names of businesses, areas, etc.). This is a particularly important consideration when conducting online interviews, for example, when the boundaries between private and professional lives can become blurred (particularly when participants are joining the interview from their home) and you can risk capturing personal information. 

One important thing to note is that once you have made edits, the audio will then need time to realign with the text (which doesn’t always happen accurately). This is usually fine because if you notice a mistake with audio alignment in the Otter.ai app, you can always check the text against the original audio recording using the time stamps, but it’s useful to keep this in mind.

2. Annotate the transcript 

While editing the transcript, I start to annotate key quotes that I think are useful or interesting for analysis. Usually I have a few research questions and/or themes from the academic literature in mind when analysing data, which helps to guide this process. I also add comments about emerging themes, or anything I think is interesting/relevant for the analysis stage. In Otter.ai, you can highlight text using the “highlight” function (all highlights are then summarised at the top of your transcript, below the title and key words). In addition to highlights, you can add individual comments to sections of the text, which appear in the margin in order.

A screenshot of the Otter.ai browser interface showing how you can highlight and comment on text. (Transcript source: public webinar “Engaging for the Future”, Commonplace).

I won’t go into too much detail here as have covered this in my previous blog posts (e.g. this tutorial and this webinar), but automated transcription software can generate some really useful summaries of your transcript. This is particularly useful if you want to see quickly see some of the (potential) themes in the transcript before conducting more in-depth analysis, e.g. if you’re working on a collaborative project and want to send your colleagues a brief summary. The image below shows the key words which are automatically generated by Otter.ai (which can also be viewed as a word cloud), which shows the words which appear most frequently in the transcript. In this example, you can see from the key words that this webinar was about community engagement in a planning setting. Otter.ai will also tell you the amount of time (%) that each person speaks for in the transcript, amongst other quick insights.

A screenshot of the Otter.ai browser interface showing automatically generated key words. (Transcript source: public webinar “Engaging for the Future”, Commonplace).

3. Prepare for analysis

It’s very straightforward to export a file from your chosen automated transcription software and move it to qualitative analysis software (in Otter.ai you can export your transcript as TXT, DOCX, PDF file, etc.). I export my files from Otter.ai in .txt format and import them to NVivo by QSR (“import” > “text file”). If you haven’t used NVivo before, there are some great tutorials on YouTube and their website.

Once the files are imported to NVivo, I copy and paste all of my comments from Otter.ai (see previous step) and add them as “annotations”. I do this by finding the relevant words (CTRL+F), highlighting the text in NVivo and adding the annotation (CTRL+A), then pasting the corresponding comment. There might be more efficient ways to do this, but it works well for me – when I’m analysing qualitative interviews, for example, repeatedly going through the transcript really helps to increase my familiarity with it. 

A screenshot of NVivo by QSR showing one way that comments from Otter.ai can be used to create annotations and themes for analysis. (Transcript source: public webinar “Engaging for the Future”, Commonplace).

Once I’ve added all the comments into the transcript in NVivo as annotations, I then start more in-depth analysis (coding). While I go through the transcript and code it into different themes, the annotations are really useful for highlighting quotes and insights which I may have otherwise overlooked. This is helpful for me because I have a thorough record of the various stages I went through to analyse my data, including emerging themes. If you’re unfamiliar with this software, make sure to check out the numerous free resources and tutorials available online (I’ve pasted a few links below).

I will also add the caveat here that this is just one way that I’ve been using automated transcription with qualitative analysis software, out of many potential approaches which can be facilitated by software (or not). While these tools have been really useful for me, this isn’t necessarily the best or most efficient way of doing it.

10 recommendations for best practice stakeholder engagement

I collaborated with Natural England to produce outputs for embedding an evidence-led, best practice culture of engagement. This included delivering recommendations which are useful for any organisation thinking about improving their strategy for public and stakeholder engagement.

The available evidence for best practice public and stakeholder engagement was reviewed in a report and summarised in an accompanying infographic pack. Engagement is a process by which members of the public (or other key stakeholders like local authorities, businesses, and charities) can become involved in decisions which affect their lives. Engagement is essential for healthy democracies and ensuring that people are at the heart of tackling environmental issues, helping us to make better decisions for more sustainable and equitable outcomes for everyone.

The outputs from this research are suitable for anyone who is thinking about engaging, including practitioners, practice enablers, researchers, and policy makers who aim to involve members of the public and other key stakeholders in decision-making processes. While this work was focused on engagement in environmental decision-making, it is more broadly relevant to other areas of research and practice.

The report provides the evidence behind what engagement is and why it is important, what the benefits are, the potential risks of ‘poor’ engagement and how to mitigate them, how different ‘types’ of engagement can provide useful classifications for practitioners, and how practitioners can use theory (i.e., different ways of thinking and knowing) to inform best practice. This includes the challenges and opportunities of engaging during COVID-19 and in an increasingly digitised world, particularly considering the ethical implications of digital technologies.

The report outlines how the available evidence can be used to inform the creation of an evidence-led, best-practice engagement culture. It outlines 10 recommendations which consider engagement strategies, frameworks, standards, models, methods, toolkits (and so forth).

One central message in this review is that ‘best practice’ engagement and its outcomes will vary between different situations. Practitioners should recognise that the quality of the process and outcomes will change depending on the purpose and objectives for engaging, as well as organisational cultures of engagement, institutional capacity, wider socio-economic and political contexts, and the characteristics of participants.

The 10 recommendations in the report are:

1. Engagement is an ongoing process, not just a one-off activity.

2. Take time to understand the local context in which engagement is being carried out.

3. Engage stakeholders in dialogue as early as possible in the decision-making process.

4. Recognise the importance of integrating local and scientific knowledge and implement this in practice.

5. Manage power dynamics effectively, for example by using skilled facilitators who can help marginalised voices be heard and build trust in the process.

6. Think about the length and time scale of the engagement process and how often it might be necessary to engage with participants. 

7. Recognise that different (digital/remote and in-person) tools and approaches for engagement will work differently in different situations.

8. Engagement coordinators need to manage participants’ expectations of the engagement process.

9. There are risks to engagement, some of which can be managed or mitigated.

10. Frameworks for engagement need to be institutionalised within organisations as a culture of engagement.