[Call for Participants] Are you active in an online community?

Do you have a role in monitoring and assessing user behaviour and content?

Do you have an active role in monitorin and assessing user behaviour in a (public or private) online community, such as a Facebook page or group, Subreddit, community site, forum or wiki, for example through moderating posts, creating community guidelines, or managing the group?

The Everything in Moderation? Project would love to hear from you!

Researchers at Horizon Digital Economy Research at the University of Nottingham are studying how the emergence of end-to-end encrypted platforms and services impacts the practice of moderation (whether human, machine, or hybrid). This project is in partnership with the Internet Society and will run for a year until the end of April 2023. 

As part of the project, we are interested in hearing about the current attitudes and practices in moderation, from people who are actively involved in (public or private) online communities. We are looking to interview people who have experience of a range of online moderation techniques – from self-moderation practices and community guidelines to technology-mediated practices such as automatic removal or muting of posts. We are especially interested in understanding attitudes towards end-to-end encryption as it relates to moderation but it is not necessary to have experience of this.

We would therefore like to invite anyone who is actively involved in an online community and takes on a role in monitoring and assessing user behaviour and content in that community to take part in an online interview with one of our researchers. It should take around 1 hour (or however long you wish) and to reimburse you for your time we will offer you a £15 Amazon Voucher.

In addition to this interview, we are also seeking to engage longer term with a variety of online communities as the project progresses. If you think you or your group would be interested in this, please do let us know.

If you are interested, please contact me! (Dr Liz Dowthwaite)

[Call for Participants] Exploring public perceptions and attitudes towards using automated techniques for credit scoring

Another of my PhD students is recruiting participants for a series of online workshops. Are you a credit card user or have you had experience with any other form of loan application? We are running a study to explore general perceptions and attitudes towards the use of automated approaches for making credit decisions and the importance placed on explanations to credit decisions. We are inviting people to take part in our study.  This study forms part of a PhD research project exploring the responsible use of automated techniques for credit scoring.

Who can participate?

  • Must be over 18years  
  • Resident in UK
  • Credit card users or anyone with experience of any form of loan application

What you will do:

You will take part in a focus group session to discuss themes of the study and fill out a survey on the importance placed on credit explanations.

Where?

The study will be carried out online on Microsoft Teams, all data collected will be anonymised.

How long will it take?

The focus group session will last for about 90 minutes and about 10 minutes for filling out the survey.

What’s in it for you?

You will receive a £15 Amazon voucher for your kind participation!

Interested?

Please sign-up directly here, or to find out more, view the study page here. Please also see the information sheet and privacy notice.

Please feel free to share with friends or relations who are also eligible to participate. 

If you have any questions, please contact Edwina Abam (firstname.lastnameATnottingham.ac.uk)

[Call for Participants] Workshops for children aged 10-17 on online privacy (Nottingham based)

One of my PhD students is looking at reviewing the design of child-facing privacy policies online, and is currently recruiting 10-17 year olds to take part in workshops on the design of privacy policies and other tools for informing internet users about their privacy. Face-to-face workshops will predominantly take place in the Nottingham area, but there is potential for online workshops as well if interest and availability is high. If you are a teacher/parent/guardian of 10-17 year olds please consider encouraging them to take part. Especially if you are a teacher in the Nottingham area who would be willing to host at your school, please let us know!

Ephraim says:

Testing for Transparency is a range of workshops that will be conducted by researchers at the University of Nottingham. The workshops are an opportunity for researchers to talk to young people about how the internet affects their lives. Recently, laws and standards have emerged about the concept of Age-Appropriate Design such as the P2089 Standard for an Age Appropriate Digital Services Framework by the Institute of Electrical and Electronics Engineers, and the UK government’s Age Appropriate Design Code. The Age Appropriate Design Code is a set of guidelines created by the government to help online services built for young people serve and account for them better. The guidelines consider the different needs of young people, at different ages. Testing for Transparency will explore designs for age-appropriate privacy policies and engage young people in a creative exercise, focused on how to make them readable, more effective, and better tools for communicating privacy information to young users.

The workshops will take place in person at the University of Nottingham and in several public locations and schools over the course of 2022. The sessions will be 2 hours long and will consist of (1) a short introduction and ice-breaking activity, (2) a presentation about the problems of privacy policies and why transparency about privacy matters to young people, (3) a privacy policy design segment where groups of young people will be invited to come up with their own privacy interfaces and (4) a presentation of each group’s policy designs.

The workshop will focus on what young people think about privacy policies that they have seen. What could internet service providers do to make them better? The design segment is an opportunity for young people to share solutions and discuss them with others their age. The designs created and ideas discussed will be shared with experts in privacy as well as professionals who create websites and apps targeted toward young people. There will also be an opportunity for participants who want to engage to take part in a Working Group that will be able to feed back to researchers as the research evolves.

For more info, email Ephraim Luwemba (firstname.lastnameATnottingham.ac.uk)

[Call for Participants] The impact of a mental health platform for young people on subjective wellbeing

One of my first year PhD students is in the process of finalising his proposal for the work he will carry out over the next three years. To do this he is running a pilot study on the “Impact of the Kooth Platform on Subjective Well-being”. Therefore we are currently seeking young people between the age of 16 and 25 who are willing to use a digital mental health and wellbeing support platform for a period of 6 weeks, and report back their experiences through short surveys.

If you are, or known anyone who is, in this age group and interested, please get in touch. We also ask that you have not used the ‘Kooth’ app before. Specifically he is exploring whether their are changes in subjective wellbeing of participants before and after the use of the platform. For the duration and afterwards, you will have free access to the platform, where you will be able to explore your emotional wellbeing and mental health. You will be asked to use the platform 3 times a week for 6 weeks. Weekly surveys will ask you to rate your subjective wellbeing and which parts of the platform you engaged with. Nothing you post will be visible to researchers, but we will collect data about your patterns, for example how long you spent on each part of the platform and what order you visited them in.

It will not be necessary for you to discuss your medical or mental health history or that of others, and you are under no obligation to disclose any information you do not want to. The surveys are designed to take around 5 minutes and will take place online. You will receive a £25 shopping voucher for contributing to the study.

For more information, or to sign up, contact Gregor Milligan (firstname.lastname@nottingham.ac.uk)

[Call for participants] Study of an emotional wellbeing platform

One of my PhD students is recruiting participants for an online study on an emotional wellbeing platform. Please do consider taking part if you can:

💭 Would you like to help us and help yourself at the same time?

🗣 Would you like free access to a unique and innovative emotional health platform in exchange for giving us some feedback?

I am currently looking for participants to take part (virtually) in an exciting 6-week pilot study in partnership with the University of Nottingham. On behalf of My Internal World, I will be investigating ways to improve user experience on their platform.

For the duration, you will have free access to the platform where you will be given a chance to explore your emotional wellbeing through a personalised journey.

What will I have to do?

The study will involve taking brief emotional wellbeing assessments and telling us what you think about the interface.

To register your interest, please email Emma.Gentry@nottingham.ac.uk

*Please note, participants must be over the age of 18, currently employed, and for the purposes of this study, we are currently not recruiting individuals with a diagnosed mental health condition.

CALL FOR PARTICIPANTS: Testing for Transparency

One of my PhD students, Ephraim Luwemba, is looking to interview anyone who has been involved in the process of service design on children’s websites. If you currently work on digital services that are likely to be accessed by children or have worked on them in the past, he would appreciate your input. He is looking for professionals involved in all aspects of creating a digital service including but not limited to QA and user testing specialists, UI and UX designers, copy writers, content creators, project managers, project architects, and contract lawyers who have been involved in the drafting of website policies and terms of service.

The interviews will take around 45 minutes, and will take place online unless you prefer face-to-face and this is feasible. You will be thanked with a £15 Amazon voucher, but if you prefer, we can also donate the same amount to a charity of your choice.

The interviews form an important part of Ephraim’s PhD, which is looking at how privacy policies are presented to young people online. His goal is to design a framework for for creating new presentation methods for privacy policies and similar notices, (e.g., cookie statements), informed by the thoughts and feedback of children. By also interviewing the people who design digital services, he hopes to ensure the project is relevant and that he has a realistic impression of the technical and institutional challenges faced when attempting to create transparent privacy policies.

If you are interested in taking part, please see the information here to contact Ephraim.

CALL FOR PARTICIPANTS: Workshops on technology use for health and wellbeing

The “TAS for Health” project is exploring attitudes towards the use of technology in health and wellbeing decision-making in the home, across users including patients, carers, and family members. We are particularly interested in how use relates to shared values such as trust, self-efficacy, and privacy.

We are currently recruiting for a series of workshops, which will explore how people currently use technology to support their own health and wellbeing, and that of others, and how they may do so in the future. We are looking for four groups of people:

  1. People who have made use of technology such as apps or smart devices to support their health and wellbeing during lockdown.
  2. People who have experience caring for or living with others with conditions such as multiple sclerosis, dementia, or stroke, with or without technological support.
  3. People who have multiple sclerosis.
  4. People who have had a stroke.

If you fit into any of these groups, we’d like to invite you to take part in an online workshop, in which we will discuss the use of technology in healthcare decision-making. It will not be necessary for you to discuss your medical history, or that of others, and you are under no obligation to disclose any information you do not want to. Workshops are designed to last around 2 hours and will take place online, with adjustments to be made depending on participant needs. You will receive a £20 shopping voucher for contributing to the study.

Workshops will take place in October and November 2021, with dates to be confirmed once we have an idea of participant availability.

For more information, or to sign up, contact Dr Liz Dowthwaite. If possible, please give an indication of your availability in your email.

DEADLINE EXTENDED: CALL FOR PAPERS: RRI and Trustworthy Autonomous Systems

We are inviting submissions for articles for publication in a Special Issue of the Journal of Responsible Technology: “Reflections of Responsible Research and Innovation for Trustworthy Autonomous Systems”

Scope

An essential concept for the development of socially beneficial trustworthy autonomous systems (TAS) is responsibility in the context of research and innovation. The framing of responsibility itself can be confusing – often mistaken as liability or research ethics – and challenging when actioned into practice. Responsible Research and Innovation (RRI) provides governance frameworks and tools for engaging in a process of care and responsiveness. However, what does it mean to be responsible and responsive within TAS? This special issue will focus on RRI in practice; for example, how have RRI action plans been deployed successfully in TAS studies? What are the main challenges and issues around engagement? What can we learn from challenge cases and best RRI practices? This is an opportunity to reflect and discuss RRI limitations, potential and opportunities within TAS studies.

For queries relating to scope, please contact Dr Liz Dowthwaite (that’s me!)

Articles

We are seeking two types of articles:

  1. Short reflective articles (2000-3000 words) that discuss experiences of applying RRI in practice, within a particular project or programme of research. We are looking for papers that critically reflect on the barriers and facilitators of ‘doing’ RRI, how action plans were successfully (or unsuccessfully) deployed, and the lessons learnt for future work and for defining RRI best practices. We are not looking for a hypothetical discussion of how RRI should be framed or considered without a real-world example.
  2. Longer original research articles (absolute maximum 10,000 words) that report the results of more formal research carried out into the application of RRI. Papers should include full Introduction, Methods, Results, and Discussion sections. We are looking for papers that add to our understanding of the important considerations for RRI and tackle the above questions in a more empirical manner. We are not looking for review papers (e.g. literature reviews) that simply scope out current thinking, but welcome those that then proceed to present new insights and theories.

Submission

Submission deadline: 20th January 2022

Final decisions: 20th April 2022

You are invited to submit your manuscript at any time before the submission deadline. The journal’s submission platform (Editorial Manager®) is now available for receiving submissions to this Special Issue. Please refer to the Guide for Authors to prepare your manuscript and select the article type of “VSI: RRI reflections for TAS” when submitting your manuscript online. Both the Guide for Authors and the submission portal can be found on the Call for papers web page.

Please also note that the Article Processing Charge (APC) is waived for this Special Issue.

Guest Editors

Dr. Elvira Perez Vallejos, The University of Nottingham, UK

Dr. Liz Dowthwaite, The University of Nottingham, UK

Dr. Pepita Barnard, The University of Nottingham, UK

Dr. Ben Coomber, The University of Nottingham, UK

BPS Cyberpsychology 2021

For the past couple of days I have been attending the Bitish Psychological Society’s Cyberpsychology Section Virtual Conference. My colleagues and I are well represented, with four talks from projects that I worked on (two by me, and one each by my colleagues Elvira Perez Vallejos and Virginia Portillo) and two others from colleagues I have the great fortune to be working on other projects with (Mat Rawsthorne and Camilla Babbage).

Day One

On the first day, I presented our work from the ReEnTrust project, on attitudes towards online wellbeing and trust in younger and older adults. The video can be seen here. One of the major aims of the ReEnTrust project was to identify the most important issues that effect trust in users’ online service interactions, and how these interactions affect wellbeing. Our work package related especially to how attitudes and experiences of these issues differed across younger and older adults. We carried out a series of 3 hour workshops with 2 age groups: 4 workshops with 35 young people aged 16-25 year olds, and 5 workshops with 40 older adults aged 65 and over. As part of these workshops participants completed pre- and post-session questionnaires focusing on trust and wellbeing, as well as digital literacy. We measured both the fulfilment of the basic psychological needs for autonomy, competence, and relatedness, to examine eudamonia, or the experience of purpose in life, and the other looking at subjective wellbeing, measuring experiences of positive and negative emotions online. High levels of need fulfilment and high levels of positive affect lead to a fulfilled life, or what may be termed ‘flourishing’. We also asked participants to rate statements related to their trust in the internet, and how important trust is when online, and measured what we called ‘digital confidence’ using a 6 item scale, which aimed to get users to rate their own online digital literacy.

Whilst both groups did experience considerable benefits of being online, and recognised the potential for both positive and negative effects of the online world, young people are more concerned about the wellbeing effects of being online. Older adults seemed to focus more on the positives, including increased ability to communicate with friends and family, and opportunities to take part in things that they could not do offline. However, some older adults did mention a concern for others who they perceive as using the internet more, especially younger adults and teenagers. Negative factors for young people often surrounded the type of content they saw, the potential for negative social comparison, and a lack of control over their information. Lower levels of autonomy among young people were also related to higher negative affect, but not for older adults.

Older adults more explicitly related negative experiences to a feeling of being overwhelmed and a lack of competence. Indeed, competence online and digital confidence were the major differences between the two groups, with older adults being more adversely affected. Young people had both higher digital confidence and competence fulfilment than older adults; higher levels of competence fulfiment in young people and old peoples’ digital confidence were related to lower levels of negative affect. Older adults were more bothered by their own (perceived) lack of understanding and this related both to their trust in websites and their sense of wellbeing; they also placed more importance on trust than young people did.

Whilst there were interesting differences between the two age groups, there were also striking similarities in how young people and older adults consider and experience their wellbeing when online. Overall the results suggest that both young people and older adults experience moderate levels of wellbeing and need satisfaction. Both groups have similar levels of autonomy and relatedness satisfaction, and encounter similar levels of positive and negative experiences online, with the positive slightly outweighing the negative. Being online has the potential to satisfy basic psychological needs and contribute to human flourishing, however both groups also talked about stress, anxiety and pressure, as well as the time consuming nature of being online. Increased autonomy was related to higher levels of positive affect, lower negative affect, and increased relatedness, but the study revealed relatively low scores for autonomy in both groups. Many users’ speak of a sense of lack of control when they are online and this needs to be addressed.

In terms of trust, both groups only had moderate levels of trust in the online world. Although older adults place more importance on trust online, both groups felt quite strongly that websites have a responsibility to act in a trustworthy manner towards their users, and that websites do not do enough to ensure this. Both groups related trust back to familiarity, reputation, safety and security, and data issues, often referring to the design and content of the websites they use. Whilst these experiences of trust bared little relationship to their levels of basic needs or subjective wellbeing, their responses did resonate with concerns about autonomy, competence and relatedness. This suggests that basic psychological needs are a useful lens through with to understand the experiences of internet users, and to frame discussions of wellbeing. More work needs to be done to relate this directly to measuring online wellbeing and trust, to ensure that the future design of platforms enhances the human experience and allows people to flourish.

Some results from our UnBias project presented by Elvira

Also on the first day, my colleage Elvira discussed results from our UnBias project looking at impact of algorithmic decision-making processes on young people’s well-being. Algorithms rule online environments and are essential for performing data processing, filtering, personalisation and other tasks. The algorithms that govern online platforms are often obfuscated by a lack of transparency in their online T&C and user agreements. This lack of transparency speaks to the need for protecting the most vulnerable users from potential online harms. Little attention has been given to children and young people’s experiences of algorithmically-mediated online platforms, or the impact of them on their mental health and well-being, despite one third of internet users being children below the age of 18. ‘Youth juries’ are youth-led interactive sessions that encourage participants to share and discuss their personal experiences and opinions of the online world in a safe space. We carried out a series of youth juries with a total of 260 children and young people (13-17 years old) to bring their opinions to the forefront, and elicit discussion of their experiences of using online platforms and perceived psychosocial impact on users. Perceived benefits include convenience, entertainment and personalised search results. Negative aspects include participants’ concerns for their privacy, safety and trust when online. We recommend that online platforms acknowledge and enact on their responsibility to protect the privacy of their young users, recognising significant developmental milestones, and the impact that technology may have on young users. We argue that governments need to incorporate policies that require technologists and others to embed the safeguarding of users’ well-being within the core of the design of Internet products and services to improve the user experiences and psychological well-being of all, but especially those of children and young people.

Day Two

On Day Two I presented our ReEnTrust work on online trust amongst older adults. The video is here. Despite the increases in the number of older adults over 65 years old using the internet, this group are often neglected from these discussions. We therefore set out to explore the factors that affected the online trust of older adults. We draw on data from a total of 40 participants across five workshops with adults aged 65 years and over. Co-created scenarios based on everyday online tasks – online shopping and seeking information – were used to facilitate discussion about trust on the internet. For each scenario they were asked to identify points that they felt were related to trust, whether they were positive or negative. Specifically, they were asked: “What are the most important factors related to trust here? How do you feel about it? How do you respond when this happens? And What do you think websites should do about it?”

Reputation was often highlighted, linked to intertwining factors including recognition of the brand, being a well-established company, and being a platform that they had used previously. For some, brand reputation was also related to having real-world connection such as a familiar bricks-and-mortar store. The platform’s reputation appears to be a protective factor for the user which offers security because of the platform’s need to nurture a good reputation.

Participants often reported difficulties in understanding how familiar behaviours in the real world might be translated online, for example in how search results are produced versus looking something up in the Yellow Pages. Participants also felt that platforms were being dishonest by obfuscating information relating to their business, such as hiding data collection behind cookie notifications, or simply not making clear where a company was based (having products appear weeks later from China was frustrating!). Activities such as profiling, tracking and surveillance also strongly impacted user’s trust online. Whilst some found profiling useful in terms of recommendations, often they had frustrations about inaccurate assumptions, repetitive advertising, and concerns about being placed in a ‘filter bubble’. Profiling of users led to a sense of losing privacy, with participants relating their experiences to being watched by ‘Big Brother’. They also had concerns about the extent to which they were being tracked online, and how others might find information about them, especially on social media, although some felt that they had little to hide and if they weren’t doing anything ‘dodgy’ it was not a problem.

Another common concern was that the internet is simply not safe to use. Many outlined various protective strategies, including looking for visual indicators of security, such as the padlock icon in web addresses. There were also a lot of concerns about a lack of control of what they were shown online, and in their choices of what websites and services to use. They often felt forced to do things like accept cookies, create user accounts, or accept permissions when downloading apps. They also felt a power imbalance due to the dominance of just a few companies, and participants often highlighted that they often felt compelled to abandon their personal values for the convenience offered by using them. For example, Amazon was often referred to, including in terms of benefits such as being able to rely on established policies and procedures, but also in terms of concern about the platform’s dominance and especially what they saw as dubious ethical practices. This raises the question of whether this represents an abandonment of trust in favour of convenience.

We recommend that the concept of trust, or more importantly, trustworthiness, is incorporated into the design of products, technologies and services to build user confidence and increase the wellbeing of users. To do this, different user groups must be consulted and involved from the very start of the design stages. This is especially important for groups who are not traditionally seen as online users, and may become neglected. The over 65s are often such a group in online research and our research has shed light on their online experiences.

My colleague Virginia also presented work from our work in UnBias: “Transparency in the age of Big Data: What do children want to know?” In the UK 99% of 12-15 year olds are online. Despite this, children’s voices have been often overlooked when making recommendations about the lack of transparency in data management (collection and usage) by online platforms. We explored children’s experiences of interacting with online services that shape their lives, in particular recommender systems (Google, YouTube, Netflix, etc.), and their ideas for a more fair and transparent online environment, through the youth juries I already described above. Recommendations predominantly revolved around how platforms use the information they collect from users, in particular a desire to be informed with what is collected from them, who is using it, and why. Participants also highlighted the benefits and barriers the Internet has on their lives and the importance of education to allow users to understand how the online world works. Children also wanted more choice and control of how their data is used. Meaningful transparency and education is required to allow people to reflect, question and develop their own ideas on key issues related to Internet technologies, and regulation to ensure transparency is both meaningful and maintained.

Some more results from our UnBias project presented by Virginia

Mat presented “Prototyping an unobtrusive measure of online psychological flexibility in a moderated mental health peer support forum”. His abstract:
Using data from the REBOOT study (RCT of the Effectiveness of Big White Wall Compared to Other Online Support), explore the potential of analysing language used by contributors in internet support groups to gauge their ability to respond to new circumstances and possibly predict outcome
Design/Background: Current methods for analysing online conversations are labour intensive and automated linguistic inquiry methods utilising key word counts and collocations do not scale to provide the full context (and therefore accurate meaning) of concordances. However, computer science advances in these areas often not informed by psychology. Self-report measures for digital mental health are prone to bias so unobtrusive techniques may enable triangulation.
Methods: Applying Natural Language Processing of items from relevant clinical questionnaires to bootstrap the training of an algorithm to classify statements by how people relate to themselves and others (informed by Relational Frame Theory account of empathy and perspective-taking, and mechanisms of social comparison). Create a collaborative machine learning model to incorporate human expertise to refine its ability to label different types of post and test relationship with outcomes. Combining clinical knowledge and service user lived experience of anxiety and depression to assess and improve the face validity and transparency of the categorisation decisions.
Analysis: Classifier accuracy (area under the curve, confusion matrix) and comparison with non-posting participants in both active and control arms where outcome data exists
Conclusions Expected Implications: Initial assessment of whether non-professional conversational processes can be linked to wellbeing, and therefore whether there are types of interaction moderators should monitor and encourage.

Can we capture the context of what people talk about in peer support forums using NLP?

Camilla presented “Developing an app to improve wellbeing for young people with Tourette Syndrome: Interviews with young people and professionals”. Her abstract:
Many young people with neurodevelopmental disorders who show reduced quality of life, will also experience co-occurring emotional and behavioural difficulties. Young people with Tourette Syndrome (TS) which causes involuntary tics, report emotional dysfunction to be more impairing than their tics. Digital self-help interventions targeting mood management are effective for young people, recommended in guidelines, and could combat resource deficiencies. Currently no such intervention exists, therefore the aim of this research is to explore what young people and professionals would desire and consider useful in a wellbeing app for young people with TS.
Design/Background: The study design included semi-structured interviews analysed using thematic analysis. Methods: 15 young people aged 9-17 with TS and Tic Disorders were interviewed via video-call or face-to-face. 16 professionals with an average 9 years TS work-experience were interviewed face-to-face, by video-call or phone call.
Results: Both samples derived themes that centred on desired features of the app. For young people this included psychoeducational and reminder functions, and calming elements like music and games. Professional themes highlighted a need for features of the app that would facilitate the use of tic and mood-management strategies, help young people to plan ahead and for family psychoeducation of TS.
Conclusions: Professional and young people showed overlaps and differences in themes relevant to features desired in the app. In order to develop wellbeing apps that are both engaging and effective for young people with neurodevelopmental disorders, including both perspectives is important.

How to design digital interventions for young people with Tourette Syndrome and tic disorder

Great work everyone!

TAS/RUSI Conference: Trusting Machines? Cross-Sector lessons from Healthcare and Security

Over the past 3 days I have been attending the joint TAS/RUSI conference on ‘Trusting Machines? Cross-Sector lessons from Healthcare and Security’. There have been really interesting panels on the use of ‘artificial intelligence’/autonomous systems/machine learning in varied contexts across the security and healthcare spaces, indentifying synergies in the issues that emerge in both contexts.

Today I had the opportunity to present our current work in the project ‘Trustworthy autonomous systems to support healthcare experiences (TAS for Health)‘ which I co-lead with my colleague Nils Jaeger. Or, as we alternately call it, ‘Reflect’. The project looks at the use of smart mirrors (get it, ‘Reflect’) in the home to support treatment, detection, and prevention of ill health. This includes day-to-day wellbeing and general health as well as producing case studies of three distinct user groups: people with multiple sclerosis, people with dementia, and people who have had a stroke.

The aims of this 12 month project are to improve understanding of the role of autonomous systems in experiences and provision of health monitoring at home, and to examine how autonomous systems might support shared decision-making amongst service users and support networks. This includes investigation of the roles of values, trust, and wellbeing in the design and use of these systems, as well as identifying tensions and synergies between different stakeholders, including service users, carers, family members, and clinicians. The research will inform the design of future applications of trustworthy autonomous systems in healthcare and wellbeing, and help develop the foundational concepts of autonomy and trust that are central to trustworthy autonomous systems.

Smart mirrors are two-way displays which show information collected from other apps such as the time and date, the weather, your calendar, or daily activity. They also use ambient monitoring to collect data about appearance and behaviour, and relate this to the data collected by other apps to assess everyday health. For example, they may provide reminders and monitoring of every day personal care to determine when behaviour changes, to give advice, and if necessary to alert a medical professional or family member. They may also be used for fatigue management using appearance data, which prompts the mirror to check on how the user is feeling and offer advice.

Tales From The Girl's Bathroom: Mirror Mirror On The Wall – The Authored  Ascension
Would you want your mirror to talk back to you?

The TAS for health project has three main work packages. The first explores how autonomous systems can support treatment, detection, and prevention of ill health through a smart mirror in the home. The second investigates specifically how this technology might be used to support potentially vulnerable users. The third examines tensions and synergies between perceptions of different system users, particularly in terms of values in decision making and attitudes towards the use of autonomous systems.

Integral to these studies is understanding different aspects of trust and how this impacts acceptance and confidence in decision-making, continued use of such systems, and supporting interactions with and between users. This may include trust that the system makes a good recommendation or that the system does not compromise users, for example, by giving away unnecessary personal information or undermining professional recommendations. Trust may also be affected by the technology’s visibility to others in the home. Visibility has the potential to draw attention to health conditions, which may cause feelings of stigma. Equally, use of these systems may enhance trust in oneself, by providing evidence to back up their lived experience.

Another project I am working on – understanding attitudes towards digital contact tracing – was also represented in the session. It was a fun session and the panel discussion afterwards gave me plenty to think about in terms of our methods for engaging participants in terms of thinking about potential uses for smart mirrors – including science fiction as a starting point, co-creating design fictions with Lego, and plenty more.