I recently, and sadly, had a fellowship proposal turned down, which was obviously disappointing. Nevertheless the exercise of applying for it was very much worthwhile, even if only to get a handle on where I wanted my research to be going (and I’d like to thank everyone involved for helping me sharpen up my thoughts.
The proposal required a description on a single side of A4. I thought I’d share the proposal description here because a) the need isn’t going to change any time soon, and these are issues that I think more people should be aware of and b) I’ll be applying to other funders for this over the next few years and any feedback would be appreciated. (More to the point, the effect of forcing all of the surrounding factors into one side of A4 meant that each sentence below can probably justify a blog post on it’s own, so feel free to make requests…)
Designing with, not designing for: communicating privacy and informational control for users with complex communication needs
This fellowship will support research exploring ethical and social perspectives on the mind, mental well being, and mental health, in the context of language development and maintenance (Challenge 2). It focuses on users with a range of disabilities who rely on augmentative and alternative communication (AAC). This work will develop an understanding of how different AAC communities view informational privacy in their language development and maintenance. This understanding will be used to develop technologies and processes that enable users to negotiate external access, control information flow and express underlying concerns about personal information that communication equipment processes. AAC devices enable users to construct utterances, many of which describe themselves or aspects of their lives, including their actions with others and as such can be considered `personal data’. The growing sophistication of commercial AAC allows the inclusion of sensor data to guide communication of an individual’s daily narrative. AAC users have been historically designed for rather than designed with: as a result AAC privacy technologies reflect the wishes of careers and society, not of users. This has eroded trust in the technology; yet users are in a position where their only choice is engage with technology that has not engaged with them.
There are an estimated 19,710 adults in the UK using AAC (Gross 2010); with the NHS spend on AAC devices in the past five years accounting for over £14 million, with similar amounts costed for education and social services.
This raises a wide range of implicit issues related to privacy including: anonymity; autonomy; personalisation of services; and identity management. The work proposed here will a) develop technologies and processes that enable AAC-users to specify who they share their data with b) develop technologies and processes to modify and selectively produce information about their daily activities c) develop support materials for carers and clinicians, therapists, and educational professionals to engage in conversations with AAC users on privacy related topics. Deployment of the outputs will take place in partnership with AAC manufactures, family networks and the above key stakeholders, as well as users. Post-deployment work will allow recognition of user’s attitudes towards privacy, and its affect on mental well being, as they will now have a means of expressing these attitudes. This recognition will have measurable impacts on the control user’s have on their personal information and hence their quality of life. At the governance level, outputs of this project will allow user representatives to work with commercial manufactures and service providers to improve key aspects of communication and control.
Work Package and Impacts
The proposed work is structured in two phases:
Phase 1: The first six months of the project will work with AAC users to assess AAC devices and methodologies with regards to personal data disclosure choices. This project focuses on adult AAC users, of a relatively high level of cognitive function, and in particular those who have experience privacy issues privacy on social media.
Phase 2: 6-26 months, will comprise the development and testing of privacy technologies for end-users (in the sense of exploring a means for users to express privacy and consent choices and interactions). The primary method for this would be to explicitly explore with users the tensions between privacy and capability that exist for the user group (at the higher level this would be questions to explore issues like `Would you let someone fit a GPS to your wheelchair and track you if it meant you could talk about the places you’ve been?’ or `Would you accept having much better ability to talk about the previous day if it meant you could never lie about it’).
Phase 3: 26-34 months, will focus on validation and dissemination, particularly in the direction of user- communities and commercial interests. Dissemination will include the production of guidance documents for the AAC community clarifying such legal issues as rights clearance, data protection legislation, and compliance scheduling.
(image from wikicommons)