Voice assistants have become embedded in people's private spaces and domestic lives where they gather enormous amounts of personal information which is why they evoke serious privacy concerns. The paper reports the findings from a mixed-method study with 65 digital natives, their attitudes to privacy and actual and intended behaviour in privacy-sensitive situations and contexts. It also presents their recommendations to governments or organisations with regard to protecting their data. The results show that the majority are concerned about privacy but are willing to disclose personal data if the benefits outweigh the risks. The prevailing attitude is one characterised by uncertainty about what happens with their data, powerlessness about controlling their use, mistrust in big tech companies and uneasiness about the lack of transparency. Few take steps to self-manage their privacy, but rely on the government to take measures at the political and regulatory level. The respondents, however, show scant awareness of existing or planned legislation such as the GDPR and the Digital Services Act, respectively. A few participants are anxious to defend the analogue world and limit digitalization in general which in their opinion only opens the gate to surveillance and misuse.
We present a conversational social robot behaviour design that draws from psychotherapy research to support individual self-reflection and wellbeing, without requiring the robot to parse or otherwise understand what the user is saying. This simplicity focused approached enabled us to intersect the well-being aims with privacy and simplicity, while achieving high robustness. We implemented a fully autonomous and standalone (not network enabled) prototype and conducted a proof-of-concept study as an initial step to test the feasibility of our behaviour design: whether people would successfully engage with our simple behaviour and could interact meaningfully with it. We deployed our robot unsupervised for 48 h into the homes of 14 participants. All participants engaged with self-reflection with the robot without reporting any interaction challenges or technical issues. This supports the feasibility of our specific behaviour design, as well as the general viability of our non-parsing simplicity approach to conversation, which we believe to be an exciting avenue for further exploration. Our results thus pave the way for further exploring how conversational behaviour designs like ours may support people living with loneliness.