From Digital Care to Privacy Anxiety: A Communication Perspective on Human–Computer Interaction Ethics in the Xiaohongshu App
Authors:
Shikun Zhang
Keywords:
Women’s health apps; Algorithmic governance; Health Anxiety; Privacy Concerns; Data justice.
Doi:
10.70114/aimedr.2026.1.1.P1
Abstract
In today’s platform-driven, data-rich digital environment, women’s health apps have become central to everyday health communication and bodily tracking. Using Xiaohongshu as a case, this paper examines a key paradox: a tool promoted as “digital care” — providing scientific information, personalized services, and community support to reduce uncertainty and anxiety — can, through its specific HCI design and algorithmic operations, instead intensify women’s health anxiety and privacy concerns. In doing so, it reshapes information asymmetries, users’ control over data, and trust in platform–user relationships. Building on health communication, emotion communication, and platform studies, and engaging debates on algorithmic governance and data justice, there are two concepts which has: “algorithmic perception” and “privacy perception.” The researcher attempts to ask how women, in actual use, interpret interface cues such as data forms, notifications, consent requests, and tracking reminders, and how they imagine algorithmic influence on visibility, agenda priority, risk framing, emotional experience, and data security. Under conditions of limited algorithmic transparency and weak data control, this study examine how they respond by adjusting routines, avoiding features, or accommodating the app’s demands. Methodologically, this paper relies on semi-structured interviews with long-term users (3+ years). These interviews trace participants’ information pathways and probe their understandings of algorithms and data collection, perceived risks and fairness, and their emotional and trust-related experiences around both health and privacy. The analysis identifies three algorithmic mechanisms that jointly fuel anxiety and privacy risks: (1) personalization that creates “filter bubbles” of negative health content; (2) data-driven “ideal body” and “ideal fertility” norms that encourage comparison and self-discipline; and (3) commercial targeting coupled with medicalized narratives that turn ordinary bodily experiences into monetizable “potential risks.” This study situates women’s health apps within wider debates on digital culture, HCI ethics, and data justice.