Mobile health apps or mHealth apps are becoming increasingly popular. Prominent manufacturers of wearables, fitness trackers, and smartwatches such as Fitbit or Garmin offer apps that record sleeping patterns, steps taken in a day, or their user’s heart rate. Long gone are the times in which people would not track their training and exercises, listen to meditation coaches on their phones, or track their eating habits to lose weight. It seems that an entire culture has developed around being well and living a healthy lifestyle; so-called healthism.
Furthermore, health services that have been formerly exclusively available in non-digital ways such as doctor’s appointments or diagnosis are increasingly offered as telemedical services aiming to either substitute the old service or to make them more cost-efficient and widely accessible.
Many of those products have been criticized for invading their user’s privacy by collecting sensitive data and using the collected data to create manipulative choice architectures nudging users into purchasing ever more premium services and products. While those business models are paradigmatic for what Shoshana Zuboff (2019) calls surveillance capitalism, I think that the ethical problems this technology raise go beyond concerns for people’s privacy and entail much broader societal challenges.
In recent years, the European Commission as well as the WHO have promoted mHealth apps as a way to increase the health of our societies while simultaneously providing opportunities for economic growth. Health insurance companies have adopted mHealth apps to track their users’ fitness levels and provide them with benefits if they stay healthy. Making use of the health culture may, at first glance, seem like a win-win situation: The citizens live a healthy lifestyle and avoid getting sick, thereby reducing costs for the health sector, and becoming more empowered. Software companies benefit financially by selling the respective digital products and make the collected data available for research institutes and public health organizations thereby creating new jobs and prosperity.
To some, however, this development leads further down the slippery slope of clustering disadvantages. What if you are in a position so you cannot care about yourself by going for long runs regularly? What if you do not want to use a mHealth app because privacy is important to you? What if you do not want to substitute your general practitioner who knows you for the better part of your life for an app that hooks you up with a medical call center?
The fear of having to trade-off values like privacy or the relationship to another human as well as the fear of not being able to live up to certain health standards is particularly worrisome for those who are already disadvantaged when it comes to health care. Especially during the COVID-19 pandemic vulnerable groups like elderly people, people with disabilities, or pregnant women often rely on mHealth apps because they cannot risk infection by visiting a doctor. Furthermore, people with disabilities or chronic illnesses use mHealth apps to achieve a certain level of independence and safety. It seems that vulnerability forces people to give up on certain values – like privacy or attachment – if they essentially use mHealth to equal out their disadvantages. This, however, entrenches social injustices even more.
Even though such trade-offs affect all users, people who must rely on mHealth apps to live a decent life should always have the choice of an alternative that does not require them to give up on other fundamental values despite their health. It is only then that we can ensure that they are not missing out on privacy protection, contact with another human being, or the freedom of defining their own standard for a healthy life.