By Matt Fisher, Healthcare Attorney
LinkedIn: Matthew Fisher
X: @matt_r_fisher
Host of Healthcare de Jure – #HCdeJure
The privacy of female health data, especially data connected to reproductive health, has become a primary and important topic of conversation ever since the U.S. Supreme Court decision in the Dobbs case. As the discussion and attempts to access such data evolve, understanding the approach to such data by non-traditional healthcare entities is essential. The importance is heightened by potentially questionable privacy practices of newer entrants into the healthcare ecosystem. Will discourse change the landscape voluntarily or will outside action be required?
Research into Female mHealth Apps
mHealth applications focused on female or women’s health are drawing a fair amount of attention when it comes to privacy practices. The applications, which may be broadly referred to as female technology (FemTech), are designed to help individuals personally track different aspects of their health. The focal points may heavily go in the direction of reproductive health in a variety of different ways. Consistently, the FemTech applications collect user data through manual or automatic means and may also connect to other accounts or applications maintained by the user.
How does FemTech approach privacy though? To set the field to a degree, a large portion of FemTech falls outside the traditional healthcare industry. What does that mean? It means that HIPAA likely does not apply to the FemTech. HIPAA does not apply because the FemTech more often follows a direct to consumer approach without engaging the rest of the healthcare industry. The direct to consumer route often means no insurance coverage or other billing as the application is either “free” or has a subscription cost. If HIPAA does not apply, then the basic privacy and security protections that come along with HIPAA are absent.
Absent some regulatory mandated baseline, the actual approach to privacy in FemTech (and more broadly) is not what may be expected and almost certainly not what is desired. New research focused on some of those questions is fairly informative. Cutting to the chase first, the research found a lot of problems in the approach to privacy among the FemTech applications that were reviewed.
Privacy Approach Findings
The approach of the reviewed FemTech apps to the privacy policy was a little varied, but some themes emerged. The policies tended to be more generic than specific and would update without calling for reaccepting the policy. Both of these findings are not surprising since those are approaches that occur often across industries.
The descriptions of data collection are interesting. Not only did the apps gather through accessing device data and import data from other sources, but a number would automatically import data from Google, Facebook, or another source that can be used in place of a unique account. The potential for data being pulled in from other sources means that an unexpectedly broad set is being created in a new place.
The next step is how the collected data can be used. Delivery of the services and personalizing to the individual are not surprising. Nor is the ability to use data for research and further development activities. Additionally, all of the reviewed FemTech apps shared data with third-party advertisers. Again, it should not be shocking that data are used for those purposes if the use of the app itself is free, but it does show that information is sent to a lot of different parties through terms that may be buried in a privacy policy.
One area where descriptions and practices became vague was data deletion. The policies confused terms in a manner that introduced potential inconsistency. Anonymization was mixed into deletion, which suggests that information would not actually be deleted or showed more expansive use.
Leaving aside the terms of the privacy policies themselves, the user interfaces also showed a lot of variety. The research only found that 7 of 20 reviewed FemTech apps contained granular privacy controls within the app. Even when the controls were present, usability was not necessarily high. In the absence of tools within the app, there could be an ability to contact for controls. However, the overall takeaway is that enhancing privacy takes effort and barriers are put in place to make it harder.
In terms of user entry of data, the apps were pretty extensive in the information to be collected. Given the intended use of the various apps, it was expected that a large swath of information would be sought. If a user wanted in-depth analysis or insights into their health, then it would follow that information would need to be entered.
Research Conclusions
The researchers commented on the findings by stating that the results show a lot of concern about how and where data can be used. Even where regulatory frameworks beyond HIPAA existed, the terms of the privacy policy still contained permissions to share information widely. Further, the disconnect between what the privacy policy allowed, controls available in each app, and the confusion of terms could be expected to leave users uncertain as to what would actually happen with their data and potentially unable to seek restrictions.
Practical Impact and Takeaways
One of the biggest takeaways from the research is the need to fully understand what regulatory ecosystem a FemTech (or other) app may operate under and a careful review of user policies to see how those regulations may be made inapplicable or protections waived. This is especially true when use of an app is free because, as has been said often before, “free” use means something is being monetized. In this day and age, the something is most often data because data can drive so many different uses and insights. The free reign to reap benefit from data could also arguably be more valuable than a small user subscription fee, so that could explain why the means of limiting use are made so opaque.
It is also necessary to consider the interaction of data collected through FemTech apps with evolving social and legal matters across the country. As states take diverging approaches to regulating and criminalizing reproductive health and women’s health, creating repositories of sensitive data that do not benefit from protections or restrictions could carry a lot of unintended consequences. For example, changes to the HIPAA Privacy Rule that will become effective in mid 2024 will change the process for obtaining access to reproductive health care records in some instances, such as pursuing criminal enforcement. However, the new processes and procedures only apply under HIPAA. If, as is usually the case, a FemTech app is not subject to HIPAA, then the new limitations on access will not apply. If limitations don’t exist, then would an app developer want to try resisting a request on potentially shaky footing or would it just freely turn over data? Either scenario leaves the user in a tricky situation that would carry a lot of personal risk.
Building on a discussion that has been raised before, the findings and understood business models to fall outside of HIPAA suggest the need for a different approach to privacy. When sensitive information is regulated in one place but not in another just because of happenstance, which distinction can be easily lost on users, there is a lot of opportunity for alleged misuse of information. Even if there is not misuse, there will be a lack of aligned vision that causes distrust and raises the opportunity for negative repercussions.
Where to Go?
Where do all of these factors leave FemTech, healthcare, technology, and more? In a place where change is needed. Just saying that new privacy rules or regulations are needed is not enough. The discussion has to move beyond words to actions that can be enforced and required to be followed. There should also be an honest reckoning of just how data are viewed, both by individuals and businesses. There is value in the data, which must be recognized but appropriately accounted for. That means there is a lot of work to come and hopefully all are up to shouldering the burden.
This article was originally published on The Pulse blog and is republished here with permission.