From fertility and menstrual cycle monitoring apps to prescription drug and other healthcare apps, it seems there’s now not only an uptick in discovered consumer privacy violations among them but also accompanying attention from the Federal Trade Commission (FTC). With recently proposed financial penalties and more robust consumer/media notification requirement enforcement, app owners are no doubt taking heed. 

Just last month, the FTC proposed an eye-opening $100,000 civil fine and other enforcement actions against the owner of an ovulation tracking app with more than one million downloads on the Android Play Store. According to the FTC, the app known as Premom deceived users by sharing their personally identifiable information (PII) and personal health data with third parties without prior consent.

It’s alleged the company failed to comply with the FTC’s Health Breach Notification Rule (HBNR). The rule includes mandates (mostly dormant until recently) that app owners notify consumers and, in some cases, the media as well as the FTC itself regarding data breaches involving their company or their service providers. 

The latest enforcement action by the FTC comes as the agency is seeking to add more powerful teeth to its HBNR, announcing a unanimous vote to update the rule’s language surrounding breaches and user consent. 

The vote came amid concern that unauthorized disclosures of personal healthcare data existed via SDKs and their hidden pieces of code that are part of the software supply chain used to help build mobile apps – ultimately covertly sharing data with unauthorized organizations (advertisers) potentially based both in and outside of the United States. 

A similar story appeared in Bleeping Computer a few months ago concerning a mental health therapy app called BetterHelp. The FTC settlement and fine were a whopping $7.8M!

Going beyond HIPAA in some cases

The FTC appears to be signaling its intention to use its enforcement powers to crack down far more consistently and with more far-reaching definitions of what constitutes a healthcare data breach. As SC Media reported alongside the news of the abovementioned HBNR language update, the FTC aims to use an even broader (and obvious) approach to determining what data is actually quite telling when it comes to a consumer’s health information. 

For example, simply providing one’s name and contact information to a healthcare app provider that is, in the end, breached can translate into advertisers or threat actors learning that someone either sought out or received a specific type of treatment. 

With this significantly broader definition of relevant data, healthcare app developers find themselves in a scenario where even an innocuous threat or breach can trigger a need for notifications to be sent to thousands or millions of users, not to mention the government and members of the press. SC Media noted that the FTC is seeking to include revised definitions for the rule’s application to healthcare apps that aren’t addressed by HIPAA and the definition of PHR-identifiable health information. Alongside that language update, the agency also hopes to better clarify the definition of a breach.

Apple's health-shaming app enters the fray

Taking heed of the public’s unease, Apple recently released a new commercial called “The Waiting Room,” a concerning yet humorous take on data piracy breaches in a healthcare setting where an audible voice-over reveals each patient’s private health issue, much to the consternation of the room full of people. The ad was designed to highlight Apple’s commitment to data privacy on the iPhone. The campaign is a great reminder that the mobile industry as a whole needs to do a better job of proactively safeguarding consumer data.

Play Video

A mobile app developer wake up call

The recent FTC actions seek to help well-intentioned app developers prevent unfair trade practices. Governmental actions are designed to do just that: increase safety and ensure privacy in the market. But when an app developer turns out to be a criminal organization in the first place, a whole host of additional and even more dramatic concerns arise. 

Tighter consumer data privacy regulations mean mobile app developers will need to secure their apps and APIs to more tightly control the usage of consumer data to avoid compliance investigations, civil penalties, corporate reputation challenges, and more. They will also need to better scrutinize their business arrangements with third-party services to ensure their data sharing policies and practices are transparent, adhered to correctly, and that PII is not shared without consent. Finally, developers will need to dive deeper into their app supply chain code to unroot lurking dangers such as back doors that illicitly communicate with unauthorized servers, whether legitimate suppliers or hackers, as they may be on the hook for such app privacy violations.

The FTC’s recent enforcement actions, Apple’s advertising campaign, and app security recommendations from companies like Verimatrix all send a clear message to the healthcare app development community: Respecting patient privacy is crucial; take responsibility for the data entrusted to you, or face potential repercussions.