The Need for Proper Health Information Privacy protection, a comparative view


Created with Microsoft Bing Image Creator powered by DALL-E


 Jan 30

        While privacy breaches over general personal information has frequently been making news, with companies being investigated for the ways they exploited their users' online behavioral data or data breaches revealing financial data, breaches of personal health information (PHI) has gone relatively unnoticed until recently. However, due to the high sensitivity of PHI, breaches of PHI privacy are more serious and any lack of regulation should be amended with alacrity. 

        The current gap in regulation comes from the PHI collected by countless health apps, devices, and websites. These kinds of PHI collected by devices in the US is not covered by federal Health Insurance Portability and Accountability Act (HIPPA), which governs privacy and security of PHI for physicians, hospitals and the general health care industry. These "developers, vendors, and service providers for personal health devices are NOT covered under HIPAA and merely are required to comply with the Breach Notification Rule in Section 5 of the Federal Trade Commission Act. This lack of regulation has allowed for significant trafficking of PHI done "in the shadows", where regulation is in a grey area and PHI supposedly being de-identified but with little oversight. For example, a class-action lawsuit in Louisiana alleges that health system websites shared PHI, including medical conditions and prescriptions, to major companies such as Meta. Such unregulated sale of PHI not only invades privacy, but may also discourage people from seeking or relying on health assistance

        Similarly, the FTC also recently barred the company GoodRx from sharing consumers' PHI for advertising, and fined them $1.5M for failing to report unauthorized disclosure of PHI to major advertising companies including Meta and Google. This case established the first precedent of utilizing the Health Breach Notification Rule in the FTC Act, although notably GoodRx was punished for failing to comply with their claims of abiding by Digital Advertising Alliance and HIPAA principles, rather than purely the sale of PHI. The punishment that GoodRx received seems rather minor, as the only real punishment appears to be the $1.5M fine, with the other punishments being orders to comply with the regulations GoodRx should have been abiding by to begin with. Considering that GoodRx is a major company with $187M revenue from the 3rd quarter of 2022 alone, a $1.5M fine is far below the GDPR fines that can be up to 4% of annual global turnover. Nonetheless, the FTC's director, Samuel Levine, stated that "digital health companies and mobile apps should not cash in on... personally identifiable health information", indicating that the FTC may further push to strengthen regulation on the sale of PHI without user's awareness and consent. 

        Another example of data brokers selling PHI is a study by Duke University's Cyber Policy Program, which found US citizens' mental health data being advertised and sold by data brokers with seemingly little to no oversight on safety or security. They found that of 37 data brokers contacted, 26 responded and 11 directly sold them mental health data, which included PHI on those with depression, insomnia, ADHD, anxiety, as well as identifiable data such as ethnitity, age, gender, zip code, net worth, credit score, and date of birth. Moreover, the study evaluated that data brokers seemed rather unwilling - generally speaking - to provide access disclosure to their customers and users about the collection or correction of their personal data. Due to the gap in legislation, this study advised for either a new federal privacy law or expansion of existing HIPPA protections, as well as a ban on the sale of mental health data

        There are many other situations that HIPAA fails to adequately cover privacy. In the era of the Internet of Things, legacy devices with outdated operating systems represent a major security weakness. For instance, a study found that 75% of infusion pumps scanned in hospitals (which tend to be the largest number of IoT devices used in the healthcare industry) had known security gaps. In Ontario, the leading cause of unauthorized disclosure of PHI was found to be misdirected faxes. Compounding on such security gaps, the fall of Roe v Wade means that failures in protecting certain PHI, such as reproductive health data, have far more dire consequences - such as discouraging patients who are seeking abortion care in states that criminalize abortion. For such security weaknesses, one suggestion is to use a "zero-trust framework", where one must continuously validate every stage of digital interaction with a principle of "never trust, always verify" through frequently updated passwords, software, policies, testing, and practice of least privilege. 

        Although this particular issue focuses more on the US's HIPAA coverage of PHI, such sale of PHI data by data brokers or usage by companies is still a relevant issue for the EU, Canada, or countries with newly developing privacy laws as such practices are difficult to regulate unless companies implement strong principle-based privacy policies when collecting and using data. In Canada's case, although PIPEDA or provincial health privacy laws do cover PHI, it is done in a more general manner with a focus on principles rather than strict regulations, and notably there is no enforcement power given to the privacy agencies (unlike the GDPR or FTC). Additionally, in Canada, each province may have slightly differing standards; for instance, British Columbia prohibits PHI from being saved in the USA even when encrypted. A unified and more direct regulation on protecting PHI is needed as well as more enforcement powers for violations of PHI related privacy laws. 

        Comparatively, South Korea is one of the leading countries in attempting to safely utilize PHI data. For instance, the Bioethics and Safety Act allows health and medical information to be used for research in specific and tightly regulated circumstances, such as a research plan approved by a specific board of review and with written consent from donors. Additionally, Korea's Personal Information Protection Act (PIPA) allows for de-identified data to be used, with various government agencies having the legal authority to collect and provide access to PHI, after it is de-identified, for research purposes. While there is always room for improvement, such as better safety controls to protect patient privacy and safeguards to prevent unauthorized access or usage beyond the stated purposes, an established regulatory system such as this is one method for protecting privacy rights related to PHI while still using it for economic or societal benefits.  





Comments

Popular posts from this blog

Seeking ChatGPT's Insight: Are the Biden Administration's 'Trump-Proofing' Efforts Legally and Morally Justifiable?

ChatGPT's Age-related Slogans for Biden, Trump, and Desantis.

Unraveling the WGA’s MBA with ChatGPT: Expert Analysis or Algorithmic Bias Towards Legalese?