A drop in your daily step count. A missed period. A loss of hearing. If it’s collected by a smartwatch or wearable, that health data isn’t protected the same way your medical records are.

And as wearables like smartwatches and headphones sweep up an increasing amount of health data — flagging potential medical issues that could be used for ad targeting or to discriminate against someone — some lawmakers and researchers are calling for a reconsideration of the current approach.

In a sign of the increasing urgency of the problem during the current virtual care boom, U.S. senators last month reintroduced a bill that would make it illegal for companies like Apple, Amazon, or Google to sell or share the data collected by wearables. Violations of the act would be enforced by the U.S. Department of Health and Human Services in the same manner it enforces the Health Insurance Portability and Accountability Act, or HIPAA.

Legal experts consider the move a step in the right direction, but caution that further action is needed to address the vast amounts of information being absorbed by health tech startups and technology giants alike.

“We are on a collision course with how to regulate health data as all the different types of wearables and health tech explode,” said Carmel Shachar, executive director of the Petrie-Flom Center for Health Law Policy, Biotechnology, and Bioethics at Harvard.

“HIPAA doesn’t extend to the world of health tech, and it should,” she added.

Sen. Bill Cassidy (R-La.) originally introduced the bill, called the Smartwatch Data Act, together with Sen. Jacky Rosen (D-Nev.) in 2019, but told STAT he felt the need to revive it after seeing a large insurance company offering to provide its customers with smartwatches.

“These watches are collecting data in an environment akin to an exam room,” Cassidy, who is also a trained physician, told STAT. “There is an expectation of privacy.”

When HIPAA was written in 1996, its creators likely did not envision a world in which people trusted large tech companies with tracking sensitive behavioral data such as sleep patterns, menstrual cycles, and activity levels. As a result, there is now a morass of health and health-adjacent information with virtually no shield against monetization or discrimination, outside of the general requirement that companies abide by the privacy policies they share with users.

Harvard law professor Glenn Cohen likens the situation to an iceberg, where the tip represents the data covered by HIPAA and the rest represents all the information that is not shielded by the law. Today, there is nothing stopping an employer or insurer from using that unprotected data to price its products or deny someone a job.

“I like to remind people that the ‘P’ in HIPAA isn’t privacy,” Cohen said. “The law made sense when we were talking about health care information, not health information” more broadly.

The problem has come to the fore amid the pandemic, as moves mount to permanently define the home as a place of clinical care. Earlier this month, Amazon health tech subsidiary Amazon Care joined the hospital chains Ascension and Intermountain Health to create a lobbying organization called Moving Health Home aimed at “fundamentally change the way policymakers think about the home as a site of clinical service,” according to a press release.

While these kinds of shifts would undoubtedly make it easier to access telehealth and other forms of virtual care, they also raise important ethical questions about what constitutes health data and how concepts like informed consent are defined outside of a traditional medical environment.

It’s not clear whether the bill will pass, but there are reasons to believe it may gain more traction than in 2019, given how the landscape has changed. This January, the period-tracking app Flo settled allegations by the Federal Trade Commission that it disclosed users’ personal health information — including when they were having their period and whether they intended to get pregnant — to Facebook, despite promising to keep their data private.

“We are looking closely at whether developers of health apps are keeping their promises and handling sensitive health information responsibly,” Andrew Smith, director of the FTC’s Bureau of Consumer Protection, said in a statement.

Along with collecting vast amounts of health and health-adjacent data, health tech companies are also increasingly playing a role in medical research. Both Google and Apple, for example, make it possible for people to participate in remote medical studies using their phones. As more health tech companies launch virtual research programs, it is time for a new definition of informed consent, said Cohen and Shachar. To them, asking consumers to scroll through a novella of fine print doesn’t cut it.

Another important ethical question to consider with the rise of virtual care is how health data is transferred among the subsidiaries of large tech companies. Although the new bill would stop tech giants from sharing or selling such data with one another, it would not prevent them from sharing it among their own subsidiaries.

For example, there is nothing in the current law — or proposed bill — that would stop an entity such as Google-owned Fitbit from sharing its wearable data with Google Health, the company’s research and wellness subsidiary, or with Nest, its smart thermostat and camera company. In 2019, then-presidential candidate Sen. Elizabeth Warren (D-Mass.) proposed breaking up the subsidiaries of tech companies including Amazon, Facebook, and Google for this reason, arguing that the mergers were anticompetitive.

Cohen and Shachar said they believed because the act does not protect against such intra-company data transfer, it could wind up incentivizing health tech monopolies, adding that they hoped future legislative efforts would block this type of data transfer.

“Would you rather have lots of people know little bits of information about you, or one person know everything about you?” said Cohen. “If that one person is the love of your life, that’s great, but if that one person is the person who sold you your car, maybe it’s not so great.”

Don't miss out

Subscribe now for access to exclusive content.