The Supreme Court’s ruling in Dobbs v. Jackson Women’s Health Organization granted state legislatures the authority to regulate abortion. The Court’s decision quickly led states, such as Texas and Arkansas, to enact trigger bans for the procedure. Prior to the Court’s ruling, data brokers had already begun selling location data for individuals visiting abortion facilities through ordinary apps. This data often provided details to where the individual traveled from and how long they stayed at the facility.
In the wake of Dobbs, concerns have come to light regarding the potential misuse of sensitive personal health data originating from period tracking apps. Questions have arisen concerning whether “femtech” app data can be used to identify and prosecute individuals violating abortion laws. Due to lax federal laws and regulations in the United States, the onus falls on femtech companies to immediately and proactively find ways to protect users’ sensitive health data.
What is “Femtech’?
The term “femtech” was coined in 2016 by Ida Tin, the CEO and co-founder of period tracking app Clue. Femtech refers to health technology directed at supporting reproductive and menstrual healthcare. The femtech industry is currently estimated to have a market size between $500 million and $1 billion. Femtech apps are widely used with popular period-tracking app Flo Health touting more than 200 million downloads and 48 million monthly users.
Apps like Clue, Flo Health, and Stardust allow individuals to record and track their menstrual cycle to receive personalized predictions on their next period or their ovulation cycle. Although femtech apps collect highly sensitive health data, they are largely unregulated in the United States and there is a growing push for a comprehensive framework to protect sensitive health data that the apps collect from being sold or provided to third parties and law enforcement.
Current Regulatory Framework
Three federal agencies have regulatory authority over femtech apps – the Federal Trade Commission (“FTC”), United States Food and Drug Administration (“FDA”), and the Department of Health and Human Services (“HHS”). Their authority over femtech data privacy is limited in scope. Furthermore, while the FDA can clear the apps for contraceptive use, greater focus has been put on the FTC and HHS in regulating femtech. The Health Insurance Portability and Accountability Act (HIPAA), administered by the HHS, fails to protect sensitive health data from being collected and sold, and femtech apps are not covered under the Act. The FTC is currently exploring rules on harmful commercial surveillance and lax data security practices following President Joe Biden’s July 2022 executive order that encourages the FTC to “consider actions . . . to protect consumers’ privacy when seeking information about and provision of reproductive health care services.” The executive order’s definition of “reproductive healthcare services” does not, however, seem to include femtech apps. Thus, a massive gap remains in protecting sensitive health data consumers willingly provide to femtech app who may sell or provide such data to law enforcement or third parties. Femtech apps generally have free and paid versions for users, which makes the issue all the more immediate.
The unease based on potential misuse of health data collected by “femtech” apps heightened following the FTC’s complaint against Flo. The agency alleged the app violated Section 5 of the Federal Trade Commission Act (“FTCA”) by misleading consumers on how it handled sensitive health data. While the app promised to keep sensitive health data private, the FTC found the app was instead sharing this data with marketing and analytics firms, including Facebook and Google. Flo ultimately settled with the FTC, but the app refused to admit any wrongdoing.
FTC Commissioners Rohit Chopra and Rebecca Kelly Slaughter issued a joint statement following the settlement stating that, in addition to misleading consumers, they believed the app also violated the FTC’s Health Breach Notification Rule (“Rule”), which requires “vendors of unsecured health information . . . to notify users and the FTC if there has been an unauthorized disclosure.” The FTC refused to apply the Rule against Flo as such enforcement would have been “novel.” Such disclosures will help users navigate the post-Dobbs digital landscape, especially in light of news reports that law enforcement in certain states has begun to issue search warrants and subpoenas in abortion cases.
There is additional concern regarding femtech app’s potential location tracking falling into the hands of data brokers. The FTC recently charged Kochava, a data brokerage firm, with unfair trade practice under Section 5 of the FTCA for selling consumers’ precise geolocation data at abortion clinics. While Kochava’s data is not linked to femtech, in light of the FTC’s settlement with Flo, concerns of sensitive reproductive health data from femtech apps being sold is not out of the realm of possibility. Despite the FTC announcement on exploring new rules for commercial surveillance and lax data security, experts have expressed concern on whether such rulemaking is best done through the FTC or Congress. This is because the FTC’s rules are “typically more changeable than a law passed by Congress.”
As noted, most femtech apps are not covered under HIPAA nor are they required to comply. HIPAA encompasses three main rules under Title II: the Security Rule, the Privacy Rule, and the Breach Notification Rule. HIPAA is not a privacy bill, but it has grown to “provide expansive privacy protections for [protected health information] (“PHI”).” Due to the narrow definition of covered entity, there is little protection that can be provided to femtech app users under the current structure of HIPAA even though these apps collect health data that is “individually identifiable.”
Momentum for HIPAA to be amended so femtech may fall within the scope of covered entities may still fall short since HIPAA’s Privacy Rule permits covered entities to disclose protected health information (“PHI”) for law enforcement purposes through a subpoena or court-ordered warrant. While it does not require covered entities to disclose PHI, this permission could be troublesome in states hostile to abortion. Even if HIPAA’s definition of covered entities is expanded, it would still be up to the company to decide whether to disclose PHI to law enforcement. Some femtech companies, though, may be more willing to protect user data and have already begun to do so.
Future Outlook and What Apps Are Doing Post-Dobbs
In the same update, Flo introduced “anonymous mode” letting users access the app without providing their name, email, or any technical identifiers. Flo said this decision was made “in an effort to further protect sensitive reproductive health information in a post-Roe America.” The FTC, however, states that claims of anonymized data are often deceptive and that the data is easily traceable. Users may still be at risk of potentially having their sensitive health data handed over to law enforcement. Further, research shows femtech apps often have significant shortcomings with respect to making privacy policies easy to read and that users are often unaware of what their consent means.
While femtech has the potential to provide much-needed attention to a group often under-researched and underrepresented in medicine, the need to enhance current data privacy standards should be at the forefront for developers, legislators, and regulators. Although femtech companies may be incentivized to sell sensitive health data, their resources may be better spent lobbying for the passage of legislation like the American Data Privacy and Protection Act (“ADPPA”) and My Body, My Data Act otherwise the lack of data privacy measures may turn users away from femtech altogether. While no current reports show that menstruating individuals are turning away from femtech apps, it may be too soon to tell the effects post-Dobbs.
The ADPPA is a bipartisan bill that would be the “first comprehensive information privacy legislation” and would charge the FTC with the authority to administer the Act. The ADPPA would regulate “sensitive covered data” including “any information that describes or reveals the past, present, or future physical health, mental health, disability, diagnosis, or healthcare condition or treatment of an individual” as well as “precise geolocation information.” ADPAA’s scope would extend beyond covered providers as defined by HIPAA and would encompass femtech apps. The ADPPA would reduce the amount of data available through commercial sources that is available to law enforcement and give consumers more rights to control their data. The Act, however, is not perfect, and some legislators have argued that it would make it more difficult for individuals to bring forth claims against privacy violations. While it is unlikely that Congress will pass or consider ADPAA before it convenes in January 2023, it marks a start to long-awaited federal privacy law discussions.
On a state level, California moved quickly to enact two bills that would strengthen privacy protections for individuals seeking abortion, including prohibiting cooperation with out-of-state law enforcement regardless of whether the individual is a California resident. Although California is working to become an abortion safe haven, abortion access is costly and individuals most impacted by the Supreme Court’s decision will likely not be able to fund trips to the state to take advantage of the strong privacy laws.
As menstruating individuals continue to navigate the post-Dobbs landscape, transparency from femtech companies should be provided to consumers with regard to how their reproductive health data is being collected and how it may be shared, especially when it comes to a growing healthcare service that individuals are exploring online - abortion pills.
In August 2020, Marlene Stollings, the head coach of Texas Tech Women’s Basketball Team, allegedly forced her players to wear heart rate monitors during practice and games. Stollings would subsequently view the player data and reprimand each player who did not achieve their target heart rates. It could be argued that Stollings was simply pushing her players to perform better, however former player Erin DeGrate described Stollings’ use of the data as a “torture mechanism.” This is just one reported example of how athletic programs use athlete data collected from wearable technology to the student athlete’s detriment.
As of 2021 the market for wearable devices in athletics has a $79.94 billion valuation and is expected to grow to $212.67 billion by 2029. The major market competitors in the industry consist of Nike, Adidas, Under Armour, Apple, and Alphabet, Inc. so the expected growth comes as no surprise. Some wearable technology is worn by everyday consumers to simply track how many calories they have burned in a day or whether they met their desired exercise goals. On the other hand, professional and college athletes use wearable technology to track health and activity data to better understand their bodies and gain a competitive edge. While professional athletes can negotiate which types of technology they wear and how the technology is used through their league’s respective collective bargaining agreement, collegiate athletes do not benefit from these negotiation powers. Universities ultimately possess a sort of “constructive authority” to determine what kind of technology students wear, what data is collected, and how that data is used without considering the student athlete’s level of comfort. This is because if the student-athlete chooses to-opt out of wearable technology usage it may hinder their playing time or lead to being kicked off the team.
Studies show that collecting athlete biometric data has a positive effect on a player’s success and helps reduce possible injury. For instance, professional leagues utilize wearables for creating heat maps to analyze an athlete’s decision-making abilities. The Florida State Seminole basketball program also routinely uses wearables to track and monitor early signs of soft tissue damage which helped reduce the team’s overall injury rate by 88%. However, there are significant trade-offs including the invasion of an athlete’s privacy and possible misuse of the data.
Section I of this article will examine the different types of information collected from athletes and how that information is being collected. Section II will discuss a college athlete’s right to privacy under state biometric laws. Section III will discuss how data privacy laws are changing with respect to collecting athlete biometric data. Last, section IV will discuss possible solutions to collecting biometric data.
II. What Data is Collected & How?
Many people around the country use Smart Watch technology such as Fitbits, Apple Watches, or Samsung Galaxy Watches to track their everyday lifestyle. Intending to maintain a healthy lifestyle, people usually allow these devices to monitor the number of steps taken throughout the day, how many calories were burned, the variance of their heart rate, or even their sleep schedule. On the surface, there is nothing inherently problematic about this data collection, however, biometric data collected on college athletes is much more intrusive. Athletic programs are beginning to enter into contractual relationships with big tech companies to provide wearable technology for their athletes. For example, Rutgers University football program partnered with Oura to provide wearable rings for their athletes. Moreover, the types of data these devices collect include blood oxygenation levels, glucose, gait, blood pressure, body temperature, body fatigue, muscle strain, and even brain activity. While many college athletes voluntarily rely on wearable technology to develop a competitive edge, some collegiate programs now mandate students wear the technology for the athletic program to collect the data. Collegiate athletes do not have the benefit of negotiations or the privileges of a collective bargaining agreement, but the athletes do sign a national letter of intent which requires a waiver of certain rights in order to play for the University. Although college athletes have little to no bargaining power, they should be given the chance to negotiate this national letter of intent to incorporate biometric data privacy issues because it is ultimately their bodies producing the data.
II. Biometric Privacy Laws
Currently, there are no federal privacy laws on point that protect collecting student athlete biometric data. Nonetheless, some states have enacted biometric privacy statutes to deal with the issue. Illinois, for example, which houses thirteen NCAA Division I athletic programs, authorized the Biometric Information Privacy Act (BIPA) in 2008. BIPA creates standards for how companies in Illinois must handle biometric data. Specifically, BIPA prohibits private companies from collecting biometric data unless the company (1) informs the individual in writing that their biometric data is being collected or stored, (2) informs the individual in writing why the data is being collected along with the duration collection will continue for and (3) the company receives a written release from the individual. This is a step in the right direction in protecting athletes’ privacy since the statute’s language implies athletes would have to provide informed consent before their biometric data is collected. However, BIPA does not apply to universities and their student-athletes since they fall under the 25(c) exemption for institutions. Five other Illinois courts, including a recent decision in Powell v. DePaul University, explain the 25(c) exemption extended to “institutions of higher education that are significantly engaged in financial activities such as making or administering student loans.”
So, although Illinois has been praised for being one of the first states to address the emerging use of biometric data by private companies, it does not protect collegiate athletes who are “voluntarily” opting into the wearable technology procedures set by their teams.
III. Data Collection Laws are Changing
While BIPA does not protect collegiate athletes, other states have enacted privacy laws that may protect student-athletes. In 2017 the state of Washington followed Illinois’ footsteps by enacting its own biometric privacy law that is substantively similar to the provisions in BIPA. But the Washington law contains an expanded definition of what constitutes “biometric data.” Specifically, the law defines biometric identifiers as “data generated by automatic measurements of an individual’s biological characteristics, such as a fingerprint, voiceprint, eye retinas, irises or other unique biological patterns or characteristics that are used to identify a specific individual.” By adding the two phrases, “data generated by automatic measurements of an individual’s biological characteristics,” and “other biological patterns or characteristics that is used to identify a specific individual,” the Washington law may encompass the complex health data collected from student-athletes. The language in the statute is broad and thus likely covers an athlete’s biometric data because it is unique to that certain individual and could be used as a characteristic to identify that individual.
IV. Possible Solutions to Protect Player Biometric Data
Overall, it’s hard to believe that biometric data on student-athletes will see increased restrictions any time soon. There is too much on the line for college athletic programs to stop collecting biometric data since programs want to do whatever it takes to gain a competitive edge. Nonetheless, it would be possible to restrict who has access to athletes’ biometric data. In 2016, Nike and the University of Michigan signed an agreement worth $170 million where Nike would provide Michigan athletes with apparel and in return, Michigan would allow Nike to obtain personal data from Michigan athletes through the use of wearable technology. The contract hardly protected the University’s student-athletes and was executed in secrecy seeing its details were only revealed after obtaining information through the Freedom of Information Act. Since the University was negotiating the use of the student athlete’s biometric data on the athlete’s behalf, it can likely be assumed that the University owns the data. Therefore, athletes should push for negotiable scholarship terms allowing them to restrict access to their biometric data and only allow the athletic program’s medical professionals to obtain the data.
One would think that HIPAA protects this information from the outset. Yet there is a “general consensus” that HIPAA does not apply to information collected by wearables since (a) “wearable technology companies are not considered ‘covered entities’, (b) athletes consent to these companies having access to their information, or (c) an employment exemption applies.” Allowing student-athletes to restrict access before their college career starts likely hinders the peer pressure received from coaches to consent to data collection. Further, this would show they do not consent to companies having access to their information and could trigger HIPAA. This would also cause the information to be privileged since it is in the hands of a medical professional, and the athlete could still analyze the data with the medical professional on his or her own to gain the competitive edge biometric data provides.
Anthony Vitucci is a third-year law student at Northwestern Pritzker School of Law.