Forbes describes how Fitbit data are being used in a personal injury case:
The young woman in question was injured in an accident four years ago. Back then, Fitbits weren’t even on the market, but given that she was a personal trainer, her lawyers at McLeod Law believe they can say with confidence that she led an active lifestyle. A week from now, they will start processing data from her Fitbit to show that her activity levels are now under a baseline for someone of her age and profession.
In this case, the data are being used in a way that's beneficial to the individual on which the data were collected. But that's just this case.
This is such a perfect example of something that I preach to anyone who doesn't seem to care about data privacy. The mindset goes something like this: "I don't care if the government or Big Company X has my data. I haven't done anything wrong."
It's fine if you don't care about privacy, but if you use the innocent-today-therefore-innocent-forever logic to arrive at your apathy, you've gone way astray. You can't possibly anticipate how today's data will be used to implicate you in the future.
You can't possibly foresee how the fact that you went to lunch at a cafe on Broad Street at 11:43 AM on a Wednesday morning in July will become relevant and subpoenaed in a court case involving people you don't even know. You can't possibly be certain that a pattern of perfectly innocuous web searches you did in 2009 will raise suspicion in light of an accusation someone makes in 2021. You can't be sure that future laws will err more on maintaining civil liberties than ensnaring enemies of the state.
In his 1954 book How to Lie With Statistics, Darrell Huff famously said "If you torture the data long enough, it will confess to anything." I would offer a 21st century version: We're all guilty given enough database unions.