Smile, We Can See You!

Smile, We Can See You!

The Human Crop in Digital Data Harvests

K. L. L’am-Li

LOCATION, CONTACTS, BIOMETRICS, KEYSTROKE, CAMERA, WiFi networks, clipboard, restart permissions.

That’s just a smidge of the data that TikTok collects on the phones it’s installed on.

Whilst one of the most extreme examples of a company subverting user privacy, TikTok is far from the only firm guilty of performing mass data harvests on its users. Social media corporations are especially predatory in breaking personal privacy to obtain data from their users for algorithm optimisation, sale to third parties and utilisation elsewhere without customer consent.

Except we gave them permission. Pretty willingly, as well.

Realistically speaking, few of us have given the terms and conditions more than a scrutinous glance when signing up for a new service. Even fewer have delved into the legal jargon that plagues the document, but within the terms and conditions of Instagram and various other companies, it’s listed right there:

“By using such Applications, you acknowledge and agree to the following: (i) if you use an Application to share information, you are consenting to information about your profile on the Service being shared; (ii) your use of an Application may cause personally identifying information to be publicly disclosed and/or …”

                     Terms of Use, Instagram, 2022

However initially shocking it may have been, Instagram and the social media cohort understanding us better than we do ourselves is known commonly enough (and concerningly accepted), so to avoid beating the decomposing horse, I’d like to touch on a personal aspect that’s rarely considered when the subject of digital privacy is brought up.

Your health, and all the information about it.

I had a fascinating discussion with my science teacher last week. We’d been studying pedigrees and she mentioned something I’d never thought to consider. When paying for an ancestry test, we willingly give our genetic information to a company that could, theoretically, sell it to third parties that had a use for such information. If an individual took an ancestry test and found that they had the genes for potential early-stage dementia, arthritis or another cocktail of physical maladies, it would be a simple check for health insurance companies to deny someone coverage for mental illness or rehabilitation if they find that the person is especially susceptible to it, based off their genetic makeup.

Whilst a frantic Google search reassured that common ancestry companies didn’t sell personally identifying data to third parties, several provide the aggregate data of a population to private companies involved in health research. It’s not a leap to see how, given enough time, the sharing of data for purposes beyond medical research and not-for-profits could be a very real part of the future.

Nice Fitbit you’ve got there

The current privacy policy for Fitbit is sound. The company doesn’t sell any information to third parties and is very clear about the data they collect to inform their customers as accurately as they can about their personal health. No issues here.

The issues stem from this small bit within their terms and conditions. It states that:

“You may also direct us to share your information in other ways, for example, when you give a third-party application access to your account, or give your employer access to information when you choose to participate in an employee wellness program. Remember that their use of your information will be governed by their privacy policies and terms.”

Fitbit Privacy Policy, 6th June 2023

Through another feature of Fitbit that encourages users to interact with and meet other users within their local area, Fitbit has unintentionally provided a loophole for social media companies to exert their rules onto some of the most personal data a person can reveal. Instagram’s parent company, Meta, has collaborated with the company to provide Bitmoji (the personalised stickers used on Instagram and Snapchat) services to Fitbit, which requires a linking of at least one of their two social media platforms to your watch; and by association, with your data.

The Fitbit company has shown its best in maintaining integrity with its customer’s data, but social media companies thrive on subverting rules to get where they aren’t meant to be. It’s not just Fitbit that’s sharing data, though by no means is it the fault of the company in Fitbit’s case. Thousands of other business operations, selling everything ranging from health-tracking watches to diet apps, both domestic and international, are far more malicious with their privacy policies. Particularly in applications from countries laxed in regulations around user data and usage such as China’s media titan, TikTok, the problems presented extend well beyond a minor issue to brush off. As time passes and their databases store increasingly private details of users, it raises serious concerns about what data should legally be available to companies regardless of customer consent, and changes the risks of a security breach from serious to downright ruinous. 

Privacy is as much a component of life as security and love are, but like those two, the fabric surrounding it has been stretched thin. As we advance and integrate and automate large portions of our lives for convenience’s sake, we need to keep in mind what we’re giving in return for these shortcuts in life.

We’ve all been taught the saying no pain, no gain. What on earth have we gained, really?