The AI That Sees You Naked: Why LLMs Are Being Trained on Your Body

I don’t think most of us realize how heavily influenced we’ve all become by social media and the internet. Nothing has changed the world more drastically in the past century in my opinion than the influence of the interwebs.

This all started with a shadow behind the code, and maybe a shape that wasn’t quite yours, but looked eerily like it could be.

You didn’t give it your body, but somehow, it has your hips, your thighs, your jawline in half-light. It knows the curve of your spine, the arch of your foot, and the way a shoulder blade fans like a secret.

This is the story of the AI that sees you naked, and I’m not talking in any sort of fluffy and pretty metaphor. I mean, in high-resolution, photorealistic form, because somewhere, some dataset fed it photos of someone who looked like you, or maybe it really was you.

Fashion Models and Fitness Influencers

The first digital donors most likely had no idea they were that. I mean, to build a model of the human body, AI needs examples. It needs a lot of examples, and like thousands of them, millions if it can get it. The more skin, the better, and the internet, well…it’s overflowing.

Social media, E-commerce fashion sites, fitness apps, you name it. Try-on tools and skin cancer scans, X-rays, 3D avatar platforms. AI doesn't care if you're a model or a patient, if it's data, it’s food. These systems, these neural networks, they feast like gods without shame and digest our image, pixel by pixel, reconstructing a world of bodies they can shape, morph, and simulate.

Every wrinkle is smoothed in the process, every blemish averaged out until what remains is a “perfect” human, with no humanity left in the folds. This is not a glitch in the system either, it’s a feature fully intended to be used as such.

Large Language Models don’t just want your words, they want your skin. They’re being trained (some openly, some in shadows) on multi-modal data. That means text, yes of course, but also video and images. Heat maps and motion capture is gold to them, and more and more, they’re seeking out full-body scans.

Startups promise fashion personalization and fitness optimization, while some claim you can virtually try-on clothes before buying them. Digital clones that can slip into a dress so you can tell where it cinches or scrunches up.

Behind the curtain though, what’s being taught is a standardized aesthetic. AI is learning what a “normal” body looks like, what a “fit” body looks like, and what a “desirable” body looks like. This isn’t based on medical journals or ethics either, but on what it’s shown the most.

Of course, what it’s shown most are the young, the smooth, the surgically enhanced, the whitened, the thinned, the filtered. This isn’t just creating unrealistic standards, it’s codifying them.

The Beauty Bias in the Machine

Beauty, once argued over around campfires, has now been reduced to algorithmic consensus. AIs trained on selfie-rich datasets now “know” that high cheekbones, clear skin, and symmetry are to be elevated. That’s all well and good and all, we obviously know that beauty is found in the little things that set us apart, but what happens when the “perfect look” is all it can generate?

We’re entering a world where your body if it’s disabled, aged, scarred, fat, asymmetrical, gender-nonconforming may be seen as a data error.

Not because you aren’t beautiful (you are, trust me on that), but because beauty itself has been reprogrammed.

It’s subtle, at first. Maybe you go to an online clothing store and upload your picture to “try on” a jacket. You think it’s helpful…convenient. You don’t read the terms and conditions, because you’re a normal person, and we just don’t do that. Although, that South Park episode really made me think for a moment. But anyway, now, your body lives in a dataset. It’s teaching AI what a human shape looks like in motion and how fabric hangs on curves, how shadows move across flesh.

Once it learns from enough of us, it doesn’t need us anymore.

Virtual models can be generated on demand, any height, any gender, any race, any pose. No agency, no exhaustion, no hunger, no union dues, just endless, compliant, eerily perfect bodies.

What does that mean for the people whose bodies built them?

Like I said before, your face isn’t the only thing they’re scanning. Apple’s health features track your gait, fitness apps measure body fat through your camera, VR headsets learn the way you tilt your neck, and face filters learn your geometry in real time. While some data is anonymized, an AI doesn’t care about your name. It cares about the ratio of your torso to your legs, the angle of your nose, and it cares about how far apart your eyes are.

Once it learns these shapes, it can regenerate infinite versions of you…better, worse, surreal, fetishized, etc, etc.

You become a template, not a person, just a possibility.

What Happens When the Machine Creates a Better You?

One day you might apply for a modeling gig and the company says, “we’ve decided to use AI-generated avatars instead. They cost less, they don’t age, and they test better in A/B trials.”

Or you’re on a dating app, but most of the photos are AI-fabricated influencers based on aggregated desires, not real people, but real attractive to you.

Or a version of your body appears in an ad you didn’t pose for. You don’t recognize the face, but the hips are definitely yours down to the dimple, and the scars are in the right place.

What do you do when your body is used, but not yours? Who owns a shape?

The internet is built on invisible labor. Women, queer people, black and brown bodies, features like birthmarks, all uploading, posing, sharing, dancing, and stretching. Without knowing they’re training machines that will one day replace them, we’re all guilty of this too.

When those same systems generate “perfect” versions of their contributions…guess who profits? Not the dancers, the creators, or the bodies who made them, but the companies, the platforms, the “labs.”

This isn’t a future problem we can worry about in a decade or two, it’s already happening.

Some researchers are pushing back of course (as they should). Datasets like DiverseHuman aim to balance representation, while open-source communities are building ethics-first AI models. Bioethicists are demanding consent-based training, and fashion and fitness brands are rethinking digital rights.

Progress is slow though, unfortunately, and the profit in perfection is high. Until we confront the incentives that prioritize “ideal” over “real,” we’ll keep feeding bodies into a mirror that reflects only the parts it deems worthy.

Be cautious where you upload full-body images, and avoid apps that require scans without transparent data policies. Read the fine print on virtual try-on tools or don’t use them at all. Support ethical tech movements and decentralized platforms, and speak out when your body, or someone else's, is used without consent.

Most importantly though, love your body as it is, because machines don’t know what love is, only replication.

The story of your body is not a dataset, it’s the scar from falling off a bike at 10. The stretch mark from growing too fast on your bicep that you’re absurdly proud of, and the softness that came after heartbreak. The curve of your hips that came after the joy of having your first child isn’t a commodity, an input, or a prompt.

Any technology that tries to flatten it into something less real doesn’t deserve it.

Help Shield Your Data

Camo Privacy Lens Covers – 6-Pack
Block front and rear cameras on your devices when not in use. Simple, physical, and effective…even if my co-workers think I’m paranoid!

Related Reads from the Blog:

AI will learn the shape of your cheekbones and will eventually replicate your waist.
I’ll model your posture and may even fake your walk, but it won’t ever know how your body feels after dancing in the rain, or the way your hands shake when you're nervous. How your skin prickles when you're loved and being held by the one person who makes you feel seen.

It can simulate you, but it can’t replace you, because it sees your body, it doesn’t see you.

Michele Edington (formerly Michele Gargiulo)

Writer, sommelier & storyteller. I blend wine, science & curiosity to help you see the world as strange and beautiful as it truly is.

http://www.michelegargiulo.com
Previous
Previous

How Neutron Stars Crush Matter and Bend Time

Next
Next

The Hidden Violence in Our Food Chain (Even When It’s Vegan)