Now Reading
Consent in the Digital Realm: Billboard Apologizes to Taylor Swift for Use of a Deepfake Image

Consent in the Digital Realm: Billboard Apologizes to Taylor Swift for Use of a Deepfake Image

When you think of Taylor Swift, what comes to mind? Most likely, you’ll envision the myriad of accomplishments the 34-year-old has under her belt. A powerhouse of the music industry, Swift has managed to stay relevant for almost 20 years and shows no signs of slowing down. So when Billboard magazine ranked Taylor Swift as #2 on their list of the 21st century and featured a clip from Kanye West’s “Famous” music video in the announcement post, people were rightfully upset.

The clip in question showed a naked wax figure of Taylor Swift accompanied by the lyric, “I made that b*tch famous.” The 2016 music video sparked controversy about the misogynistic nature of Kanye’s video, and the fact that he never got Swift’s consent.

Billboard has since come out with an apology for featuring that clip as a “highlight” of Taylor Swift’s career, but the incident still reflects a problem we have in the modern age of manufacturing false sexual content of non-consenting individuals. 

This isn’t the only time Swift has been a victim to this type of sexual abuse. On January 24, sexually explicit photos of the pop star went viral on X. Those images were not real, and the incident made it to Congress, sparking conversations over the rising issue of AI deepfakes and sexual harassment in the internet age.

Deepfake Porn

What happened to Taylor Swift in January is an example of deepfake porn, which is when a picture is edited to replace one person’s likeness with that of another–specifically using someone’s face on a naked photo of someone else.

As AI broadens its capabilities, deepfake porn has spread like wildfire; in fact, 98% of deepfake videos online are pornographic–with 99% of the victims of deepfake porn being women.

Swift is among thousands of celebrities who have been victims of deepfake pornography, including Megan Thee Stallion. Megan tweeted about an explicit AI deepfake video of her that was circulating online in June, only further highlighting the turmoil that deepfake pornography causes.

And celebrities aren’t the only ones being targeted. Last month, a Canadian teenager found out about a deepfake image of herself when someone sent her a link to a topless photo with her face. The original, fully-dressed image of her was a picture of her at age 13.

Over one-third of US states have laws about non-consensual deepfake pornography, and even within those states, the penalties vary. Hopefully all 50 states will have legal repercussions for these crimes, but in the meantime, there are steps you can take to protect yourself if you fall victim.

What Can We Do?

Much like Billboard using a deepfake video clip of Taylor Swift, deepfake images can negatively impact one's mental wellness.

Deepfakes are a form of sexual abuse and can lead to trauma. If you have experienced something like this, know that your circumstance is valid and you are not alone. If you’re looking for next steps to take in recovering and making your voice heard, here are some things you can do:

Online Organizations

The Cyber Civil Rights Initiative provides referrals to attorneys if you’re considering legal action, as well as a 24-hour crisis hotline (1-844-878-2274).

Additionally, the Digital Millennium Copyright Act works to help get deepfake images removed.

See Also

Seek Mental Health Support

To speak with a trained crisis counselor, you can text or call 988. Therapy can also be a great resource to work through it, and you can use RAINN’s guide for finding a therapist that will suit your needs.

Remember That It’s Not Your Fault and You Are Not Alone

It’s common for survivors of sexual abuse, including deepfake porn, to blame themselves, but it is truly not your fault. Nobody deserves this type of online harassment. Your loved ones have your back, and it’s okay to open up to them.

A letterboard that reads You Are Not Alone

Hope for the Future

Since this is such a new issue, the law hasn’t quite caught up, but states are in the process of adapting legislation to address the dangers of deepfake porn. Hopefully one day there will be more protections against being victimized in this way and consequences for people behind these deepfakes.

We’re already seeing celebrities like Taylor Swift using their power for good to combat deepfake porn, and the amount of support they’re getting shows that this is unacceptable to the public. Until then, it’s important to stand with victims and make our voices heard.

What's Your Reaction?
Excited
0
Happy
0
In Love
0
Not Sure
0
Silly
0
View Comments (0)

Leave a Reply

Your email address will not be published.

OUR PARTNERS



JUSTNLIFE.com is part of Nekia Nichelle, LLC. Just N Life may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.
©2023 Nekia Nichelle, LLC. All Rights Reserved. | Privacy Policy | Terms of Use | JustNLife.com Advertising Terms



Scroll To Top