Content warning: pornography, revenge porn, exploitation
A number of my friends make part or all of their living creating art, some or all NSFW—we’ll call it porn for simplicity’s sake but trust me, some of it really is art. This post is for you, so you know what legislation is being put forward.
SISEA, or the Stop Internet Sexual Exploitation Act, is a US Senate bill currently in committee. Purportedly, it will defend against pornographic material posted without the affirmative consent of those depicted, but the methods it takes to do are poorly thought out. The potential for damage will be far out of proportion to the good it accomplishes.
SISEA will place significant burdens on users and any online platform that hosts and makes available to the general public “pornographic images” (hereinafter “covered platforms”).
“Pornographic image” means any visual depiction, including any photograph, film, video, picture, or computer or computer-generated image or picture, whether made or produced by electronic, mechanical, or other means, of “sexually explicit” conduct, i.e., actual or simulated—
(i) sexual intercourse, including genital-genital, oral-genital, anal-genital, or oral-anal, whether between persons of the same or opposite sex; (ii) bestiality; (iii) masturbation; (iv) sadistic or masochistic abuse; or (v) lascivious exhibition of the anus, genitals, or pubic area of any person. 18 USC 2256.
If you make, upload, or host any of the above, pay attention, because this applies to you.
What’s SISEA about?
The stated goals of SISEA are to give people the ability to have pornographic images uploaded without their consent removed, and this certainly does that. Sex workers will actually be able to control distribution of their work across platforms, and people who don’t consent to distribution will have leverage to have images scrubbed from the internet.
Every platform is required to verify the age and identity of each user, and obtain a consent form for each individual depicted. The consent can limit the geographic area where the image can be distributed. The platform is also required to prohibit any image from being downloaded from the platform.
Platforms will also have to display a prominent banner with instructions to have an contested image removed, operate a 24-hour telephone hotline to receive removal requests, remove a contested image within two hours of a request, and block re-upload of any image removed under this act. Further, platforms will have to check each image uploaded against the Federal Trade Commission database created under this Act to host a list of individuals who do not consent to any upload of any pornographic image to any platform.
That’s good, right?
In theory, yeah. The thinking behind the bill is that it allows individuals depicted to control where their image can be displayed, and theoretically prevents viral spread of any nonconsensual images. That’s great.
I’m sensing a “but.“
A big but. A mondo booty.
Prohibiting downloads is an issue to professionals who rely on downloads to fuel sales will lose an entire distribution method.
Platforms will have to track and limit geographic access for any limited consents provided, stretching already thinly-spread moderation staff even further.
The burden of operating a telephone hotline and responding within two hours to a removal request is unreasonable for smaller platforms, who would likely pull out of the pornography market rather than face FTC investigation for unfair or deceptive trade practices, which this Act will require.
The consents, though—did I read that right? They need consents for all images uploaded even before the Act?
Yeah, you read that right. Consents for any image, uploaded at any time before or after the Act is enacted, are required or the image has to be pulled down.
When asked for comment, the proprietor of prominent giant art site Macrophile.com had this to say:
I will just have to turn off Macrophile.com and dump a couple of decades of art… there’s no way I can research past art at all. Hell… I have explicit art there from people who have died, who I have kept running. So it will be an online death for those folks.
Is that all?
Nope. Most disturbingly, users and individuals will have to give the platform their personally identifiable information with their consents. Platforms will have to contain the above information and keep it secure, but accessible in case of FTC audit. The likelihood of that identifiable information being released or hacked should give any individual or platform pause—Pandora does not go back in that box once you let her out.
The database is the biggest deal, though.
And what’s this about a database?
Glad you asked. The Act establishes an FTC database where individuals can opt in to protection from any uploads of pornographic material whatsoever, anywhere on the internet. The protection is significant, because platforms will be fined up to $1,000 per day that an offending image is uploaded.
It’s all or nothing, though—anyone with a professional or amateur porn image on the internet will have to remove it to get this level of protection.
A single mistake on the part of a small platform can prove incredibly costly. Platforms, again, especially smaller platforms, would probably rather pull out of pornography altogether than face this level of exposure.
So this really doesn’t do what it set out to?
Nope. Call and write your senators, because this can’t make it out of committee. Hopefully Georgia flips the Senate and this bill dies ignominiously, and legislators take a smarter approach to porn regulation.
Sit, stay, speak. Good dog.