Revenge porn and the tricky, delicate balance between freedom of speech and privacy

Revenge porn and the tricky, delicate balance between freedom of speech and privacy

Sharing nude photos or videos online is a growing phenomenon. What is sometimes done with those intimate images and videos, like being sent beyond the intended recipient due to spite, is part of a horrifying trend known as revenge porn.

Dr. Spring Chenoa Cooper, a 37-year-old public health professor at the City University of New York (CUNY), said she’s more than aware of these circumstances. Not only does she conduct research in sexual assault prevention, but she is suing her ex-boyfriend, Ryan Broems, for allegedly circulating highly personal images of her, some of which were consensually captured or shared with him when the relationship was ongoing. Many, however, were not only taken without her explicit approval but were spread throughout the internet with a malicious goal in mind, Cooper told The Daily Caller News Foundation.

Many state and local laws across the U.S. do not necessarily forbid the act of further disseminating private photos if they were originally shared with the subject’s consent. Federal laws, like the ones prohibiting extortion and sexual assault, can be applied to certain cases, but depending on the specific circumstances and the presiding judge or jury’s discretion, those legal definitions may not fall under the scope of the incident in question.

A small bipartisan group of senators, as well as a number of politically balanced House representatives, are trying to change that at the federal level. Democratic Sen. Kamala Harris of California introduced the Ending Nonconsensual Online User Graphic Harassment (ENOUGH) Act in November 2017, along with fellow Democratic Rep. Amy Klobuchar of Minnesota and Republican Sen. Richard Burr of North Carolina. The prospective legislation would federally outlaw the sharing of personal, explicit images of others without proper consent, but with some stipulations. Prosecutors would have to prove the accused perpetrator clearly knew that the images were supposed to stay private and shared with the purpose of ultimately causing harm.

“Perpetrators of exploitation who seek to humiliate and shame their victims must be held accountable,” Harris said in a statement at the time. “It is long past time for the federal government to take action to give law enforcement the tools they need to crack down on these crimes.”

Democratic Rep. Jackie Speier of California, one of the first to spark legislative interest, introduced a companion bill in the other chamber and has since  garnered several cosponsors from across the aisle.

“The Internet is used for a lot of great work, but such incredible real-time information sharing and accessibility is also abused by bad actors,” Republican Rep. David Joyce of Ohio told TheDCNF. “As a Co-Chair of the Bipartisan Task Force to End Sexual Violence, I was proud to join Rep. Speier’s legislation, which would create federal criminal liability for revenge porn. Sharing intimate photos of someone without their consent and with the intent to cause harm should be illegal.”

Certain local ordinances aren’t waiting for the slow bureaucratic and legislative processes that frequently occur at the federal level.

California passed a bill in 2013 that allows for a maximum punishment of six months in jail and $1,000 fine for any people “convicted of illegally distributing private images with the intent to harass or annoy.”

The New York City government passed on Nov. 16, 2017 a similar revenge porn law with a punishment of up to $1,000, a year in jail, or both. The specific statute is already being put to use by Cooper and her legal representation, the first known lawsuit dealing with revenge porn in this respect.

Cooper said her whole life was turned upside down when one night she received messages on Tumblr threatening to expose some nude images if she did not provide even more. The educator refused, an act considered to be good practice in the cybersecurity sense.

The next day, Cooper and her friends for hours scanned the account that digitally accosted her, eventually discovering personal images that she knows only Ryan possessed — at least at one point.

“I got an order of protection against him that same day, and assumed that this nightmare was over,” Cooper explained. “However, a few weeks later, I woke up to several Facebook messages from strangers: they sent me nude photos of myself; they called me names; they demanded more photos, threatening to ‘expose’ me further if I didn’t comply.”

After engaging with some of these strangers, despite many of them tormenting her, Cooper realized that some of the posts they were sharing incorporated contact information, including that of her place of employment. Administrators at CUNY were and have been generally supportive, said Cooper. They offered to have security officials walk her to her car during certain hours, among other security measures, but aren’t really equipped with handling such a nascent, twenty-first century problem.

Every day for weeks she would be sent images of herself that were never supposed to be seen by the public. The “revenge porn community” began to harass her constantly, even creating and displaying doctored posts that fraudulently depicted her doing things that are embarrassing.

“I came to be terrorized by Facebook message requests,” she said.

Cooper takes offense to the argument purporting that since she allowed her ex to capture at least a portion of the intimate content, she should take at least some of the responsibility in its eventual sharing.

“I’ve never heard someone give advice such as ‘you shouldn’t own anything so that you never have something stolen,’” Cooper, who specializes in sexual health, said. “To blame my trust in a partner for this crime is re-victimizing, and I’ve heard it hundreds of times.”

The attorney representing Cooper, Daniel Szalkiewicz, is happy the NYC law is empowering victims with the legal means for retribution, but said in many areas it doesn’t go far enough.

“The New York City Revenge Porn Law is a first step, but sadly does not help all victims,” Szalkiewicz’s website reads. “It allows only for actions to be brought when the person who discloses the pictures had intent to cause serious emotional harm — a term the drafters of the law did not define.”

He also takes issue with the NYC law’s requirement to have an individual disclose the intimate images being considered because it “is going to make the victim a spectacle, calling into question the severity of his or her’s emotional harm.”

Several privacy experts, on the other hand, foresee a number of adverse consequences if laws dealing with this problem are too extensive. For example, will a student sitting in a dorm room browsing through the abyss of the internet who also happens to stumble upon an image unwitting of the fact that the subject did not consent to its online publication be held liable for viewing or sharing it with a friend?

“We are very sympathetic to what victims of revenge porn go through,” Sophia Cope, staff attorney at the Electronic Frontier Foundation (EFF), a digital rights nonprofit, told TheDCNF. “But it’s important that Congress and state legislatures not write overly broad laws that harm free speech online. Any legislation should target the direct abusers while also taking into account newsgathering and other activities in the public interest.”

Liz Woolery, senior policy analyst at the Center for Democracy & Technology, for the most part agrees, also stressing how critical it is to make any potential law as narrow and targeted as possible.

Woolery told TheDCNF:

One of the challenges with revenge-porn laws is that they often seek to criminalize a person’s distribution of an image which they lawfully possess. Some cases may be covered by privacy torts or hacking crimes, such as when a person hacks into a victim’s online account or otherwise obtains unauthorized access to an image.

But when seeking to criminalize the archetypical case, where a person shares an image that was consensually created and shared, then a law should have some additional element of criminal knowledge and intent, as in, the person sharing the image knows it was shared in confidence and intends to cause the person depicted in the image some kind of harm (emotional, reputational, financial).

Szalkiewicz dismissed most concerns of an all-encompassing or more rigorous statute. The “sharing of the image in and of itself is not the illegal act” in New York City, but rather that it was shared with third parties “with the intent to cause economic, physical, or substantial emotional harm and in a manner where the depicted individual is identifiable,” he said.

The U.S. passed the Fight Online Sex Trafficking Act (FOSTA) in March — a bill that amends portions of the Section 230 of the Communications Decency Act (CDA), which grants websites legal immunity for users’ conduct. The law, while not directly applicable, is related to the revenge porn issue.

Companies are now more liable for user behavior — specifically in the policing and removal of users engaging in illegal human trading. This shows an appetite within the federal government and perhaps the public to hold tech companies accountable for their creations and how they are used.

It will likely be hard to make the case that the sex trafficking reforms apply to revenge porn.

That’s one of the many reasons Szalkiewicz wants to “eliminate” the civil protections from the CDA, or create a more comprehensive law to cover more harmful actions.

“A federal law would be immensely helpful to victims of revenge porn, who often run into jurisdictional issues depending on state (and in this case city) laws and those who live in states without revenge porn laws,” said Szalkiewicz. “Victims need some recourse because even the staunchest First Amendment advocate would say there is an inherent wrong in sharing these images.”

But what can be done outside the arena of politics? Are there other measures that can help make certain companies are doing the most to police their own platforms — all while not going too far with content moderation?

“Social media platforms and other content hosts should enable users to report incidents of for review and removal — often getting an image taken down is the most salient remedy for victims,” said Woolery. “At the same time, platforms should ensure that they have robust appeals mechanisms in place for any of their content takedown procedures; we see time and again that any moderation process designed for one use will also be abused and manipulated by others seeking to silence those they disagree with.”

One problem, as evident in Cooper’s case, is that simple reporting isn’t enough if those reports take weeks, even months, to be taken into consideration, or aren’t addressed at all.

Facebook has launched initiatives in attempts to identify and purge revenge porn on the platform. The company, however, has been very unhelpful in Cooper’s situation and has not responded to several appeals, according to Szalkiewicz and Cooper.

The professor tried to report a Facebook group through the digital processes offered on the platform, she said.

“Facebook responded with an automated response that I should block that person if I found them threatening,” said Cooper. “There is no mechanism to email Facebook or respond to their response, other than giving them feedback about their customer service.”

Facebook declined to comment on the record, but pointed TheDCNF to blog posts  outlining the actions they take, both through humans and artificial intelligence. Conducting completely fair and accurate content moderation is a difficult task for a social media company with billions of pieces of content on its platform. Even if the salacious images are removed on a social media site, they could be elsewhere on the internet, or already in a person’s possession (whether digitally or physically).

Cooper never heard back from Facebook, and Szalkiewicz also claimed he didn’t receive any help after many attempts.

The same allegedly goes for Tumblr where Cooper said most of her personal images were shared by multiple accounts. Szalkiewicz and Cooper described Tumblr as a quasi-virtual refuge for revenge porn enthusiasts that shows little interest in clamping down on online menaces and evildoers.

“We are exploring ways to hold Tumblr responsible for their involvement,” Szalkiewicz said.

A Manhattan judge forced Tumblr to release the account information of more than 300 once-anonymous Tumblr users accused of spreading revenge porn in July 2017, ruling in favor of Szalkiewicz’s 27-year-old unnamed client. A separate EFF attorney at the time described the decision as one that “discourages other speakers from exercising their own anonymous speech rights.”

Like EFF’s Cope, Woolery worries about the language of relevant legislation, especially since so many elected officials make laws general and vague, particularly when it comes to governing tech.

Still, sometimes laws can be regarded as too specific or not exhaustive enough.

For instance, an Oregon judge ruled that a 61-year-old man did not do anything illegal when he took an upskirt photo of a 13-year-old girl without permission.

The judge expressed his desire to do more to punish the man, but conceded that due to the way the laws were written, he couldn’t. Privacy rules prohibit covert photography in areas like bathrooms, dressing areas, and locker rooms, but in a department store aisle where the image snapshotting occurred, it was deemed public and thus legal. In relation that it was of a minor’s undercarriage, the defense successfully argued the relevant laws only apply to nudity, not an underwear-clad buttocks.

“It’s incumbent on us as citizens to cover up whatever we don’t want filmed in public places,” the defendant’s attorney said at the time, according to The Guardian. He also referenced the famous photos of Marilyn Monroe wearing a white dress over an air-blowing subway grating, arguing that upskirt sightings can occur by coincidence.

“In this case, one law in question, designed to protect against invasion of privacy, specified nudity, while the other law, encouraging child sexual abuse, specified engagement in sexual conduct — and neither nudity nor sexual conduct were included in the defendant’s upskirt photo,” Woolery explained in her argument that the nuances of a bill’s text are critical.

Legislators are attempting to fill in these gaps.

“The ENOUGH Act is a commonsense bipartisan proposal that establishes a federal criminal penalty for perpetrators while ensuring online speech is not burdened,” Rep. Cheri Bustos, a Democrat from Illinois and cosponsor of Speier’s ENOUGH Act, told TheDCNF. “We must stand with survivors by ensuring our laws are keeping up with changing technology and that the Department of Justice has the tools they need to hold predators fully accountable.”

The Japanese took an interesting and non-regulatory approach to the apparently prevalent problem of upskirt photos, Woolery said.

“Rather than passing new legislation, phone manufacturers have agreed to ensure that all camera phones have a shutter sound — making it more noticeable when an individual is taking a picture near you,” said Woolery.

“In addition, arguably one of the best methods for tackling problems presented (or made easier by) new technology is adapting and learning new social norms that address the new situations these technologies present,” she continued. “Of course, revenge porn is not a unique product of the cell phone era, but those technological developments — to discreetly snap a photo, to easily copy and distribute images online — have made the problem more widespread.”

With the rapid rise of technology, the sharing of personal and intimate content has intensified as well. How society deals with the balance between the First Amendment and personal privacy will test the priorities of the American populace and their apparent respect for both the ideals of freedom of speech and the freedom to be left alone.

This report, by Eric Lieberman, was cross posted by arrangement with the Daily Caller News Foundation.

For your convenience, you may leave commments below using either the Spot.IM commenting system or the Facebook commenting system. If Spot.IM is not appearing for you, please disable AdBlock to leave a comment.

LU Staff

LU Staff

Promoting and defending liberty, as defined by the nation’s founders, requires both facts and philosophical thought, transcending all elements of our culture, from partisan politics to social issues, the workings of government, and entertainment and off-duty interests. Liberty Unyielding is committed to bringing together voices that will fuel the flame of liberty, with a dialogue that is lively and informative.


Commenting Policy

We have no tolerance for comments containing violence, racism, vulgarity, profanity, all caps, or discourteous behavior. Thank you for partnering with us to maintain a courteous and useful public environment where we can engage in reasonable discourse.