Make your case
Noah Stein ’08
Keeping the Internet Honest
Noah Stein headshot
There’s a war going on against the armies of bots that create false information, steal people’s identities, and influence everything from consumers’ buying behaviors to our elections. Assistant Attorney General Noah Stein talks about what it’s like being on the front lines.
by Vince Beiser
Illustration by Pepe Serra
Most prosecutors bring cases against people they believe have done bad things. Noah Stein’s cases often involve people who don’t even exist.

Stein is an assistant attorney general with the New York State Attorney General’s Bureau of Internet and Technology, an outfit that focuses exclusively on crimes committed in the digital realm. That puts Stein at the forefront of the ever-evolving battle to rein in the trolls, scammers, identity thieves, digital stalkers, and others who make the online world an often perilous place. Stein has helped shut down everyone from scalpers who use bots to scoop up hundreds of choice concert tickets to scammers who manufacture thousands of fake social media followers.

“There’s a lot of incentive for people to create fake activity of various forms,” says Stein. “There’s money in it. And there’s still not enough policing of that, so it’s an area where there’s a lot of need.” State prosecutors like Stein have led the way on many large-scale digital cases, from the scandal over Facebook’s selling data to Cambridge Analytica to antitrust actions against big social media platforms. “We’re part of that vanguard,” says Stein. “To me, this is a dream job.”

Raised on Public Service
Stein grew up in policy-saturated Washington, D.C., the son of two lawyers who quit law for work in the public sector (his father) and in the nonprofit arena (his mother). “I was always interested in government and positive reform in general,” says Stein.

Yet law wasn’t his first pursuit after he graduated from Johns Hopkins University, where he majored in English. His first job after college was as an analyst with McKinsey & Company, followed by a role as a project manager at a software start-up in Manhattan. Mulling his next move, Stein found himself fascinated by newspaper accounts of New York’s then Attorney General Eliot Spitzer’s anti-corruption crusades targeting Wall Street firms. (This, Stein is quick to point out, was before the revelations that Spitzer indulged in a little corruption himself.) “Spitzer was taking these legal tools and using them to reform industries. That seemed really powerful, worthwhile, and effective,” Stein says.

With the goal of becoming a criminal prosecutor, Stein started at Fordham Law in 2005. “My first year, I had an adjunct professor who was in a very senior position at the U.S. Attorney’s Office for the Southern District, a high-flying lawyer who was just brilliant. Because Fordham is here in New York City, where there is so much interesting legal work going on, that was the kind of adjunct teaching the class,” he says.

“There’s a lot of incentive for people to create fake activity
of various forms. There’s money in it. And there’s still not enough policing of that, so it’s an area where there’s a lot of need.”
Yet by the time he graduated in 2008, Stein was no longer sure criminal law was a good fit.

Instead, he took a job doing commercial law for the firm of Patterson Belknap Webb & Tyler. He quickly discovered that he liked the work but was not excited about the idea of becoming a junior partner. “I got to first-chair a pro bono trial in federal court and it was the most fun I had as a litigator. But the idea of spending years to become a junior partner, where I’d support a senior partner who did most of the trial work, all in hopes that decades later I would get to become the senior partner myself—that didn’t seem so appealing,” he says. “I wanted to have my own subject matter expertise and get to direct my work and my career.”

Stein was much more excited about the increasingly complicated intersections of law and technology, and the idea of becoming a subject matter expert in an area of law that was just beginning to develop. When a job with the Bureau of Internet and Technology opened up, he interviewed. “I met with the team and I was really blown away,” he recalls. “I liked how thoughtful they were, and how they were actually fans of technology. It wasn’t the typical critique of tech—you know, ‘It’s all evil. It’s all bad.’ It was more, ‘Look, there’s this whole new area of human activity. And like any other area of human activity, it needs to be policed around the edges. And how are we going to do that?’”

Fighting All Forms of Fakery, Online
Stein is especially interested in what he calls “automated mass deception”—cases where fraudsters manufacture phony online armies to push for some result, using automated bots or real individuals posing as other people. It’s a growing problem that can include everything from companies that fill a shoddy product’s Amazon pages with rave reviews to Russian trolls filling voters’ Facebook feeds with lies meant to swing their votes. “The worst instance of this is the political use,” says Stein. “I think that’s really terrible.”

Besides influencing the outcome of elections, fake online personae can also influence policy, as in 2017, when the Federal Communications Commission began collecting public comments about its proposal to scrap net neutrality—that is, to drop the rules requiring Internet service providers to treat all online traffic equally, and instead allow them to offer faster access to the highest-paying websites. Some 22 million comments poured into the FCC’s website. “Very quickly, there were reports of comments that looked fake,” says Stein, who persuaded his office to launch an investigation. Not only were millions of the comments posted by fake people, but some were posted under identities stolen from real individuals, including people who were no longer living. Stein would not comment on the case, because it is still ongoing, but press accounts reveal that his office subpoenaed more than a dozen entities, including telecommunications trade groups, lobbyists, and advocacy organizations, in a bid to find out who was responsible for the mass deception.

When an Influencer Isn’t
This year, Stein brought a case against a company called Devumi, which sold bogus social media followers to anyone who wanted to inflate their YouTube views or Instagram numbers. “They would advertise online and say, ‘Grow your followers,’ or ‘Get more likes for your posts,’ and sell YouTube views by the hundreds of thousands. It was all fake—a way to appear more popular,” says Stein. The clicks were all generated by automated bots, or sometimes by sock puppets—multiple fake accounts controlled by an actual person.

Adding another layer of deception, scammers sometimes copied real people’s social media profiles and used them without their knowledge, as a way to make the automated accounts’ activity appear more realistic. In the process, Devumi raked in millions of dollars selling fake social media engagement to customers, among them professional athletes, reality show stars, and models.

You might wonder what’s the harm in people inflating their own popularity online. “As an influencer, the money you can charge a brand to mention their product is tied to the size, activity, and engagement level of your audience,” Stein explains. By helping unscrupulous influencers artificially pump up their follower counts, Devumi was essentially helping them defraud their clients.

“I want to find out who’s engaging in that kind of abuse, to ratchet up the pain for them, and
make them stop.”
Yet bringing a case against the company was uncharted territory. For one thing, at the time, there was no law that expressly said it was illegal to manufacture fake followers and likes. There is one, however, against undisclosed endorsements—meaning, it’s illegal to pay someone to talk up your product without acknowledging that you are doing so. Stein argued that creating and selling fake followers was a form of undisclosed endorsement, as well as charging that Devumi’s use of real people’s identities was illegal impersonation.

Stein’s work helped pressure Devumi to stop selling fake followers and to pay a financial settlement in 2019. It was, according to the attorney general, the first time a law enforcement agency had successfully brought such a case against fake social media activity. And precedent was one of Stein’s goals. “If we can focus the public on the problem of mass deception, and establish that it is illegal and unacceptable, maybe we can change the incentives to engage in this kind of massive fraud. Maybe this encourages platforms to do more to fight it. Maybe it gets the word out to the public—you know, “‘Don’t automatically believe everything you see on the Internet.’”

There does seem to be growing momentum in this direction. Prodded by governments and an unrelenting torrent of bad publicity, major platforms like Facebook have started scouring their user bases and filing suits against companies that sell fake likes and followers. After the Devumi matter, Facebook began bringing its own lawsuits against sellers of fake Instagram activity. And in July, California enacted a law making it a crime to use bots to try to influence buying or voting decisions.

Four years into the job, one of Stein’s primary goals is to help make the Internet “a less terrible place” for women and other groups who tend to be vulnerable to trolling, harassment, and worse. “I want to find out who’s engaging in that kind of abuse, to ratchet up the pain for them, and make them stop,” he says. At the same time, he’s looking for ways to increase protections around data privacy so ordinary folks can be less vulnerable online. “There are all these safeguards for your credit card information or your medical records,” he says. “But if somebody takes and then discloses, without your authorization, pictures of your naked body, or information about your sexual expression, that’s not covered. And that’s crazy.”

Last summer, Stein helped settle a case against an online dating app that touted a feature that allowed users to share nude photos with intimate partners but did a sloppy job of protecting those photos, which were left vulnerable to hackers. According to Stein, the company knew about the holes in its security for a full year but did nothing about them. Ultimately, it agreed to a settlement payout, setting another precedent. “I wanted to make the point that this type of sensitive information—nudity and sexuality—should have strong protection,” says Stein.

That kind of result keeps Stein going. “We can use law to set rules for how technology gets to be used and how it doesn’t, and what happens to people when they misuse it,” he says. “So you identify these areas of need, and then you get to be creative. That’s what I love about this work.”

If you’re working on an interesting case or in an emerging area of law, we want to hear about it. Tell us what you’re doing at