Three years ago, 15-year-old Zackery Nazario died while subway surfing. His mother believes social media companies are to blame for filling his feed with videos of kids riding the roof of train cars.
The lawsuit she filed in the wake of his death may determine whether companies like Meta are allowed to target minors with their algorithms — if an appeals court allows it to proceed.
“They kept pushing and pushing this stuff at him,” Norma Nazario previously told amNewYork, referring to Meta and TikTok parent company ByteDance’s algorithms that suggested subway surfing videos to her son. “I can’t even imagine how he felt, the pressure that he felt to go do this.”
After Meta and ByteDance lost their motions to dismiss Nazario’s Manhattan Supreme Court lawsuit last year, the companies appealed to the Appellate Division, First Department which heard arguments Thursday.

ByteDance and Meta attorney Timothy Hester argued that Zackery’s actions alone are the cause of his death, and that, as publishers, the First Amendment protects the social media companies’ right to use algorithms how they see fit.
“Just like a newspaper decides, ‘I’m going to run a story on the front page because it’s the one my users want to read,’ it’s the same thing in the social media context,” Hester said. “It’s impossible for the internet to operate without some mechanism that allows users to connect with the content.”
Associate Justice Sallie Manzanet-Daniels questioned whether that line of reasoning holds true if the person affected is a minor, and if targeting those underage via algorithms is where a line should be drawn on First Amendment rights.
“I think that all of your arguments with regard to the freedom of speech and the free access of information are not arguments that offend most of us,” Manzanet-Daniels said. “But, when we’re thinking about our children being targeted by this information, is there no room for saying that should be the line — that these algorithms should not be allowed to target minors?”
She agreed with Hester that a number of cases have explored the right of social media companies to use algorithms and keep content on their sites, but said there wasn’t much case law exploring the tech giants’ rights to do so when it pertained to kids — suggesting maybe this case should be the start of that.
“What case has ever been litigated that explores … the targeting of minors, which we know exists?” Manzanet-Daniels asked. “Explain why that difference should not at least entitle these plaintiffs to … explore whether there was indeed an active campaign at the time to make sure anyone who fit the profile of a young, urban male in New York City [saw subway surfing videos]? Why would that not be something that the plaintiffs should be able to at least get some discovery on?”
Matt Bergman, Nazario’s attorney, argued that the way Meta and ByteDance use algorithms can’t so easily be compared to news outlets placing stories on the front page.
It’s much more aggressive, Bergman said, and involves sending people push notifications encouraging them to watch certain videos, and pushing videos onto their feeds because those views financially benefit them or their advertisers — not necessarily because users have indicated any interest .
“The son wasn’t looking for this material, he was being fed it in order to maximize his engagement, and the profits and advertising derived [from it],” Bergman said.
Bergman added that because the social media companies “knew” their videos were motivating children to subway surf, it made them liable for Zackery’s death, something he emphasized when judges asked why claims against Meta and ByteDance couldn’t be dismissed like they were against the MTA, an initial defendant in Nazario’s suit.
“These algorithms are designed knowingly to take advantage of the undeveloped prefrontal cortexes of young people,” Bergman said, adding that the MTA simply provided the trains while the social media companies caused Zackery to use them in a dangerous way. “They take advantage of the desire for social acclimation of young people.”
Whether it’s OK for the companies to target children — and whether it’s important courts consider that — is a question Justice Manzanet-Daniels kept coming back to. She emphasized that laws regarding social media companies and algorithms may be outdated, as developments move faster than Congress can regulate through legislation.
“We know Congress is not exactly functional now,” Manzanet-Daniels said. “That doesn’t help society in dealing with the ever-growing issue … It’s left to the courts.”
When reached for comment, a Meta spokesperson told amNewYork Law it would “vigorously defend” itself against the suit, that posting videos of subway surfing “violates” its policies and that the company removes those videos when it becomes aware of them. ByteDance did not respond to a request for comment.



































