A legal case and a security surveillance system collided to ruin a Girl Scout troop’s trip.
Kelly Conlon, an attorney from New Jersey, says she wasn’t allowed to see a Rockettes show at Radio City Music Hall after she was identified by a facial recognition system, according to a report from NBC New York. Conlon told the outlet that guards approached her while she was in the building’s lobby and said she wasn’t allowed to be there because of her connection to a legal case against the company that owns the hall.
“I believe they said that our recognition picked you up,” she told NBC, saying that she was asked to identify herself and that “they knew my name before I told them. They knew the firm I was associated with before I told them.” She says she ended up waiting outside while her daughter watched the show with other members of her Girl Scout troop.
Madison Square Garden Entertainment (or MSG), the owner of Radio City and many other venues, hasn’t confirmed whether it was facial recognition that alerted security to Conlon’s presence. However, it does make it clear that it uses the tech. “We have always made it clear to our guests and to the public that we use facial recognition as one of our tools to provide a safe and secure environment and we will continue to use it to protect against the entry of individuals who we have prohibited from entering our venues,” the company said in a statement sent to The Verge by Mikyl Cordova, a spokesperson for the company.
MSG refused to provide details about its system, such as whose facial recognition tech it uses. There are many companies that develop these kinds of systems, with some selling them to businesses and governments. However, the company has a long history with facial recognition systems — it was testing them by early 2018, according to a report from The New York Times. As NBC shows in its report, the company has signage posted at the venue to tell people that security uses facial recognition, as it’s legally required to do.
It’s possible there are other ways Conlon could have identified before the show; if she’d been asked to present her identification or tickets with her name on them at any point, it would’ve been an opportunity for other security systems to flag her. But she told NBC that she was picked out pretty much as soon as she went through the metal detector.
The incident stems from the fact that Conlon is a lawyer at a firm that’s involved in a lawsuit against MSG. While she told NBC that she hasn’t worked on the case, MSG’s policy “precludes attorneys from firms pursuing active litigation against the company from attending events at our venues until that litigation has been resolved,” according to Cordova. Its reasoning is that “litigation creates an inherently adversarial environment.” Cordova says that “all impacted attorneys were notified of the policy” and that Conlon’s firm was notified twice.
The policy has been controversial from a legal standpoint. When lawyers from another case brought it up, Judge Kathaleen McCormick — who presided over two different Elon Musk cases this year as he tried to get out out of buying Twitter and argued over his pay package with Tesla shareholders — called it “the stupidest thing I’ve ever read,” according to documents obtained by Reuters.
Another judge in a separate case ruled that “plaintiffs may not be denied entry into any shows where they possess a valid ticket” while noting that MSG did have the right not to sell them tickets in the first place. The company didn’t answer The Verge’s questions about whether it had systems in place that would’ve prevented Conlon from purchasing a ticket, either through its systems or from resellers.
Despite the ruling, MSG sent another letter to law firms saying that they weren’t allowed onto its premises and that it could revoke their tickets, according to Reuters. It seems likely that the question of whether MSG’s ban is allowed will be litigated in many courtrooms over the next who knows how long. That probably won’t be the case for its use of facial recognition itself — in New York, it’s legal for businesses to do so, and reports have shown that the NYC government has received millions in funding for its own surveillance systems. (It has curtailed facial recognition in at least a few instances, though; schools currently aren’t supposed to use it.)
Even as they become more commonplace, facial recognition systems aren’t accepted everywhere. While their ability to scan a large number of people quickly and attempt to match faces to an identity in a database makes them attractive to governments and businesses, there are members of the public and privacy advocates that have pushed back against their use.
Outside of the concerns around how they can be used to intensify policing or track people’s movements, facial recognition opponents often point to studies suggesting that many of the systems are less accurate when identifying people who aren’t white. There have been cases where people were arrested after facial recognition software identified them as someone that they didn’t actually look like.
Some states and cities have passed laws meant to curb police and other government agencies’ access to the tech, and massive tech companies like Google, Microsoft, IBM, and Amazon have weighed in on different sides of the debate. Even the controversial facial recognition firm Clearview AI has said that it’ll stop selling its systems to most private companies after it was accused of building its database with pictures taken from social networks without users’ knowledge.