Tech and Science

Meet Airbnb’s Official Party Freak, who reduced the number of parties by 55% in two years

Naba Banerjee, Airbnb

Source: Prashant Joshi | Airbnb

Naba Banerjee is a proud partygoer.

As the person responsible for Airbnb's global party ban, she has spent more than three years figuring out how to combat user partisan collusion, flag “repeat party houses,” and, most importantly, develop an anti-party AI system. Enough training data to stop high-risk reservations, before the perpetrator even gets to the checkout page.

It was a bit of a slugfest: every time Banerjee's algorithms report concerns, new ones pop up.

According to a company representative, Airbnb defines a party as a gathering held at an Airbnb listing that “causes significant disruption to neighbors and the surrounding community.” To determine violations, the Company considers whether the gathering is an open event and whether there is excessive noise, trash, visitors, parking problems for neighbors, and other factors.

Bannerjee joined the company's trust and safety team in May 2020 and now leads this group. In her short time at the company, she has overseen a ban on high-risk reservations for users under 25, a pilot program for anti-partisan AI in Australia, increased defenses on holiday weekends, multimillion-dollar host insurance, and, this summer, a global rollout of the Reservation Verification System Airbnb.

Some measures have worked better than others, but the company says party reports fell 55% between August 2020 and August 2022 – and since the Banerjee system launched globally in May, more than 320,000 guests have been blocked from booking attempts on Airbnb or redirected.

Overall, the company's business is getting stronger as the post-pandemic travel boom wanes. Last month, the company reported earnings that beat analysts' expectations on earnings per share and revenue. The latter increased by 18% year-on-year, although the number of nights and experiences booked via the platform was lower than expected.

Turning the parent party's radar into an algorithm

Airbnb says the pandemic and hosts' fears of property damage are the main reasons for its anti-partisan push, but there have also been darker incidents.

Five people died at a Halloween party at an Airbnb in 2019. This year, at least five people were killed at parties at Airbnbs between Memorial Day and Labor Day weekends. In June, the company was sued by a family who lost their 18-year-old son in a shooting at an Airbnb party in 2021.

When Banerjee first joined Airbnb's escrow team in the summer of 2020, she remembers people around her asking, “How do you solve this problem?” The stream of questions from people above and below her on the corporate ladder contributed to hers fear. Airbnb's party problem was complex and in some ways she didn't know where to start.

As a mother of five, Banerjee knows how to sniff out a clandestine party.

Last summer, Banerjee's 17-year-old daughter had a friend who wanted to throw an 18th birthday party — and she thought about booking an Airbnb for it. Banerjee remembers her daughter telling her about the plan and asking her if she should tell her friend not to book an Airbnb because of AI protections. The friend ended up hosting the party at her house.

“As a mother of teenagers and when I see teenage friends of my children, the antenna is particularly sharp and you have a radar for, ‘Oh my God, okay, this is a party that's going to happen soon,'” Banerjee said. “Our data scientists, our machine learning engineers and we started looking at these signals.”

For Banerjee, it was a matter of translating this antenna into a usable algorithm.

In an April 2020 meeting with Nate Blecharczyk, the company's co-founder and chief strategy officer, Banerjee recalled developing strategies to solve Airbnb's party problem on three different time scales: “now,” within a year, and in general Future.

For the Now scale, they talked about looking at platform data, examining the patterns and signals for current party reports, and seeing how those puzzle pieces fit together.

The first step in July 2020 was to introduce a ban on high-risk reservations for users under 25, particularly those who either did not have much experience on the platform or did not have good reviews from hosts. Although Airbnb says it has blocked or redirected “thousands” of guests worldwide, Banerjee still observed users trying to get around the ban by having an older friend or relative book the reservation for them. Two months later, Airbnb announced a “global party plan,” but that was mostly lip service — at least until they had the technology to back it up.

Around the same time, Banerjee sent out a series of invitations. Instead of a party, they were invited to risk mitigation workshops at parties sent to Airbnb designers, data scientists, machine learning engineers, and members of the operations and communications teams. In Zoom meetings, they examined the results of the ban on bookings for guests under 25 and began implementing further plans: Banerjee's team set up a 24-hour security line for hosts, set up a neighborhood support line and decided to staff Increase customer support call centers.

One of the biggest changes, however, was removing the ability for hosts to list their accommodation as available for gatherings of more than 16 people.

Now that they had a significant amount of data about how potential partygoers might behave, Banerjee's had a new goal: to develop the AI ​​equivalent of a neighbor who checks the house when the high school student's parents leave him alone for the weekend Leave it home.

Around January 2021, Banerjee recalled hearing from the Australian Airbnb offices that disruptive parties were on the rise at Airbnbs, as in North America, as travel had come to a relative standstill and Covid was in full swing. Banerjee considered introducing the under-25 ban in Australia, but after speaking with Blecharczyk, she decided to experiment with a machine learning model that bans parties instead.

But Banerjee was nervous. Shortly afterwards she called her father in Calcutta, India – for her it was between 10 and 11 p.m., for him it was mid-morning. As the first female engineer in her family, Banerjee's father is one of her biggest supporters, she said, and typically the person she calls during the most difficult moments of her life.

Banerjee said: “I remember saying to him, ‘I'm just really scared – I feel like I'm on the verge of doing one of the most important things of my career, but I still don't know if we're going to do it.' do.' We will be successful as the pandemic continues, business hurts… We have something that we think will be great, but we don't know it yet. I'm on the edge of insecurity right now, and it just makes me really nervous.'”

Banerjee remembered her father telling her that this had happened to her before and that she would do it again. He would worry more, he told her, if she was overconfident.

In October 2021, Banerjee's team launched the pilot program for its reservation screening AI in Australia. The company saw a 35 percent decrease in parties between the regions of the country that implemented the program and those that did not. The team spent months analyzing the results and updating the system with more data, as well as security and property damage incidents and user agreement records.

How the AI ​​system works to stop parties

Listings on Airbnb

Source: Airbnb

Imagine you're 21 years old and planning a Halloween party in your hometown. Her plan: Book an Airbnb home for one night, send “BYOB” texts, and avoid cliche Instagram captions.

There's just one problem: Airbnb's AI system is working against you from the second you log in.

The party ban algorithm takes into account hundreds of factors: the proximity of the reservation to the user's birthday, the user's age, the length of stay, the proximity of the listing to the user's home, how far in advance the reservation is made, weekend vs. weekday, the Type of listing and whether the listing is in a heavily populated location rather than a rural area.

Deep learning is a subset of machine learning that uses neural networks – that is, the systems process information in a way inspired by the human brain. Functionally, the systems are certainly not comparable to the human brain, but they follow the pattern of learning by example. In the case of Airbnb, one model focuses specifically on the risk of parties, while another targets property damage, for example.

“When we started looking at the data, we realized that in most cases these were bookings made at extremely short notice, perhaps through a guest account that was created at the last minute and then a booking was made .” for a possible party weekend like New Year’s Eve or Halloween, and they might book an entire house for one night,” Banerjee told CNBC. “And if you look at where the guest actually lives, it’s really close to where the accommodation was booked.”

After the models complete their analysis, the system assigns a party risk to each reservation. Depending on the risk tolerance that Airbnb has set for that country or territory, the reservation will either be banned or given a green light. The team also implemented “enhanced party mitigation measures” for holiday weekends such as July 4th, Halloween and New Year’s Eve.

In some cases, such as when the correct decision is not entirely clear, reservation requests are flagged for human review, and these human agents can look at the message thread to assess party risk. But the company is also starting to “invest heavily” in large language models for content understanding to help understand partisan incidents and fraud, Banerjee said.

“The LLM trend is something where if you're not on that train, it's like missing out on the internet,” Banerjee told CNBC.

Banerjee said her team found a higher risk for parties in the U.S. and Canada, and the next riskiest would likely be Australia and certain European countries. In Asia, reservations seem to be significantly less risky.

The algorithms are trained in part on tickets flagged as parties or property damage, as well as hypothetical incidents and past incidents that occurred before the system went live to see if those incidents would have been reported. They are also trained on what “good” guest behavior looks like, such as when someone checks in and out on time, leaves a review on time, and there are no incidents on the platform.

But as with many forms of AI training data, the idea of ​​“good” guests is prone to bias. Airbnb has conducted anti-discrimination experiments in the past, such as hiding photos of guests, preventing hosts from seeing a guest's full name before confirming the booking, and introducing a smart pricing tool to eliminate income disparities when even unconsciously In the end, the gap widened.

Airbnb said its reservation verification AI has been evaluated by the company's anti-discrimination team and that the company frequently tests the system in areas such as precision and recall.

Go international

Almost exactly a year ago, Banerjee was at a garden center with her husband and mother-in-law when she received a call from Airbnb CEO Brian Chesky.

She thought he was calling about the results of the Australian pilot program, but instead he asked her about confidence in the platform. Given all the conversations she'd had about machine learning models and features, she remembered him asking her if she would feel comfortable sending one of her college-going children to an Airbnb hotel—and if so no, what would make her feel safe?

That call ultimately led to the decision to expand the Banerjee team's reservation verification AI globally the following spring.

Things were in full swing, with TV commercials for Banerjee, some of which she saw on the gym TV between pull-ups. She asked her daughter for advice on what to wear. The next thing she knew, the team was preparing for a live demo of the reservation screening AI using Chesky. Banerjee was nervous.

Last fall, the team sat down with Chesky after working with front-end engineers to create a fake party risk. It showed someone booking an entire villa at the last minute on a holiday weekend and tested whether the model would report this in real time. It worked.

Banerjee recalled that Chesky's only feedback was to change the existing message – “Your reservation cannot be completed at this time because we identify a party risk” – to be more customer-friendly and perhaps provide an opportunity to appeal or make another booking this weekend. They followed his advice. Now the message says: “The details of this reservation indicate that there may be an unauthorized party at the property. You still have the option to book a hotel or private room, or you can contact us if you have any questions.”

Banerjee remembers the hectic activity of the next few months, but also a feeling of calm and confidence. In April 2023, she visited her family in India for the first time in about a year. She told her father about the excitement of the rollout, which happened in batches over the next month.

Last Labor Day, Banerjee was visiting her son in Texas when the algorithm blocked or redirected 5,000 potential party bookings.

But no matter how quickly the AI ​​models learn, Banerjee and her team must continue to monitor and change the systems as party-oriented users find ways around the obstacles.

“The interesting thing about the world of trust and security is that it never remains static,” Banerjee said. “Once you put up a defense, some of these bad actors out there who might be trying to buck the system and throw a party will get smarter and try to do something different.”

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button