Melbourne teacher’s story about Airbnb AI shocks Q+A panel


A panel of tech experts has ripped into Airbnb after a Melbourne teacher said she could not create an account on the accommodation platform because of the colour of her skin.

Francesca Dias said her white partner had to do it for her.

“So recently I found I couldn’t activate an Airbnb account basically because the facial recognition software couldn’t match two photographs or photo ID of me, and so I ended up having my white male partner make the booking for me,” she told ABC Q+A’s panel on Monday night.

Ms Dias wanted to hear from the experts on how society could “avoid AI bias and reinforcing discrimination”.

Catriona Wallace, a future technology expert and founder of the Responsible Metaverse Alliance, explained the problem was the data sets AI starts with.

“To train the algorithms, data sets have to be collected from somewhere, usually society, and often society does not have good representation of the full population in its data sets because that’s how biased we’ve been historically,” Dr Wallace said. “And those historical sets are used to train the machines that are going to make decisions of the future, like what happened to Francesca.

“It is staggering that this is still the case and it’s Airbnb, right? You’d think a big international global company would get that s*** sorted but they still haven’t and we’re seeing that over and over again.

“Even with big brands and tech companies, they are still using data sets that haven’t been properly transformed to reflect the population and it continues on.”

Q+A host Patricia Karvelas appeared equally as appalled.

“If you look at something like Airbnb you’d think capitalism would sort it out,” Karvelas said. “Just in terms of consumer numbers. There are lots of brown women.”

Technology journalist Angharad Yeo said the problem was that diverse data sets were not front of mind for companies investing in new technology.

“Because the technology is still new, I think it’s very easy for them to get very excited that it’s being implemented at all,” she said.

However, she offered an “optimistic” view for the future.

“I think it’s one of those areas that really puts a spotlight on these biases because you have real examples when something doesn’t work,” Ms Yeo said.

“It’s very easy to ignore a bias when it’s just like ‘oh maybe I didn’t get promoted over a co-worker and maybe that was because of bias and maybe it wasn’t’.

“When it’s a little bit more hidden, it’s easier to ignore, but when it’s ‘I literally cannot use this service because the AI isn’t working’, then that really makes you go, hang on a second we have a real problem here.”

Toby Walsh, chief scientist at the UNSW AI Institute, said it wasn’t a technical problem or capitalism problem, it was a regulatory problem.

He said racial discrimination laws existed and should be applied “forcefully” so it was in Airbnb’s best interest “to get it right”.

“We know actually technically how to fix this,” he said.

Airbnb Australia directed news.com.au to a page about the photo matching process on its website.

“No facial matching process is completely accurate every time,” the website states.

“The effectiveness of this process can vary based on the quality and resolution of the photos — and changes in a person’s appearance between the two photos (for example, change in age, change in weight or different outfits).

“As a result, this process may sometimes ‘match’ photos that are not, in fact, of the same person or fail to match photos that are of the same person.”

In a crackdown on house parties, Airbnb is now also using AI globally to weed out potential party hosts.

AI will look out for red flags like how recently users made an account, if they’re trying to book a property in the same city they live, and over what period.

“If someone is booking a room during New Year’s Eve for one night only, and they are from the same city as the host, that’s likely to be a party,” Naba Banerjee, head of safety and trust at Airbnb, told BBC.

Read related topics:AirBnBMelbourne



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *