google.com, pub-8701563775261122, DIRECT, f08c47fec0942fa0
UK

Airbnb guest says images were altered in false £12,000 damage claim | Airbnb

Airbnb apologized to a woman after incorrectly claiming that an apartment host has caused thousands of pounds damage and digitally manipulated to support its claims.

London -based academician was returned to almost £ 4,300, and an internal review of how the case was handled was launched at the short -term accommodation rental company.

According to a security expert, the event emphasizes how cheap and easy -to -use artificial intelligence software is used to manipulate images to give images to give false evidence about consumer complaints.

The picture of the ‘crack table’ allegedly in the New York apartment. Photo: stock market

The woman living in London reserved two and a half months ago to stay in the new York Manhattan in Manhattan to stay while working, but decided to leave early after feeling insecure in the area.

Shortly after he left, the host told Airbnb that he had caused more than £ 12,000 damage and sent a seemingly cracked coffee table paintings as part of his case. Among his allegations, he painted an urinated bed and damaged a robot vacuum cleaner, a sofa, a microwave, a TV and the air conditioner.

The woman rejected any damage to the apartment. He said he left him in good position and had only two visitors during his seven weeks. Closely examining the two paintings of the coffee table seems to vary in the damage that causes the woman to believe that it is manipulated or produced digitally by AI. He said the landlord was retaliation because he finished his tenancy early.

Airbnb initially told him that he would have to pay a total of £ 5,314 to the server after the photos were carefully reviewed ”. He objected to the decision.

There were differences between the two photos of the coffee table.

“I have reported them that I could witness a witness that was with me during the payment and can swear to the situation where the property was released: clean, undamaged and good in order, or he says. “I also showed visual inconsistencies in the images of the same object (wooden table) provided by the host showing open manufacturing symptoms.”

“These inconsistencies were not possible in the real, unprocessed photographs of the same object. These immediately raised the red flags and even if the evidence was reviewed even with the basic examination, but they not only defined Airbnb’s open manipulation, but completely ignored the open evidence that the material was produced.”

Guardian Money Money was said to have accepted the appeal to the woman five days after asking questions about Airbnb, and said that he looked with £ 500. Then when he said he would not book again with Airbnb, the company offered a repayment of £ 854 – one -fifth of the reservation. He refused to accept this and returned the full cost of the reservation (£ 4.269) and a negative review of the host’s profile was downloaded.

“My concern is for future customers, who may be a victim of similar fraudulent claims and do not have tools to pay for fear of much backing or climbing, Kadın says the woman.

“Given that such images can now be produced by AI and apparently accepted by Airbnb, it should not be so easy for a host to escape by creating evidence.”

The man who complains about him is listed as a “super house ında in Airbnb, where the site says it is an experienced and high score. He did not answer the request request.

Airbnb said that he was warned for violating his conditions and would be removed if there was another similar report. The company said he could not verify the images he presented as part of his complaint.

Airbnb apologized and said there would be an investigation on how his case was handled. “We take the claims of damage seriously – our expert team reviews all the existing evidence to reach proportional results for both sides and to provide a fair approach.”

Management consultants Baringa, Economic Crime Director Serpil Hall, said that manipulating images and videos is now “easier than ever before, and that the software is cheap, widely available and requires very little skill.

In a recent case, an insurance company found an increase in wrong allegations about vehicles and home repairs using the manipulated photos.

“Recently, many companies have decided that images could no longer be taken as face value. [during disputes]And to verify them, judicial vehicles and fraud intelligence models are needed. “

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button