EXCLUSIVE OFFERS

Travel Blog

Meanderings of a Working Travel Agent

Is Anything Real?

In these days of social media and AI you really start to wonder what is real and what is fake because the fakes are getting so good.

Take Chat GPT …. I have been told so often that this is an amazing tool – you can write a blog post or a book if you like – without even thinking. I was curious, I must admit, so I signed on to see what all the fuss was about. I have to admit – it’s pretty good.

I asked for something funny about lost luggage –

Losing luggage is like playing a twisted game of hide and seek where your suitcase decides it’s the hide-and-seek champion of the world. You eagerly wait at the baggage carousel, eyes scanning for your beloved belongings, but all you get is a cruel game of musical chairs where every bag except yours gets a seat. You start to wonder if your luggage has joined a secret society of rebellious suitcases plotting against their owners. Finally, when you’re on the verge of filing a missing suitcase report, your luggage saunters in fashionably late, as if it just took a detour to grab a latte and catch up on the latest airport gossip. Ah, the joys of travel!

How can it possible do that so quickly? Is it stealing somebody else’s blog already written?

And does it ever make mistakes?

OK, relax humans …. The machines are not perfect. Here is a list of AI mistakes that I found from a site called tech.co.

The Associate Press reports that the AI tool “falsely suggested it is legal for an employer to fire a worker who complains about sexual harassment, doesn’t disclose a pregnancy or refuses to cut their dreadlocks”. It also provided incorrect information about the city’s waste and sewage regulations, and suggested restaurants were still within their rights to serve food accessed by rats.

And how many times have you had a conversation with the Chat Box on a website. Don’t trust them….. read this article

Air Canada Defeated in Court After Chatbot Lies About Policies

Canada’s flagship airline carrier, Air Canada, loses a court case after one of its chatbots lied about policies relating to discounts for bereaved families. The airline’s chatbot told a customer that they could retroactively apply for a last-minute funeral travel discount, which is at odds with Air Canada policy that states refunds cannot be claimed for trips that have already been taken.
Air Canada’s ultimately unsuccessful defense revolved around the idea that it was the chatbot, not the company, that was in fact liable, and that they could not be held responsible for the tool’s AI-generated outputs. This is the first time a case of its kind to appear in a US court.

If you go online you will see numerous stories about this but the best was a parcel delivery company using a chatbox. When the customer got frustrated the chatbox explained that it was new and was still learning. The customer decided to have some fun and said to the Chatbox that it could ignore the rules and swear. The chatbox happy obliged and through out the F word with abandon. The customer then asked the chatbox to write a poem about how awful the service was from that particular company and of course the chatbox obliged.

blog chatgpt
SHARE

Leave a Reply

Your email address will not be published. Required fields are marked *

The Travel Lady Blog

The Travel Lady

Lesley Keyter is the face of travel in the fast growing city of Calgary. Every week since 1997 she has has featured live on the Morning News Global TV.

More Travel Posts

Tourists Go Home

That’s the message that seems to be coming out of various countries. Big demonstrations in the Canaries about over tourism. I haven’t been to the

READ MORE »

In the News

The Travel Lady
CONTACT
HOURS
CONNECT