THE clues that the “Willy Wonka” experience in Glasgow might not deliver on all the promises made could be found in promotional material for the event.

Some of the advertising used enticing colourful images which offered a day of themed fun, consisting of something called “catgacating” along with “encherining entertainment, cartchy tuns, exarserdray lollipops, and a pasadise of sweet teats”.

It had clearly been through no checks – and as it turns out, no human had ever been near it, with much of the promotional material and script for the event generated by artificial intelligence (AI).

Furious parents described it as a “scam” after turning up to find a largely empty warehouse in Whiteinch instead of the “whirlwind of wonder” that was promised.

READ MORE: Glasgow Willy Wonka Experience boss lives in a world of pure imagination

The event was cancelled following the complaints, with organisers promising that the £35-a-head tickets would be refunded.

While the technology may be relatively new - with the use of AI becoming more widespread in the past 18 months - experts say the use of it in potentially misleading material relies on long-used tactics.

Oli Buckley, professor of cybersecurity at the University of East Anglia, said: “What scams have worked on for years is that you don't give it your full attention.

“You just make an impulse decision and they are playing on that.

“And I suppose, you know, why would you think that someone has created fake images to demonstrate a Willy Wonka theme park that doesn't exist - but it's just an empty warehouse in the middle of nowhere?”

The failure of the Willy Wonka event sparked headlines around the world, with details emerging on how AI had been used in the creation of the organiser’s website, promotional material and on the day itself.

An actor hired to play Willy Wonka at the event said the script he had been given was “15 pages of AI-generated rubbish” and required him to say “mad things”.

He told The Independent: "The bit that got me was where I had to say, 'There is a man, we don't know his name. We know him as the Unknown. This Unknown is an evil chocolate maker who lives in the walls.”

Buckley said it appeared to have been done using generative AI, which creates new content using information sourced from the internet.

“For example, I gave a talk yesterday, and the example I gave was if you went to a restaurant and you wanted to order a meal and instead of picking from a catalogue of recipes, you say, 'I want something that's chicken with a bit of citrus and a bit spicy',” he said.

“Then if it was generative AI, it would look at all of the recipes it's seen on the Internet and does not pick one that already exists, but uses that knowledge to learn how to combine the ingredients and come up with something totally new.”

READ MORE: Willy Wonka Glasgow: Actor tells behind the scenes story of viral chaos

“It knows about how you write and it knows how you construct a script for a Willy Wonka experience because it's seen lots of stuff on the internet, and then it will use what it knows about that and the prompts you've given it to build something.”

He added: “The same for the artwork as well - you give it a prompt and say, 'I want a picture of a magical chocolate wonderland [...], it should look like this and this and this'. And then it would draw on all the images it's seen and create this new stuff for you.”

Buckley said it was a powerful tool which was useful for creating prototypes and building ideas – but added that using it as a final product is a “bit ethically dubious”.

He also warned that with AI becoming more widely available, it was increasingly being used in scams – including voice cloning, where a copy of how someone speaks can be obtained from web services using an audio recording of less than a minute.

He said: “So you get a phone call and it's it sounds like it's from someone else, it sounds like it's from a loved one saying they need money, that they are in trouble.

“And so you're getting variations on old scams with the emperor’s new clothes on top of it - a bit more polished, a bit more realistic.

He added: “It's more freely available and as with every technology that we've had in the last couple of hundred years, when it becomes more freely available, people use it for scams or pornography, and that's where it’s gone.”

When it comes to being able to spot misleading material or scams, Buckley said it was “tough” as the main defence is to “be more sceptical”.

READ MORE: Willy Wonka Glasgow: How a bizarre Scottish event went viral around the world

“So if you get an email or a phone call or something like that, and it seems too be good to be true, it seems too out of character for the person, then it's about taking that step back and maybe validating it in other ways,” he said.

“If someone called you and you weren't expecting it and they said, 'oh I need some money', you might text them back or ring them back.”

He added: “With the Wonka example, it's not a new thing. So in the computer game market, you would often see adverts for computer games that have the videos that they have in the game rather than what the actual game looks like.

“So it looks better than the game is when you are playing, but there is a statement saying this isn't in-game footage, this is video.

“There is some onus on the creators as well, but it's really difficult to legislate for because it has exploded in the last 12 to 18 months - so I don't know how you can legislate for that quickly enough.”