<<< Prompt injection vs SQL injection 

I made a rather rookie mistake.

I decided to create an AI tool for gathering and organizing security news to help stay on top of what is going on. I wrote a 'serverless' Google Cloud function to run on a regular basis, invoke an AI prompt to gather those security news from the past x weeks that I am interested in, organize them into categories, summarize them, and put summaries and links on a web page so that I can review them. All worked well. ✅

...until I noticed that some of the links were broken. Further investigation showed that almost none of the links worked properly, they were broken, pointed to non-existing paths, or to example.com. The problem was not with the links, but with the news. All hallucinations. 🦄🐲👾

Having spent some time refining my prompt, experimenting with multiple AI models and learning techniques for 'grounding' the model, I realized:
The whole thing was a very bad idea.

If you ask AI to gather news, it will 'want' to give you lots of news. To please you, it will even invent some. That is the way it works. There are techniques to tune the model's 'temperature', push the model to ground the results and force it to provide evidence that they exist, you might be better off with a more advanced model or one trained more recently, but no matter what you do it may hallucinate. That is the way it works.

AI can do wonders when processing data🤖🌈 or for summarizing the past, but telling an AI to go and gather fresh news from the past x weeks is a bad idea (at least with today's tech). You need to gather this kind of data then you can make AI filter/process/organize it -- this seems to be the only way to get rid of hallucinations.

 

This post was first published on Linkedin here on 2026-05-10.

 

 

 
This is my personal website, opinions expressed here are strictly my own, and do not reflect the opinion of my employer. My English blog is experimental and only a small portion of my Hungarian blog is available in English. Contents of my blog may be freely used according to Creative Commons license CC BY.