

Yeah, LLMs seem pretty unlikely to do that, though if they figure it out that would be great. That’s just not their wheelhouse. You have to know enough about what you’re attempting to ask the right questions and recognize bad answers. The thing you’re trying to do needs be within your reach without AI or you are unlikely to be successful.
I think the problem is more the over-promising what AI can do (or people who don’t understand it at all making assumptions because it sounds human-like).
Sure but that’s really the fault of the moron, not the AI for existing. Definitely could blame the AI sellers who would be happy to say AI can do it.
It’s a useful tool but like fire, if idiots get their hands on it bad things will happen.