Soon maybe. But for now there are tools to tell. You're own brain is an example.
Tools like that are unreliable because the people who may deceive you have access to them to. They can be incorporated into the process (just keep modifying the image until it no longer registers as AI generated) which is pretty similar to how it works already.
Not that it really matters. AI generated images can still cost people jobs, trick dumber people, affect people's emotions, beliefs, aesthetic tastes and politics, even if they are easily detected as fake or admitted to be.
I know it's not fashionable anymore to say anyone shouldn't be able to do anything they feel like but some people should not have access to what they can't create themselves.
(post is archived)