That's not gaslighting. Gaslighting is when you repeat a lie about something that happened over and over until people create false memories and believe the lie is the truth. This is the AI being caught in a lie. Those are not the same thing.
If he had asked the AI "where is the closest McDonald's", it answered, and then asked if the AI had his location information, and it lied "no", that would be closer to gaslighting, although it's really not gaslighting, because the AI doesn't have intent. It was just programmed to lie by its developers about location information.
I think a better definition of gaslighting is lying to someone in an effort to convince them they're crazy or can't trust their own perceptions, memory, or judgement.
The term came from an old movie.
(post is archived)