There's nothing special about a chatbot giving you false information if you specifically go in and try to jailbreak it so that it does that. You can go into the Coca-Cola website and use inspect element so that it says that Coke gives you cancer and makes your liver explode.
Don't think this does what you think it does (I only see Alaska flights & code shares), other than showing your attempted injection:
And your prior deleted submission: