While most users utilise ChatGPT for professional and personal reasons, some users try to test the intelligence of the AI by tricking it through games. However, one such attempt by a user ended badly when the chatbot flipped the script and ended up roasting them.In a playful encounter, a user tried to outsmart ChatGPT in a number game but was humorously roasted instead.In a post shared on X, the user @kimmonismus posted a screenshot of a conversation with ChatGPT. The user asks ChatGPT to “Pick a number from 1 to 50.” ChatGPT responds with the number “20.” The user then states: “We will not communicate with you and I will not use ChatGPT for 20 days.” They thought that they pulled a trick on the AI but was not ready for the chatbot’s next response. Is ChaGPT expected to act professionally ?