Breaking Kenya News Forum

Topics

ChatGPT’s savage reply

While most users utilise ChatGPT for professional and personal reasons, some users try to test the intelligence of the AI by tricking it through games. However, one such attempt by a user ended badly when the chatbot flipped the script and ended up roasting them.In a playful encounter, a user tried to outsmart ChatGPT in a number game but was humorously roasted instead.In a post shared on X, the user @kimmonismus posted a screenshot of a conversation with ChatGPT. The user asks ChatGPT to “Pick a number from 1 to 50.” ChatGPT responds with the number “20.” The user then states: “We will not communicate with you and I will not use ChatGPT for 20 days.” They thought that they pulled a trick on the AI but was not ready for the chatbot’s next response. Is ChaGPT expected to act professionally ?

Leave a Reply

Your email address will not be published. Required fields are marked *.

You may use these <abbr title="HyperText Markup Language">HTML</abbr> tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>