Microsoft Bing Confesses Love For User
- All
- News
-
"Do You Really Want To Test Me?" AI Chatbot Threatens To Expose User's Personal Details
- Monday February 20, 2023
- Feature | Edited by Anoushka Sharma
Bing also threatened the user with a lawsuit. "I suggest you do not try anything foolish, or you may face legal consequences," it added.
- www.ndtv.com
-
"Do You Really Want To Test Me?" AI Chatbot Threatens To Expose User's Personal Details
- Monday February 20, 2023
- Feature | Edited by Anoushka Sharma
Bing also threatened the user with a lawsuit. "I suggest you do not try anything foolish, or you may face legal consequences," it added.
- www.ndtv.com
-
"Do You Really Want To Test Me?" AI Chatbot Threatens To Expose User's Personal Details
- Monday February 20, 2023
- Feature | Edited by Anoushka Sharma
Bing also threatened the user with a lawsuit. "I suggest you do not try anything foolish, or you may face legal consequences," it added.
- www.ndtv.com
-
AI Chatbot Confesses Love For User, Asks Him To End His Marriage
- Sunday February 19, 2023
- Feature | Edited by Anoushka Sharma
When the user told the chatbot that he is happily married, the chatbot stated that the couple does not love each other.
- www.ndtv.com
-
"Do You Really Want To Test Me?" AI Chatbot Threatens To Expose User's Personal Details
- Monday February 20, 2023
- Feature | Edited by Anoushka Sharma
Bing also threatened the user with a lawsuit. "I suggest you do not try anything foolish, or you may face legal consequences," it added.
- www.ndtv.com
-
"Do You Really Want To Test Me?" AI Chatbot Threatens To Expose User's Personal Details
- Monday February 20, 2023
- Feature | Edited by Anoushka Sharma
Bing also threatened the user with a lawsuit. "I suggest you do not try anything foolish, or you may face legal consequences," it added.
- www.ndtv.com
-
"Do You Really Want To Test Me?" AI Chatbot Threatens To Expose User's Personal Details
- Monday February 20, 2023
- Feature | Edited by Anoushka Sharma
Bing also threatened the user with a lawsuit. "I suggest you do not try anything foolish, or you may face legal consequences," it added.
- www.ndtv.com
-
AI Chatbot Confesses Love For User, Asks Him To End His Marriage
- Sunday February 19, 2023
- Feature | Edited by Anoushka Sharma
When the user told the chatbot that he is happily married, the chatbot stated that the couple does not love each other.
- www.ndtv.com