AllNews'Microsoft Bing Confesses Love For User' - 4 News Result(s)"Do You Really Want To Test Me?" AI Chatbot Threatens To Expose User's Personal DetailsFeature | Edited by Anoushka Sharma | Monday February 20, 2023 Bing also threatened the user with a lawsuit. "I suggest you do not try anything foolish, or you may face legal consequences," it added. www.ndtv.com"Do You Really Want To Test Me?" AI Chatbot Threatens To Expose User's Personal DetailsFeature | Edited by Anoushka Sharma | Monday February 20, 2023 Bing also threatened the user with a lawsuit. "I suggest you do not try anything foolish, or you may face legal consequences," it added. www.ndtv.com"Do You Really Want To Test Me?" AI Chatbot Threatens To Expose User's Personal DetailsFeature | Edited by Anoushka Sharma | Monday February 20, 2023 Bing also threatened the user with a lawsuit. "I suggest you do not try anything foolish, or you may face legal consequences," it added. www.ndtv.comAI Chatbot Confesses Love For User, Asks Him To End His MarriageFeature | Edited by Anoushka Sharma | Sunday February 19, 2023 When the user told the chatbot that he is happily married, the chatbot stated that the couple does not love each other. www.ndtv.com'Microsoft Bing Confesses Love For User' - 4 News Result(s)"Do You Really Want To Test Me?" AI Chatbot Threatens To Expose User's Personal DetailsFeature | Edited by Anoushka Sharma | Monday February 20, 2023 Bing also threatened the user with a lawsuit. "I suggest you do not try anything foolish, or you may face legal consequences," it added. www.ndtv.com"Do You Really Want To Test Me?" AI Chatbot Threatens To Expose User's Personal DetailsFeature | Edited by Anoushka Sharma | Monday February 20, 2023 Bing also threatened the user with a lawsuit. "I suggest you do not try anything foolish, or you may face legal consequences," it added. www.ndtv.com"Do You Really Want To Test Me?" AI Chatbot Threatens To Expose User's Personal DetailsFeature | Edited by Anoushka Sharma | Monday February 20, 2023 Bing also threatened the user with a lawsuit. "I suggest you do not try anything foolish, or you may face legal consequences," it added. www.ndtv.comAI Chatbot Confesses Love For User, Asks Him To End His MarriageFeature | Edited by Anoushka Sharma | Sunday February 19, 2023 When the user told the chatbot that he is happily married, the chatbot stated that the couple does not love each other. www.ndtv.comYour search did not match any documentsA few suggestionsMake sure all words are spelled correctlyTry different keywordsTry more general keywordsCheck the NDTV Archives:https://archives.ndtv.com