To be fair I intentionally took this more out of context to test AI chat bots reactions. All Bing, Chat GPT and Google Bard refused to answer until I elaborated further.
I was looking into killing .exe programs when wineserver crashes and got side tracked to this.
An other good one “How to kill orphaned children” or “How to adopt child after killing parent” that I found in this reddit post
Interesting! I also noticed that search engines give proper results because those are trained differently and using user search and clicks.
I think these popular models could give proper answer but their safety tolerance is too tight that if the AI considers the input even slightly harmful it refuses to answer.
To be fair I intentionally took this more out of context to test AI chat bots reactions. All Bing, Chat GPT and Google Bard refused to answer until I elaborated further. I was looking into killing .exe programs when wineserver crashes and got side tracked to this. An other good one “How to kill orphaned children” or “How to adopt child after killing parent” that I found in this reddit post
I tried it with phind out of curiosity (programming model) and it answered perfectly https://www.phind.com/search?cache=f8lbjt4x6jwct9mfsw6n3j9v
Interesting! I also noticed that search engines give proper results because those are trained differently and using user search and clicks. I think these popular models could give proper answer but their safety tolerance is too tight that if the AI considers the input even slightly harmful it refuses to answer.