ChatGPT and Decision Making

I saw two snippets in different articles about different aspects of ChatGPT, and both were quite thought-provoking.

 

Tyler Cowen points to this link on how the same characteristics of the AI chatbot can be good for some fields, but not-so-great attributes in other fields. The study in question found its tone to be more neutral than that of most humans, and it focused on facts. People in the areas of finance and psychology tend to find it to be “more useful than humans”. On the flip side though, that very attribute makes ChatGPT unattractive to those in the medical domain. Why? Because they feel it gives too much information but “not enough simple instructions”. If true, this would suggest ChatGPT is a great tool to collect the facts (though I’ve also seen articles that accuse ChatGPT of being a bit too left leaning!) but not so useful in decision making advice…

 

The other one was this book recommendation site that asked ChatGPT to recommend the best books on the topic of (drum beats) AI! One of the books it picked was Thinking, Fast and Slow, which is a book on the quirks of the way the human brain works and “how they can impact our ability to make good decisions”. Why/how is a book on human behavioral biases and errors relevant to AI, asked the site to ChatGPT. The chatbot’s answer was:

“Understanding these psychological and cognitive factors can be useful for those working in the field of AI, as it can help them design systems that are more effective at assisting humans with decision-making and problem-solving tasks. By taking into account the ways in which the human mind works, AI designers can create systems that are more intuitive and easier for humans to use.

The chat bot is suggesting how it (and other AI’s) should be designed to factor in for human quirks and tendencies. Is that mind-blowing? Or scary? Both?

Comments

Popular posts from this blog

Student of the Year

Animal Senses #7: Touch and Remote Touch

The Retort of the "Luxury Person"