{"id":4868,"date":"2026-01-09T16:09:19","date_gmt":"2026-01-09T15:09:19","guid":{"rendered":"https:\/\/ba.be\/uncategorized\/chatgpt-is-bullshit\/"},"modified":"2026-01-30T13:18:57","modified_gmt":"2026-01-30T12:18:57","slug":"chatgpt-is-bullshit","status":"publish","type":"post","link":"https:\/\/ba.be\/en\/front-en\/chatgpt-is-bullshit\/","title":{"rendered":"ChatGPT is bullshit"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-post\" data-elementor-id=\"4868\" class=\"elementor elementor-4868 elementor-4711\" data-elementor-post-type=\"post\">\n\t\t\t\t<div class=\"elementor-element elementor-element-3021938 e-flex e-con-boxed e-con e-parent\" data-id=\"3021938\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-6c1c75d elementor-widget elementor-widget-text-editor\" data-id=\"6c1c75d\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>What bothers me most about the debate is the use of the term hashtag #hallucineren in the context of LLMs. It was even voted (digital) word of the year in the Netherlands, but it doesn&#8217;t quite capture the essence. <\/p><p>LLMs don&#8217;t drink, don&#8217;t take drugs, and can&#8217;t have fevers. The term is essentially a way to justify what an LLM actually does: lie when they don&#8217;t know the answer and then, like a cunning con artist, pass off their fiction as fact. <\/p><p>Incidentally, I fall into the same trap here by humanizing AI; the system is not conscious. According to researchers in Glasgow, these models sell bullshit in the philosophical sense of the word: they provide information without regard for the truth. Or, better yet, they simply generate text without the intention of telling the truth.  <\/p><p>In Shakespeare&#8217;s language it sounds even sharper: &#8220;the models are in an important way indifferent to the truth of their outputs&#8221;.<\/p><p>It remains incredibly difficult to avoid anthropomorphism. We seem so eager to attribute consciousness and human-like qualities to AI, while under the hood it&#8217;s just empty zeros and ones. <\/p><p>Interesting reading: <a href=\"https:\/\/lnkd.in\/e62zXn2Z\" target=\"_blank\" rel=\"noopener\">Michael Townsen Hicks, James Humphries &amp; Joe Slater &#8220;ChatGPT is bullshit&#8221; <\/a> <\/p><p><a href=\"https:\/\/www.linkedin.com\/posts\/janguldentops_chatgpt-is-bullshit-ethics-and-information-activity-7415114925280411648-9831\/?utm_source=social_share_send&amp;utm_medium=member_desktop_web&amp;rcm=ACoAAAATmswBNh40vouXGOAUsNhL7mwTzQVYaLU\" target=\"_blank\" rel=\"noopener\">originally posted on LinkedIn<\/a><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>What bothers me most about the debate is the use of the term hashtag #hallucineren in the context of LLMs. It was even voted (digital) word of the year in [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":2533,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[76,78,80],"tags":[],"class_list":["post-4868","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-front-en","category-2-the-bad-en","category-3-the-ugly-en"],"_links":{"self":[{"href":"https:\/\/ba.be\/en\/wp-json\/wp\/v2\/posts\/4868","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ba.be\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/ba.be\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/ba.be\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/ba.be\/en\/wp-json\/wp\/v2\/comments?post=4868"}],"version-history":[{"count":1,"href":"https:\/\/ba.be\/en\/wp-json\/wp\/v2\/posts\/4868\/revisions"}],"predecessor-version":[{"id":4869,"href":"https:\/\/ba.be\/en\/wp-json\/wp\/v2\/posts\/4868\/revisions\/4869"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/ba.be\/en\/wp-json\/wp\/v2\/media\/2533"}],"wp:attachment":[{"href":"https:\/\/ba.be\/en\/wp-json\/wp\/v2\/media?parent=4868"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/ba.be\/en\/wp-json\/wp\/v2\/categories?post=4868"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/ba.be\/en\/wp-json\/wp\/v2\/tags?post=4868"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}