We were promised better Siri, better Alexa, better everything. Instead we’ve gotten… chip bumps.

  • ch00f@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    13 hours ago

    Mostly unrelated, but since this is going to be a dunk on AI thread anyway, I think what’s feeding all this hubris around AI is that it’s essentially tricking us into thinking it’s intelligent. It’s an incredible tool for compressing and organizing information, but it isn’t really smart.

    And I had this thought watching a video last night of Apollo the African grey parrot. This bird has the Guiness World Record for being to correctly identify 12 different objects. But that he can speak a language that we understand doesn’t make him any more intelligent than many other animals. And when left alone without a prompt he’ll just mumble nonsense or in other words “hallucinate.” That he gets the words in order is just something that he was trained to do. It’s not his natural state.

    Anyway, I feel like AI is kind of like that. Our language-based psychology makes it seem more intelligent to us than it actually is.

    • Pennomi@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      12 hours ago

      To be fair, a great percentage of human to human communication is also an attempt to trick each other that we’re intelligent.

    • Showroom7561@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      ·
      13 hours ago

      I think what’s feeding all this hubris around AI is that it’s essentially tricking us into thinking it’s intelligent. It’s an incredible tool for compressing and organizing information, but it isn’t really smart.

      My son has Apple assistant (Siri), and I have Google Gemini. For shits and giggles, we had them talk to each other… literally have a conversation… and it got stale very quickly. There’s no “person” behind artificial “intelligence”, so you can see just how limited it gets.

      I’ve always said that if you know a lot about a topic, you can very quickly see how AI is really stupid for the most part. The problem is that if you ask it a question that you don’t know the answer to, then it for sure seems correct, even when it completely hallucinates the response.

      The danger is that not everyone has enough critical thinking skills to question the correctness of an answer, so they hear what Siri or Gemini told them as fact… and then pass that knowledge onto other actual human beings. Like a virus of misinformation.

      • ch00f@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        13 hours ago

        The danger is that not everyone has enough critical thinking skills to question the correctness of an answer

        I brought this up to my mom who responded with “yeah, but there’s a lot of incorrect information online anyway.” This is true, but AI strips away 100% of the context for that information, and if the AI people have their way, there will be no other portal online with which to get a second opinion.

        • Showroom7561@lemmy.ca
          link
          fedilink
          English
          arrow-up
          4
          ·
          12 hours ago

          “yeah, but there’s a lot of incorrect information online anyway.”

          Here’s the thing: before AI, most information came from an author or organization, who had to stake their reputation on the content they create. If the information they provided was false, low quality, misleading, etc… they paid a penalty for it in a loss of credibility (and even income).

          But with AI, that doesn’t happen. You can generate 1000 articles at the click of a button, post it everywhere, and there’s no backlash because the author doesn’t exist.

          I think in the near future, you’ll start to see certification for human-generated content. I know that movies have started to disclose whether AI generated content was used or not, so the trend is that people want to know.

      • ahornsirup@feddit.org
        link
        fedilink
        English
        arrow-up
        4
        ·
        12 hours ago

        It’s not even a lack of critical thinking skills, necessarily. Companies don’t exactly highlight the fact that their AIs are prone to hallucinating. I’d be willing to bet actual money that a lot of users aren’t even aware that that’s a possibility.

        • Showroom7561@lemmy.ca
          link
          fedilink
          English
          arrow-up
          4
          ·
          12 hours ago

          Companies don’t exactly highlight the fact that their AIs are prone to hallucinating.

          Funny enough, if you question Gemini enough, it will eventually cave and admit that it’s answers aren’t always right and that it’s still improving. LOL

          The problem is, as I suggested, you already need to know what the correct answer is in order to effectively cross-examine what it told you.

    • theneverfox@pawb.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 hours ago

      The record is now like 26 in a minute or something, by a guy who has done videos on his bird’s training for years. I think the bird’s name is Apollo

      They have arguments over what things are, like you can’t convince Apollo a lizard isn’t a bug, because Apollo has understood a bug to be a little critter he could potentially eat. You can’t convince him ceramic tile isn’t made of rock, because he’s kinda got a point

      Apollo babbles to himself when he’s alone too, but you know what? So do I. Especially when I’m trying to pick up a foreign language, I’ll practice words until they feel natural on my tongue

      And everyone seems so quick to forget Koko or label her an exception. She basically spoke in poetry, understood mortality, and described herself as a good gorilla person when asked what she was

      Animals understand, it’s just rare to find ones that are motivated to sit and communicate on our terms. Every “special” human trait, from language to culture to sense of self and abstract thinking seems to be pretty common, we keep finding it in many animals so we keep moving the goalposts

      • ch00f@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 hours ago

        The video I linked is literally of Apollo.

        Apollo has understood a bug to be a little critter he could potentially eat

        How can you know that? He only knows a handful of words. The lizard probably looks more like a bug than like a cup or Wario. He’s familiar with the phrase “what’s this?” and “what made of?” If he had any real understanding, why didn’t he just ask those questions to expand his vocabulary?

        I’m a big fan of Apollo, and he’s a lot of fun to watch, but his use of language is not demonstrative of a deeper understanding.

        And regarding Koko:

        Patterson reported that Koko invented new signs to communicate novel thoughts. For example, she said that nobody taught Koko the word for “ring”, so Koko combined the words “finger” and “bracelet”, hence “finger-bracelet”.[22][promotional source?] This type of claim was seen as a typical problem with Patterson’s methodology, as it relies on a human interpreter of Koko’s intentions.

        Other researchers argued that Koko did not understand the meaning behind what she was doing and learned to complete the signs simply because the researchers rewarded her for doing so (indicating that her actions were the product of operant conditioning)

    • Nougat@fedia.io
      link
      fedilink
      arrow-up
      2
      ·
      13 hours ago

      I wouldn’t call the verbalization in that video “nonsense.” He’s choosing to say those words and phrases, and often saying them in concert with actions we can recognize as being related. Knowing the kind of memory birds have for all sorts of things, I would also not be surprised if he was thinking about something and verbalizing those thoughts - but how could we ever know that?

      • ch00f@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        13 hours ago

        I mean at one point he says “step up” while stepping on a branch. Nothing else he does seems terribly related to physical actions. And this makes sense because his brain didn’t evolve to communicate complex ideas using words.

        we can recognize as being related

        And this is my point. We’re seeing them as being related, but I think we are doing a lot of the heavy lifting here assigning intelligence where there may be a lot more random noise. Like if after being trained to identify objects he spent his time practicing identifying objects, that might convince me he’s doing something intelligent, but I think it’s more likely he just likes hearing himself vocalize.

        • Nougat@fedia.io
          link
          fedilink
          arrow-up
          2
          ·
          13 hours ago

          “No biting” stood out to me, too.

          And this makes sense because his brain didn’t evolve to communicate complex ideas using words.

          But some of them most certainly communicate with vocalization. The fact that some birds are able to mimic the non-bird sounds they hear points to their being very good with vocalization. What’s in a word besides being a set of vocalizations that communicates some meaning to another creature?

          We’re seeing them as being related, but I think we are doing a lot of the heavy lifting here assigning intelligence where there may be a lot more random noise.

          Possibly, and I’m not a bird lawyer. It starts to get kind of meta from this point. What is intelligence, and are we the arbiters of its definition?

          … spent his time practicing identifying objects, …

          Like with “step up” and “no biting”? Don’t get me wrong, you make good and valid points. I just think it’s more of a “grey” area (pun intended).

          • ch00f@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            12 hours ago

            What’s in a word besides being a set of vocalizations that communicates some meaning to another creature?

            That’s why I said “complex ideas.” Like a dog will yelp if it’s hurt, or stare out the back door when it wants out, but I wouldn’t consider that “language.”

            The only difference between yelping and what Apollo is doing is that he sounds like a person.

            And maybe discussing animal psychology is a little too off topic from my original point which is that things can seem more intelligent to us when they look or sound like people.

            Like the fact that kids can form an emotional bond with a Tamagotchi which is no more sophisticated than a Casio wristwatch speaks more to how humans assign intelligence to life-like things than to how intelligent a Tamagotchi is.