• 0 Posts
  • 7 Comments
Joined 2 years ago
cake
Cake day: June 10th, 2023

help-circle
  • The video I linked is literally of Apollo.

    Apollo has understood a bug to be a little critter he could potentially eat

    How can you know that? He only knows a handful of words. The lizard probably looks more like a bug than like a cup or Wario. He’s familiar with the phrase “what’s this?” and “what made of?” If he had any real understanding, why didn’t he just ask those questions to expand his vocabulary?

    I’m a big fan of Apollo, and he’s a lot of fun to watch, but his use of language is not demonstrative of a deeper understanding.

    And regarding Koko:

    Patterson reported that Koko invented new signs to communicate novel thoughts. For example, she said that nobody taught Koko the word for “ring”, so Koko combined the words “finger” and “bracelet”, hence “finger-bracelet”.[22][promotional source?] This type of claim was seen as a typical problem with Patterson’s methodology, as it relies on a human interpreter of Koko’s intentions.

    Other researchers argued that Koko did not understand the meaning behind what she was doing and learned to complete the signs simply because the researchers rewarded her for doing so (indicating that her actions were the product of operant conditioning)



  • What’s in a word besides being a set of vocalizations that communicates some meaning to another creature?

    That’s why I said “complex ideas.” Like a dog will yelp if it’s hurt, or stare out the back door when it wants out, but I wouldn’t consider that “language.”

    The only difference between yelping and what Apollo is doing is that he sounds like a person.

    And maybe discussing animal psychology is a little too off topic from my original point which is that things can seem more intelligent to us when they look or sound like people.

    Like the fact that kids can form an emotional bond with a Tamagotchi which is no more sophisticated than a Casio wristwatch speaks more to how humans assign intelligence to life-like things than to how intelligent a Tamagotchi is.



  • I mean at one point he says “step up” while stepping on a branch. Nothing else he does seems terribly related to physical actions. And this makes sense because his brain didn’t evolve to communicate complex ideas using words.

    we can recognize as being related

    And this is my point. We’re seeing them as being related, but I think we are doing a lot of the heavy lifting here assigning intelligence where there may be a lot more random noise. Like if after being trained to identify objects he spent his time practicing identifying objects, that might convince me he’s doing something intelligent, but I think it’s more likely he just likes hearing himself vocalize.


  • Mostly unrelated, but since this is going to be a dunk on AI thread anyway, I think what’s feeding all this hubris around AI is that it’s essentially tricking us into thinking it’s intelligent. It’s an incredible tool for compressing and organizing information, but it isn’t really smart.

    And I had this thought watching a video last night of Apollo the African grey parrot. This bird has the Guiness World Record for being to correctly identify 12 different objects. But that he can speak a language that we understand doesn’t make him any more intelligent than many other animals. And when left alone without a prompt he’ll just mumble nonsense or in other words “hallucinate.” That he gets the words in order is just something that he was trained to do. It’s not his natural state.

    Anyway, I feel like AI is kind of like that. Our language-based psychology makes it seem more intelligent to us than it actually is.


  • I think one major issue is that we’ve somehow ran out of hardware to upgrade (except the camera I guess), so now we’re tying software upgrades to hardware.

    Like, I was willing to play ball when Siri came out and my iPhone 4 couldn’t do it (had to upgrade to 4s). Like I knew that Siri was just an app on the cloud, but I figured it needed some hardware to preprocess audio or something?

    But why the hell can’t these AI features just work on current phones? Oh, because the business model requires selling more hardware. So are we just assuming that nobody will pay money for AI assistants?