• Lvxferre [he/him]@mander.xyz
    link
    fedilink
    English
    arrow-up
    4
    Ā·
    1 day ago

    Thereā€™s no previous context to speak of; each screenshot shows a self-contained ā€œconversationā€, with no earlier input or output. And thereā€™s no history to clear, since Gemini app activity is not even turned on.

    And even with your suggested prompt, one of the issues is still there:

    The other issue is not being tested in this shot as itā€™s language-specific, but it is relevant here because it reinforces that the issue is in the training, not in the context window.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      Ā·
      1 day ago

      Was just a guess. The AI is still shitty, lol.

      What I am trying to get at is the misconception: AI can generate novel content not in its training dataset. An astronaut riding a horse is the classic test case, which did not exist anywhere before diffusion models, and it should be able to extrapolate a fuller wine glass. Itā€™s just too dumb to do it, lol.