I don’t disagree with the ambiguity here, but let’s focus on LLMs.
I don’t understand the argument that an LLM is more conscious than traditional software. Take for example a spreadsheet with a timed macro.
A spreadsheet with a timed macro stores state in both short term (RAM) memory and long term (in the spreadsheet table) memory and accepts many inputs. The macro timer gives it repeatable continuing execution, and a chance to process both past and present inputs with complexity.
An LLM has limited short term memory (it’s current place in the calculation, stored in VRAM) and zero long term memory. It can only accept one input and cannot recall past experience to construct a chain of consciousness.
The spreadsheet is objectively more likely to be conscious than the LLM. But no one argues for spreadsheet consciousness because it’s ridiculous. People argue for LLM consciousness simply because we sentient messy meat bags are evolutionarily programmed to construct theories of mind for everything and in so doing mistakenly personify things that remind us of ourselves. LLMs simply feel “alive-ish” enough to pass our evolutionary sniff test.
But in reality, they’re less conscious than the spreadsheet.
I don’t disagree with the ambiguity here, but let’s focus on LLMs.
I don’t understand the argument that an LLM is more conscious than traditional software. Take for example a spreadsheet with a timed macro.
A spreadsheet with a timed macro stores state in both short term (RAM) memory and long term (in the spreadsheet table) memory and accepts many inputs. The macro timer gives it repeatable continuing execution, and a chance to process both past and present inputs with complexity.
An LLM has limited short term memory (it’s current place in the calculation, stored in VRAM) and zero long term memory. It can only accept one input and cannot recall past experience to construct a chain of consciousness.
The spreadsheet is objectively more likely to be conscious than the LLM. But no one argues for spreadsheet consciousness because it’s ridiculous. People argue for LLM consciousness simply because we sentient messy meat bags are evolutionarily programmed to construct theories of mind for everything and in so doing mistakenly personify things that remind us of ourselves. LLMs simply feel “alive-ish” enough to pass our evolutionary sniff test.
But in reality, they’re less conscious than the spreadsheet.