Man, there’s some lore behind Nvidia support in Linux. Short of it: Nvidia are assholes, they pretty much could give Linux users an on-par experience to Windows, but don’t want to. Things have been generally improving in the last 5 years or so I would say though. CUDA and PCI Passthrough also usually work, so getting a cheap Intel Arc to draw your DE and using your Nvidia for heavy lifting is a relatively cheap and ‘drop-in’ fix to your workflow.
But the thing is, while I understand that for hobbyists this is all cool Linux gossip that feeds Youtube videos and is entertaining, it’s very, very far from what a normal user expects.
Which is, you know… no lore at all.
For the record, I ran an A770 on Linux for a bit on fairly recent drivers for a bit and there was definitely something wrong (frame pacing? straight up worse performance? I’m not sure). I should give that another shot one of these days.
Man, there’s some lore behind Nvidia support in Linux. Short of it: Nvidia are assholes, they pretty much could give Linux users an on-par experience to Windows, but don’t want to. Things have been generally improving in the last 5 years or so I would say though. CUDA and PCI Passthrough also usually work, so getting a cheap Intel Arc to draw your DE and using your Nvidia for heavy lifting is a relatively cheap and ‘drop-in’ fix to your workflow.
Was using CUDA a problem?
Does someone want to convince me that the whole AI industry is training their models on windows machines?
Yeah, I know.
But the thing is, while I understand that for hobbyists this is all cool Linux gossip that feeds Youtube videos and is entertaining, it’s very, very far from what a normal user expects.
Which is, you know… no lore at all.
For the record, I ran an A770 on Linux for a bit on fairly recent drivers for a bit and there was definitely something wrong (frame pacing? straight up worse performance? I’m not sure). I should give that another shot one of these days.