jogai_san@lemmy.world to Selfhosted@lemmy.worldEnglish · 8 hours agoSelf-Host Weekly (30 January 2026)selfh.stexternal-linkmessage-square19linkfedilinkarrow-up142
arrow-up142external-linkSelf-Host Weekly (30 January 2026)selfh.stjogai_san@lemmy.world to Selfhosted@lemmy.worldEnglish · 8 hours agomessage-square19linkfedilink
minus-squareirmadlad@lemmy.worldlinkfedilinkEnglisharrow-up1·3 hours agoIf I had the proper equipment, I’d run AI if it were self contained and not pinging out to another LLM.
minus-squareDavid J. Atkinson@c.imlinkfedilinkarrow-up2·3 hours ago@irmadlad @selfhosted That is precisely the challenge. I’m not sure it is possible.
minus-squareirmadlad@lemmy.worldlinkfedilinkEnglisharrow-up1·18 minutes agoI mean, I can run a few of the private AI stacks, but it is excruciatingly slow as to make it not worth the time. I would want something pretty responsive.
If I had the proper equipment, I’d run AI if it were self contained and not pinging out to another LLM.
@irmadlad @selfhosted That is precisely the challenge. I’m not sure it is possible.
I mean, I can run a few of the private AI stacks, but it is excruciatingly slow as to make it not worth the time. I would want something pretty responsive.