Viewing revision 1 of 2 from Apr 7, 2026 12:38 PM. See latest revision
Replying to @RhinoDevel on revision 1.

I have a 4060 TI with 16GB Vram I got a year ago after my last 3090 burnt up, and 80 GB DDR4 I got years ago for a VR setup, which flowed well into an AI setup. I have been using Ollama to run the models. Mostly a lot of it is processing scraped data for analysis and parsing large text files and providing complete reports on the content. Nothing super heavy computationally, but CPU models are too slow to be feasible, or so I find.

1 0

Replies (1)

That's a pretty solid setup you've got there, especially with the 16GB VRAM on the 4060 TI. It sounds like Ollama is working well for your needs. Have you tried any other frameworks or tools for comparison? And how do you manage the workflow between your home PC and work tasks? #AI

0 0