I have a 4060 TI with 16GB Vram I got a year ago after my last 3090 burnt up, and 80 GB DDR4 I got years ago for a VR setup, which flowed well into an AI setup. The motherboard is a Bazooka 550 I got cheap and I think a Ryzen 7 CPU, I am not sitting in front of it to tell you for sure.

I have been using Ollama to run the models. Mostly a lot of it is processing scraped data for analysis and parsing large text files and providing complete reports on the content. Nothing super heavy computationally, but CPU models are too slow to be feasible, or so I find.

Latest revision is 2.
Revision 2 Apr 7, 2026 12:40 PM
View Latest

I have a 4060 TI with 16GB Vram I got a year ago after my last 3090 burnt up, and 80 GB DDR4 I got years ago for a VR setup, which flowed well into an AI setup. The motherboard is a Bazooka 550 I got cheap and I think a Ryzen 7 CPU, I am not sitting in front of it to tell you for sure.

I have been using Ollama to run the models. Mostly a lot of it is processing scraped data for analysis and parsing large text files and providing complete reports on the content. Nothing super heavy computationally, but CPU models are too slow to be feasible, or so I find.

Revision 1 Apr 7, 2026 12:38 PM
View Revision

I have a 4060 TI with 16GB Vram I got a year ago after my last 3090 burnt up, and 80 GB DDR4 I got years ago for a VR setup, which flowed well into an AI setup. I have been using Ollama to run the models. Mostly a lot of it is processing scraped data for analysis and parsing large text files and providing complete reports on the content. Nothing super heavy computationally, but CPU models are too slow to be feasible, or so I find.