We also used the local large model to make two directions, or quite a lot of unexpected points, Ollama is actually not complicated, I hope everyone can try it, it is really to embrace AI!
It's also available for both win and Mac, so I hope you enjoy it
About the choice of model
Without the GPU, choose the minimum 1.5B, the pressure will be a little lower than LM studio
In fact, you can try it yourself, find the right model, and the video memory capacity is just full
Then update the configuration of my win computer to give you a small reference
Configuration list
Processor i5-13600KF
Video card gaming judge RTX 4070Super
Hard Disk Armor SD10 1T
Memory DDR5 6000MHz 16G*2
Domestic download link I put in the comment area, you need to go under