Ollama is the single easiest way to run AI models on your laptop

[ollama.ai]

[Github Ollama Project]

[Python Ollama Project]

pip install ollama

Add your local Python scripts folder to your path:

%localappdata%\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\Scripts

%localappdata%\Programs\Python\Python38\Scripts

Models are downloaded to %userprofile%.ollama\models.

ollama search

Found 6 available models:
orca_mini_3b
orca_mini_7b
orca_mini_13b
replit_code_3b
nous_hermes_13b
wizard_vicuna_13b_uncensored

ollama run wizard_vicuna_13b_uncensored

You can change Ollama to not print the prompt.

%localappdata%\Programs\Python\Python38\Lib\site-packages\ollama\cmd\cli.py

def generate(*args, **kwargs):
    if prompt := kwargs.get("prompt"):
        #print(">>>", prompt, flush=True)
        generate_oneshot(*args, **kwargs)
        return

    if sys.stdin.isatty():
        return generate_interactive(*args, **kwargs)

    return generate_batch(*args, **kwargs)