Open-interpreter : using locally deployed LLM.

Shekhar Khandelwal
2 min readJan 18, 2024

Open interpreter github page — https://github.com/KillianLucas/open-interpreter/

To use open-source locally deployed model with open-interpretor, we would need LLM to be deployed on your local machine. We can use wither LM Studio or Ollama for the same.

Ollama — https://medium.com/@khandelwal-shekhar/deploy-llm-locally-using-ollama-277136e4b1f6

LMStudio — https://lmstudio.ai/

Install open-interpretor -

pip install open-interpreter

Open terminal, and write interpreter -

It asks for openAI api key. So, by default it runs using OpenAI api. But, if you want to run it locally, then run -

interpreter --local

For this to run, you must deploy LLM using LMStudio on your machine.

Now ask a question on interpreter terminal -

You can ask the interpreter to all kind of tasks, ranging from writing and executing code, writing content, blogs etc. For better responses, its important to use specific LLMs for specific tasks. Also, bigger the LLM, better the resposes. Hence, quality of the task performance deplends highly on your machine’s computing power, since in such scenario, you are running the LLM locally.

Python wrapper for open-interpreter

To replicate --local and connect to LM Studio, use these settings:

from interpreter import interpreter

interpreter.offline = True # Disables online features like Open Procedures
interpreter.llm.model = "openai/x" # Tells OI to send messages in OpenAI's format
interpreter.llm.api_key = "fake_key" # LiteLLM, which we use to talk to LM Studio, requires this
interpreter.llm.api_base = "http://localhost:1234/v1" # Point this at any OpenAI compatible server

Happy Learning !

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

Shekhar Khandelwal
Shekhar Khandelwal

Written by Shekhar Khandelwal

Data Scientist with a majors in Computer Vision. Love to blog and share the knowledge with the data community.

No responses yet

Write a response