# docker-langchain **Repository Path**: beyond-prototype/docker-langchain ## Basic Information - **Project Name**: docker-langchain - **Description**: Docker to run langchain and LLM - **Primary Language**: Python - **License**: Apache-2.0 - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2023-12-31 - **Last Updated**: 2024-01-01 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # Docker for langchain ## 1. Define Dockerfile Create a file called [Dockerfile](Dockerfile) in the root of a project and add the following contents. ### 1.1 Install Python Rather than installing Python on the local machine, we can use Docker to create a container with Python installed. ```dockerfile # Use an official Python runtime as a parent image FROM python:3.10-slim AS base ``` Update the source of python library if necessary. Here tsinghua mirror is used for example to speed up the process. ```dockerfile RUN pip config set global.index-url https://pypi.tuna.tsinghua.edu.cn/simple RUN pip config set install.trusted-host pypi.tuna.tsinghua.edu.cn ``` Upgrade pip to the latest version. ```dockerfile RUN pip3 install --upgrade pip ``` ### 1.2. Install langchain and its dependencies ```dockerfile RUN pip install requests==2.31.0 RUN pip install beautifulsoup4==4.12.2 RUN pip install langchain==0.0.352 RUN pip install transformers==4.25.1 ``` ### 1.3. Install pymilvus and its dependencies Pymilvus is the official python client for Milvus (a famous open source vector database). ```dockerfile RUN pip install grpcio==1.37.1 RUN pip install grpcio-tools==1.37.1 RUN pip install pymilvus==2.3.0 ``` ### 1.4. Create a volume for application ```dockerfile VOLUME [ "/app" ] ``` We put the entrypoint script and other python scripts in host machine **[./app](app)** folder and mount it to the container at runtime, so the container can access the scripts from the host machine. ### 1.5. Define entrypoint ```dockerfile ENTRYPOINT ["/bin/bash", "/app/entrypoint.sh"] ``` When the container is started, the entrypoint script [app/entrypoint.sh](app/entrypoint.sh) will be launched to execute the python scripts e.g. the hello_langchain.py below. ```bash #!/bin/bash python3 /app/hello_langchain.py ``` ## 2. Build docker image ```bash docker build -t bp/langchain:0.0.2 . ``` ## 3. Run docker container ```bash docker run --net host -v ./app:/app bp/langchain:0.0.2 ``` Please note that we use **--net host** to share the host network with the host machine, so the container can access the host machine's network. We use **-v ./app:/app** to mount the host machine's **[./app](app)** folder to the container at runtime. ## 4. Docker Compose ## 4.1 Docker Ollama The ollama service is a docker container that runs the ollama model such as Mistral. Please refer to [docker-compose-ollama.yml](docker-compose-ollama.yml). ```yml services: ollama: container_name: ollama image: ollama/ollama volumes: - ${DOCKER_VOLUME_DIRECTORY:-.}/volumes/ollama:/root/.ollama ports: - "11434:11434" networks: default: name: ollama ``` ### 4.2. Langchain The langchain container to run the application scripts is defined in [docker-compose.yml](docker-compose.yml) below. ```yaml include: - ./docker-compose-ollama.yml services: langchain: image: bp/langchain:0.0.2 container_name: bplangchain volumes: - ./app:/app ``` We include the ollama container in the docker-compose.yml file so that the langchain container can access the ollama container. We can directly call the ollama container in the langchain container using the its service name **ollama**. Please refer to [app/hello_mistral.py](app/hello_mistral.py) which will be executed by the langchain container. ```python # Set the URL for Ollama API url = "http://ollama:11434/api/generate" prompt = "who are you" # Make a POST request to the Mistral language model API with the input text response = requests.post(url, json={"model": "mistral:instruct", "prompt": prompt}) ``` Please remember to update the [app/entrypoint.sh](app/entrypoint.sh) to execute the [hello_mistral.py](app/hello_mistral.py) script. ```bash #!/bin/bash python3 -c 'print("Hello, langchain 2024! ")' python3 /app/hello_mistral.py ``` Finally, please use the following commands to start the containers. ```bash sudo docker compose -f docker-compose.yml up -d [+] Running 3/3 ✔ Network ollama Created 0.0s ✔ Container ollama Started 0.0s ✔ Container bplangchain Started 0.1s ``` We can check the logs from the langchain container to see the response from mistral. ```bash sudo docker compose -f docker-compose.yml logs -f langchain bplangchain | Hello, langchain 2024! bplangchain | Hello, Mistral! bplangchain | 200 bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:10.228181712Z","response":" I","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:10.419699004Z","response":" am","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:10.618428004Z","response":" an","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:10.807201921Z","response":" artificial","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:10.995209962Z","response":" intelligence","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:11.182526462Z","response":" designed","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:11.369557004Z","response":" to","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:11.556764629Z","response":" assist","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:11.744266379Z","response":" with","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:11.931092046Z","response":" various","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:12.117416463Z","response":" tasks","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:12.305158255Z","response":" and","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:12.565411171Z","response":" answer","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:12.751554255Z","response":" questions","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:12.938195463Z","response":" to","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:13.124278588Z","response":" the","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:13.310953047Z","response":" best","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:13.497314339Z","response":" of","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:13.688855047Z","response":" my","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:13.875354089Z","response":" ability","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:14.06135863Z","response":".","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:14.248503464Z","response":" I","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:14.444638131Z","response":" don","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:14.631600631Z","response":"'","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:14.819221381Z","response":"t","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:15.005906423Z","response":" have","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:15.192520381Z","response":" a","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:15.394447131Z","response":" physical","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:15.581238756Z","response":" form","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:15.768074548Z","response":" or","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:15.955235673Z","response":" personal","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:16.142249048Z","response":" identity","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:16.329051257Z","response":",","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:16.515602382Z","response":" but","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:16.702644173Z","response":" I","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:16.889553882Z","response":" can","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:17.076452799Z","response":" process","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:17.263071715Z","response":" information","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:17.450023674Z","response":" and","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:17.636521049Z","response":" communicate","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:17.823074382Z","response":" with","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:18.010190966Z","response":" users","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:18.196853174Z","response":" through","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:18.384203299Z","response":" text","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:18.570895924Z","response":".","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:18.757827883Z","response":" How","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:18.944806174Z","response":" can","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:19.13195105Z","response":" I","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:19.318624091Z","response":" help","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:19.505924716Z","response":" you","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:19.692787633Z","response":" today","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:19.879939342Z","response":"?","done":false} bplangchain | {"model":"mistral:instruct","created_at":"2024-01-01T07:52:20.067064092Z","response":"","done":true,"context":[733,16289,28793,28705,693,460,368,733,28748,16289,28793,315,837,396,18278,10895,5682,298,6031,395,4118,9796,304,4372,4224,298,272,1489,302,586,5537,28723,315,949,28742,28707,506,264,5277,1221,442,3327,8208,28725,562,315,541,1759,1871,304,16287,395,5443,1059,2245,28723,1602,541,315,1316,368,3154,28804],"total_duration":17793666050,"load_duration":5689031002,"prompt_eval_count":12,"prompt_eval_duration":2453690000,"eval_count":52,"eval_duration":9647251000} bplangchain | bplangchain exited with code 0 ``` ### 4.3 Update application Whenever you make changes to the application code e.g. the [entrypoint.sh](app/entrypoint.sh) or the python scripts included, you do **NOT** need to rebuild the image and you just need to restart the container. ```bash sudo docker compose -f docker-compose.yml restart langchain ``` ### 4.4 Clean up You can run the following commands to clean up the containers. ```bash sudo docker compose -f docker-compose.yml down ``` ## 5. Docker for milvus ### 5.1. Milvus Download the milvus standalone docker compose configuraion file and save it to **[docker-compose-milvus.yml](docker-compose-milvus.yml)** ```bash wget https://github.com/milvus-io/milvus/releases/download/v2.3.0/milvus-standalone-docker-compose.yml -O docker-compose-milvus.yml ``` Run the milvus container with docker compose. ```bash sudo docker compose -f docker-compose-milvus.yml up -d [+] Running 4/4 ✔ Network milvus Created 0.0s ✔ Container milvus-minio Started 0.0s ✔ Container milvus-etcd Started 0.0s ✔ Container milvus-standalone Started 0.0s ``` Shutdown milvus container with docker compose. ```bash sudo docker compose -f docker-compose-milvus.yml down [+] Running 4/3 ✔ Container milvus-standalone Removed 0.1s ✔ Container milvus-etcd Removed 0.1s ✔ Container milvus-minio Removed 0.6s ✔ Network milvus Removed 0.0s ``` ### 5.2 Connect to Milvus To simplify the netowrk archicture, we just remove the milvus network from **docker-compose-milvus.yml** ```yml # networks: # default: # name: milvus ``` And then include **docker-compose-milvus.yml** in the [docker-compose.yml](docker-compose.yml) so that all the services are in the same network as Ollama. ```yml include: - ./docker-compose-milvus.yml - ./docker-compose-ollama.yml services: langchain: image: bp/langchain:0.0.2 container_name: bplangchain volumes: - ./app:/app depends_on: - "standalone" - "minio" - "etcd" - "ollama" ``` Update the [entrypoint.sh](app/entrypoint.sh) to execute the python script **hello_milvus.py** which will connect to milvus. ```bash #!/bin/bash python3 -c 'print("Hello, langchain 2024!")' python3 /app/hello_milvus.py ``` Please note that the service name **standalone** is used in the [hello_milvus.py](app/hello_milvus.py) to connect to milvus. ```python connections.connect("default", host="standalone", port="19530") ``` Start compose and you should see the output below in the terminal. ```bash sudo docker compose -f docker-compose.yml up -d [+] Running 6/6 ✔ Network ollama Created 0.0s ✔ Container ollama Started 0.0s ✔ Container milvus-minio Started 0.0s ✔ Container milvus-etcd Started 0.0s ✔ Container milvus-standalone Started 0.0s ✔ Container bplangchain Started 0.0s ``` Check the application logs. You should see the following output if the connection is successful (no exceptions should be thrown). ```bash sudo docker compose -f docker-compose.yml logs -f langchain bplangchain | Hello, langchain 2024! bplangchain | Hello, Milvus! bplangchain | Hello, Milvus @_@! ``` Use the following command to restart the container if there is timeout exception. ```bash sudo docker compose -f docker-compose.yml restart langchain [+] Restarting 1/1 ✔ Container bplangchain Started ``` Finally, clear up the environment. ```bash sudo docker compose -f docker-compose.yml down [+] Running 6/6 ✔ Container bplangchain Removed 0.0s ✔ Container milvus-standalone Removed 0.1s ✔ Container ollama Removed 0.1s ✔ Container milvus-etcd Removed 0.1s ✔ Container milvus-minio Removed 0.6s ✔ Network ollama Removed ``` ## 6. Embedding with Milvus TODO