1. Create a Dockerfile for the Ollama Service (Dockerfile.ollama)
FROM python:3.11-slimENV DEBIAN_FRONTEND=noninteractiveWORKDIR /app# Install required dependenciesRUN apt update && apt install -y curl && rm -rf /var/lib/apt/lists/*# Install OllamaRUN curl -fsSL https://ollama.com/install.sh | bash# Ensure Ollama is in PATHENV PATH="/root/.ollama/bin:$PATH"# Copy entrypoint script and make it executableCOPY ollama-entrypoint.sh /usr/local/bin/ollama-entrypoint.shRUN chmod +x /usr/local/bin/ollama-entrypoint.shEXPOSE 11434ENTRYPOINT ["/usr/local/bin/ollama-entrypoint.sh"]
2. Create the Entry Point Script for Ollama (ollama-entrypoint.sh)
#!/bin/bash# Ensure Ollama is installed and in PATHexport PATH="/root/.ollama/bin:$PATH"# Start Ollama# /root/.ollama/bin/ollama serve &ollama serve &# Wait for Ollama to initializesleep 5# Pull the Llama3 model# /root/.ollama/bin/ollama pull llama3:8bollama pull llama3.2# Keep the container runningwait
3. Create a Dockerfile for the Streamlit Service (Dockerfile.streamlit)
FROM python:3.11-slim
WORKDIR /app
# Copy dependency file and install packagesCOPY requirements.txt .RUN pip install --no-cache-dir -r requirements.txt
# Copy your app codeCOPY . .
EXPOSE 8501
CMD ["streamlit", "run", "app.py", "--server.port=8501", "--server.address=0.0.0.0"]
4. Create a Docker Compose File (docker-compose.yml)
version: '3.8'
services: ollama: build: context: . dockerfile: Dockerfile.ollama ports: - "11434:11434"
streamlit: build: context: . dockerfile: Dockerfile.streamlit ports: - "8501:8501" depends_on: - ollama
Steps to Run
-
Place the two Dockerfiles and the entrypoint script in your project folder.
-
Make sure you have a valid
requirements.txt
(e.g., withstreamlit
). -
Place your
app.py
in the same folder. -
Run the following command in the project directory:
docker-compose build --no-cache
You will get two images ollama-service:latest, streamlit-service:latest
You will run both to make as containers
Right click on streamlit-service:latest container to get a dropdown.
Pick open in
browser to see the screen as shown below:
PART II
To push your Ollama + Streamlit Docker Compose setup to Docker Hub, follow these steps:
1. Log in to Docker Hub
First, ensure you're logged in:
docker login
Enter your Docker Hub username and password when prompted.
2. Tag Your Images Properly
Since Docker Compose builds images with generic names like ollama-ollama
and ollama-streamlit
, you must tag them with your Docker Hub repository name before pushing.
Assume your Docker Hub username is yourusername
. Replace it with your actual username.
docker tag ollama-ollama yourusername/ollama-service:latest
docker tag ollama-streamlit Yourusername/streamlit-service:latest
my case :
docker tag ollama-ollama chandra65/ollama-service:latest
docker tag ollama-streamlit chandra65/streamlit-service:latest
3. Push Images to Docker Hub
Now, push each image separately:
docker push yourusername/ollama-service:latest
docker push yourusername/streamlit-service:latest
my case :
docker push chandra65/ollama-service:latest
docker push chandra65/streamlit-service:latest
4. Verify on Docker Hub
Go to Docker Hub and check if your images appear under your repositories.
5. Update docker-compose.yml
to Use Pushed Images (Optional)
If you want to use these images without rebuilding them locally, update docker-compose.yml
like this:
version: '3.8'
services:
ollama:
image: yourusername/ollama-service:latest
ports:
- "11434:11434"
streamlit:
image: yourusername/streamlit-service:latest
ports:
- "8501:8501"
depends_on:
- ollama
6. Pull & Run from Docker Hub
On any machine, you can now pull and run your services directly:
docker-compose up
This will fetch the images from Docker Hub and run them without rebuilding! 🚀
No comments:
Post a Comment