Wednesday, September 17, 2025

Document for Docker + nginx + HTTPS setup for Company Recommender

Docker + nginx + HTTPS setup for Company Recommender

This document contains a ready-to-use Docker deployment for your FastAPI backend and Angular frontend, plus an NGINX reverse proxy. It also includes instructions for obtaining Let’s Encrypt certificates with Certbot (manual step).

Important: I placed all files and config examples below. Follow the numbered steps in Deployment to build, obtain certificates, and run in production.


Files included

  • docker-compose.yml — orchestrates backend, frontend, nginx, and certbot (optional)

  • backend/Dockerfile — builds your FastAPI app (uvicorn)

  • frontend/Dockerfile — builds Angular production bundle and serves using nginx

  • nginx/nginx.conf — main nginx config with HTTP -> HTTPS redirect

  • nginx/conf.d/immai.acintia.com.conf — site config (reverse proxy to backend + static hosting for frontend)

  • README_DEPLOY.md — deployment steps and Certbot instructions


docker-compose.yml

version: '3.8'
services:
  backend:
    build:
      context: ./backend
      dockerfile: Dockerfile
    container_name: company_recommender_backend
    environment:
      - PORT=8000
      - OLLAMA_URL=https://immai.acintia.com
    expose:
      - "8000"
    restart: unless-stopped

  frontend:
    build:
      context: ./frontend
      dockerfile: Dockerfile
    container_name: company_recommender_frontend
    restart: unless-stopped
    expose:
      - "80"

  nginx:
    image: nginx:stable
    container_name: company_recommender_nginx
    ports:
      - "80:80"
      - "443:443"
    depends_on:
      - frontend
      - backend
    volumes:
      - ./nginx/nginx.conf:/etc/nginx/nginx.conf:ro
      - ./nginx/conf.d:/etc/nginx/conf.d:ro
      - ./certs:/etc/letsencrypt/live:ro
      - ./nginx/html:/usr/share/nginx/html:ro
    restart: unless-stopped

  certbot:
    image: certbot/certbot
    container_name: company_recommender_certbot
    volumes:
      - ./certs:/etc/letsencrypt/live
      - ./nginx/html:/var/www/html
    entrypoint: ''
    command: "/bin/sh -c 'sleep infinity'"
    restart: 'no'

networks:
  default:
    driver: bridge

Notes:

  • certbot here is present to allow you to run one-off cert issuance commands using the certbot container (instructions below).

  • ./certs will hold the live certificate files after you create them (mounted read-only into nginx).


backend/Dockerfile

# backend/Dockerfile
FROM python:3.11-slim

WORKDIR /app

# system deps (if needed)
RUN apt-get update && apt-get install -y --no-install-recommends build-essential curl && rm -rf /var/lib/apt/lists/*

COPY backend/requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt

COPY backend/ .

ENV PORT=8000

CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000", "--workers", "1"]

Make sure backend/requirements.txt contains: fastapi, uvicorn[standard], langgraph, langchain-core, langchain-community, pydantic, and any other packages your main.py imports.


frontend/Dockerfile (Angular)

# frontend/Dockerfile
# Build stage
FROM node:20 AS build
WORKDIR /app
COPY frontend/package*.json ./
RUN npm ci --legacy-peer-deps
COPY frontend/ .
RUN npm run build -- --configuration production

# Production stage
FROM nginx:stable
COPY --from=build /app/dist/ /usr/share/nginx/html/
# optional: copy a custom nginx conf for serving SPA (404 -> index.html handled by nginx conf)
COPY nginx/spa.conf /etc/nginx/conf.d/default.conf

EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]

Assumes your Angular build output lands in /app/dist/ (adjust if your project name differs; ng build by default creates dist/<project-name>).


nginx/nginx.conf

user  nginx;
worker_processes  auto;
error_log  /var/log/nginx/error.log warn;
pid        /var/run/nginx.pid;

events { worker_connections 1024; }

http {
    include       /etc/nginx/mime.types;
    default_type  application/octet-stream;
    sendfile        on;
    keepalive_timeout  65;

    include /etc/nginx/conf.d/*.conf;
}

nginx/conf.d/immai.acintia.com.conf

# Redirect HTTP to HTTPS
server {
    listen 80;
    server_name immai.acintia.com;

    root /var/www/html;

    location /.well-known/acme-challenge/ {
        alias /var/www/html/.well-known/acme-challenge/;
    }

    location / {
        return 301 https://$host$request_uri;
    }
}

# HTTPS server
server {
    listen 443 ssl http2;
    server_name immai.acintia.com;

    ssl_certificate /etc/letsencrypt/live/immai.acintia.com/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/immai.acintia.com/privkey.pem;

    ssl_protocols TLSv1.2 TLSv1.3;
    ssl_prefer_server_ciphers on;

    # Serve frontend static files
    location / {
        proxy_pass http://frontend:80;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }

    # API proxying
    location /recommend {
        proxy_pass http://backend:8000/recommend;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }

    location /chat {
        proxy_pass http://backend:8000/chat;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }

    # optionally allow access to certbot challenge files served from nginx html
    location ^~ /.well-known/acme-challenge/ {
        alias /var/www/html/.well-known/acme-challenge/;
        allow all;
    }
}

The proxy_pass uses the Docker service names frontend and backend defined in docker-compose.yml so nginx communicates over the Docker network.


nginx/spa.conf (used by frontend image)

server {
  listen 80;
  server_name _;
  root /usr/share/nginx/html;
  index index.html;

  location / {
    try_files $uri $uri/ /index.html;
  }
}

README_DEPLOY.md — Deployment steps (summary)

  1. Clone repo and place the backend and frontend project folders side-by-side with docker-compose.yml and nginx/ folder.

project-root/
  ├─ backend/        # your FastAPI app (main.py, requirements.txt)
  ├─ frontend/       # your Angular project (package.json, angular.json)
  ├─ nginx/
  ├─ docker-compose.yml
  └─ certs/          # created after certbot
  1. Build images and start containers (without certs yet):

docker compose up -d --build
  1. Obtain Let’s Encrypt certificates using the certbot container + webroot method (run on the host):

  • Ensure DNS for immai.acintia.com points to the server IP.

  • Ensure port 80 is reachable externally and not blocked by firewall.

Run this command (example using docker exec into certbot container):

# create the directory for challenge files
mkdir -p nginx/html/.well-known/acme-challenge

# run certbot interactively to obtain certs
docker run --rm -it \
  -v "$(pwd)/certs:/etc/letsencrypt/live" \
  -v "$(pwd)/nginx/html:/var/www/html" \
  certbot/certbot certonly --webroot \
    --webroot-path /var/www/html \
    -d immai.acintia.com \
    --email your-email@example.com --agree-tos --non-interactive

If successful, certificate files will be in ./certs/immai.acintia.com/ and will be mounted into the nginx container.

  1. Reload nginx to pick up the certificates:

docker compose restart nginx
  1. (Optional) Set up automatic renewal (cron on host) using certbot renew and reload nginx after renewal.

Local dev alternative (self-signed)

If you don't want to use certbot yet, you can create a self-signed cert and mount it into ./certs with the same filenames fullchain.pem and privkey.pem for testing.


Additional operational notes

  • If you use Cloudflare, you may prefer to enable Cloudflare proxy and use their TLS — in that case you'd point nginx to use Cloudflare origin certs or use ssl_certificate accordingly.

  • Make sure your backend FastAPI app binds to 0.0.0.0 (the provided Dockerfile uses that). If your FastAPI uvicorn.run call uses 127.0.0.1, update it.

  • In production, consider using environment variables or a .env file for secrets and configuration. Also increase uvicorn worker count appropriately.

  • Monitor logs: docker compose logs -f nginx and docker compose logs -f backend.


Troubleshooting

  • Certbot fails: ensure port 80 isn't blocked and DNS resolves. Run certbot with --staging to test.

  • Backend 502 from nginx: check the proxy_pass host/port match your compose service names and ports; use docker compose ps to verify.


If you want, I can also:

  • Provide a .env and systemd unit file for auto-start on server boot.

  • Add a healthcheck to the backend service and change nginx config to use proxy_pass http://backend:8000; more generally.


End of file.

Angular frontend for company-recommender-frontend

FastAPI backend for recommending the type of company registration in India using an LLM. Now you want a web frontend that can talk to your backend (/recommend and /chat).

I’ll give you a simple Angular (latest) frontend with a form to collect inputs and a chat interface.


Steps to Build Frontend

1. Create Angular Project

npm install -g @angular/cli
ng new company-recommender-frontend
cd company-recommender-frontend
ng serve

2. Generate Components & Service

ng g c components/recommendation-form
ng g c components/chat
ng g s services/api

3. Angular Service (src/app/services/api.service.ts)

This will handle HTTP calls to your FastAPI backend.

import { Injectable } from '@angular/core';
import { HttpClient } from '@angular/common/http';
import { Observable } from 'rxjs';

@Injectable({
  providedIn: 'root'
})
export class ApiService {
  private baseUrl = 'http://127.0.0.1:8000'; // Change to https://immai.acintia.com in production

  constructor(private http: HttpClient) {}

  getRecommendation(payload: any): Observable<any> {
    return this.http.post(`${this.baseUrl}/recommend`, payload);
  }

  chat(payload: any): Observable<any> {
    return this.http.post(`${this.baseUrl}/chat`, payload);
  }
}

4. Recommendation Form Component (src/app/components/recommendation-form/recommendation-form.component.ts)

import { Component } from '@angular/core';
import { ApiService } from '../../services/api.service';
import { v4 as uuidv4 } from 'uuid';

@Component({
  selector: 'app-recommendation-form',
  templateUrl: './recommendation-form.component.html'
})
export class RecommendationFormComponent {
  business_type = '';
  investment_amount: number | null = null;
  number_of_partners = 1;
  is_foreign_citizen = false;
  session_id = uuidv4(); // generate unique session

  recommendation: string | null = null;

  constructor(private api: ApiService) {}

  submitForm() {
    const payload = {
      business_type: this.business_type,
      investment_amount: this.investment_amount,
      number_of_partners: this.number_of_partners,
      is_foreign_citizen: this.is_foreign_citizen,
      session_id: this.session_id
    };

    this.api.getRecommendation(payload).subscribe({
      next: (res) => {
        this.recommendation = res.response;
      },
      error: (err) => {
        console.error(err);
      }
    });
  }
}

HTML (recommendation-form.component.html)

<div class="p-4 max-w-lg mx-auto">
  <h2 class="text-xl font-bold mb-2">Company Recommender Form</h2>

  <form (ngSubmit)="submitForm()">
    <label>Business Type:</label>
    <input [(ngModel)]="business_type" name="business_type" class="border p-2 w-full mb-2" required />

    <label>Investment Amount (INR):</label>
    <input type="number" [(ngModel)]="investment_amount" name="investment_amount" class="border p-2 w-full mb-2" />

    <label>Number of Partners:</label>
    <input type="number" [(ngModel)]="number_of_partners" name="number_of_partners" class="border p-2 w-full mb-2" />

    <label>
      <input type="checkbox" [(ngModel)]="is_foreign_citizen" name="is_foreign_citizen" />
      Foreign Citizen Involved
    </label>

    <button type="submit" class="bg-blue-500 text-white px-4 py-2 rounded mt-3">Get Recommendation</button>
  </form>

  <div *ngIf="recommendation" class="mt-4 p-3 bg-gray-100 rounded">
    <h3 class="font-bold">Recommendation:</h3>
    <p>{{ recommendation }}</p>
  </div>
</div>

5. Chat Component (src/app/components/chat/chat.component.ts)

import { Component, Input } from '@angular/core';
import { ApiService } from '../../services/api.service';

@Component({
  selector: 'app-chat',
  templateUrl: './chat.component.html'
})
export class ChatComponent {
  @Input() session_id!: string;
  userInput = '';
  messages: { sender: string, text: string }[] = [];

  constructor(private api: ApiService) {}

  sendMessage() {
    if (!this.userInput.trim()) return;

    this.messages.push({ sender: 'You', text: this.userInput });
    const payload = {
      user_input: this.userInput,
      session_id: this.session_id
    };

    this.api.chat(payload).subscribe({
      next: (res) => {
        this.messages.push({ sender: 'AI', text: res.response });
      },
      error: (err) => console.error(err)
    });

    this.userInput = '';
  }
}

HTML (chat.component.html)

<div class="p-4 max-w-lg mx-auto mt-6 border rounded">
  <h2 class="text-xl font-bold mb-2">Chat with AI</h2>

  <div class="h-60 overflow-y-auto bg-gray-100 p-2 mb-2 rounded">
    <div *ngFor="let msg of messages">
      <p [ngClass]="{'font-bold text-blue-700': msg.sender==='AI'}">
        {{ msg.sender }}: {{ msg.text }}
      </p>
    </div>
  </div>

  <div class="flex">
    <input [(ngModel)]="userInput" name="chatInput" class="flex-1 border p-2" placeholder="Ask something..." />
    <button (click)="sendMessage()" class="bg-green-500 text-white px-4 ml-2 rounded">Send</button>
  </div>
</div>

6. App Component (src/app/app.component.html)

<app-recommendation-form></app-recommendation-form>
<app-chat [session_id]="session_id"></app-chat>

7. Install Dependencies

npm install uuid

✅ Now you have a working Angular frontend for your FastAPI backend:

  • Fill form → /recommend

  • Chat with AI → /chat


Do you want me to also dockerize this Angular frontend and add it to your existing docker-compose.yml (with backend + nginx reverse proxy + HTTPS)?

-------------------


Upgrading reverse proxy server at https://immai.actindia.com

Upgrading the reverse proxy to serve your app securely at https://immai.acintia.com using NGINX + Let’s Encrypt (Certbot).

We’ll use docker-compose with a dedicated reverse proxy container and automatic SSL certificate management.


🔹 Step 1: Update NGINX Config for SSL

Replace your nginx/nginx.conf with this:

server {
    listen 80;
    server_name immai.acintia.com;

    # Redirect all HTTP traffic to HTTPS
    location / {
        return 301 https://$host$request_uri;
    }
}

server {
    listen 443 ssl;
    server_name immai.acintia.com;

    # SSL Certificates (managed by certbot)
    ssl_certificate /etc/letsencrypt/live/immai.acintia.com/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/immai.acintia.com/privkey.pem;
    ssl_protocols TLSv1.2 TLSv1.3;
    ssl_prefer_server_ciphers on;

    # Serve frontend Angular app
    location / {
        proxy_pass http://frontend:80;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }

    # Proxy backend FastAPI
    location /api/ {
        proxy_pass http://backend:8000/;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }

    # Proxy Ollama (optional)
    location /ollama/ {
        proxy_pass http://ollama:11434/;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }
}

🔹 Step 2: Extend docker-compose.yml

Here’s the updated file with nginx + certbot:

version: "3.9"

services:
  backend:
    build: .
    container_name: company-recommender-backend
    expose:
      - "8000"
    environment:
      - OLLAMA_URL=http://ollama:11434
    depends_on:
      - ollama

  ollama:
    image: ollama/ollama:latest
    container_name: ollama
    expose:
      - "11434"
    volumes:
      - ollama_data:/root/.ollama

  frontend:
    build: ./frontend
    container_name: company-recommender-frontend
    expose:
      - "80"
    environment:
      - API_URL=/api
    depends_on:
      - backend

  proxy:
    image: nginx:alpine
    container_name: reverse-proxy
    restart: always
    ports:
      - "80:80"
      - "443:443"
    volumes:
      - ./nginx/nginx.conf:/etc/nginx/conf.d/default.conf:ro
      - certbot-etc:/etc/letsencrypt
      - certbot-var:/var/lib/letsencrypt
    depends_on:
      - frontend
      - backend
      - ollama

  certbot:
    image: certbot/certbot
    container_name: certbot
    volumes:
      - certbot-etc:/etc/letsencrypt
      - certbot-var:/var/lib/letsencrypt
      - ./nginx/nginx.conf:/etc/nginx/conf.d/default.conf:ro
    entrypoint: >
      sh -c "trap exit TERM; while :; do certbot renew --quiet; sleep 12h & wait $${!}; done"

volumes:
  ollama_data:
  certbot-etc:
  certbot-var:

🔹 Step 3: Issue the SSL Certificate

Before starting everything, you need to request the certificate once:

docker run -it --rm \
  -v certbot-etc:/etc/letsencrypt \
  -v certbot-var:/var/lib/letsencrypt \
  -v ./nginx/nginx.conf:/etc/nginx/conf.d/default.conf \
  certbot/certbot certonly --webroot \
  --webroot-path=/var/lib/letsencrypt \
  --agree-tos --no-eff-email \
  -m youremail@example.com \
  -d immai.acintia.com

Replace youremail@example.com with your real email (for renewal notices).


🔹 Step 4: Start the Stack

docker-compose up --build -d

Now:

Certbot will auto-renew every 12 hours.


🔹 Step 5: Verify HTTPS

Open:

curl -I https://immai.acintia.com

You should see HTTP/1.1 200 OK with a valid SSL certificate.


👉 Do you want me to also configure auto-redirect of /api requests inside Angular (so your Angular code never needs to know /api explicitly), or keep it like this (frontend calls /api/...)?

NGINX reverse proxy (one domain support)

 Adding an NGINX reverse proxy so your users just hit one domain/port (e.g. http://localhost) and NGINX will serve:

  • Frontend at /

  • Backend API at /api/

  • Ollama at /ollama/ (optional, in case you want to expose it too)


🔹 Step 1: Create nginx.conf (reverse proxy)

Make a new folder nginx/ in your project root and add nginx.conf:

server {
  listen 80;

  server_name _;

  # Serve frontend Angular app
  location / {
    proxy_pass http://frontend:80;
    proxy_set_header Host $host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto $scheme;
  }

  # Proxy backend FastAPI
  location /api/ {
    proxy_pass http://backend:8000/;
    proxy_set_header Host $host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto $scheme;
  }

  # Proxy Ollama (optional)
  location /ollama/ {
    proxy_pass http://ollama:11434/;
    proxy_set_header Host $host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto $scheme;
  }
}

🔹 Step 2: Add Reverse Proxy Service to docker-compose.yml

Extend your compose file:

version: "3.9"

services:
  backend:
    build: .
    container_name: company-recommender-backend
    expose:
      - "8000"
    environment:
      - OLLAMA_URL=http://ollama:11434
    depends_on:
      - ollama

  ollama:
    image: ollama/ollama:latest
    container_name: ollama
    expose:
      - "11434"
    volumes:
      - ollama_data:/root/.ollama

  frontend:
    build: ./frontend
    container_name: company-recommender-frontend
    expose:
      - "80"
    environment:
      - API_URL=/api   # frontend calls backend via proxy
    depends_on:
      - backend

  proxy:
    image: nginx:alpine
    container_name: reverse-proxy
    ports:
      - "80:80"
    volumes:
      - ./nginx/nginx.conf:/etc/nginx/conf.d/default.conf:ro
    depends_on:
      - frontend
      - backend
      - ollama

volumes:
  ollama_data:

🔹 Step 3: Adjust Angular Runtime Config

Since the reverse proxy maps backend → /api/, update frontend/src/assets/env.js:

(function(window) {
  window["env"] = window["env"] || {};
  window["env"]["API_URL"] = "${API_URL}";
})(this);

In docker-compose.yml, the frontend service already sets:

environment:
  - API_URL=/api

So the Angular app will now call /api/..., which gets routed to the backend by NGINX.


🔹 Step 4: Rebuild & Run

From your project root:

docker-compose up --build

✅ Final Setup

  • Visit → http://localhost → Angular frontend served.

  • Frontend API calls → /api/... → routed to backend.

  • Optional Ollama calls → /ollama/....

Everything is cleanly under one domain and one port (80) 🎯.


👉 Do you want me to also add HTTPS (SSL/TLS) with Let’s Encrypt support to this proxy so you can put it on a real server/domain securely?

Live API URL injection

Let’s set up live API URL injection so you can change backend URL without rebuilding the Angular frontend. This is super useful for Docker because the container just reads env variables at runtime.


🔹 Step 1: Use env.js instead of Angular’s static environment.ts

Already we created src/assets/env.js:

(function(window) {
  window["env"] = window["env"] || {};
  window["env"]["API_URL"] = "${API_URL}";
})(this);

This file will get replaced with real env vars when the container starts.


🔹 Step 2: Modify Angular ApiService

Update src/app/services/api.service.ts:

import { Injectable } from '@angular/core';
import { HttpClient } from '@angular/common/http';
import { Observable } from 'rxjs';

@Injectable({
  providedIn: 'root'
})
export class ApiService {
  private apiUrl = (window as any)["env"]?.API_URL || "http://localhost:8000";

  constructor(private http: HttpClient) {}

  getRecommendation(payload: any): Observable<any> {
    return this.http.post(`${this.apiUrl}/recommend`, payload);
  }

  chat(payload: any): Observable<any> {
    return this.http.post(`${this.apiUrl}/chat`, payload);
  }
}

🔹 Step 3: Add an Entrypoint Script in Docker

Create frontend/docker-entrypoint.sh:

#!/bin/sh

# Replace placeholder vars in env.js with real env vars
echo "Injecting runtime environment variables..."
sed -i "s|\${API_URL}|${API_URL:-http://localhost:8000}|g" /usr/share/nginx/html/assets/env.js

exec "$@"

Make it executable:

chmod +x frontend/docker-entrypoint.sh

🔹 Step 4: Update Frontend Dockerfile

Modify frontend/Dockerfile:

# Stage 1: Build Angular app
FROM node:18-slim as build

WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build -- --configuration production

# Stage 2: Serve with Nginx
FROM nginx:alpine

# Copy build output
COPY --from=build /app/dist/frontend /usr/share/nginx/html

# Copy custom entrypoint
COPY docker-entrypoint.sh /docker-entrypoint.sh
RUN chmod +x /docker-entrypoint.sh

# Copy nginx config
COPY nginx.conf /etc/nginx/conf.d/default.conf

EXPOSE 80

ENTRYPOINT ["/docker-entrypoint.sh"]
CMD ["nginx", "-g", "daemon off;"]

🔹 Step 5: Update docker-compose.yml

version: "3.9"

services:
  backend:
    build: .
    container_name: company-recommender-backend
    ports:
      - "8000:8000"
    environment:
      - OLLAMA_URL=http://ollama:11434
    depends_on:
      - ollama

  ollama:
    image: ollama/ollama:latest
    container_name: ollama
    ports:
      - "11434:11434"
    volumes:
      - ollama_data:/root/.ollama

  frontend:
    build: ./frontend
    container_name: company-recommender-frontend
    ports:
      - "3000:80"
    environment:
      - API_URL=http://backend:8000
    depends_on:
      - backend

volumes:
  ollama_data:

✅ Final Behavior

  • At runtime, when you start the container, docker-entrypoint.sh injects API_URL into env.js.

  • Your Angular app reads from window.env.API_URL.

  • You can point the frontend to a different backend without rebuilding the Angular image.

Example:

docker-compose up --build

Frontend → http://localhost:3000
Backend → http://localhost:8000
Ollama → http://localhost:11434


👉 Do you want me to also add an NGINX reverse proxy in front (so that frontend and backend are served under one domain, e.g. /api), or keep them on separate ports (3000 + 8000)?

Angular Front end(UI)

 Perfect 👍 Let’s do this with Angular (latest) frontend. 5 clear steps + code so that it works with the backend and Ollama inside the same docker-compose setup.


✅ Step 1: Create Angular Frontend

Run this on your host machine (outside Docker):

npm install -g @angular/cli
ng new frontend --routing --style=css

This creates a new Angular app in frontend/.

Move into the folder:

cd frontend

✅ Step 2: Add an API Service

Inside Angular, create a service for backend calls:

ng generate service services/api

This generates src/app/services/api.service.ts. Replace with:

import { Injectable } from '@angular/core';
import { HttpClient } from '@angular/common/http';
import { Observable } from 'rxjs';

@Injectable({
  providedIn: 'root'
})
export class ApiService {
  private apiUrl = (window as any)['env']?.API_URL || 'http://localhost:8000';

  constructor(private http: HttpClient) {}

  getRecommendation(payload: any): Observable<any> {
    return this.http.post(`${this.apiUrl}/recommend`, payload);
  }

  chat(payload: any): Observable<any> {
    return this.http.post(`${this.apiUrl}/chat`, payload);
  }
}

✅ Step 3: Allow Environment Config at Runtime

Angular normally hardcodes environment.ts, but for Docker we want runtime config.

Create a file: src/assets/env.js:

(function(window) {
  window["env"] = window["env"] || {};
  window["env"]["API_URL"] = "${REACT_APP_API_URL}";
})(this);

Add it in angular.json under assets so it’s copied into the build:

"assets": [
  "src/favicon.ico",
  "src/assets",
  "src/assets/env.js"
]

✅ Step 4: Update AppComponent to Call Backend

Modify src/app/app.component.ts:

import { Component } from '@angular/core';
import { ApiService } from './services/api.service';

@Component({
  selector: 'app-root',
  template: `
    <h1>AI Company Recommender</h1>
    <button (click)="getRecommendation()">Get Recommendation</button>
    <p *ngIf="response">{{ response }}</p>
  `
})
export class AppComponent {
  response: string | null = null;

  constructor(private api: ApiService) {}

  getRecommendation() {
    const payload = {
      business_type: "IT Startup",
      investment_amount: 100000,
      number_of_partners: 2,
      is_foreign_citizen: false,
      session_id: "demo-session"
    };
    this.api.getRecommendation(payload).subscribe(res => {
      this.response = res.response;
    });
  }
}

✅ Step 5: Dockerize Angular Frontend

Create frontend/Dockerfile:

# Stage 1: Build Angular app
FROM node:18-slim as build

WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build -- --configuration production

# Stage 2: Serve with Nginx
FROM nginx:alpine

# Copy build output
COPY --from=build /app/dist/frontend /usr/share/nginx/html

# Replace env vars in env.js at runtime
COPY nginx.conf /etc/nginx/conf.d/default.conf

EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]

Create frontend/nginx.conf:

server {
  listen 80;

  location / {
    root   /usr/share/nginx/html;
    index  index.html;
    try_files $uri /index.html;
  }

  # Allow API proxy if needed
  location /api/ {
    proxy_pass http://backend:8000/;
  }
}

✅ docker-compose.yml (final with Angular frontend)

version: "3.9"

services:
  backend:
    build: .
    container_name: company-recommender-backend
    ports:
      - "8000:8000"
    environment:
      - OLLAMA_URL=http://ollama:11434
    depends_on:
      - ollama

  ollama:
    image: ollama/ollama:latest
    container_name: ollama
    ports:
      - "11434:11434"
    volumes:
      - ollama_data:/root/.ollama

  frontend:
    build: ./frontend
    container_name: company-recommender-frontend
    ports:
      - "3000:80"
    environment:
      - REACT_APP_API_URL=http://backend:8000
    depends_on:
      - backend

volumes:
  ollama_data:

✅ Usage

From project root (where docker-compose.yml is):

docker-compose up --build

Now:

  • Frontend → http://localhost:3000

  • Backend → http://localhost:8000

  • Ollama → http://localhost:11434


👉 Do you want me to also set up live API URL injection (so Angular doesn’t need rebuilds when backend URL changes), or is the static env.js approach good enough for your case?

+Front end service ->docker-compose.yml (all in one)

Extends the above docker-compose.yml so that it runs backend + Ollama + frontend in one go.  Assumption frontend is a simple Node.js/React app (like a Vite or CRA build). 


✅ docker-compose.yml (with frontend)

version: "3.9"

services:
  backend:
    build: .
    container_name: company-recommender-backend
    ports:
      - "8000:8000"
    environment:
      - OLLAMA_URL=http://ollama:11434
    depends_on:
      - ollama

  ollama:
    image: ollama/ollama:latest
    container_name: ollama
    ports:
      - "11434:11434"
    volumes:
      - ollama_data:/root/.ollama

  frontend:
    build: ./frontend   # assumes you have a frontend/ directory
    container_name: company-recommender-frontend
    ports:
      - "3000:3000"
    environment:
      - REACT_APP_API_URL=http://backend:8000   # frontend talks to backend inside Docker
    depends_on:
      - backend

volumes:
  ollama_data:

✅ Frontend Dockerfile (in frontend/Dockerfile)

If your frontend is React (Vite or CRA):

# Use Node.js for building the frontend
FROM node:18-slim

WORKDIR /app

# Install dependencies first
COPY package.json package-lock.json ./
RUN npm install

# Copy all frontend files
COPY . .

# Start dev server (hot reload) or build for production
# For dev mode:
CMD ["npm", "start"]

# For production (static build with Nginx), replace above CMD with:
# RUN npm run build
# FROM nginx:alpine
# COPY --from=0 /app/dist /usr/share/nginx/html
# EXPOSE 80
# CMD ["nginx", "-g", "daemon off;"]

👉 If you want production-ready deployment, I’d recommend the Nginx option (the commented block above), because it’s lighter and faster than running a Node dev server.


✅ Usage

From project root (where docker-compose.yml is):

docker-compose up --build
  • Backend → http://localhost:8000

  • Frontend → http://localhost:3000 (and it calls backend via Docker network using http://backend:8000)

  • Ollama → http://localhost:11434



docker-compose.yml for backend ALONE

Run your FastAPI backend in one container and later connect it with Ollama (running either locally or in another container), here’s a simple docker-compose.yml setup.


✅ docker-compose.yml

version: "3.9"

services:
  backend:
    build: .
    container_name: company-recommender-backend
    ports:
      - "8000:8000"
    environment:
      - OLLAMA_URL=http://ollama:11434  # backend talks to ollama service
    depends_on:
      - ollama

  ollama:
    image: ollama/ollama:latest
    container_name: ollama
    ports:
      - "11434:11434"
    volumes:
      - ollama_data:/root/.ollama  # persist models

volumes:
  ollama_data:

✅ Changes in your main.py

Right now you hardcoded:

ollama_url = "https://immai.acintia.com"

Update this to read from an environment variable so it works inside Docker:

import os

ollama_url = os.getenv("OLLAMA_URL", "http://localhost:11434")

That way:

  • If you run with docker-compose, it will connect to the ollama container.

  • If you run locally without Docker, it defaults to http://localhost:11434.


✅ Usage

From your project root:

docker-compose up --build

Then visit:

http://localhost:8000

Your backend will talk to the Ollama container automatically 🎯.



Dockerize CAAI_AI (As per Request from Maran)

 This supplements to CAAI_AI repository

Dockerfile you can use to dockerize your FastAPI backend (main.py). I’ll assume your project structure looks like this:

project-root/
│── main.py
│── requirements.txt
│── Dockerfile

✅ Dockerfile.txt

# Use an official lightweight Python image
FROM python:3.11-slim

# Set environment variables
ENV PYTHONUNBUFFERED=1 \
    PYTHONDONTWRITEBYTECODE=1

# Set work directory
WORKDIR /app

# Install system dependencies
RUN apt-get update && apt-get install -y --no-install-recommends \
    build-essential curl git && \
    rm -rf /var/lib/apt/lists/*

# Copy requirements first for efficient caching
COPY requirements.txt .

# Install Python dependencies
RUN pip install --no-cache-dir -r requirements.txt

# Copy the entire project
COPY . .

# Expose FastAPI port
EXPOSE 8000

# Run the FastAPI app with uvicorn
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]

✅ requirements.txt

If you don’t already have one, here’s what your requirements.txt should contain based on your code:

fastapi
uvicorn[standard]
pydantic
langchain
langchain-community
langgraph

(If you know extra versions you want pinned, you can add == versions.)


✅ Build & Run Docker Container

From your project root:

# Build the image
docker build -t company-recommender-backend .

# Run the container
docker run -d -p 8000:8000 company-recommender-backend

⚡ Your FastAPI backend will now be available at:

http://localhost:8000

Response :



Document for Docker + nginx + HTTPS setup for Company Recommender

Docker + nginx + HTTPS setup for Company Recommender This document contains a ready-to-use Docker deployment for your FastAPI backend and An...