Skip to main content

Build and Deployment

API server build:
cd api
pnpm build
Build output is generated in the api/dist directory.Web frontend build:
cd web
pnpm build
Build output is generated in the web/dist directory.
API server:
cd api
NODE_ENV=production node dist/index.js
Or using PM2:
pm2 start dist/index.js --name sonamu-api
Web server:Serve static files with Nginx or Apache:
# nginx.conf
server {
  listen 80;
  server_name example.com;

  root /var/www/sonamu/web/dist;
  index index.html;

  location / {
    try_files $uri $uri/ /index.html;
  }

  location /api {
    proxy_pass http://localhost:1028;
    proxy_http_version 1.1;
    proxy_set_header Upgrade $http_upgrade;
    proxy_set_header Connection 'upgrade';
    proxy_set_header Host $host;
    proxy_cache_bypass $http_upgrade;
  }
}

Environment Variables

Create .env file:
# api/.env.production
NODE_ENV=production
PORT=1028

# Database
DB_HOST=prod-db.example.com
DB_PORT=5432
DB_NAME=sonamu_prod
DB_USER=postgres
DB_PASSWORD=secure_password

# Session
SESSION_SECRET=your-super-secret-key

# CORS
CORS_ORIGIN=https://example.com

# Cache (Redis)
REDIS_HOST=redis.example.com
REDIS_PORT=6379
REDIS_PASSWORD=redis_password
Use in sonamu.config.ts:
export default {
  server: {
    listen: {
      port: Number(process.env.PORT) || 1028,
    }
  },
  database: {
    connections: {
      main: {
        host: process.env.DB_HOST,
        port: Number(process.env.DB_PORT),
        database: process.env.DB_NAME,
        user: process.env.DB_USER,
        password: process.env.DB_PASSWORD,
      }
    }
  },
  session: {
    secret: process.env.SESSION_SECRET,
  }
} satisfies SonamuConfig;
Method 1: Environment Variables
# Add .env files to .gitignore
echo ".env" >> .gitignore
echo ".env.*" >> .gitignore
Method 2: Secret Management Tools
  • AWS Secrets Manager
  • HashiCorp Vault
  • Azure Key Vault
// AWS Secrets Manager example
import { SecretsManager } from "@aws-sdk/client-secrets-manager";

const client = new SecretsManager({ region: "us-east-1" });

async function getSecret(secretName: string) {
  const response = await client.getSecretValue({ SecretId: secretName });
  return JSON.parse(response.SecretString!);
}

const dbSecret = await getSecret("prod/database");

Database

1. Check migration status:
NODE_ENV=production pnpm sonamu migrate status
2. Test on Shadow DB (recommended):
NODE_ENV=production pnpm sonamu migrate shadow-test
3. Apply to actual DB:
NODE_ENV=production pnpm sonamu migrate run
4. Rollback (if needed):
NODE_ENV=production pnpm sonamu migrate rollback
PostgreSQL backup:
# Manual backup
pg_dump -h prod-db.example.com -U postgres -d sonamu_prod > backup.sql

# Compressed backup
pg_dump -h prod-db.example.com -U postgres -d sonamu_prod | gzip > backup.sql.gz

# Restore
gunzip < backup.sql.gz | psql -h prod-db.example.com -U postgres -d sonamu_prod
Automatic backup (Cron):
# crontab -e
# Daily backup at 2 AM
0 2 * * * /usr/bin/pg_dump -h prod-db.example.com -U postgres sonamu_prod | gzip > /backups/sonamu_$(date +\%Y\%m\%d).sql.gz

# Delete backups older than 30 days
0 3 * * * find /backups -name "sonamu_*.sql.gz" -mtime +30 -delete

Monitoring and Logging

Using Winston:
// sonamu.config.ts
import winston from "winston";

export default {
  server: {
    logger: winston.createLogger({
      level: process.env.LOG_LEVEL || "info",
      format: winston.format.json(),
      transports: [
        new winston.transports.File({
          filename: "logs/error.log",
          level: "error"
        }),
        new winston.transports.File({
          filename: "logs/combined.log"
        })
      ]
    })
  }
} satisfies SonamuConfig;
Using the logger:
class UserModelClass extends BaseModelClass {
  @api({ httpMethod: "POST" })
  async createUser(params: UserSaveParams): Promise<User> {
    this.logger.info("Creating user", { email: params.email });

    try {
      const user = await this.save(params);
      this.logger.info("User created", { userId: user.id });
      return user;
    } catch (error) {
      this.logger.error("Failed to create user", { error, params });
      throw error;
    }
  }
}
// health.frame.ts
import { BaseFrameClass, api } from "sonamu";

class HealthFrameClass extends BaseFrameClass {
  @api({ httpMethod: "GET" })  // Omit guards = public API
  async healthCheck(): Promise<{
    status: string;
    timestamp: Date;
    uptime: number;
    database: string;
  }> {
    // Check DB connection
    let databaseStatus = "ok";
    try {
      await this.getDB("r").raw("SELECT 1");
    } catch (error) {
      databaseStatus = "error";
    }

    return {
      status: databaseStatus === "ok" ? "healthy" : "unhealthy",
      timestamp: new Date(),
      uptime: process.uptime(),
      database: databaseStatus
    };
  }
}

export const HealthFrame = new HealthFrameClass();
Sentry integration:
// sonamu.config.ts
import * as Sentry from "@sentry/node";

if (process.env.NODE_ENV === "production") {
  Sentry.init({
    dsn: process.env.SENTRY_DSN,
    environment: process.env.NODE_ENV,
    tracesSampleRate: 1.0,
  });
}

export default {
  server: {
    lifecycle: {
      async onStart() {
        console.log("Server started");
      },
      async onError(error: Error) {
        if (process.env.NODE_ENV === "production") {
          Sentry.captureException(error);
        }
      }
    }
  }
} satisfies SonamuConfig;

Performance Optimization

1. Connection Pool settings:
export default {
  database: {
    connections: {
      main: {
        pool: {
          min: 2,
          max: 10,
          acquireTimeoutMillis: 30000,
          idleTimeoutMillis: 30000
        }
      }
    }
  }
} satisfies SonamuConfig;
2. Enable caching:
import { bentostore } from "bentocache";
import { redisDriver } from "bentocache/drivers/redis";

export default {
  server: {
    cache: {
      default: "redis",
      stores: {
        redis: bentostore()
          .useL1Layer(redisDriver({
            connection: {
              host: process.env.REDIS_HOST,
              port: 6379
            }
          }))
      }
    }
  }
} satisfies SonamuConfig;
3. Enable compression:
import compress from "@fastify/compress";

export default {
  server: {
    lifecycle: {
      async onStart(fastify) {
        await fastify.register(compress);
      }
    }
  }
} satisfies SonamuConfig;
Nginx configuration:
server {
  # ...

  location ~* \.(js|css|png|jpg|jpeg|gif|ico|svg|woff|woff2|ttf)$ {
    expires 1y;
    add_header Cache-Control "public, immutable";
  }
}

Docker

Create Dockerfile:
# api/Dockerfile
FROM node:18-alpine

WORKDIR /app

# Install pnpm
RUN npm install -g pnpm

# Copy and install dependencies
COPY package.json pnpm-lock.yaml ./
RUN pnpm install --frozen-lockfile

# Copy source code
COPY . .

# Build
RUN pnpm build

# Run in production mode
ENV NODE_ENV=production
EXPOSE 1028

CMD ["node", "dist/index.js"]
docker-compose.yml:
version: '3.8'

services:
  postgres:
    image: postgres:15-alpine
    environment:
      POSTGRES_DB: sonamu_prod
      POSTGRES_USER: postgres
      POSTGRES_PASSWORD: ${DB_PASSWORD}
    volumes:
      - postgres_data:/var/lib/postgresql/data
    ports:
      - "5432:5432"

  redis:
    image: redis:7-alpine
    ports:
      - "6379:6379"

  api:
    build: ./api
    environment:
      NODE_ENV: production
      DB_HOST: postgres
      DB_PORT: 5432
      DB_NAME: sonamu_prod
      DB_USER: postgres
      DB_PASSWORD: ${DB_PASSWORD}
      REDIS_HOST: redis
      REDIS_PORT: 6379
    ports:
      - "1028:1028"
    depends_on:
      - postgres
      - redis

  web:
    build: ./web
    ports:
      - "80:80"
    depends_on:
      - api

volumes:
  postgres_data:
Run:
docker-compose up -d

CI/CD

.github/workflows/deploy.yml:
name: Deploy

on:
  push:
    branches: [main]

jobs:
  deploy:
    runs-on: ubuntu-latest

    steps:
      - uses: actions/checkout@v3

      - name: Setup Node.js
        uses: actions/setup-node@v3
        with:
          node-version: '18'

      - name: Install pnpm
        run: npm install -g pnpm

      - name: Install dependencies
        run: pnpm install

      - name: Run tests
        run: pnpm test

      - name: Build
        run: pnpm build

      - name: Deploy to server
        uses: appleboy/ssh-action@master
        with:
          host: ${{ secrets.SERVER_HOST }}
          username: ${{ secrets.SERVER_USER }}
          key: ${{ secrets.SSH_PRIVATE_KEY }}
          script: |
            cd /var/www/sonamu
            git pull
            pnpm install
            pnpm build
            pm2 restart sonamu-api

Security

Let’s Encrypt + Nginx:
# Install Certbot
sudo apt install certbot python3-certbot-nginx

# Issue certificate
sudo certbot --nginx -d example.com -d www.example.com

# Test automatic renewal
sudo certbot renew --dry-run
Nginx HTTPS configuration:
server {
  listen 443 ssl http2;
  server_name example.com;

  ssl_certificate /etc/letsencrypt/live/example.com/fullchain.pem;
  ssl_certificate_key /etc/letsencrypt/live/example.com/privkey.pem;

  ssl_protocols TLSv1.2 TLSv1.3;
  ssl_ciphers HIGH:!aNULL:!MD5;

  # ...
}

# HTTP to HTTPS redirect
server {
  listen 80;
  server_name example.com;
  return 301 https://$server_name$request_uri;
}
// sonamu.config.ts
export default {
  server: {
    cors: {
      origin: process.env.CORS_ORIGIN || "http://localhost:3028",
      credentials: true,
      methods: ["GET", "POST", "PUT", "DELETE", "PATCH"]
    }
  }
} satisfies SonamuConfig;
import rateLimit from "@fastify/rate-limit";

export default {
  server: {
    lifecycle: {
      async onStart(fastify) {
        await fastify.register(rateLimit, {
          max: 100,  // 100 requests
          timeWindow: '1 minute'
        });
      }
    }
  }
} satisfies SonamuConfig;

Best Practices

Security:
  • HTTPS configured
  • Sensitive info managed via environment variables
  • CORS properly configured
  • Rate Limiting set up
  • SQL Injection prevention (using Puri)
  • XSS prevention (input validation)
Performance:
  • Connection Pool optimized
  • Redis caching enabled
  • Static file caching
  • DB indexes optimized
  • N+1 queries eliminated
Monitoring:
  • Health check endpoint
  • Error monitoring (Sentry)
  • Log collection (Winston)
  • DB backup automation
Deployment:
  • Production environment variables set
  • Migration tested (Shadow DB)
  • CI/CD pipeline
  • Rollback plan