System Architecture Overview
High-level architecture of the EthosPower AI platform including deployment, data flow, and key components.
Last updated: 31 January 2026
System Architecture Overview
EthosPower AI is a modern, edge-first web application built for the power generation and industrial battery consulting industry. This document provides a high-level overview of the system architecture.
Technology Stack
The platform is built on a carefully selected technology stack optimized for performance, developer experience, and edge deployment:
Frontend & Framework
- Next.js 15 - React framework with App Router
- TypeScript - Type-safe development
- Tailwind CSS - Utility-first styling
- Lucide React - Icon library
Content Management
- Payload CMS 3.x - Headless CMS with admin UI
- MDX - Markdown with React components for blog and docs
- gray-matter - Frontmatter parsing
Edge Runtime
- Cloudflare Workers - Serverless edge compute
- OpenNext - Next.js to Cloudflare adapter
- D1 - SQLite at the edge
- R2 - Object storage for media
- KV - Key-value caching
- Vectorize - Vector embeddings for RAG
Deployment Architecture
┌─────────────────────────────────────────────────────────────┐
│ Cloudflare Edge Network │
├─────────────────────────────────────────────────────────────┤
│ │
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────────┐ │
│ │ Workers │ │ D1 │ │ R2 │ │
│ │ (OpenNext) │ │ (Database) │ │ (Media Storage) │ │
│ └──────────────┘ └──────────────┘ └──────────────────┘ │
│ │
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────────┐ │
│ │ KV │ │ Vectorize │ │ Workers AI │ │
│ │ (Cache) │ │ (Embeddings) │ │ (Inference) │ │
│ └──────────────┘ └──────────────┘ └──────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────┘
Key Subsystems
RAG Pipeline
The Retrieval-Augmented Generation system powers the AI chat assistant:
- Document Ingestion - Content vectorized via Workers AI
- Semantic Search - Vectorize finds relevant chunks
- Context Assembly - Sliding window manages token limits
- Response Generation - Claude generates grounded answers
MCP Integration
Model Context Protocol connectors enable AI agents to interact with external systems:
- ERPNext - CRM and business operations
- Task Master - Project and task management
- Neo4j - Knowledge graph queries
- Qdrant - Vector memory storage
Customer Portal
Authenticated area for customer self-service:
- Magic Link Auth - Passwordless email authentication
- Session Management - D1-backed sessions with 30-day expiry
- Dashboard - Projects, tasks, and invoices from ERPNext
Data Flow
Public Request Flow
- Request hits Cloudflare edge
- Workers handle routing via OpenNext
- Static assets served from R2/KV cache
- Dynamic content queries D1 database
- AI features use Vectorize + Workers AI
Authenticated Request Flow
- Session cookie validated against D1
- Customer ID extracted from session
- ERPNext queries filtered by customer
- Results cached in KV (5-15 min TTL)
- Response returned with security headers
Performance Considerations
- Edge-first - All compute happens at nearest Cloudflare POP
- Aggressive caching - KV cache for ERPNext responses
- Lazy loading - Heavy components (charts, maps) loaded on demand
- Streaming - SSE for AI responses reduces time-to-first-byte
architecturecloudflarenext.jspayload