Backend Architecture
Overview
The ResQConnect backend is built using FastAPI (Python 3.10+) and follows a microservice-like architecture. It leverages Google Firebase (Firestore) for data persistence and authentication, and Celery with Redis for asynchronous task processing (primarily for AI agent workflows).
High-Level Architecture
graph TD
UserWeb[("User (Web)")] -->|HTTP / Next.js| Frontend["Frontend (Web)"]
UserMobile[("User (Mobile)")] -->|Flutter App| Mobile["Mobile App"]
subgraph "API Layer"
Frontend -->|REST API| LB["Load Balancer / Reverse Proxy"]
Mobile -->|REST API| LB
LB -->|HTTP| API["Backend API (FastAPI)"]
end
subgraph "Backend Services"
API -->|Auth/Data| Firestore[(Google Firestore)]
API -->|Enqueues Task| Redis[(Redis Broker)]
Redis -->|Pulls Task| Worker["Celery Worker"]
Worker -->|Update| Firestore
end
subgraph "External Integrations"
Worker -->|LLM Calls| OpenAI["OpenAI API"]
Worker -->|Tracing| Langfuse["Langfuse"]
Worker -->|Notifications| FCM["Firebase Cloud Messaging"]
end
Component Descriptions
1. API Layer (FastAPI)
The core entry point for all client applications.
- Role: authentication, request validation, and routing.
- Key Features:
- Authentication: Verifies Firebase ID Tokens via HTTPBearer.
- Validation: Uses Pydantic models to ensure data integrity.
- Routing: Dispatches requests to appropriate controllers (users, requests, tasks, etc.).
2. Data Layer (Firestore)
A NoSQL, document-oriented database.
- Role: Primary data store for the application.
- Collections: users, requests, tasks, resources, disasters, messages, observations.
- Access Pattern: Repository pattern implemented in app/crud/.
3. Asynchronous Task Queue (Celery & Redis)
Handles background processing to keep the API responsive. - Broker: Redis is used to store task messages. - Worker: Consumes tasks from Redis and executes them. - Tasks: - Agent Workflows: Matching help requests with available resources/volunteers. - Notifications: Sending push notifications via FCM. - Data Aggregation: Generating summaries or reports.
4. AI & LLM Integration (OpenAI & Langfuse)
Powers the intelligent features of the platform.
- OpenAI: Provides LLM capabilities for natural language processing and decision making.
- Langfuse: Traces and monitors LLM execution for debugging and optimization.
- Agents & Chatbots:
- ChatbotService: Handles real-time user queries (/chatbot/ask) using context-aware prompts.
- Agent Workflows: Custom Celery workers (agent_v2) that orchestrate complex tasks like resource allocation and disaster response planning (/workflow-outputs).
Directory Structure
backend/
├── app/
│ ├── api/ # Route handlers (Endpoints)
│ ├── core/ # Configuration, Security, Logging
│ ├── crud/ # Database operations (CRUD)
│ ├── schemas/ # Pydantic models (Data validation)
│ ├── services/ # Business logic & External integrations
│ ├── agent_v2/ # AI Agent logic
│ └── main.py # Application entry point
├── tests/ # Pytest suite
└── requirements.txt # Python dependencies