System Overview
This section gives a single mental model of how ResQConnect works across the three codebases:
frontend/(Next.js web app)backend/(FastAPI + Firestore + Celery)mobile/(Flutter + Android on-device LLM runtime)
For this overview, the mobile app is treated as connected to the backend in production architecture.
Platform Responsibilities
| Codebase | Primary Responsibility | Key Integrations |
|---|---|---|
frontend |
Operator and citizen web UX (dashboard, maps, chatbot UI, request/task screens) | Firebase Auth, Backend REST APIs |
backend |
Source of truth, orchestration, agentic workflows, chatbot endpoint, moderation flow | Firestore, Redis, Celery, OpenAI, Langfuse, FCM |
mobile |
Field UX, help capture, offline assistant, local inference + eventual sync | Android native runtime, Backend APIs, local storage |
System Context Diagram
graph TD
Citizen[Citizen / Affected User]
Responder[Responder / Volunteer]
Admin[Admin / Dispatcher]
Web[Frontend Web App<br/>Next.js]
Mobile[Mobile App<br/>Flutter + Android bridge]
API[Backend API<br/>FastAPI]
DB[(Firestore)]
Queue[(Redis)]
Worker[Agent Worker<br/>Celery]
LLM[OpenAI LLM]
Trace[Langfuse]
Push[FCM Push]
LocalModel[On-device LLM<br/>MediaPipe]
Citizen --> Web
Responder --> Web
Admin --> Web
Citizen --> Mobile
Responder --> Mobile
Web -->|REST + Firebase token| API
Mobile -->|REST + Firebase token| API
API --> DB
API --> Queue
Queue --> Worker
Worker --> DB
Worker --> LLM
Worker --> Trace
Worker --> Push
Mobile -->|offline inference| LocalModel
How To Read This Section
- Codebase Map: What exists in each repo and how responsibilities are separated.
- Dataflow: How data moves from user actions to stored records and operational outputs.
- Agentic Workflow: How AI-assisted planning is generated, moderated, and converted to action.
- Chatbot Architecture: Request lifecycle for
/chatbot/ask, context assembly, and safeguards. - Offline LLM: Mobile on-device inference, online/offline routing, and sync strategy.