Mobile Workflows
1. Offline Model Download
sequenceDiagram
participant User
participant Start as StartScreen (Flutter)
participant Channel as LlmPlatformChannel
participant Native as MainActivity (Kotlin)
participant Net as HuggingFace URL
participant FS as App filesDir
User->>Start: Tap "Download Offline Model"
Start->>Channel: downloadModel()
Channel->>Native: MethodChannel.invoke("downloadModel")
Native->>Net: Open URL connection
loop file chunks
Net-->>Native: bytes
Native-->>Channel: downloadProgress(downloaded,total)
Channel-->>Start: progress stream event
end
Native->>FS: write qwen.task
Native->>Native: save model_path in SharedPreferences
Native-->>Channel: success=true
Channel-->>Start: Future<bool> completed
Start-->>User: Navigate to chat on success
2. Inference Invocation Path
This is the current low-level execution path implemented in code.
Note: the active ChatScreen is currently scripted; wiring chat input to this path is pending.
sequenceDiagram
participant UI as Flutter Caller
participant Dart as LlmPlatformChannel.generateResponse
participant Native as MainActivity
participant Engine as InferenceModel
participant MP as MediaPipe LlmInferenceSession
participant Metrics as MetricsRecorder
UI->>Dart: generateResponse(prompt)
Dart->>Dart: start stopwatch + read memory
Dart->>Native: invokeMethod("generateResponse")
Native->>Engine: getInstance() if needed
Engine->>MP: addQueryChunk(prompt)
Engine->>MP: generateResponseAsync(...)
MP-->>Native: final response text
Native-->>Dart: result.success(response)
Dart->>Dart: stop stopwatch + compute metrics
Dart->>Metrics: recordEntry(...)
Dart-->>UI: response text
3. Help Request Form Flow
flowchart TD
A[Open Help Request Screen] --> B[Fill disaster type + details]
B --> C[Optional photo via image_picker]
C --> D[Get location permission]
D --> E{Location captured?}
E -- No --> F[Show validation snackbar]
E -- Yes --> G[Tap Send Request]
G --> H[Local simulated submit]
H --> I[Show success snackbar]
I --> J[Navigate back]
4. Metrics Storage and Export
flowchart LR
A[generateResponse called] --> B[Collect latency + memory + text metrics]
B --> C[Append MetricEntry in memory]
C --> D[Persist llm_metrics.json in app-docs]
D --> E{User exports CSV?}
E -- Yes --> F[Ask Android for public Downloads path]
F --> G[Write llm_metrics.csv]
E -- No --> H[Keep JSON history only]