-
Notifications
You must be signed in to change notification settings - Fork 24
Add Foreground Service to keep WebSocket alive #79
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Important Review skippedReview was skipped due to path filters ⛔ Files ignored due to path filters (1)
CodeRabbit blocks several paths by default. You can override this behavior by explicitly including those paths in the path filters. For example, including You can disable this status message by setting the ✨ Finishing Touches🧪 Generate unit tests
Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 5
🧹 Nitpick comments (5)
app/app.json (1)
32-36: Permissions look right; avoid duplication between app.json and plugin.You’ve added FGS + POST_NOTIFICATIONS here and the plugin also injects them. It’s harmless (deduped), but consider keeping them in one place to prevent drift.
app/plugins/with-ws-fgs.js (1)
27-35: Good: injects the right FGS/notifications permissions; optional mic gate.This aligns with Android 14+ FGS policy. If microphone is true, consider also adding RECORD_AUDIO here to keep all related perms co-located.
if (microphone) { addPerm('android.permission.FOREGROUND_SERVICE_MICROPHONE'); - // If you actually record audio, you'll also need RECORD_AUDIO elsewhere. + addPerm('android.permission.RECORD_AUDIO'); }app/app/hooks/useAudioStreamer.ts (3)
123-133: Ensure channel exists before notifyInfo().If FGS hasn’t run yet, the “Streaming” channel may not exist. Create it defensively.
const notifyInfo = useCallback(async (title: string, body: string) => { try { + await notifee.createChannel({ + id: FGS_CHANNEL_ID, + name: 'Streaming', + importance: AndroidImportance.LOW, + }); await notifee.displayNotification({ title, body, android: { channelId: FGS_CHANNEL_ID }, }); } catch { // ignore if not available } }, []);
99-101: Timer refs: prefer platform-agnostic types.NodeJS.Timeout can be wrong in RN. Use ReturnType<typeof setTimeout|setInterval>.
- const reconnectTimeoutRef = useRef<NodeJS.Timeout | null>(null); - const heartbeatRef = useRef<NodeJS.Timeout | null>(null); + const reconnectTimeoutRef = useRef<ReturnType<typeof setTimeout> | null>(null); + const heartbeatRef = useRef<ReturnType<typeof setInterval> | null>(null);
253-256: Support auth headers/subprotocols for “advanced backend.”RN WebSocket supports headers/protocols; exposing optional params will help with token-based or subprotocol auth without URL hacks.
-interface UseAudioStreamer { +interface UseAudioStreamer { // ... - startStreaming: (url: string) => Promise<void>; + startStreaming: (url: string, opts?: { protocols?: string | string[]; headers?: Record<string, string> }) => Promise<void>;- const startStreaming = useCallback(async (url: string): Promise<void> => { + const startStreaming = useCallback(async (url: string, opts?: { protocols?: string | string[]; headers?: Record<string, string> }): Promise<void> => { // ... - const ws = new WebSocket(trimmed); + const ws = new WebSocket(trimmed, opts?.protocols, { headers: opts?.headers });I can wire this through your UI and add a token provider if you share the advanced backend’s auth shape.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (4)
app/app.json(3 hunks)app/app/hooks/useAudioStreamer.ts(3 hunks)app/package.json(1 hunks)app/plugins/with-ws-fgs.js(1 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
app/**/*.{ts,tsx,js,jsx}
📄 CodeRabbit inference engine (CLAUDE.md)
app/**/*.{ts,tsx,js,jsx}: Follow standard React Native conventions for TypeScript/JavaScript in the mobile app
Mobile app WebSocket streaming must implement the Wyoming protocol: send JSONL header (audio-start/chunk/stop) followed by binary payloads
Files:
app/plugins/with-ws-fgs.jsapp/app/hooks/useAudioStreamer.ts
🔇 Additional comments (5)
app/package.json (1)
17-17: Pinning expo-build-properties with ~ is fine.This reduces drift across devs/CI; no action needed.
app/plugins/with-ws-fgs.js (2)
47-51: Foreground service types are set correctly for Android 14+.dataSync (+ microphone when enabled) is appropriate for a WS keep-alive/data channel.
41-46: Retain service creation logic: It’s required to override Notifee’s default AAR service entry—which declares<service android:name="app.notifee.core.ForegroundService" android:foregroundServiceType="shortService"/>during manifest merging—so you can set the properandroid:foregroundServiceTypeon Android 14+ (github.com, docs.page)Likely an incorrect or invalid review comment.
app/app/hooks/useAudioStreamer.ts (2)
360-373: Intentional: don’t auto-stop on unmount.Acknowledged. Be sure owners explicitly call stopStreaming() on session end to avoid leaked sockets after navigation.
135-153: Wyoming framing LGTM.JSONL header + optional binary payload matches the guideline for audio-start/chunk/stop.
| [ | ||
| "./plugins/with-ws-fgs", | ||
| { | ||
| "android": { | ||
| "permissions": [ | ||
| "FOREGROUND_SERVICE", | ||
| "FOREGROUND_SERVICE_DATA_SYNC", | ||
| "POST_NOTIFICATIONS" | ||
| ] | ||
| } | ||
| } | ||
| ], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Plugin options don’t match the plugin’s API.
with-ws-fgs expects { microphone?: boolean }. The passed { android: { permissions: [...] } } is ignored. If you’ll ever record from mic, pass microphone: true; otherwise remove options to avoid confusion.
Apply:
- [
- "./plugins/with-ws-fgs",
- {
- "android": {
- "permissions": [
- "FOREGROUND_SERVICE",
- "FOREGROUND_SERVICE_DATA_SYNC",
- "POST_NOTIFICATIONS"
- ]
- }
- }
- ],
+ ["./plugins/with-ws-fgs", { "microphone": false }],If you plan to capture mic audio later, also add RECORD_AUDIO under android.permissions.
🏁 Script executed:
#!/bin/bash
# Locate the with-ws-fgs plugin file in the repository
file=$(fd -I with-ws-fgs -t f | head -1)
echo "Found plugin file: $file"
# Display the first 200 lines for inspection
sed -n '1,200p' "$file"Length of output: 2125
Plugin options only support the microphone flag; remove the android field
with-ws-fgs is defined as (config, { microphone?: boolean }) and ignores other options. Replace your plugin entry with:
- [
- "./plugins/with-ws-fgs",
- {
- "android": {
- "permissions": [
- "FOREGROUND_SERVICE",
- "FOREGROUND_SERVICE_DATA_SYNC",
- "POST_NOTIFICATIONS"
- ]
- }
- }
- ],
+ ["./plugins/with-ws-fgs", { "microphone": false }],If you plan to record audio, use { "microphone": true } and manually add android.permission.RECORD_AUDIO to your AndroidManifest (e.g. via expo-build-properties or another config plugin).
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| [ | |
| "./plugins/with-ws-fgs", | |
| { | |
| "android": { | |
| "permissions": [ | |
| "FOREGROUND_SERVICE", | |
| "FOREGROUND_SERVICE_DATA_SYNC", | |
| "POST_NOTIFICATIONS" | |
| ] | |
| } | |
| } | |
| ], | |
| ["./plugins/with-ws-fgs", { "microphone": false }], |
🤖 Prompt for AI Agents
In app/app.json around lines 51 to 62, the with-ws-fgs plugin entry passes an
unsupported "android" object; the plugin only accepts a single optional
"microphone" boolean. Replace the current plugin options with either no options
or { "microphone": true } and remove the entire "android" field. If you need
audio recording also add android.permission.RECORD_AUDIO to AndroidManifest via
expo-build-properties or another config plugin.
| async function startForegroundServiceNotification(title: string, body: string) { | ||
| ensureFgsRegistered(); | ||
| await ensureNotificationPermission(); | ||
|
|
||
| // Create channel if needed | ||
| await notifee.createChannel({ | ||
| id: FGS_CHANNEL_ID, | ||
| name: 'Streaming', | ||
| importance: AndroidImportance.LOW, | ||
| }); | ||
|
|
||
| // Start (or update) the foreground notification | ||
| await notifee.displayNotification({ | ||
| id: FGS_NOTIFICATION_ID, | ||
| title, | ||
| body, | ||
| android: { | ||
| channelId: FGS_CHANNEL_ID, | ||
| asForegroundService: true, | ||
| ongoing: true, | ||
| pressAction: { id: 'default' }, | ||
| }, | ||
| }); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Harden FGS notification path: don’t fail start on notification errors.
Wrap in try/catch and ensure the channel exists; if POST_NOTIFICATIONS is denied, proceed with streaming anyway.
async function startForegroundServiceNotification(title: string, body: string) {
- ensureFgsRegistered();
- await ensureNotificationPermission();
-
- // Create channel if needed
- await notifee.createChannel({
- id: FGS_CHANNEL_ID,
- name: 'Streaming',
- importance: AndroidImportance.LOW,
- });
-
- // Start (or update) the foreground notification
- await notifee.displayNotification({
- id: FGS_NOTIFICATION_ID,
- title,
- body,
- android: {
- channelId: FGS_CHANNEL_ID,
- asForegroundService: true,
- ongoing: true,
- pressAction: { id: 'default' },
- },
- });
+ try {
+ ensureFgsRegistered();
+ await ensureNotificationPermission();
+ await notifee.createChannel({
+ id: FGS_CHANNEL_ID,
+ name: 'Streaming',
+ importance: AndroidImportance.LOW,
+ });
+ await notifee.displayNotification({
+ id: FGS_NOTIFICATION_ID,
+ title,
+ body,
+ android: {
+ channelId: FGS_CHANNEL_ID,
+ asForegroundService: true,
+ ongoing: true,
+ pressAction: { id: 'default' },
+ },
+ });
+ } catch (e) {
+ console.warn('[AudioStreamer] FGS notification failed:', (e as any)?.message ?? e);
+ }
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| async function startForegroundServiceNotification(title: string, body: string) { | |
| ensureFgsRegistered(); | |
| await ensureNotificationPermission(); | |
| // Create channel if needed | |
| await notifee.createChannel({ | |
| id: FGS_CHANNEL_ID, | |
| name: 'Streaming', | |
| importance: AndroidImportance.LOW, | |
| }); | |
| // Start (or update) the foreground notification | |
| await notifee.displayNotification({ | |
| id: FGS_NOTIFICATION_ID, | |
| title, | |
| body, | |
| android: { | |
| channelId: FGS_CHANNEL_ID, | |
| asForegroundService: true, | |
| ongoing: true, | |
| pressAction: { id: 'default' }, | |
| }, | |
| }); | |
| } | |
| async function startForegroundServiceNotification(title: string, body: string) { | |
| try { | |
| ensureFgsRegistered(); | |
| await ensureNotificationPermission(); | |
| // Create channel if needed | |
| await notifee.createChannel({ | |
| id: FGS_CHANNEL_ID, | |
| name: 'Streaming', | |
| importance: AndroidImportance.LOW, | |
| }); | |
| // Start (or update) the foreground notification | |
| await notifee.displayNotification({ | |
| id: FGS_NOTIFICATION_ID, | |
| title, | |
| body, | |
| android: { | |
| channelId: FGS_CHANNEL_ID, | |
| asForegroundService: true, | |
| ongoing: true, | |
| pressAction: { id: 'default' }, | |
| }, | |
| }); | |
| } catch (e) { | |
| console.warn( | |
| '[AudioStreamer] FGS notification failed:', | |
| (e as any)?.message ?? e | |
| ); | |
| } | |
| } |
🤖 Prompt for AI Agents
In app/app/hooks/useAudioStreamer.ts around lines 56 to 79, the foreground
service notification startup is currently unprotected and can throw (e.g.,
permission denied), which would abort streaming; wrap the channel creation and
displayNotification calls in a try/catch so any errors are logged but do not
rethrow, check permission status before attempting notification and if
POST_NOTIFICATIONS is denied skip notification steps and continue, and ensure
the channel creation is attempted only when permitted or handled safely so
streaming proceeds regardless of notification failures.
| // Ensure Foreground Service is up so the JS VM isn’t killed when backgrounded | ||
| await startForegroundServiceNotification('Streaming active', 'Keeping WebSocket connection alive'); | ||
|
|
||
| setIsConnecting(true); | ||
| setError(null); | ||
| console.log(`[AudioStreamer] Initializing WebSocket: ${trimmed}`); | ||
| if (websocketRef.current) await stopStreaming(); // close any existing | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bug: FGS is stopped immediately when reconnecting. Reorder calls.
You start the foreground notification, then call stopStreaming() which stops the FGS you just started. This can drop the keep-alive right before dialing the new socket.
- // Ensure Foreground Service is up so the JS VM isn’t killed when backgrounded
- await startForegroundServiceNotification('Streaming active', 'Keeping WebSocket connection alive');
-
- console.log(`[AudioStreamer] Initializing WebSocket: ${trimmed}`);
- if (websocketRef.current) await stopStreaming(); // close any existing
+ // Close any existing socket first (stopStreaming() currently stops FGS)
+ if (websocketRef.current) await stopStreaming();
+ // Ensure Foreground Service is up so the JS VM isn’t killed when backgrounded
+ await startForegroundServiceNotification('Streaming active', 'Keeping WebSocket connection alive');
+ console.log(`[AudioStreamer] Initializing WebSocket: ${trimmed}`);Alternatively, add a parameter to stopStreaming(skipFgsStop?: boolean) and skip stopping FGS during handoffs.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // Ensure Foreground Service is up so the JS VM isn’t killed when backgrounded | |
| await startForegroundServiceNotification('Streaming active', 'Keeping WebSocket connection alive'); | |
| setIsConnecting(true); | |
| setError(null); | |
| console.log(`[AudioStreamer] Initializing WebSocket: ${trimmed}`); | |
| if (websocketRef.current) await stopStreaming(); // close any existing | |
| // Close any existing socket first (stopStreaming() currently stops FGS) | |
| if (websocketRef.current) await stopStreaming(); | |
| // Ensure Foreground Service is up so the JS VM isn’t killed when backgrounded | |
| await startForegroundServiceNotification('Streaming active', 'Keeping WebSocket connection alive'); | |
| console.log(`[AudioStreamer] Initializing WebSocket: ${trimmed}`); |
🤖 Prompt for AI Agents
In app/app/hooks/useAudioStreamer.ts around lines 242 to 247, the foreground
service is started before calling stopStreaming(), which causes stopStreaming to
immediately stop the FGS you just started; either move the call to
stopStreaming() so it runs before startForegroundServiceNotification (ensuring
any existing socket is closed first) or add a skipFgsStop?: boolean parameter to
stopStreaming and call stopStreaming(true) here so the handoff keeps the FGS
running while reconnecting; update the stopStreaming implementation and all its
callers accordingly if you choose the parameter approach.
| // Start heartbeat | ||
| if (heartbeatRef.current) clearInterval(heartbeatRef.current); | ||
| heartbeatRef.current = setInterval(() => { | ||
| try { | ||
| if (websocketRef.current?.readyState === WebSocket.OPEN) { | ||
| websocketRef.current.send(JSON.stringify({ type: 'ping', t: Date.now() })); | ||
| } | ||
| } catch {} | ||
| }, HEARTBEAT_MS); | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Heartbeat should use Wyoming JSONL framing (or drop it).
Sending a raw JSON message without newline isn’t Wyoming and may confuse strict servers. Use the existing sendWyomingEvent so it’s JSONL-framed, payload_length null, and consistent.
- heartbeatRef.current = setInterval(() => {
- try {
- if (websocketRef.current?.readyState === WebSocket.OPEN) {
- websocketRef.current.send(JSON.stringify({ type: 'ping', t: Date.now() }));
- }
- } catch {}
- }, HEARTBEAT_MS);
+ heartbeatRef.current = setInterval(() => {
+ if (websocketRef.current?.readyState === WebSocket.OPEN) {
+ // Fire-and-forget; ignore promise
+ void sendWyomingEvent({ type: 'ping', data: { t: Date.now() } });
+ }
+ }, HEARTBEAT_MS);Confirm your backend tolerates unknown Wyoming event types like “ping”; if not, consider zero-length audio-chunk as a keep-alive or rely solely on FGS.
🏁 Script executed:
#!/bin/bash
set -eux
# Frontend: locate sendWyomingEvent definition and usage
rg -n "sendWyomingEvent" -g "*.ts" -g "*.tsx"
# Confirm import of sendWyomingEvent in useAudioStreamer.ts
rg -n "sendWyomingEvent" -g "app/app/hooks/useAudioStreamer.ts"
# Find HEARTBEAT_MS constant
rg -n "HEARTBEAT_MS" -g "app/**/*.ts" -g "app/**/*.tsx"
# Backend: locate ws_pcm endpoint
rg -nP "async def ws_pcm" -g "backends/**/*.py"
# Backend: inspect JSONL parsing loop and handling of unknown event types
rg -n "for event" -g "backends/**/*.py"
rg -n "type == 'ping'" -g "backends/**/*.py"Length of output: 1712
🏁 Script executed:
#!/bin/bash
set -eux
# List all Python files under backends (up to depth 5)
find backends -type f -maxdepth 5 -name "*.py"
# Search for any ws_pcm handler definitions
grep -RIn "ws_pcm" backends || true
# If the advanced backend main.py exists, print its top 200 lines
if [ -f backends/advanced/src/main.py ]; then
echo "----- backends/advanced/src/main.py -----"
sed -n '1,200p' backends/advanced/src/main.py
fi
# Search for JSONL parsing loops and event type handling
grep -RIn "for.*in.*websocket" backends || true
grep -RIn "json.loads" backends || true
grep -RIn "type *==" backends || trueLength of output: 9443
🏁 Script executed:
#!/bin/bash
set -eux
# Print advanced backend ws_pcm handler region (lines around 650-780)
sed -n '650,780p' backends/advanced/src/advanced_omi_backend/main.py || true
# Print simple backend JSONL parsing and event loop (first 150 lines)
sed -n '1,150p' backends/simple/main.py || trueLength of output: 11936
Use sendWyomingEvent for heartbeat pings
Replace the manual JSON ping with sendWyomingEvent to emit a newline-terminated Wyoming header (proper payload_length) and leverage built-in ping handling in the /ws_pcm endpoint. For example:
- heartbeatRef.current = setInterval(() => {
- try {
- if (websocketRef.current?.readyState === WebSocket.OPEN) {
- websocketRef.current.send(JSON.stringify({ type: 'ping', t: Date.now() }));
- }
- } catch {}
- }, HEARTBEAT_MS);
+ heartbeatRef.current = setInterval(() => {
+ if (websocketRef.current?.readyState === WebSocket.OPEN) {
+ // Fire-and-forget; JSONL framing + payload_length = 0
+ void sendWyomingEvent({ type: 'ping', data: { t: Date.now() } });
+ }
+ }, HEARTBEAT_MS);The advanced backend’s /ws_pcm handler explicitly checks for header["type"] === "ping", so this change is fully supported.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // Start heartbeat | |
| if (heartbeatRef.current) clearInterval(heartbeatRef.current); | |
| heartbeatRef.current = setInterval(() => { | |
| try { | |
| if (websocketRef.current?.readyState === WebSocket.OPEN) { | |
| websocketRef.current.send(JSON.stringify({ type: 'ping', t: Date.now() })); | |
| } | |
| } catch {} | |
| }, HEARTBEAT_MS); | |
| // Start heartbeat | |
| if (heartbeatRef.current) clearInterval(heartbeatRef.current); | |
| heartbeatRef.current = setInterval(() => { | |
| if (websocketRef.current?.readyState === WebSocket.OPEN) { | |
| // Fire-and-forget; JSONL framing + payload_length = 0 | |
| void sendWyomingEvent({ type: 'ping', data: { t: Date.now() } }); | |
| } | |
| }, HEARTBEAT_MS); |
🤖 Prompt for AI Agents
In app/app/hooks/useAudioStreamer.ts around lines 263 to 272, replace the manual
JSON ping send with a call to sendWyomingEvent so the heartbeat emits a
newline-terminated Wyoming header including correct payload_length;
specifically, when websocketRef.current?.readyState is WebSocket.OPEN call
sendWyomingEvent(websocketRef.current, { type: 'ping', t: Date.now() }) (or the
local equivalent) inside the interval and remove the raw JSON.stringify send;
preserve the try/catch surrounding the send and keep clearing/setting the
interval logic unchanged so the built-in /ws_pcm ping handling is used.
| "web": "expo start --web" | ||
| }, | ||
| "dependencies": { | ||
| "@notifee/react-native": "^9.1.8", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Verify Notifee + Expo/RN compatibility and native build requirements.
Adding @notifee/react-native is correct for FGS, but it requires a custom dev client/EAS build (won’t work in Expo Go). Also confirm v9.1.8 supports RN 0.79.2 and Expo SDK 53 on your target CI images.
🌐 Web query:
Is @notifee/[email protected] compatible with React Native 0.79 and Expo SDK 53 (Android)? Any special setup beyond manifest (e.g., smallIcon, dev client requirement)?
💡 Result:
Short answer: Yes — Notifee v9 works with React Native 0.79 and Expo SDK 53 on Android, but it requires native builds (prebuild/EAS/dev-client) and a few Android build/manifest/icon setup steps.
What to know / do
- Native build required: Notifee is a native module — with Expo you must add the Notifee plugin and run npx expo prebuild and rebuild (or use EAS + a custom/dev client for development). (notifee.app)
- compileSdk / Android 14: Notifee v9 expects compileSdk >= 34 (and related Android/Gradle/JDK requirements). If you request Android 13 notification permission, ensure targetSdk >= 33. Use JDK 11+ and update EAS build image if needed. (github.com, notifee.app)
- smallIcon and icons: You must add proper Android notification icons (smallIcon) to your Android resources and set android.smallIcon on notifications — otherwise icons will be wrong. Follow Notifee’s Android Appearance docs. (notifee.app)
- Manifest notes: v9 removed the USE_FULL_SCREEN_INTENT permission by default; if your app needs it or needs specific foreground service types for Android 14+, add them manually per the v9 release notes. (github.com)
Quick checklist for Expo SDK 53 + RN 0.79
- Add @notifee/react-native and add the plugin to app config, run npx expo prebuild. (notifee.app)
- Ensure compileSdkVersion >= 34, targetSdk >= 33 (if using Android13 permissions), Gradle/AGP versions and JDK11+. (notifee.app, github.com)
- Add notification smallIcon resource and set smallIcon in notifications. (notifee.app)
- Use EAS / development client to run native builds during development (Notifee won’t run in plain Expo Go). (notifee.app)
If you want, tell me whether you’re using managed or bare Expo and I’ll give the exact app.json/app.config and eas.json snippets you’ll need.
Citations:
- 1: https://notifee.app/react-native/docs/installation?utm_source=openai
- 2: https://github.com/invertase/notifee/blob/main/docs/react-native/release-notes.mdx?utm_source=openai
- 3: https://notifee.app/react-native/docs/installation?utm_source=openai
- 4: https://notifee.app/react-native/docs/android/appearance/?utm_source=openai
- 5: https://github.com/invertase/notifee/blob/main/docs/react-native/release-notes.mdx?utm_source=openai
- 6: https://notifee.app/react-native/docs/installation?utm_source=openai
- 7: https://notifee.app/react-native/docs/installation?utm_source=openai
- 8: https://github.com/invertase/notifee/blob/main/docs/react-native/release-notes.mdx?utm_source=openai
- 9: https://notifee.app/react-native/docs/android/appearance/?utm_source=openai
- 10: https://notifee.app/react-native/docs/installation?utm_source=openai
Configure Notifee native setup and Android requirements
- Add the Notifee plugin in your Expo config and run
npx expo prebuildor use an EAS custom/dev client (won’t work in Expo Go). - Ensure
compileSdkVersion ≥ 34,targetSdkVersion ≥ 33, Gradle/AGP and JDK 11+. - Include a proper Android notification smallIcon resource and set
android.smallIcon. - Manually add
USE_FULL_SCREEN_INTENTpermission or any needed foreground service types per Notifee v9 release notes.
🤖 Prompt for AI Agents
In app/package.json around line 12, the Notifee native setup and Android
requirements are missing or incomplete; update the project by adding the Notifee
Expo plugin to app.json/app.config.js (and run npx expo prebuild or use an
EAS/custom dev client), ensure Android build config uses compileSdkVersion ≥ 34
and targetSdkVersion ≥ 33 with Gradle/AGP and JDK 11+, add a proper notification
smallIcon drawable to Android resources and set android.smallIcon in the Expo
config, and manually add the USE_FULL_SCREEN_INTENT permission and any required
foreground service types per Notifee v9 release notes.
This pull request adds a Foreground Service (via Notifee) to keep the WebSocket connection alive while the app is running in the background.
I have only tested this with the simple backend. I'm not sure if it would work with the advanced backend because of the authentication.
Admittedly, I have no experience with mobile development and used AI to implement this. But it works for me, so I leave this to the discretion of better devs than I.
Summary by CodeRabbit