You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I’m building a React Native app (Android/iOS) and running into audio session conflicts when trying to combine WebRTC voice/video with YouTube background music playback.
Use case
Join a LiveKit room with camera + microphone enabled
Play YouTube iframe/webview for synchronized background music
Users can toggle music and microphone audio on/off independently
Both audio sources should play simultaneously on both Android and iOS devices
Current issues
Random audio interruptions and crashes
YouTube music pauses unpredictably when LiveKit connects
Microphone audio cuts out intermittently
Inconsistent behavior between Android and iOS
What I’ve tried
Android: AudioType.MediaAudioType() with manageAudioFocus: false
iOS: useIOSAudioManagement with mixWithOthers option
Various AudioSession.configureAudio() combinations
Questions
Is this use case (WebRTC voice + background music playback) officially supported?
Should audio mixing be handled at the WebRTC level, or at the app-level audio session?
Are there any examples/docs for concurrent audio scenarios?
Is server-side mixing the recommended approach here?
Environment
Platform: React Native SDK v2.9.x
Devices: Testing on physical Android + iOS devices