Skip to content

Commit

Permalink
Merge branch 'develop'
Browse files Browse the repository at this point in the history
  • Loading branch information
skydoves committed Sep 5, 2023
2 parents ebcdea7 + 6523eb6 commit 8c67f3e
Show file tree
Hide file tree
Showing 25 changed files with 740 additions and 88 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -115,6 +115,7 @@ Video roadmap and changelog is available [here](https://github.com/GetStream/pro

### 0.4.0 milestone

- [X] Screensharing from mobile
- [ ] Complete Livestreaming APIs and Tutorials for hosting & watching
- [ ] Android SDK development.md cleanup (Daniel)
- [ ] Upgrade to more recent versions of webrtc (Kanat)
Expand All @@ -131,7 +132,6 @@ Video roadmap and changelog is available [here](https://github.com/GetStream/pro

- [ ] Testing on more devices
- [ ] Enable SFU switching
- [ ] Screensharing from mobile
- [ ] Camera controls
- [ ] Tap to focus
- [ ] H264 workaround on Samsung 23 (see https://github.com/livekit/client-sdk-android/blob/main/livekit-android-sdk/src/main/java/io/livekit/android/webrtc/SimulcastVideoEncoderFactoryWrapper.kt#L34 and
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@ object Configuration {
const val minSdk = 24
const val majorVersion = 0
const val minorVersion = 3
const val patchVersion = 2
const val patchVersion = 3
const val versionName = "$majorVersion.$minorVersion.$patchVersion"
const val versionCode = 7
const val versionCode = 8
const val snapshotVersionName = "$majorVersion.$minorVersion.${patchVersion + 1}-SNAPSHOT"
const val artifactGroup = "io.getstream"
const val streamVideoCallGooglePlayVersion = "1.0.0"
Expand Down
2 changes: 1 addition & 1 deletion docusaurus/docs/Android/02-tutorials/01-video-calling.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ If you're new to android, note that there are 2 `build.gradle` files, you want t
```kotlin
dependencies {
// Stream Video Compose SDK
implementation("io.getstream:stream-video-android-compose:0.3.2")
implementation("io.getstream:stream-video-android-compose:0.3.3")

// Optionally add Jetpack Compose if Android studio didn't automatically include them
implementation(platform("androidx.compose:compose-bom:2023.08.00"))
Expand Down
2 changes: 1 addition & 1 deletion docusaurus/docs/Android/02-tutorials/02-audio-room.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ If you're new to android, note that there are 2 `build.gradle` files, you want t
```groovy
dependencies {
// Stream Video Compose SDK
implementation("io.getstream:stream-video-android-compose:0.3.2")
implementation("io.getstream:stream-video-android-compose:0.3.3")
// Jetpack Compose (optional/ android studio typically adds them when you create a new project)
implementation(platform("androidx.compose:compose-bom:2023.08.00"))
Expand Down
2 changes: 1 addition & 1 deletion docusaurus/docs/Android/02-tutorials/03-livestream.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ If you're new to android, note that there are 2 `build.gradle` files, you want t
```kotlin
dependencies {
// Stream Video Compose SDK
implementation("io.getstream:stream-video-android-compose:0.3.2")
implementation("io.getstream:stream-video-android-compose:0.3.3")

// Jetpack Compose (optional/ android studio typically adds them when you create a new project)
implementation(platform("androidx.compose:compose-bom:2023.08.00"))
Expand Down
57 changes: 57 additions & 0 deletions docusaurus/docs/Android/06-advanced/04-screen-sharing.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
---
title: Screen sharing
description: Setup for screen sharing
---

## Introduction

The Stream Video Android SDK has support for screen sharing from an Android device. The SDK is using the [Android Media Projection API](https://developer.android.com/guide/topics/large-screens/media-projection) for the capture.

In order for a user to be able to share their screen, they must have the `screenshare` capability configured for the call they are in.

## How to start sharing your screen

You need to be in an active call (have a `Call` instance in Active call state) to start screen sharing.

You must ask the user for screen sharing permission before you can start sharing the screen. The permission is requested by using the [Media Projection API](https://developer.android.com/guide/topics/large-screens/media-projection). And then use the returned intent data from the permission result and call `Call.startScreenSharing(intentData)`.

An example implementation:

```kotlin
val startMediaProjection = registerForActivityResult(StartActivityForResult()) { result ->
if (it.resultCode == Activity.RESULT_OK && it.data != null) {
call.startScreenSharing(it.data!!)
}
}

val mediaProjectionManager = context.getSystemService(MediaProjectionManager::class.java)
startMediaProjection.launch(mediaProjectionManager.createScreenCaptureIntent())
```

You can check if screen sharing is currently active by observing `call.screenShare.isEnabled`.

## Stopping screen sharing

Screen sharing can be stopped wit `Call.stopScreenSharing()`. It is automatically stopped if the call state goes into Inactive state.

The user can also disable screen sharing directly in the system settings (depending on the OEM there is usually a button in the notification bar for disabling screen sharing).

And the screen sharing can also be disabled through the screen sharing notification action button (described in next section).

## Screen sharing notification

A notification is always displayed to the user when the screen sharing is active. The notification itself can't be hidden and is required by the Android OS. The notification title and description can be customised.

Override string `stream_video_screen_sharing_notification_title` and `stream_video_screen_sharing_notification_description` to customise the notification text.

There is also a "Stop screen sharing" action button on the notification, the text of the button can be modified by overriding `stream_video_screen_sharing_notification_action_stop`.

All notifications in Android need to have a notification channel. The Stream Video Android SDK will automatically create a new channel for the screen sharing notification. You can customise the channel title and description (this is visible to the user in the system application settings). Override `stream_video_screen_sharing_notification_channel_title` and `stream_video_screen_sharing_notification_channel_description`.

```xml
<string name="stream_video_screen_sharing_notification_title">You are screen sharing</string>
<string name="stream_video_screen_sharing_notification_description"></string>
<string name="stream_video_screen_sharing_notification_action_stop">Stop screen sharing</string>
<string name="stream_video_screen_sharing_notification_channel_title">Screen-sharing</string>
<string name="stream_video_screen_sharing_notification_channel_description">Required to be enabled for screen sharing</string>
```
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ Let the project sync. It should have all the dependencies required for you to fi
```groovy
dependencies {
// Stream Video Compose SDK
implementation("io.getstream:stream-video-android-compose:0.3.2")
implementation("io.getstream:stream-video-android-compose:0.3.3")
// Stream Chat
implementation(libs.stream.chat.compose)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,11 @@

package io.getstream.video.android.ui.call

import android.app.Activity
import android.media.projection.MediaProjectionManager
import android.widget.Toast
import androidx.activity.compose.rememberLauncherForActivityResult
import androidx.activity.result.contract.ActivityResultContracts
import androidx.compose.foundation.background
import androidx.compose.foundation.clickable
import androidx.compose.foundation.layout.Column
Expand All @@ -29,6 +33,8 @@ import androidx.compose.material.Card
import androidx.compose.material.Icon
import androidx.compose.material.Text
import androidx.compose.runtime.Composable
import androidx.compose.runtime.collectAsState
import androidx.compose.runtime.getValue
import androidx.compose.runtime.rememberCoroutineScope
import androidx.compose.ui.Alignment
import androidx.compose.ui.Modifier
Expand All @@ -54,6 +60,21 @@ internal fun SettingsMenu(
val reactions =
listOf(":fireworks:", ":hello:", ":raise-hand:", ":like:", ":hate:", ":smile:", ":heart:")

val screenSharePermissionResult = rememberLauncherForActivityResult(
contract = ActivityResultContracts.StartActivityForResult(),
onResult = {
if (it.resultCode == Activity.RESULT_OK && it.data != null) {
call.startScreenSharing(it.data!!)
}
onDismissed.invoke()
},
)

val isScreenSharing by call.screenShare.isEnabled.collectAsState()
val screenShareButtonText = if (isScreenSharing) {
"Stop screen-sharing"
} else { "Start screen-sharing" }

Popup(
alignment = Alignment.BottomStart,
offset = IntOffset(30, -200),
Expand Down Expand Up @@ -92,6 +113,37 @@ internal fun SettingsMenu(

Spacer(modifier = Modifier.height(12.dp))

Row(
modifier = Modifier.clickable {
if (!isScreenSharing) {
scope.launch {
val mediaProjectionManager = context.getSystemService(
MediaProjectionManager::class.java,
)
screenSharePermissionResult.launch(
mediaProjectionManager.createScreenCaptureIntent(),
)
}
} else {
call.stopScreenSharing()
}
},
) {
Icon(
painter = painterResource(id = R.drawable.stream_video_ic_screensharing),
tint = VideoTheme.colors.textHighEmphasis,
contentDescription = null,
)

Text(
modifier = Modifier.padding(start = 20.dp),
text = screenShareButtonText,
color = VideoTheme.colors.textHighEmphasis,
)
}

Spacer(modifier = Modifier.height(12.dp))

if (showDebugOptions) {
Row(
modifier = Modifier.clickable {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ internal fun enterPictureInPicture(context: Context, call: Call) {
val screenSharing = call.state.screenSharingSession.value

val aspect =
if (currentOrientation == ActivityInfo.SCREEN_ORIENTATION_PORTRAIT && screenSharing == null) {
if (currentOrientation == ActivityInfo.SCREEN_ORIENTATION_PORTRAIT && (screenSharing == null || screenSharing.participant.isLocal)) {
Rational(9, 16)
} else {
Rational(16, 9)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -212,7 +212,7 @@ internal fun DefaultPictureInPictureContent(call: Call) {
val video = session?.participant?.video?.collectAsStateWithLifecycle()
val pictureInPictureAspectRatio: Float = 16f / 9f

if (session != null) {
if (session != null && !session.participant.isLocal) {
VideoRenderer(
modifier = Modifier.aspectRatio(pictureInPictureAspectRatio, false),
call = call,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,8 @@ public fun ParticipantsGrid(
val screenSharingSession = call.state.screenSharingSession.collectAsStateWithLifecycle()
val screenSharing = screenSharingSession.value

if (screenSharing == null) {
// We do not display our own screen-sharing session
if (screenSharing == null || screenSharing.participant.isLocal) {
ParticipantsRegularGrid(
call = call,
modifier = modifier,
Expand Down
24 changes: 23 additions & 1 deletion stream-video-android-core/api/stream-video-android-core.api
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ public final class io/getstream/video/android/core/Call {
public final fun getLocalMicrophoneAudioLevel ()Lkotlinx/coroutines/flow/StateFlow;
public final fun getMicrophone ()Lio/getstream/video/android/core/MicrophoneManager;
public final fun getMonitor ()Lio/getstream/video/android/core/CallHealthMonitor;
public final fun getScreenShare ()Lio/getstream/video/android/core/ScreenShareManager;
public final fun getSessionId ()Ljava/lang/String;
public final fun getSpeaker ()Lio/getstream/video/android/core/SpeakerManager;
public final fun getState ()Lio/getstream/video/android/core/CallState;
Expand Down Expand Up @@ -51,9 +52,11 @@ public final class io/getstream/video/android/core/Call {
public final fun setVisibility (Ljava/lang/String;Lstream/video/sfu/models/TrackType;Z)V
public final fun startHLS (Lkotlin/coroutines/Continuation;)Ljava/lang/Object;
public final fun startRecording (Lkotlin/coroutines/Continuation;)Ljava/lang/Object;
public final fun startScreenSharing (Landroid/content/Intent;)V
public final fun stopHLS (Lkotlin/coroutines/Continuation;)Ljava/lang/Object;
public final fun stopLive (Lkotlin/coroutines/Continuation;)Ljava/lang/Object;
public final fun stopRecording (Lkotlin/coroutines/Continuation;)Ljava/lang/Object;
public final fun stopScreenSharing ()V
public final fun subscribe (Lio/getstream/video/android/core/events/VideoEventListener;)Lio/getstream/video/android/core/EventSubscription;
public final fun subscribeFor ([Ljava/lang/Class;Lio/getstream/video/android/core/events/VideoEventListener;)Lio/getstream/video/android/core/EventSubscription;
public final fun switchSfu (ZLkotlin/coroutines/Continuation;)Ljava/lang/Object;
Expand Down Expand Up @@ -392,6 +395,8 @@ public final class io/getstream/video/android/core/MediaManagerImpl {
public final fun getContext ()Landroid/content/Context;
public final fun getEglBaseContext ()Lorg/webrtc/EglBase$Context;
public final fun getScope ()Lkotlinx/coroutines/CoroutineScope;
public final fun getScreenShareTrack ()Lorg/webrtc/VideoTrack;
public final fun getScreenShareVideoSource ()Lorg/webrtc/VideoSource;
public final fun getVideoSource ()Lorg/webrtc/VideoSource;
public final fun getVideoTrack ()Lorg/webrtc/VideoTrack;
}
Expand Down Expand Up @@ -663,6 +668,22 @@ public final class io/getstream/video/android/core/RingingState$TimeoutNoAnswer
public fun toString ()Ljava/lang/String;
}

public final class io/getstream/video/android/core/ScreenShareManager {
public static final field Companion Lio/getstream/video/android/core/ScreenShareManager$Companion;
public fun <init> (Lio/getstream/video/android/core/MediaManagerImpl;Lorg/webrtc/EglBase$Context;)V
public final fun disable (Z)V
public static synthetic fun disable$default (Lio/getstream/video/android/core/ScreenShareManager;ZILjava/lang/Object;)V
public final fun enable (Landroid/content/Intent;Z)V
public static synthetic fun enable$default (Lio/getstream/video/android/core/ScreenShareManager;Landroid/content/Intent;ZILjava/lang/Object;)V
public final fun getEglBaseContext ()Lorg/webrtc/EglBase$Context;
public final fun getMediaManager ()Lio/getstream/video/android/core/MediaManagerImpl;
public final fun getStatus ()Lkotlinx/coroutines/flow/StateFlow;
public final fun isEnabled ()Lkotlinx/coroutines/flow/StateFlow;
}

public final class io/getstream/video/android/core/ScreenShareManager$Companion {
}

public final class io/getstream/video/android/core/SpeakerManager {
public fun <init> (Lio/getstream/video/android/core/MediaManagerImpl;Lio/getstream/video/android/core/MicrophoneManager;Ljava/lang/Integer;)V
public synthetic fun <init> (Lio/getstream/video/android/core/MediaManagerImpl;Lio/getstream/video/android/core/MicrophoneManager;Ljava/lang/Integer;ILkotlin/jvm/internal/DefaultConstructorMarker;)V
Expand Down Expand Up @@ -874,6 +895,7 @@ public final class io/getstream/video/android/core/call/RtcSession {
public final fun reconnect (Lkotlin/coroutines/Continuation;)Ljava/lang/Object;
public final fun setLocalTrack (Lstream/video/sfu/models/TrackType;Lio/getstream/video/android/core/model/MediaTrack;)V
public final fun setMuteState (ZLstream/video/sfu/models/TrackType;)V
public final fun setScreenShareTrack ()V
public final fun setSubscriber (Lio/getstream/video/android/core/call/connection/StreamPeerConnection;)V
public final fun setTrack (Ljava/lang/String;Lstream/video/sfu/models/TrackType;Lio/getstream/video/android/core/model/MediaTrack;)V
public final fun setTracks (Ljava/util/Map;)V
Expand Down Expand Up @@ -902,7 +924,7 @@ public final class io/getstream/video/android/core/call/connection/StreamPeerCon
public fun <init> (Lkotlinx/coroutines/CoroutineScope;Lio/getstream/video/android/core/model/StreamPeerType;Lorg/webrtc/MediaConstraints;Lkotlin/jvm/functions/Function1;Lkotlin/jvm/functions/Function2;Lkotlin/jvm/functions/Function2;I)V
public final fun addAudioTransceiver (Lorg/webrtc/MediaStreamTrack;Ljava/util/List;)V
public final fun addIceCandidate (Lio/getstream/video/android/core/model/IceCandidate;Lkotlin/coroutines/Continuation;)Ljava/lang/Object;
public final fun addVideoTransceiver (Lorg/webrtc/MediaStreamTrack;Ljava/util/List;)V
public final fun addVideoTransceiver (Lorg/webrtc/MediaStreamTrack;Ljava/util/List;Z)V
public final fun createAnswer (Lkotlin/coroutines/Continuation;)Ljava/lang/Object;
public final fun createOffer (Lkotlin/coroutines/Continuation;)Ljava/lang/Object;
public final fun getAudioTransceiver ()Lorg/webrtc/RtpTransceiver;
Expand Down
10 changes: 10 additions & 0 deletions stream-video-android-core/src/main/AndroidManifest.xml
Original file line number Diff line number Diff line change
Expand Up @@ -59,5 +59,15 @@
</intent-filter>
</receiver>
<activity android:name=".notifications.internal.DismissNotificationActivity" />

<receiver android:name=".notifications.internal.StopScreenshareBroadcastReceiver"
android:exported="false">
<intent-filter android:priority="-1">
<action android:name="io.getstream.video.android.action.CANCEL_SCREEN_SHARE" />
</intent-filter>
</receiver>

<service android:name=".screenshare.StreamScreenShareService"
android:foregroundServiceType="mediaProjection"/>
</application>
</manifest>
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@

package io.getstream.video.android.core

import android.content.Intent
import android.view.View
import androidx.annotation.VisibleForTesting
import io.getstream.log.taggedLogger
Expand Down Expand Up @@ -57,6 +58,7 @@ import org.openapitools.client.models.JoinCallResponse
import org.openapitools.client.models.ListRecordingsResponse
import org.openapitools.client.models.MemberRequest
import org.openapitools.client.models.MuteUsersResponse
import org.openapitools.client.models.OwnCapability
import org.openapitools.client.models.RejectCallResponse
import org.openapitools.client.models.SendEventResponse
import org.openapitools.client.models.SendReactionResponse
Expand Down Expand Up @@ -117,6 +119,7 @@ public class Call(
val camera by lazy { mediaManager.camera }
val microphone by lazy { mediaManager.microphone }
val speaker by lazy { mediaManager.speaker }
val screenShare by lazy { mediaManager.screenShare }

/** The cid is type:id */
val cid = "$type:$id"
Expand Down Expand Up @@ -516,6 +519,7 @@ public class Call(
} else {
RealtimeConnection.Disconnected
}
stopScreenSharing()
client.state.removeActiveCall()
client.state.removeRingingCall()
(client as StreamVideoImpl).onCallCleanUp(this)
Expand Down Expand Up @@ -662,6 +666,28 @@ public class Call(
return clientImpl.stopRecording(type, id)
}

/**
* User needs to have [OwnCapability.Screenshare] capability in order to start screen
* sharing.
*
* @param mediaProjectionPermissionResultData - intent data returned from the
* activity result after asking for screen sharing permission by launching
* MediaProjectionManager.createScreenCaptureIntent().
* See https://developer.android.com/guide/topics/large-screens/media-projection#recommended_approach
*/
fun startScreenSharing(mediaProjectionPermissionResultData: Intent) {
if (state.ownCapabilities.value.contains(OwnCapability.Screenshare)) {
session?.setScreenShareTrack()
screenShare.enable(mediaProjectionPermissionResultData)
} else {
logger.w { "Can't start screen sharing - user doesn't have wnCapability.Screenshare permission" }
}
}

fun stopScreenSharing() {
screenShare.disable(fromUser = true)
}

suspend fun startHLS(): Result<Any> {
return clientImpl.startBroadcasting(type, id)
.onSuccess {
Expand Down
Loading

0 comments on commit 8c67f3e

Please sign in to comment.