Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problems with saveing video on server in k8s cluster #128

Open
truecorax opened this issue Aug 19, 2024 · 10 comments
Open

Problems with saveing video on server in k8s cluster #128

truecorax opened this issue Aug 19, 2024 · 10 comments

Comments

@truecorax
Copy link

I run jitsi in local k8s baremetal cluster with local-storage storageClass. I created pvs and jitsi sucesssefuly connected to them and created dirs and files . There's no problem with saving video to the local machine but when i try to save to the servev I've get an error

Jibri 2024-08-19 13:17:30.397 FINE: [71] [hostname=teamok-jitsi-prosody.teamok.svc.cluster.local id=teamok-jitsi-prosody.teamok.svc.cluster.local] MucClient$3.handleIQRequest#513: Received an IQ with type set: IQ Stanza (jibri http://jitsi.org/protocol/jibri) [[email protected]/-LOGAde-xxFs,[email protected]/focus,id=amlicmlAYXV0aC5tZWV0LmppdHNpLy1MT0dBZGUteHhGcwBNM0RGQS0yNTMArR2PG9W3dic=,type=set,]
Jibri 2024-08-19 13:17:30.397 INFO: [71] XmppApi.handleJibriIq#230: Received JibriIq <iq xmlns='jabber:client' to='[email protected]/-LOGAde-xxFs' from='[email protected]/focus' id='amlicmlAYXV0aC5tZWV0LmppdHNpLy1MT0dBZGUteHhGcwBNM0RGQS0yNTMArR2PG9W3dic=' type='set'><jibri xmlns='http://jitsi.org/protocol/jibri' action='start' recording_mode='file' room='[email protected]' session_id='c9cc925b-b555-4e57-b62f-73073ce2d27c' app_data='{"file_recording_metadata":{"share":true}}'/></iq> from environment [MucClient id=teamok-jitsi-prosody.teamok.svc.cluster.local hostname=teamok-jitsi-prosody.teamok.svc.cluster.local]
Jibri 2024-08-19 13:17:30.398 INFO: [71] XmppApi.handleStartJibriIq#262: Received start request, starting service
Jibri 2024-08-19 13:17:30.401 INFO: [71] XmppApi.handleStartService#373: Parsed call url info: CallUrlInfo(baseUrl=jitsi.teamok.area, callName=chat152-member67-68.27994178584757-1724062614766, urlParams=[])
Jibri 2024-08-19 13:17:30.401 INFO: [71] JibriManager.startFileRecording#128: Starting a file recording with params: FileRecordingRequestParams(callParams=CallParams(callUrlInfo=CallUrlInfo(baseUrl=jitsi.teamok.area, callName=chat152-member67-68.27994178584757-1724062614766, urlParams=[]), email='', passcode=null, callStatsUsernameOverride=, displayName=), sessionId=c9cc925b-b555-4e57-b62f-73073ce2d27c, callLoginParams=XmppCredentials(domain=recorder.meet.jitsi, port=null, username=recorder, password=*****))
Jibri 2024-08-19 13:17:30.402 FINE: [71] [session_id=c9cc925b-b555-4e57-b62f-73073ce2d27c] FfmpegCapturer.<init>#92: Detected os as OS: LINUX
Jibri 2024-08-19 13:17:30.402 FINE: [71] MainKt$setupMetaconfigLogger$1.debug#234: ConfigSourceSupplier: Trying to retrieve key 'jibri.chrome.flags' from source 'config' as type kotlin.collections.List<kotlin.String>
Jibri 2024-08-19 13:17:30.403 FINE: [71] MainKt$setupMetaconfigLogger$1.debug#234: ConfigSourceSupplier: Found value [--use-fake-ui-for-media-stream, --start-maximized, --kiosk, --enabled, --autoplay-policy=no-user-gesture-required] for key 'jibri.chrome.flags' from source 'config' as type kotlin.collections.List<kotlin.String>
Starting ChromeDriver 126.0.6478.182 (5b5d8292ddf182f8b2096fa665b473b6317906d5-refs/branch-heads/6478@{#1776}) on port 1563
Only local connections are allowed.
Please see https://chromedriver.chromium.org/security-considerations for suggestions on keeping ChromeDriver safe.
ChromeDriver was started successfully.
Jibri 2024-08-19 13:17:31.032 INFO: [71] org.openqa.selenium.remote.ProtocolHandshake.createSession: Detected dialect: OSS
Jibri 2024-08-19 13:17:31.042 FINE: [71] MainKt$setupMetaconfigLogger$1.debug#234: FallbackSupplier: checking for value via suppliers:
  LambdaSupplier: 'JibriConfig::recordingDirectory'
  ConfigSourceSupplier: key: 'jibri.recording.recordings-directory', type: 'kotlin.String', source: 'config'
Jibri 2024-08-19 13:17:31.043 FINE: [71] MainKt$setupMetaconfigLogger$1.debug#234: LambdaSupplier: Trying to retrieve value via JibriConfig::recordingDirectory
Jibri 2024-08-19 13:17:31.043 FINE: [71] MainKt$setupMetaconfigLogger$1.debug#234: FallbackSupplier: failed to find value via LambdaSupplier: 'JibriConfig::recordingDirectory': org.jitsi.metaconfig.ConfigException$UnableToRetrieve$Error: class java.lang.NullPointerException
Jibri 2024-08-19 13:17:31.044 FINE: [71] MainKt$setupMetaconfigLogger$1.debug#234: ConfigSourceSupplier: Trying to retrieve key 'jibri.recording.recordings-directory' from source 'config' as type kotlin.String
Jibri 2024-08-19 13:17:31.045 FINE: [71] MainKt$setupMetaconfigLogger$1.debug#234: ConfigSourceSupplier: Found value /data/recordings for key 'jibri.recording.recordings-directory' from source 'config' as type kotlin.String
Jibri 2024-08-19 13:17:31.045 FINE: [71] MainKt$setupMetaconfigLogger$1.debug#234: FallbackSupplier: value found via ConfigSourceSupplier: key: 'jibri.recording.recordings-directory', type: 'kotlin.String', source: 'config'
Jibri 2024-08-19 13:17:31.045 FINE: [71] MainKt$setupMetaconfigLogger$1.debug#234: FallbackSupplier: checking for value via suppliers:
  LambdaSupplier: 'JibriConfig::finalizeRecordingScriptPath'
  ConfigSourceSupplier: key: 'jibri.recording.finalize-script', type: 'kotlin.String', source: 'config'
Jibri 2024-08-19 13:17:31.046 FINE: [71] MainKt$setupMetaconfigLogger$1.debug#234: LambdaSupplier: Trying to retrieve value via JibriConfig::finalizeRecordingScriptPath
Jibri 2024-08-19 13:17:31.046 FINE: [71] MainKt$setupMetaconfigLogger$1.debug#234: FallbackSupplier: failed to find value via LambdaSupplier: 'JibriConfig::finalizeRecordingScriptPath': org.jitsi.metaconfig.ConfigException$UnableToRetrieve$Error: class java.lang.NullPointerException
Jibri 2024-08-19 13:17:31.047 FINE: [71] MainKt$setupMetaconfigLogger$1.debug#234: ConfigSourceSupplier: Trying to retrieve key 'jibri.recording.finalize-script' from source 'config' as type kotlin.String
Jibri 2024-08-19 13:17:31.047 FINE: [71] MainKt$setupMetaconfigLogger$1.debug#234: ConfigSourceSupplier: Found value /config/finalize.sh for key 'jibri.recording.finalize-script' from source 'config' as type kotlin.String
Jibri 2024-08-19 13:17:31.048 FINE: [71] MainKt$setupMetaconfigLogger$1.debug#234: FallbackSupplier: value found via ConfigSourceSupplier: key: 'jibri.recording.finalize-script', type: 'kotlin.String', source: 'config'
Jibri 2024-08-19 13:17:31.048 INFO: [71] [session_id=c9cc925b-b555-4e57-b62f-73073ce2d27c] FileRecordingJibriService.<init>#134: Writing recording to /data/recordings/c9cc925b-b555-4e57-b62f-73073ce2d27c, finalize script path /config/finalize.sh
Jibri 2024-08-19 13:17:31.048 FINE: [71] JibriMetrics.incrementStatsDCounter#41: Incrementing statsd counter: start:recording
Jibri 2024-08-19 13:17:31.049 INFO: [71] JibriStatusManager$special$$inlined$observable$1.afterChange#75: Busy status has changed: IDLE -> BUSY
Jibri 2024-08-19 13:17:31.049 FINE: [71] WebhookClient$updateStatus$1.invokeSuspend#109: Updating 0 subscribers of status
Jibri 2024-08-19 13:17:31.050 INFO: [71] XmppApi.updatePresence#203: Jibri reports its status is now JibriStatus(busyStatus=BUSY, health=OverallHealth(healthStatus=HEALTHY, details={})), publishing presence to connections
Jibri 2024-08-19 13:17:31.050 FINE: [71] MucClientManager.setPresenceExtension#160: Setting a presence extension: org.jitsi.xmpp.extensions.jibri.JibriStatusPacketExt@7cae0660
Jibri 2024-08-19 13:17:31.050 FINE: [71] MucClientManager.saveExtension#185: Replacing presence extension: org.jitsi.xmpp.extensions.jibri.JibriStatusPacketExt@40811ac8
Jibri 2024-08-19 13:17:31.051 INFO: [71] XmppApi.handleStartJibriIq#275: Sending 'pending' response to start IQ
Jibri 2024-08-19 13:17:31.052 INFO: [88] AbstractPageObject.visit#32: Visiting url jitsi.teamok.area
Jibri 2024-08-19 13:17:31.055 FINE: [48] org.jitsi.xmpp.extensions.DefaultPacketExtensionProvider.parse: Could not add a provider for element busy-status from namespace http://jitsi.org/protocol/jibri
Jibri 2024-08-19 13:17:31.055 FINE: [48] org.jitsi.xmpp.extensions.DefaultPacketExtensionProvider.parse: Could not add a provider for element health-status from namespace http://jitsi.org/protocol/health
Jibri 2024-08-19 13:17:31.077 SEVERE: [88] [session_id=c9cc925b-b555-4e57-b62f-73073ce2d27c] JibriSelenium.joinCall$lambda$3#333: An error occurred while joining the call
org.openqa.selenium.InvalidArgumentException: invalid argument
  (Session info: chrome=126.0.6478.126)
  (Driver info: chromedriver=126.0.6478.182 (5b5d8292ddf182f8b2096fa665b473b6317906d5-refs/branch-heads/6478@{#1776}),platform=Linux 5.10.0-28-amd64 x86_64) (WARNING: The server did not provide any stacktrace information)
Command duration or timeout: 0 milliseconds
Build info: version: 'unknown', revision: 'unknown', time: 'unknown'
System info: host: 'teamok-jitsi-jitsi-meet-jibri-79d5d589fb-k5x6r', ip: '10.244.43.197', os.name: 'Linux', os.arch: 'amd64', os.version: '5.10.0-28-amd64', java.version: '17.0.11'
Driver info: org.openqa.selenium.chrome.ChromeDriver
Capabilities {acceptInsecureCerts: false, acceptSslCerts: false, browserConnectionEnabled: false, browserName: chrome, chrome: {chromedriverVersion: 126.0.6478.182 (5b5d8292ddf..., userDataDir: /tmp/.org.chromium.Chromium...}, cssSelectorsEnabled: true, databaseEnabled: false, fedcm:accounts: true, goog:chromeOptions: {debuggerAddress: localhost:46587}, handlesAlerts: true, hasTouchScreen: false, javascriptEnabled: true, locationContextEnabled: true, mobileEmulationEnabled: false, nativeEvents: true, networkConnectionEnabled: false, pageLoadStrategy: normal, platform: LINUX, platformName: LINUX, proxy: Proxy(), rotatable: false, setWindowRect: true, strictFileInteractability: false, takesHeapSnapshot: true, takesScreenshot: true, timeouts: {implicit: 0, pageLoad: 300000, script: 30000}, unexpectedAlertBehaviour: ignore, unhandledPromptBehavior: ignore, version: 126.0.6478.126, webStorageEnabled: true, webauthn:extension:credBlob: true, webauthn:extension:largeBlob: true, webauthn:extension:minPinLength: true, webauthn:extension:prf: true, webauthn:virtualAuthenticators: true}
Session ID: 91ceaefa4c4046deb14bdbb59bdaaed1
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)
	at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:480)
	at org.openqa.selenium.remote.ErrorHandler.createThrowable(ErrorHandler.java:214)
	at org.openqa.selenium.remote.ErrorHandler.throwIfResponseFailed(ErrorHandler.java:166)
	at org.openqa.selenium.remote.http.JsonHttpResponseCodec.reconstructValue(JsonHttpResponseCodec.java:40)
	at org.openqa.selenium.remote.http.AbstractHttpResponseCodec.decode(AbstractHttpResponseCodec.java:80)
	at org.openqa.selenium.remote.http.AbstractHttpResponseCodec.decode(AbstractHttpResponseCodec.java:44)
	at org.openqa.selenium.remote.HttpCommandExecutor.execute(HttpCommandExecutor.java:158)
	at org.openqa.selenium.remote.service.DriverCommandExecutor.execute(DriverCommandExecutor.java:83)
	at org.openqa.selenium.remote.RemoteWebDriver.execute(RemoteWebDriver.java:543)
	at org.openqa.selenium.remote.RemoteWebDriver.get(RemoteWebDriver.java:271)
	at org.jitsi.jibri.selenium.pageobjects.AbstractPageObject.visit(AbstractPageObject.kt:35)
	at org.jitsi.jibri.selenium.JibriSelenium.joinCall$lambda$3(JibriSelenium.kt:297)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.base/java.lang.Thread.run(Thread.java:840)
Jibri 2024-08-19 13:17:31.078 INFO: [88] [session_id=c9cc925b-b555-4e57-b62f-73073ce2d27c] JibriSelenium.onSeleniumStateChange#218: Transitioning from state Starting up to Error: FailedToJoinCall SESSION Failed to join the call
Jibri 2024-08-19 13:17:31.078 INFO: [88] [session_id=c9cc925b-b555-4e57-b62f-73073ce2d27c] StatefulJibriService.onServiceStateChange#39: File recording service transitioning from state Starting up to Error: FailedToJoinCall SESSION Failed to join the call
Jibri 2024-08-19 13:17:31.078 INFO: [88] XmppApi$createServiceStatusHandler$1.invoke#311: Current service had an error Error: FailedToJoinCall SESSION Failed to join the call, sending error iq <iq xmlns='jabber:client' to='[email protected]/focus' id='8NX7V-27' type='set'><jibri xmlns='http://jitsi.org/protocol/jibri' status='off' failure_reason='error' should_retry='true'/></iq>
Jibri 2024-08-19 13:17:31.079 FINE: [88] JibriMetrics.incrementStatsDCounter#41: Incrementing statsd counter: stop:recording
Jibri 2024-08-19 13:17:31.079 INFO: [88] JibriManager.stopService#250: Stopping the current service
Jibri 2024-08-19 13:17:31.079 INFO: [88] [session_id=c9cc925b-b555-4e57-b62f-73073ce2d27c] FileRecordingJibriService.stop#182: Stopping capturer
Jibri 2024-08-19 13:17:31.079 INFO: [88] [session_id=c9cc925b-b555-4e57-b62f-73073ce2d27c] JibriSubprocess.stop#75: Stopping ffmpeg process
Jibri 2024-08-19 13:17:31.079 INFO: [88] [session_id=c9cc925b-b555-4e57-b62f-73073ce2d27c] JibriSubprocess.stop#89: ffmpeg exited with value null
Jibri 2024-08-19 13:17:31.079 INFO: [88] [session_id=c9cc925b-b555-4e57-b62f-73073ce2d27c] FileRecordingJibriService.stop#184: Quitting selenium
Jibri 2024-08-19 13:17:31.080 INFO: [88] [session_id=c9cc925b-b555-4e57-b62f-73073ce2d27c] FileRecordingJibriService.stop#191: No media was recorded, deleting directory and skipping metadata file & finalize
Jibri 2024-08-19 13:17:31.080 INFO: [88] [session_id=c9cc925b-b555-4e57-b62f-73073ce2d27c] JibriSelenium.leaveCallAndQuitBrowser#344: Leaving call and quitting browser
Jibri 2024-08-19 13:17:31.080 INFO: [88] [session_id=c9cc925b-b555-4e57-b62f-73073ce2d27c] JibriSelenium.leaveCallAndQuitBrowser#347: Recurring call status checks cancelled
Jibri 2024-08-19 13:17:31.090 INFO: [88] [session_id=c9cc925b-b555-4e57-b62f-73073ce2d27c] JibriSelenium.leaveCallAndQuitBrowser#353: Got 0 log entries for type browser
Jibri 2024-08-19 13:17:31.099 INFO: [88] [session_id=c9cc925b-b555-4e57-b62f-73073ce2d27c] JibriSelenium.leaveCallAndQuitBrowser#353: Got 62 log entries for type driver
Jibri 2024-08-19 13:17:31.104 INFO: [88] [session_id=c9cc925b-b555-4e57-b62f-73073ce2d27c] JibriSelenium.leaveCallAndQuitBrowser#353: Got 0 log entries for type client
Jibri 2024-08-19 13:17:31.105 INFO: [88] [session_id=c9cc925b-b555-4e57-b62f-73073ce2d27c] JibriSelenium.leaveCallAndQuitBrowser#362: Leaving web call
Jibri 2024-08-19 13:17:31.136 INFO: [88] [session_id=c9cc925b-b555-4e57-b62f-73073ce2d27c] JibriSelenium.leaveCallAndQuitBrowser#369: Quitting chrome driver
Jibri 2024-08-19 13:17:31.221 INFO: [88] [session_id=c9cc925b-b555-4e57-b62f-73073ce2d27c] JibriSelenium.leaveCallAndQuitBrowser#371: Chrome driver quit
Jibri 2024-08-19 13:17:31.222 INFO: [88] JibriStatusManager$special$$inlined$observable$1.afterChange#75: Busy status has changed: BUSY -> IDLE
Jibri 2024-08-19 13:17:31.222 FINE: [88] WebhookClient$updateStatus$1.invokeSuspend#109: Updating 0 subscribers of status
Jibri 2024-08-19 13:17:31.222 INFO: [88] XmppApi.updatePresence#203: Jibri reports its status is now JibriStatus(busyStatus=IDLE, health=OverallHealth(healthStatus=HEALTHY, details={})), publishing presence to connections

I tied it with chart 1.3.8 and 1.4.0
Is anyone can show me the way?

@spijet
Copy link
Collaborator

spijet commented Aug 29, 2024

Hello @truecorax!

Can you please show your publicURL value and the whole jibri: section? It looks like Jibri cannot connect to your call room, which usually happens when the Jibri pod is somehow unable to reach the Jitsi Meet installation using the URL provided in .Values.publicURL value. Since it has to "emulate" a real (albeit invisible) person joining the room, it has to connect to Jitsi Meet as if it was connecting to it from some remote location and not inside the cluster.

@truecorax
Copy link
Author

truecorax commented Aug 30, 2024

Hello @spijet

Could it be because k8s cluster is under the NAT? In the other hand users can connect to the call rooms.
Or could it be because we use jwt auth in jitsi?

publicURL: "jitsi.alpha.teamok.com"
jibri:
  ## Enabling Jibri will allow users to record
  ## and/or stream their meetings (e.g. to YouTube).
  enabled: true

  ## Use external Jibri installation.
  ## This setting skips the creation of Jibri Deployment altogether,
  ## instead creating just the config secret
  ## and enabling recording/streaming services.
  ## Defaults to disabled (use bundled Jibri).
  useExternalJibri: false

  ## Enable single-use mode for Jibri.
  ## With this setting enabled, every Jibri instance
  ## will become "expired" after being used once (successfully or not)
  ## and cleaned up (restarted) by Kubernetes.
  ##
  ## Note that detecting expired Jibri, restarting and registering it
  ## takes some time, so you'll have to make sure you have enough
  ## instances at your disposal.
  ## You might also want to make LivenessProbe fail faster.
  singleUseMode: false

  ## Enable recording service.
  ## Set this to true/false to enable/disable local recordings.
  ## Defaults to enabled (allow local recordings).
  recording: true

  ## Enable livestreaming service.
  ## Set this to true/false to enable/disable live streams.
  ## Defaults to disabled (livestreaming is forbidden).
  livestreaming: true

  ## Enable multiple Jibri instances.
  ## If enabled (i.e. set to 2 or more), each Jibri instance
  ## will get an ID assigned to it, based on pod name.
  ## Multiple replicas are recommended for single-use mode.
  replicaCount: 1

  ## Enable persistent storage for local recordings.
  ## If disabled, jibri pod will use a transient
  ## emptyDir-backed storage instead.
  persistence:
    enabled: true
    size: 10Gi
    ## Set this to existing PVC name if you have one.
    existingClaim: teamok-jitsi-jitsi-meet-jibri
    storageClassName: local-storage

  shm:
    ## Set to true to enable "/dev/shm" mount.
    ## May be required by built-in Chromium.
    enabled: true
    ## If "true", will use host's shared memory dir,
    ## and if "false" — an emptyDir mount.
    # useHost: false
    size: 2Gi

  ## Configure the update strategy for Jibri deployment.
  ## This may be useful depending on your persistence settings,
  ## e.g. when you use ReadWriteOnce PVCs.
  ## Default strategy is "RollingUpdate", which keeps
  ## the old instances up until the new ones are ready.
  # strategy:
  #   type: RollingUpdate

  image:
    repository: jitsi/jibri

  podLabels: {}
  podAnnotations: {}
  resources: {}

  breweryMuc: jibribrewery
  timeout: 90

  ## jibri XMPP user credentials:
  xmpp:
    user: jibri
    password:

  ## recorder XMPP user credentials:
  recorder:
    user: recorder
    password:

  livenessProbe:
    initialDelaySeconds: 5
    periodSeconds: 5
    failureThreshold: 2
    exec:
      command:
        - /bin/bash
        - "-c"
        - >-
          curl -sq localhost:2222/jibri/api/v1.0/health
          | jq '"\(.status.health.healthStatus) \(.status.busyStatus)"'
          | grep -qP 'HEALTHY (IDLE|BUSY)'

  readinessProbe:
    initialDelaySeconds: 5
    periodSeconds: 5
    failureThreshold: 2
    exec:
      command:
        - /bin/bash
        - "-c"
        - >-
          curl -sq localhost:2222/jibri/api/v1.0/health
          | jq '"\(.status.health.healthStatus) \(.status.busyStatus)"'
          | grep -qP 'HEALTHY (IDLE|BUSY)'

  extraEnvs: {}

  ## Override the image-provided configuration files:
  #  See https://github.com/jitsi/docker-jitsi-meet/tree/master/jibri/rootfs
  custom:
    contInit:
      _10_config: ""
    defaults:
      _autoscaler_sidecar_config: ""
      _jibri_conf: ""
      _logging_properties: ""
      _xorg_video_dummy_conf: ""

@d-mo
Copy link

d-mo commented Aug 31, 2024

Same problem here with similar configuration.

@jens-kuerten
Copy link

Same problem. This happens even without any auth enabled.

@jens-kuerten
Copy link

Seems jibri expects the public_url to start with https://

So instead of publicURL: "jitsi.example.com" (as the comment in the values suggest) it needs to be publicURL: "https://jitsi.example.com"

@truecorax
Copy link
Author

truecorax commented Sep 24, 2024

Yep, with https:// jitsi started to recording. Thank you.

@truecorax
Copy link
Author

But now there's another problem. I can't find url to download video and in logs I've got an error with finalize.sh

Jibri 2024-09-24 10:18:50.022 INFO: [69] [session_id=d1261748-08b5-487d-93f9-7d691ab4d070] JibriSelenium.leaveCallAndQuitBrowser#369: Quitting chrome driver
Jibri 2024-09-24 10:18:50.098 INFO: [69] [session_id=d1261748-08b5-487d-93f9-7d691ab4d070] JibriSelenium.leaveCallAndQuitBrowser#371: Chrome driver quit
Jibri 2024-09-24 10:18:50.098 INFO: [69] [session_id=d1261748-08b5-487d-93f9-7d691ab4d070] FileRecordingJibriService.stop#232: Finalizing the recording
Jibri 2024-09-24 10:18:50.098 INFO: [69] JibriServiceFinalizeCommandRunner.doFinalize#44: Finalizing the jibri service operation using command [/config/finalize.sh, /data/recordings/d1261748-08b5-487d-93f9-7d691ab4d070]
Jibri 2024-09-24 10:18:50.103 SEVERE: [69] JibriServiceFinalizeCommandRunner.doFinalize#63: Failed to run finalize script
java.io.IOException: Cannot run program "/config/finalize.sh": error=2, No such file or directory
        at java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1143)
        at java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1073)
        at org.jitsi.jibri.util.ProcessWrapper.start(ProcessWrapper.kt:88)
        at org.jitsi.jibri.service.impl.JibriServiceFinalizeCommandRunner.doFinalize(JibriServiceFinalizeCommandRunner.kt:47)
        at org.jitsi.jibri.service.impl.FileRecordingJibriService.stop(FileRecordingJibriService.kt:233)
        at org.jitsi.jibri.JibriManager.stopService(JibriManager.kt:253)
        at org.jitsi.jibri.api.xmpp.XmppApi.handleStopJibriIq(XmppApi.kt:344)
        at org.jitsi.jibri.api.xmpp.XmppApi.handleJibriIq(XmppApi.kt:238)
        at org.jitsi.jibri.api.xmpp.XmppApi.handleIq(XmppApi.kt:219)
        at org.jitsi.xmpp.mucclient.MucClient.handleIq(MucClient.java:551)
        at org.jitsi.xmpp.mucclient.MucClient$3.handleIQRequest(MucClient.java:514)
        at org.jivesoftware.smack.AbstractXMPPConnection$3.run(AbstractXMPPConnection.java:1561)
        at org.jivesoftware.smack.AbstractXMPPConnection$10.run(AbstractXMPPConnection.java:2146)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:840)
Caused by: java.io.IOException: error=2, No such file or directory
        at java.base/java.lang.ProcessImpl.forkAndExec(Native Method)
        at java.base/java.lang.ProcessImpl.<init>(ProcessImpl.java:314)
        at java.base/java.lang.ProcessImpl.start(ProcessImpl.java:244)
        at java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1110)
        ... 15 more
Jibri 2024-09-24 10:18:50.104 FINE: [69] MainKt$setupMetaconfigLogger$1.debug#234: FallbackSupplier: checking for value via suppliers:
  LambdaSupplier: 'JibriConfig::singleUseMode'
  ConfigSourceSupplier: key: 'jibri.single-use-mode', type: 'kotlin.Boolean', source: 'config'
Jibri 2024-09-24 10:18:50.104 FINE: [69] MainKt$setupMetaconfigLogger$1.debug#234: LambdaSupplier: Trying to retrieve value via JibriConfig::singleUseMode
Jibri 2024-09-24 10:18:50.104 FINE: [69] MainKt$setupMetaconfigLogger$1.debug#234: FallbackSupplier: failed to find value via LambdaSupplier: 'JibriConfig::singleUseMode': org.jitsi.metaconfig.ConfigException$UnableToRetrieve$Error: class java.lang.NullPointerException
Jibri 2024-09-24 10:18:50.104 FINE: [69] MainKt$setupMetaconfigLogger$1.debug#234: ConfigSourceSupplier: Trying to retrieve key 'jibri.single-use-mode' from source 'config' as type kotlin.Boolean
Jibri 2024-09-24 10:18:50.107 FINE: [69] MainKt$setupMetaconfigLogger$1.debug#234: ConfigSourceSupplier: Found value false for key 'jibri.single-use-mode' from source 'config' as type kotlin.Boolean
Jibri 2024-09-24 10:18:50.107 FINE: [69] MainKt$setupMetaconfigLogger$1.debug#234: FallbackSupplier: value found via ConfigSourceSupplier: key: 'jibri.single-use-mode', type: 'kotlin.Boolean', source: 'config'
Jibri 2024-09-24 10:18:50.107 INFO: [69] JibriStatusManager$special$$inlined$observable$1.afterChange#75: Busy status has changed: BUSY -> IDLE

@truecorax
Copy link
Author

Oh, there's no default script finalize.sh. So I should create it on my own.

@d-mo
Copy link

d-mo commented Oct 6, 2024

Works for me with https:// prefix in publicURL. Thanks!

@spijet
Copy link
Collaborator

spijet commented Oct 8, 2024

Sorry for the delay. 💀

Yes, Jibri seems to require the schema (so, http:// or https://) to always be present in publicURL. However, you don't always need to specify the URL yourself, as the chart can generate it for you from the ingress config:

{{- define "jitsi-meet.publicURL" -}}
{{- if .Values.publicURL }}
{{- .Values.publicURL -}}
{{- else -}}
{{- if .Values.web.ingress.tls -}}https://{{- else -}}http://{{- end -}}
{{- if .Values.web.ingress.tls -}}
{{- (.Values.web.ingress.tls|first).hosts|first -}}
{{- else if .Values.web.ingress.hosts -}}
{{- (.Values.web.ingress.hosts|first).host -}}
{{ required "You need to define a publicURL or some value for ingress" .Values.publicURL }}
{{- end -}}
{{- end -}}
{{- end -}}

As for the file recording, Jibri makes the recording and saves it on a local filesystem. You can either use a finalize script to upload the recording somewhere, or use a PVC as a permanent file storage for your recordings.

If you choose to use a PVC for your recordings, you can also mount the same PVC to another service, like a web server with an auto-index feature (e.g. Apache, Nginx or any PHP web server with h5ai). Or use this cool hack (which is 100% anti-k8s, but I like it nonetheless):

find "/proc/$(pgrep -f /opt/jitsi/jibri/jibri.jar | head -n1)/root/data/recordings" -type f -name '*.mp4'

If you run this script on a node that currently runs a Jibri pod, you'll get a list of recordings stored in this pod (or the PVC).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants