You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on May 22, 2019. It is now read-only.
My machine is quite beefy - an i7-6900K CPU and an NVidia GTX 1080-Ti.
Purely by coincidence, I happened to run nvidia-smi in an SSH session after remotely rebooting my machine. The only thing using the GPU after a reboot was lightdm running in Xorg, and yet it reported a consistent ~38% GPU utilization.
A few hours later when I arrived home, I SSH'ed in again before logging in and confirmed my GPU was still pegged at >35%. Immediately after logging in (and therefore dismissing lightdm-webkit2-greeter), GPU usage dropped back to 0%.
I understand that there is some overhead associated with running WebKit, but consuming more GPU resources than most triple-A video games just to display a wallpaper seems so ridiculously excessive as to qualify as a bug. Especially considering that machines often stay in lightdm for hours on end after an unexpected reboot until the user gets to their desk, this bug represents a significant waste of energy and hardware wear-and-tear.
My lightdm-webkit2-greeter.conf is pretty much Arch's stock default, pasted below:
#
# [greeter]
# debug_mode = Greeter theme debug mode.
# detect_theme_errors = Provide an option to load a fallback theme when theme errors are detected.
# screensaver_timeout = Blank the screen after this many seconds of inactivity.
# secure_mode = Don't allow themes to make remote http requests.
# time_format = A moment.js format string so the greeter can generate localized time for display.
# time_language = Language to use when displaying the time or "auto" to use the system's language.
# webkit_theme = Webkit theme to use.
#
# NOTE: See moment.js documentation for format string options: http://momentjs.com/docs/#/displaying/format/
#
[greeter]
debug_mode = false
detect_theme_errors = true
screensaver_timeout = 300
secure_mode = true
time_format = LT
time_language = auto
webkit_theme = antergos
#
# [branding]
# background_images = Path to directory that contains background images for use by themes.
# logo = Path to logo image for use by greeter themes.
# user_image = Default user image/avatar. This is used by themes for users that have no .face image.
#
# NOTE: Paths must be accessible to the lightdm system user account (so they cannot be anywhere in /home)
#
[branding]
background_images = /usr/share/backgrounds
logo = /usr/share/pixmaps/archlinux-logo.svg
user_image = /usr/share/pixmaps/archlinux-user.svg
The text was updated successfully, but these errors were encountered:
I think you are probably right about it being related to the animation. I'd be interested to know if you see the same behavior with our 3.0.0 release candidate.
My machine is quite beefy - an i7-6900K CPU and an NVidia GTX 1080-Ti.
Purely by coincidence, I happened to run
nvidia-smi
in an SSH session after remotely rebooting my machine. The only thing using the GPU after a reboot was lightdm running in Xorg, and yet it reported a consistent ~38% GPU utilization.A few hours later when I arrived home, I SSH'ed in again before logging in and confirmed my GPU was still pegged at >35%. Immediately after logging in (and therefore dismissing lightdm-webkit2-greeter), GPU usage dropped back to 0%.
I understand that there is some overhead associated with running WebKit, but consuming more GPU resources than most triple-A video games just to display a wallpaper seems so ridiculously excessive as to qualify as a bug. Especially considering that machines often stay in lightdm for hours on end after an unexpected reboot until the user gets to their desk, this bug represents a significant waste of energy and hardware wear-and-tear.
My
lightdm-webkit2-greeter.conf
is pretty much Arch's stock default, pasted below:The text was updated successfully, but these errors were encountered: