Background story: I recently bought a computer with AMD 7000 series CPU and GPU.

amdgpu_top reports 15 ~ 20 watts in normal desktop usage, but as soon as I have video playing in VLC, it goes to 45 watts constantly which is undesirable behavior especially in summer. (I hope that is just reporting issue… but my computer is hot)

When I do DRI_PRIME=1 vlc and then play videos, amdgpu_top doesn’t report the power surge. (I have iGPU enabled)

Is there anything more convenient then modifying individual .desktop files? KDE malfunctions when I put export DRI_PRIME=1 in .xprofile so that’s a no go.


Solved: removing mesa related hardware acceleration package makes VLC fall back to libplacebo which doesn’t do these weird things.

  • Maxy@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    9
    ·
    1 year ago

    Are you just running and AMD CPU with integrated graphics, or do you also have a dedicated graphics card? From what I can gather online, the DRI_PRIME variable is mostly used for render offloading to a dedicated GPU, but your question appears to be about iGPUs.

    You can also try to manually enable hardware decoding in VLC’s settings. Just go to Tools > Preferences > Input & Codecs and choose VA-API (AMD’s preferred standard).

    • axzxc1236@lemm.eeOP
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      1 year ago

      do you also have a dedicated graphics card?

      Yes, rx 7800 xt

      My worry s that playing a 1080P video need 30 watts (assuming amdgpu_top is not wrong), I would like to move that workload to integrated GPU, which I enabled in BIOS.

      Thank you for your answer, I can confirm by switching to VA-API it lowers my power usage by a lot (from 45 to 20~21 watts reported).

      • axzxc1236@lemm.eeOP
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Through some more testing, I found out mesa related hardware acceleration package can cause these power surge, on Archlinux it includes mesa-vdpau and libva-mesa-driver.

        If I don’t have these package installed, VLC reverts to libplacebo which doesn’t seem to cause more power usage.

  • Sh1nyM3t4l4ss@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    1 year ago

    One thing you could do is plugging your monitor straight into the iGPU outputs and using DRI_PRIME only for applications that need the powerful dGPU.

    Unless you want to run either everything or nothing on a specific GPU, I don’t think there’s a more convenient way than setting DRI_PRIME per application.

    • cyanarchy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I do this. It’s part of a GPU passthrough setup, but in practice there aren’t many applications that require PRIME offload. I don’t use it for web browsers where I watch a lot of videos. I haven’t used VLC in a little bit but I’m pretty sure I don’t use it there either. Games and graphical applications. If I was doing video editing or modeling I would probably want it there too.

    • axzxc1236@lemm.eeOP
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      Yes, rx 7800 xt. I can confirm DRI_PRIME does switch to integrated GPU on demand

      DRI_PRIME=0 glxinfo | grep "OpenGL renderer"
      OpenGL renderer string: AMD Radeon Graphics (gfx1101, LLVM 16.0.6, DRM 3.54, 6.5.5-arch1-1)
      DRI_PRIME=1 glxinfo | grep "OpenGL renderer"
      OpenGL renderer string: AMD Radeon Graphics (raphael_mendocino, LLVM 16.0.6, DRM 3.54, 6.5.5-arch1-1)
      
      • Sentau@feddit.de
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        I am assuming you have the monitor connected directly to the 7800xt. Which is why it is the default GPU.

        Is the decoding being done when watching the video¿? amdgpu_top shows if the application(vlc in this case) is using the decoding hardware(column named DEC).

        Also using the iGPU for video decoding should be more efficient because the massive number of cores in dGPU aren’t needed while decoding yet are kept active because the dGPU is active

        • axzxc1236@lemm.eeOP
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          The problem has been solved, it’s caused by mesa’s video decoding package, I will answer anyway.

          Yes, VCN (Video Core Next) column stays at constant value while playing video (3% for VA-API with mesa, 5% for VDPAU with mesa, 0% for libplacebo), GFX fluctuates between 0% and 1%.

          Just playing a 1080P video (not even a high bit rate one) is enough to make GPU fan go spinning, disappointing.

          • Sentau@feddit.de
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Hmm must be some bug in mesa or the way it interacts with vlc . I use VA-API with mesa for my decoding purposes on a system(laptop) with Vega iGPU and RDNA1 dGPU and I don’t see high energy usage. In fact I get much better battery life with vaapi hardware decoding.

  • Honeybee@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Which distro are you using. Fedora, manjaro and few others disabled hardware acceleration for certain codecs making CPU and power spike. For Fedora you can enable RPM fusion and install the hardware acceleration versions and be back to normal .

    https://rpmfusion.org/

    • axzxc1236@lemm.eeOP
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Turns out in my case it’s mesa driver causing my problem, after removing mesa’s VA-API and VDPAU drivers VLC can still play things just fine, CPU is at 2~3%.