My use-case: streaming video to a Linux virtual mount and want compression of said video files on the fly.

Rclone has an experimental remote for compression but this stuff is important to me so that’s no good. I know rsync can do it but will it work for video files, and how I get rsync to warch the virtual mount-point and automatically compress and move over each individual file to rclone for upload to the Cloud? This is mostly to save on upload bandwidth and storage costs.

Thanks!

Edit: I’m stupid for not mentioning this, but the problem I’m facing is that I don’t have much local storage, which is why I wanted a transparent compression layer and directly push everything to the Cloud. This might not be worth it though since video files are already compressed. I will take a look at handbrake though, thanks!

  • WIPocket@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    6 months ago

    What is the format of these videos? Im afraid you wont get much compression out of conventional file compressors, as video files are usually already compressed to the point where you would have to reencode them to get a smaller file.

    • MigratingtoLemmy@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      6 months ago

      You’re right, I don’t know why I didn’t consider that. This is going to be a mix of security cameras and live streaming video that I’ll store on the cloud, and the problem is that I have horrible upload speeds along with no local storage for caching

  • Björn Tantau@swg-empire.de
    link
    fedilink
    English
    arrow-up
    10
    ·
    6 months ago

    What is your end goal? Do you want to back up your videos with minimal storage costs? Compression won’t help you (because videos are already compressed) unless you can accept data loss through re-encoding. Handbrake (or pure ffmpeg) would be the tool to re-encode lots of files. This could save you space but you may have some loss of quality, depending on the configuration you use and how the original videos are encoded.

    If you just want the videos to be available for streaming, tools like Jellyfin or Emby would do the job. They are servers that re-encode your media for streaming on the fly, depending on the client capabilities and your bandwidth settings.

    • MigratingtoLemmy@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      The problem is that I don’t have local storage, and neither do I have very good upload bandwidth. Compression could in theory solve the bandwidth problem and cloud storage costs to an extent, but I completely missed the part about video being already compressed. I’ll take a look at handbrake though, essentially what I want is a transparent layer that will compress video files (or reencode video files since the former seems pointless) without touching local storage and shove it into my virtual FUSE system to upload directly to the Cloud.

  • Sims@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    6 months ago

    Not much help, but a quick search revealed this: https://github.com/nschlia/ffmpegfs

    This seemed to be read-only tho, so not sure if it covers the use case you described. If you can program a little (AI help?) find a simple fuse filesystem in a language you know, fiddle with it and call ffmpeg or similar on receiving files.

  • michaelmrose@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    6 months ago

    Already compressed files like video don’t meaningfully compress more. Compression isn’t magic.

  • 9point6@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    6 months ago

    Unless you’ve got raw uncompressed video, any kind of transparent compression like you describe is only going to cost you in energy bills for no benefit. Most video is already compressed with specialised video compression as part of the file format, you can’t keep compressing stuff and getting smaller files.

    The alternative is a lossy compression, which you could automate with some scripts or a transcoding tool like tdarr. This would reduce the quality of the video in order to reduce the file size

    • MigratingtoLemmy@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      The problem is that I don’t have the local storage to maintain a watch folder for continually streaming video. I want to write semi-directly to the Cloud, which is why I’m looking for a transparent reencoding layer. Can handbrake do this?

      • 9point6@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 months ago

        I’m not sure about transparently, that’s more in the tdarr wheelhouse I’d say. You’d dump the files into a monitored folder and it will replace it with a version transcoded to your specification.

        Transcoding video takes a fair bit of time and energy too FWIW, so you’re going to need enough local storage to handle both the full size and smaller one.

        I have to question the idea though, cloud storage is always more expensive than local for anything remotely non-temporary, and transcoding a load of video all the time is going to increase your energy bills. If you have any kind of internet bandwidth restrictions that’s gonna factor in too.

        I’d say it would be better to save up for a cheap external hard drive to store your video on. For a year’s subscription to a cloud storage service that would provide enough space for a media library, you could probably get twice the amount of storage forever.

        • MigratingtoLemmy@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          I’m going to be using an SBC for this, which doesn’t have the capacity for an extra storage drive. Also, I’m planning to move in a couple of months, and I wouldn’t want to deal with storage in the middle of all of this. The cloud isn’t as insanely expensive as I initially thought; B2 is $6/TB, and I hope that with reencoded streams at an OK resolution I wouldn’t go beyond 1.5TB a month (I’ll be deleting stuff with bucket policies, of course).

          I’ll take a look at tdarr alongside ffmpeg, thanks!

          • 9point6@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            6 months ago

            Okay fair play, if you’re doing this super short term it could make sense. Though I question what SBC you’re using that’s capable of transcoding video but not the ability to plug in an external drive.

            $12/m for your 2TB of usage would make sense for maybe 5 months before it would be cheaper to buy an external disk—and of course that storage is gone once that time is up, Vs a hard disk which will probably last you a decade or so

            • MigratingtoLemmy@lemmy.worldOP
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              6 months ago

              I was considering the usual BananaPi/OrangePi/Raxda/Pine64 SBCs, are those not enough horsepower? I’d like to stay under $80 for my SBC purchase, and it will be doing double duty with managing some services like DNS and music scrobbling alongside uploading to the cloud

              • 9point6@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                6 months ago

                Hardware transcoding on SBCs is generally not fantastic, you’re gonna want to look for one that has VAAPI/VDPAU support or you’re gonna be looking at 100% CPU for half a day to transcode a film, which will make your other services effectively unavailable at the time.

                I used to run my Plex server on a Pi4 with 4GB of ram and it basically crashed any time transcoding kicked in, I swapped to an intel NUC so I could get QuickSync for transcoding.

                I’ll point out though, every SBC you’ve listed has usb, which is all you need for an external disk. If you’re worried about size, I’ve got a 5tb external drive that’s about 5cm², which is basically the footprint of any SBC you could use in this scenario

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    6 months ago

    https://github.com/bcopeland/gstfs

    If you want to do it at the filesystem level, which is what it sounds like you’re asking for, it sounds like this could do it. I have not used it.

    If you want to just watch a local directory or directory tree for a file being closed (like, the stream is complete) and then run a command on it (like, to compress and upload it), it sounds like you could use inotifywait with the close_write event.

    • MigratingtoLemmy@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      Thanks, but the second problem I’m working with (and what I forgot to mention) is that I have no local storage - I would like to write semi-directly to cloud storage. I can probably manage a few GB for caching and that’s it.

  • slazer2au@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    6 months ago

    This sounds stupid but what about tdarr?

    Stash the file in a staging directory that tdarr watches, have tdarr convert the file to something small like h265. Output the converted file to a folder rsync watches.

    • MigratingtoLemmy@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      Thank you, but there’s another problem: I don’t have local storage to write the files to and then upload. I need to write semi-directly to the Cloud.

  • 𝒎𝒂𝒏𝒊𝒆𝒍@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    as the rest said lossless compression won’t really work on media files as they’re already compressed, there are probably some compression layers based on fuse you could mount over your cloud storage mount point (if it supports mounting in linux) and it’d be transparent, but in case of video files i believe your only solution is to reencode those files, handbrake is a nice GUI tool

  • TheHolm@aussie.zone
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    Do you plan to compress video ( which generally already compressed format) when saving to remote location? I do not see use case for it, as you ether use lossless compression and not compressing it in any meaningful way, or just re-encode to different format and loose quality. Second option is simpler to achieve by re-encoding before sending out.

    • MigratingtoLemmy@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      I plan to reencode on the fly before I push the files to the Cloud. This would have to happen in real-time though, since I don’t have the space to cache files (and slow upload speeds)

  • catloaf@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    This sounds like an XY problem. I would first recommend not streaming to disk, but streaming to a program that does compression, and write that to disk.

    If that’s unavoidable, can you stream to a pipe or socket? A compressor should be able to read from there and do its work.

    If you can’t control the stream to disk, there are plenty of file watcher tools that can run arbitrary commands.

    • MigratingtoLemmy@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      Streaming to a socket sounds like a decent idea. I don’t know how and which program I could stream to; is there a way for handbrake to transparently reencode video and send it to a virtual FUSE mount-point? The problem I have is no local storage to keep video.

  • Decronym@lemmy.decronym.xyzB
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    6 months ago

    Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:

    Fewer Letters More Letters
    DNS Domain Name Service/System
    NUC Next Unit of Computing brand of Intel small computers
    Plex Brand of media server package
    SBC Single-Board Computer

    4 acronyms in this thread; the most compressed thread commented on today has 3 acronyms.

    [Thread #759 for this sub, first seen 23rd May 2024, 19:45] [FAQ] [Full list] [Contact] [Source code]