My use-case: streaming video to a Linux virtual mount and want compression of said video files on the fly.

Rclone has an experimental remote for compression but this stuff is important to me so that’s no good. I know rsync can do it but will it work for video files, and how I get rsync to warch the virtual mount-point and automatically compress and move over each individual file to rclone for upload to the Cloud? This is mostly to save on upload bandwidth and storage costs.

Thanks!

Edit: I’m stupid for not mentioning this, but the problem I’m facing is that I don’t have much local storage, which is why I wanted a transparent compression layer and directly push everything to the Cloud. This might not be worth it though since video files are already compressed. I will take a look at handbrake though, thanks!

  • @WIPocket@lemmy.world
    link
    fedilink
    English
    2010 months ago

    What is the format of these videos? Im afraid you wont get much compression out of conventional file compressors, as video files are usually already compressed to the point where you would have to reencode them to get a smaller file.

    • @MigratingtoLemmy@lemmy.worldOP
      link
      fedilink
      English
      1
      edit-2
      10 months ago

      You’re right, I don’t know why I didn’t consider that. This is going to be a mix of security cameras and live streaming video that I’ll store on the cloud, and the problem is that I have horrible upload speeds along with no local storage for caching

  • Björn Tantau
    link
    fedilink
    English
    1010 months ago

    What is your end goal? Do you want to back up your videos with minimal storage costs? Compression won’t help you (because videos are already compressed) unless you can accept data loss through re-encoding. Handbrake (or pure ffmpeg) would be the tool to re-encode lots of files. This could save you space but you may have some loss of quality, depending on the configuration you use and how the original videos are encoded.

    If you just want the videos to be available for streaming, tools like Jellyfin or Emby would do the job. They are servers that re-encode your media for streaming on the fly, depending on the client capabilities and your bandwidth settings.

    • @MigratingtoLemmy@lemmy.worldOP
      link
      fedilink
      English
      210 months ago

      The problem is that I don’t have local storage, and neither do I have very good upload bandwidth. Compression could in theory solve the bandwidth problem and cloud storage costs to an extent, but I completely missed the part about video being already compressed. I’ll take a look at handbrake though, essentially what I want is a transparent layer that will compress video files (or reencode video files since the former seems pointless) without touching local storage and shove it into my virtual FUSE system to upload directly to the Cloud.

  • Sims
    link
    fedilink
    English
    410 months ago

    Not much help, but a quick search revealed this: https://github.com/nschlia/ffmpegfs

    This seemed to be read-only tho, so not sure if it covers the use case you described. If you can program a little (AI help?) find a simple fuse filesystem in a language you know, fiddle with it and call ffmpeg or similar on receiving files.

  • @9point6@lemmy.world
    link
    fedilink
    English
    310 months ago

    Unless you’ve got raw uncompressed video, any kind of transparent compression like you describe is only going to cost you in energy bills for no benefit. Most video is already compressed with specialised video compression as part of the file format, you can’t keep compressing stuff and getting smaller files.

    The alternative is a lossy compression, which you could automate with some scripts or a transcoding tool like tdarr. This would reduce the quality of the video in order to reduce the file size

    • @MigratingtoLemmy@lemmy.worldOP
      link
      fedilink
      English
      110 months ago

      The problem is that I don’t have the local storage to maintain a watch folder for continually streaming video. I want to write semi-directly to the Cloud, which is why I’m looking for a transparent reencoding layer. Can handbrake do this?

      • @9point6@lemmy.world
        link
        fedilink
        English
        110 months ago

        I’m not sure about transparently, that’s more in the tdarr wheelhouse I’d say. You’d dump the files into a monitored folder and it will replace it with a version transcoded to your specification.

        Transcoding video takes a fair bit of time and energy too FWIW, so you’re going to need enough local storage to handle both the full size and smaller one.

        I have to question the idea though, cloud storage is always more expensive than local for anything remotely non-temporary, and transcoding a load of video all the time is going to increase your energy bills. If you have any kind of internet bandwidth restrictions that’s gonna factor in too.

        I’d say it would be better to save up for a cheap external hard drive to store your video on. For a year’s subscription to a cloud storage service that would provide enough space for a media library, you could probably get twice the amount of storage forever.

  • slazer2au
    link
    fedilink
    English
    210 months ago

    This sounds stupid but what about tdarr?

    Stash the file in a staging directory that tdarr watches, have tdarr convert the file to something small like h265. Output the converted file to a folder rsync watches.

    • @MigratingtoLemmy@lemmy.worldOP
      link
      fedilink
      English
      210 months ago

      Thank you, but there’s another problem: I don’t have local storage to write the files to and then upload. I need to write semi-directly to the Cloud.

  • @tal@lemmy.today
    link
    fedilink
    English
    2
    edit-2
    10 months ago

    https://github.com/bcopeland/gstfs

    If you want to do it at the filesystem level, which is what it sounds like you’re asking for, it sounds like this could do it. I have not used it.

    If you want to just watch a local directory or directory tree for a file being closed (like, the stream is complete) and then run a command on it (like, to compress and upload it), it sounds like you could use inotifywait with the close_write event.

    • @MigratingtoLemmy@lemmy.worldOP
      link
      fedilink
      English
      110 months ago

      Thanks, but the second problem I’m working with (and what I forgot to mention) is that I have no local storage - I would like to write semi-directly to cloud storage. I can probably manage a few GB for caching and that’s it.

  • @catloaf@lemm.ee
    link
    fedilink
    English
    110 months ago

    This sounds like an XY problem. I would first recommend not streaming to disk, but streaming to a program that does compression, and write that to disk.

    If that’s unavoidable, can you stream to a pipe or socket? A compressor should be able to read from there and do its work.

    If you can’t control the stream to disk, there are plenty of file watcher tools that can run arbitrary commands.

    • @MigratingtoLemmy@lemmy.worldOP
      link
      fedilink
      English
      110 months ago

      Streaming to a socket sounds like a decent idea. I don’t know how and which program I could stream to; is there a way for handbrake to transparently reencode video and send it to a virtual FUSE mount-point? The problem I have is no local storage to keep video.

  • 𝒎𝒂𝒏𝒊𝒆𝒍
    link
    fedilink
    English
    110 months ago

    as the rest said lossless compression won’t really work on media files as they’re already compressed, there are probably some compression layers based on fuse you could mount over your cloud storage mount point (if it supports mounting in linux) and it’d be transparent, but in case of video files i believe your only solution is to reencode those files, handbrake is a nice GUI tool