Ironically enough, I have 3 siblings that live in Google Fiber areas, or soon to be Google Fiber areas, and my parents are moving to another. We will eventually be getting some competition from what I can find and they have the same price and speeds as Google Fiber. We already have one provider but they are ridiculously expensive and have very small caps and the upload is restricted to 50mbps. I'm just going to have to spend the next few years uploading until we finally get some cost effective fiber in my area, which I am told will be soon. There is a community forum on rclone that might be worth a look if no one here know.īut, yeah, I've been looking for a long time for a solution like this and there is finally a viable solution, as long as you have the speed. Maybe because they aren't really streamable, but I doubt that should matter, especially given their size. I don't know why Office files wouldn't work. At the moment I'm uploading to cloud sync so I won't need my server on all the time. Loving the possibilities though - just wish my upload was faster so I could seriously consider uploading my Plex files. Is there a list somewhere of what files types work when mounted? E.g I can't open ms office files. This is due to the fact that only mount points within /mnt/disks/ support Slave modes.
Set the container/host volume with a mode of Read Write,Slave, else the files will not show up inside the container.Īlso remember to use the -allow-other mount option, as well as mounting the remote share inside /mnt/disks/. Making your rclone mount accessible to your docker containersįor your docker containers to be able to access your remote share mount you have to specify the path to the mount point in the docker container interface. The -allow-other mount option is also important if you want to share the remote share with your docker containers or for example through samba. The local mount point should be inside /mnt/disks/ if you want to share the files with your docker containers. Rclone mount -allow-other remote:path/to/files /path/to/local/mount & The information below this point might be obsolete. Rclone also supports encryption and decryption of files synced to these services.Īfter installation, setup rclone by using the command: rclone config
Rclone is a command line tool which enables you to mount a cloud share as well as to sync files and directories to and from several providers including but not limited to:
If you appreciate my work, then please consider buying me a coffee
The plugin will still update rclone on restart of the server or if removed and reinstalled You can now initiate an update of rclone from the settings page.
You can install the plugin from CA or from the plugin menu using this link. The plugins have now been merged so both the stable and beta branch are available in the same plugin. Rclone: Version "v1.56.0" starting with parameters The rclone config contents with secrets removed.Īcl = bucket-owner-full-control A log from the command with the -vv flagġ 04:50:12 DEBUG : rclone: Version "v1.56.0" starting with parameters ġ 04:50:12 DEBUG : Creating backend with remote "ooss:FS2_FILEVAULT_PROD"ġ 04:50:12 DEBUG : Using config file from "/home/bconnect/.config/rclone/rclone.This is a simple plugin which installs rclone on your unraid system. Object Storage to AWS S3 The command you were trying to run (eg rclone copy /tmp remote:tmp) Which cloud storage system are you using? (eg Google Drive)
Run the command 'rclone version' and share the full output of the command. Logs "ETA - Elapsed time: 7h32m50.5sTransferred: 0 / 0 Byte" even after hours of running rclone. What is the problem you are having with rclone?