Version v1.35

s3-about
Nick Craig-Wood 2017-01-02 15:30:34 +00:00
parent f538fd8eb4
commit 5b8b379feb
36 changed files with 1671 additions and 383 deletions

View File

@ -12,7 +12,7 @@
<div id="header"> <div id="header">
<h1 class="title">rclone(1) User Manual</h1> <h1 class="title">rclone(1) User Manual</h1>
<h2 class="author">Nick Craig-Wood</h2> <h2 class="author">Nick Craig-Wood</h2>
<h3 class="date">Nov 06, 2016</h3> <h3 class="date">Jan 02, 2017</h3>
</div> </div>
<h1 id="rclone">Rclone</h1> <h1 id="rclone">Rclone</h1>
<p><a href="http://rclone.org/"><img src="http://rclone.org/img/rclone-120x120.png" alt="Logo" /></a></p> <p><a href="http://rclone.org/"><img src="http://rclone.org/img/rclone-120x120.png" alt="Logo" /></a></p>
@ -46,8 +46,8 @@
<ul> <ul>
<li><a href="http://rclone.org/">Home page</a></li> <li><a href="http://rclone.org/">Home page</a></li>
<li><a href="http://github.com/ncw/rclone">Github project page for source and bug tracker</a></li> <li><a href="http://github.com/ncw/rclone">Github project page for source and bug tracker</a></li>
<li><a href="https://google.com/+RcloneOrg" rel="publisher">Google+ page</a> <li><a href="https://forum.rclone.org">Rclone Forum</a></li>
</li></li> <li><a href="https://google.com/+RcloneOrg" rel="publisher">Google+ page</a></li>
<li><a href="http://rclone.org/downloads/">Downloads</a></li> <li><a href="http://rclone.org/downloads/">Downloads</a></li>
</ul> </ul>
<h1 id="install">Install</h1> <h1 id="install">Install</h1>
@ -168,7 +168,7 @@ destpath/sourcepath/two.txt</code></pre>
<h2 id="rclone-move">rclone move</h2> <h2 id="rclone-move">rclone move</h2>
<p>Move files from source to dest.</p> <p>Move files from source to dest.</p>
<h3 id="synopsis-3">Synopsis</h3> <h3 id="synopsis-3">Synopsis</h3>
<p>Moves the contents of the source directory to the destination directory. Rclone will error if the source and destination overlap.</p> <p>Moves the contents of the source directory to the destination directory. Rclone will error if the source and destination overlap and the remote does not support a server side directory move operation.</p>
<p>If no filters are in use and if possible this will server side move <code>source:path</code> into <code>dest:path</code>. After this <code>source:path</code> will no longer longer exist.</p> <p>If no filters are in use and if possible this will server side move <code>source:path</code> into <code>dest:path</code>. After this <code>source:path</code> will no longer longer exist.</p>
<p>Otherwise for each file in <code>source:path</code> selected by the filters (if any) this will move it into <code>dest:path</code>. If possible a server side move will be used, otherwise it will copy it (server side if possible) into <code>dest:path</code> then delete the original (if no errors on copy) in <code>source:path</code>.</p> <p>Otherwise for each file in <code>source:path</code> selected by the filters (if any) this will move it into <code>dest:path</code>. If possible a server side move will be used, otherwise it will copy it (server side if possible) into <code>dest:path</code> then delete the original (if no errors on copy) in <code>source:path</code>.</p>
<p><strong>Important</strong>: Since this can cause data loss, test first with the --dry-run flag.</p> <p><strong>Important</strong>: Since this can cause data loss, test first with the --dry-run flag.</p>
@ -325,9 +325,25 @@ two-3.txt: renamed from: two.txt</code></pre>
<p>Or like this to output any .txt files in dir or subdirectories.</p> <p>Or like this to output any .txt files in dir or subdirectories.</p>
<pre><code>rclone --include &quot;*.txt&quot; cat remote:path/to/dir</code></pre> <pre><code>rclone --include &quot;*.txt&quot; cat remote:path/to/dir</code></pre>
<pre><code>rclone cat remote:path</code></pre> <pre><code>rclone cat remote:path</code></pre>
<h2 id="rclone-copyto">rclone copyto</h2>
<p>Copy files from source to dest, skipping already copied</p>
<h3 id="synopsis-20">Synopsis</h3>
<p>If source:path is a file or directory then it copies it to a file or directory named dest:path.</p>
<p>This can be used to upload single files to other than their current name. If the source is a directory then it acts exactly like the copy command.</p>
<p>So</p>
<pre><code>rclone copyto src dst</code></pre>
<p>where src and dst are rclone paths, either remote:path or /path/to/local or C:.</p>
<p>This will:</p>
<pre><code>if src is file
copy it to dst, overwriting an existing file if it exists
if src is directory
copy it to dst, overwriting existing files if they exist
see copy command for full details</code></pre>
<p>This doesn't transfer unchanged files, testing by size and modification time or MD5SUM. It doesn't delete files from the destination.</p>
<pre><code>rclone copyto source:path dest:path</code></pre>
<h2 id="rclone-genautocomplete">rclone genautocomplete</h2> <h2 id="rclone-genautocomplete">rclone genautocomplete</h2>
<p>Output bash completion script for rclone.</p> <p>Output bash completion script for rclone.</p>
<h3 id="synopsis-20">Synopsis</h3> <h3 id="synopsis-21">Synopsis</h3>
<p>Generates a bash shell autocompletion script for rclone.</p> <p>Generates a bash shell autocompletion script for rclone.</p>
<p>This writes to /etc/bash_completion.d/rclone by default so will probably need to be run with sudo or as root, eg</p> <p>This writes to /etc/bash_completion.d/rclone by default so will probably need to be run with sudo or as root, eg</p>
<pre><code>sudo rclone genautocomplete</code></pre> <pre><code>sudo rclone genautocomplete</code></pre>
@ -337,12 +353,12 @@ two-3.txt: renamed from: two.txt</code></pre>
<pre><code>rclone genautocomplete [output_file]</code></pre> <pre><code>rclone genautocomplete [output_file]</code></pre>
<h2 id="rclone-gendocs">rclone gendocs</h2> <h2 id="rclone-gendocs">rclone gendocs</h2>
<p>Output markdown docs for rclone to the directory supplied.</p> <p>Output markdown docs for rclone to the directory supplied.</p>
<h3 id="synopsis-21">Synopsis</h3> <h3 id="synopsis-22">Synopsis</h3>
<p>This produces markdown docs for the rclone commands to the directory supplied. These are in a format suitable for hugo to render into the rclone.org website.</p> <p>This produces markdown docs for the rclone commands to the directory supplied. These are in a format suitable for hugo to render into the rclone.org website.</p>
<pre><code>rclone gendocs output_directory</code></pre> <pre><code>rclone gendocs output_directory</code></pre>
<h2 id="rclone-listremotes">rclone listremotes</h2> <h2 id="rclone-listremotes">rclone listremotes</h2>
<p>List all the remotes in the config file.</p> <p>List all the remotes in the config file.</p>
<h3 id="synopsis-22">Synopsis</h3> <h3 id="synopsis-23">Synopsis</h3>
<p>rclone listremotes lists all the available remotes from the config file.</p> <p>rclone listremotes lists all the available remotes from the config file.</p>
<p>When uses with the -l flag it lists the types too.</p> <p>When uses with the -l flag it lists the types too.</p>
<pre><code>rclone listremotes</code></pre> <pre><code>rclone listremotes</code></pre>
@ -350,7 +366,7 @@ two-3.txt: renamed from: two.txt</code></pre>
<pre><code> -l, --long Show the type as well as names.</code></pre> <pre><code> -l, --long Show the type as well as names.</code></pre>
<h2 id="rclone-mount">rclone mount</h2> <h2 id="rclone-mount">rclone mount</h2>
<p>Mount the remote as a mountpoint. <strong>EXPERIMENTAL</strong></p> <p>Mount the remote as a mountpoint. <strong>EXPERIMENTAL</strong></p>
<h3 id="synopsis-23">Synopsis</h3> <h3 id="synopsis-24">Synopsis</h3>
<p>rclone mount allows Linux, FreeBSD and macOS to mount any of Rclone's cloud storage systems as a file system with FUSE.</p> <p>rclone mount allows Linux, FreeBSD and macOS to mount any of Rclone's cloud storage systems as a file system with FUSE.</p>
<p>This is <strong>EXPERIMENTAL</strong> - use with care.</p> <p>This is <strong>EXPERIMENTAL</strong> - use with care.</p>
<p>First set up your remote using <code>rclone config</code>. Check it works with <code>rclone ls</code> etc.</p> <p>First set up your remote using <code>rclone config</code>. Check it works with <code>rclone ls</code> etc.</p>
@ -371,7 +387,7 @@ two-3.txt: renamed from: two.txt</code></pre>
<ul> <ul>
<li>All the remotes should work for read, but some may not for write <li>All the remotes should work for read, but some may not for write
<ul> <ul>
<li>those which need to know the size in advance won't - eg B2, Amazon Drive</li> <li>those which need to know the size in advance won't - eg B2</li>
<li>maybe should pass in size as -1 to mean work it out</li> <li>maybe should pass in size as -1 to mean work it out</li>
<li>Or put in an an upload cache to cache the files on disk first</li> <li>Or put in an an upload cache to cache the files on disk first</li>
</ul></li> </ul></li>
@ -398,6 +414,29 @@ two-3.txt: renamed from: two.txt</code></pre>
--uid uint32 Override the uid field set by the filesystem. (default 502) --uid uint32 Override the uid field set by the filesystem. (default 502)
--umask int Override the permission bits set by the filesystem. (default 2) --umask int Override the permission bits set by the filesystem. (default 2)
--write-back-cache Makes kernel buffer writes before sending them to rclone. Without this, writethrough caching is used.</code></pre> --write-back-cache Makes kernel buffer writes before sending them to rclone. Without this, writethrough caching is used.</code></pre>
<h2 id="rclone-moveto">rclone moveto</h2>
<p>Move file or directory from source to dest.</p>
<h3 id="synopsis-25">Synopsis</h3>
<p>If source:path is a file or directory then it moves it to a file or directory named dest:path.</p>
<p>This can be used to rename files or upload single files to other than their existing name. If the source is a directory then it acts exacty like the move command.</p>
<p>So</p>
<pre><code>rclone moveto src dst</code></pre>
<p>where src and dst are rclone paths, either remote:path or /path/to/local or C:.</p>
<p>This will:</p>
<pre><code>if src is file
move it to dst, overwriting an existing file if it exists
if src is directory
move it to dst, overwriting existing files if they exist
see move command for full details</code></pre>
<p>This doesn't transfer unchanged files, testing by size and modification time or MD5SUM. src will be deleted on successful transfer.</p>
<p><strong>Important</strong>: Since this can cause data loss, test first with the --dry-run flag.</p>
<pre><code>rclone moveto source:path dest:path</code></pre>
<h2 id="rclone-rmdirs">rclone rmdirs</h2>
<p>Remove any empty directoryies under the path.</p>
<h3 id="synopsis-26">Synopsis</h3>
<p>This removes any empty directories (or directories that only contain empty directories) under the path that it finds, including the path if it has nothing in.</p>
<p>This is useful for tidying up remotes that rclone has left a lot of empty directories in.</p>
<pre><code>rclone rmdirs remote:path</code></pre>
<h2 id="copying-single-files">Copying single files</h2> <h2 id="copying-single-files">Copying single files</h2>
<p>rclone normally syncs or copies directories. However if the source remote points to a file, rclone will just copy that file. The destination remote must point to a directory - rclone will give the error <code>Failed to create file system for &quot;remote:file&quot;: is a file not a directory</code> if it isn't.</p> <p>rclone normally syncs or copies directories. However if the source remote points to a file, rclone will just copy that file. The destination remote must point to a directory - rclone will give the error <code>Failed to create file system for &quot;remote:file&quot;: is a file not a directory</code> if it isn't.</p>
<p>For example, suppose you have a remote with a file in called <code>test.jpg</code>, then you could copy just that file like this</p> <p>For example, suppose you have a remote with a file in called <code>test.jpg</code>, then you could copy just that file like this</p>
@ -504,9 +543,16 @@ rclone sync /path/to/files remote:current-backup</code></pre>
<p>Normally rclone will look at modification time and size of files to see if they are equal. If you set this flag then rclone will check only the size.</p> <p>Normally rclone will look at modification time and size of files to see if they are equal. If you set this flag then rclone will check only the size.</p>
<p>This can be useful transferring files from dropbox which have been modified by the desktop sync client which doesn't set checksums of modification times in the same way as rclone.</p> <p>This can be useful transferring files from dropbox which have been modified by the desktop sync client which doesn't set checksums of modification times in the same way as rclone.</p>
<h3 id="statstime">--stats=TIME</h3> <h3 id="statstime">--stats=TIME</h3>
<p>Rclone will print stats at regular intervals to show its progress.</p> <p>Commands which transfer data (<code>sync</code>, <code>copy</code>, <code>copyto</code>, <code>move</code>, <code>moveto</code>) will print data transfer stats at regular intervals to show their progress.</p>
<p>This sets the interval.</p> <p>This sets the interval.</p>
<p>The default is <code>1m</code>. Use 0 to disable.</p> <p>The default is <code>1m</code>. Use 0 to disable.</p>
<p>If you set the stats interval then all command can show stats. This can be useful when running other commands, <code>check</code> or <code>mount</code> for example.</p>
<h3 id="stats-unitbitsbytes">--stats-unit=bits|bytes</h3>
<p>By default data transfer rates will be printed in bytes/second.</p>
<p>This option allows the data rate to be printed in bits/second.</p>
<p>Data transfer volume will still be reported in bytes.</p>
<p>The rate is reported as a binary unit, not SI unit. So 1 Mbit/s equals 1,048,576 bits/s and not 1,000,000 bits/s.</p>
<p>The default is <code>bytes</code>.</p>
<h3 id="delete-beforeduringafter">--delete-(before,during,after)</h3> <h3 id="delete-beforeduringafter">--delete-(before,during,after)</h3>
<p>This option allows you to specify when files on your destination are deleted when you sync folders.</p> <p>This option allows you to specify when files on your destination are deleted when you sync folders.</p>
<p>Specifying the value <code>--delete-before</code> will delete all files present on the destination, but not on the source <em>before</em> starting the transfer of any new or updated files. This uses extra memory as it has to store the source listing before proceeding.</p> <p>Specifying the value <code>--delete-before</code> will delete all files present on the destination, but not on the source <em>before</em> starting the transfer of any new or updated files. This uses extra memory as it has to store the source listing before proceeding.</p>
@ -563,6 +609,12 @@ c/u/q&gt;</code></pre>
<p>rclone uses <a href="https://godoc.org/golang.org/x/crypto/nacl/secretbox">nacl secretbox</a> which in turn uses XSalsa20 and Poly1305 to encrypt and authenticate your configuration with secret-key cryptography. The password is SHA-256 hashed, which produces the key for secretbox. The hashed password is not stored.</p> <p>rclone uses <a href="https://godoc.org/golang.org/x/crypto/nacl/secretbox">nacl secretbox</a> which in turn uses XSalsa20 and Poly1305 to encrypt and authenticate your configuration with secret-key cryptography. The password is SHA-256 hashed, which produces the key for secretbox. The hashed password is not stored.</p>
<p>While this provides very good security, we do not recommend storing your encrypted rclone configuration in public if it contains sensitive information, maybe except if you use a very strong password.</p> <p>While this provides very good security, we do not recommend storing your encrypted rclone configuration in public if it contains sensitive information, maybe except if you use a very strong password.</p>
<p>If it is safe in your environment, you can set the <code>RCLONE_CONFIG_PASS</code> environment variable to contain your password, in which case it will be used for decrypting the configuration.</p> <p>If it is safe in your environment, you can set the <code>RCLONE_CONFIG_PASS</code> environment variable to contain your password, in which case it will be used for decrypting the configuration.</p>
<p>You can set this for a session from a script. For unix like systems save this to a file called <code>set-rclone-password</code>:</p>
<pre><code>#!/bin/echo Source this file don&#39;t run it
read -s RCLONE_CONFIG_PASS
export RCLONE_CONFIG_PASS</code></pre>
<p>Then source the file when you want to use it. From the shell you would do <code>source set-rclone-password</code>. It will then ask you for the password and set it in the envonment variable.</p>
<p>If you are running rclone inside a script, you might want to disable password prompts. To do that, pass the parameter <code>--ask-password=false</code> to rclone. This will make rclone fail instead of asking for a password if <code>RCLONE_CONFIG_PASS</code> doesn't contain a valid password.</p> <p>If you are running rclone inside a script, you might want to disable password prompts. To do that, pass the parameter <code>--ask-password=false</code> to rclone. This will make rclone fail instead of asking for a password if <code>RCLONE_CONFIG_PASS</code> doesn't contain a valid password.</p>
<h2 id="developer-options">Developer options</h2> <h2 id="developer-options">Developer options</h2>
<p>These options are useful when developing or debugging rclone. There are also some more remote specific options which aren't documented here which are used for testing. These start with remote name eg <code>--drive-test-option</code> - see the docs for the remote in question.</p> <p>These options are useful when developing or debugging rclone. There are also some more remote specific options which aren't documented here which are used for testing. These start with remote name eg <code>--drive-test-option</code> - see the docs for the remote in question.</p>
@ -671,10 +723,9 @@ y/e/d&gt;</code></pre>
<p>Rclone has a sophisticated set of include and exclude rules. Some of these are based on patterns and some on other things like file size.</p> <p>Rclone has a sophisticated set of include and exclude rules. Some of these are based on patterns and some on other things like file size.</p>
<p>The filters are applied for the <code>copy</code>, <code>sync</code>, <code>move</code>, <code>ls</code>, <code>lsl</code>, <code>md5sum</code>, <code>sha1sum</code>, <code>size</code>, <code>delete</code> and <code>check</code> operations. Note that <code>purge</code> does not obey the filters.</p> <p>The filters are applied for the <code>copy</code>, <code>sync</code>, <code>move</code>, <code>ls</code>, <code>lsl</code>, <code>md5sum</code>, <code>sha1sum</code>, <code>size</code>, <code>delete</code> and <code>check</code> operations. Note that <code>purge</code> does not obey the filters.</p>
<p>Each path as it passes through rclone is matched against the include and exclude rules like <code>--include</code>, <code>--exclude</code>, <code>--include-from</code>, <code>--exclude-from</code>, <code>--filter</code>, or <code>--filter-from</code>. The simplest way to try them out is using the <code>ls</code> command, or <code>--dry-run</code> together with <code>-v</code>.</p> <p>Each path as it passes through rclone is matched against the include and exclude rules like <code>--include</code>, <code>--exclude</code>, <code>--include-from</code>, <code>--exclude-from</code>, <code>--filter</code>, or <code>--filter-from</code>. The simplest way to try them out is using the <code>ls</code> command, or <code>--dry-run</code> together with <code>-v</code>.</p>
<p><strong>Important</strong> Due to limitations of the command line parser you can only use any of these options once - if you duplicate them then rclone will use the last one only.</p>
<h2 id="patterns">Patterns</h2> <h2 id="patterns">Patterns</h2>
<p>The patterns used to match files for inclusion or exclusion are based on &quot;file globs&quot; as used by the unix shell.</p> <p>The patterns used to match files for inclusion or exclusion are based on &quot;file globs&quot; as used by the unix shell.</p>
<p>If the pattern starts with a <code>/</code> then it only matches at the top level of the directory tree, relative to the root of the remote. If it doesn't start with <code>/</code> then it is matched starting at the <strong>end of the path</strong>, but it will only match a complete path element:</p> <p>If the pattern starts with a <code>/</code> then it only matches at the top level of the directory tree, <strong>relative to the root of the remote</strong> (not necessarily the root of the local drive). If it doesn't start with <code>/</code> then it is matched starting at the <strong>end of the path</strong>, but it will only match a complete path element:</p>
<pre><code>file.jpg - matches &quot;file.jpg&quot; <pre><code>file.jpg - matches &quot;file.jpg&quot;
- matches &quot;directory/file.jpg&quot; - matches &quot;directory/file.jpg&quot;
- doesn&#39;t match &quot;afile.jpg&quot; - doesn&#39;t match &quot;afile.jpg&quot;
@ -722,9 +773,9 @@ y/e/d&gt;</code></pre>
<p>Rclone implements bash style <code>{a,b,c}</code> glob matching which rsync doesn't.</p> <p>Rclone implements bash style <code>{a,b,c}</code> glob matching which rsync doesn't.</p>
<p>Rclone always does a wildcard match so <code>\</code> must always escape a <code>\</code>.</p> <p>Rclone always does a wildcard match so <code>\</code> must always escape a <code>\</code>.</p>
<h2 id="how-the-rules-are-used">How the rules are used</h2> <h2 id="how-the-rules-are-used">How the rules are used</h2>
<p>Rclone maintains a list of include rules and exclude rules.</p> <p>Rclone maintains a combined list of include rules and exclude rules.</p>
<p>Each file is matched in order against the list until it finds a match. The file is then included or excluded according to the rule type.</p> <p>Each file is matched in order, starting from the top, against the rule in the list until it finds a match. The file is then included or excluded according to the rule type.</p>
<p>If the matcher falls off the bottom of the list then the path is included.</p> <p>If the matcher fails to find a match after testing against all the entries in the list then the path is included.</p>
<p>For example given the following rules, <code>+</code> being include, <code>-</code> being exclude,</p> <p>For example given the following rules, <code>+</code> being include, <code>-</code> being exclude,</p>
<pre><code>- secret*.jpg <pre><code>- secret*.jpg
+ *.jpg + *.jpg
@ -745,11 +796,26 @@ y/e/d&gt;</code></pre>
<p>A similar process is done on directory entries before recursing into them. This only works on remotes which have a concept of directory (Eg local, google drive, onedrive, amazon drive) and not on bucket based remotes (eg s3, swift, google compute storage, b2).</p> <p>A similar process is done on directory entries before recursing into them. This only works on remotes which have a concept of directory (Eg local, google drive, onedrive, amazon drive) and not on bucket based remotes (eg s3, swift, google compute storage, b2).</p>
<h2 id="adding-filtering-rules">Adding filtering rules</h2> <h2 id="adding-filtering-rules">Adding filtering rules</h2>
<p>Filtering rules are added with the following command line flags.</p> <p>Filtering rules are added with the following command line flags.</p>
<h3 id="repeating-options">Repeating options</h3>
<p>You can repeat the following options to add more than one rule of that type.</p>
<ul>
<li><code>--include</code></li>
<li><code>--include-from</code></li>
<li><code>--exclude</code></li>
<li><code>--exclude-from</code></li>
<li><code>--filter</code></li>
<li><code>--filter-from</code></li>
</ul>
<p>Note that all the options of the same type are processed together in the order above, regardless of what order they were placed on the command line.</p>
<p>So all <code>--include</code> options are processed first in the order they appeared on the command line, then all <code>--include-from</code> options etc.</p>
<p>To mix up the order includes and excludes, the <code>--filter</code> flag can be used.</p>
<h3 id="exclude---exclude-files-matching-pattern"><code>--exclude</code> - Exclude files matching pattern</h3> <h3 id="exclude---exclude-files-matching-pattern"><code>--exclude</code> - Exclude files matching pattern</h3>
<p>Add a single exclude rule with <code>--exclude</code>.</p> <p>Add a single exclude rule with <code>--exclude</code>.</p>
<p>This flag can be repeated. See above for the order the flags are processed in.</p>
<p>Eg <code>--exclude *.bak</code> to exclude all bak files from the sync.</p> <p>Eg <code>--exclude *.bak</code> to exclude all bak files from the sync.</p>
<h3 id="exclude-from---read-exclude-patterns-from-file"><code>--exclude-from</code> - Read exclude patterns from file</h3> <h3 id="exclude-from---read-exclude-patterns-from-file"><code>--exclude-from</code> - Read exclude patterns from file</h3>
<p>Add exclude rules from a file.</p> <p>Add exclude rules from a file.</p>
<p>This flag can be repeated. See above for the order the flags are processed in.</p>
<p>Prepare a file like this <code>exclude-file.txt</code></p> <p>Prepare a file like this <code>exclude-file.txt</code></p>
<pre><code># a sample exclude rule file <pre><code># a sample exclude rule file
*.bak *.bak
@ -758,10 +824,12 @@ file2.jpg</code></pre>
<p>This is useful if you have a lot of rules.</p> <p>This is useful if you have a lot of rules.</p>
<h3 id="include---include-files-matching-pattern"><code>--include</code> - Include files matching pattern</h3> <h3 id="include---include-files-matching-pattern"><code>--include</code> - Include files matching pattern</h3>
<p>Add a single include rule with <code>--include</code>.</p> <p>Add a single include rule with <code>--include</code>.</p>
<p>This flag can be repeated. See above for the order the flags are processed in.</p>
<p>Eg <code>--include *.{png,jpg}</code> to include all <code>png</code> and <code>jpg</code> files in the backup and no others.</p> <p>Eg <code>--include *.{png,jpg}</code> to include all <code>png</code> and <code>jpg</code> files in the backup and no others.</p>
<p>This adds an implicit <code>--exclude *</code> at the very end of the filter list. This means you can mix <code>--include</code> and <code>--include-from</code> with the other filters (eg <code>--exclude</code>) but you must include all the files you want in the include statement. If this doesn't provide enough flexibility then you must use <code>--filter-from</code>.</p> <p>This adds an implicit <code>--exclude *</code> at the very end of the filter list. This means you can mix <code>--include</code> and <code>--include-from</code> with the other filters (eg <code>--exclude</code>) but you must include all the files you want in the include statement. If this doesn't provide enough flexibility then you must use <code>--filter-from</code>.</p>
<h3 id="include-from---read-include-patterns-from-file"><code>--include-from</code> - Read include patterns from file</h3> <h3 id="include-from---read-include-patterns-from-file"><code>--include-from</code> - Read include patterns from file</h3>
<p>Add include rules from a file.</p> <p>Add include rules from a file.</p>
<p>This flag can be repeated. See above for the order the flags are processed in.</p>
<p>Prepare a file like this <code>include-file.txt</code></p> <p>Prepare a file like this <code>include-file.txt</code></p>
<pre><code># a sample include rule file <pre><code># a sample include rule file
*.jpg *.jpg
@ -772,9 +840,11 @@ file2.avi</code></pre>
<p>This adds an implicit <code>--exclude *</code> at the very end of the filter list. This means you can mix <code>--include</code> and <code>--include-from</code> with the other filters (eg <code>--exclude</code>) but you must include all the files you want in the include statement. If this doesn't provide enough flexibility then you must use <code>--filter-from</code>.</p> <p>This adds an implicit <code>--exclude *</code> at the very end of the filter list. This means you can mix <code>--include</code> and <code>--include-from</code> with the other filters (eg <code>--exclude</code>) but you must include all the files you want in the include statement. If this doesn't provide enough flexibility then you must use <code>--filter-from</code>.</p>
<h3 id="filter---add-a-file-filtering-rule"><code>--filter</code> - Add a file-filtering rule</h3> <h3 id="filter---add-a-file-filtering-rule"><code>--filter</code> - Add a file-filtering rule</h3>
<p>This can be used to add a single include or exclude rule. Include rules start with <code>+</code> and exclude rules start with <code>-</code>. A special rule called <code>!</code> can be used to clear the existing rules.</p> <p>This can be used to add a single include or exclude rule. Include rules start with <code>+</code> and exclude rules start with <code>-</code>. A special rule called <code>!</code> can be used to clear the existing rules.</p>
<p>This flag can be repeated. See above for the order the flags are processed in.</p>
<p>Eg <code>--filter &quot;- *.bak&quot;</code> to exclude all bak files from the sync.</p> <p>Eg <code>--filter &quot;- *.bak&quot;</code> to exclude all bak files from the sync.</p>
<h3 id="filter-from---read-filtering-patterns-from-a-file"><code>--filter-from</code> - Read filtering patterns from a file</h3> <h3 id="filter-from---read-filtering-patterns-from-a-file"><code>--filter-from</code> - Read filtering patterns from a file</h3>
<p>Add include/exclude rules from a file.</p> <p>Add include/exclude rules from a file.</p>
<p>This flag can be repeated. See above for the order the flags are processed in.</p>
<p>Prepare a file like this <code>filter-file.txt</code></p> <p>Prepare a file like this <code>filter-file.txt</code></p>
<pre><code># a sample exclude rule file <pre><code># a sample exclude rule file
- secret*.jpg - secret*.jpg
@ -787,6 +857,7 @@ file2.avi</code></pre>
<p>This example will include all <code>jpg</code> and <code>png</code> files, exclude any files matching <code>secret*.jpg</code> and include <code>file2.avi</code>. Everything else will be excluded from the sync.</p> <p>This example will include all <code>jpg</code> and <code>png</code> files, exclude any files matching <code>secret*.jpg</code> and include <code>file2.avi</code>. Everything else will be excluded from the sync.</p>
<h3 id="files-from---read-list-of-source-file-names"><code>--files-from</code> - Read list of source-file names</h3> <h3 id="files-from---read-list-of-source-file-names"><code>--files-from</code> - Read list of source-file names</h3>
<p>This reads a list of file names from the file passed in and <strong>only</strong> these files are transferred. The filtering rules are ignored completely if you use this option.</p> <p>This reads a list of file names from the file passed in and <strong>only</strong> these files are transferred. The filtering rules are ignored completely if you use this option.</p>
<p>This option can be repeated to read from more than one file. These are read in the order that they are placed on the command line.</p>
<p>Prepare a file like this <code>files-from.txt</code></p> <p>Prepare a file like this <code>files-from.txt</code></p>
<pre><code># comment <pre><code># comment
file1.jpg file1.jpg
@ -1045,8 +1116,8 @@ The hashes are used when transferring data as an integrity check and can be spec
<td align="left">Amazon Drive</td> <td align="left">Amazon Drive</td>
<td align="center">Yes</td> <td align="center">Yes</td>
<td align="center">No</td> <td align="center">No</td>
<td align="center">No <a href="https://github.com/ncw/rclone/issues/721">#721</a></td> <td align="center">Yes</td>
<td align="center">No <a href="https://github.com/ncw/rclone/issues/721">#721</a></td> <td align="center">Yes</td>
<td align="center">No <a href="https://github.com/ncw/rclone/issues/575">#575</a></td> <td align="center">No <a href="https://github.com/ncw/rclone/issues/575">#575</a></td>
</tr> </tr>
<tr class="odd"> <tr class="odd">
@ -1204,7 +1275,7 @@ y/e/d&gt; y</code></pre>
<p>Google documents can only be exported from Google drive. When rclone downloads a Google doc it chooses a format to download depending upon this setting.</p> <p>Google documents can only be exported from Google drive. When rclone downloads a Google doc it chooses a format to download depending upon this setting.</p>
<p>By default the formats are <code>docx,xlsx,pptx,svg</code> which are a sensible default for an editable document.</p> <p>By default the formats are <code>docx,xlsx,pptx,svg</code> which are a sensible default for an editable document.</p>
<p>When choosing a format, rclone runs down the list provided in order and chooses the first file format the doc can be exported as from the list. If the file can't be exported to a format on the formats list, then rclone will choose a format from the default list.</p> <p>When choosing a format, rclone runs down the list provided in order and chooses the first file format the doc can be exported as from the list. If the file can't be exported to a format on the formats list, then rclone will choose a format from the default list.</p>
<p>If you prefer an archive copy then you might use <code>--drive-formats pdf</code>, or if you prefer openoffice/libreoffice formats you might use <code>--drive-formats ods,odt</code>.</p> <p>If you prefer an archive copy then you might use <code>--drive-formats pdf</code>, or if you prefer openoffice/libreoffice formats you might use <code>--drive-formats ods,odt,odp</code>.</p>
<p>Note that rclone adds the extension to the google doc, so if it is calles <code>My Spreadsheet</code> on google docs, it will be exported as <code>My Spreadsheet.xlsx</code> or <code>My Spreadsheet.pdf</code> etc.</p> <p>Note that rclone adds the extension to the google doc, so if it is calles <code>My Spreadsheet</code> on google docs, it will be exported as <code>My Spreadsheet.xlsx</code> or <code>My Spreadsheet.pdf</code> etc.</p>
<p>Here are the possible extensions with their corresponding mime types.</p> <p>Here are the possible extensions with their corresponding mime types.</p>
<table style="width:49%;"> <table style="width:49%;">
@ -2042,7 +2113,7 @@ y/e/d&gt; y</code></pre>
<p>Amazon Drive has rate limiting so you may notice errors in the sync (429 errors). rclone will automatically retry the sync up to 3 times by default (see <code>--retries</code> flag) which should hopefully work around this problem.</p> <p>Amazon Drive has rate limiting so you may notice errors in the sync (429 errors). rclone will automatically retry the sync up to 3 times by default (see <code>--retries</code> flag) which should hopefully work around this problem.</p>
<p>Amazon Drive has an internal limit of file sizes that can be uploaded to the service. This limit is not officially published, but all files larger than this will fail.</p> <p>Amazon Drive has an internal limit of file sizes that can be uploaded to the service. This limit is not officially published, but all files larger than this will fail.</p>
<p>At the time of writing (Jan 2016) is in the area of 50GB per file. This means that larger files are likely to fail.</p> <p>At the time of writing (Jan 2016) is in the area of 50GB per file. This means that larger files are likely to fail.</p>
<p>Unfortunatly there is no way for rclone to see that this failure is because of file size, so it will retry the operation, as any other failure. To avoid this problem, use <code>--max-size 50G</code> option to limit the maximum size of uploaded files.</p> <p>Unfortunatly there is no way for rclone to see that this failure is because of file size, so it will retry the operation, as any other failure. To avoid this problem, use <code>--max-size 50000M</code> option to limit the maximum size of uploaded files. Note that <code>--max-size</code> does not split files into segments, it only ignores files over this size.</p>
<h2 id="microsoft-one-drive">Microsoft One Drive</h2> <h2 id="microsoft-one-drive">Microsoft One Drive</h2>
<p>Paths are specified as <code>remote:path</code></p> <p>Paths are specified as <code>remote:path</code></p>
<p>Paths may be as deep as required, eg <code>remote:directory/subdirectory</code>.</p> <p>Paths may be as deep as required, eg <code>remote:directory/subdirectory</code>.</p>
@ -2129,6 +2200,7 @@ y/e/d&gt; y</code></pre>
<p>Note that One Drive is case insensitive so you can't have a file called &quot;Hello.doc&quot; and one called &quot;hello.doc&quot;.</p> <p>Note that One Drive is case insensitive so you can't have a file called &quot;Hello.doc&quot; and one called &quot;hello.doc&quot;.</p>
<p>Rclone only supports your default One Drive, and doesn't work with One Drive for business. Both these issues may be fixed at some point depending on user demand!</p> <p>Rclone only supports your default One Drive, and doesn't work with One Drive for business. Both these issues may be fixed at some point depending on user demand!</p>
<p>There are quite a few characters that can't be in One Drive file names. These can't occur on Windows platforms, but on non-Windows platforms they are common. Rclone will map these names to and from an identical looking unicode equivalent. For example if a file has a <code>?</code> in it will be mapped to <code></code> instead.</p> <p>There are quite a few characters that can't be in One Drive file names. These can't occur on Windows platforms, but on non-Windows platforms they are common. Rclone will map these names to and from an identical looking unicode equivalent. For example if a file has a <code>?</code> in it will be mapped to <code></code> instead.</p>
<p>The largest allowed file size is 10GiB (10,737,418,240 bytes).</p>
<h2 id="hubic">Hubic</h2> <h2 id="hubic">Hubic</h2>
<p>Paths are specified as <code>remote:path</code></p> <p>Paths are specified as <code>remote:path</code></p>
<p>Paths are specified as <code>remote:container</code> (or <code>remote:</code> for the <code>lsd</code> command.) You may put subdirectories in too, eg <code>remote:container/path/to/dir</code>.</p> <p>Paths are specified as <code>remote:container</code> (or <code>remote:</code> for the <code>lsd</code> command.) You may put subdirectories in too, eg <code>remote:container/path/to/dir</code>.</p>
@ -2670,6 +2742,52 @@ nounc = true</code></pre>
<p><strong>NB</strong> This flag is only available on Unix based systems. On systems where it isn't supported (eg Windows) it will not appear as an valid flag.</p> <p><strong>NB</strong> This flag is only available on Unix based systems. On systems where it isn't supported (eg Windows) it will not appear as an valid flag.</p>
<h2 id="changelog">Changelog</h2> <h2 id="changelog">Changelog</h2>
<ul> <ul>
<li>v1.35 - 2017-01-02
<ul>
<li>New Features</li>
<li>moveto and copyto commands for choosing a destination name on copy/move</li>
<li>rmdirs command to recursively delete empty directories</li>
<li>Allow repeated --include/--exclude/--filter options</li>
<li>Only show transfer stats on commands which transfer stuff
<ul>
<li>show stats on any command using the <code>--stats</code> flag</li>
</ul></li>
<li>Allow overlapping directories in move when server side dir move is supported</li>
<li>Add --stats-unit option - thanks Scott McGillivray</li>
<li>Bug Fixes</li>
<li>Fix the config file being overwritten when two rclones are running</li>
<li>Make rclone lsd obey the filters properly</li>
<li>Fix compilation on mips</li>
<li>Fix not transferring files that don't differ in size</li>
<li>Fix panic on nil retry/fatal error</li>
<li>Mount</li>
<li>Retry reads on error - should help with reliability a lot</li>
<li>Report the modification times for directories from the remote</li>
<li>Add bandwidth accounting and limiting (fixes --bwlimit)</li>
<li>If --stats provided will show stats and which files are transferring</li>
<li>Support R/W files if truncate is set.</li>
<li>Implement statfs interface so df works</li>
<li>Note that write is now supported on Amazon Drive</li>
<li>Report number of blocks in a file - thanks Stefan Breunig</li>
<li>Crypt</li>
<li>Prevent the user pointing crypt at itself</li>
<li>Fix failed to authenticate decrypted block errors
<ul>
<li>these will now return the underlying unexpected EOF instead</li>
</ul></li>
<li>Amazon Drive</li>
<li>Add support for server side move and directory move - thanks Stefan Breunig</li>
<li>Fix nil pointer deref on size attribute</li>
<li>B2</li>
<li>Use new prefix and delimiter parameters in directory listings
<ul>
<li>This makes --max-depth 1 dir listings as used in mount much faster</li>
</ul></li>
<li>Reauth the account while doing uploads too - should help with token expiry</li>
<li>Drive</li>
<li>Make DirMove more efficient and complain about moving the root</li>
<li>Create destination directory on Move()</li>
</ul></li>
<li>v1.34 - 2016-11-06 <li>v1.34 - 2016-11-06
<ul> <ul>
<li>New Features</li> <li>New Features</li>
@ -3567,6 +3685,36 @@ h='&#104;&#x6f;&#116;&#x6d;&#x61;&#x69;&#108;&#46;&#x63;&#x6f;&#x6d;';a='&#64;';
document.write('<a h'+'ref'+'="ma'+'ilto'+':'+e+'" clas'+'s="em' + 'ail">'+e+'<\/'+'a'+'>'); document.write('<a h'+'ref'+'="ma'+'ilto'+':'+e+'" clas'+'s="em' + 'ail">'+e+'<\/'+'a'+'>');
// --> // -->
</script><noscript>&#x6d;&#x61;&#120;&#100;&#x31;&#x33;&#x5f;&#108;&#x75;&#x69;&#122;&#x5f;&#x63;&#x61;&#114;&#108;&#x6f;&#x73;&#32;&#x61;&#116;&#32;&#104;&#x6f;&#116;&#x6d;&#x61;&#x69;&#108;&#32;&#100;&#x6f;&#116;&#32;&#x63;&#x6f;&#x6d;</noscript></li> </script><noscript>&#x6d;&#x61;&#120;&#100;&#x31;&#x33;&#x5f;&#108;&#x75;&#x69;&#122;&#x5f;&#x63;&#x61;&#114;&#108;&#x6f;&#x73;&#32;&#x61;&#116;&#32;&#104;&#x6f;&#116;&#x6d;&#x61;&#x69;&#108;&#32;&#100;&#x6f;&#116;&#32;&#x63;&#x6f;&#x6d;</noscript></li>
<li>Stefan Breunig <script type="text/javascript">
<!--
h='&#x79;&#114;&#100;&#x65;&#110;&#46;&#100;&#x65;';a='&#64;';n='&#x73;&#116;&#x65;&#102;&#x61;&#110;&#x2d;&#x67;&#x69;&#116;&#104;&#x75;&#98;';e=n+a+h;
document.write('<a h'+'ref'+'="ma'+'ilto'+':'+e+'" clas'+'s="em' + 'ail">'+e+'<\/'+'a'+'>');
// -->
</script><noscript>&#x73;&#116;&#x65;&#102;&#x61;&#110;&#x2d;&#x67;&#x69;&#116;&#104;&#x75;&#98;&#32;&#x61;&#116;&#32;&#x79;&#114;&#100;&#x65;&#110;&#32;&#100;&#x6f;&#116;&#32;&#100;&#x65;</noscript></li>
<li>Alishan Ladhani <script type="text/javascript">
<!--
h='&#x75;&#x73;&#x65;&#114;&#x73;&#46;&#110;&#x6f;&#114;&#x65;&#112;&#108;&#x79;&#46;&#x67;&#x69;&#116;&#104;&#x75;&#98;&#46;&#x63;&#x6f;&#x6d;';a='&#64;';n='&#x61;&#108;&#x69;&#x2d;&#108;';e=n+a+h;
document.write('<a h'+'ref'+'="ma'+'ilto'+':'+e+'" clas'+'s="em' + 'ail">'+e+'<\/'+'a'+'>');
// -->
</script><noscript>&#x61;&#108;&#x69;&#x2d;&#108;&#32;&#x61;&#116;&#32;&#x75;&#x73;&#x65;&#114;&#x73;&#32;&#100;&#x6f;&#116;&#32;&#110;&#x6f;&#114;&#x65;&#112;&#108;&#x79;&#32;&#100;&#x6f;&#116;&#32;&#x67;&#x69;&#116;&#104;&#x75;&#98;&#32;&#100;&#x6f;&#116;&#32;&#x63;&#x6f;&#x6d;</noscript></li>
<li>0xJAKE <script type="text/javascript">
<!--
h='&#x75;&#x73;&#x65;&#114;&#x73;&#46;&#110;&#x6f;&#114;&#x65;&#112;&#108;&#x79;&#46;&#x67;&#x69;&#116;&#104;&#x75;&#98;&#46;&#x63;&#x6f;&#x6d;';a='&#64;';n='&#48;&#120;&#74;&#x41;&#x4b;&#x45;';e=n+a+h;
document.write('<a h'+'ref'+'="ma'+'ilto'+':'+e+'" clas'+'s="em' + 'ail">'+e+'<\/'+'a'+'>');
// -->
</script><noscript>&#48;&#120;&#74;&#x41;&#x4b;&#x45;&#32;&#x61;&#116;&#32;&#x75;&#x73;&#x65;&#114;&#x73;&#32;&#100;&#x6f;&#116;&#32;&#110;&#x6f;&#114;&#x65;&#112;&#108;&#x79;&#32;&#100;&#x6f;&#116;&#32;&#x67;&#x69;&#116;&#104;&#x75;&#98;&#32;&#100;&#x6f;&#116;&#32;&#x63;&#x6f;&#x6d;</noscript></li>
<li>Thibault Molleman <script type="text/javascript">
<!--
h='&#x75;&#x73;&#x65;&#114;&#x73;&#46;&#110;&#x6f;&#114;&#x65;&#112;&#108;&#x79;&#46;&#x67;&#x69;&#116;&#104;&#x75;&#98;&#46;&#x63;&#x6f;&#x6d;';a='&#64;';n='&#116;&#104;&#x69;&#98;&#x61;&#x75;&#108;&#116;&#x6d;&#x6f;&#108;';e=n+a+h;
document.write('<a h'+'ref'+'="ma'+'ilto'+':'+e+'" clas'+'s="em' + 'ail">'+e+'<\/'+'a'+'>');
// -->
</script><noscript>&#116;&#104;&#x69;&#98;&#x61;&#x75;&#108;&#116;&#x6d;&#x6f;&#108;&#32;&#x61;&#116;&#32;&#x75;&#x73;&#x65;&#114;&#x73;&#32;&#100;&#x6f;&#116;&#32;&#110;&#x6f;&#114;&#x65;&#112;&#108;&#x79;&#32;&#100;&#x6f;&#116;&#32;&#x67;&#x69;&#116;&#104;&#x75;&#98;&#32;&#100;&#x6f;&#116;&#32;&#x63;&#x6f;&#x6d;</noscript></li>
<li>Scott McGillivray <script type="text/javascript">
<!--
h='&#x67;&#x6d;&#x61;&#x69;&#108;&#46;&#x63;&#x6f;&#x6d;';a='&#64;';n='&#x73;&#x63;&#x6f;&#116;&#116;&#46;&#x6d;&#x63;&#x67;&#x69;&#108;&#108;&#x69;&#118;&#114;&#x61;&#x79;';e=n+a+h;
document.write('<a h'+'ref'+'="ma'+'ilto'+':'+e+'" clas'+'s="em' + 'ail">'+e+'<\/'+'a'+'>');
// -->
</script><noscript>&#x73;&#x63;&#x6f;&#116;&#116;&#46;&#x6d;&#x63;&#x67;&#x69;&#108;&#108;&#x69;&#118;&#114;&#x61;&#x79;&#32;&#x61;&#116;&#32;&#x67;&#x6d;&#x61;&#x69;&#108;&#32;&#100;&#x6f;&#116;&#32;&#x63;&#x6f;&#x6d;</noscript></li>
</ul> </ul>
<h1 id="contact-the-rclone-project">Contact the rclone project</h1> <h1 id="contact-the-rclone-project">Contact the rclone project</h1>
<h2 id="forum">Forum</h2> <h2 id="forum">Forum</h2>

265
MANUAL.md
View File

@ -1,6 +1,6 @@
% rclone(1) User Manual % rclone(1) User Manual
% Nick Craig-Wood % Nick Craig-Wood
% Nov 06, 2016 % Jan 02, 2017
Rclone Rclone
====== ======
@ -37,7 +37,8 @@ Links
* [Home page](http://rclone.org/) * [Home page](http://rclone.org/)
* [Github project page for source and bug tracker](http://github.com/ncw/rclone) * [Github project page for source and bug tracker](http://github.com/ncw/rclone)
* <a href="https://google.com/+RcloneOrg" rel="publisher">Google+ page</a></li> * [Rclone Forum](https://forum.rclone.org)
* <a href="https://google.com/+RcloneOrg" rel="publisher">Google+ page</a>
* [Downloads](http://rclone.org/downloads/) * [Downloads](http://rclone.org/downloads/)
# Install # # Install #
@ -288,7 +289,8 @@ Move files from source to dest.
Moves the contents of the source directory to the destination Moves the contents of the source directory to the destination
directory. Rclone will error if the source and destination overlap. directory. Rclone will error if the source and destination overlap and
the remote does not support a server side directory move operation.
If no filters are in use and if possible this will server side move If no filters are in use and if possible this will server side move
`source:path` into `dest:path`. After this `source:path` will no `source:path` into `dest:path`. After this `source:path` will no
@ -652,6 +654,45 @@ Or like this to output any .txt files in dir or subdirectories.
rclone cat remote:path rclone cat remote:path
``` ```
## rclone copyto
Copy files from source to dest, skipping already copied
### Synopsis
If source:path is a file or directory then it copies it to a file or
directory named dest:path.
This can be used to upload single files to other than their current
name. If the source is a directory then it acts exactly like the copy
command.
So
rclone copyto src dst
where src and dst are rclone paths, either remote:path or
/path/to/local or C:\windows\path\if\on\windows.
This will:
if src is file
copy it to dst, overwriting an existing file if it exists
if src is directory
copy it to dst, overwriting existing files if they exist
see copy command for full details
This doesn't transfer unchanged files, testing by size and
modification time or MD5SUM. It doesn't delete files from the
destination.
```
rclone copyto source:path dest:path
```
## rclone genautocomplete ## rclone genautocomplete
Output bash completion script for rclone. Output bash completion script for rclone.
@ -774,7 +815,7 @@ mount won't do that, so will be less reliable than the rclone command.
### Bugs ### ### Bugs ###
* All the remotes should work for read, but some may not for write * All the remotes should work for read, but some may not for write
* those which need to know the size in advance won't - eg B2, Amazon Drive * those which need to know the size in advance won't - eg B2
* maybe should pass in size as -1 to mean work it out * maybe should pass in size as -1 to mean work it out
* Or put in an an upload cache to cache the files on disk first * Or put in an an upload cache to cache the files on disk first
@ -808,6 +849,68 @@ rclone mount remote:path /path/to/mountpoint
--write-back-cache Makes kernel buffer writes before sending them to rclone. Without this, writethrough caching is used. --write-back-cache Makes kernel buffer writes before sending them to rclone. Without this, writethrough caching is used.
``` ```
## rclone moveto
Move file or directory from source to dest.
### Synopsis
If source:path is a file or directory then it moves it to a file or
directory named dest:path.
This can be used to rename files or upload single files to other than
their existing name. If the source is a directory then it acts exacty
like the move command.
So
rclone moveto src dst
where src and dst are rclone paths, either remote:path or
/path/to/local or C:\windows\path\if\on\windows.
This will:
if src is file
move it to dst, overwriting an existing file if it exists
if src is directory
move it to dst, overwriting existing files if they exist
see move command for full details
This doesn't transfer unchanged files, testing by size and
modification time or MD5SUM. src will be deleted on successful
transfer.
**Important**: Since this can cause data loss, test first with the
--dry-run flag.
```
rclone moveto source:path dest:path
```
## rclone rmdirs
Remove any empty directoryies under the path.
### Synopsis
This removes any empty directories (or directories that only contain
empty directories) under the path that it finds, including the path if
it has nothing in.
This is useful for tidying up remotes that rclone has left a lot of
empty directories in.
```
rclone rmdirs remote:path
```
Copying single files Copying single files
-------------------- --------------------
@ -1120,12 +1223,31 @@ modification times in the same way as rclone.
### --stats=TIME ### ### --stats=TIME ###
Rclone will print stats at regular intervals to show its progress. Commands which transfer data (`sync`, `copy`, `copyto`, `move`,
`moveto`) will print data transfer stats at regular intervals to show
their progress.
This sets the interval. This sets the interval.
The default is `1m`. Use 0 to disable. The default is `1m`. Use 0 to disable.
If you set the stats interval then all command can show stats. This
can be useful when running other commands, `check` or `mount` for
example.
### --stats-unit=bits|bytes ###
By default data transfer rates will be printed in bytes/second.
This option allows the data rate to be printed in bits/second.
Data transfer volume will still be reported in bytes.
The rate is reported as a binary unit, not SI unit. So 1 Mbit/s
equals 1,048,576 bits/s and not 1,000,000 bits/s.
The default is `bytes`.
### --delete-(before,during,after) ### ### --delete-(before,during,after) ###
This option allows you to specify when files on your destination are This option allows you to specify when files on your destination are
@ -1254,6 +1376,20 @@ If it is safe in your environment, you can set the `RCLONE_CONFIG_PASS`
environment variable to contain your password, in which case it will be environment variable to contain your password, in which case it will be
used for decrypting the configuration. used for decrypting the configuration.
You can set this for a session from a script. For unix like systems
save this to a file called `set-rclone-password`:
```
#!/bin/echo Source this file don't run it
read -s RCLONE_CONFIG_PASS
export RCLONE_CONFIG_PASS
```
Then source the file when you want to use it. From the shell you
would do `source set-rclone-password`. It will then ask you for the
password and set it in the envonment variable.
If you are running rclone inside a script, you might want to disable If you are running rclone inside a script, you might want to disable
password prompts. To do that, pass the parameter password prompts. To do that, pass the parameter
`--ask-password=false` to rclone. This will make rclone fail instead `--ask-password=false` to rclone. This will make rclone fail instead
@ -1489,19 +1625,16 @@ and exclude rules like `--include`, `--exclude`, `--include-from`,
try them out is using the `ls` command, or `--dry-run` together with try them out is using the `ls` command, or `--dry-run` together with
`-v`. `-v`.
**Important** Due to limitations of the command line parser you can
only use any of these options once - if you duplicate them then rclone
will use the last one only.
## Patterns ## ## Patterns ##
The patterns used to match files for inclusion or exclusion are based The patterns used to match files for inclusion or exclusion are based
on "file globs" as used by the unix shell. on "file globs" as used by the unix shell.
If the pattern starts with a `/` then it only matches at the top level If the pattern starts with a `/` then it only matches at the top level
of the directory tree, relative to the root of the remote. of the directory tree, **relative to the root of the remote** (not
If it doesn't start with `/` then it is matched starting at the necessarily the root of the local drive). If it doesn't start with `/`
**end of the path**, but it will only match a complete path element: then it is matched starting at the **end of the path**, but it will
only match a complete path element:
file.jpg - matches "file.jpg" file.jpg - matches "file.jpg"
- matches "directory/file.jpg" - matches "directory/file.jpg"
@ -1590,13 +1723,14 @@ Rclone always does a wildcard match so `\` must always escape a `\`.
## How the rules are used ## ## How the rules are used ##
Rclone maintains a list of include rules and exclude rules. Rclone maintains a combined list of include rules and exclude rules.
Each file is matched in order against the list until it finds a match. Each file is matched in order, starting from the top, against the rule
The file is then included or excluded according to the rule type. in the list until it finds a match. The file is then included or
excluded according to the rule type.
If the matcher falls off the bottom of the list then the path is If the matcher fails to find a match after testing against all the
included. entries in the list then the path is included.
For example given the following rules, `+` being include, `-` being For example given the following rules, `+` being include, `-` being
exclude, exclude,
@ -1627,16 +1761,44 @@ based remotes (eg s3, swift, google compute storage, b2).
Filtering rules are added with the following command line flags. Filtering rules are added with the following command line flags.
### Repeating options ##
You can repeat the following options to add more than one rule of that
type.
* `--include`
* `--include-from`
* `--exclude`
* `--exclude-from`
* `--filter`
* `--filter-from`
Note that all the options of the same type are processed together in
the order above, regardless of what order they were placed on the
command line.
So all `--include` options are processed first in the order they
appeared on the command line, then all `--include-from` options etc.
To mix up the order includes and excludes, the `--filter` flag can be
used.
### `--exclude` - Exclude files matching pattern ### ### `--exclude` - Exclude files matching pattern ###
Add a single exclude rule with `--exclude`. Add a single exclude rule with `--exclude`.
This flag can be repeated. See above for the order the flags are
processed in.
Eg `--exclude *.bak` to exclude all bak files from the sync. Eg `--exclude *.bak` to exclude all bak files from the sync.
### `--exclude-from` - Read exclude patterns from file ### ### `--exclude-from` - Read exclude patterns from file ###
Add exclude rules from a file. Add exclude rules from a file.
This flag can be repeated. See above for the order the flags are
processed in.
Prepare a file like this `exclude-file.txt` Prepare a file like this `exclude-file.txt`
# a sample exclude rule file # a sample exclude rule file
@ -1652,6 +1814,9 @@ This is useful if you have a lot of rules.
Add a single include rule with `--include`. Add a single include rule with `--include`.
This flag can be repeated. See above for the order the flags are
processed in.
Eg `--include *.{png,jpg}` to include all `png` and `jpg` files in the Eg `--include *.{png,jpg}` to include all `png` and `jpg` files in the
backup and no others. backup and no others.
@ -1665,6 +1830,9 @@ flexibility then you must use `--filter-from`.
Add include rules from a file. Add include rules from a file.
This flag can be repeated. See above for the order the flags are
processed in.
Prepare a file like this `include-file.txt` Prepare a file like this `include-file.txt`
# a sample include rule file # a sample include rule file
@ -1689,12 +1857,18 @@ This can be used to add a single include or exclude rule. Include
rules start with `+ ` and exclude rules start with `- `. A special rules start with `+ ` and exclude rules start with `- `. A special
rule called `!` can be used to clear the existing rules. rule called `!` can be used to clear the existing rules.
This flag can be repeated. See above for the order the flags are
processed in.
Eg `--filter "- *.bak"` to exclude all bak files from the sync. Eg `--filter "- *.bak"` to exclude all bak files from the sync.
### `--filter-from` - Read filtering patterns from a file ### ### `--filter-from` - Read filtering patterns from a file ###
Add include/exclude rules from a file. Add include/exclude rules from a file.
This flag can be repeated. See above for the order the flags are
processed in.
Prepare a file like this `filter-file.txt` Prepare a file like this `filter-file.txt`
# a sample exclude rule file # a sample exclude rule file
@ -1718,6 +1892,9 @@ This reads a list of file names from the file passed in and **only**
these files are transferred. The filtering rules are ignored these files are transferred. The filtering rules are ignored
completely if you use this option. completely if you use this option.
This option can be repeated to read from more than one file. These
are read in the order that they are placed on the command line.
Prepare a file like this `files-from.txt` Prepare a file like this `files-from.txt`
# comment # comment
@ -1950,7 +2127,7 @@ operations more efficient.
| Openstack Swift | Yes † | Yes | No | No | No | | Openstack Swift | Yes † | Yes | No | No | No |
| Dropbox | Yes | Yes | Yes | Yes | No [#575](https://github.com/ncw/rclone/issues/575) | | Dropbox | Yes | Yes | Yes | Yes | No [#575](https://github.com/ncw/rclone/issues/575) |
| Google Cloud Storage | Yes | Yes | No | No | No | | Google Cloud Storage | Yes | Yes | No | No | No |
| Amazon Drive | Yes | No | No [#721](https://github.com/ncw/rclone/issues/721) | No [#721](https://github.com/ncw/rclone/issues/721) | No [#575](https://github.com/ncw/rclone/issues/575) | | Amazon Drive | Yes | No | Yes | Yes | No [#575](https://github.com/ncw/rclone/issues/575) |
| Microsoft One Drive | Yes | Yes | No [#197](https://github.com/ncw/rclone/issues/197) | No [#197](https://github.com/ncw/rclone/issues/197) | No [#575](https://github.com/ncw/rclone/issues/575) | | Microsoft One Drive | Yes | Yes | No [#197](https://github.com/ncw/rclone/issues/197) | No [#197](https://github.com/ncw/rclone/issues/197) | No [#575](https://github.com/ncw/rclone/issues/575) |
| Hubic | Yes † | Yes | No | No | No | | Hubic | Yes † | Yes | No | No | No |
| Backblaze B2 | No | No | No | No | Yes | | Backblaze B2 | No | No | No | No | Yes |
@ -2166,7 +2343,7 @@ then rclone will choose a format from the default list.
If you prefer an archive copy then you might use `--drive-formats If you prefer an archive copy then you might use `--drive-formats
pdf`, or if you prefer openoffice/libreoffice formats you might use pdf`, or if you prefer openoffice/libreoffice formats you might use
`--drive-formats ods,odt`. `--drive-formats ods,odt,odp`.
Note that rclone adds the extension to the google doc, so if it is Note that rclone adds the extension to the google doc, so if it is
calles `My Spreadsheet` on google docs, it will be exported as `My calles `My Spreadsheet` on google docs, it will be exported as `My
@ -3280,8 +3457,9 @@ This means that larger files are likely to fail.
Unfortunatly there is no way for rclone to see that this failure is Unfortunatly there is no way for rclone to see that this failure is
because of file size, so it will retry the operation, as any other because of file size, so it will retry the operation, as any other
failure. To avoid this problem, use `--max-size 50G` option to limit failure. To avoid this problem, use `--max-size 50000M` option to limit
the maximum size of uploaded files. the maximum size of uploaded files. Note that `--max-size` does not split
files into segments, it only ignores files over this size.
Microsoft One Drive Microsoft One Drive
----------------------------------------- -----------------------------------------
@ -3427,6 +3605,8 @@ platforms they are common. Rclone will map these names to and from an
identical looking unicode equivalent. For example if a file has a `?` identical looking unicode equivalent. For example if a file has a `?`
in it will be mapped to `` instead. in it will be mapped to `` instead.
The largest allowed file size is 10GiB (10,737,418,240 bytes).
Hubic Hubic
----------------------------------------- -----------------------------------------
@ -4400,6 +4580,44 @@ flag.
Changelog Changelog
--------- ---------
* v1.35 - 2017-01-02
* New Features
* moveto and copyto commands for choosing a destination name on copy/move
* rmdirs command to recursively delete empty directories
* Allow repeated --include/--exclude/--filter options
* Only show transfer stats on commands which transfer stuff
* show stats on any command using the `--stats` flag
* Allow overlapping directories in move when server side dir move is supported
* Add --stats-unit option - thanks Scott McGillivray
* Bug Fixes
* Fix the config file being overwritten when two rclones are running
* Make rclone lsd obey the filters properly
* Fix compilation on mips
* Fix not transferring files that don't differ in size
* Fix panic on nil retry/fatal error
* Mount
* Retry reads on error - should help with reliability a lot
* Report the modification times for directories from the remote
* Add bandwidth accounting and limiting (fixes --bwlimit)
* If --stats provided will show stats and which files are transferring
* Support R/W files if truncate is set.
* Implement statfs interface so df works
* Note that write is now supported on Amazon Drive
* Report number of blocks in a file - thanks Stefan Breunig
* Crypt
* Prevent the user pointing crypt at itself
* Fix failed to authenticate decrypted block errors
* these will now return the underlying unexpected EOF instead
* Amazon Drive
* Add support for server side move and directory move - thanks Stefan Breunig
* Fix nil pointer deref on size attribute
* B2
* Use new prefix and delimiter parameters in directory listings
* This makes --max-depth 1 dir listings as used in mount much faster
* Reauth the account while doing uploads too - should help with token expiry
* Drive
* Make DirMove more efficient and complain about moving the root
* Create destination directory on Move()
* v1.34 - 2016-11-06 * v1.34 - 2016-11-06
* New Features * New Features
* Stop single file and `--files-from` operations iterating through the source bucket. * Stop single file and `--files-from` operations iterating through the source bucket.
@ -5118,6 +5336,11 @@ Contributors
* Felix Bünemann <buenemann@louis.info> * Felix Bünemann <buenemann@louis.info>
* Durval Menezes <jmrclone@durval.com> * Durval Menezes <jmrclone@durval.com>
* Luiz Carlos Rumbelsperger Viana <maxd13_luiz_carlos@hotmail.com> * Luiz Carlos Rumbelsperger Viana <maxd13_luiz_carlos@hotmail.com>
* Stefan Breunig <stefan-github@yrden.de>
* Alishan Ladhani <ali-l@users.noreply.github.com>
* 0xJAKE <0xJAKE@users.noreply.github.com>
* Thibault Molleman <thibaultmol@users.noreply.github.com>
* Scott McGillivray <scott.mcgillivray@gmail.com>
# Contact the rclone project # # Contact the rclone project #

View File

@ -1,6 +1,6 @@
rclone(1) User Manual rclone(1) User Manual
Nick Craig-Wood Nick Craig-Wood
Nov 06, 2016 Jan 02, 2017
@ -40,6 +40,7 @@ Links
- Home page - Home page
- Github project page for source and bug tracker - Github project page for source and bug tracker
- Rclone Forum
- Google+ page - Google+ page
- Downloads - Downloads
@ -284,7 +285,8 @@ Move files from source to dest.
Synopsis Synopsis
Moves the contents of the source directory to the destination directory. Moves the contents of the source directory to the destination directory.
Rclone will error if the source and destination overlap. Rclone will error if the source and destination overlap and the remote
does not support a server side directory move operation.
If no filters are in use and if possible this will server side move If no filters are in use and if possible this will server side move
source:path into dest:path. After this source:path will no longer longer source:path into dest:path. After this source:path will no longer longer
@ -599,6 +601,40 @@ Or like this to output any .txt files in dir or subdirectories.
rclone cat remote:path rclone cat remote:path
rclone copyto
Copy files from source to dest, skipping already copied
Synopsis
If source:path is a file or directory then it copies it to a file or
directory named dest:path.
This can be used to upload single files to other than their current
name. If the source is a directory then it acts exactly like the copy
command.
So
rclone copyto src dst
where src and dst are rclone paths, either remote:path or /path/to/local
or C:.
This will:
if src is file
copy it to dst, overwriting an existing file if it exists
if src is directory
copy it to dst, overwriting existing files if they exist
see copy command for full details
This doesn't transfer unchanged files, testing by size and modification
time or MD5SUM. It doesn't delete files from the destination.
rclone copyto source:path dest:path
rclone genautocomplete rclone genautocomplete
Output bash completion script for rclone. Output bash completion script for rclone.
@ -705,8 +741,7 @@ that, so will be less reliable than the rclone command.
Bugs Bugs
- All the remotes should work for read, but some may not for write - All the remotes should work for read, but some may not for write
- those which need to know the size in advance won't - eg B2, - those which need to know the size in advance won't - eg B2
Amazon Drive
- maybe should pass in size as -1 to mean work it out - maybe should pass in size as -1 to mean work it out
- Or put in an an upload cache to cache the files on disk first - Or put in an an upload cache to cache the files on disk first
@ -736,6 +771,59 @@ Options
--write-back-cache Makes kernel buffer writes before sending them to rclone. Without this, writethrough caching is used. --write-back-cache Makes kernel buffer writes before sending them to rclone. Without this, writethrough caching is used.
rclone moveto
Move file or directory from source to dest.
Synopsis
If source:path is a file or directory then it moves it to a file or
directory named dest:path.
This can be used to rename files or upload single files to other than
their existing name. If the source is a directory then it acts exacty
like the move command.
So
rclone moveto src dst
where src and dst are rclone paths, either remote:path or /path/to/local
or C:.
This will:
if src is file
move it to dst, overwriting an existing file if it exists
if src is directory
move it to dst, overwriting existing files if they exist
see move command for full details
This doesn't transfer unchanged files, testing by size and modification
time or MD5SUM. src will be deleted on successful transfer.
IMPORTANT: Since this can cause data loss, test first with the --dry-run
flag.
rclone moveto source:path dest:path
rclone rmdirs
Remove any empty directoryies under the path.
Synopsis
This removes any empty directories (or directories that only contain
empty directories) under the path that it finds, including the path if
it has nothing in.
This is useful for tidying up remotes that rclone has left a lot of
empty directories in.
rclone rmdirs remote:path
Copying single files Copying single files
rclone normally syncs or copies directories. However if the source rclone normally syncs or copies directories. However if the source
@ -1043,12 +1131,29 @@ modification times in the same way as rclone.
--stats=TIME --stats=TIME
Rclone will print stats at regular intervals to show its progress. Commands which transfer data (sync, copy, copyto, move, moveto) will
print data transfer stats at regular intervals to show their progress.
This sets the interval. This sets the interval.
The default is 1m. Use 0 to disable. The default is 1m. Use 0 to disable.
If you set the stats interval then all command can show stats. This can
be useful when running other commands, check or mount for example.
--stats-unit=bits|bytes
By default data transfer rates will be printed in bytes/second.
This option allows the data rate to be printed in bits/second.
Data transfer volume will still be reported in bytes.
The rate is reported as a binary unit, not SI unit. So 1 Mbit/s equals
1,048,576 bits/s and not 1,000,000 bits/s.
The default is bytes.
--delete-(before,during,after) --delete-(before,during,after)
This option allows you to specify when files on your destination are This option allows you to specify when files on your destination are
@ -1173,6 +1278,18 @@ If it is safe in your environment, you can set the RCLONE_CONFIG_PASS
environment variable to contain your password, in which case it will be environment variable to contain your password, in which case it will be
used for decrypting the configuration. used for decrypting the configuration.
You can set this for a session from a script. For unix like systems save
this to a file called set-rclone-password:
#!/bin/echo Source this file don't run it
read -s RCLONE_CONFIG_PASS
export RCLONE_CONFIG_PASS
Then source the file when you want to use it. From the shell you would
do source set-rclone-password. It will then ask you for the password and
set it in the envonment variable.
If you are running rclone inside a script, you might want to disable If you are running rclone inside a script, you might want to disable
password prompts. To do that, pass the parameter --ask-password=false to password prompts. To do that, pass the parameter --ask-password=false to
rclone. This will make rclone fail instead of asking for a password if rclone. This will make rclone fail instead of asking for a password if
@ -1404,10 +1521,6 @@ exclude rules like --include, --exclude, --include-from, --exclude-from,
--filter, or --filter-from. The simplest way to try them out is using --filter, or --filter-from. The simplest way to try them out is using
the ls command, or --dry-run together with -v. the ls command, or --dry-run together with -v.
IMPORTANT Due to limitations of the command line parser you can only use
any of these options once - if you duplicate them then rclone will use
the last one only.
Patterns Patterns
@ -1415,9 +1528,10 @@ The patterns used to match files for inclusion or exclusion are based on
"file globs" as used by the unix shell. "file globs" as used by the unix shell.
If the pattern starts with a / then it only matches at the top level of If the pattern starts with a / then it only matches at the top level of
the directory tree, relative to the root of the remote. If it doesn't the directory tree, RELATIVE TO THE ROOT OF THE REMOTE (not necessarily
start with / then it is matched starting at the END OF THE PATH, but it the root of the local drive). If it doesn't start with / then it is
will only match a complete path element: matched starting at the END OF THE PATH, but it will only match a
complete path element:
file.jpg - matches "file.jpg" file.jpg - matches "file.jpg"
- matches "directory/file.jpg" - matches "directory/file.jpg"
@ -1504,13 +1618,14 @@ Rclone always does a wildcard match so \ must always escape a \.
How the rules are used How the rules are used
Rclone maintains a list of include rules and exclude rules. Rclone maintains a combined list of include rules and exclude rules.
Each file is matched in order against the list until it finds a match. Each file is matched in order, starting from the top, against the rule
The file is then included or excluded according to the rule type. in the list until it finds a match. The file is then included or
excluded according to the rule type.
If the matcher falls off the bottom of the list then the path is If the matcher fails to find a match after testing against all the
included. entries in the list then the path is included.
For example given the following rules, + being include, - being exclude, For example given the following rules, + being include, - being exclude,
@ -1541,16 +1656,44 @@ Adding filtering rules
Filtering rules are added with the following command line flags. Filtering rules are added with the following command line flags.
Repeating options
You can repeat the following options to add more than one rule of that
type.
- --include
- --include-from
- --exclude
- --exclude-from
- --filter
- --filter-from
Note that all the options of the same type are processed together in the
order above, regardless of what order they were placed on the command
line.
So all --include options are processed first in the order they appeared
on the command line, then all --include-from options etc.
To mix up the order includes and excludes, the --filter flag can be
used.
--exclude - Exclude files matching pattern --exclude - Exclude files matching pattern
Add a single exclude rule with --exclude. Add a single exclude rule with --exclude.
This flag can be repeated. See above for the order the flags are
processed in.
Eg --exclude *.bak to exclude all bak files from the sync. Eg --exclude *.bak to exclude all bak files from the sync.
--exclude-from - Read exclude patterns from file --exclude-from - Read exclude patterns from file
Add exclude rules from a file. Add exclude rules from a file.
This flag can be repeated. See above for the order the flags are
processed in.
Prepare a file like this exclude-file.txt Prepare a file like this exclude-file.txt
# a sample exclude rule file # a sample exclude rule file
@ -1566,6 +1709,9 @@ This is useful if you have a lot of rules.
Add a single include rule with --include. Add a single include rule with --include.
This flag can be repeated. See above for the order the flags are
processed in.
Eg --include *.{png,jpg} to include all png and jpg files in the backup Eg --include *.{png,jpg} to include all png and jpg files in the backup
and no others. and no others.
@ -1579,6 +1725,9 @@ you must use --filter-from.
Add include rules from a file. Add include rules from a file.
This flag can be repeated. See above for the order the flags are
processed in.
Prepare a file like this include-file.txt Prepare a file like this include-file.txt
# a sample include rule file # a sample include rule file
@ -1603,12 +1752,18 @@ This can be used to add a single include or exclude rule. Include rules
start with + and exclude rules start with -. A special rule called ! can start with + and exclude rules start with -. A special rule called ! can
be used to clear the existing rules. be used to clear the existing rules.
This flag can be repeated. See above for the order the flags are
processed in.
Eg --filter "- *.bak" to exclude all bak files from the sync. Eg --filter "- *.bak" to exclude all bak files from the sync.
--filter-from - Read filtering patterns from a file --filter-from - Read filtering patterns from a file
Add include/exclude rules from a file. Add include/exclude rules from a file.
This flag can be repeated. See above for the order the flags are
processed in.
Prepare a file like this filter-file.txt Prepare a file like this filter-file.txt
# a sample exclude rule file # a sample exclude rule file
@ -1632,6 +1787,9 @@ This reads a list of file names from the file passed in and ONLY these
files are transferred. The filtering rules are ignored completely if you files are transferred. The filtering rules are ignored completely if you
use this option. use this option.
This option can be repeated to read from more than one file. These are
read in the order that they are placed on the command line.
Prepare a file like this files-from.txt Prepare a file like this files-from.txt
# comment # comment
@ -1867,7 +2025,7 @@ more efficient.
Openstack Swift Yes † Yes No No No Openstack Swift Yes † Yes No No No
Dropbox Yes Yes Yes Yes No #575 Dropbox Yes Yes Yes Yes No #575
Google Cloud Storage Yes Yes No No No Google Cloud Storage Yes Yes No No No
Amazon Drive Yes No No #721 No #721 No #575 Amazon Drive Yes No Yes Yes No #575
Microsoft One Drive Yes Yes No #197 No #197 No #575 Microsoft One Drive Yes Yes No #197 No #197 No #575
Hubic Yes † Yes No No No Hubic Yes † Yes No No No
Backblaze B2 No No No No Yes Backblaze B2 No No No No Yes
@ -2078,7 +2236,7 @@ rclone will choose a format from the default list.
If you prefer an archive copy then you might use --drive-formats pdf, or If you prefer an archive copy then you might use --drive-formats pdf, or
if you prefer openoffice/libreoffice formats you might use if you prefer openoffice/libreoffice formats you might use
--drive-formats ods,odt. --drive-formats ods,odt,odp.
Note that rclone adds the extension to the google doc, so if it is Note that rclone adds the extension to the google doc, so if it is
calles My Spreadsheet on google docs, it will be exported as calles My Spreadsheet on google docs, it will be exported as
@ -3229,8 +3387,9 @@ means that larger files are likely to fail.
Unfortunatly there is no way for rclone to see that this failure is Unfortunatly there is no way for rclone to see that this failure is
because of file size, so it will retry the operation, as any other because of file size, so it will retry the operation, as any other
failure. To avoid this problem, use --max-size 50G option to limit the failure. To avoid this problem, use --max-size 50000M option to limit
maximum size of uploaded files. the maximum size of uploaded files. Note that --max-size does not split
files into segments, it only ignores files over this size.
Microsoft One Drive Microsoft One Drive
@ -3372,6 +3531,8 @@ they are common. Rclone will map these names to and from an identical
looking unicode equivalent. For example if a file has a ? in it will be looking unicode equivalent. For example if a file has a ? in it will be
mapped to instead. mapped to instead.
The largest allowed file size is 10GiB (10,737,418,240 bytes).
Hubic Hubic
@ -4292,6 +4453,51 @@ it isn't supported (eg Windows) it will not appear as an valid flag.
Changelog Changelog
- v1.35 - 2017-01-02
- New Features
- moveto and copyto commands for choosing a destination name on
copy/move
- rmdirs command to recursively delete empty directories
- Allow repeated --include/--exclude/--filter options
- Only show transfer stats on commands which transfer stuff
- show stats on any command using the --stats flag
- Allow overlapping directories in move when server side dir move
is supported
- Add --stats-unit option - thanks Scott McGillivray
- Bug Fixes
- Fix the config file being overwritten when two rclones are
running
- Make rclone lsd obey the filters properly
- Fix compilation on mips
- Fix not transferring files that don't differ in size
- Fix panic on nil retry/fatal error
- Mount
- Retry reads on error - should help with reliability a lot
- Report the modification times for directories from the remote
- Add bandwidth accounting and limiting (fixes --bwlimit)
- If --stats provided will show stats and which files are
transferring
- Support R/W files if truncate is set.
- Implement statfs interface so df works
- Note that write is now supported on Amazon Drive
- Report number of blocks in a file - thanks Stefan Breunig
- Crypt
- Prevent the user pointing crypt at itself
- Fix failed to authenticate decrypted block errors
- these will now return the underlying unexpected EOF instead
- Amazon Drive
- Add support for server side move and directory move - thanks
Stefan Breunig
- Fix nil pointer deref on size attribute
- B2
- Use new prefix and delimiter parameters in directory listings
- This makes --max-depth 1 dir listings as used in mount much
faster
- Reauth the account while doing uploads too - should help with
token expiry
- Drive
- Make DirMove more efficient and complain about moving the root
- Create destination directory on Move()
- v1.34 - 2016-11-06 - v1.34 - 2016-11-06
- New Features - New Features
- Stop single file and --files-from operations iterating through - Stop single file and --files-from operations iterating through
@ -5070,6 +5276,11 @@ Contributors
- Felix Bünemann buenemann@louis.info - Felix Bünemann buenemann@louis.info
- Durval Menezes jmrclone@durval.com - Durval Menezes jmrclone@durval.com
- Luiz Carlos Rumbelsperger Viana maxd13_luiz_carlos@hotmail.com - Luiz Carlos Rumbelsperger Viana maxd13_luiz_carlos@hotmail.com
- Stefan Breunig stefan-github@yrden.de
- Alishan Ladhani ali-l@users.noreply.github.com
- 0xJAKE 0xJAKE@users.noreply.github.com
- Thibault Molleman thibaultmol@users.noreply.github.com
- Scott McGillivray scott.mcgillivray@gmail.com

View File

@ -15,7 +15,8 @@ Making a release
* make tag * make tag
* edit docs/content/changelog.md * edit docs/content/changelog.md
* make doc * make doc
* git commit -a -v * git status - to check for new man pages - git add them
* git commit -a -v -m "Version v1.XX"
* make retag * make retag
* # Set the GOPATH for a current stable go compiler * # Set the GOPATH for a current stable go compiler
* make cross * make cross

View File

@ -7,6 +7,44 @@ date: "2016-11-06"
Changelog Changelog
--------- ---------
* v1.35 - 2017-01-02
* New Features
* moveto and copyto commands for choosing a destination name on copy/move
* rmdirs command to recursively delete empty directories
* Allow repeated --include/--exclude/--filter options
* Only show transfer stats on commands which transfer stuff
* show stats on any command using the `--stats` flag
* Allow overlapping directories in move when server side dir move is supported
* Add --stats-unit option - thanks Scott McGillivray
* Bug Fixes
* Fix the config file being overwritten when two rclones are running
* Make rclone lsd obey the filters properly
* Fix compilation on mips
* Fix not transferring files that don't differ in size
* Fix panic on nil retry/fatal error
* Mount
* Retry reads on error - should help with reliability a lot
* Report the modification times for directories from the remote
* Add bandwidth accounting and limiting (fixes --bwlimit)
* If --stats provided will show stats and which files are transferring
* Support R/W files if truncate is set.
* Implement statfs interface so df works
* Note that write is now supported on Amazon Drive
* Report number of blocks in a file - thanks Stefan Breunig
* Crypt
* Prevent the user pointing crypt at itself
* Fix failed to authenticate decrypted block errors
* these will now return the underlying unexpected EOF instead
* Amazon Drive
* Add support for server side move and directory move - thanks Stefan Breunig
* Fix nil pointer deref on size attribute
* B2
* Use new prefix and delimiter parameters in directory listings
* This makes --max-depth 1 dir listings as used in mount much faster
* Reauth the account while doing uploads too - should help with token expiry
* Drive
* Make DirMove more efficient and complain about moving the root
* Create destination directory on Move()
* v1.34 - 2016-11-06 * v1.34 - 2016-11-06
* New Features * New Features
* Stop single file and `--files-from` operations iterating through the source bucket. * Stop single file and `--files-from` operations iterating through the source bucket.

View File

@ -1,12 +1,12 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone" title: "rclone"
slug: rclone slug: rclone
url: /commands/rclone/ url: /commands/rclone/
--- ---
## rclone ## rclone
Sync files and directories to and from local and remote object stores - v1.34-DEV Sync files and directories to and from local and remote object stores - v1.35-DEV
### Synopsis ### Synopsis
@ -79,16 +79,16 @@ rclone
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -110,7 +110,8 @@ rclone
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -126,6 +127,7 @@ rclone
* [rclone cleanup](/commands/rclone_cleanup/) - Clean up the remote if possible * [rclone cleanup](/commands/rclone_cleanup/) - Clean up the remote if possible
* [rclone config](/commands/rclone_config/) - Enter an interactive configuration session. * [rclone config](/commands/rclone_config/) - Enter an interactive configuration session.
* [rclone copy](/commands/rclone_copy/) - Copy files from source to dest, skipping already copied * [rclone copy](/commands/rclone_copy/) - Copy files from source to dest, skipping already copied
* [rclone copyto](/commands/rclone_copyto/) - Copy files from source to dest, skipping already copied
* [rclone dedupe](/commands/rclone_dedupe/) - Interactively find duplicate files delete/rename them. * [rclone dedupe](/commands/rclone_dedupe/) - Interactively find duplicate files delete/rename them.
* [rclone delete](/commands/rclone_delete/) - Remove the contents of path. * [rclone delete](/commands/rclone_delete/) - Remove the contents of path.
* [rclone genautocomplete](/commands/rclone_genautocomplete/) - Output bash completion script for rclone. * [rclone genautocomplete](/commands/rclone_genautocomplete/) - Output bash completion script for rclone.
@ -138,11 +140,13 @@ rclone
* [rclone mkdir](/commands/rclone_mkdir/) - Make the path if it doesn't already exist. * [rclone mkdir](/commands/rclone_mkdir/) - Make the path if it doesn't already exist.
* [rclone mount](/commands/rclone_mount/) - Mount the remote as a mountpoint. **EXPERIMENTAL** * [rclone mount](/commands/rclone_mount/) - Mount the remote as a mountpoint. **EXPERIMENTAL**
* [rclone move](/commands/rclone_move/) - Move files from source to dest. * [rclone move](/commands/rclone_move/) - Move files from source to dest.
* [rclone moveto](/commands/rclone_moveto/) - Move file or directory from source to dest.
* [rclone purge](/commands/rclone_purge/) - Remove the path and all of its contents. * [rclone purge](/commands/rclone_purge/) - Remove the path and all of its contents.
* [rclone rmdir](/commands/rclone_rmdir/) - Remove the path if empty. * [rclone rmdir](/commands/rclone_rmdir/) - Remove the path if empty.
* [rclone rmdirs](/commands/rclone_rmdirs/) - Remove any empty directoryies under the path.
* [rclone sha1sum](/commands/rclone_sha1sum/) - Produces an sha1sum file for all the objects in the path. * [rclone sha1sum](/commands/rclone_sha1sum/) - Produces an sha1sum file for all the objects in the path.
* [rclone size](/commands/rclone_size/) - Prints the total size and number of objects in remote:path. * [rclone size](/commands/rclone_size/) - Prints the total size and number of objects in remote:path.
* [rclone sync](/commands/rclone_sync/) - Make source and dest identical, modifying destination only. * [rclone sync](/commands/rclone_sync/) - Make source and dest identical, modifying destination only.
* [rclone version](/commands/rclone_version/) - Show the version number. * [rclone version](/commands/rclone_version/) - Show the version number.
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone authorize" title: "rclone authorize"
slug: rclone_authorize slug: rclone_authorize
url: /commands/rclone_authorize/ url: /commands/rclone_authorize/
@ -52,16 +52,16 @@ rclone authorize
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -83,7 +83,8 @@ rclone authorize
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -92,6 +93,6 @@ rclone authorize
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone cat" title: "rclone cat"
slug: rclone_cat slug: rclone_cat
url: /commands/rclone_cat/ url: /commands/rclone_cat/
@ -63,16 +63,16 @@ rclone cat remote:path
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -94,7 +94,8 @@ rclone cat remote:path
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -103,6 +104,6 @@ rclone cat remote:path
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone check" title: "rclone check"
slug: rclone_check slug: rclone_check
url: /commands/rclone_check/ url: /commands/rclone_check/
@ -55,16 +55,16 @@ rclone check source:path dest:path
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -86,7 +86,8 @@ rclone check source:path dest:path
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -95,6 +96,6 @@ rclone check source:path dest:path
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone cleanup" title: "rclone cleanup"
slug: rclone_cleanup slug: rclone_cleanup
url: /commands/rclone_cleanup/ url: /commands/rclone_cleanup/
@ -52,16 +52,16 @@ rclone cleanup remote:path
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -83,7 +83,8 @@ rclone cleanup remote:path
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -92,6 +93,6 @@ rclone cleanup remote:path
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone config" title: "rclone config"
slug: rclone_config slug: rclone_config
url: /commands/rclone_config/ url: /commands/rclone_config/
@ -49,16 +49,16 @@ rclone config
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -80,7 +80,8 @@ rclone config
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -89,6 +90,6 @@ rclone config
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone copy" title: "rclone copy"
slug: rclone_copy slug: rclone_copy
url: /commands/rclone_copy/ url: /commands/rclone_copy/
@ -88,16 +88,16 @@ rclone copy source:path dest:path
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -119,7 +119,8 @@ rclone copy source:path dest:path
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -128,6 +129,6 @@ rclone copy source:path dest:path
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -0,0 +1,121 @@
---
date: 2017-01-02T15:29:14Z
title: "rclone copyto"
slug: rclone_copyto
url: /commands/rclone_copyto/
---
## rclone copyto
Copy files from source to dest, skipping already copied
### Synopsis
If source:path is a file or directory then it copies it to a file or
directory named dest:path.
This can be used to upload single files to other than their current
name. If the source is a directory then it acts exactly like the copy
command.
So
rclone copyto src dst
where src and dst are rclone paths, either remote:path or
/path/to/local or C:\windows\path\if\on\windows.
This will:
if src is file
copy it to dst, overwriting an existing file if it exists
if src is directory
copy it to dst, overwriting existing files if they exist
see copy command for full details
This doesn't transfer unchanged files, testing by size and
modification time or MD5SUM. It doesn't delete files from the
destination.
```
rclone copyto source:path dest:path
```
### Options inherited from parent commands
```
--acd-templink-threshold int Files >= this size will be downloaded via their tempLink. (default 9G)
--acd-upload-wait-per-gb duration Additional time per GB to wait after a failed complete upload to see if it appears. (default 3m0s)
--ask-password Allow prompt for password for encrypted configuration. (default true)
--b2-chunk-size int Upload chunk size. Must fit in memory. (default 96M)
--b2-test-mode string A flag string for X-Bz-Test-Mode header.
--b2-upload-cutoff int Cutoff for switching to chunked upload (default 190.735M)
--b2-versions Include old versions in directory listings.
--bwlimit int Bandwidth limit in kBytes/s, or use suffix b|k|M|G
--checkers int Number of checkers to run in parallel. (default 8)
-c, --checksum Skip based on checksum & size, not mod-time & size
--config string Config file. (default "/home/ncw/.rclone.conf")
--contimeout duration Connect timeout (default 1m0s)
--cpuprofile string Write cpu profile to file
--delete-after When synchronizing, delete files on destination after transfering
--delete-before When synchronizing, delete files on destination before transfering
--delete-during When synchronizing, delete files during transfer (default)
--delete-excluded Delete files on dest excluded from sync
--drive-auth-owner-only Only consider files owned by the authenticated user. Requires drive-full-list.
--drive-chunk-size int Upload chunk size. Must a power of 2 >= 256k. (default 8M)
--drive-formats string Comma separated list of preferred formats for downloading Google docs. (default "docx,xlsx,pptx,svg")
--drive-full-list Use a full listing for directory list. More data but usually quicker. (obsolete)
--drive-upload-cutoff int Cutoff for switching to chunked upload (default 8M)
--drive-use-trash Send files to the trash instead of deleting permanently.
--dropbox-chunk-size int Upload chunk size. Max 150M. (default 128M)
-n, --dry-run Do a trial run with no permanent changes
--dump-auth Dump HTTP headers with auth info
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info
--exclude stringArray Exclude files matching pattern
--exclude-from stringArray Read exclude patterns from file
--files-from stringArray Read list of source-file names from file
-f, --filter stringArray Add a file-filtering rule
--filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files
--include stringArray Include files matching pattern
--include-from stringArray Read include patterns from file
--log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
--max-depth int If set limits the recursion depth to this. (default -1)
--max-size int Don't transfer any file larger than this in k or suffix b|k|M|G (default off)
--memprofile string Write memory profile to file
--min-age string Don't transfer any file younger than this in s or suffix ms|s|m|h|d|w|M|y
--min-size int Don't transfer any file smaller than this in k or suffix b|k|M|G (default off)
--modify-window duration Max time diff to be considered the same (default 1ns)
--no-check-certificate Do not verify the server SSL certificate. Insecure.
--no-gzip-encoding Don't set Accept-Encoding: gzip.
--no-traverse Don't traverse destination file system on copy.
--no-update-modtime Don't update destination mod-time if files identical.
-x, --one-file-system Don't cross filesystem boundaries.
--onedrive-chunk-size int Above this size files will be chunked - must be multiple of 320k. (default 10M)
--onedrive-upload-cutoff int Cutoff for switching to chunked upload - must be <= 100MB (default 10M)
-q, --quiet Print as little stuff as possible
--retries int Retry operations this many times if they fail (default 3)
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum
--stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4)
-u, --update Skip files that are newer on the destination.
-v, --verbose Print lots more stuff
```
### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone dedupe" title: "rclone dedupe"
slug: rclone_dedupe slug: rclone_dedupe
url: /commands/rclone_dedupe/ url: /commands/rclone_dedupe/
@ -130,16 +130,16 @@ rclone dedupe [mode] remote:path
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -161,7 +161,8 @@ rclone dedupe [mode] remote:path
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -170,6 +171,6 @@ rclone dedupe [mode] remote:path
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone delete" title: "rclone delete"
slug: rclone_delete slug: rclone_delete
url: /commands/rclone_delete/ url: /commands/rclone_delete/
@ -66,16 +66,16 @@ rclone delete remote:path
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -97,7 +97,8 @@ rclone delete remote:path
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -106,6 +107,6 @@ rclone delete remote:path
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone genautocomplete" title: "rclone genautocomplete"
slug: rclone_genautocomplete slug: rclone_genautocomplete
url: /commands/rclone_genautocomplete/ url: /commands/rclone_genautocomplete/
@ -64,16 +64,16 @@ rclone genautocomplete [output_file]
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -95,7 +95,8 @@ rclone genautocomplete [output_file]
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -104,6 +105,6 @@ rclone genautocomplete [output_file]
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone gendocs" title: "rclone gendocs"
slug: rclone_gendocs slug: rclone_gendocs
url: /commands/rclone_gendocs/ url: /commands/rclone_gendocs/
@ -52,16 +52,16 @@ rclone gendocs output_directory
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -83,7 +83,8 @@ rclone gendocs output_directory
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -92,6 +93,6 @@ rclone gendocs output_directory
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone listremotes" title: "rclone listremotes"
slug: rclone_listremotes slug: rclone_listremotes
url: /commands/rclone_listremotes/ url: /commands/rclone_listremotes/
@ -59,16 +59,16 @@ rclone listremotes
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -90,7 +90,8 @@ rclone listremotes
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -99,6 +100,6 @@ rclone listremotes
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone ls" title: "rclone ls"
slug: rclone_ls slug: rclone_ls
url: /commands/rclone_ls/ url: /commands/rclone_ls/
@ -49,16 +49,16 @@ rclone ls remote:path
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -80,7 +80,8 @@ rclone ls remote:path
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -89,6 +90,6 @@ rclone ls remote:path
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone lsd" title: "rclone lsd"
slug: rclone_lsd slug: rclone_lsd
url: /commands/rclone_lsd/ url: /commands/rclone_lsd/
@ -49,16 +49,16 @@ rclone lsd remote:path
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -80,7 +80,8 @@ rclone lsd remote:path
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -89,6 +90,6 @@ rclone lsd remote:path
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone lsl" title: "rclone lsl"
slug: rclone_lsl slug: rclone_lsl
url: /commands/rclone_lsl/ url: /commands/rclone_lsl/
@ -49,16 +49,16 @@ rclone lsl remote:path
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -80,7 +80,8 @@ rclone lsl remote:path
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -89,6 +90,6 @@ rclone lsl remote:path
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone md5sum" title: "rclone md5sum"
slug: rclone_md5sum slug: rclone_md5sum
url: /commands/rclone_md5sum/ url: /commands/rclone_md5sum/
@ -52,16 +52,16 @@ rclone md5sum remote:path
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -83,7 +83,8 @@ rclone md5sum remote:path
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -92,6 +93,6 @@ rclone md5sum remote:path
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone mkdir" title: "rclone mkdir"
slug: rclone_mkdir slug: rclone_mkdir
url: /commands/rclone_mkdir/ url: /commands/rclone_mkdir/
@ -49,16 +49,16 @@ rclone mkdir remote:path
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -80,7 +80,8 @@ rclone mkdir remote:path
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -89,6 +90,6 @@ rclone mkdir remote:path
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone mount" title: "rclone mount"
slug: rclone_mount slug: rclone_mount
url: /commands/rclone_mount/ url: /commands/rclone_mount/
@ -59,7 +59,7 @@ mount won't do that, so will be less reliable than the rclone command.
### Bugs ### ### Bugs ###
* All the remotes should work for read, but some may not for write * All the remotes should work for read, but some may not for write
* those which need to know the size in advance won't - eg B2, Amazon Drive * those which need to know the size in advance won't - eg B2
* maybe should pass in size as -1 to mean work it out * maybe should pass in size as -1 to mean work it out
* Or put in an an upload cache to cache the files on disk first * Or put in an an upload cache to cache the files on disk first
@ -125,16 +125,16 @@ rclone mount remote:path /path/to/mountpoint
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -156,7 +156,8 @@ rclone mount remote:path /path/to/mountpoint
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -165,6 +166,6 @@ rclone mount remote:path /path/to/mountpoint
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone move" title: "rclone move"
slug: rclone_move slug: rclone_move
url: /commands/rclone_move/ url: /commands/rclone_move/
@ -13,7 +13,8 @@ Move files from source to dest.
Moves the contents of the source directory to the destination Moves the contents of the source directory to the destination
directory. Rclone will error if the source and destination overlap. directory. Rclone will error if the source and destination overlap and
the remote does not support a server side directory move operation.
If no filters are in use and if possible this will server side move If no filters are in use and if possible this will server side move
`source:path` into `dest:path`. After this `source:path` will no `source:path` into `dest:path`. After this `source:path` will no
@ -65,16 +66,16 @@ rclone move source:path dest:path
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -96,7 +97,8 @@ rclone move source:path dest:path
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -105,6 +107,6 @@ rclone move source:path dest:path
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -0,0 +1,124 @@
---
date: 2017-01-02T15:29:14Z
title: "rclone moveto"
slug: rclone_moveto
url: /commands/rclone_moveto/
---
## rclone moveto
Move file or directory from source to dest.
### Synopsis
If source:path is a file or directory then it moves it to a file or
directory named dest:path.
This can be used to rename files or upload single files to other than
their existing name. If the source is a directory then it acts exacty
like the move command.
So
rclone moveto src dst
where src and dst are rclone paths, either remote:path or
/path/to/local or C:\windows\path\if\on\windows.
This will:
if src is file
move it to dst, overwriting an existing file if it exists
if src is directory
move it to dst, overwriting existing files if they exist
see move command for full details
This doesn't transfer unchanged files, testing by size and
modification time or MD5SUM. src will be deleted on successful
transfer.
**Important**: Since this can cause data loss, test first with the
--dry-run flag.
```
rclone moveto source:path dest:path
```
### Options inherited from parent commands
```
--acd-templink-threshold int Files >= this size will be downloaded via their tempLink. (default 9G)
--acd-upload-wait-per-gb duration Additional time per GB to wait after a failed complete upload to see if it appears. (default 3m0s)
--ask-password Allow prompt for password for encrypted configuration. (default true)
--b2-chunk-size int Upload chunk size. Must fit in memory. (default 96M)
--b2-test-mode string A flag string for X-Bz-Test-Mode header.
--b2-upload-cutoff int Cutoff for switching to chunked upload (default 190.735M)
--b2-versions Include old versions in directory listings.
--bwlimit int Bandwidth limit in kBytes/s, or use suffix b|k|M|G
--checkers int Number of checkers to run in parallel. (default 8)
-c, --checksum Skip based on checksum & size, not mod-time & size
--config string Config file. (default "/home/ncw/.rclone.conf")
--contimeout duration Connect timeout (default 1m0s)
--cpuprofile string Write cpu profile to file
--delete-after When synchronizing, delete files on destination after transfering
--delete-before When synchronizing, delete files on destination before transfering
--delete-during When synchronizing, delete files during transfer (default)
--delete-excluded Delete files on dest excluded from sync
--drive-auth-owner-only Only consider files owned by the authenticated user. Requires drive-full-list.
--drive-chunk-size int Upload chunk size. Must a power of 2 >= 256k. (default 8M)
--drive-formats string Comma separated list of preferred formats for downloading Google docs. (default "docx,xlsx,pptx,svg")
--drive-full-list Use a full listing for directory list. More data but usually quicker. (obsolete)
--drive-upload-cutoff int Cutoff for switching to chunked upload (default 8M)
--drive-use-trash Send files to the trash instead of deleting permanently.
--dropbox-chunk-size int Upload chunk size. Max 150M. (default 128M)
-n, --dry-run Do a trial run with no permanent changes
--dump-auth Dump HTTP headers with auth info
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info
--exclude stringArray Exclude files matching pattern
--exclude-from stringArray Read exclude patterns from file
--files-from stringArray Read list of source-file names from file
-f, --filter stringArray Add a file-filtering rule
--filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files
--include stringArray Include files matching pattern
--include-from stringArray Read include patterns from file
--log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
--max-depth int If set limits the recursion depth to this. (default -1)
--max-size int Don't transfer any file larger than this in k or suffix b|k|M|G (default off)
--memprofile string Write memory profile to file
--min-age string Don't transfer any file younger than this in s or suffix ms|s|m|h|d|w|M|y
--min-size int Don't transfer any file smaller than this in k or suffix b|k|M|G (default off)
--modify-window duration Max time diff to be considered the same (default 1ns)
--no-check-certificate Do not verify the server SSL certificate. Insecure.
--no-gzip-encoding Don't set Accept-Encoding: gzip.
--no-traverse Don't traverse destination file system on copy.
--no-update-modtime Don't update destination mod-time if files identical.
-x, --one-file-system Don't cross filesystem boundaries.
--onedrive-chunk-size int Above this size files will be chunked - must be multiple of 320k. (default 10M)
--onedrive-upload-cutoff int Cutoff for switching to chunked upload - must be <= 100MB (default 10M)
-q, --quiet Print as little stuff as possible
--retries int Retry operations this many times if they fail (default 3)
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum
--stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4)
-u, --update Skip files that are newer on the destination.
-v, --verbose Print lots more stuff
```
### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone purge" title: "rclone purge"
slug: rclone_purge slug: rclone_purge
url: /commands/rclone_purge/ url: /commands/rclone_purge/
@ -53,16 +53,16 @@ rclone purge remote:path
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -84,7 +84,8 @@ rclone purge remote:path
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -93,6 +94,6 @@ rclone purge remote:path
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone rmdir" title: "rclone rmdir"
slug: rclone_rmdir slug: rclone_rmdir
url: /commands/rclone_rmdir/ url: /commands/rclone_rmdir/
@ -51,16 +51,16 @@ rclone rmdir remote:path
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -82,7 +82,8 @@ rclone rmdir remote:path
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -91,6 +92,6 @@ rclone rmdir remote:path
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -0,0 +1,102 @@
---
date: 2017-01-02T15:29:14Z
title: "rclone rmdirs"
slug: rclone_rmdirs
url: /commands/rclone_rmdirs/
---
## rclone rmdirs
Remove any empty directoryies under the path.
### Synopsis
This removes any empty directories (or directories that only contain
empty directories) under the path that it finds, including the path if
it has nothing in.
This is useful for tidying up remotes that rclone has left a lot of
empty directories in.
```
rclone rmdirs remote:path
```
### Options inherited from parent commands
```
--acd-templink-threshold int Files >= this size will be downloaded via their tempLink. (default 9G)
--acd-upload-wait-per-gb duration Additional time per GB to wait after a failed complete upload to see if it appears. (default 3m0s)
--ask-password Allow prompt for password for encrypted configuration. (default true)
--b2-chunk-size int Upload chunk size. Must fit in memory. (default 96M)
--b2-test-mode string A flag string for X-Bz-Test-Mode header.
--b2-upload-cutoff int Cutoff for switching to chunked upload (default 190.735M)
--b2-versions Include old versions in directory listings.
--bwlimit int Bandwidth limit in kBytes/s, or use suffix b|k|M|G
--checkers int Number of checkers to run in parallel. (default 8)
-c, --checksum Skip based on checksum & size, not mod-time & size
--config string Config file. (default "/home/ncw/.rclone.conf")
--contimeout duration Connect timeout (default 1m0s)
--cpuprofile string Write cpu profile to file
--delete-after When synchronizing, delete files on destination after transfering
--delete-before When synchronizing, delete files on destination before transfering
--delete-during When synchronizing, delete files during transfer (default)
--delete-excluded Delete files on dest excluded from sync
--drive-auth-owner-only Only consider files owned by the authenticated user. Requires drive-full-list.
--drive-chunk-size int Upload chunk size. Must a power of 2 >= 256k. (default 8M)
--drive-formats string Comma separated list of preferred formats for downloading Google docs. (default "docx,xlsx,pptx,svg")
--drive-full-list Use a full listing for directory list. More data but usually quicker. (obsolete)
--drive-upload-cutoff int Cutoff for switching to chunked upload (default 8M)
--drive-use-trash Send files to the trash instead of deleting permanently.
--dropbox-chunk-size int Upload chunk size. Max 150M. (default 128M)
-n, --dry-run Do a trial run with no permanent changes
--dump-auth Dump HTTP headers with auth info
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info
--exclude stringArray Exclude files matching pattern
--exclude-from stringArray Read exclude patterns from file
--files-from stringArray Read list of source-file names from file
-f, --filter stringArray Add a file-filtering rule
--filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files
--include stringArray Include files matching pattern
--include-from stringArray Read include patterns from file
--log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
--max-depth int If set limits the recursion depth to this. (default -1)
--max-size int Don't transfer any file larger than this in k or suffix b|k|M|G (default off)
--memprofile string Write memory profile to file
--min-age string Don't transfer any file younger than this in s or suffix ms|s|m|h|d|w|M|y
--min-size int Don't transfer any file smaller than this in k or suffix b|k|M|G (default off)
--modify-window duration Max time diff to be considered the same (default 1ns)
--no-check-certificate Do not verify the server SSL certificate. Insecure.
--no-gzip-encoding Don't set Accept-Encoding: gzip.
--no-traverse Don't traverse destination file system on copy.
--no-update-modtime Don't update destination mod-time if files identical.
-x, --one-file-system Don't cross filesystem boundaries.
--onedrive-chunk-size int Above this size files will be chunked - must be multiple of 320k. (default 10M)
--onedrive-upload-cutoff int Cutoff for switching to chunked upload - must be <= 100MB (default 10M)
-q, --quiet Print as little stuff as possible
--retries int Retry operations this many times if they fail (default 3)
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum
--stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4)
-u, --update Skip files that are newer on the destination.
-v, --verbose Print lots more stuff
```
### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone sha1sum" title: "rclone sha1sum"
slug: rclone_sha1sum slug: rclone_sha1sum
url: /commands/rclone_sha1sum/ url: /commands/rclone_sha1sum/
@ -52,16 +52,16 @@ rclone sha1sum remote:path
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -83,7 +83,8 @@ rclone sha1sum remote:path
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -92,6 +93,6 @@ rclone sha1sum remote:path
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone size" title: "rclone size"
slug: rclone_size slug: rclone_size
url: /commands/rclone_size/ url: /commands/rclone_size/
@ -49,16 +49,16 @@ rclone size remote:path
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -80,7 +80,8 @@ rclone size remote:path
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -89,6 +90,6 @@ rclone size remote:path
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone sync" title: "rclone sync"
slug: rclone_sync slug: rclone_sync
url: /commands/rclone_sync/ url: /commands/rclone_sync/
@ -68,16 +68,16 @@ rclone sync source:path dest:path
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -99,7 +99,8 @@ rclone sync source:path dest:path
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -108,6 +109,6 @@ rclone sync source:path dest:path
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -1,5 +1,5 @@
--- ---
date: 2016-11-06T10:19:14Z date: 2017-01-02T15:29:14Z
title: "rclone version" title: "rclone version"
slug: rclone_version slug: rclone_version
url: /commands/rclone_version/ url: /commands/rclone_version/
@ -49,16 +49,16 @@ rclone version
--dump-bodies Dump HTTP headers and bodies - may contain sensitive info --dump-bodies Dump HTTP headers and bodies - may contain sensitive info
--dump-filters Dump the filters to the output --dump-filters Dump the filters to the output
--dump-headers Dump HTTP headers - may contain sensitive info --dump-headers Dump HTTP headers - may contain sensitive info
--exclude string Exclude files matching pattern --exclude stringArray Exclude files matching pattern
--exclude-from string Read exclude patterns from file --exclude-from stringArray Read exclude patterns from file
--files-from string Read list of source-file names from file --files-from stringArray Read list of source-file names from file
-f, --filter string Add a file-filtering rule -f, --filter stringArray Add a file-filtering rule
--filter-from string Read filtering patterns from a file --filter-from stringArray Read filtering patterns from a file
--ignore-existing Skip all files that exist on destination --ignore-existing Skip all files that exist on destination
--ignore-size Ignore size when skipping use mod-time or checksum. --ignore-size Ignore size when skipping use mod-time or checksum.
-I, --ignore-times Don't skip files that match size and time - transfer all files -I, --ignore-times Don't skip files that match size and time - transfer all files
--include string Include files matching pattern --include stringArray Include files matching pattern
--include-from string Read include patterns from file --include-from stringArray Read include patterns from file
--log-file string Log everything to this file --log-file string Log everything to this file
--low-level-retries int Number of low level retries to do. (default 10) --low-level-retries int Number of low level retries to do. (default 10)
--max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y --max-age string Don't transfer any file older than this in s or suffix ms|s|m|h|d|w|M|y
@ -80,7 +80,8 @@ rclone version
--s3-acl string Canned ACL used when creating buckets and/or storing objects in S3 --s3-acl string Canned ACL used when creating buckets and/or storing objects in S3
--s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA) --s3-storage-class string Storage class to use when uploading S3 objects (STANDARD|REDUCED_REDUNDANCY|STANDARD_IA)
--size-only Skip based on size only, not mod-time or checksum --size-only Skip based on size only, not mod-time or checksum
--stats duration Interval to print stats (0 to disable) (default 1m0s) --stats duration Interval between printing stats, e.g 500ms, 60s, 5m. (0 to disable) (default 1m0s)
--stats-unit string Show data rate in stats as either 'bits' or 'bytes'/s (default "bytes")
--swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G) --swift-chunk-size int Above this size files will be chunked into a _segments container. (default 5G)
--timeout duration IO idle timeout (default 5m0s) --timeout duration IO idle timeout (default 5m0s)
--transfers int Number of file transfers to run in parallel. (default 4) --transfers int Number of file transfers to run in parallel. (default 4)
@ -89,6 +90,6 @@ rclone version
``` ```
### SEE ALSO ### SEE ALSO
* [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.34-DEV * [rclone](/commands/rclone/) - Sync files and directories to and from local and remote object stores - v1.35-DEV
###### Auto generated by spf13/cobra on 6-Nov-2016 ###### Auto generated by spf13/cobra on 2-Jan-2017

View File

@ -2,41 +2,41 @@
title: "Rclone downloads" title: "Rclone downloads"
description: "Download rclone binaries for your OS." description: "Download rclone binaries for your OS."
type: page type: page
date: "2016-11-06" date: "2017-01-02"
--- ---
Rclone Download v1.34 Rclone Download v1.35
===================== =====================
* Windows * Windows
* [386 - 32 Bit](http://downloads.rclone.org/rclone-v1.34-windows-386.zip) * [386 - 32 Bit](http://downloads.rclone.org/rclone-v1.35-windows-386.zip)
* [AMD64 - 64 Bit](http://downloads.rclone.org/rclone-v1.34-windows-amd64.zip) * [AMD64 - 64 Bit](http://downloads.rclone.org/rclone-v1.35-windows-amd64.zip)
* OSX * OSX
* [386 - 32 Bit](http://downloads.rclone.org/rclone-v1.34-osx-386.zip) * [386 - 32 Bit](http://downloads.rclone.org/rclone-v1.35-osx-386.zip)
* [AMD64 - 64 Bit](http://downloads.rclone.org/rclone-v1.34-osx-amd64.zip) * [AMD64 - 64 Bit](http://downloads.rclone.org/rclone-v1.35-osx-amd64.zip)
* Linux * Linux
* [386 - 32 Bit](http://downloads.rclone.org/rclone-v1.34-linux-386.zip) * [386 - 32 Bit](http://downloads.rclone.org/rclone-v1.35-linux-386.zip)
* [AMD64 - 64 Bit](http://downloads.rclone.org/rclone-v1.34-linux-amd64.zip) * [AMD64 - 64 Bit](http://downloads.rclone.org/rclone-v1.35-linux-amd64.zip)
* [ARM - 32 Bit](http://downloads.rclone.org/rclone-v1.34-linux-arm.zip) * [ARM - 32 Bit](http://downloads.rclone.org/rclone-v1.35-linux-arm.zip)
* [ARM - 64 Bit](http://downloads.rclone.org/rclone-v1.34-linux-arm64.zip) * [ARM - 64 Bit](http://downloads.rclone.org/rclone-v1.35-linux-arm64.zip)
* FreeBSD * FreeBSD
* [386 - 32 Bit](http://downloads.rclone.org/rclone-v1.34-freebsd-386.zip) * [386 - 32 Bit](http://downloads.rclone.org/rclone-v1.35-freebsd-386.zip)
* [AMD64 - 64 Bit](http://downloads.rclone.org/rclone-v1.34-freebsd-amd64.zip) * [AMD64 - 64 Bit](http://downloads.rclone.org/rclone-v1.35-freebsd-amd64.zip)
* [ARM - 32 Bit](http://downloads.rclone.org/rclone-v1.34-freebsd-arm.zip) * [ARM - 32 Bit](http://downloads.rclone.org/rclone-v1.35-freebsd-arm.zip)
* NetBSD * NetBSD
* [386 - 32 Bit](http://downloads.rclone.org/rclone-v1.34-netbsd-386.zip) * [386 - 32 Bit](http://downloads.rclone.org/rclone-v1.35-netbsd-386.zip)
* [AMD64 - 64 Bit](http://downloads.rclone.org/rclone-v1.34-netbsd-amd64.zip) * [AMD64 - 64 Bit](http://downloads.rclone.org/rclone-v1.35-netbsd-amd64.zip)
* [ARM - 32 Bit](http://downloads.rclone.org/rclone-v1.34-netbsd-arm.zip) * [ARM - 32 Bit](http://downloads.rclone.org/rclone-v1.35-netbsd-arm.zip)
* OpenBSD * OpenBSD
* [386 - 32 Bit](http://downloads.rclone.org/rclone-v1.34-openbsd-386.zip) * [386 - 32 Bit](http://downloads.rclone.org/rclone-v1.35-openbsd-386.zip)
* [AMD64 - 64 Bit](http://downloads.rclone.org/rclone-v1.34-openbsd-amd64.zip) * [AMD64 - 64 Bit](http://downloads.rclone.org/rclone-v1.35-openbsd-amd64.zip)
* Plan 9 * Plan 9
* [386 - 32 Bit](http://downloads.rclone.org/rclone-v1.34-plan9-386.zip) * [386 - 32 Bit](http://downloads.rclone.org/rclone-v1.35-plan9-386.zip)
* [AMD64 - 64 Bit](http://downloads.rclone.org/rclone-v1.34-plan9-amd64.zip) * [AMD64 - 64 Bit](http://downloads.rclone.org/rclone-v1.35-plan9-amd64.zip)
* Solaris * Solaris
* [AMD64 - 64 Bit](http://downloads.rclone.org/rclone-v1.34-solaris-amd64.zip) * [AMD64 - 64 Bit](http://downloads.rclone.org/rclone-v1.35-solaris-amd64.zip)
You can also find a [mirror of the downloads on github](https://github.com/ncw/rclone/releases/tag/v1.34). You can also find a [mirror of the downloads on github](https://github.com/ncw/rclone/releases/tag/v1.35).
You can also download [the releases using SSL](https://downloads-rclone-org-7d7d567e.cdn.memsites.com/). You can also download [the releases using SSL](https://downloads-rclone-org-7d7d567e.cdn.memsites.com/).

View File

@ -1,4 +1,4 @@
package fs package fs
// Version of rclone // Version of rclone
var Version = "v1.34-DEV" var Version = "v1.35-DEV"

333
rclone.1
View File

@ -1,7 +1,7 @@
.\"t .\"t
.\" Automatically generated by Pandoc 1.16.0.2 .\" Automatically generated by Pandoc 1.16.0.2
.\" .\"
.TH "rclone" "1" "Nov 06, 2016" "User Manual" "" .TH "rclone" "1" "Jan 02, 2017" "User Manual" ""
.hy .hy
.SH Rclone .SH Rclone
.PP .PP
@ -63,9 +63,9 @@ Home page (http://rclone.org/)
Github project page for source and bug Github project page for source and bug
tracker (http://github.com/ncw/rclone) tracker (http://github.com/ncw/rclone)
.IP \[bu] 2 .IP \[bu] 2
Rclone Forum (https://forum.rclone.org)
.IP \[bu] 2
Google+ page Google+ page
.RS 2
.RE
.IP \[bu] 2 .IP \[bu] 2
Downloads (http://rclone.org/downloads/) Downloads (http://rclone.org/downloads/)
.SH Install .SH Install
@ -394,7 +394,8 @@ Move files from source to dest.
.SS Synopsis .SS Synopsis
.PP .PP
Moves the contents of the source directory to the destination directory. Moves the contents of the source directory to the destination directory.
Rclone will error if the source and destination overlap. Rclone will error if the source and destination overlap and the remote
does not support a server side directory move operation.
.PP .PP
If no filters are in use and if possible this will server side move If no filters are in use and if possible this will server side move
\f[C]source:path\f[] into \f[C]dest:path\f[]. \f[C]source:path\f[] into \f[C]dest:path\f[].
@ -785,6 +786,50 @@ rclone\ \-\-include\ "*.txt"\ cat\ remote:path/to/dir
rclone\ cat\ remote:path rclone\ cat\ remote:path
\f[] \f[]
.fi .fi
.SS rclone copyto
.PP
Copy files from source to dest, skipping already copied
.SS Synopsis
.PP
If source:path is a file or directory then it copies it to a file or
directory named dest:path.
.PP
This can be used to upload single files to other than their current
name.
If the source is a directory then it acts exactly like the copy command.
.PP
So
.IP
.nf
\f[C]
rclone\ copyto\ src\ dst
\f[]
.fi
.PP
where src and dst are rclone paths, either remote:path or /path/to/local
or C:.
.PP
This will:
.IP
.nf
\f[C]
if\ src\ is\ file
\ \ \ \ copy\ it\ to\ dst,\ overwriting\ an\ existing\ file\ if\ it\ exists
if\ src\ is\ directory
\ \ \ \ copy\ it\ to\ dst,\ overwriting\ existing\ files\ if\ they\ exist
\ \ \ \ see\ copy\ command\ for\ full\ details
\f[]
.fi
.PP
This doesn\[aq]t transfer unchanged files, testing by size and
modification time or MD5SUM.
It doesn\[aq]t delete files from the destination.
.IP
.nf
\f[C]
rclone\ copyto\ source:path\ dest:path
\f[]
.fi
.SS rclone genautocomplete .SS rclone genautocomplete
.PP .PP
Output bash completion script for rclone. Output bash completion script for rclone.
@ -920,8 +965,7 @@ won\[aq]t do that, so will be less reliable than the rclone command.
All the remotes should work for read, but some may not for write All the remotes should work for read, but some may not for write
.RS 2 .RS 2
.IP \[bu] 2 .IP \[bu] 2
those which need to know the size in advance won\[aq]t \- eg B2, Amazon those which need to know the size in advance won\[aq]t \- eg B2
Drive
.IP \[bu] 2 .IP \[bu] 2
maybe should pass in size as \-1 to mean work it out maybe should pass in size as \-1 to mean work it out
.IP \[bu] 2 .IP \[bu] 2
@ -960,6 +1004,70 @@ rclone\ mount\ remote:path\ /path/to/mountpoint
\ \ \ \ \ \ \-\-write\-back\-cache\ \ \ \ \ \ \ \ \ \ Makes\ kernel\ buffer\ writes\ before\ sending\ them\ to\ rclone.\ Without\ this,\ writethrough\ caching\ is\ used. \ \ \ \ \ \ \-\-write\-back\-cache\ \ \ \ \ \ \ \ \ \ Makes\ kernel\ buffer\ writes\ before\ sending\ them\ to\ rclone.\ Without\ this,\ writethrough\ caching\ is\ used.
\f[] \f[]
.fi .fi
.SS rclone moveto
.PP
Move file or directory from source to dest.
.SS Synopsis
.PP
If source:path is a file or directory then it moves it to a file or
directory named dest:path.
.PP
This can be used to rename files or upload single files to other than
their existing name.
If the source is a directory then it acts exacty like the move command.
.PP
So
.IP
.nf
\f[C]
rclone\ moveto\ src\ dst
\f[]
.fi
.PP
where src and dst are rclone paths, either remote:path or /path/to/local
or C:.
.PP
This will:
.IP
.nf
\f[C]
if\ src\ is\ file
\ \ \ \ move\ it\ to\ dst,\ overwriting\ an\ existing\ file\ if\ it\ exists
if\ src\ is\ directory
\ \ \ \ move\ it\ to\ dst,\ overwriting\ existing\ files\ if\ they\ exist
\ \ \ \ see\ move\ command\ for\ full\ details
\f[]
.fi
.PP
This doesn\[aq]t transfer unchanged files, testing by size and
modification time or MD5SUM.
src will be deleted on successful transfer.
.PP
\f[B]Important\f[]: Since this can cause data loss, test first with the
\-\-dry\-run flag.
.IP
.nf
\f[C]
rclone\ moveto\ source:path\ dest:path
\f[]
.fi
.SS rclone rmdirs
.PP
Remove any empty directoryies under the path.
.SS Synopsis
.PP
This removes any empty directories (or directories that only contain
empty directories) under the path that it finds, including the path if
it has nothing in.
.PP
This is useful for tidying up remotes that rclone has left a lot of
empty directories in.
.IP
.nf
\f[C]
rclone\ rmdirs\ remote:path
\f[]
.fi
.SS Copying single files .SS Copying single files
.PP .PP
rclone normally syncs or copies directories. rclone normally syncs or copies directories.
@ -1304,12 +1412,30 @@ modified by the desktop sync client which doesn\[aq]t set checksums of
modification times in the same way as rclone. modification times in the same way as rclone.
.SS \-\-stats=TIME .SS \-\-stats=TIME
.PP .PP
Rclone will print stats at regular intervals to show its progress. Commands which transfer data (\f[C]sync\f[], \f[C]copy\f[],
\f[C]copyto\f[], \f[C]move\f[], \f[C]moveto\f[]) will print data
transfer stats at regular intervals to show their progress.
.PP .PP
This sets the interval. This sets the interval.
.PP .PP
The default is \f[C]1m\f[]. The default is \f[C]1m\f[].
Use 0 to disable. Use 0 to disable.
.PP
If you set the stats interval then all command can show stats.
This can be useful when running other commands, \f[C]check\f[] or
\f[C]mount\f[] for example.
.SS \-\-stats\-unit=bits|bytes
.PP
By default data transfer rates will be printed in bytes/second.
.PP
This option allows the data rate to be printed in bits/second.
.PP
Data transfer volume will still be reported in bytes.
.PP
The rate is reported as a binary unit, not SI unit.
So 1 Mbit/s equals 1,048,576 bits/s and not 1,000,000 bits/s.
.PP
The default is \f[C]bytes\f[].
.SS \-\-delete\-(before,during,after) .SS \-\-delete\-(before,during,after)
.PP .PP
This option allows you to specify when files on your destination are This option allows you to specify when files on your destination are
@ -1448,6 +1574,24 @@ If it is safe in your environment, you can set the
password, in which case it will be used for decrypting the password, in which case it will be used for decrypting the
configuration. configuration.
.PP .PP
You can set this for a session from a script.
For unix like systems save this to a file called
\f[C]set\-rclone\-password\f[]:
.IP
.nf
\f[C]
#!/bin/echo\ Source\ this\ file\ don\[aq]t\ run\ it
read\ \-s\ RCLONE_CONFIG_PASS
export\ RCLONE_CONFIG_PASS
\f[]
.fi
.PP
Then source the file when you want to use it.
From the shell you would do \f[C]source\ set\-rclone\-password\f[].
It will then ask you for the password and set it in the envonment
variable.
.PP
If you are running rclone inside a script, you might want to disable If you are running rclone inside a script, you might want to disable
password prompts. password prompts.
To do that, pass the parameter \f[C]\-\-ask\-password=false\f[] to To do that, pass the parameter \f[C]\-\-ask\-password=false\f[] to
@ -1710,17 +1854,14 @@ exclude rules like \f[C]\-\-include\f[], \f[C]\-\-exclude\f[],
\f[C]\-\-filter\f[], or \f[C]\-\-filter\-from\f[]. \f[C]\-\-filter\f[], or \f[C]\-\-filter\-from\f[].
The simplest way to try them out is using the \f[C]ls\f[] command, or The simplest way to try them out is using the \f[C]ls\f[] command, or
\f[C]\-\-dry\-run\f[] together with \f[C]\-v\f[]. \f[C]\-\-dry\-run\f[] together with \f[C]\-v\f[].
.PP
\f[B]Important\f[] Due to limitations of the command line parser you can
only use any of these options once \- if you duplicate them then rclone
will use the last one only.
.SS Patterns .SS Patterns
.PP .PP
The patterns used to match files for inclusion or exclusion are based on The patterns used to match files for inclusion or exclusion are based on
"file globs" as used by the unix shell. "file globs" as used by the unix shell.
.PP .PP
If the pattern starts with a \f[C]/\f[] then it only matches at the top If the pattern starts with a \f[C]/\f[] then it only matches at the top
level of the directory tree, relative to the root of the remote. level of the directory tree, \f[B]relative to the root of the remote\f[]
(not necessarily the root of the local drive).
If it doesn\[aq]t start with \f[C]/\f[] then it is matched starting at If it doesn\[aq]t start with \f[C]/\f[] then it is matched starting at
the \f[B]end of the path\f[], but it will only match a complete path the \f[B]end of the path\f[], but it will only match a complete path
element: element:
@ -1850,13 +1991,14 @@ Rclone always does a wildcard match so \f[C]\\\f[] must always escape a
\f[C]\\\f[]. \f[C]\\\f[].
.SS How the rules are used .SS How the rules are used
.PP .PP
Rclone maintains a list of include rules and exclude rules. Rclone maintains a combined list of include rules and exclude rules.
.PP .PP
Each file is matched in order against the list until it finds a match. Each file is matched in order, starting from the top, against the rule
in the list until it finds a match.
The file is then included or excluded according to the rule type. The file is then included or excluded according to the rule type.
.PP .PP
If the matcher falls off the bottom of the list then the path is If the matcher fails to find a match after testing against all the
included. entries in the list then the path is included.
.PP .PP
For example given the following rules, \f[C]+\f[] being include, For example given the following rules, \f[C]+\f[] being include,
\f[C]\-\f[] being exclude, \f[C]\-\f[] being exclude,
@ -1893,15 +2035,48 @@ google drive, onedrive, amazon drive) and not on bucket based remotes
.SS Adding filtering rules .SS Adding filtering rules
.PP .PP
Filtering rules are added with the following command line flags. Filtering rules are added with the following command line flags.
.SS Repeating options
.PP
You can repeat the following options to add more than one rule of that
type.
.IP \[bu] 2
\f[C]\-\-include\f[]
.IP \[bu] 2
\f[C]\-\-include\-from\f[]
.IP \[bu] 2
\f[C]\-\-exclude\f[]
.IP \[bu] 2
\f[C]\-\-exclude\-from\f[]
.IP \[bu] 2
\f[C]\-\-filter\f[]
.IP \[bu] 2
\f[C]\-\-filter\-from\f[]
.PP
Note that all the options of the same type are processed together in the
order above, regardless of what order they were placed on the command
line.
.PP
So all \f[C]\-\-include\f[] options are processed first in the order
they appeared on the command line, then all \f[C]\-\-include\-from\f[]
options etc.
.PP
To mix up the order includes and excludes, the \f[C]\-\-filter\f[] flag
can be used.
.SS \f[C]\-\-exclude\f[] \- Exclude files matching pattern .SS \f[C]\-\-exclude\f[] \- Exclude files matching pattern
.PP .PP
Add a single exclude rule with \f[C]\-\-exclude\f[]. Add a single exclude rule with \f[C]\-\-exclude\f[].
.PP .PP
This flag can be repeated.
See above for the order the flags are processed in.
.PP
Eg \f[C]\-\-exclude\ *.bak\f[] to exclude all bak files from the sync. Eg \f[C]\-\-exclude\ *.bak\f[] to exclude all bak files from the sync.
.SS \f[C]\-\-exclude\-from\f[] \- Read exclude patterns from file .SS \f[C]\-\-exclude\-from\f[] \- Read exclude patterns from file
.PP .PP
Add exclude rules from a file. Add exclude rules from a file.
.PP .PP
This flag can be repeated.
See above for the order the flags are processed in.
.PP
Prepare a file like this \f[C]exclude\-file.txt\f[] Prepare a file like this \f[C]exclude\-file.txt\f[]
.IP .IP
.nf .nf
@ -1921,6 +2096,9 @@ This is useful if you have a lot of rules.
.PP .PP
Add a single include rule with \f[C]\-\-include\f[]. Add a single include rule with \f[C]\-\-include\f[].
.PP .PP
This flag can be repeated.
See above for the order the flags are processed in.
.PP
Eg \f[C]\-\-include\ *.{png,jpg}\f[] to include all \f[C]png\f[] and Eg \f[C]\-\-include\ *.{png,jpg}\f[] to include all \f[C]png\f[] and
\f[C]jpg\f[] files in the backup and no others. \f[C]jpg\f[] files in the backup and no others.
.PP .PP
@ -1936,6 +2114,9 @@ If this doesn\[aq]t provide enough flexibility then you must use
.PP .PP
Add include rules from a file. Add include rules from a file.
.PP .PP
This flag can be repeated.
See above for the order the flags are processed in.
.PP
Prepare a file like this \f[C]include\-file.txt\f[] Prepare a file like this \f[C]include\-file.txt\f[]
.IP .IP
.nf .nf
@ -1969,12 +2150,18 @@ Include rules start with \f[C]+\f[] and exclude rules start with
A special rule called \f[C]!\f[] can be used to clear the existing A special rule called \f[C]!\f[] can be used to clear the existing
rules. rules.
.PP .PP
This flag can be repeated.
See above for the order the flags are processed in.
.PP
Eg \f[C]\-\-filter\ "\-\ *.bak"\f[] to exclude all bak files from the Eg \f[C]\-\-filter\ "\-\ *.bak"\f[] to exclude all bak files from the
sync. sync.
.SS \f[C]\-\-filter\-from\f[] \- Read filtering patterns from a file .SS \f[C]\-\-filter\-from\f[] \- Read filtering patterns from a file
.PP .PP
Add include/exclude rules from a file. Add include/exclude rules from a file.
.PP .PP
This flag can be repeated.
See above for the order the flags are processed in.
.PP
Prepare a file like this \f[C]filter\-file.txt\f[] Prepare a file like this \f[C]filter\-file.txt\f[]
.IP .IP
.nf .nf
@ -2002,6 +2189,9 @@ This reads a list of file names from the file passed in and
\f[B]only\f[] these files are transferred. \f[B]only\f[] these files are transferred.
The filtering rules are ignored completely if you use this option. The filtering rules are ignored completely if you use this option.
.PP .PP
This option can be repeated to read from more than one file.
These are read in the order that they are placed on the command line.
.PP
Prepare a file like this \f[C]files\-from.txt\f[] Prepare a file like this \f[C]files\-from.txt\f[]
.IP .IP
.nf .nf
@ -2511,9 +2701,9 @@ Yes
T}@T{ T}@T{
No No
T}@T{ T}@T{
No #721 (https://github.com/ncw/rclone/issues/721) Yes
T}@T{ T}@T{
No #721 (https://github.com/ncw/rclone/issues/721) Yes
T}@T{ T}@T{
No #575 (https://github.com/ncw/rclone/issues/575) No #575 (https://github.com/ncw/rclone/issues/575)
T} T}
@ -2805,7 +2995,7 @@ rclone will choose a format from the default list.
If you prefer an archive copy then you might use If you prefer an archive copy then you might use
\f[C]\-\-drive\-formats\ pdf\f[], or if you prefer \f[C]\-\-drive\-formats\ pdf\f[], or if you prefer
openoffice/libreoffice formats you might use openoffice/libreoffice formats you might use
\f[C]\-\-drive\-formats\ ods,odt\f[]. \f[C]\-\-drive\-formats\ ods,odt,odp\f[].
.PP .PP
Note that rclone adds the extension to the google doc, so if it is Note that rclone adds the extension to the google doc, so if it is
calles \f[C]My\ Spreadsheet\f[] on google docs, it will be exported as calles \f[C]My\ Spreadsheet\f[] on google docs, it will be exported as
@ -4188,8 +4378,10 @@ This means that larger files are likely to fail.
Unfortunatly there is no way for rclone to see that this failure is Unfortunatly there is no way for rclone to see that this failure is
because of file size, so it will retry the operation, as any other because of file size, so it will retry the operation, as any other
failure. failure.
To avoid this problem, use \f[C]\-\-max\-size\ 50G\f[] option to limit To avoid this problem, use \f[C]\-\-max\-size\ 50000M\f[] option to
the maximum size of uploaded files. limit the maximum size of uploaded files.
Note that \f[C]\-\-max\-size\f[] does not split files into segments, it
only ignores files over this size.
.SS Microsoft One Drive .SS Microsoft One Drive
.PP .PP
Paths are specified as \f[C]remote:path\f[] Paths are specified as \f[C]remote:path\f[]
@ -4350,6 +4542,8 @@ Rclone will map these names to and from an identical looking unicode
equivalent. equivalent.
For example if a file has a \f[C]?\f[] in it will be mapped to For example if a file has a \f[C]?\f[] in it will be mapped to
\f[C]\f[] instead. \f[C]\f[] instead.
.PP
The largest allowed file size is 10GiB (10,737,418,240 bytes).
.SS Hubic .SS Hubic
.PP .PP
Paths are specified as \f[C]remote:path\f[] Paths are specified as \f[C]remote:path\f[]
@ -5439,6 +5633,93 @@ On systems where it isn\[aq]t supported (eg Windows) it will not appear
as an valid flag. as an valid flag.
.SS Changelog .SS Changelog
.IP \[bu] 2 .IP \[bu] 2
v1.35 \- 2017\-01\-02
.RS 2
.IP \[bu] 2
New Features
.IP \[bu] 2
moveto and copyto commands for choosing a destination name on copy/move
.IP \[bu] 2
rmdirs command to recursively delete empty directories
.IP \[bu] 2
Allow repeated \-\-include/\-\-exclude/\-\-filter options
.IP \[bu] 2
Only show transfer stats on commands which transfer stuff
.RS 2
.IP \[bu] 2
show stats on any command using the \f[C]\-\-stats\f[] flag
.RE
.IP \[bu] 2
Allow overlapping directories in move when server side dir move is
supported
.IP \[bu] 2
Add \-\-stats\-unit option \- thanks Scott McGillivray
.IP \[bu] 2
Bug Fixes
.IP \[bu] 2
Fix the config file being overwritten when two rclones are running
.IP \[bu] 2
Make rclone lsd obey the filters properly
.IP \[bu] 2
Fix compilation on mips
.IP \[bu] 2
Fix not transferring files that don\[aq]t differ in size
.IP \[bu] 2
Fix panic on nil retry/fatal error
.IP \[bu] 2
Mount
.IP \[bu] 2
Retry reads on error \- should help with reliability a lot
.IP \[bu] 2
Report the modification times for directories from the remote
.IP \[bu] 2
Add bandwidth accounting and limiting (fixes \-\-bwlimit)
.IP \[bu] 2
If \-\-stats provided will show stats and which files are transferring
.IP \[bu] 2
Support R/W files if truncate is set.
.IP \[bu] 2
Implement statfs interface so df works
.IP \[bu] 2
Note that write is now supported on Amazon Drive
.IP \[bu] 2
Report number of blocks in a file \- thanks Stefan Breunig
.IP \[bu] 2
Crypt
.IP \[bu] 2
Prevent the user pointing crypt at itself
.IP \[bu] 2
Fix failed to authenticate decrypted block errors
.RS 2
.IP \[bu] 2
these will now return the underlying unexpected EOF instead
.RE
.IP \[bu] 2
Amazon Drive
.IP \[bu] 2
Add support for server side move and directory move \- thanks Stefan
Breunig
.IP \[bu] 2
Fix nil pointer deref on size attribute
.IP \[bu] 2
B2
.IP \[bu] 2
Use new prefix and delimiter parameters in directory listings
.RS 2
.IP \[bu] 2
This makes \-\-max\-depth 1 dir listings as used in mount much faster
.RE
.IP \[bu] 2
Reauth the account while doing uploads too \- should help with token
expiry
.IP \[bu] 2
Drive
.IP \[bu] 2
Make DirMove more efficient and complain about moving the root
.IP \[bu] 2
Create destination directory on Move()
.RE
.IP \[bu] 2
v1.34 \- 2016\-11\-06 v1.34 \- 2016\-11\-06
.RS 2 .RS 2
.IP \[bu] 2 .IP \[bu] 2
@ -6875,6 +7156,16 @@ Felix Bünemann <buenemann@louis.info>
Durval Menezes <jmrclone@durval.com> Durval Menezes <jmrclone@durval.com>
.IP \[bu] 2 .IP \[bu] 2
Luiz Carlos Rumbelsperger Viana <maxd13_luiz_carlos@hotmail.com> Luiz Carlos Rumbelsperger Viana <maxd13_luiz_carlos@hotmail.com>
.IP \[bu] 2
Stefan Breunig <stefan-github@yrden.de>
.IP \[bu] 2
Alishan Ladhani <ali-l@users.noreply.github.com>
.IP \[bu] 2
0xJAKE <0xJAKE@users.noreply.github.com>
.IP \[bu] 2
Thibault Molleman <thibaultmol@users.noreply.github.com>
.IP \[bu] 2
Scott McGillivray <scott.mcgillivray@gmail.com>
.SH Contact the rclone project .SH Contact the rclone project
.SS Forum .SS Forum
.PP .PP