b2: correct docs on SHA1s on large files

s3-about
Nick Craig-Wood 2017-11-03 12:49:15 +00:00
parent f60e2a7aac
commit 6552581a17
1 changed files with 16 additions and 11 deletions

View File

@ -116,8 +116,22 @@ method to set the modification time independent of doing an upload.
The SHA1 checksums of the files are checked on upload and download and
will be used in the syncing process.
Large files which are uploaded in chunks will store their SHA1 on the
object as `X-Bz-Info-large_file_sha1` as recommended by Backblaze.
Large files (bigger than the limit in `--b2-upload-cutoff`) which are
uploaded in chunks will store their SHA1 on the object as
`X-Bz-Info-large_file_sha1` as recommended by Backblaze.
For a large file to be uploaded with an SHA1 checksum, the source
needs to support SHA1 checksums. The local disk supports SHA1
checksums so large file transfers from local disk will have an SHA1.
See [the overview](/overview/#features) for exactly which remotes
support SHA1.
Sources which don't support SHA1, in particular `crypt` will upload
large files without SHA1 checksums. This may be fixed in the future
(see [#1767](https://github.com/ncw/rclone/issues/1767)).
Files sizes below `--b2-upload-cutoff` will always have an SHA1
regardless of the source.
### Transfers ###
@ -233,15 +247,6 @@ start and finish the upload) and another 2 requests for each chunk:
/b2api/v1/b2_finish_large_file
```
### B2 with crypt ###
When using B2 with `crypt` files are encrypted into a temporary
location and streamed from there. This is required to calculate the
encrypted file's checksum before beginning the upload. On Windows the
%TMPDIR% environment variable is used as the temporary location. If
the file requires chunking, both the chunking and encryption will take
place in memory.
### Specific options ###
Here are the command line options specific to this cloud storage