Update docs for Dropbox and Google Cloud Storage

s3-about
Nick Craig-Wood 2014-07-17 20:03:11 +01:00
parent 7d8bac2711
commit 07f9a1a9f0
11 changed files with 282 additions and 22 deletions

View File

@ -12,6 +12,8 @@ Rclone is a command line program to sync files and directories to and from
* Google Drive
* Amazon S3
* Openstack Swift / Rackspace cloud files / Memset Memstore
* Dropbox
* Google Cloud Storage
* The local filesystem
Features
@ -86,7 +88,11 @@ first with the `--dry-run` flag.
rclone ls [remote:path]
List all the objects in the the path.
List all the objects in the the path with sizes.
rclone lsl [remote:path]
List all the objects in the the path with sizes and timestamps.
rclone lsd [remote:path]
@ -111,6 +117,11 @@ Checks the files in the source and destination match. It
compares sizes and MD5SUMs and prints a report of files which
don't match. It doesn't alter the source or destination.
rclone md5sum remote:path
Produces an md5sum file for all the objects in the path. This is in
the same format as the standard md5sum tool produces.
General options:
```
@ -173,7 +184,7 @@ The modified time is stored as metadata on the object as
Google drive
------------
Paths are specified as drive:path Drive paths may be as deep as required.
Paths are specified as remote:path Drive paths may be as deep as required.
The initial setup for drive involves getting a token from Google drive
which you need to do in your browser. `rclone config` walks you
@ -181,9 +192,45 @@ through it.
To copy a local directory to a drive directory called backup
rclone copy /home/source drv:backup
rclone copy /home/source remote:backup
Google drive stores modification times accurate to 1 ms.
Google drive stores modification times accurate to 1 ms natively.
Dropbox
-------
Paths are specified as remote:path Dropbox paths may be as deep as required.
The initial setup for dropbox involves getting a token from Dropbox
which you need to do in your browser. `rclone config` walks you
through it.
To copy a local directory to a drive directory called backup
rclone copy /home/source dropbox:backup
Md5sums and timestamps in RFC3339 format accurate to 1ns are stored in
a Dropbox datastore called "rclone". Dropbox datastores are limited
to 100,000 rows so this is the maximum number of files rclone can
manage on Dropbox.
Google Cloud Storage
--------------------
Paths are specified as remote:path Google Cloud Storage paths may be
as deep as required.
The initial setup for Google Cloud Storage involves getting a token
from Google which you need to do in your browser. `rclone config`
walks you through it.
To copy a local directory to a google cloud storage directory called backup
rclone copy /home/source remote:backup
Google google cloud storage stores md5sums natively and rclone stores
modification times as metadata on the object, under the "mtime" key in
RFC3339 format accurate to 1ns.
Single file copies
------------------

View File

@ -5,7 +5,7 @@
"menu": "menu"
},
"baseurl": "http://rclone.org",
"title": "rclone - rsync for object storage",
"description": "rclone - rsync for object storage: google drive, s3, swift, cloudfiles, memstore...",
"title": "rclone - rsync for cloud storage",
"description": "rclone - rsync for cloud storage: google drive, s3, swift, cloudfiles, dropbox, memstore...",
"canonifyurls": true
}

View File

@ -1,8 +1,8 @@
---
title: "Rclone"
description: "rclone syncs files to and from Google Drive, S3, Swift and Cloudfiles."
description: "rclone syncs files to and from Google Drive, S3, Swift, Cloudfiles, Dropbox and Google Cloud Storage."
type: page
date: "2014-04-26"
date: "2014-07-17"
groups: ["about"]
---
@ -16,6 +16,8 @@ Rclone is a command line program to sync files and directories to and from
* Google Drive
* Amazon S3
* Openstack Swift / Rackspace cloud files / Memset Memstore
* Dropbox
* Google Cloud Storage
* The local filesystem
Features

View File

@ -1,7 +1,7 @@
---
title: "Documentation"
description: "Rclone Documentation"
date: "2014-04-26"
date: "2014-07-17"
---
Install
@ -71,11 +71,15 @@ first with the -dry-run flag.
rclone ls [remote:path]
List all the objects in the the path.
List all the objects in the the path with sizes.
rclone lsl [remote:path]
List all the objects in the the path with sizes and timestamps.
rclone lsd [remote:path]
List all directoryes/objects/buckets in the the path.
List all directories/objects/buckets in the the path.
rclone mkdir remote:path
@ -96,6 +100,10 @@ Checks the files in the source and destination match. It
compares sizes and MD5SUMs and prints a report of files which
don't match. It doesn't alter the source or destination.
rclone md5sum remote:path
Produces an md5sum file for all the objects in the path. This is in
the same format as the standard md5sum tool produces.
General options:
```

View File

@ -4,10 +4,12 @@ description: "Rclone docs for Google drive"
date: "2014-04-26"
---
<i class="fa fa-google"></i> Google Drive
-----------------------------------------
Paths are specified as `drive:path`
Drive paths may be as deep as required, eg
`drive:directory/subdirectory`.
Drive paths may be as deep as required, eg `drive:directory/subdirectory`.
The initial setup for drive involves getting a token from Google drive
which you need to do in your browser. `rclone config` walks you

80
docs/content/dropbox.md Normal file
View File

@ -0,0 +1,80 @@
---
title: "Dropbox"
description: "Rclone docs for Dropbox"
date: "2014-07-17"
---
<i class="fa fa-dropbox"></i> Dropbox
---------------------------------
Paths are specified as `remote:path`
Dropbox paths may be as deep as required, eg
`remote:directory/subdirectory`.
The initial setup for dropbox involves getting a token from Dropbox
which you need to do in your browser. `rclone config` walks you
through it.
Here is an example of how to make a remote called `remote`. First run:
rclone config
This will guide you through an interactive setup process:
```
n) New remote
d) Delete remote
q) Quit config
e/n/d/q> n
name> remote
What type of source is it?
Choose a number from below
1) swift
2) s3
3) local
4) google cloud storage
5) dropbox
6) drive
type> 5
Dropbox App Key - leave blank to use rclone's.
app_key>
Dropbox App Secret - leave blank to use rclone's.
app_secret>
Remote config
Please visit:
https://www.dropbox.com/1/oauth2/authorize?client_id=XXXXXXXXXXXXXXX&response_type=code
Enter the code: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX_XXXXXXXXXX
--------------------
[remote]
app_key =
app_secret =
token = XXXXXXXXXXXXXXXXXXXXXXXXXXXXX_XXXX_XXXXXXXXXXXXXXXXXXXXXXXXXXXXX
--------------------
y) Yes this is OK
e) Edit this remote
d) Delete this remote
y/e/d> y
```
You can then use it like this,
List directories in top level of your dropbox
rclone lsd remote:
List all the files in your dropbox
rclone ls remote:
To copy a local directory to a dropbox directory called backup
rclone copy /home/source remote:backup
Modified time
-------------
Md5sums and timestamps in RFC3339 format accurate to 1ns are stored in
a Dropbox datastore called "rclone". Dropbox datastores are limited
to 100,000 rows so this is the maximum number of files rclone can
manage on Dropbox.

View File

@ -0,0 +1,117 @@
---
title: "Google Cloud Storage"
description: "Rclone docs for Google Cloud Storage"
date: "2014-07-17"
---
<i class="fa fa-google"></i> Google Cloud Storage
-------------------------------------------------
Paths are specified as `remote:bucket` (or `remote:` for the `lsd`
command.) You may put subdirectories in too, eg `remote:bucket/path/to/dir`.
The initial setup for google cloud storage involves getting a token from Google Cloud Storage
which you need to do in your browser. `rclone config` walks you
through it.
Here is an example of how to make a remote called `remote`. First run:
rclone config
This will guide you through an interactive setup process:
```
n) New remote
d) Delete remote
q) Quit config
e/n/d/q> n
name> remote
What type of source is it?
Choose a number from below
1) swift
2) s3
3) local
4) google cloud storage
5) dropbox
6) drive
type> 4
Google Application Client Id - leave blank to use rclone's.
client_id>
Google Application Client Secret - leave blank to use rclone's.
client_secret>
Project number optional - needed only for list/create/delete buckets - see your developer console.
project_number> 12345678
Access Control List for new objects.
Choose a number from below, or type in your own value
* Object owner gets OWNER access, and all Authenticated Users get READER access.
1) authenticatedRead
* Object owner gets OWNER access, and project team owners get OWNER access.
2) bucketOwnerFullControl
* Object owner gets OWNER access, and project team owners get READER access.
3) bucketOwnerRead
* Object owner gets OWNER access [default if left blank].
4) private
* Object owner gets OWNER access, and project team members get access according to their roles.
5) projectPrivate
* Object owner gets OWNER access, and all Users get READER access.
6) publicRead
object_acl> 4
Access Control List for new buckets.
Choose a number from below, or type in your own value
* Project team owners get OWNER access, and all Authenticated Users get READER access.
1) authenticatedRead
* Project team owners get OWNER access [default if left blank].
2) private
* Project team members get access according to their roles.
3) projectPrivate
* Project team owners get OWNER access, and all Users get READER access.
4) publicRead
* Project team owners get OWNER access, and all Users get WRITER access.
5) publicReadWrite
bucket_acl> 2
Remote config
Go to the following link in your browser
https://accounts.google.com/o/oauth2/auth?access_type=&approval_prompt=&client_id=XXXXXXXXXXXX.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob&response_type=code&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdevstorage.full_control&state=state
Log in, then type paste the token that is returned in the browser here
Enter verification code> x/xxxxxxxxxxxxxxxxxxxxxxxxxxxx.xxxxxxxxxxxxxxxxxxxxxx_xxxxxxxx
--------------------
[remote]
type = google cloud storage
client_id =
client_secret =
token = {"AccessToken":"xxxx.xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx","RefreshToken":"x/xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx_xxxxxxxxx","Expiry":"2014-07-17T20:49:14.929208288+01:00","Extra":null}
project_number = 12345678
object_acl = private
bucket_acl = private
--------------------
y) Yes this is OK
e) Edit this remote
d) Delete this remote
y/e/d> y
```
This remote is called `remote` and can now be used like this
See all the buckets in your project
rclone lsd remote:
Make a new bucket
rclone mkdir remote:bucket
List the contents of a bucket
rclone ls remote:bucket
Sync `/home/local/directory` to the remote bucket, deleting any excess
files in the bucket.
rclone sync /home/local/directory remote:bucket
Modified time
-------------
Google google cloud storage stores md5sums natively and rclone stores
modification times as metadata on the object, under the "mtime" key in
RFC3339 format accurate to 1ns.

View File

@ -4,8 +4,8 @@ description: "Rclone docs for the local filesystem"
date: "2014-04-26"
---
Local Filesystem
----------------
<i class="fa fa-file"></i> Local Filesystem
-------------------------------------------
Local paths are specified as normal filesystem paths, eg `/path/to/wherever`, so

View File

@ -4,13 +4,11 @@ description: "Rclone docs for Amazon S3"
date: "2014-04-26"
---
Paths are specified as `remote:bucket` or `remote:`
<i class="fa fa-archive"></i> Amazon S3
---------------------------------------
S3 paths can't refer to subdirectories within a bucket (yet).
So to copy a local directory to a s3 container called backup
rclone sync /home/source s3:backup
Paths are specified as `remote:bucket` (or `remote:` for the `lsd`
command.) You may put subdirectories in too, eg `remote:bucket/path/to/dir`.
Here is an example of making an s3 configuration. First run

View File

@ -4,13 +4,17 @@ description: "Swift"
date: "2014-04-26"
---
<i class="fa fa-space-shuttle"></i>Swift
----------------------------------------
Swift refers to [Openstack Object Storage](http://www.openstack.org/software/openstack-storage/).
Commercial implementations of that being:
* [Rackspace Cloud Files](http://www.rackspace.com/cloud/files/)
* [Memset Memstore](http://www.memset.com/cloud/storage/)
Paths are specified as `remote:container` or `remote:`
Paths are specified as `remote:container` (or `remote:` for the `lsd`
command.) You may put subdirectories in too, eg `remote:container/path/to/dir`.
Here is an example of making a swift configuration. First run

View File

@ -20,6 +20,8 @@
<li><a href="/drive/"><i class="fa fa-google"></i> Drive</a></li>
<li><a href="/s3/"><i class="fa fa-archive"></i> S3</a></li>
<li><a href="/swift/"><i class="fa fa-space-shuttle"></i> Swift</a></li>
<li><a href="/dropbox/"><i class="fa fa-dropbox"></i> Dropbox</a></li>
<li><a href="/googlecloudstorage/"><i class="fa fa-google"></i> Google Cloud Storage</a></li>
<li><a href="/local/"><i class="fa fa-file"></i> Local</a></li>
</ul>
</li>