Before you can start creating jobs that send or receive data from the cloud, you need to configure a cloud storage connection in Settings -> Storage Connectors:
Resilio Connect Agents can work with several types of cloud storages. Each type is configured separately. To be able to add cloud storage, the MC license must include a "Storage connector" feature in at least one package.
No connection probed
The Management Console itself won't probe the connection to your cloud storage. Connection tests are performed by one of the chosen Agents, or by the configured Agent in the job.
- Name of the storage (any name that will help you to later understand what this storage is)
- Description, optional
- Access key ID. Should be copied from your Amazon account
- Secret access key. Should be copied from your Amazon account
- Region name. Please specify as dash-delimited text as in "Region" column of officially supported AWS regions.
- Use bucket. Optional*. Can be left empty, if the access keys have permissions to list the buckets. Otherwise, it won't be possible to test connection to the cloud and browse through buckets.
- Starting with MC v3.3, access by Amazon IAM Roles is supported. Bucket checkbox is optional. If it’s checked, the Agents’ connections will be limited to only that bucket. If not - all the buckets that the IAM Roles allow, will be listed by the Agent when configuring the job in the path picker (Permission to list the buckets does not guarantee write access inside the buckets).
Note: the Agent will not report access errors shortly after the role is revoked, until the access token expires or until the Agent is restarted.
(*) Empty bucket name
If you leave it empty, agent will be able to access all buckets by the provided access keys. Thus you MUST specify some bucket name as part of the path during job configuration and that bucket MUST exist in the storage already.-
Name of the storage (any name that will help you to later understand what this storage is).
-
Description, optional
- Access key or SAS token. You can get in Home -> Access keys or Shared access signature
of your Storage account - Storage account name
- Leave endpoint default unless you are using non-default connection endpoint
- Use container. Optional*. Can be left empty, if the access keys have permissions to list the containers. Otherwise, it won't be possible to test connection to the cloud and browse through containers.
(*) Empty container name
If you leave it empty, agent will be able to access all containers by the provided access keys. Thus you MUST specify some container name as part of the path during job configuration and that container MUST exist in the storage already.-
Name of the storage (any name that will help you to later understand what this storage is).
-
Description, optional.
- Access key or SAS token. You can get in Home -> Access keys or Shared access signature
of your Storage account - Storage account name
- Leave endpoint default unless you are using non-default connection endpoint
- Use share. Optional*. Can be left empty, if the access keys have permissions to list the shares. Otherwise, it won't be possible to test connection to the cloud and browse through buckets.
(*) Empty share name
If you leave it empty, agent will be able to access all shares by the provided access keys. Thus you MUST specify some share name as part of the path during job configuration and that container MUST exist in the storage already.- Name of the storage (any name that will help you to later understand what this storage is).
- Description, optional
- Your endpoint that agent will attempt to send S3 API requests to*
- Access Key ID. Should be provided to you by your S3-compatible storage vendor
- Secret access key. Should be provided to you by your S3-compatible storage vendor
- Region. Specify bucket's region.
- Checkbox 'Use SSL'. If checked, TLS protocol will be used in communication with the storage.
- Use bucket. Optional**. Can be left empty, if the access keys have permissions to list the buckets. Otherwise, it won't be possible to test connection to the cloud and browse through buckets.
(*) S3 Compatible Storages' peculiarities
Each object storage and S3-compatible in particular have their own specifics requiring applying some additional settings. See "Limitations and peculiarities" block here for more details.Storages known to be NOT supported: Amazon Glacier, Minio, Hitachi S3
(**) Empty bucket name
If you leave it empty, agent will be able to access all buckets by the provided access keys. Thus you MUST specify some bucket name as part of the path during job configuration and that bucket MUST exist in the storage already.- Name of the storage as it appear in MC everywhere
- Description, optional
- Access key. Get in (or create one) in your project settings
- Secret can also be obtained in Storage Settings
- Project ID. Can be learned from Project info.
- Use bucket. Optional*. Can be left empty, if the access keys have permissions to list the buckets. Otherwise, it won't be possible to test connection to the cloud and browse through buckets.
(*) Empty bucket name
If you leave it empty, agent will be able to access all buckets by the provided access keys. Thus you MUST specify some bucket name as part of the path during job configuration and that bucket MUST exist in the storage already.Available in Resilio Connect v.3.8.0 and newer.
Create a new application. Be sure to save Client secret value. Application ID and Client secret value will be needed later when configuring connector in the Management Console
https://portal.azure.com/#view/Microsoft_AAD_RegisteredApps/ApplicationsListBlade
Microsoft Graph ‘Application’ level permissions Files.ReadWrite.All, Sites.Read.All, User.Read should be granted.
Go to ‘API permissions’ section -> Add a permission, in the ‘Microsoft APIs’ tab choose ‘Microsoft Graph’, then ‘Application permissions’, and search for a specific permission. Be sure that “Admin consent” is granted for the added permissions.
On the Management Console go to Storage Connectors menu and add new storage Sharepoint.
Fill in Tenant ID and Client ID with the information from the Application. Use the saved Client secret value.
Root (site name) is required.
Drive - Document library name. Optional, if not provided, the Agent will enumerate drives by its permissions. If the drive is not provided, be sure to add it to the path inside the job, or use the folder picker to browse through the drive.
Be sure to apply "Preset for Sharepoint" in the Agent Profile before configuring the job.
Read here for more details about synchronising with Sharepoint Online
Available in Resilio AE 4.1.0 and newer.
- Name of the storage (any name that will help you later understand what this storage is).
- Description, optional
- Access Key ID. Get it from the use profile in Oracle Cloud console, see here for details..
- Secret access key. Get it from the use profile in Oracle Cloud console, see here for details.
- Namespace. The namespace is available in the bucket info. Alternatively, it’s available on the tenancy page if the user has sufficient privileges. Provide in format
<namespace>.compat.objectstorage.<region>.oraclecloud.com.
For example,ax123qwertyd.compat.objectstorage.us-ashburn-1.oraclecloud.com
- Region. Must be provided as the region identifier. You can learn the identifier here.
- Use bucket. Optional**. Can be left empty, if the access keys have permissions to list the buckets. Otherwise, it won't be possible to test connection to the cloud and browse through buckets.
Once you have a storage configured and have an Agent to take the role of a cloud agent you can create a job with a cloud storage.
- Once you decide one of agents should deliver data to/from a cloud, apply cloud storage Profile to it. Be sure to use "Preset for Sharepoint" for Sharepoint Online synchronization.
- Create and configure job normally. Click this agent's path and select "Storage Connectors" from the path macro dropdown, Select the preconfigured storage connector
- Once you picked the storage, enter the path for that storage or browse through the buckets, see below for completing the "Path" field correctly.
(*) Bucket/container/drive name not configured?
- if you left them empty in cloud storage configuration, now you MUST enter it as a first component of the path. It CAN be the only component, or you may add more subfolders- if you pre-defined them name in an cloud storage configuration, Agent will create the path you enter inside that bucket/container/drive. They must exist on the storage
Storage connector path macro is only available for agents of version 2.9 and newer, which have "Storage connector" feature in their license package.
Starting with Resilio Active Everywhere v 4.0.0 it's also available for High Availability groups.
Selective Sync is not supported for a cloud storage. If Selective Sync option is checked, path configuration will report an error: