Deprecated
This functionality is deprecated and is no longer supported for Resilio Connect v3.0.0 and newer: this version greatly reduced RAM requirements. See here for more details.The guide below is applicable for Resilio Connect of v2.12 and older.
Resilio Connect version pre-3.0.0 consumes approximately 2KB of RAM for every file/folder (4KB if you are syncing Posix or NTFS permissions).
This article covers the following use case (All the following examples assume that you are not syncing files' permissions so each file/folder takes 2KB of RAM):
- You need to deliver millions of files and you don't have enough available RAM
- Files on source Agent may change, new files added, but nothing is removed
- You need to keep all Agents synchronized
- The source and target systems don't have enough RAM to keep the entire files tree in RAM
This can be achieved in two steps:
1st - The whole array needs to be delivered in batches (using transfer_job_files_limit
parameter) to remote Agent without consuming all of the RAM.
2nd - Keep the delivered files in sync, and synchronize only newly populated files.
Acknowledge the following limitations and peculiarities
1. Only 1-to-1 transfer. The initial transfer withtransfer_job_files_limit
parameter only supports 1-to-1 transfer. Adding multiple destinations may cause the job to stop and never finish.2. Only supported by 2.10.2+ agents
3. The batch limit value shall be more that nested folders depth in the whole data array. I.e. setting
transfer_job_files_limit:10
while there are 15 nested folders won't work.4. Batch limit can't be smaller than number of folders in the whole data array.
5. If destination agent has same files and they are newer than files on source, the files will be overwritten.
6. Data transfer may be slow of flat folder structure (e.g. 2M file on same level)
7. Destination peer may always show the "indexing" status during the Step 1 transfer
Step 1 - Initial files synchronization
- Calculate the amount of subfolders (recursively) in the folder you are going to synchronize.
- Create a job profile in advance, import the following JSON settings into it.
{
Adjust the
"transfer_job_skip_locked_files": "true",
"transfer_job_force_owner_to_hash_file": "false",
"file_deduplication": "false",
"transfer_job_files_limit": "xxx"
}transfer_job_files_limit
amount to fit your RAM. For example, if your computer has 4GB of RAM, we recommend you set this value to a maximum of 1000000 (2GB for files/folders, 1GB for networking buffers, and 1GB RAM should be reserved for the OS). - Configure the distribution job and use the profile created above.
- Start the job and mark down the job start time.
Once the job has completed the initial transfer, proceed to the second step
Step 2 - Keeping the array of data synchronized
- Edit the parameter "Max file age for scanning (seconds)" in the Job profile.
- This parameter allows Agent to only sync files that have changed during X last seconds. Therefore, setting it to 86400 orders Agent to only sync files changed during last 24 hours. Ensure that this value overlaps the timestamp you marked down in Step 1.
- Setup a Sync job.
Agents will only recheck the files that fall into the configured time range. It will speed up syncing, as agents won't have to recheck all those millions of files.
Enforcing absolute time window for files to be synced
In other complex setups, for step 2 administrator might need to force agent to scan files using absolute time window, not a sliding window. This can be done by using 2 custom parameters:
- Min file age for scanning (seconds) - starting time of the window in UNIXTIME format
- Max file age for scanning (seconds) - ending time of the window in UNIXTIME format
For example, to only sync files from May 25 2019 9:00am (UTC) to May 29 2019 9:00am (UTC) set
- Max file age for scanning (seconds) = 1559120400
- Min file age for scanning (seconds) = 1558774800
You can use this site for UNIXTIME conversion.