How to backup files directly between Synology DiskStation and Apple AirPort Time Capsule?

Problem

Synology DSM (up to v6.0) does not allow to use Remote Folder (i.e. mounted network folder of other NAS or PC) as a backup source or backup destination. Although DSM does support network backup on rsync-compatible server, but unfortunately Apple AirPort Time Capsule isn’t such.

  • How to backup, sync or mirror Synology DiskStation files to/from Remote Folder (mounted folder) of any NAS?
  • How to setup direct and automatic DiskStation folder synchronisation to Apple AirPort Time Capsule and vice versa?
  • How to sync Apple AirPort Time Capsule with Synology DiskStation?

Solution

At the moment, we have 2 ways to backup DiskStation data directly to Time Capsule (without any of the 3rd party software on the intermediary computer):

  • to create user-defined script with rsync on DiskStation to backup local files to the remote folder (mounted one) of Time Capsule or
  • to cheat DiskStation to think that it is backuping locally, but in reality local backup folder is a remote folder – it’s the question of changing backup creation sequence under DSM.

I prefer the latter but I also give you ready-to-use bash rsync script to decide (or maybe to use on any other NAS than Synology).

Pros of the cheat solution (using Backup & Replication App up to DSM v5.2 or Hyper Backup App since DSM v6.0) over rsync script solution:

  • Backup of Synology DSM config is also made.
  • You can determine if metadata should be backuped (e.g. thumbnails).
  • You can backup DSM Apps and their data too (e.g. Note Station, Photo Station, Web Station, Mail Server, MariaDB etc.).
  • You can restore entire backup set using GUI by just one click (missing files are added and duplicates are overwritten).
  • You get native DSM backup notifications and graphical progress info.
  • Original DSM GUI to manage your backup is easier than editing files.
  • Less DiskStation CPU utilisation.

A cheat solution

Introduction

Unfortunately, this trick works only with local single-version backup (done by /volume1/@appstore/HyperBackup/bin/synolocalbkp  binary), so there is no versioning available – simple mirroring only.

Based on my findings, local multi-version backup (done by /volume1/@appstore/HyperBackup/bin/synoimgbkptool binary) utilises hard linking, which cannot cross filesystem boundaries (i.e. all files have to reside on the same partition) and won’t work with mounted remote folders (Error: Invalid cross-device link in /var/log/messages). A bit more info you’ll find in troubleshooting section below.

Anyway, let’s get started…

Procedure

On Apple AirPort Time Capsule (ATC):
  1. Create some disk account using AirPort Utility > Disks > Secure Shared Disks > With accounts, called for example: Backup. ATC will then create also shared folder of that account name, i.e. Backup.
  2. Set static IP for ATC (e.g. 192.168.1.3). I don’t know why but Bonjour address like AirPort-Time-Capsule.local doesn’t work in current DSM implementation of CIFS that is used to mount remote shared folder – only IPs are valid right now. So, bear in mind to take IP outside of your router DHCP LAN IP range to avoid any IP conflicts (assuming here your DHCP range is for example from 192.168.1.200 to 250, thus 192.168.1.3 will be safe).
On Synology DiskStation (DS):
  1. Create shared folder called Backup and create inside subfolder called Time Capsule.
  2. Create local backup task using
    • DSM 5: Backup & Replication App > Create > Data backup task > Local Backup Destination > Backup data to local shared folder
    • DSM 6 and later: Hyper Backup App > Create > Data backup task > Local folder & USB (single-version)
      (Note: It’s a simple data mirroring without deduplication, see Synology Help, because I couldn’t success with versioning and data compression, while always getting Failed to access backup destination error).

    and on subsequent screens provide:

    • Backup shared folder as destination and
    • Time Capsule subfolder as Directory in Backup Settings.

    Don’t run backup and skip setting schedule for it right now – you will be able to edit everything later.

And now goes the cheat…

  1. Move all files created earlier by the App within /Backup/Time Capsule folder (there should be some config and db files) to the ATC Backup shared folder, leaving /Backup/Time Capsule folder empty on DS. Otherwise, DSM won’t let you mount remote folder to this folder, because it must be empty.
  2. Mount ATC Backup shared folder to /Backup/Time Capsule folder using File Station > Tools > Mount Remote Folder > CIFS Shared Folder (see Synology Help) and then:
    • in the field Folder provide ATC static IP with Backup shared folder name like \\192.168.1.3\Backup,
    • in the field Account name provide ATC user account (here our Backup account name) and its password,
    • in the field Map to: browse to /Backup/Time Capsule – here will be mapped/accessed ATC as symlink but files will reside in ATC.
    • check/set option Mount automatically on startup.
  3. You are ready now to start your backup! Easy, right? Now you can also edit your backup task but do not delete nor change backup directory! Otherwise, you must unmount ATC shared folder and repeat procedure from step 2 on DS.

Troubleshooting cheat solution

1. How to investigate any issues?

High-level Log

Go to the Backup & Replication/Hyper Backup > Log and find the root-cause of your problem. Hook mouse pointer over the log entry to see the details popup. There are three levels of logs: Information, Warnings and Errors:

  • Error stops/breaks the backup process.
  • Warning informs about some backup issues (e.g. symbolic links cannot be backuped) but they don’t stop the backup process.
  • Information gives the timing data.

Low-level Log

If above high-level in-app log does not correspond to nor explains issues you are experiencing, try to examine low-level backup log via SSH using this command:

sudo cat /var/log/messages | grep synolocalbkp

or narrowing log entries to the current day (e.g. 2018-03-05):

sudo cat /var/log/messages | grep '2018-03-05.*synolocalbkp'

By the way, note that synolocalbkp tool is a part of Hyper Backup App binaries responsible for Local folder & USB (single-version) option implementation. Multi version local folder backup is done by synoimgbkptool binary. Knowing names of all Hyper Backup App bins can be helpful in filtering /var/log/messagesoutput during its examination. You can find them under SSH using:

sudo ls /volume1/@appstore/HyperBackup/bin/

 

2. Backup starts (I can find synobkpinfo.db and @app directory on the Time Capsule), but then stops immediately and shows failure

This issue is usually connected with too long path names (folders depth) or „unusual” file names. Look into the Log for Errors. The error details will give you problematic folder path. I’ve managed with some path issues just by zipping folders (usually they was Xcode application code). Only then I could successfully backup around 2TB from several volumes and volume subfolders.

Moreover, looking into:

  • DSM 5 Help > Backup & Replication > Data Backup > General notes
  • DSM 6 Help > Hyper Backup > Data Backup > Source

you can also find some clues (but I haven’t tested it like that):

  • The maximum length of the complete folder path for backup is 2048 characters.
  • If over 32000 files or folders are uploaded to an EXT3 shared folder via AFP, all backup tasks including the shared folder will fail.
3. DiskStation backups locally instead of backuping on Time Capsule

This is TC mounting problem: if you restarted DS while TC HDD was sleeping/hibernated, then DS won’t mount it. If you miss it, then in effect, DS while starting backup task will backup to the local folder that was used to mount the remote one (i.e. DS is backuping to itself – oops!).
If you’ve encounter that, then stop backup task and:
1) Delete all locally backuped files (leaving backup target folder empty).
2) Wake TC HDD using Finder (or any other way) – just visit some TC volume to wake it up.
3) Reconnect TC Remote Folder – go to DS File Station > Tools > Mount List > Remote Folder, highlight your TC Remote Folder and click Reconnect.
4) Start backup task once again. Don’t worry, it will find previous backup files on TC and will take them into account during mirroring process.

4. Failed to start backup task

This might be two things.

First thing is very similar to the 3rd point – TC HDD has felt asleep and DiskStation cannot reach mounted folder. In this case just wake up TC visiting any of it’s volumes using Finder on Mac or File Explorer on Windows, then make sure you can browse mounted TC folder via Synology DS File Station (reconnect TC Remote Folder as described in issue 2 step 3 if needed) and start backup task again.

Another thing can be connected with your DS backup destination folder permissions. Investigate them under File Station > Right click on your folder > Choose Properties and look into Permissions and Advanced Permissions tab. As a last resort, log into DS via SSH and try to:

sudo chmod -R a+rwx /volume1/backup

(where path /volume1/backup you need to replace with the path to your backup destination) to give all permissions to all users.

A rsync solution

In short, this alternative solution is a bash script with rsync runing on Synology DiskStation (DS) with scheduled dispatch to backup files (sync) on mounted remote folder of Apple AirPort Time Capsule (ATC) or any other NAS/network folder. Script consists of 3 files and produces one log file during backup time:

  • sync.sh – this is main script run by DS,
  • sync-exclude-list.txt – editable by the user to define files and folders exclusions from backup (usually hidden system files),
  • sync-backups.sh – editable by the user to define source and destination backup paths,
  • sync.log – info file refreshed every time main script runs to store backuped files list.

Download it here: synology-rsync-backup.zip

To use it properly you need to:

  1. Create a backup shared folder on ATC (see cheat solution ATC p. 1 above how to do it).
  2. Map your ATC backup shared folder on DS, so DS has direct and permanent access to it (see  cheat solution DS p. 4 above how to do it).
  3. Store script files on DS, e.g. in /Volume1/Private/Scripts/backup/ folder.
  4. Create task in Control Panel > Task Scheduler > Create > User-defined script and in Run command field write:

    ash /Volume1/Private/Scripts/backup/sync.sh

    where /Volume1/Private/Scripts/backup/ is an example path you stored script files on DS.

  5. Edit appropriately to your needs files sync-backups.sh and sync-exclude-list.txt using some simple text editor (they are already filled with some examples).
    DO NOT USE word processor like Word nor Wordpad to edit files – this has to be plain text (txt file) Notepad like editor. Otherwise, you might break these files leaving hidden formatting characters and leading to unexpected results while executing.
  6. That’s all.

Now you can run your task and observe contents of sync.log or destination folder. If it doesn’t work as expected, you can detect any errors running script in SSH of DS (using the same command as in step 4). There you can see what rsync is saying about any errors (usually wrong/invalid source/destination paths provided or inaccessible).

Dyskusja 42 komentarze

  1. Greetings!
    Thanks for the sharing. I have followed your instructions with „cheat” method, I managed up to the step of transferring all generated backup file to TimeCapsule. with Hyper Backup, it is able to see the mounted folder (resit on TimeCapsule), the problem comes to, when I tried to backup to the TimeCapsule, Hyper Backup shows „Processing data” forever and stuck there, I left it for the whole night but nothing progress. I have tried this a couple of time and remains the same. Any possible causes to this problem? How to make it work properly?

    Very appreciate and thank you.

    Regards.

    YaQi

    Running on DS718+ with DSM 6.1 update 1

    • Read Hyper Backup > Log to investigate what is going on. Usually, all previous issues where connected with:
      1) Path names to backup (see Troubleshooting cheat solution section).
      2) Folder permissions. Try to sudo chmod -R a+rwX /volume1/backup on DS following Marco’s findings.

      • After trying your code of modifying permissions, the problem remains. Before my post, for testing purpose I only set the Hyper Backup to backup a couple of empty folders, and some folders contains very few jpg images, which all are very simple names with less than 20 characters.

        1) I have got no error warnings when I first created backup onto local drive before moving files to TimeCapsule.
        2)After moving everything to TimeCapsule, I am able to mount the backup (on TimeCapsule) to Hyper Back with capability of viewing version information, and browse files inside the backup. [I got to this point before my first post.]
        3) I have tried to backup again without adding any new items to the backup, the Hyper Backup shows „Backing Up…” as well as „Processing data…”, and in the „Target” box „Size” showing „Calculating…”. And this is what I get and stuck with.
        4) I quit the Hyper Backup and restart everything over again, repeating to point 3), then alter the setting to add little more files to be included to the backup, and eventually the Hyper Backup shows exactly the same as described in point 3).

        I have no knowledge about unix command lines, however I have tried the code in your trouble shooting section, hopping to get it work. But the problems remains.

        Regards.

        • Try to delete all backuped files at ATC, i.e. make ATC backup folder empty, restart backup task and come back with results.

          • Hi, I have tried to delete all backup files at Time Capsule. But the DS prompted with „Offline”, and refused to continue with backup. This is just like what Martin described in his comment „However, NAS notices the deletion of the files and turns stubborn..”. But he is luckier than I am, I am again stuck here.
            I then put every files back to the destination folder on Time Capsule, then Hyper Backup went back to the state of “Processing Data” forever when i tried to backup as I described in my previous post. I have tried this a couple of times, with creating new backup, move to TimeCapsule, mount remote folder, delete all files in destination folder, it shows offline every single time.

            Regards.

          • Have you examined Hyper Backup Log? What entries show up while HB is in “stuck”?

          • It just shows:
            Level: Information|Date & Time| user: SYSTEM|Event: [Local][Backup to TimeCapsule] Backup task started..

            There is no orange warning nor red error text.

            and Hyper Backup just stays in the state in my previously described showing „Backing up…”, „Processing data…” but nothing progress.

          • Try to examine low level backup log via SSH using this command:
            sudo cat /var/log/messages | grep synolocalbkp
            or if you want latest log (e.g. from day 2018-03-05):
            sudo cat /var/log/messages | grep '2018-03-05.*synolocalbkp'

          • I have obtained a couple of lines of log, sorry for showing so many, as I don’t know which is useful.
            However, I do spot few lines, with err and fail. Are they represent some sort of problems?
            2018-03-06T17:40:26+08:00 YaQi_Synology synoscgi_SYNO.Backup.Task_1_create[16409]: (16409) [info] task_state_machine.cpp:311 task [27] from state [Initial] to state [Backupable] with action [Task create]
            2018-03-06T17:40:57+08:00 YaQi_Synology img_backup: [16586]img_backup.cpp:2080 Local Backup Task has been started: task_ID: 27
            2018-03-06T17:40:57+08:00 YaQi_Synology img_backup: [16586]img_backup.cpp:1562 Action: [local backup], Repo Path: [Backup], LinkKey: [YaQi_Synology_00113282B38D_27], Cloud backup: [0], Target ID: [TimeCapsule.hbk], task ID: [27]
            2018-03-06T17:40:57+08:00 YaQi_Synology img_backup: [16586]img_backup.cpp:1564 app config Path: [/volume1/@tmp/BKP_APP_JWQoun]
            2018-03-06T17:40:57+08:00 YaQi_Synology img_backup: [16586]img_backup.cpp:1566 data Path: [/homes]
            2018-03-06T17:40:57+08:00 YaQi_Synology img_backup: [16586]img_backup.cpp:1566 data Path: [/music]
            2018-03-06T17:40:57+08:00 YaQi_Synology img_backup: [16586]img_backup.cpp:1566 data Path: [/photo/YaQi Zhang Photos/High School]
            2018-03-06T17:40:58+08:00 YaQi_Synology img_backup: (16586) [info] snapshot.cpp:328 take share [homes] backup snapshot [/volume1/@sharesnap/homes/GMT+08-2018.03.06-17.40.57]
            2018-03-06T17:40:59+08:00 YaQi_Synology img_backup: (16586) [info] snapshot.cpp:328 take share [music] backup snapshot [/volume1/@sharesnap/music/GMT+08-2018.03.06-17.40.58]
            2018-03-06T17:41:00+08:00 YaQi_Synology img_backup: (16586) [info] snapshot.cpp:328 take share [photo] backup snapshot [/volume1/@sharesnap/photo/GMT+08-2018.03.06-17.40.59]
            2018-03-06T17:41:04+08:00 YaQi_Synology synoimgbkp_tagmgr: [16786]tag_db.cpp:1095 info: last version tag db [/volume1/@img_bkp_cache/ClientCache_image_image_local.N5Yncv/last_version_tagdb] not exists [No such file or directory]
            2018-03-06T17:41:10+08:00 YaQi_Synology export: Export finished result is 0
            2018-03-06T17:41:13+08:00 YaQi_Synology img_backup: (16586) backup_controller.cpp:3064 (16586)[BkpCtrl] All workers flush done, continue:(4)
            2018-03-06T17:41:26+08:00 YaQi_Synology img_backup: (16586) [info] snapshot.cpp:172 remove share [homes] backup snapshot [GMT+08-2018.03.06-17.40.57]
            2018-03-06T17:41:27+08:00 YaQi_Synology img_backup: (16586) [info] snapshot.cpp:172 remove share [music] backup snapshot [GMT+08-2018.03.06-17.40.58]
            2018-03-06T17:41:27+08:00 YaQi_Synology img_backup: (16586) [info] snapshot.cpp:172 remove share [photo] backup snapshot [GMT+08-2018.03.06-17.40.59]
            2018-03-06T17:41:27+08:00 YaQi_Synology img_backup: (16586) [err] backup_progress.cpp:431 Backup task [Backup to TimeCapsule] completes with result [1]. Time spent: [30 sec].
            2018-03-06T17:41:27+08:00 YaQi_Synology img_backup: (16586) [err] backup_progress.cpp:446 Total Size(Bytes):[228809654], Modified Size(Bytes):[228809654], Total Directory:[180], Modified Directory:[180], Total File:[940], Modified File:[940],
            2018-03-06T17:41:27+08:00 YaQi_Synology img_backup: [16586]img_backup.cpp:1913 Storage Statistics: TargetSize:[225328], LastBackupTargetSize:[0], SourceSize:[225972], TotalFile:[940], ModifyFile:[0], NewFile:[940], UnchangeFile:[0], RemoveFile:[0], RenameFile:[0], RenameLogicSize:[0], CopyFile:[13], CopyLogicSize:[802648], CopyMissFile:[56], CopyMissFileLogicSize:[3870962]
            2018-03-06T17:45:41+08:00 YaQi_Synology kernel: [ 599.320529] CIFS VFS: Send error in SessSetup = -13
            2018-03-06T17:45:41+08:00 YaQi_Synology kernel: [ 599.326233] CIFS VFS: cifs_mount failed w/return code = -13
            2018-03-06T17:45:41+08:00 YaQi_Synology synoscgi_SYNO.FileStation.Mount_1_mount_remote[18013]: cifsrecord.cpp:74 Fail to mount.cifs \\192.168.31.3\backup\TimeCapsule.hbk /volume1/Backup/TimeCapsule.hbk ([13] Success)
            2018-03-06T17:45:41+08:00 YaQi_Synology synoscgi_SYNO.FileStation.Mount_1_mount_remote[18013]: SYNO.FileStation.Mount.cpp:316 Failed to mount with MAC options. Will retry with sec=ntlm options.
            2018-03-06T17:47:01+08:00 YaQi_Synology kernel: [ 679.315736] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:01+08:00 YaQi_Synology kernel: [ 679.325510] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:01+08:00 YaQi_Synology kernel: [ 679.334840] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:01+08:00 YaQi_Synology kernel: [ 679.346919] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:03+08:00 YaQi_Synology kernel: [ 680.856727] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:03+08:00 YaQi_Synology kernel: [ 680.930908] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:03+08:00 YaQi_Synology kernel: [ 680.951920] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:03+08:00 YaQi_Synology kernel: [ 680.961687] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:03+08:00 YaQi_Synology kernel: [ 680.970988] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:03+08:00 YaQi_Synology kernel: [ 680.981364] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:20+08:00 YaQi_Synology kernel: [ 697.807670] cifs_vfs_err: 5 callbacks suppressed
            2018-03-06T17:47:20+08:00 YaQi_Synology kernel: [ 697.812840] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:28+08:00 YaQi_Synology kernel: [ 706.224945] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:38+08:00 YaQi_Synology kernel: [ 716.152892] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:38+08:00 YaQi_Synology kernel: [ 716.294264] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:38+08:00 YaQi_Synology kernel: [ 716.302362] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:38+08:00 YaQi_Synology kernel: [ 716.314060] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:39+08:00 YaQi_Synology img_backup: [18700]img_backup.cpp:2080 Local Backup Task has been started: task_ID: 27
            2018-03-06T17:47:39+08:00 YaQi_Synology img_backup: [18700]img_backup.cpp:1562 Action: [local backup], Repo Path: [Backup], LinkKey: [YaQi_Synology_00113282B38D_27], Cloud backup: [0], Target ID: [TimeCapsule.hbk], task ID: [27]
            2018-03-06T17:47:39+08:00 YaQi_Synology img_backup: [18700]img_backup.cpp:1564 app config Path: [/volume1/@tmp/BKP_APP_1Gp0LM]
            2018-03-06T17:47:39+08:00 YaQi_Synology img_backup: [18700]img_backup.cpp:1566 data Path: [/homes]
            2018-03-06T17:47:39+08:00 YaQi_Synology img_backup: [18700]img_backup.cpp:1566 data Path: [/music]
            2018-03-06T17:47:39+08:00 YaQi_Synology img_backup: [18700]img_backup.cpp:1566 data Path: [/photo/YaQi Zhang Photos/High School]
            2018-03-06T17:47:39+08:00 YaQi_Synology kernel: [ 717.471396] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:40+08:00 YaQi_Synology img_backup: (18700) [info] snapshot.cpp:328 take share [homes] backup snapshot [/volume1/@sharesnap/homes/GMT+08-2018.03.06-17.47.39]
            2018-03-06T17:47:41+08:00 YaQi_Synology img_backup: (18700) [info] snapshot.cpp:328 take share [music] backup snapshot [/volume1/@sharesnap/music/GMT+08-2018.03.06-17.47.40]
            2018-03-06T17:47:41+08:00 YaQi_Synology img_backup: (18700) [info] snapshot.cpp:328 take share [photo] backup snapshot [/volume1/@sharesnap/photo/GMT+08-2018.03.06-17.47.41]
            2018-03-06T17:47:43+08:00 YaQi_Synology kernel: [ 721.213593] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:43+08:00 YaQi_Synology kernel: [ 721.226266] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:43+08:00 YaQi_Synology kernel: [ 721.234266] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:43+08:00 YaQi_Synology kernel: [ 721.242257] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:43+08:00 YaQi_Synology kernel: [ 721.250577] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:43+08:00 YaQi_Synology kernel: [ 721.418350] cifs_vfs_err: 2 callbacks suppressed
            2018-03-06T17:47:43+08:00 YaQi_Synology kernel: [ 721.423561] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:47:43+08:00 YaQi_Synology kernel: [ 721.523316] CIFS VFS: bogus file nlink value 0
            2018-03-06T17:48:23+08:00 YaQi_Synology kernel: [ 761.287589] CIFS VFS: bogus file nlink value 0
            ODPOWIEDZ

          • Thanks for your reply and patience. After many times fails and your hint of “img_backup’, I reviewed your post of the cheat method carefully again, I noticed you mentioned of not to use versioning backup. I guess this is why I have always failed while trying. Maybe “img_backup’ is the program to create backup with versioning. And current version of Hyper Backup have different names for the backup options, they are “Local folder & USB” and “Local folder & USB (single-version)”, and the later is the option highlighted in your post and I didn’t notice.
            It is sad that versioning does not work with your method.

            But anyway, it seems to be backing up. My old TimeCapsule justifies its value!

            Thank you very much for sharing, really appreciate.

            regards.

          • Yep, currently this is „single-version local backup” solution. Thanks to you, I’ve already updated the text with appropriate explanations and hints. I’m also happy you’ve finally made it.

          • Cheers!

Wypowiedz się