How to backup files directly between Synology DiskStation and Apple AirPort Time Capsule?

Problem

Synology DSM (up to v6.0) does not allow to use Remote Folder (i.e. mounted network folder of other NAS or PC) as a backup source or backup destination. Although DSM does support network backup on rsync-compatible server, but unfortunately Apple AirPort Time Capsule isn’t such.

  • How to backup, sync or mirror Synology DiskStation files to/from Remote Folder (mounted folder) of any NAS?
  • How to setup direct and automatic DiskStation folder synchronisation to Apple AirPort Time Capsule and vice versa?
  • How to sync Apple AirPort Time Capsule with Synology DiskStation?

Solution

At the moment, we have 2 ways to backup DiskStation data directly to Time Capsule (without any of the 3rd party software on the intermediary computer):

  • create user-defined script with rsync on DiskStation to backup local files to the remote folder (mounted one) of Time Capsule or
  • cheat DiskStation to think it backups locally but in reality local backup folder is a remote folder – it’s the question of changing backup creation sequence under DSM.

I prefer the latter but I also give you ready to use bash rsync script to decide (or maybe to use on any other NAS than Synology).

Pros of the cheat solution (using Backup & Replication App up to DSM v5.2 or Hyper Backup App since DSM v6.0) over rsync script solution:

  • Backup of Synology DSM config is also made.
  • You can determine if metadata should be backuped (e.g. thumbnails).
  • You can backup DSM Apps and their data too (e.g. Note Station, Photo Station, Web Station, Mail Server, MariaDB etc.).
  • You can restore entire backup set using GUI by just one click (missing files are added and duplicates are overwritten).
  • You get native DSM backup notifications and graphical progress info.
  • Original DSM GUI to manage your backup is easier than editing files.
  • Less DiskStation CPU utilisation.

A cheat solution

On Apple AirPort Time Capsule (ATC):
  1. Create some disk account using AirPort Utility > Disks > Secure Shared Disks > With accounts, called for example: Backup. ATC will then create also shared folder of that account name, i.e. Backup.
  2. Set static IP for ATC (e.g. 192.168.1.3). I don’t know why but Bonjour address like AirPort-Time-Capsule.local doesn’t work in current DSM implementation of CIFS that is used to mount remote shared folder – only IPs are valid right now. So, bear in mind to take IP outside of your router DHCP LAN IP range to avoid any IP conflicts (assuming here your DHCP range is for example from 192.168.1.200 to 250, thus 192.168.1.3 will be safe).
On Synology DiskStation (DS):
  1. Create shared folder called Backup and create inside subfolder called Time Capsule.
  2. Create Backup task using
    • DSM 5: Backup & Replication App > Create > Data backup task > Local Backup Destination > Backup data to local shared folder
    • DSM 6 and later: Hyper Backup App > Create > Data backup task > Local Data Copy (it’s a simple data mirroring, see Synology Help). I couldn’t success with Local Shared Folder & External Storage, a Synology powered backup format with versioning and data compression, while always getting Failed to access backup destination error.

    and provide Backup shared folder as destination and Time Capsule subfolder as Directory in Backup Settings. Don’t set any schedules for it right now – you will be able to edit everything later.
    And now goes the cheat…

  3. Move all files created earlier by backup app within /Backup/Time Capsule folder to the ATC Backup shared folder, leaving /Backup/Time Capsule folder empty on DS. Otherwise, DSM won’t let you mount remote folder to this folder because it must be empty.
  4. Mount ATC Backup shared folder to /Backup/Time Capsule folder using File Station > Tools > Mount Remote Folder > CIFS Shared Folder (see Synology Help) and then:
    • in field Folder provide ATC static IP with Backup shared folder name like: \\192.168.1.3\Backup,
    • in the field Account name provide ATC user account (here our Backup account name) and its password,
    • in the field Map to: browse to /Backup/Time Capsule – here will be mapped/accessed ATC as symlink but files will reside in ATC.
    • check/set option Mount automatically on startup.
  5. You are ready now to start your backup! Easy, right? Now you can also edit your backup task but do not delete nor change backup directory! Otherwise, you must unmount ATC shared folder and repeat procedure from step 2 on DS.

Troubleshooting cheat solution

1. Backup starts (I can find synobkpinfo.db and @app directory on the Time Capsule), but then stops immediately and shows failure

This issue is mainly connected with too long path names (folders depth) or „unusual” file names. Go to Backup & Replication/Hyper Backup > Log and find the root-cause of your problem (hook mouse pointer over error entry to see the details popup). There are three levels of logs: Information, Warning and Error. Errors stops the backup – this is your concern. Warnings inform about some backup issues (e.g. symbolic links cannot be backuped) but they don’t fail the backup process. Information gives the timing data. Error details give you problematic folder path. I’ve managed with some path issues just zipping folders (usually they was Xcode application code). Now I successfully backup around 2TB from several volumes and volume subfolders.

Looking into:

  • DSM 5 Help > Backup & Replication > Data Backup > General notes
  • DSM 6 Help > Hyper Backup > Data Backup > Source

you can also find some clues (but I haven’t tested it like that):

  • The maximum length of the complete folder path for backup is 2048 characters.
  • If over 32000 files or folders are uploaded to an EXT3 shared folder via AFP, all backup tasks including the shared folder will fail.
2. DiskStation backups locally instead of backuping on Time Capsule

This is TC mounting problem: if you restarted DS while TC HDD was sleeping/hibernated, then DS won’t mount it. If you miss it, then in effect, DS while starting backup task will backup to the local folder that was used to mount the remote one (i.e. DS is backuping to itself – oops!).
If you’ve encounter that, then stop backup task and:
1) Delete all locally backuped files (leaving backup target folder empty).
2) Wake TC HDD using Finder (or any other way) – just visit some TC volume to wake it up.
3) Reconnect TC Remote Folder – go to DS File Station > Tools > Mount List > Remote Folder, highlight your TC Remote Folder and click Reconnect.
4) Start backup task once again. Don’t worry, it will find previous backup files on TC and will take them into account during mirroring process.

3. Failed to start backup task

This is very similar issue to the previous one – TC HDD has felt asleep and DiskStation cannot reach mounted folder. In this case just wake up TC visiting any of it’s volumes using Finder on Mac or File Explorer on Windows, then make sure you can browse mounted TC folder via Synology DS File Station (reconnect TC Remote Folder as described in issue 2 step 3 if needed) and start backup task again.

A rsync solution

In short, this alternative solution is a bash script with rsync run on Synology DiskStation (DS) with scheduled dispatch to backup files (sync) on mounted remote folder of Apple AirPort Time Capsule (ATC) or any other NAS/network folder. Script consists of 3 files and produces one log file during backup:

  • sync.sh – this is main script run by DS,
  • sync-exclude-list.txt – editable by the user to define files and folders exclusions from backup (usually hidden system files),
  • sync-backups.sh – editable by the user to define source and destination backup paths,
  • sync.log – info file refreshed every time main script runs to store backuped files list.

Download it here: synology-rsync-backup.zip

To use it we need to:

  1. Create a backup shared folder on ATC (see cheat solution ATC p. 1 above how to do it).
  2. Map your ATC backup shared folder on DS, so DS has direct and permanent access to it (see  cheat solution DS p. 4 above how to do it).
  3. Store script files on DS, e.g. in /Volume1/Private/Scripts/backup/ folder.
  4. Create task in Control Panel > Task Scheduler > Create > User-defined script and in Run command field write:

    ash /Volume1/Private/Scripts/backup/sync.sh

    where /Volume1/Private/Scripts/backup/ is an example path you stored script files on DS.

  5. Edit appropriately to your needs files sync-backups.sh and sync-exclude-list.txt using some simple text editor (they are already filled with some examples).
    DO NOT USE word processor like Word nor Wordpad to edit files – this has to be plain text (txt file) Notepad like editor. Otherwise, you might break these files leaving hidden formatting characters and leading to unexpected results while executing.
  6. That’s all.

Now you can run your task and observe contents of sync.log or destination folder. If it doesn’t work as expected, detect any errors running script in SSH of DS (the same command in step 4) – there you can see what’s rsync saying about any errors (usually wrong/invalid source/destination paths provided or inaccessible).

Dyskusja27 komentarzy

  1. Hi,
    Thank you for your process, it answers exactly to what I look for.

    Nevertheless, after applying exactly what you say (a cheat solution), when starting the backup, it starts indeed (I found synobkpinfo.db and @app directory in the Time capsule directory), but it stops immedialy and show failure mode.

    Did you completely test your solution ?
    Do you see a reason ?
    Thank you for your help.

    • Yes, I did. I’ve also stumbled upon that issue and it’s mainly connected with too long path names (folders depth) or „unusual” file names. Go to Backup & Replication > Log and find the root-cause of your problem (hook mouse pointer over error entry to see the details popup). There are three levels of logs: Information, Warning and Error. Errors stops the backup – this is your concern. Warnings inform about some backup issues (e.g. symbolic links cannot be backuped) but don’t fail the backup process. Information gives the timing data. Error details give you problematic folder path. I’ve managed with some path issues just zipping folders (usually they was Xcode application codes). Now I successfully backup around 2TB from several volumes or volume subfolders.

      Looking into DSM Help > Backup & Replication > Data Backup > General notes, you can find also some clues (but I haven’t tested it like that):
      The maximum length of the complete folder path for backup is 2048 characters.
      If over 32000 files or folders are uploaded to an EXT3 shared folder via AFP, all backup tasks including the shared folder will fail.

      Did that solve your problem?
      If not, come back with errors’ details.

    • Another issue you can stumble upon is TC mounting problem: if you restarted DS while TC HDD was sleeping/hibernated, then DS won’t mount it. If you miss it then in effect, DS while starting backup task will backup to the local folder, that was used to mount the remote one, i.e. to itself (oops!).
      If you’ve encounter that, then stop backup task and:
      1) Delete all locally backuped files (leaving backup target folder empty).
      2) Wake TC HDD using Finder – just visit some TC volume to wake it up.
      3) Reconnect TC Remote Folder – go to DS File Station > Tools > Mount List > Remote Folder, highlight your TC Remote Folder and click Reconnect.
      4) Start backup task once again. Don’t worry, it will find previous backup files on TC and will take them into account during mirroring process.

  2. Thanks for your solution, but it didn’t works for me!
    Error are:
    [Local to share] Failed to export system configuration
    [Local to share] Failed to run backup task.

    Any solution?

    Other question: how to backup app setting with scripts?
    Thanks

    Schlew

    • At which stage it happen? Did syno backup anything before this failure? Is TC backup folder mounted (ATC is reachable)?
      Usually failure is connected with:
      1) incorrect backup config,
      2) ATC cannot be reached (incorrect login or hibernated),
      3) backuped files’ names or paths issues.
      See the Troubleshooting cheat solution section I’ve added.

      • Hi and thanks for your reply!
        ATC is reachable and I’ve tried your cheats.
        It happends just after I click on „save now”.
        The backup can create synobkpinfo.db file and @app folder on my TC, but cannot perform the backup.
        If I unmount TC folder, backup works fine on synology.
        Any idea?
        Thanks.
        Schlew

        • So, it looks like files’ names issue on TC side – see point 1 on Troubleshooting cheat solution section I’ve recently updated. This also happen to me before I wrote this solution.

          • Hi!
            I’ve just test this:
            Folder to backup: volume1\test (inside is just an empty folder called test)
            Backup directory: \\192.168.1.10\Schlew\Backup on my TC
            Backup failed when attempting to save configuration.
            As you can see, the maximum length of the complete folder path for backup is not 2048 characters and just one empty folder must be uploaded to TC.

            I’ve tried via your scipt but nothing happend. Don’t know what I’m doing false.

            This is the script:

            #!/bin/bash
            # Run command: ash /volume1/backup/backup/sync.sh

            # Provide your Apple Time Capsule mount dir on Synology:
            TIMECAPSULE=”/volume1/backup/TC”

            # Then instead of providing entire path like:
            # DESTINATION=”/volume1/Private/Backup/Time Capsule/Backup/Photo/”
            # We can replicate source structure like this:
            # DESTINATION=${TIMECAPSULE}${SOURCE}

            # Here goes your backup definition:

            # Backup test
            SOURCE=”/volume1/test/”
            DESTINATION=${TIMECAPSULE}${SOURCE}
            sync

            Sorry again and thanks a lot

            Schlew

          • log file:
            Started: Sun Jan 17 19:52:53 CET 2016
            Script executed from: /usr/syno/synoman/webapi
            Script location: /volume1/backup/backup
            Excluded list:
            @eaDir/
            .DS_Store
            .apdisk
            .*
            .*/

          • ssh error:

            Loups> ash /volume1/backup/backup/sync.sh
            : not foundckup/backup/sync.sh: /volume1/backup/backup/sync-backups.sh: line 3:

            Or sync-backups.sh is in folder volume1/backup/backup.

            Don’t understand. Sorry

          • 1) Have you mounted TC folder as Remote Folder to the Time Capsule account?
            2) It looks like you cut error string. I can’t figure out what exactly happen.

            PS
            What about cheat solution after reading toubleshooter?

          • You know what? I made it!
            With your script. The only thinks I’ve changed is that I modify .sh files with notepad instead of wordpad!
            Thanks a lot for your help and for your script!!!!!
            It works really well!
            Schlew

          • Thanks for your feedback! I would have never guessed you edited them using word processor. I’ve already updated text with this warning.

            PS
            Have you really gave up with cheat solution?

          • Yes, I gave up with cheat solution. Didn’t find why it didn’t works with Backup & Replication App.
            I will use your script. The essential is that my music, photo, and other files are saved…. and that I can use my time capsule to do it!
            Thanks again!
            Schlew

  3. Hi Krzysztof,

    I have tried to follow your excellently written „cheat” on my DS 214play with latest DSM (6.0.2) at no avail. When deleting all files in the DS Backup folder, the status of the task turns to offline, and I can’t get it online again. I use dedicated local IP addresses for both the Time Capsule Router as well as the Synology NAS and managed to mount the shared folder as described. However, NAS notices the deletion of the files and turns stubborn..

    I am trying to sync my files on my harddisk on a Time Capsule with the DS 214. QNAP has a module called RTRR which is exactly what I am looking for, but I can’t get this to work e.g. with your cheat on the Synology. Is there any other way to do this?

    Any help would be greatly appreciated.

    Best regards,
    Martin

    • I’ve managed to replicate your issue and trying to find solution. Be patient.

    • OK, try this:
      Assuming you have created local backup task with destination on /Volume 1/Backup/Test, do not delete backup files created in it but move them to the TC backup folder (or copy then delete). Then mount TC backup folder under /Volume 1/Backup/Test.
      This works for me but come back with your result.

      • Hi Krzysztof,

        Thanks! The backup works now, however it’s very slow (max 7 Mbit/s although it’s a 1000 Mbit ethernet connection).

        Thanks for your help, appreciated.
        Martin

        • 7 megabits or 7 megabytes per second?
          Anyway, I’ll measure this speed soon, because backup cannot be proceed during SHR consistency check (I had to exchange one drive and SHR repairing takes few days). Meanwhile check if your TC is reliably connected (i.e. eth connector sits closely in the socket). One time I found my TC were working on 100 Mbit/s only because I plug eth connector inaccurately (loosely).

        • On my 1Gb/s network backup speed varies between 6 MB/s to 20 KB/s, usually 40 KB/s. I think speed depends on the backup file structure, i.e. large files are backuping faster due to relatively small backup administration overhead (written backup metadata).

        • One more thing…
          I couldn’t success with Local Shared Folder & External Storage, a Synology powered backup format with versioning and data compression, while always getting Failed to access backup destination error. I had to use Hyper Backup App > Create > Data backup task > Local Data Copy (it’s a simple data mirroring).

  4. Hi,

    I have been trying to use you cheat method but keep getting error message „failed to start backup task”.

    Looked at all your trouble shooting comments but cannot understand why it does not work.

    Please advise

    Regards

    Winston

  5. Hi,

    Yes I can browse and copy and delete folders and file as normal.

    Regards

    Winston

Wypowiedz się