CrashPlan packages for Synology NAS

UPDATE – The instructions and notes on this page apply to all three versions of the package hosted on my repo: CrashPlan, CrashPlan PRO, and CrashPlan PROe.

CrashPlan is a popular online backup solution which supports continuous syncing. With this your NAS can become even more resilient – it could even get stolen or destroyed and you would still have your data. Whilst you can pay a small monthly charge for a storage allocation in the Cloud, one neat feature CrashPlan offers is for individuals to collaboratively backup their important data to each other – for free! You could install CrashPlan on your laptop and have it continuously protecting your documents to your NAS, even whilst away from home.


CrashPlan is a Java application, and one that’s typically difficult to install on a NAS – therefore an obvious candidate for me to simplify into a package, given that I’ve made a few others. I tried and failed a few months ago, getting stuck at compiling the Jtux library for ARM CPUs (the Oracle Java for Embedded doesn’t come with any headers).

I noticed a few CrashPlan setup guides linking to my Java package, and decided to try again based on these: Kenneth Larsen’s blog post, the Vincesoft blog article for installing on ARM processor Iomega NAS units, and this handy PDF document which is a digest of all of them, complete with download links for the additional compiled ARM libraries. I used the PowerPC binaries Christophe had compiled on his blog, so thanks go to him. I wanted make sure the package didn’t require the NAS to be bootstrapped, so I picked out the few generic binaries that were needed (bash, nice and cpio) directly from the Optware repo.

UPDATE – For version 3.2 I also had to identify and then figure out how to compile Tim Macinta’s fast MD5 library, to fix the supplied on ARM systems (CrashPlan only distributes libraries for x86). I’m documenting that process here in case more libs are required in future versions. I identified it from the error message in log/engine_error.log and by running objdump -x I could see that the same Java_com_twmacinta_util_MD5_Transform_1native function mentioned in the error was present in the x86 lib but not in my compiled from W3C Libwww. I took the headers from an install of OpenJDK on a regular Ubuntu desktop. I then used the Linux x86 source from the download bundle on Tim’s website – the closest match – and compiled it directly on the syno using the command line from a comment in another version of that source:
gcc -O3 -shared -I/tmp/jdk_headers/include /tmp/fast-md5/src/lib/arch/linux_x86/MD5.c -o

Aside from the challenges of getting the library dependencies fixed for ARM and QorIQ PowerPC systems, there was also the matter of compliance – Code 42 Software’s EULA prohibits redistribution of their work. I had to make the syno package download CrashPlan for Linux (after the end user agrees their EULA), then I had to write my own script to extract this archive and mimic their installer, since their installer is interactive. It took a lot of slow testing, but I managed it!

CPPROe package info

My most recent package version introduces handling of the automatic updates which Code 42 sometimes publish to the clients. This has proved to be quite a challenge to get working as testing was very laborious. I can confirm that it worked with the update from CrashPlan PRO 3.2 to 3.2.1 , and from CrashPlan 3.2.1 to 3.4.1:




  • This package is for Marvell Kirkwood, Marvell Armada 370/XP, Intel and Freescale QorIQ/PowerQUICC PowerPC CPUs only, so please check which CPU your NAS has. It will work on an unmodified NAS, no hacking or bootstrapping required. It will only work on older PowerQUICC PowerPC models that are running DSM 5.0. It is technically possible to run CrashPlan on older DSM versions, but it requires chroot-ing to a Debian install. Christophe from has recently released packages to automate this.
  • In the User Control Panel in DSM, enable the User Homes service.
  • Install the package directly from Package Center in DSM. In Settings -> Package Sources add my package repository URL which is
  • You will need to install either one of my Java SE Embedded packages first (Java 6 or 7). Read the instructions on that page carefully too.
  • If you previously installed CrashPlan manually using the Synology Wiki, you can find uninstall instructions here.


  • The package downloads the CrashPlan installer directly from Code 42 Software, following acceptance of their EULA. I am complying with their wish that no one redistributes it.
  • CrashPlan is installed in headless mode – backup engine only. This is configured by a desktop client, but operates independently of it.
  • The engine daemon script checks the amount of system RAM and scales the Java heap size appropriately (up to the default maximum of 512MB). This can be overridden in a persistent way if you are backing up very large backup sets by editing /volume1/@appstore/CrashPlan/syno_package.vars. If you’re considering buying a NAS purely to use CrashPlan and intend to back up more than a few hundred GB then I strongly advise buying one of the Intel models which come with 1GB RAM and can be upgraded to 3GB very cheaply. RAM is very limited on the ARM ones. 128MB RAM on the J series means CrashPlan is running with only one fifth of the recommended heap size, so I doubt it’s viable for backing up very much at all. My DS111 has 256MB of RAM and currently backs up around 60GB with no issues. I have found that a 512MB heap was insufficient to back up more than 2TB of files on a Windows server. It kept restarting the backup engine every few minutes until I increased the heap to 1024MB.
  • As with my other syno packages, the daemon user account password is randomized when it is created using the openssl binary. DSM Package Center runs as the root user so my script starts the package using an su command. This means that you can change the password yourself and CrashPlan will still work.
  • The default location for saving friends’ backups is set to /volume1/crashplan/backupArchives (where /volume1 is you primary storage volume) to eliminate the chance of them being destroyed accidentally by uninstalling the package.
  • The first time you run the server you will need to stop it and restart it before you can connect the client. This is because a config file that’s only created on first run needs to be edited by one of my scripts. The engine is then configured to listen on all interfaces on the default port 4243.
  • Once the engine is running, you can manage it by installing CrashPlan on another computer, and editing the file conf/ on that computer so that this line:
    is uncommented (by removing the hash symbol) and set to the IP address of your NAS, e.g.:
    On Windows you can also disable the CrashPlan service if you will only use the client.
  • If you need to manage CrashPlan from a remote location, I suggest you do so using SSH tunnelling as per this support document.
  • The package supports upgrading to future versions while preserving the machine identity, logs, login details, and cache. Upgrades can now take place without requiring a login from the client afterwards.
  • If you remove the package completely and re-install it later, you can re-attach to previous backups. When you log in to the Desktop Client with your existing account after a re-install, you can select “adopt computer” to merge the records, and preserve your existing backups. I haven’t tested whether this also re-attaches links to friends’ CrashPlan computers and backup sets, though the latter does seem possible in the Friends section of the GUI. It’s probably a good idea to test that this survives a package reinstall before you start relying on it. Sometimes, particularly with CrashPlan PRO I think, the adopt option is not offered. In this case you can log into CrashPlan Central and retrieve your computer’s GUID. On the CrashPlan client, double-click on the logo in the top right and you’ll enter a command line mode. You can use the GUID command to change the system’s GUID to the one you just retrieved from your account.
  • The log which is displayed in the package’s Log tab is actually the activity history. If you’re trying to troubleshoot an issue you will need to use an SSH session to inspect the two engine log files which are:
  • When CrashPlan downloads and attempts to run an automatic update, the script will most likely fail and stop the package. This is typically caused by syntax differences with the Synology versions of certain Linux shell commands (like rm, mv, or ps). You will need to wait several minutes in the event of this happening before you take action, because the update script tries to restart CrashPlan 10 times at 10 second intervals. After this, you simply start the package again in Package Center and my scripts will fix the update, then run it. One final package restart is required before you can connect with the CrashPlan Desktop client (remember to update that too).
  • After their backup is seeded some users may wish to schedule the CrashPlan engine using cron so that it only runs at certain times. This is particularly useful on ARM systems because CrashPlan currently prevents hibernation while it is running (unresolved issue, reported to Code 42). To schedule, edit /etc/crontab and add the following entries for starting and stopping CrashPlan:
    55 2 * * * root /var/packages/CrashPlan/scripts/start-stop-status start
    0  4 * * * root /var/packages/CrashPlan/scripts/start-stop-status stop

    This example would configure CrashPlan to run daily between 02:55 and 04:00am. CrashPlan by default will scan the whole backup selection for changes at 3:00am so this is ideal. The simplest way to edit crontab if you’re not really confident with Linux is to install Merty’s Config File Editor package, which requires the official Synology Perl package to be installed too (since DSM 4.2). After editing crontab you will need to restart the cron daemon for the changes to take effect:
    /usr/syno/etc.defaults/rc.d/ stop
    /usr/syno/etc.defaults/rc.d/ start

    It is vitally important that you do not improvise your own startup commands or use a different account because this will most likely break the permissions on the config files, causing additional problems. The package scripts are designed to be run as root, and they will in turn invoke the CrashPlan engine using its own dedicated user account.
  • If you update DSM later, you will need to re-install the Java package or else UTF-8 and locale support will be broken by the update.
  • If you decide to sign up for one of CrashPlan’s paid backup services as a result of my work on this, I would really appreciate it if you could use this affiliate link, or consider donating using the PayPal button on the right.

Package scripts

For information, here are the package scripts so you can see what it’s going to do. You can get more information about how packages work by reading the Synology Package wiki.


#--------CRASHPLAN installer script
#--------package maintained at

[ "${SYNOPKG_PKGNAME}" == "CrashPlan" ] && DOWNLOAD_FILE="CrashPlan_3.6.3_Linux.tgz"
[ "${SYNOPKG_PKGNAME}" == "CrashPlanPRO" ] && DOWNLOAD_FILE="CrashPlanPRO_3.6.3_Linux.tgz"
[ "${SYNOPKG_PKGNAME}" == "CrashPlanPROe" ] && DOWNLOAD_FILE="CrashPlanPROe_3.6.3_Linux.tgz"
DAEMON_USER="`echo ${SYNOPKG_PKGNAME} | awk {'print tolower($_)'}`"
DAEMON_PASS="`openssl rand 12 -base64 2>/dev/null`"
SYNO_CPU_ARCH="`uname -m`"
[ "${SYNO_CPU_ARCH}" == "x86_64" ] && SYNO_CPU_ARCH="i686"
NATIVE_BINS_FILE="`echo ${NATIVE_BINS_URL} | sed -r "s%^.*/(.*)%\1%"`"
TEMP_FOLDER="`find / -maxdepth 2 -name '@tmp' | head -n 1`"
#the Manifest folder is where friends' backup data is stored
#we set it outside the app folder so it persists after a package uninstall
MANIFEST_FOLDER="/`echo $TEMP_FOLDER | cut -f2 -d'/'`/crashplan"
UPGRADE_FILES="syno_package.vars conf/my.service.xml conf/service.login conf/service.model"

source /etc/profile
PUBLIC_FOLDER="`cat /usr/syno/etc/smb.conf | sed -r '/\/public$/!d;s/^.*path=(\/volume[0-9]{1,4}\/public).*$/\1/'`"

preinst ()
  if [ -z ${PUBLIC_FOLDER} ]; then
    echo "A shared folder called 'public' could not be found - note this name is case-sensitive. "
    echo "Please create this using the Shared Folder DSM Control Panel and try again."
    exit 1

  if [ -z ${JAVA_HOME} ]; then
    echo "Java is not installed or not properly configured. JAVA_HOME is not defined. "
    echo "Download and install the Java Synology package from"
    exit 1
  if [ ! -f ${JAVA_HOME}/bin/java ]; then
    echo "Java is not installed or not properly configured. The Java binary could not be located. "
    echo "Download and install the Java Synology package from"
    exit 1
  #is the User Home service enabled?
  synouser --add userhometest Testing123 "User Home test user" 0 "" ""
  UHT_HOMEDIR=`cat /etc/passwd | sed -r '/User Home test user/!d;s/^.*:User Home test user:(.*):.*$/\1/'`
  if echo $UHT_HOMEDIR | grep '/var/services/homes/' > /dev/null; then
    if [ ! -d $UHT_HOMEDIR ]; then
  synouser --del userhometest
  #remove home directory (needed since DSM 4.1)
  [ -e /var/services/homes/userhometest ] && rm -r /var/services/homes/userhometest
  if [ "${UH_SERVICE}" == "false" ]; then
    echo "The User Home service is not enabled. Please enable this feature in the User control panel in DSM."
    exit 1
    WGET_FILENAME="`echo ${WGET_URL} | sed -r "s%^.*/(.*)%\1%"`"
    wget ${WGET_URL}
    if [[ $? != 0 ]]; then
      if [ -d ${PUBLIC_FOLDER} ] && [ -f ${PUBLIC_FOLDER}/${WGET_FILENAME} ]; then
        echo "There was a problem downloading ${WGET_FILENAME} from the official download link, "
        echo "which was \"${WGET_URL}\" "
        echo "Alternatively, you may download this file manually and place it in the 'public' shared folder. "
        exit 1
  exit 0

postinst ()
  #create daemon user
  synouser --add ${DAEMON_USER} ${DAEMON_PASS} "${DAEMON_ID}" 0 "" ""
  #save the daemon user's homedir as variable in that user's profile
  #this is needed because new users seem to inherit a HOME value of /root which they have no permissions for.
  su - ${DAEMON_USER} -s /bin/sh -c "echo export HOME=\'${DAEMON_HOME}\' >> .profile"

  #extract CPU-specific additional binaries
  mkdir ${SYNOPKG_PKGDEST}/bin

  #extract main archive
  #extract cpio archive
  cat "${TEMP_FOLDER}/${EXTRACTED_FOLDER}"/${CPI_FILE} | gzip -d -c | ${SYNOPKG_PKGDEST}/bin/cpio -i --no-preserve-owner
  echo "#uncomment to expand Java max heap size beyond prescribed value (will survive upgrades)" > ${SYNOPKG_PKGDEST}/syno_package.vars
  echo "#you probably only want more than the recommended 512M if you're backing up extremely large volumes of files" >> ${SYNOPKG_PKGDEST}/syno_package.vars
  echo "#USR_MAX_HEAP=512M" >> ${SYNOPKG_PKGDEST}/syno_package.vars
  echo >> ${SYNOPKG_PKGDEST}/syno_package.vars

  #the following Package Center variables will need retrieving if launching CrashPlan via cron

  cp ${TEMP_FOLDER}/${EXTRACTED_FOLDER}/scripts/run.conf ${OPTDIR}/bin
  mkdir -p ${MANIFEST_FOLDER}/backupArchives    
  #save install variables which Crashplan expects its own installer script to create
  echo BINSDIR=/bin >> ${VARS_FILE}
  echo MANIFESTDIR=${MANIFEST_FOLDER}/backupArchives >> ${VARS_FILE}
  #leave these ones out which should help upgrades from Code42 to work (based on examining an upgrade script)
  #echo INITDIR=/etc/init.d >> ${VARS_FILE}
  #echo RUNLVLDIR=/usr/syno/etc/rc.d >> ${VARS_FILE}
  echo INSTALLDATE=`date +%Y%m%d` >> ${VARS_FILE}
  echo JAVACOMMON=\${JAVA_HOME}/bin/java >> ${VARS_FILE}
  cat ${TEMP_FOLDER}/${EXTRACTED_FOLDER}/install.defaults >> ${VARS_FILE}
  #remove temp files
  #change owner of CrashPlan folder tree
  exit 0

preuninst ()
  #make sure engine is stopped
  su - ${DAEMON_USER} -s /bin/sh -c "${OPTDIR}/bin/${ENGINE_SCRIPT} stop"
  sleep 2
  exit 0

postuninst ()
  if [ -f ${SYNOPKG_PKGDEST}/syno_package.vars ]; then
    source ${SYNOPKG_PKGDEST}/syno_package.vars

  if [ "${LIBFFI_SYMLINK}" == "YES" ]; then
    rm /lib/
  #if it doesn't exist, but is still a link then it's a broken link and should also be deleted
  if [ ! -e /lib/ ]; then
    [ -L /lib/ ] && rm /lib/
  #remove daemon user
  synouser --del ${DAEMON_USER}
  #remove daemon user's home directory (needed since DSM 4.1)
  [ -e /var/services/homes/${DAEMON_USER} ] && rm -r /var/services/homes/${DAEMON_USER}
 exit 0

preupgrade ()
  #make sure engine is stopped
  su - ${DAEMON_USER} -s /bin/sh -c "${OPTDIR}/bin/${ENGINE_SCRIPT} stop"
  sleep 2
  #if identity and config data exists back it up
  if [ -d ${DAEMON_HOME}/.crashplan ]; then
    mkdir -p ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig/conf
    mv ${DAEMON_HOME}/.crashplan ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig
      if [ -f ${OPTDIR}/${FILE_TO_MIGRATE} ]; then
      if [ -d ${OPTDIR}/${FOLDER_TO_MIGRATE} ]; then

  exit 0

postupgrade ()
  #use the migrated identity and config data from the previous version
  if [ -d ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig/.crashplan ]; then
    mv ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig/.crashplan ${DAEMON_HOME}
      if [ -f ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig/${FILE_TO_MIGRATE} ]; then
    if [ -d ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig/${FOLDER_TO_MIGRATE} ]; then
    rmdir ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig/conf
    rmdir ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig
    #make CrashPlan log entry
    TIMESTAMP="`date +%D` `date +%I:%M%p`"
    echo "I ${TIMESTAMP} Synology Package Center updated ${SYNOPKG_PKGNAME} to version ${SYNOPKG_PKGVER}" >> ${LOG_FILE}
    #daemon user has been deleted and recreated so we need to reset ownership (new UID)
    chown -R ${DAEMON_USER} ${DAEMON_HOME}/.crashplan
    #read manifest location from the migrated XML config, and reset ownership on that path too
    if [ -f ${SYNOPKG_PKGDEST}/conf/my.service.xml ]; then
      MANIFEST_FOLDER=`cat ${SYNOPKG_PKGDEST}/conf/my.service.xml | grep "<manifestPath>" | cut -f2 -d'>' | cut -f1 -d'<'`
    #the following Package Center variables will need retrieving if launching CrashPlan via cron
    grep "^CRON_SYNOPKG_PKGNAME" ${SYNOPKG_PKGDEST}/syno_package.vars > /dev/null \
     || echo "CRON_SYNOPKG_PKGNAME='${SYNOPKG_PKGNAME}'" >> ${SYNOPKG_PKGDEST}/syno_package.vars
    grep "^CRON_SYNOPKG_PKGDEST" ${SYNOPKG_PKGDEST}/syno_package.vars > /dev/null \
     || echo "CRON_SYNOPKG_PKGDEST='${SYNOPKG_PKGDEST}'" >> ${SYNOPKG_PKGDEST}/syno_package.vars
  exit 0


#--------CRASHPLAN start-stop-status script
#--------package maintained at

if [ "${SYNOPKG_PKGNAME}" == "" ]; then
  #if this script has been invoked by cron then some Package Center vars are undefined
  source "`dirname $0`/../target/syno_package.vars"

#Main variables section
DAEMON_USER="`echo ${SYNOPKG_PKGNAME} | awk {'print tolower($_)'}`"
TEMP_FOLDER="`find / -maxdepth 2 -name '@tmp' | head -n 1`"
MANIFEST_FOLDER="/`echo $TEMP_FOLDER | cut -f2 -d'/'`/crashplan"
LIBFFI_SO_NAMES="5 6" #armada370 build of is newer, and uses
source ${OPTDIR}/install.vars

JAVA_MIN_HEAP=`grep "^${CFG_PARAM}=" "${OPTDIR}/bin/${ENGINE_CFG}" | sed -r "s/^.*-Xms([0-9]+)[Mm] .*$/\1/"`
SYNO_CPU_ARCH="`uname -m`"

case $1 in
    #set the current timezone for Java so that log timestamps are accurate
    #we need to use the modern timezone names so that Java can figure out DST 
    SYNO_TZ=`cat /etc/synoinfo.conf | grep timezone | cut -f2 -d'"'`
    SYNO_TZ=`grep "^${SYNO_TZ}" /usr/share/zoneinfo/Timezone/tzname | sed -e "s/^.*= //"`
    grep "^export TZ" ${DAEMON_HOME}/.profile > /dev/null \
     && sed -i "s%^export TZ=.*$%export TZ='${SYNO_TZ}'%" ${DAEMON_HOME}/.profile \
     || echo export TZ=\'${SYNO_TZ}\' >> ${DAEMON_HOME}/.profile
    #this package stores the machine identity in the daemon user home directory
    #so we need to remove any old config data from previous manual installations or startups
    [ -d /var/lib/crashplan ] && rm -r /var/lib/crashplan

    #check persistent variables from syno_package.vars
    if [ -f ${SYNOPKG_PKGDEST}/syno_package.vars ]; then
      source ${SYNOPKG_PKGDEST}/syno_package.vars
    USR_MAX_HEAP=`echo $USR_MAX_HEAP | sed -e "s/[mM]//"`

    #create or repair libffi symlink if a DSM upgrade has removed it
    for FFI_VER in ${LIBFFI_SO_NAMES}; do 
      if [ -e ${OPTDIR}/lib/${FFI_VER} ]; then
        if [ ! -e /lib/${FFI_VER} ]; then
          #if it doesn't exist, but is still a link then it's a broken link and should be deleted
          [ -L /lib/${FFI_VER} ] && rm /lib/${FFI_VER}
          ln -s ${OPTDIR}/lib/${FFI_VER} /lib/${FFI_VER}

    #fix up some of the binary paths and fix some command syntax for busybox 
    #moved this to start-stop-status from because Code42 push updates and these
    #new scripts will need this treatment too
    for TARGET in ${SCRIPTS_TO_EDIT}; do
    find ${OPTDIR} \( -name \*.sh ${FIND_TARGETS} \) | while IFS="" read -r FILE_TO_EDIT; do
      if [ -e ${FILE_TO_EDIT} ]; then
        #this list of substitutions will probably need expanding as new CrashPlan updates are released
        sed -i "s%^#!/bin/bash%#!${SYNOPKG_PKGDEST}/bin/bash%" "${FILE_TO_EDIT}"
        sed -i -r "s%(^\s*)nice -n%\1${SYNOPKG_PKGDEST}/bin/nice -n%" "${FILE_TO_EDIT}"
        sed -i -r "s%(^\s*)(/bin/ps|ps) [^\|]*\|%\1/bin/ps w \|%" "${FILE_TO_EDIT}"
        sed -i -r "s%\`ps [^\|]*\|%\`ps w \|%" "${FILE_TO_EDIT}"
        sed -i "s/rm -fv/rm -f/" "${FILE_TO_EDIT}"
        sed -i "s/mv -fv/mv -f/" "${FILE_TO_EDIT}"

    #any downloaded upgrade script will usually have failed until the above changes are made so we need to
    #find it and start it, if it exists
    UPGRADE_SCRIPT=`find ${OPTDIR}/upgrade -name ""`
    if [ -n "${UPGRADE_SCRIPT}" ]; then
      rm ${OPTDIR}/${ENGINE_SCRIPT}.pid

      #make CrashPlan log entry
      TIMESTAMP="`date +%D` `date +%I:%M%p`"
      echo "I ${TIMESTAMP} Synology repairing upgrade in ${SCRIPT_HOME}" >> ${LOG_FILE}

      mv ${SCRIPT_HOME}/upgrade.log ${SCRIPT_HOME}/upgrade.log.old
      su - ${DAEMON_USER} -s /bin/sh -c "cd ${SCRIPT_HOME} ; ."
      mv ${SCRIPT_HOME}/ ${SCRIPT_HOME}/
      exit 0

    #updates may also overwrite our native binaries
    if [ "${SYNO_CPU_ARCH}" == "x86_64" ]; then
      cp ${SYNOPKG_PKGDEST}/bin/ ${OPTDIR}/lib
      cp -f ${SYNOPKG_PKGDEST}/bin/ ${OPTDIR}
      cp -f ${SYNOPKG_PKGDEST}/bin/jna-3.2.5.jar ${OPTDIR}/lib
      cp -f ${SYNOPKG_PKGDEST}/bin/* ${OPTDIR}/lib

    #set appropriate Java max heap size
    RAM=$((`free | grep Mem: | sed -e "s/^ *Mem: *\([0-9]*\).*$/\1/"`/1024))
    if [ $RAM -le 128 ]; then
    elif [ $RAM -le 256 ]; then
    elif [ $RAM -le 512 ]; then
    #CrashPlan's default max heap is 512MB
    elif [ $RAM -gt 512 ]; then
    if [ $USR_MAX_HEAP -gt $JAVA_MAX_HEAP ]; then
    if [ $JAVA_MAX_HEAP -lt $JAVA_MIN_HEAP ]; then
      #can't have a max heap lower than min heap (ARM low RAM systems)
    sed -i -r "s/(^${CFG_PARAM}=.*) -Xmx[0-9]+[mM] (.*$)/\1 -Xmx${JAVA_MAX_HEAP}m \2/" "${OPTDIR}/bin/${ENGINE_CFG}"
    #disable the use of the x86-optimized external Fast MD5 library if running on ARM and QorIQ CPUs
    #seems to be the default behaviour now but that may change again
    if [ "${SYNO_CPU_ARCH}" != "x86_64" ]; then
      grep "^${CFG_PARAM}=.*c42\.native\.md5\.enabled" "${OPTDIR}/bin/${ENGINE_CFG}" > /dev/null \
       || sed -i -r "s/(^${CFG_PARAM}=\".*)\"$/\1 -Dc42.native.md5.enabled=false\"/" "${OPTDIR}/bin/${ENGINE_CFG}"

    #move the Java temp directory from the default of /tmp
    grep "^${CFG_PARAM}=.*Djava\.io\.tmpdir" "${OPTDIR}/bin/${ENGINE_CFG}" > /dev/null \
     || sed -i -r "s%(^${CFG_PARAM}=\".*)\"$%\1${TEMP_FOLDER}\"%" "${OPTDIR}/bin/${ENGINE_CFG}"

    #reset ownership of all files to daemon user, so that manual edits to config files won't cause problems
    chown -R ${DAEMON_USER} ${DAEMON_HOME}    

    #now edit the XML config file, which only exists after first run
    if [ -f ${SYNOPKG_PKGDEST}/conf/my.service.xml ]; then

      #allow direct connections from CrashPlan Desktop client on remote systems
      #you must edit the value of serviceHost in conf/ on the client you connect with
      #users report that this value is sometimes reset so now it's set every service startup 
      sed -i "s/<serviceHost>127\.0\.0\.1<\/serviceHost>/<serviceHost>0\.0\.0\.0<\/serviceHost>/" "${SYNOPKG_PKGDEST}/conf/my.service.xml"
      #this change is made only once in case you want to customize the friends' backup location
      if [ "${MANIFEST_PATH_SET}" != "True" ]; then

        #keep friends' backup data outside the application folder to make accidental deletion less likely 
        sed -i "s%<manifestPath>.*</manifestPath>%<manifestPath>${MANIFEST_FOLDER}/backupArchives/</manifestPath>%" "${SYNOPKG_PKGDEST}/conf/my.service.xml"
        echo "MANIFEST_PATH_SET=True" >> ${SYNOPKG_PKGDEST}/syno_package.vars

      #since CrashPlan version 3.5.3 the value javaMemoryHeapMax also needs setting to match that used in bin/run.conf
      sed -i -r "s%(<javaMemoryHeapMax>)[0-9]+[mM](</javaMemoryHeapMax>)%\1${JAVA_MAX_HEAP}m\2%" "${SYNOPKG_PKGDEST}/conf/my.service.xml"
      echo "Wait a few seconds, then stop and restart the package to allow desktop client connections." > "${SYNOPKG_TEMP_LOGFILE}"
    if [ "${CRON_LAUNCHED}" == "True" ]; then
      [ -e /var/packages/${SYNOPKG_PKGNAME}/enabled ] || touch /var/packages/${SYNOPKG_PKGNAME}/enabled

    #delete any stray Java temp files
    find /tmp -name "jna*.tmp" -user ${DAEMON_USER} | while IFS="" read -r FILE_TO_DEL; do
      if [ -e ${FILE_TO_DEL} ]; then
        rm ${FILE_TO_DEL}

    #increase the system-wide maximum number of open files from Synology default of 24466
    echo "65536" > /proc/sys/fs/file-max

    #raise the maximum open file count from the Synology default of 1024 - thanks Casper K. for figuring this out
    ulimit -n 65536

    if [ "${SYNO_CPU_ARCH}" == "x86_64" ]; then
      #Intel synos running older DSM need rwojo's glibc version shim for inotify support
      GLIBC_VER="`/lib/ | grep -m 1 version | sed -r "s/^[^0-9]*([0-9].*[0-9])\,.*$/\1/"`"
      if [ "${GLIBC_VER}" == "2.3.6" ]; then
        su - ${DAEMON_USER} -s /bin/sh -c "LD_PRELOAD=${SYNOPKG_PKGDEST}/lib/ ${OPTDIR}/bin/${ENGINE_SCRIPT} start"
        su - ${DAEMON_USER} -s /bin/sh -c "${OPTDIR}/bin/${ENGINE_SCRIPT} start"
      su - ${DAEMON_USER} -s /bin/sh -c "${OPTDIR}/bin/${ENGINE_SCRIPT} start"
    exit 0

    su - ${DAEMON_USER} -s /bin/sh -c "${OPTDIR}/bin/${ENGINE_SCRIPT} stop"
    if [ "${CRON_LAUNCHED}" == "True" ]; then
      [ -e /var/packages/${SYNOPKG_PKGNAME}/enabled ] && rm /var/packages/${SYNOPKG_PKGNAME}/enabled
    exit 0

    PID=`/bin/ps w| grep "app=${APP_NAME}" | grep -v grep | awk '{ print $1 }'`
    if [ -n "$PID" ]; then
      exit 0
      exit 1

    echo "${LOG_FILE}"
    exit 0


  • 0027 Fixed open file handle limit for very large backup sets (ulimit fix)
  • 0026 Updated all CrashPlan clients to version 3.6.3, improved handling of Java temp files
  • 0025 glibc version shim no longer used on Intel Synology models running DSM 5.0
  • 0024 Updated to CrashPlan PROe and added support for PowerPC 2010 Synology models running DSM 5.0
  • 0023 Added support for Intel Atom Evansport and Armada XP CPUs in new DSx14 products
  • 0022 Updated all CrashPlan client versions to 3.5.3, compiled native binary dependencies to add support for Armada 370 CPU (DS213j), now updates the new javaMemoryHeapMax value in my.service.xml to the value defined in syno_package.vars
  • 0021 Updated CrashPlan to version 3.5.2
  • 0020 Fixes for DSM 4.2
  • 018 Updated CrashPlan PRO to version 3.4.1
  • 017 Updated CrashPlan and CrashPlan PROe to version 3.4.1, and improved in-app update handling
  • 016 Added support for Freescale QorIQ CPUs in some x13 series Synology models, and installer script now downloads native binaries separately to reduce repo hosting bandwidth, PowerQUICC PowerPC processors in previous Synology generations with older glibc versions are not supported
  • 015 Added support for easy scheduling via cron – see updated Notes section
  • 014 DSM 4.1 user profile permissions fix
  • 013 implemented update handling for future automatic updates from Code 42, and incremented CrashPlanPRO client to release version 3.2.1
  • 012 incremented CrashPlanPROe client to release version 3.3
  • 011 minor fix to allow a wildcard on the cpio archive name inside the main installer package (to fix CP PROe client since Code 42 Software had amended the cpio file version to
  • 010 minor bug fix relating to daemon home directory path
  • 009 rewrote the scripts to be even easier to maintain and unified as much as possible with my imminent CrashPlan PROe server package, fixed a timezone bug (tightened regex matching), moved the script-amending logic from to with it now applying to all .sh scripts each startup so perhaps updates from Code42 might work in future, if wget fails to fetch the installer from Code42 the installer will look for the file in the public shared folder
  • 008 merged the 14 package scripts each (7 for ARM, 7 for Intel) for CP, CP PRO, & CP PROe – 42 scripts in total – down to just two! ARM & Intel are now supported by the same package, Intel synos now have working inotify support (Real-Time Backup) thanks to rwojo’s shim to pass the glibc version check, upgrade process now retains login, cache and log data (no more re-scanning), users can specify a persistent larger max heap size for very large backup sets
  • 007 fixed a bug that broke CrashPlan if the Java folder moved (if you changed version)
  • 006 installation now fails without User Home service enabled, fixed Daylight Saving Time support, automated replacing the ARM symlink which is destroyed by DSM upgrades, stopped assuming the primary storage volume is /volume1, reset ownership on /var/lib/crashplan and the Friends backup location after installs and upgrades
  • 005 added warning to restart daemon after 1st run, and improved upgrade process again
  • 004 updated to CrashPlan 3.2.1 and improved package upgrade process, forced binding to each startup
  • 003 fixed ownership of /volume1/crashplan folder
  • 002 updated to CrashPlan 3.2
  • 001 intial public release
About these ads

2,508 thoughts on “CrashPlan packages for Synology NAS

  1. Joachim

    Hi Patters,

    For your information, the “this affiliate link” does not work.

    Kind regards,

    1. patters Post author

      Yes unfortunately Google dropped the entire Affiliates programme some time last year, and CrashPlan never switched to a different affiliate plan.

  2. RobertL

    Thanks for the package.

    I am using the seed drive service from Crashplan, and just recieived my drive. When I open crashplan on my mac and select destinations, I can see the USB volumne mounted by my Synology NAS, but Crashplan reports that “the backup engine does not have access to the given location”.
    I have made sure all users have permission to use this drive. What am I missing?

    1. Philip Hawkshaw

      I have this issue even though I’m selecting a folder. It is an archive brought from another Linux machine could it be file permissions?

  3. Roms

    Everything used to work perfectly for few months. But now, the crashplan headless service on my Synology is at status “stopped”. And when I choose the menu “Action -> Run”, it dosen’t start the service (still at status “stopped”). Did you see this behavior before ??
    Thanks for your help !

      1. Steve

        I thought that you couldn’t use Java 7 with an Intel DS but lo and behold it works! Thanks so much.

      2. patters Post author

        Yes Synology fixed that issue with Java 7 when DSM 5.0 came out. I wouldn’t show it in the list of available packages on the repo if it didn’t work :)

      3. Jennifer

        I think the last DSM update killed it. I uninstalled Java 8, reinstalled Java 7 and it’s working again. Thanks!

      4. Jennifer

        Oops. I take that back. It ran for a while, then stopped again. The last update I had done was for the Transmission package. I stopped the Transmission package and Crashplan now stays running. As soon as I start Transmission, Crashplan stops again.

        DS213 | DSM 5.0.443

  4. Scott

    Looks like my last comment got hidden below the fold. Wanted to make sure I got this to you for 414j support. Thanks again!
    Looks like the newest update for Java is now showing the version numbers in the log.

    Here is the results from cat command:

    and from uname command:

    If you need anything, else I’ll see what I can do. Thanks!

    1. John T. Hoffoss

      Regarding 414j support, I tried the overly simplistic tactic of downloading the package, executing ‘tar zvxpf ‘, editing INFO to add comcerto2k, ‘tar cvpzf INFO LICENSE package.tgz scripts’ and manually installing. Installation appears to succeed, but when trying to start the service, receiving the following:

      DiskStation> ./scripts/ start
      find: /volume1/@appstore/CrashPlan/upgrade: No such file or directory
      ./scripts/ line 7: can’t create : nonexistent directory
      -sh: /volume1/@appstore/CrashPlan/bin/CrashPlanEngine: not found

      But it’s there:

      DiskStation> ls -la /volume1/@appstore/CrashPlan/bin/
      drwxr-xr-x 2 crashpla root 4096 Aug 17 18:10 .
      drwxr-xr-x 3 crashpla root 4096 Aug 17 18:09 ..
      -rwxr-xr-x 1 crashpla root 4771 Aug 17 18:10 CrashPlanEngine
      -rwxr-xr-x 1 crashpla root 906921 Jun 3 2013 bash
      -rwxr-xr-x 1 crashpla root 184958 Jun 3 2013 cpio
      -rwxrwxrwx 1 crashpla root 972340 Jun 3 2013 jna-3.2.5.jar
      -rwxr-xr-x 1 crashpla root 31756 Jun 3 2013
      -rwxrwxr-x 1 crashpla root 126646 Jun 3 2013
      -rwxrwxr-x 1 crashpla root 44089 Jun 3 2013 nice
      -rw-r–r– 1 crashpla root 620 Aug 17 18:10 run.conf

      DiskStation> ls -la /var/packages/CrashPlan
      drwxr-xr-x 3 root root 4096 Aug 17 18:09 .
      drwxrwxrwx 8 root root 4096 Aug 17 18:09 ..
      -rw-r–r– 1 root root 30316 Aug 17 17:51 INFO
      –wxr-x— 1 root root 0 Aug 17 18:09 enabled
      lrwxrwxrwx 1 root root 32 Aug 17 18:09 etc -> /usr/syno/etc/packages/CrashPlan
      drwxrwxrwx 2 admin users 4096 Nov 26 2012 scripts
      lrwxrwxrwx 1 root root 28 Aug 17 18:09 target -> /volume1/@appstore/CrashPlan

  5. Jeremy T

    The current JavaSE packages appear to install properly and java runs on the 414j and its armv7, but the CrashPlan package is not flagged for this architecture yet. Patters, I would be happy to help with any testing that needs to be done on that architecture, as I have a 414j available.

  6. Roms

    On my side, I de-installed the CrashPlan engine (from my Synology 214Play – Intel) + I installed java 7 instead of java 8 + I re-installed the CrashPlan engine + I made an adoption in CrashPlan (if not, CrashPlan was considering my Synology was a new one instead of the same as before de-installation/re-installation).
    It worked few days. But now, the service is stopped, and when I click on Run, again it doesn’t launch it. Back to the initial problem…
    Other strange issue : the client User interface on my PC (talking with the Synology backup engine), always crashes with a critical error, saying that it was disconnected from the backup engine.
    Knowing that everything used to work fine during my first 2 month (I backed ip 400 GB !), I don’t understand which parameter changed to make these troubles happen.
    The CrashPlan support doesn’t accept to support when it is a head-less version.
    Is it the proper place here for these questions, or is there a specific Synology forum for CrashPlan ? (I found a section “Third-party Packages”, but no CrashPlan sub-section in it)
    Thanks !

    1. Roms

      Just adding a last precision : I used the command line from the PC GUI (by double-clicking on the CrasPlan icon) and I typed the command : deauthorize (=> Deauthorize the computer. This completely disables the backup service and requires login to resume). Then it asked me to log-in (still in the PC GUI). May ba this commad is not compliant with the headless version and making that the backup engine forced to be stopped and can’t be restarted ???

    2. Bjorn

      Roms, this sounds like it could be a memmory issue. The bigger your backup set becomes, the greater the need for memory. Have you changed the default of 512mb in the settings? Then again, you should see OutOfMemmoryExceptions in the serverside-log if this does happen. Can’t remember of my head where its located, but it should be in a log-folder close to installed application.

      1. Roms

        Hello Bjorn, I feel like an idiot, but I can’t find /usr on my Sonology (the crashplan logs on Linux are supposed to be in /usr/local/crashplan/log, but I always navigate my NAS via File Station and this one doesn’t show me any /usr/local…).

      2. Roms

        Now I have PuTTY and I can navigate my Synology directories !
        /usr/local exists, but not /usr/local/crashplan… Would anybody know were are the crashplan logs on Synology ??
        Thanks !

      3. Bjorn

        I am away and cant check this, but you should be able to run the “find” command to find your file. Type this in the putty prompt, wihtout the outer qoutes: “find / -name “*.log” ”
        In another words #find / -name “*.log”
        This will find all files that ends with .log on your hard drive.

      4. Roms

        Thanks !
        In case someone else need this information, the logs on Synology are there : /volume1/@appstore/CrashPlan/log

  7. nvt1

    There has been a lot of activity and now the instructions that used to work don’t
    My DS1010+ crashed and I had to do a restore and reinstall of DSM…I lost my Crashplan set up and configuration. Frustrating as I had not updated DSM for weeks/months as everythgin was work gin fine with the Crashplan headless application
    So once I got the unit restored and set up I went about following the instructions
    Added the source and saw the Java installs. Successfully downloaded the Java 8 install and ran and installed
    But when I looked for Crashplan all I saw was CrashPlanProe. I tried that but this is setting up my device as a Crashplan server which is not what I want
    I want to install the headless Crashplan client so I can continue my backup that had been working for the past several months and was 60% complete

    Is there some new technique or source to install the Crashplan headless client or has this reverted to manual install and if so which one
    I believe my unit it Intel Dual core 1.6 Ghz x86

    Thanks for any help/guidance

    1. nvt1

      Minor update – it showed up as Crashplan (not sure why it did not show up)
      I installed it and it starts and then stops immediately
      I have tried Java v7 and v6 (uninstall v8 tried v7, uninstall v7 and install v6)
      Same response
      Crashplan starts and then stops immediately

      1. Shane

        you will have to change the amount of Ram that Crashplan is using. The instructions that are posted lists where and what values depending on the unit and how much ram your synology has.

      2. sirevag

        same issue here.. newest DSM.. DS713+.. it used to work, not anymore.. tried Java 6,7,8.. no success… crashplan log says it starts then stops.. tried adjusting memory settings file no success.. anyone? I can check all logfiles etc, but not sure what is the problem.. here is one logfile output (/volume1/@appstore/CrashPlan/log/engine_output.log)

        [08.03.14 19:34:37.247 INFO main root ] Checking Java memory heap max.
        [08.03.14 19:34:37.253 INFO main root ] Previous Java memory max heap size was 512
        [08.03.14 19:34:37.263 INFO main root ] END Loading Configuration
        jtux Loaded.
        [08.03.14 19:34:43.082 INFO main root ] ***** STOPPING *****
        [08.03.14 19:34:43.084 INFO Thread-0 root ] Stopping service…
        [08.03.14 19:34:43.375 INFO Thread-0 root ] Selector shutting down…

      3. sirevag

        update: it appears that some java process was hanging with a listening port 4243, and I found that by using netstat -tulpn | grep 4243 and was now able to start up the crashplan finally…

      4. nvt1

        I tried the netstat command
        netstat -tulpn | grep 4243
        Ran with output
        tcp 0 0* LISTEN 23730/java

        Still no success

        I also edited the memory in
        and increased to 1536 (I upgraded to 3Gb)

        Still no avail
        I can’t seem to grab the output log but essentially it says
        Date/Time : Crashplan Started v3.6.3 GUID
        Date/Time : Crashplan Syopped v3.6.3 GUID
        The date and time are the same for all starts and stops

        Thanks for any suggestions

      5. patters Post author

        The first time CrashPlan runs, it only accepts connections to the engine from the same host ( My scripts edit this to (allowing connections from all hosts), but I can only make this edit once the config file exists. The config gets created the first time it runs, that’s why CrashPlan must be launched twice before it works. This is explained in the notes on this page which I think a lot of people aren’t reading. Perhaps some of you are stopping and starting too quickly for that first run. Look in at the log in Package Center to confirm that it started the first time before you stop it.

        If you somehow have an orphaned process listening only for requests from then reboot your NAS and it should be fine after that.

      6. nvt1

        Thanks patters – great work here and great support and guidance too which is greatly appreciated
        I am embarrassed to say that a simple re-boot of the synology device solves the problem and it is now connecting and re-sync

        One remaining question
        Should I/Can I upgrade from Java v6 to Java v7 or Java v8

      7. patters Post author

        Glad to hear you’re back up and running. With the threat of that Synolocker malware, CrashPlan is a more relevant protection than ever.

        Java 6 isn’t maintained any more so it’s a potential security risk, and 8 is too new for most applications, so I’d recommend Java 7.

  8. Eloi

    I installed using Java v8 and later downgraded to v7 since I seen some people getting it to work on that but still it just run and die off by itself. i tried multiple times already. The logs just say starting up, load configuration, jtux loaded then it goes into stopping.

    1. sirevag

      hi, that was my problem too! For some reason one of the java-processes from perhaps a different java version was still running, locking up the listening port for the Crashplan service.. see my reply a little bit up on this thread to solve your issue

      1. Eloi

        Weirdly it just work suddenly after I did 2 reboots, no idea why though but I am not complaining.

  9. Eloi

    Anyone tried to disble the service on the windows machine that you are using it as a client? like illustrated above?

    “Once the engine is running, you can manage it by installing CrashPlan on another computer, and editing the file conf/ on that computer so that this line:
    is uncommented (by removing the hash symbol) and set to the IP address of your NAS, e.g.:
    On Windows you can also disable the CrashPlan service if you will only use the client.”

    The moment I disable the service it seems to be not able to access the service on the NAS. I double check the settings on the UI it seems to reflect the one in the NAS as well. Anyone seeing that?

  10. mohammedslimani

    Some here, it’s keeping running and stopping by itself… Logs just say starting up, load configuration, then stopping….

    Sometime it startworking for a while with no reason… I don’t know what to do… tested with java6, Java7 and java8

  11. Erik

    Thanks for a great tutorial! It was easy to follow, and I managed to get everything working smoothly with the CrashPlan backup.

    However, in the process of setting this up, I have lost the possibility of external access to my DS213. I have an Airport Extreme, and it has worked perfectly fine in the past 1-2 years with the port forwarding. Previously, I have been able to both use the EZ-Internet, the Setup Router in Router configuration (in the Control panel), or doing it manually in the Airport Extreme (all have worked). Now, neither of these work despite many attempts and after several restarts of the Diskstation as well as the router. When trying to set up the port forwarding, it just goes into a waiting mode and sometime hangs, and sometimes comes back, but without being able to write the rules to the router.

    I have not changed anything else except installing the CrashPlan packages, so it seems like this has caused some sort of conflict. Does anyone else recognize this? If I remove the Crashplan and Java packages, and then reinstall, will Crashplan central be able to pick up the image, or will I have to backup everything once again (took a month or two this time, but almost done now).

    Please help, thanks in advance!

  12. Michael Barrientos

    Bug Report; installed most recent versions as of today (Aug 10, 2014):

    DSM 5.0-4493 Update 3 (14/7/19)
    Java: ejdk-8u6-fcs-b23-linux-i586-12_jun_2014
    CrashPlan 3.6.3-0027

    > ps w

    27732 crashpla 635m S N /volume1/@appstore/java8/ejdk1.8.0_06/linux_i586/jre/bin/java -Dfile.encoding=UTF-8 -Dapp=CrashPlanServic

    Truncates “CrashPlanService” as show above breaking both
    /volume1/@appstore/CrashPlan/bin/CrashPlanEngine (at _findpid()) and
    /var/packages/CrashPlan/scripts/ for status

    What this looks like to the user is that after install the package is “stopped”, but it is in fact running. Package action “Stop” does not work, and “Start” may attempt to run multiple instances, not sure. Possible workaround: install package, let it run and initialize conf, then restart DSM. I have modified the broken scripts and it works for me and have not tested the workaround.



    1. patters Post author

      Thanks. I always thought that was a shoddy way of detecting the running process, but that’s actually the official CrashPlan script not mine. Currently I’m mid way through a total re-write to use a daemon script of my own instead.

      1. patters Post author

        You should be able to get around this by using Java 7 instead of Java 8 until I can release the new package version. The path to the Java 7 binary is not so long so the truncation of “CrashPlanService” won’t occur in the output of the ps command.

      2. gnucoder

        Thank you for your feedback, but i’ve tried all java versions from 6 to 8, no one is working :( I have no backup for 3 weeks :'( If someone can tell me how to fix the script?

      3. stansween

        I’m unable to start the Crashplan Package on my Synology. It’s immediately getting ‘stopped’. I’m unable to debug the issue. The following the versions I’m using:
        DSM version: DSM 5.0-4493 Update 3
        Java: Java SE Embedded 7 – 1.7.0_60-0027
        Crashplan: 3.6.3-0027

  13. azeus

    To help analysing the bug, I’ve checked and like Michael Barrientos, the commande lunched is truncated:

    ps w | grep -i crash

    6776 crashpla 635m S N /volume1/@appstore/java8/ejdk1.8.0_06/linux_i586/jre/bin/java -Dfile.encoding=UTF-8 -Dapp=CrashPlanServic

    I managed to change the environnement (.profile in /root and /etc/profile) like this:

    PATH=$PATH:/volume1/@appstore/java/bin # Synology Java Package
    JAVA_HOME=/volume1/@appstore/java # Synology Java Package
    CLASSPATH=.:/volume1/@appstore/java/lib # Synology Java Package

    And I created a symlink in /volume1/@appstore like this:

    DiskCraft> ls -all java
    lrwxrwxrwx 1 root root 52 Aug 11 20:02 java -> /volume1/@appstore/java8/ejdk1.8.0_06/linux_i586/jre

    Now my process is still truncated but it’s having more characters (as java path is shorter)

    7566 crashpla 635m S N
    /volume1/@appstore/java8/ejdk1.8.0_06/linux_i586/jre/bin/java -Dfile.encoding=UTF-8 -Dapp=CrashPlanService -DappBaseName=CrashPlan -Xms20m -Xmx512m

    I can see the process running in DSM (It was’nt the case before the chage) but it’s still not backuping.

    I’ve read the CrashPlanEngin script, and in my understanding the correct command should be something like :

    /volume1/@appstore/java8/ejdk1.8.0_06/linux_i586/jre/bin/java -Dfile.encoding=UTF-8 -Dapp=CrashPlanService -DappBaseName=CrashPlan -Xms20m -Xmx512m -Dnetworkaddress.cache.ttl=300 -Dnetworkaddress.cache.ttl=300 -Dnetworkaddress.cache.negative.ttl=0 -Dc42.native.md5.enabled=false /volume1/@appstore/CrashPlan/lib/com.backup42.desktop.jar:/volume1/@appstore/CrashPlan/lang


    I don’t why the command is truncated :'(

    Plz help

  14. Herwig

    Thank you for this great tutorial. I have, however, a question regarding access to the headless client with a local installation of the crashplan software: Crashplan is up and running on my Diskstation; file modified according to description above. The software however does not connect, it only shows the local harddisk. Any hints?

  15. azeus


    I had 2 problems:

    – The first was because the long Java 8 path… I’ve fixed but my backup was still not working (even with java6 and java7).
    – The second problem is that I was trying to adopt my old backup (and avoid to reupload 500Go of data)… When I triyed to adopt crashplan keept rebooting all the time… I don’t know if it’s a problem on crashplan server side, or synology side… Anyway I’ve started a whole new backup from scrach and it’s working now.

  16. Argon


    Have the same problem. I own a DS412+ and installed Java 8 with the help of your packages. Then I installed Crashplan. When I want to run Crashplan it says immediately “Stopped”. So the package isn’t running.

    I uninstalled Java 8 and installed Java 7 and reinstalled Crashplan. Same problem occures. When I SSH into the Synology I still see this in top:
    /volume1/@appstore/java8/ejdk1.8.0_06/linux_i586/jre/bin/java -Dfile.encoding=UTF-8 -Dapp=CrashPlanService -DappBaseName=Crash

    It still uses Java 8 instead of Java 7? Now I don’t know if this is the problem or not.
    Is there some sort of manual to fix this?

    @patters Or can you fix this and update the package? :-)

  17. patters Post author

    I have nearly finished a comprehensive overhaul of the package which will include DS414j support, will run as root to eliminate the need to re-do permissions every upgrade, and will target DSM 5.0 systems only. There have been a number of changes since the original package – is now included with DSM for instance, and Intel systems no longer need the glibc version-faking shim since DSM 5.0. This version will also fix the problems with the CrashPlan daemon wrapper script by using a much better start-stop-status script. Finally firewall settings will be pre-populated with a CrashPlan application entry for users that use the DSM firewall.

    Unfortunately however on Thursday lightning struck a neighbour’s house which broke my phone and inexplicably killed the ethernet ports on my broadband router. Amazingly I still have functioning Internet via wifi, and have confirmed my NAS is still ok but I can’t get it online to test before I release. A new router is on its way, but it may take a week…

  18. olije

    Hi Patters, great work! I was wondering, will this also clear up the issues with crashplan not working on a DS213+ (it keeps on disconnecting when approached through the desktop application and it seems to get stuck on certain files and therefore never progressing/finishing a backup set)?
    I realise the 512MB memory limit is a problem with my backup size of ~800GB but I was hoping for a solution, e.g. through different memory stack usage. However, if this is fundamental problem, what DS (preferably 2-bay) would you recommend?
    Thanks a lot!

    1. Kipik

      How funny this is exactly the same issue i have since java 1.7.60 on my 213+. Neved had this problem before.

  19. stansween

    I’m unable to start the Crashplan Package on my Synology DS213+. It’s immediately getting ‘stopped’. I’m unable to debug the issue. The following the versions I’m using:
    DSM version: DSM 5.0-4493 Update 3
    Java: Java SE Embedded 7 – 1.7.0_60-0027
    Crashplan: 3.6.3-0027

    1. Kipik

      I would stop/ uninstall Crashplan, uninstall java 7, reinstall java, reinstall crashplan. That shall make it :)

  20. squareeyes

    Could only get CrashPlan working using v7 or Java. Is that normal? It just refused with v8.

  21. DJ

    Maybe a strange thought but could we move this to some forum? Maybe on the Synology website? There’s extremely useful information, hints, bug-finding, etc.

  22. Rick

    Thank you for making a Synology package! This makes setting up Crashplan so much easier.

    Like everyone I had problems getting my Synology to hibernate. In my setup I have several Crashplan clients which backup local files to Crashplan on my Synology (DS212j). Using the crontab in this setup is not an option because clients not active during certain times of the day.

    I found the post by hammer useful but it still didn’t do the trick. Crashplan still reads default.service.xml every once in a while, preventing hibernation. To fix this I moved this file to /etc (which is in RAM) replacing it by a symbolic link pointing to the new location:

    1. Move the default.service.xml file to the /etc directory:
    mv /volume1/@appstore/CrashPlan/conf/default.service.xml /etc/
    2. Make a symbolic link pointing to the file at the new location:
    ln -s /etc/default.service.xml /volume1/@appstore/CrashPlan/conf/default.service.xml
    3. Just to make sure change the ownership of the symbolic link to the Crashplan user:
    chown -h crashplan:root /volume1/@appstore/CrashPlan/conf/default.service.xml

    I hope this helps some of you. As said I also followed Hammers instructions. For the completeness of this post, these are his instructions:

    1. Symlink “log/app.log to “/tmp” (tmpfs filesystem so the file will probably only be written to RAM) as the application occasionally updates “log/app.log” even when CrashPlan is doing nothing.
    Run the following commands:
    rm /volume1/@appstore/CrashPlan/log/app.log
    ln -s /tmp/CrashPlan_app.log /volume1/@appstore/CrashPlan/log/app.log
    chown -h crashplan:users /volume1/@appstore/CrashPlan/log/app.log
    2. Edit “/volume1/@appstore/CrashPlan/conf/” to prevent “log/service.log.0″ to be written constantly with a lot of unneeded messages. Set all existing log-levels to WARN or OFF (see below).
    3. On all backup sets: Disable “Advanced settings (Configure)” -> “Watch file system in real time”.
    4. Mount volumes using noatime (

    An alternative to 1. is to symlink “log/app.log” to “/dev/null” (ln -s /dev/null /volume1/@appstore/CrashPlan/log/app.log) but this creates error messages in the “log/service.log.0″. The errors can be muted by adding “ = OFF” to “conf/”.

    Here’s my complete “conf/”:

    # Backup
    # Code 42 Software, Inc. 2005

    # Package Level modifications = WARN = OFF = WARN = WARN = WARN = WARN = WARN = WARN = WARN = WARN = WARN = WARN

    #Only if symlinking app.log to /dev/null = OFF

  23. hulldini

    I have a DS412+ and am running the Crashplan 3.6.3 on it. When I try to connect from my computer (a mac) the app runs for awhile before stopping with the error message “crashplan has been disconnected from the back up engine”. It seems that because of that it is not adding the new folders that I have added to the back up. When I log in to the Crashplan website and look at the files that can be restored, that folder isn’t listed there. Do you have any idea on what I can do to fix this.

    1. Fred


      I have the same problem since a few days, although I didn’t change anything in Java or Crashplan packages… maybe the latest small upgrade from Synology?

      Crashplan is constantly scanning and then indeed it the package stops and is restarted on my DS214play, disconnecting the PC client. Right now I stopped the package, completely useless.

      I am considering uninstalling and reinstalling, but last time I did that a folder with ~ 600 Gb of data was removed from the backup, even though I recovered the previous settings! Had to do it again, took like a month…

      I think I’ll wait for the next upgrade that is announced for soon!

      I have a feeling Crashplan doesn’t like people using their soft on NAS…

      Thanks in advance for your help if you have any insights!

  24. Wouter Vellekoop

    Mine is Working great!

    had to change the port also to 5000.

    So now the hibernation problem is the only thing that remains?
    Maybe a stupid suggestion? But isn’t it possible to shut crasplan of in the computer cliënt?
    By Settings>Back-Up>Back-Up Will Run. And than configure whatever your need is?
    Will this work?

  25. JaimeMC

    Having the same problem. I’m constantly getting the “crashplan has been disconnected from the backup engine” after analyzing the folders. Recently upgraded my DSM to the latest version because of all the Synology hacks going on so I went ahead and upgraded Java to 7 and the latest package for CrashPlan.

  26. gnucoder

    @hulldini and @Fred : Guys, I had the same problem as you on my DS412+. First of all, look at your logs. For my case, I had actually 2 problemes:

    – Crashplan was not showing as running in DSM because of the latest version of Java8. The path of java was too long… Solution, change the path of your java8 to something shorter via a symlink.
    – After the first fixe, crashplan stucks scanning and restarting again and again… The problem is I was trying to adopt an old backup. I definitely didn’t found a solution… I just started a full new backup from scratch… And its works again.

    1. JaimeMC

      I was doing the adoption to use an old backup originally. Did the same thing you suggested and started a new backup after I saw it was having issues but mine is still not working. Still doing its first scanning and then crashes as soon as that scan is done.

  27. Fred

    Well, I upgraded yesterday to latest version, DSM 5.0-4493 Update 4 (I guess I had Update 3 before) and it seems much better! I could backup new pictures without a problem. No idea if this is really the reason… but try latest upgrade!

    But now crashplan crashes when trying to backup my video folder, it might be because a file is too big. I have seen that already, just removing it from the backup list solves it. Will try tonight…

    1. olije

      Yes, same goes for me. Updated to update-4 and suddenly Crashplan started working as it always has. Strange but great! However, it would be good to understand the root cause to prevent future issues. Hopefully Patters can interpret the changes made in update-4 and link them to the issues described by so many people earlier. Cheers!

      1. svenc

        What I found out so far… The CrashPlan(PROe) Client stops because the directory /tmp/@tmp is missing (as shown in one of the logs of the CrashPlan Client itself). After manually creating that folder (mkdir /tmp/@tmp) and changing the privileges (chmod 0777 /tmp/@tmp) the client is working without problems. I had this problem twice on new Synology (DS1813+ and DS412+) and as I remember it was a problem starting (only with fresh installs so far) with the update-3 (but I am not quite sure about that).

      2. Fred

        Well, unfortunately mine still crashes after a while :'(
        (although there was definitely an improvement after upgrade)
        I have 2 backup sets, one with high priority with smaller documents and one with low priority with videos. It seems to crash on the second one, while scanning a small file.

        Now I had to redo this part during the last month or so (~600 Gb) so I really wouldn’t want to uninstall and start from scratch – again – as people above mention was the only solution :(

        Waiting for the next crashplan upgrade then… Thanks in advance, Patters!

  28. Mala

    Dear all
    Thanks for the great package, been using it for one year without too many hickups.

    Since DSM 5.0-4493 v4 upgrade, a few days ago, my Crashplan 3.6.3-0027 (green) has frozen.
    I can see this from the GUI.
    What I mean by this, is I have a connection to CP central, but the upload is stuck. (Pause is off)
    Just to check, I moved the first file where it was stuck, just in case it was a problem file, no joy.
    Logs don’t say anything…
    I have Java 8 installed, everything was fine until I updated DSM.
    Has anyone got the same problem, can anyone give any helpful advice.
    Cheers Mala

      1. Mala

        Just had one more issue which I solved

        Crashplan was running, but when in the GUI, GUI disconnects at 460GB
        Re opening the GUI starts the scan from scratch, and closes always before or at 460GB

        Solution: I took some of the folders out of the backup to see if there were too many jobs.
        After doing this, my GUI scan progressed to the end, and Crashplan is now backing up.
        I will now add folders bit by bit.

        Thank you all

      2. olije

        Unfortunately, Crashplan has suddenly stopped again, after running many days without a problem. I would be very curious what the root cause is to this problem as it seems quite random. Is it linked to the total size stored at Crashplan (are the actively changing/sabotaging stuff at their end to prevent big backup sets?), is it linked to filenames/paths? Because I really don’t understand why it runs perfectly for so many days and then suddenly, without anything changed that I know of, suddenly drops and refuses to start properly again.

        Patters, can you explain this, as you have the needed in-depth knowledge? Would be great to understand it! Thanks.

        DS213+ | DSM 5.0-4493 Update 4

      3. Shane

        Do a search in this community for heap size. I suspect, like a lot of problems, it is due to the fact that when CrashPlan installs on the Synology the heap can been to low for the amount of files being backed up.

        You can think of heap as memory that is used to keep a list of things to do. In this case it is a list of files to be backed up. If the heap is small then memory runs out and CrashPlan stops. If the heap is large enough then there is enough space for that list and any work/list/data that Java requires.

        The Synology I have is the 1512+ and I bought that because it could be upgraded to 3 GB of memory. Currently I am backup up 4.7 TB of data without a glitch. I would also say that I add about 12 GB of data ever week (I have some action cams and have kids and ride a motor bike, so there is lots of stuff I want to keep).

        I have 10 TB of disk currently and have had my system for over a year now (all data has been backed up and usually takes about a day if I had a bunch more video). I know what you are looking for is here it is just a matter of looking through the posts. As well there are lots of good people here that have made tones of suggestions and replies, including Patters.

      4. olije

        Thanks, Shane for your reply. What you’re basically saying is that a DS213+ (and many other similar diskstations) are just not fit for running backups like this, since the memory can not be expanded. That would mean spending a lot more money ~($1000) on a diskstation and HDD’s, which would leave me with an overkill compared with what I need.

        What still puzzles me though, is that after Update 4 my backup suddenly started running again and backed up many 1000’s of files more before crashing again. This would suggest that in Update 4 something was changed that positively affected the allowed heap size or other way of memory management in my DS213+. And that is exactly the kind of ‘root causes’ I would like to understand from highly skilled guys like Patters as this could potentially point towards solutions for the lower-spec diskstations. A workaround, of course, would be for Synology to enable memory upgrades for these lower spec machines. Cheers.

      5. John B.

        Memory seems to be a recurring problem. My DS411+II is backing up 8 TB of data, so I split this into 6 separate backup sets, reducing the concurrent memory requirement for CrashPlan.
        The NAS powers on at 8 am and off again at 11 pm (except fri & sat, where it runs 24/7).
        My backups run all the time, and verifies the selection once every day at 8 am.
        Another user (sorry, can’t remember the name right now) had an excellent post about Advanced Settings. The only setting that changes between my sets, is the “Data de-duplication” setting. The remainder is always Compression=off, Encryption=on, Watch=off, OpenFiles=on.

    1. olije

      Hi, try it with Java 7 as Patters also suggested. As stated before, my Crashplan stopped working some months ago, but all is working perfectly again since I installed Update-4 of 5.0-4493. As I am using Java 7, hopefully this will also work for you. It really is great that it works again. Great work Patters, code42 should be very happy with you, as undoubtedly many people use Crashplan only because you have enabled the headless use of it!

      1. Mala

        Thanks for the advice, I installed Java 7 as suggested, everything is working after the following alterations.

        I also applied some Java RAM allocation fixes which I had missed as 512MB seemed to be causing problems according to forums,
        /volume1/@appstore/CrashPlan/syno_package.vars – USR_MAX_HEAP=1536M
        /volume1/@appstore/CrashPlan/bin/run.conf – SRV_JAVA_OPTS – Xmx512m
        /volume1/@appstore/CrashPlan/bin/run.conf – GUI_JAVA_OPTS – Xmx1536m

        I have already had the max ram installed 3072 MB.

    2. Mala

      Now I can see my server again

      I edited “C:\Program Files\CrashPlan\conf\” text file, and put in the IP address of my Synology “serviceHost=x.x.x.x”

  29. DavidW

    Patters, FYI – I just returned from VMworld where I managed to spend time with both the Synology and Code42 folks. They’re both well aware of your work and said that they heard from plenty of other attendees that we’re all using your work.
    Unfortunately, Code42 has no plans to support headless operations for Crashplan (at least none that they would talk about) and Synology’s hands are tied. The Synology folks in particular were very thankful that you’re doing this work and helping their customers take advantage of Crashplan, even though Code42 doesn’t support it.
    Thanks as always for all the great work!

    1. Fred

      Well, if you think of it, Code42 has absolutely no interest for Crashplan to work on NAS… if it’s installed on a personal computer, usually the disks are smaller than 1 Tb, especially now with SSD. And to my knowledge, it’s not possible to backup an external hard drive (they claim it’s a technical problem, but I’m not sure I believe that…)
      When on a NAS, it’s usual to have 4 Tb of data, on a machine that is on 24/7 and can backup constantly… bad deal for them as backup size is illimited! ;)

      1. olije

        Not necessarily, depends on their cost side. Moreover, I doubt people would join Crashplan if you could only backup from or via your PC. I wouldn’t because I have all my relevant data on the DS and definitely do not want my PC on 24/7 only to enable the possibility for backups. Therefore I think code42 should actually actively market this opportunity with a good value proposition. I would even be willing to pay a bit more or otherwise have some kind of limit or differentiated plans. And last but not least, looking a few years ahead, the (private) cloud is here to stay and expand, so $/TB will also rapidly decrease, making e.g. 4 TB not such a big deal…

      2. John B.

        @Fred: I have several TB on my NAS that I would like an *external* backup of.
        Remember that RAID redundancy is not a backup, e.g.:
        I therefore have an unlimited Family package and would most likely use an official Crashplan client, if available.

        @Aj: backing up to ‘Friends’ is a nice option, but I don’t have any friends with several TB of free space that I may use for my own purposes and vice-versa. I think the ‘Friends’ option is good, but for smaller backups, i.e. I’d more likely to have a few GBs to spare than TBs

      3. Alexander

        IMO, you don’t “spare” space for your friends and neither do they for you. You buy the harddrives needed and install them in your friend’s machine.
        My dad and I do that. Half of the disks in my Synology are mine, the other his for his backup. If one of those disks needs to be replaced, he buys it. Same with my disks in his machine…

      4. AJ Willmer

        I have a friend on the East Coast (I am on the West Coast to be don’t share earthquakes or super storms), and we back up to each other. I have second backup at a friend’s in Los Angeles, a backup that was seeded.

      5. John B.

        While it sounds like a good (and probably also a fast) option, and a great way to spread risk, it’s simply not for me. I don’t have spare room for other peoples disks, nor room for other peoples data. I therefore chose to backup to CrashPlan Central. Problem solved.

  30. Mala

    Thanks for enabling headless
    I am just wondering if most of the stability problems are with [Crashplan - vanilla Green]
    I see few posts mentioning PRO or PROe

    Does anyone know if the upload speeds are greater with PRO or PROe
    I am not a business, but PROe still offers great value.

    Do Crashplan offer discounts on other accounts like they do for Crashplan Green?

    Thank you

  31. Bowers (@chrisipedia)

    Does Crashplan not start for anyone else on DSM 5.0-4493 Update 4 with Java 1.8.0_6-0027
    and Crashplan 3.6.3-0027?

    Everything installs but crashplan doesn’t run. It used to run before on previous versions.

  32. jxgw

    First thank you very much for this package! I would like to exclude the @eaDir and all sub folders / files from being backed up. I think it will greatly reduce the number of files and thus help with some of my memory issues. I have a 1512+ with 3GB of RAM and USR_MAX_HEAP=1536M which is working but still it is really not necessary for me to backup all the thumbnails. Problem is I am NOT REGEX savvy. Here is what I have in my exclude list.


    I thought the */@eaDir* would exclude do it, but this does not seem to be working. I built this list a while ago. Looks like there have been updates either with DSM 5 or PhotoStation that have changed from the THUMB_*.jpg I can probably add another exclusion for the SYNOPHOTO THUMB PREVIEW.jpg. But it will still build out the folder structure under the @eaDir.

    1. John B.

      I added my exclusions through the GUI, and just wrote them as they are (the last one being temporary files for Chrome downloads):

  33. azeus

    Just to let you know that I changed my USR_MAX_HEAP value from 512M to 1024M and Crashplan started working again.

  34. Mala

    Well after thinking I had things working, the GUI started crashing again.
    I found a 64 bit version of the Crashplan client, so “in for a penny, in for a pound” I upgraded.
    BIG MISTAKE, I tried to do a restore as suggested by crashplan before computer adoption (didn’t work). I adopted My DS, somehow the backup was lost for the second time with CP.
    I don’t like the way adoption works, I have never managed it without losing my backups.

    So what now? Well I cannot see my DS any more in the GUI, I checked everything, Telnet, ports, firewall, nothing works, but my DS is available in my network
    Conclusion… I’m beat for once, I am giving up with CP.

    1. davidgelb

      Do what I did and save yourself a ton of headaches…I purchased a used MAC Mini on ebay for a few hundred dollars. It can be set up to auto mount all your DS volumes. Backing us a DS via the Mac is officially supported by CP. It worked perfectly and backed up quicker than via the DS alone.

      So much easier and reliable. So happy I did this and now my DS remains 100% backed up.

      Let me know if you have any questions.

      1. olije

        Hi David,

        Sounds like a nice workaround although it partly defies the solution of having one single, low power consumption unit as your central storage with all its inherent possibilities. It might be more worthwhile spending the few hundred dollars on an upgrade towards a DS1513+, which seems to have sufficient memory (after max upgrade) to work well. What do you think?

        BTW, which MAC OS should I be looking for, should I want to move in this direction? I have seen some MAC mini’s for ~$150, but they run Tiger, which does not work with CP anymore.


      2. ericdano

        Why do that? It is really easy to get a Synology unit to run Crashplan.

        Also, if you get an “cheap” Mac Mini, and it isn’t able to run 10.9 or anywhere close, you can always stick Ubuntu Linux on it…..

      3. Jesse Graves

        Haha, if it’s “really easy”, I could really use some pointers per my Sept 7th post. :)

      4. davidgelb

        Hi olije,

        I am using a 1513+ and was having the same issue that everyone else is. It seemed that I spent more time trying to get CP to work than it actually spent working. I got a Mini that is running OSx 10.8 and it works perfectly. I was so tired of having to fix CP on the DS, and this way, it just works. Always. Reliable and fast.

        Plus, now that my large backup is complete and I am down only to small daily backups, I will schedule it to run only at night and the remainder of the time, I may use it as the computer connected to my TV – to run Skype and watch the DVDs and Blurays I have ripped to my DS.

        Hope that helps!

      5. olije

        Hi David,

        Thanks for you answer. Really surprises me that even with a 1513+ there is trouble. I was contemplating buying one but now I’m cured of that option. But what I’m curious about is why you have the 1513+ in the first place then. Is it sheer storage room? You might as well work with a few USB HDD’s plugged in, a lot cheaper!

        What you clearly illustrate is actually the root cause of all the discussions here. Despite the good work from Patters, CP is still a DIY kit that needs a lot of attention and maintenance. As posted earlier, I think they would be very wise to make an official, well working package for the Synology diskstations. Anyhow, I’ll keep the Mac option in mind although I find it pretty expensive. Might as well start backing up to Amazon S3 or HiDrive. I also found IDrive which I used a long time ago from my PC. Not too bad pricing @ $47,-/year for 1 TB. Would be enough for me for the near future. Unfortunately as one of the few, they don’t support de DS213+ (1513+ is however).

        I’ll need to keep on pondering on a solution…


      6. davidgelb

        Hi Olije,

        Yes, it is not the easiest solution, but it certainly is one of the least expensive. I decided to go with the 1513+ for volume. I realize I could have gone with the less expensive box and add USB drives, but I like having everything enclosed in one box as well as the beefier hardware.

        I am sticking with CP because of the unlimited storage space, plus, we purchased a family plan to back up the entire family, so for the cost, it can’t be beat. Plus, with the Mac Mini option, it is super easy and reliable.

        As I mentioned, I also plan to use the Mini as a home theater PC, so I will certainly get my use out of it.

        Good luck finding what works best for you!

  35. Matt Bentley

    My 213+ was doing great for a while, but now – no matter what I do, I get an out of memory error…

    I’ve edited the XMX and the user settings in the synoresouce to try and give CP more memory, but anytime I go over the base amount the Crashplan client can’t connect.

    My backup set is about 600Gb.

    Any ideas?

    1. olije

      Yeah, same here. Very frustrating!

      As pointed out several times on this page by several users, the main problem seems to be the memory overflowing. Apparently it can even (yet less frequently) happen to systems with e.g. 2 or 3 GB of memory.

      Therefore, what I would really like to plea for is solving the FUNDAMENTAL issue of memory overflowing. Without exactly knowing how the software works (and risking nonsense in the next sentences), I could imagine the application monitoring memory usage and at a certain level either clearing out unneeded information or writing the data to a temp file for later use. In other words, PREVENT the memory from overflowing, no matter how much memory is available. Or if CP prevents this way of working, perhaps backup a set in smaller chunks (e.g. max 10k files or max 5 GB per chunk).

      PATTERS, would be great if you could comment on this layman approach of mine. I really would like to understand the fundamentals of this recurring problem. As I am up for renewal of CP in a couple of months I am curious whether this problem is structural or not. If so, I will drop the use of CP and try to find a different solution, although with regret…

      1. tonysqrd

        I second your call for help!!! I am also willing to PAY PATTERS for a more stable package and support.

  36. Jesse Graves

    My Crashplan starts and stops as well. I found I have hundreds of restart logs containing this:

    Starting CrashPlan Engine … Using standard startup
    ./CrashPlanEngine: line 17: ./../ Permission denied

    A directory permissions issue? How can I tell what user is executing the CrashPlanEngine script?

  37. Derek

    Thanks patters for your work, quick question:

    How does this actually work in practice? Does my PC back up to my NAS, and my NAS is the only direct connection to CrashPlan? If so, would this mean that my whole family could back up to the NAS and yet I’d only need to purchase a CP license for my NAS?

    1. John B.

      Not any of those for the recent DSM 5x updates, but I remember having to install the java version once (don’t remember the details though). In fact, CP has been impressively stable for me for months. If you drag the Control Panel and Package Center to the DSM desktop, it will be easy to check if all packages are running after each update.

  38. Thomas

    Do you know when the version of CrashPlan for Synology will be available? I am trying to decide to wait it out or manually revert the DSM firmware. Thank you so much for this program. It has been great to have direct back on Synology.


  39. Laurrow

    I too have the same problem. Crashplan starts and stops.
    I ‘ have 2.2 to backup and one ds214 with 512kb of ram.
    Is it of in a problem of ram? or I have the impression that I have the problem since the update 5 of the dsm 5.0

    Sorry for my english
    Thank you google translate

    1. OecherWolke

      2.2TB to backup and only 512MB of RAM will be a problem. How many files are in that backup set? With such a low amount of memory, You should reduce the number of files per job to < 50.000. Check the advanced settings and set the dedup and compression settings to "auto".

  40. OecherWolke

    First of all: Thanks Paters for Your work.
    The package is working great.
    We have nearly twenty customers with Synology-NAS systems running patters package and I have seen most of all problems, mentioned here.
    Most of them can be solved by:
    – use Synology machines with Atom-CPUs (especially with larger backup sets)
    – maximize RAM of the Synology (especially with large backup sets)
    – modify the synology_install_vars file to use 1024MB or more of memory.
    – Split the backups in multiple jobs, I always try to keep the number of files per job to less than 100.000 Files.
    – big files (especially movies) should be done in an own job and disable the compression and deduplication (with highly compressed movies there won`t be much loss and the cpu and ran usage of the CrashPlan application decreases a lot
    – After each DSM-update, first check, if the Java Manager is up-to-date and the installed Java is up-to-date.
    – Stay a Java 7, this is working great.
    – if something isn’t working (wrong ownership-errors, etc.), uninstall the package, check that the complete app folder on /volume1/@appstore/CrashPlan(PROe) is deleted. Also delete the CrashPlan-user in the user settings of the Synology. Re-Install the package and do as told (wait a few seconds after Installation and stop the package, then start it again).

    1. OecherWolke

      one addition: check the log-files of CrashPlan (/volume1/@appstore/CrashPlan(PROe)/logs
      Check the service.log.0 and the engine_output.log. CrashPlan tells a lot about problems in the different log files. A problem with ownership of the “” file in most of the times is caused by “manual changing of ownerships”. Uninstall the package and reinstall it.

    2. olije

      Thanks for sharing your experience. So, if I were to sell my 213+ and buy a 214play instead, you reckon my problems with CP would be solved? FYI: my backupsets are smaller than 100k files and I have no huge video’s in that backup.Total backup size approx 800GB.

      1. OecherWolke

        Hi olije,
        I can`t tell definitely with an 214play, because our customers are using DS411+II and up, but looking at the technical details of the the 214play, it should be at the same performance level as the DS411+II. The customer using the 411+II is saving about 750GB mit about 500.000 Files of different kinds. We split up the jobs to 5 or 6 backup sets, this way it is working without problems.

  41. Serg

    My crashplan does not work. Synology 213+, DSM 5.0. Backup of 420 GB. It stopped working after installation of DSM Update 4, and Update 5 did not help, neither changed anything. I tried different Java packages – with 8 Crashplan shows “Stopped”, with 6 and 7 it is running, but apparently restarting – I see row of mountains in CPU Resourse Monitor (peaks 60-70%, then falls to 3-4%). I am trying to make it run for 2 weeks now, tried a few dozens restarts, uninstalled and reinstalled both Java and Crashplan for about 10 times, changed Max Heap size to 1024M. Logs don’t show anything strange, engine log fnds with “jtux loaded”. Engine_crash log is empty.

    The only thing I can think about is a cut line of Java command that is truncated:

    DiskStation> ps -w |grep crash
    21532 crashpla 1151m S N /volume1/@appstore/java7/jre/bin/java -Dfile.encoding=UTF-8 -Dapp=CrashPlanService -DappBaseName=CrashPla
    21781 root 4716 S grep crash

    I can’t understand what the hell is with that program. Maybe someone can help me, PLEASE!!!!!!

    1. OecherWolke

      Have You had a look on the CrashPlan logfiles inside the CrashPlan directory? Check the service.log.0 of CrashPlan.
      How many files do You backup? Reduce the number of files per job to < 100.000, less is better.

  42. Tony Rose

    I hope someone can help. I upgraded Synology to DMS5.0, subsequent to which I uninstalled Crashplan, reinstalled Java 7 and Crashplan. I pointed crashplan to the existing instance “Diskstation” when prompted in crashplan itself. From the client I can see “Diskstation” with a green (active) circle, but it says “Diskstation is unavailable – backup location is not accessible”. Likewise, or similarly, from the headless server the crashplan log file says “not read for backup from macbook pro. Reason: the backup location is no accessible”.

    Any thoughts anyone?

    1. olije

      have you adopted your previous backup via the frontend interface? That’s always necessary after reinstalling CP. Hope this helps. Cheers.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s