CrashPlan packages for Synology NAS

UPDATE – The instructions and notes on this page apply to all three versions of the package hosted on my repo: CrashPlan, CrashPlan PRO, and CrashPlan PROe.

CrashPlan is a popular online backup solution which supports continuous syncing. With this your NAS can become even more resilient – it could even get stolen or destroyed and you would still have your data. Whilst you can pay a small monthly charge for a storage allocation in the Cloud, one neat feature CrashPlan offers is for individuals to collaboratively backup their important data to each other – for free! You could install CrashPlan on your laptop and have it continuously protecting your documents to your NAS, even whilst away from home.

CrashPlan-Windows

CrashPlan is a Java application, and one that’s typically difficult to install on a NAS – therefore an obvious candidate for me to simplify into a package, given that I’ve made a few others. I tried and failed a few months ago, getting stuck at compiling the Jtux library for ARM CPUs (the Oracle Java for Embedded doesn’t come with any headers).

I noticed a few CrashPlan setup guides linking to my Java package, and decided to try again based on these: Kenneth Larsen’s blog post, the Vincesoft blog article for installing on ARM processor Iomega NAS units, and this handy PDF document which is a digest of all of them, complete with download links for the additional compiled ARM libraries. I used the PowerPC binaries Christophe had compiled on his chreggy.fr blog, so thanks go to him. I wanted make sure the package didn’t require the NAS to be bootstrapped, so I picked out the few generic binaries that were needed (bash, nice and cpio) directly from the Optware repo.

UPDATE – For version 3.2 I also had to identify and then figure out how to compile Tim Macinta’s fast MD5 library, to fix the supplied libmd5.so on ARM systems (CrashPlan only distributes libraries for x86). I’m documenting that process here in case more libs are required in future versions. I identified it from the error message in log/engine_error.log and by running objdump -x libmd5.so. I could see that the same Java_com_twmacinta_util_MD5_Transform_1native function mentioned in the error was present in the x86 lib but not in my compiled libmd5.so from W3C Libwww. I took the headers from an install of OpenJDK on a regular Ubuntu desktop. I then used the Linux x86 source from the download bundle on Tim’s website – the closest match – and compiled it directly on the syno using the command line from a comment in another version of that source:
gcc -O3 -shared -I/tmp/jdk_headers/include /tmp/fast-md5/src/lib/arch/linux_x86/MD5.c -o libmd5.so

Aside from the challenges of getting the library dependencies fixed for ARM and QorIQ PowerPC systems, there was also the matter of compliance – Code 42 Software’s EULA prohibits redistribution of their work. I had to make the syno package download CrashPlan for Linux (after the end user agrees their EULA), then I had to write my own script to extract this archive and mimic their installer, since their installer is interactive. It took a lot of slow testing, but I managed it!

CPPROe package info

My most recent package version introduces handling of the automatic updates which Code 42 sometimes publish to the clients. This has proved to be quite a challenge to get working as testing was very laborious. I can confirm that it worked with the update from CrashPlan PRO 3.2 to 3.2.1 , and from CrashPlan 3.2.1 to 3.4.1:

CrashPlan-update-repair

 

Installation

  • This package is for Marvell Kirkwood, Marvell Armada 370/XP, Intel and Freescale QorIQ/PowerQUICC PowerPC CPUs only, so please check which CPU your NAS has. It will work on an unmodified NAS, no hacking or bootstrapping required. It will only work on older PowerQUICC PowerPC models that are running DSM 5.0. It is technically possible to run CrashPlan on older DSM versions, but it requires chroot-ing to a Debian install. Christophe from chreggy.fr has recently released packages to automate this.
  • In the User Control Panel in DSM, enable the User Homes service.
  • Install the package directly from Package Center in DSM. In Settings -> Package Sources add my package repository URL which is http://packages.pcloadletter.co.uk.
  • You will need to install either one of my Java SE Embedded packages first (Java 6 or 7). Read the instructions on that page carefully too.
  • If you previously installed CrashPlan manually using the Synology Wiki, you can find uninstall instructions here.
 

Notes

  • The package downloads the CrashPlan installer directly from Code 42 Software, following acceptance of their EULA. I am complying with their wish that no one redistributes it.
  • CrashPlan is installed in headless mode – backup engine only. This is configured by a desktop client, but operates independently of it.
  • The engine daemon script checks the amount of system RAM and scales the Java heap size appropriately (up to the default maximum of 512MB). This can be overridden in a persistent way if you are backing up very large backup sets by editing /volume1/@appstore/CrashPlan/syno_package.vars. If you’re considering buying a NAS purely to use CrashPlan and intend to back up more than a few hundred GB then I strongly advise buying one of the Intel models which come with 1GB RAM and can be upgraded to 3GB very cheaply. RAM is very limited on the ARM ones. 128MB RAM on the J series means CrashPlan is running with only one fifth of the recommended heap size, so I doubt it’s viable for backing up very much at all. My DS111 has 256MB of RAM and currently backs up around 60GB with no issues. I have found that a 512MB heap was insufficient to back up more than 2TB of files on a Windows server. It kept restarting the backup engine every few minutes until I increased the heap to 1024MB.
  • As with my other syno packages, the daemon user account password is randomized when it is created using the openssl binary. DSM Package Center runs as the root user so my script starts the package using an su command. This means that you can change the password yourself and CrashPlan will still work.
  • The default location for saving friends’ backups is set to /volume1/crashplan/backupArchives (where /volume1 is you primary storage volume) to eliminate the chance of them being destroyed accidentally by uninstalling the package.
  • The first time you run the server you will need to stop it and restart it before you can connect the client. This is because a config file that’s only created on first run needs to be edited by one of my scripts. The engine is then configured to listen on all interfaces on the default port 4243.
  • Once the engine is running, you can manage it by installing CrashPlan on another computer, and editing the file conf/ui.properties on that computer so that this line:
    #serviceHost=127.0.0.1
    is uncommented (by removing the hash symbol) and set to the IP address of your NAS, e.g.:
    serviceHost=192.168.1.210
    On Windows you can also disable the CrashPlan service if you will only use the client.
  • If you need to manage CrashPlan from a remote location, I suggest you do so using SSH tunnelling as per this support document.
  • The package supports upgrading to future versions while preserving the machine identity, logs, login details, and cache. Upgrades can now take place without requiring a login from the client afterwards.
  • If you remove the package completely and re-install it later, you can re-attach to previous backups. When you log in to the Desktop Client with your existing account after a re-install, you can select “adopt computer” to merge the records, and preserve your existing backups. I haven’t tested whether this also re-attaches links to friends’ CrashPlan computers and backup sets, though the latter does seem possible in the Friends section of the GUI. It’s probably a good idea to test that this survives a package reinstall before you start relying on it. Sometimes, particularly with CrashPlan PRO I think, the adopt option is not offered. In this case you can log into CrashPlan Central and retrieve your computer’s GUID. On the CrashPlan client, double-click on the logo in the top right and you’ll enter a command line mode. You can use the GUID command to change the system’s GUID to the one you just retrieved from your account.
  • The log which is displayed in the package’s Log tab is actually the activity history. If you’re trying to troubleshoot an issue you will need to use an SSH session to inspect the two engine log files which are:
    /volume1/@appstore/CrashPlan/log/engine_output.log
    /volume1/@appstore/CrashPlan/log/engine_error.log
  • When CrashPlan downloads and attempts to run an automatic update, the script will most likely fail and stop the package. This is typically caused by syntax differences with the Synology versions of certain Linux shell commands (like rm, mv, or ps). You will need to wait several minutes in the event of this happening before you take action, because the update script tries to restart CrashPlan 10 times at 10 second intervals. After this, you simply start the package again in Package Center and my scripts will fix the update, then run it. One final package restart is required before you can connect with the CrashPlan Desktop client (remember to update that too).
  • After their backup is seeded some users may wish to schedule the CrashPlan engine using cron so that it only runs at certain times. This is particularly useful on ARM systems because CrashPlan currently prevents hibernation while it is running (unresolved issue, reported to Code 42). To schedule, edit /etc/crontab and add the following entries for starting and stopping CrashPlan:
    55 2 * * * root /var/packages/CrashPlan/scripts/start-stop-status start
    0  4 * * * root /var/packages/CrashPlan/scripts/start-stop-status stop

    This example would configure CrashPlan to run daily between 02:55 and 04:00am. CrashPlan by default will scan the whole backup selection for changes at 3:00am so this is ideal. The simplest way to edit crontab if you’re not really confident with Linux is to install Merty’s Config File Editor package, which requires the official Synology Perl package to be installed too (since DSM 4.2). After editing crontab you will need to restart the cron daemon for the changes to take effect:
    /usr/syno/etc.defaults/rc.d/S04crond.sh stop
    /usr/syno/etc.defaults/rc.d/S04crond.sh start

    It is vitally important that you do not improvise your own startup commands or use a different account because this will most likely break the permissions on the config files, causing additional problems. The package scripts are designed to be run as root, and they will in turn invoke the CrashPlan engine using its own dedicated user account.
  • If you update DSM later, you will need to re-install the Java package or else UTF-8 and locale support will be broken by the update.
  • If you decide to sign up for one of CrashPlan’s paid backup services as a result of my work on this, I would really appreciate it if you could use this affiliate link, or consider donating using the PayPal button on the right.
 

Package scripts

For information, here are the package scripts so you can see what it’s going to do. You can get more information about how packages work by reading the Synology Package wiki.

installer.sh

#!/bin/sh

#--------CRASHPLAN installer script
#--------package maintained at pcloadletter.co.uk

DOWNLOAD_PATH="http://download.crashplan.com/installs/linux/install/${SYNOPKG_PKGNAME}"
[ "${SYNOPKG_PKGNAME}" == "CrashPlan" ] && DOWNLOAD_FILE="CrashPlan_3.6.3_Linux.tgz"
[ "${SYNOPKG_PKGNAME}" == "CrashPlanPRO" ] && DOWNLOAD_FILE="CrashPlanPRO_3.6.3_Linux.tgz"
[ "${SYNOPKG_PKGNAME}" == "CrashPlanPROe" ] && DOWNLOAD_FILE="CrashPlanPROe_3.6.3_Linux.tgz"
DOWNLOAD_URL="${DOWNLOAD_PATH}/${DOWNLOAD_FILE}"
CPI_FILE="${SYNOPKG_PKGNAME}_*.cpi"
EXTRACTED_FOLDER="${SYNOPKG_PKGNAME}-install"
DAEMON_USER="`echo ${SYNOPKG_PKGNAME} | awk {'print tolower($_)'}`"
DAEMON_PASS="`openssl rand 12 -base64 2>/dev/null`"
DAEMON_ID="${SYNOPKG_PKGNAME} daemon user"
DAEMON_HOME="/var/services/homes/${DAEMON_USER}"
OPTDIR="${SYNOPKG_PKGDEST}"
VARS_FILE="${OPTDIR}/install.vars"
ENGINE_SCRIPT="CrashPlanEngine"
SYNO_CPU_ARCH="`uname -m`"
[ "${SYNO_CPU_ARCH}" == "x86_64" ] && SYNO_CPU_ARCH="i686"
NATIVE_BINS_URL="http://packages.pcloadletter.co.uk/downloads/crashplan-native-${SYNO_CPU_ARCH}.tgz"   
NATIVE_BINS_FILE="`echo ${NATIVE_BINS_URL} | sed -r "s%^.*/(.*)%\1%"`"
INSTALL_FILES="${DOWNLOAD_URL} ${NATIVE_BINS_URL}"
TEMP_FOLDER="`find / -maxdepth 2 -name '@tmp' | head -n 1`"
#the Manifest folder is where friends' backup data is stored
#we set it outside the app folder so it persists after a package uninstall
MANIFEST_FOLDER="/`echo $TEMP_FOLDER | cut -f2 -d'/'`/crashplan"
LOG_FILE="${SYNOPKG_PKGDEST}/log/history.log.0"
UPGRADE_FILES="syno_package.vars conf/my.service.xml conf/service.login conf/service.model"
UPGRADE_FOLDERS="log cache"

source /etc/profile
PUBLIC_FOLDER="`cat /usr/syno/etc/smb.conf | sed -r '/\/public$/!d;s/^.*path=(\/volume[0-9]{1,4}\/public).*$/\1/'`"


preinst ()
{
  if [ -z ${PUBLIC_FOLDER} ]; then
    echo "A shared folder called 'public' could not be found - note this name is case-sensitive. "
    echo "Please create this using the Shared Folder DSM Control Panel and try again."
    exit 1
  fi

  if [ -z ${JAVA_HOME} ]; then
    echo "Java is not installed or not properly configured. JAVA_HOME is not defined. "
    echo "Download and install the Java Synology package from http://wp.me/pVshC-z5"
    exit 1
  fi
  
  if [ ! -f ${JAVA_HOME}/bin/java ]; then
    echo "Java is not installed or not properly configured. The Java binary could not be located. "
    echo "Download and install the Java Synology package from http://wp.me/pVshC-z5"
    exit 1
  fi
  
  #is the User Home service enabled?
  UH_SERVICE=maybe
  synouser --add userhometest Testing123 "User Home test user" 0 "" ""
  UHT_HOMEDIR=`cat /etc/passwd | sed -r '/User Home test user/!d;s/^.*:User Home test user:(.*):.*$/\1/'`
  if echo $UHT_HOMEDIR | grep '/var/services/homes/' > /dev/null; then
    if [ ! -d $UHT_HOMEDIR ]; then
      UH_SERVICE=false
    fi
  fi
  synouser --del userhometest
  #remove home directory (needed since DSM 4.1)
  [ -e /var/services/homes/userhometest ] && rm -r /var/services/homes/userhometest
  if [ "${UH_SERVICE}" == "false" ]; then
    echo "The User Home service is not enabled. Please enable this feature in the User control panel in DSM."
    exit 1
  fi
  
  cd ${TEMP_FOLDER}
  for WGET_URL in ${INSTALL_FILES}
  do
    WGET_FILENAME="`echo ${WGET_URL} | sed -r "s%^.*/(.*)%\1%"`"
    [ -f ${TEMP_FOLDER}/${WGET_FILENAME} ] && rm ${TEMP_FOLDER}/${WGET_FILENAME}
    wget ${WGET_URL}
    if [[ $? != 0 ]]; then
      if [ -d ${PUBLIC_FOLDER} ] && [ -f ${PUBLIC_FOLDER}/${WGET_FILENAME} ]; then
        cp ${PUBLIC_FOLDER}/${WGET_FILENAME} ${TEMP_FOLDER}
      else     
        echo "There was a problem downloading ${WGET_FILENAME} from the official download link, "
        echo "which was \"${WGET_URL}\" "
        echo "Alternatively, you may download this file manually and place it in the 'public' shared folder. "
        exit 1
      fi
    fi
  done
 
  exit 0
}


postinst ()
{
  #create daemon user
  synouser --add ${DAEMON_USER} ${DAEMON_PASS} "${DAEMON_ID}" 0 "" ""
  
  #save the daemon user's homedir as variable in that user's profile
  #this is needed because new users seem to inherit a HOME value of /root which they have no permissions for.
  su - ${DAEMON_USER} -s /bin/sh -c "echo export HOME=\'${DAEMON_HOME}\' >> .profile"

  #extract CPU-specific additional binaries
  mkdir ${SYNOPKG_PKGDEST}/bin
  cd ${SYNOPKG_PKGDEST}/bin
  tar xzf ${TEMP_FOLDER}/${NATIVE_BINS_FILE} && rm ${TEMP_FOLDER}/${NATIVE_BINS_FILE}

  #extract main archive
  cd ${TEMP_FOLDER}
  tar xzf ${TEMP_FOLDER}/${DOWNLOAD_FILE} && rm ${TEMP_FOLDER}/${DOWNLOAD_FILE} 
  
  #extract cpio archive
  cd ${SYNOPKG_PKGDEST}
  cat "${TEMP_FOLDER}/${EXTRACTED_FOLDER}"/${CPI_FILE} | gzip -d -c | ${SYNOPKG_PKGDEST}/bin/cpio -i --no-preserve-owner
  
  echo "#uncomment to expand Java max heap size beyond prescribed value (will survive upgrades)" > ${SYNOPKG_PKGDEST}/syno_package.vars
  echo "#you probably only want more than the recommended 512M if you're backing up extremely large volumes of files" >> ${SYNOPKG_PKGDEST}/syno_package.vars
  echo "#USR_MAX_HEAP=512M" >> ${SYNOPKG_PKGDEST}/syno_package.vars
  echo >> ${SYNOPKG_PKGDEST}/syno_package.vars

  #the following Package Center variables will need retrieving if launching CrashPlan via cron
  echo "CRON_SYNOPKG_PKGNAME='${SYNOPKG_PKGNAME}'" >> ${SYNOPKG_PKGDEST}/syno_package.vars
  echo "CRON_SYNOPKG_PKGDEST='${SYNOPKG_PKGDEST}'" >> ${SYNOPKG_PKGDEST}/syno_package.vars

  cp ${TEMP_FOLDER}/${EXTRACTED_FOLDER}/scripts/${ENGINE_SCRIPT} ${OPTDIR}/bin
  cp ${TEMP_FOLDER}/${EXTRACTED_FOLDER}/scripts/run.conf ${OPTDIR}/bin
  mkdir -p ${MANIFEST_FOLDER}/backupArchives    
  chown -R ${DAEMON_USER} ${MANIFEST_FOLDER}
  
  #save install variables which Crashplan expects its own installer script to create
  echo TARGETDIR=${SYNOPKG_PKGDEST} > ${VARS_FILE}
  echo BINSDIR=/bin >> ${VARS_FILE}
  echo MANIFESTDIR=${MANIFEST_FOLDER}/backupArchives >> ${VARS_FILE}
  #leave these ones out which should help upgrades from Code42 to work (based on examining an upgrade script)
  #echo INITDIR=/etc/init.d >> ${VARS_FILE}
  #echo RUNLVLDIR=/usr/syno/etc/rc.d >> ${VARS_FILE}
  echo INSTALLDATE=`date +%Y%m%d` >> ${VARS_FILE}
  echo JAVACOMMON=\${JAVA_HOME}/bin/java >> ${VARS_FILE}
  cat ${TEMP_FOLDER}/${EXTRACTED_FOLDER}/install.defaults >> ${VARS_FILE}
  
  #remove temp files
  rm -r ${TEMP_FOLDER}/${EXTRACTED_FOLDER}
  
  #change owner of CrashPlan folder tree
  chown -R ${DAEMON_USER} ${SYNOPKG_PKGDEST}
  
  exit 0
}


preuninst ()
{
  #make sure engine is stopped
  su - ${DAEMON_USER} -s /bin/sh -c "${OPTDIR}/bin/${ENGINE_SCRIPT} stop"
  sleep 2
  
  exit 0
}


postuninst ()
{
  if [ -f ${SYNOPKG_PKGDEST}/syno_package.vars ]; then
    source ${SYNOPKG_PKGDEST}/syno_package.vars
  fi

  if [ "${LIBFFI_SYMLINK}" == "YES" ]; then
    rm /lib/libffi.so.5
  fi
  
  #if it doesn't exist, but is still a link then it's a broken link and should also be deleted
  if [ ! -e /lib/libffi.so.5 ]; then
    [ -L /lib/libffi.so.5 ] && rm /lib/libffi.so.5
  fi
    
  #remove daemon user
  synouser --del ${DAEMON_USER}
  
  #remove daemon user's home directory (needed since DSM 4.1)
  [ -e /var/services/homes/${DAEMON_USER} ] && rm -r /var/services/homes/${DAEMON_USER}
  
 exit 0
}

preupgrade ()
{
  #make sure engine is stopped
  su - ${DAEMON_USER} -s /bin/sh -c "${OPTDIR}/bin/${ENGINE_SCRIPT} stop"
  sleep 2
  
  #if identity and config data exists back it up
  if [ -d ${DAEMON_HOME}/.crashplan ]; then
    mkdir -p ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig/conf
    mv ${DAEMON_HOME}/.crashplan ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig
    for FILE_TO_MIGRATE in ${UPGRADE_FILES}; do
      if [ -f ${OPTDIR}/${FILE_TO_MIGRATE} ]; then
        cp ${OPTDIR}/${FILE_TO_MIGRATE} ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig/${FILE_TO_MIGRATE}
      fi
    done
    for FOLDER_TO_MIGRATE in ${UPGRADE_FOLDERS}; do
      if [ -d ${OPTDIR}/${FOLDER_TO_MIGRATE} ]; then
        mv ${OPTDIR}/${FOLDER_TO_MIGRATE} ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig
      fi
    done
  fi

  exit 0
}


postupgrade ()
{
  #use the migrated identity and config data from the previous version
  if [ -d ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig/.crashplan ]; then
    mv ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig/.crashplan ${DAEMON_HOME}
    for FILE_TO_MIGRATE in ${UPGRADE_FILES}; do
      if [ -f ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig/${FILE_TO_MIGRATE} ]; then
        mv ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig/${FILE_TO_MIGRATE} ${OPTDIR}/${FILE_TO_MIGRATE}
      fi
    done
    for FOLDER_TO_MIGRATE in ${UPGRADE_FOLDERS}; do
    if [ -d ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig/${FOLDER_TO_MIGRATE} ]; then
      mv ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig/${FOLDER_TO_MIGRATE} ${OPTDIR}
    fi
    done
    rmdir ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig/conf
    rmdir ${SYNOPKG_PKGDEST}/../${DAEMON_USER}_data_mig
    
    #make CrashPlan log entry
    TIMESTAMP="`date +%D` `date +%I:%M%p`"
    echo "I ${TIMESTAMP} Synology Package Center updated ${SYNOPKG_PKGNAME} to version ${SYNOPKG_PKGVER}" >> ${LOG_FILE}
    
    #daemon user has been deleted and recreated so we need to reset ownership (new UID)
    chown -R ${DAEMON_USER} ${DAEMON_HOME}/.crashplan
    chown -R ${DAEMON_USER} ${SYNOPKG_PKGDEST}
    
    #read manifest location from the migrated XML config, and reset ownership on that path too
    if [ -f ${SYNOPKG_PKGDEST}/conf/my.service.xml ]; then
      MANIFEST_FOLDER=`cat ${SYNOPKG_PKGDEST}/conf/my.service.xml | grep "<manifestPath>" | cut -f2 -d'>' | cut -f1 -d'<'`
      chown -R ${DAEMON_USER} ${MANIFEST_FOLDER}
    fi
    
    #the following Package Center variables will need retrieving if launching CrashPlan via cron
    grep "^CRON_SYNOPKG_PKGNAME" ${SYNOPKG_PKGDEST}/syno_package.vars > /dev/null \
     || echo "CRON_SYNOPKG_PKGNAME='${SYNOPKG_PKGNAME}'" >> ${SYNOPKG_PKGDEST}/syno_package.vars
    grep "^CRON_SYNOPKG_PKGDEST" ${SYNOPKG_PKGDEST}/syno_package.vars > /dev/null \
     || echo "CRON_SYNOPKG_PKGDEST='${SYNOPKG_PKGDEST}'" >> ${SYNOPKG_PKGDEST}/syno_package.vars
  fi
  
  exit 0
}
 

start-stop-status.sh

#!/bin/sh

#--------CRASHPLAN start-stop-status script
#--------package maintained at pcloadletter.co.uk

if [ "${SYNOPKG_PKGNAME}" == "" ]; then
  #if this script has been invoked by cron then some Package Center vars are undefined
  source "`dirname $0`/../target/syno_package.vars"
  SYNOPKG_PKGNAME="${CRON_SYNOPKG_PKGNAME}" 
  SYNOPKG_PKGDEST="${CRON_SYNOPKG_PKGDEST}"
  CRON_LAUNCHED=True
fi

#Main variables section
DAEMON_USER="`echo ${SYNOPKG_PKGNAME} | awk {'print tolower($_)'}`"
DAEMON_HOME="/var/services/homes/${DAEMON_USER}"
OPTDIR="${SYNOPKG_PKGDEST}"
TEMP_FOLDER="`find / -maxdepth 2 -name '@tmp' | head -n 1`"
MANIFEST_FOLDER="/`echo $TEMP_FOLDER | cut -f2 -d'/'`/crashplan"
LOG_FILE="${SYNOPKG_PKGDEST}/log/history.log.0"
ENGINE_SCRIPT="CrashPlanEngine"
APP_NAME="CrashPlanService"
SCRIPTS_TO_EDIT="${ENGINE_SCRIPT}"
ENGINE_CFG="run.conf"
LIBFFI_SO_NAMES="5 6" #armada370 build of libjnidispatch.so is newer, and uses libffi.so.6
CFG_PARAM="SRV_JAVA_OPTS"
source ${OPTDIR}/install.vars

JAVA_MIN_HEAP=`grep "^${CFG_PARAM}=" "${OPTDIR}/bin/${ENGINE_CFG}" | sed -r "s/^.*-Xms([0-9]+)[Mm] .*$/\1/"`
SYNO_CPU_ARCH="`uname -m`"


case $1 in
  start)    
    #set the current timezone for Java so that log timestamps are accurate
    #we need to use the modern timezone names so that Java can figure out DST 
    SYNO_TZ=`cat /etc/synoinfo.conf | grep timezone | cut -f2 -d'"'`
    SYNO_TZ=`grep "^${SYNO_TZ}" /usr/share/zoneinfo/Timezone/tzname | sed -e "s/^.*= //"`
    grep "^export TZ" ${DAEMON_HOME}/.profile > /dev/null \
     && sed -i "s%^export TZ=.*$%export TZ='${SYNO_TZ}'%" ${DAEMON_HOME}/.profile \
     || echo export TZ=\'${SYNO_TZ}\' >> ${DAEMON_HOME}/.profile
    #this package stores the machine identity in the daemon user home directory
    #so we need to remove any old config data from previous manual installations or startups
    [ -d /var/lib/crashplan ] && rm -r /var/lib/crashplan

    #check persistent variables from syno_package.vars
    USR_MAX_HEAP=0
    if [ -f ${SYNOPKG_PKGDEST}/syno_package.vars ]; then
      source ${SYNOPKG_PKGDEST}/syno_package.vars
    fi
    USR_MAX_HEAP=`echo $USR_MAX_HEAP | sed -e "s/[mM]//"`

    #create or repair libffi symlink if a DSM upgrade has removed it
    for FFI_VER in ${LIBFFI_SO_NAMES}; do 
      if [ -e ${OPTDIR}/lib/libffi.so.${FFI_VER} ]; then
        if [ ! -e /lib/libffi.so.${FFI_VER} ]; then
          #if it doesn't exist, but is still a link then it's a broken link and should be deleted
          [ -L /lib/libffi.so.${FFI_VER} ] && rm /lib/libffi.so.${FFI_VER}
          ln -s ${OPTDIR}/lib/libffi.so.${FFI_VER} /lib/libffi.so.${FFI_VER}
        fi
      fi
    done

    #fix up some of the binary paths and fix some command syntax for busybox 
    #moved this to start-stop-status from installer.sh because Code42 push updates and these
    #new scripts will need this treatment too
    FIND_TARGETS=
    for TARGET in ${SCRIPTS_TO_EDIT}; do
      FIND_TARGETS="${FIND_TARGETS} -o -name ${TARGET}"
    done
    find ${OPTDIR} \( -name \*.sh ${FIND_TARGETS} \) | while IFS="" read -r FILE_TO_EDIT; do
      if [ -e ${FILE_TO_EDIT} ]; then
        #this list of substitutions will probably need expanding as new CrashPlan updates are released
        sed -i "s%^#!/bin/bash%#!${SYNOPKG_PKGDEST}/bin/bash%" "${FILE_TO_EDIT}"
        sed -i -r "s%(^\s*)nice -n%\1${SYNOPKG_PKGDEST}/bin/nice -n%" "${FILE_TO_EDIT}"
        sed -i -r "s%(^\s*)(/bin/ps|ps) [^\|]*\|%\1/bin/ps w \|%" "${FILE_TO_EDIT}"
        sed -i -r "s%\`ps [^\|]*\|%\`ps w \|%" "${FILE_TO_EDIT}"
        sed -i "s/rm -fv/rm -f/" "${FILE_TO_EDIT}"
        sed -i "s/mv -fv/mv -f/" "${FILE_TO_EDIT}"
      fi
    done

    #any downloaded upgrade script will usually have failed until the above changes are made so we need to
    #find it and start it, if it exists
    UPGRADE_SCRIPT=`find ${OPTDIR}/upgrade -name "upgrade.sh"`
    if [ -n "${UPGRADE_SCRIPT}" ]; then
      rm ${OPTDIR}/${ENGINE_SCRIPT}.pid
      SCRIPT_HOME=`dirname $UPGRADE_SCRIPT`

      #make CrashPlan log entry
      TIMESTAMP="`date +%D` `date +%I:%M%p`"
      echo "I ${TIMESTAMP} Synology repairing upgrade in ${SCRIPT_HOME}" >> ${LOG_FILE}

      mv ${SCRIPT_HOME}/upgrade.log ${SCRIPT_HOME}/upgrade.log.old
      chown -R ${DAEMON_USER} ${SYNOPKG_PKGDEST}
      su - ${DAEMON_USER} -s /bin/sh -c "cd ${SCRIPT_HOME} ; . upgrade.sh"
      mv ${SCRIPT_HOME}/upgrade.sh ${SCRIPT_HOME}/upgrade.sh.old
      exit 0
    fi

    #updates may also overwrite our native binaries
    if [ "${SYNO_CPU_ARCH}" == "x86_64" ]; then
      cp ${SYNOPKG_PKGDEST}/bin/synology-x86-glibc-2.4-shim.so ${OPTDIR}/lib
    else    
      cp -f ${SYNOPKG_PKGDEST}/bin/libjtux.so ${OPTDIR}
      cp -f ${SYNOPKG_PKGDEST}/bin/jna-3.2.5.jar ${OPTDIR}/lib
      cp -f ${SYNOPKG_PKGDEST}/bin/libffi.so.* ${OPTDIR}/lib
    fi

    #set appropriate Java max heap size
    RAM=$((`free | grep Mem: | sed -e "s/^ *Mem: *\([0-9]*\).*$/\1/"`/1024))
    if [ $RAM -le 128 ]; then
      JAVA_MAX_HEAP=80
    elif [ $RAM -le 256 ]; then
      JAVA_MAX_HEAP=192
    elif [ $RAM -le 512 ]; then
      JAVA_MAX_HEAP=384
    #CrashPlan's default max heap is 512MB
    elif [ $RAM -gt 512 ]; then
      JAVA_MAX_HEAP=512
    fi
    if [ $USR_MAX_HEAP -gt $JAVA_MAX_HEAP ]; then
      JAVA_MAX_HEAP=${USR_MAX_HEAP}
    fi   
    if [ $JAVA_MAX_HEAP -lt $JAVA_MIN_HEAP ]; then
      #can't have a max heap lower than min heap (ARM low RAM systems)
      $JAVA_MAX_HEAP=$JAVA_MIN_HEAP
    fi
    sed -i -r "s/(^${CFG_PARAM}=.*) -Xmx[0-9]+[mM] (.*$)/\1 -Xmx${JAVA_MAX_HEAP}m \2/" "${OPTDIR}/bin/${ENGINE_CFG}"
    
    #disable the use of the x86-optimized external Fast MD5 library if running on ARM and QorIQ CPUs
    #seems to be the default behaviour now but that may change again
    if [ "${SYNO_CPU_ARCH}" != "x86_64" ]; then
      grep "^${CFG_PARAM}=.*c42\.native\.md5\.enabled" "${OPTDIR}/bin/${ENGINE_CFG}" > /dev/null \
       || sed -i -r "s/(^${CFG_PARAM}=\".*)\"$/\1 -Dc42.native.md5.enabled=false\"/" "${OPTDIR}/bin/${ENGINE_CFG}"
    fi

    #move the Java temp directory from the default of /tmp
    grep "^${CFG_PARAM}=.*Djava\.io\.tmpdir" "${OPTDIR}/bin/${ENGINE_CFG}" > /dev/null \
     || sed -i -r "s%(^${CFG_PARAM}=\".*)\"$%\1 -Djava.io.tmpdir=${TEMP_FOLDER}\"%" "${OPTDIR}/bin/${ENGINE_CFG}"

    #reset ownership of all files to daemon user, so that manual edits to config files won't cause problems
    chown -R ${DAEMON_USER} ${SYNOPKG_PKGDEST}
    chown -R ${DAEMON_USER} ${DAEMON_HOME}    

    #now edit the XML config file, which only exists after first run
    if [ -f ${SYNOPKG_PKGDEST}/conf/my.service.xml ]; then

      #allow direct connections from CrashPlan Desktop client on remote systems
      #you must edit the value of serviceHost in conf/ui.properties on the client you connect with
      #users report that this value is sometimes reset so now it's set every service startup 
      sed -i "s/<serviceHost>127\.0\.0\.1<\/serviceHost>/<serviceHost>0\.0\.0\.0<\/serviceHost>/" "${SYNOPKG_PKGDEST}/conf/my.service.xml"
      
      #this change is made only once in case you want to customize the friends' backup location
      if [ "${MANIFEST_PATH_SET}" != "True" ]; then

        #keep friends' backup data outside the application folder to make accidental deletion less likely 
        sed -i "s%<manifestPath>.*</manifestPath>%<manifestPath>${MANIFEST_FOLDER}/backupArchives/</manifestPath>%" "${SYNOPKG_PKGDEST}/conf/my.service.xml"
        echo "MANIFEST_PATH_SET=True" >> ${SYNOPKG_PKGDEST}/syno_package.vars
      fi

      #since CrashPlan version 3.5.3 the value javaMemoryHeapMax also needs setting to match that used in bin/run.conf
      sed -i -r "s%(<javaMemoryHeapMax>)[0-9]+[mM](</javaMemoryHeapMax>)%\1${JAVA_MAX_HEAP}m\2%" "${SYNOPKG_PKGDEST}/conf/my.service.xml"
    else
      echo "Wait a few seconds, then stop and restart the package to allow desktop client connections." > "${SYNOPKG_TEMP_LOGFILE}"
    fi
    if [ "${CRON_LAUNCHED}" == "True" ]; then
      [ -e /var/packages/${SYNOPKG_PKGNAME}/enabled ] || touch /var/packages/${SYNOPKG_PKGNAME}/enabled
    fi

    #delete any stray Java temp files
    find /tmp -name "jna*.tmp" -user ${DAEMON_USER} | while IFS="" read -r FILE_TO_DEL; do
      if [ -e ${FILE_TO_DEL} ]; then
        rm ${FILE_TO_DEL}
      fi
    done

    #increase the system-wide maximum number of open files from Synology default of 24466
    echo "65536" > /proc/sys/fs/file-max

    #raise the maximum open file count from the Synology default of 1024 - thanks Casper K. for figuring this out
    #http://support.code42.com/Administrator/3.6_And_4.0/Troubleshooting/Too_Many_Open_Files
    ulimit -n 65536

    if [ "${SYNO_CPU_ARCH}" == "x86_64" ]; then
      #Intel synos running older DSM need rwojo's glibc version shim for inotify support
      #https://github.com/wojo/synology-x86-glibc-2.4-shim
      GLIBC_VER="`/lib/libc.so.6 | grep -m 1 version | sed -r "s/^[^0-9]*([0-9].*[0-9])\,.*$/\1/"`"
      if [ "${GLIBC_VER}" == "2.3.6" ]; then
        su - ${DAEMON_USER} -s /bin/sh -c "LD_PRELOAD=${SYNOPKG_PKGDEST}/lib/synology-x86-glibc-2.4-shim.so ${OPTDIR}/bin/${ENGINE_SCRIPT} start"
      else
        su - ${DAEMON_USER} -s /bin/sh -c "${OPTDIR}/bin/${ENGINE_SCRIPT} start"
      fi
    else
      su - ${DAEMON_USER} -s /bin/sh -c "${OPTDIR}/bin/${ENGINE_SCRIPT} start"
    fi
    exit 0
  ;;

  stop)
    su - ${DAEMON_USER} -s /bin/sh -c "${OPTDIR}/bin/${ENGINE_SCRIPT} stop"
    if [ "${CRON_LAUNCHED}" == "True" ]; then
      [ -e /var/packages/${SYNOPKG_PKGNAME}/enabled ] && rm /var/packages/${SYNOPKG_PKGNAME}/enabled
    fi
    exit 0
  ;;

  status)
    PID=`/bin/ps w| grep "app=${APP_NAME}" | grep -v grep | awk '{ print $1 }'`
    if [ -n "$PID" ]; then
      exit 0
    else
      exit 1
    fi
  ;;

  log)
    echo "${LOG_FILE}"
    exit 0
  ;;
esac
 

Changelog:

  • 0027 Fixed open file handle limit for very large backup sets (ulimit fix)
  • 0026 Updated all CrashPlan clients to version 3.6.3, improved handling of Java temp files
  • 0025 glibc version shim no longer used on Intel Synology models running DSM 5.0
  • 0024 Updated to CrashPlan PROe 3.6.1.4 and added support for PowerPC 2010 Synology models running DSM 5.0
  • 0023 Added support for Intel Atom Evansport and Armada XP CPUs in new DSx14 products
  • 0022 Updated all CrashPlan client versions to 3.5.3, compiled native binary dependencies to add support for Armada 370 CPU (DS213j), start-stop-status.sh now updates the new javaMemoryHeapMax value in my.service.xml to the value defined in syno_package.vars
  • 0021 Updated CrashPlan to version 3.5.2
  • 0020 Fixes for DSM 4.2
  • 018 Updated CrashPlan PRO to version 3.4.1
  • 017 Updated CrashPlan and CrashPlan PROe to version 3.4.1, and improved in-app update handling
  • 016 Added support for Freescale QorIQ CPUs in some x13 series Synology models, and installer script now downloads native binaries separately to reduce repo hosting bandwidth, PowerQUICC PowerPC processors in previous Synology generations with older glibc versions are not supported
  • 015 Added support for easy scheduling via cron – see updated Notes section
  • 014 DSM 4.1 user profile permissions fix
  • 013 implemented update handling for future automatic updates from Code 42, and incremented CrashPlanPRO client to release version 3.2.1
  • 012 incremented CrashPlanPROe client to release version 3.3
  • 011 minor fix to allow a wildcard on the cpio archive name inside the main installer package (to fix CP PROe client since Code 42 Software had amended the cpio file version to 3.2.1.2)
  • 010 minor bug fix relating to daemon home directory path
  • 009 rewrote the scripts to be even easier to maintain and unified as much as possible with my imminent CrashPlan PROe server package, fixed a timezone bug (tightened regex matching), moved the script-amending logic from installer.sh to start-stop-status.sh with it now applying to all .sh scripts each startup so perhaps updates from Code42 might work in future, if wget fails to fetch the installer from Code42 the installer will look for the file in the public shared folder
  • 008 merged the 14 package scripts each (7 for ARM, 7 for Intel) for CP, CP PRO, & CP PROe – 42 scripts in total – down to just two! ARM & Intel are now supported by the same package, Intel synos now have working inotify support (Real-Time Backup) thanks to rwojo’s shim to pass the glibc version check, upgrade process now retains login, cache and log data (no more re-scanning), users can specify a persistent larger max heap size for very large backup sets
  • 007 fixed a bug that broke CrashPlan if the Java folder moved (if you changed version)
  • 006 installation now fails without User Home service enabled, fixed Daylight Saving Time support, automated replacing the ARM libffi.so symlink which is destroyed by DSM upgrades, stopped assuming the primary storage volume is /volume1, reset ownership on /var/lib/crashplan and the Friends backup location after installs and upgrades
  • 005 added warning to restart daemon after 1st run, and improved upgrade process again
  • 004 updated to CrashPlan 3.2.1 and improved package upgrade process, forced binding to 0.0.0.0 each startup
  • 003 fixed ownership of /volume1/crashplan folder
  • 002 updated to CrashPlan 3.2
  • 001 intial public release
 
 
About these ads

2,380 thoughts on “CrashPlan packages for Synology NAS

  1. Chris (@g0afr3ak)

    After adding a lot of photos the backup didn’t work (stuck at “Analyzing….”). Upgraded the RAM on my DS 412+ to 2GB, changed USR_MAX_HEAP to 1024M and now it’s working again!

    Reply
    1. Germaine

      How did you amend the permissions on the syno_package.vars file? I’m unable to do that so I can’t change the Max Heap.

      Reply
  2. Carlos

    Hello,
    Sorry I’m newbie I’ve the crash plan installed and java on a DS412+. but how I can configure what files from NAS to backup to crash plan cloud??

    thanks in advance

    Reply
    1. Aram

      Hi Carlos,
      Rather than repeating every thing that has been written already, I suggest you Google “CrashPlan Headless Client”. Also if you run into trouble, reading through the comments on this site will tell you all you need.

      Reply
  3. marv

    I have succesfully installed th package and Crashplan can backup the files on my Synology NAS. The only problem I have is that on one foldder, I select the complete folder, but Crashplan only sees it as one file. Also the icons of the selected files and folder in that folder are the same. On other folders I don’t have that problem. It has probaly something to do with rights on that folder but I can’t seem to find what the actual problem is.

    Reply
  4. Pope

    Hi Patters (or anyone else who may be able to help!)

    I’ve been running CrashPlan on my Synology for a few years without problem and had some problems when the new Synology update came out but following instructions in the comments here I got it sorted.

    But…about 2 weeks ago the CrashPlan server stopped working and I’ve only just got around to looking at it. CrashPlan wasn’t starting and wasn’t logging anything when it didn’t start, so I uninstalled both it and Java and then installed the latest Java SE Embedded 7 package and then opened the log for that package to check it was all ok.#

    In the log is an error stating:

    Error occurred during initialization of VM
    java/lang/ClassNoFoundException: error in opening JAR file /volume1@appstore/java7/jre/lib/rt.jar

    JAVA_HOME=/volume1/@appstore/java7/jre
    TZ=Europe/Dublin

    Tried rebooting – no difference
    Tried uninstalling and reinstalling java – no difference

    I did a quick google and found a post of StackOverflow that suggested that java hadn’t been installed correctly and the pack files hadn’t been unpacked to jar files, so I logged in using SSH and checked the files in that directory and the rt.jar file is there as expected so that doesn’t seem to be the problem…

    Anyone got any ideas?

    Thanks

    Reply
  5. RAJ

    Just thought I’d share my experience, as I’ve benefited by greatly by many of you, most of all you, Patters… thanks for keeping this up and running, and all the patches along the way.

    I originally was looking to do this @September of 2013. I had a stock Synology 1511+ with original 1GB of memory. Currently have 5 drives of various sizes, using Synology’s Hybrid Raid Structure… for a total of 5.4 TB of usable storage. It is now 65% used. Ever since then to now… I have benefited from every Patter’s patch, forum comments and advice.

    I have since upgraded to 3GB of memory (as per advice), as I found the more I sent up… the more I needed. And currently have 2.2TB uploaded onto CrashPlan Servers… with approx .5 TB left to go that I want to move.

    Current setup:

    DSM 5.0-4482
    Patter’s CrashPlan Version 3.6.3-0027
    Patter’s Java SE Embedded 8 Version 1.8.0_0132-0023
    CrashPlan + 3 Version 3.6.3

    Aside from using the above, and upgrading to 3GB of memory… the following 3 tweaks help me get it all working seemlessly. Using Putty to SSH onto the DS1511+, and logging in as root:

    1. EDIT syno_package.vars
    a. cd /volume1/@appstore/CrashPlan
    b. vi syno_package.vars
    c. Change “#USR_MAX_HEAP=512M” to “USR_MAX_HEAP=2560M”
    d. Hit “ESCAPE” – “:wq” – “”

    2. EDIT run.conf
    a. cd /volume1/@appstore/CrashPlan/bin
    b. vi run.conf
    c. Change “-Xms20m -Xmx512m” to “-Xms128m -Xmx2560m” on first line
    d. Hit “ESCAPE” – “:wq” – “”

    3. EDIT CrashPlanEngine
    a. cd /volume1/@appstore/CrashPlan/bin
    b. vi CrashPlanEngine
    c. Add lines:
    #Increase open files limit
    ulimit -n 65536
    d. Hit “ESCAPE” – “:wq” – “”

    The above changes pretty much result in 95-98% of the memory being used to send the files up to CrashPlan… leaving very little left to do other stuff… so if you bang on the server frequently… you may have to schedule around that.. I’m okay with it for now, as my priority for now is to get all of the stuff I want backed up – onto CrashPlans servers. Eventually when the final .5 TB is uploaded… I will experiment with the above tweaks to get it optimized. I’m guessing (hoping) that I only need the dedicated memory during the upload process… and hopefully once everything is up there… less will be needed for incremental backups.

    Occasionally I would have to uninstall everything after an update, reinstall, make the above changes again…. then adopt my computer… but its a small price to pay to be able to pick up from where it left off…. after a resync of course… but the files didn’t need to be re-uploaded.

    Hope this helps,…. and a thousand thanks again Patters!

    RAJ

    Reply
    1. patters Post author

      Your points 2 and 3 are done automatically by the package by the way. Point 1 is the only required customization.

      Reply
  6. Joe

    It appears as though the CrashPlan and CrashPlan Pro packages will no longer start after the most recent DSM update 5.0-4482. Anyone have any insight into this? Thank you.

    Reply
    1. Michael

      Have the same Problem with the latest DSM update. Crashplan App doesn’t start. Reinstalling Java didn’t help.

      Reply
  7. Mitchell

    I have tried to edit the syno_package.vars file, but it is read-only. Can someone recommend how to make it editable? I tried to reboot the server, but that did not help. Thanks.

    Reply
    1. Shane

      Use Winscp. You can download that from Portable apps.com I use it and it makes things easy.

      Reply
      1. Mitchell

        Thanks Shane. I could connect with FTP. However, I could not navagate to \volume1.

      2. Shane

        Use the SCP protocol. That will work better. You can then click Edit and then modify the file. Ensure that you login using ROOT and NOT admin. Use ROOT.

      3. Mitchell

        Thanks. I just did it, and it is still write protected. When I try to change the permissions, it denies me. I tried to rename it, and it is denied too.

      4. Shane

        Looking at the rights this is what I see on /volume1/@appstore/CrashPlan\syno_packages.vars

        rw-r–r–

        Do yours look like that? You can also right click on the file and get the linux properties which have the Owner, Group and Other Rights.

      5. Mitchell

        I just saw your recommendation to log in as Root. That worked! I just restarted crashplan and it connected. I will check it in the AM and see if it is working and has not disconnected. Shane, thank you so much for your help with the editing and recommendation to use winscp. I will leave a follow-up post. Mitchell

      6. Shane

        No problem Mitchell. This forum has helped me a lot and any help that I can give back I try to do so.

        Your welcome. Glad it helped.

  8. seb

    Hello, my crashplan refuse to start :(

    DS414 / MARVELL Armada XP / 1.33 GHz / 1024 Mo
    DSM 5.0-4482
    Java 8 correctly installed from packages.pcloadletter.co.uk

    Then i installed Crashplan 3.6.3 but it never launched.
    In log i ve Crashplan start , then 1s later, Crashplan stopped

    any idea ?

    Reply
  9. david

    Hi all, I was running CrashPlan on my 1513+ and seemed to be working well. Had to increase the RAM when I increased the backup load, but it seemed to work fine after that.

    Now I am getting the “Unable to connect to the backup engine” again. The only thing that changed was that I reformatted the computer that is running the CrashPlan desktop app and once I reinstalled that, cannot connect to the backup engine.

    I tried an uninstall of both JAVA and CP, rebooted, re-installed both and modified the syno_package.vars file (upped the heap to 3000M) and modified the run.conf file and then rebooted again.

    After all that, still no luck connecting to the backup engine.

    Any suggestions?? This is so frustrating! I just wish CrashPlan would support headless servers already!!!

    Thank you,
    David

    Reply
  10. Alex

    Hello,

    I have added “http://packages.pcloadletter.co.uk” to the package sources, but CrashPlan does not appear.

    Reply
    1. B. Goodman

      I thought that too, until I noticed the light grey scroll bar. Once I scrolled UP, I found CrashPlan!

      Reply
  11. Eric

    Upgrade on a DS214 to DSM5.0 was a bit painfull.

    Even though, I stayed with Java 1.7 (anyway the latest package doesn’t seem to work yet with 1.8), I had to “backtrack” to 3.6.3-026.
    With 3.6.3-027, it seems that crashplan simply chockes when scanning the list of (already) backed-up files. Crashplan was running on the server. Code 42 was reporting regular connections but nothing was getting saved (and Code 42 was sending warning messages about the situation)
    I am backing up 259 709 files: so may be there is a link to the fix for large backup sets???

    In any case, thanks for Patters for maintaining the package(s)!

    Reply
    1. C Farley

      Eric, I’m a noob at this, but can I ask how you “backtracked” to 026? I’ve been wanting to do that for a couple of months now.
      I have never gotten 027 to work on my 412+ with DSM 5. I’ve had to “backtrack” myself to DSM 4 in order to get CPpro working again.

      THANKS!

      Reply
  12. Chris Standley

    I have Crashplan working perfectly (as far as I can tell!).
    I want to use it to back up the CloudStation folder on my NAS that, in turn, I’m using to backup my two video edit suites.
    Problem is, Crashplan’s file browser cannot see the Cloud Station folder. It sees the admin folder (where the CloudStation folder resides – correct?) but once I twirl this down – nothing. I’m sure it’s a permissions issue but cannot figure this out.
    Any (straightforward) advice?

    Reply
    1. DJ

      I had to give the crashplan user specific read-access to the “homes” share. Then the CP client sees the folders. If you already did this I would not know why.

      Reply
  13. Pingback: On a home media system set-up | The Howle Blog

  14. b5b8d2

    Ok, I just upgraded my DS412+ from 4.3 to the latest 5.0 without issue! My system is syncing my backup as I type this! Here’s the exact steps I took..

    I was running the Java Manager 1.6 from Synology/Oracle for years now without issue so that’s what I’m doing again.

    0. You made backups of everything, right?!? :)
    1. Uninstall Java Manager
    2. Uninstall Crashplan
    3. Upgrade DSM/Reboot
    4. Make sure everything system wise looks ok
    5. Goto Package Manager and install Java Manager 1.7 from Synology, this doesn’t actually install it yet it just sets the system up so you can install it
    6. Goto Control Panel and you should see a Java Manager Icon, run it and it should say Install Or Upgrade Java, but the classpath should be blank
    7. After you hit Install/Upgrade it should give you a screen telling you that you need to download Java from Oracle, go ahead and do that, the file you want is: jdk-7u55-linux-i586.tar.gz which is 133 megs
    8. Now go ahead and install it, once it does it’s thing you should now have a full java classpath
    9. Reboot
    10. Go back to Package Manager and install Patters Crashplan 3.6.3-0027
    11. Reboot
    12. Once it comes back verify that the Crashplan service started, then connect to with your client and it should ask you for your account information, enter that and it should ask you if you want to adapt the backup which of course you do!
    13. After it works out a few things you’ll also need to enter your private backup key and maybe answer another question or two, but after all of that you SHOULD be in business!

    You MAY have to reboot again after you enter your account information, I think I had to..

    Overall though the process went very smoothly, I didn’t have to edit any files, services came right up and so far I’m about 80% Block Sync’d of 326Gigs/640K files..

    Super happy!

    Thanks again Patters, you rock!

    Reply
  15. AJ Willmer

    I installed your package AFTER uninstalling the CrashplanWiki manual install. I noticed that I had to explicitly grant read permission to user crashplan to all folder hierarchies that I wanted to back up. Is this expected behaviour or should the user crashplan be part of an administrators group?

    Reply
  16. Arjan

    I have added “http://packages.pcloadletter.co.uk” to the package sources, but CrashPlan does not appear

    Can anyone help?

    I Have a DS414J

    Reply
    1. patters Post author

      This product has a new CPU type (Mindspeed Comcerto 2000). Did you get Java installed and does it correctly display the version information in the package Log in Package Center? If so I can add support for this model to the other packages like CrashPlan and Serviio.

      Please can you also post the result from running cat /proc/cpuinfo while connected to your NAS via SSH? Log in as the user root using the same password you defined for the admin account.

      Please also run uname -a and paste the result here. Thanks!

      Reply
      1. Scott

        Java 7 & 8 both install but both don’t list anything in log. Get a “java: not found” error line. Any tips?

        Also have a 414j that would like to get the CrashPlan package running on.

  17. adamT

    In what directory on the synology does crashplan install to? Im using Crashplan ProE 3.6.1.4

    Reply
  18. FreKai

    I am having a problem every time when updateing DSM or Crashplan on my DSJ 213. After the update has completed and crashplan is reinstalled i am choosing the option that the DS is overtaking for an old device and is resuming. Crashplan then starts to backup again – BUT all older Backups (from my PC´s which are using the DS as backup location) are gone… why? it seems they got deleted when crasplan starts new after the update. but thats not what the option to resume for another device should be for…

    i am using the DS as backup location for my PC´s and i am having online supscription (crashplan Central)

    How to resume right? So that all the old backup sets are available? In Crashplan Central there are all old Backups. But not on the DS itself.

    Can anybody help me please?

    Reply
    1. Heisenberg

      I had the same issue during my last update – permissions had been reset for the ‘crashplan’ user account on the DS and my previous backups were not showing in the Crashplan desktop client. I granted read permissions to the ‘crashplan’ account, restarted the desktop client and then I was able to select the correct backup folders. De-duplication then did its magic and recognised the backup locations – everything was up and running after a while (depends on the size of your backups).

      Hope this helps you out.

      Reply
  19. shayaknyc

    Ok, here’s something weird. I successfully installed Java 7 and the regular Crashplan packages. Crashplan was running just fine. Came back this morning, crashplan stopped. Won’t restart. Logs don’t update either. Tried uninstalling crashplan, restarting nas, reinstall crashplan, won’t stay running, logs empty. Any ideas?

    Reply
    1. Yorn

      I have this problem too.

      I think we have to follow the guide from “b5b8d2″ above and restart it like 3 times. Since the patch came out recently I uninstalled all packages of Java and CrashPlan, installed the update so it’d restart. Then I installed patters version of Java (ejre-7u60-fcs-b19-linux-i586-headless-07_may_2014.tar.gz) and restarted, then I installed CrashPlan and restarted. It works now, but I had to redo the whole of my backups. I’m really worried I’m going to have to recreate every time, basically, and spend several days backing everything up to Crashplan again each time.

      Reply
      1. shayaknyc

        I figured it out this morning! For some reason the direcore /tmp/@tmp didn’t exist, so I created it and gave it full permissions. Now crash plan works!

    2. Chris

      I had the exact same problem and behavior. Without any restart or reinstall of CrashPlan, I uninstalled/installed the Java package. Restarted CrashPlan and everything was working without having lost any settings.
      Using CrashPlan (reg) and Java 8 packages.

      Reply
  20. narble

    Here’s an odd one. Suggestions appreciated.

    Installed the CrashPlan package successfully and am able to connect to it with the CrashPlan desktop app, but when I try to login, the desktop app tells me my credentials are invalid.

    I can login to the CrashPlan website just fine and when I change the ui.properties file to point back to the service running on my desktop, it accepts my credentials just fine.

    Any ideas?

    Reply
  21. NativePaul

    Firstly a massive thank you to patters for making this possible! It’s greatly appreciated.
    My DS413j used to handle Crashplan but now refuses to backup to Crashplan cloud (since recent DSM updates). It says it’s connecting but constantly crashes. I have about 1TB of data to push to the cloud and suspect the 512MB of un-upgradable RAM is insufficient. The primary purpose of my NAS is to manage backups to Crashplan. Is there anything I can do to make this work on the DS413j? If not, can anyone recommend a good & affordable alternative?

    Reply
    1. Bjorn

      I dont have Crashplan+ myself, but it seems like the memory problem is due to a large set of files to backup. Why not use the feature in Crashplan+ that allows you to use several backup sets, and segment your data into smaller pieces. As I said, I dont have + so I can’t try it myself.

      Reply
      1. NativePaul

        Thanks Bjorn. I tried breaking my folders into much smaller backup sets that run at different times but still having no luck. It now gets stuck on Analyzing files for a while, then crashes and never uploads anything. I uncommented the heap_size line to make it 512MB (but that is how much RAM the 413j has). I gave the crashplan user read access to all folders. But still no dice. Any other thoughts?

      2. NativePaul

        As a test, I created a new backup set to upload one 30MB file. Crashplan said it was complete, but the restore function in the GUI and on the website don’t have the file available. The GUI reports that none of my folders have been modified since September 2013 (when I signed up to Crashplan) despite reporting progress and success for months. I’ve run out of patience now and have no idea what to do.

      3. Techiedork

        Paul – I was in the same boat – despite Patters’ amazing work and support, Crashplan on a Diskstation is just too unreliable to be a good backup option. Every DSM update breaks it one way or another.

        I ended up getting a used Mac Mini from eBay. Since they officially support backing up a NAS via a Mac or Linux system (not a Windows PC), I just have the Mac mini sitting on top of the DS with the shares mounted. Crashplan works great (and is super fast) and reliably backs up the NAS. Updates to DSM won’t cause any issues and if for some reason I have to reformat the Mac Mini, I can just adopt the backup and pick up where I left off.

        It was some cost I wasn’t anticipating, but the reliability is completely worth it.

      4. NativePaul

        Techiedork – Only saw your comment now. Thanks for the tip – I had to settle for a very similar solution in the end. Fortunately I had an old windows laptop lying around so that’s now my dedicated backup machine. Took some fiddling to get the network drives mapped but looks like I’m finally backing up again.

    2. NativePaul

      After butting my head against a wall for over two weeks I had to give up. Having tried every suggestion online I am now running a ridiculous solution with an old Windows laptop on the LAN dedicated to CrashPlan Central backups from my DS413j. Patters is a hero for the work he’s put into this but it is a real shame that CrashPlan won’t support NAS backups with a solution that simply works.

      Reply
  22. Ivom74

    I reinstalled Crashplan a lot of times. Problem is that after logging in with existing account the question of Crashplan to confirm that this is the same computer (use old guide number) doesn’t appear.
    I have to backup again all my data and delete the old computer guide ID.

    Is there a way to change the guidenumber of crashplan in an ini file?
    By the way, it’s installed on Synology DS212.

    Reply
    1. patters Post author

      In the CrashPlan client you can enter commands by double-clicking the CrashPlan icon in the top corner. There a GUID command to manually set it.

      Reply
  23. BabakHeidariDDS

    Changing GUID back to old diskstation GUID fixes the update problem after you restablish connection by reinstalling java and crashplan on DS and restarting 2-3 times. Change GUID so you wanted have to rebackup everything.
    Wish crashplan and synology would work together and make a trouble free application out of Patters great work. Would make both companies more usefull. More people would buy/subscribe to their products.
    I am a dentist that uses the synology diskstation as a data server for X-ray data, pt data, images via SQL.
    Loading crashplan is the best way I have found to automate my 3 location backup for the DS.
    Thanks Patters
    PS. Long night figuring out this prob again, no more synology updates for me.

    Reply
      1. BabakHeidariDDS

        Thanks to you for everything.
        One question- I always have 23 files that don’t get backed up. How can I find out what they are. On pc log file does not show diskstation incoming errors. Also haven’t found a log file on diskstation to detail this.
        Thanks again.

  24. svar

    When DSM gets updated, the CP package stops and won’t restart before I uninstall and reinstall Java8. Not a big deal, as it so far has worked each time, just wanted everyone to know.

    Reply
  25. Pingback: UPDATED 2014: How to setup CrashPlan Cloud Backup on a Synology NAS running DSM 5.0 - Scott Hanselman

  26. belnas

    what else can I try?

    setup:
    DS213j | DSM 5.0-44-93 update 1
    community java package, 7u60

    backing up 1.3Tb (in 4 sets of 500Gb, 400Gb, 300Gb, 100Gb)

    Adopting computer + synchronizing block information works good (connected for 20h),
    but when starts backing up, “crashplan disconnected from backup engine” every 2 minutes :(

    I tried using, java 6, 7 & 8
    installing, uninstalling, rebooting….
    leaving the USR_MAX_HEAP=512M,
    changing the USR_MAX_HEAP=400M….
    ….

    I can’t upgrade the 512 RAM of my box :(
    What else can I try?

    * At one point, with DSM 4.0 + Oracle Java 7u55 It was working SUPER good and fast :(

    Thanks

    Reply
  27. Heisenberg

    Just successfully upgraded to latest DSM and Java, step by step as follows. Some steps or reboots may not have been necessary but I wanted to stay with a proven workflow!


    Synology DS214
    DSM 5.0-4482 upgrade > 5.0-4493 > 5.0-4493 Update 1
    CrashPlan 3.6.3-0027
    Java SE Embedded 7 – 1.7.0_55-0024 upgrade > 1.7.0_60-0025

    NOTE: *All DSM updates require Java reinstall*

    –STEP 1–
    Stop CrashPlan and uninstall Java

    >>REBOOT<>REBOOT<>REBOOT<<

    –STEP 6–
    Check CrashPlan user account has at least 'read' permissions for your backup folders

    –STEP 7–
    Run CrashPlan

    –STEP 8–
    Run CrashPlan desktop client

    *The last time I performed this upgrade (April) I also uninstalled the headless CrashPlan engine, as well as Java, but that's not necessary (at least in my situation as described above) and saves you having to re-sign in to your CrashPlan account, adopting previous backups and going through the data de-duplication process. Following the steps above I was able to start my desktop CrashPlan client and it behaved as if nothing had changed.

    Hope this helps.

    Reply
      1. Heisenberg

        No idea what happened there – try again here…

        - -

        Just successfully upgraded to latest DSM and Java, step by step as follows. Some steps or reboots may not have been necessary but I wanted to stay with a proven workflow!


        Synology DS214
        DSM 5.0-4482 upgrade > 5.0-4493 > 5.0-4493 Update 1
        CrashPlan 3.6.3-0027
        Java SE Embedded 7 – 1.7.0_55-0024 upgrade > 1.7.0_60-0025

        NOTE: *All DSM updates require Java reinstall*

        –STEP 1–
        Stop CrashPlan and uninstall Java

        REBOOT

        –STEP 2–
        Update DSM (in my case this was a 2-step upgrade to 5.0-4493 Update 1 with forced reboots between upgrades)

        –STEP 3–
        Reinstall Java SE Embedded 7 (if upgrading you will need to download the new version from Oracle, see link below, registration is required) and copy to ‘public’ folder. If you are not upgrading, simply reinstall the version you already have.

        LINK: http://www.oracle.com/technetwork/java/embedded/downloads/javase/index.html
        FILE NAME: ejre-7u60-fcs-b19-linux-arm-vfp-sflt-client_headless-07_may_2014.gz

        REBOOT

        –STEP 4–
        Restart CrashPlan engine

        –STEP 5–
        Wait for CPU activity to settle then stop CrashPlan

        REBOOT

        –STEP 6–
        Check CrashPlan user account has at least ‘read’ permissions for your backup folders

        –STEP 7–
        Run CrashPlan

        –STEP 8–
        Run CrashPlan desktop client

        *The last time I performed this upgrade (April) I also uninstalled the headless CrashPlan engine, as well as Java, but that’s not necessary (at least in my situation as described above) and saves you having to re-sign in to your CrashPlan account, adopting previous backups and going through the data de-duplication process. Following the steps above I was able to start my desktop CrashPlan client and it behaved as if nothing had changed.

        Hope this helps.

  28. hal sandick

    Since running the last DSM update for my server(details below) the crashplan package on my server says it is stopped. However, when I open the crashplan client on my pc i can initiate a backup. the log on the synology server shows that
    1) it has started when the server booted up
    2) that the scan i imitated occurred. files were uploaded.

    Any suggestions?

    Thanks for the great package!

    hal

    Details:
    DS213+
    updated to DSM 5.0-4493 Update 1
    java 1.8.0_0132-0023
    your crash plan package 3.6.3.-0027

    Reply
    1. hal sandick

      Okay, i uninstalled java 8 and installed java 7 and now the crashplan now says it’s running.

      hal

      Reply
  29. Jeppe

    I’m getting this error in my logs:

    JVM temp directory /@tmp exists but we can’t write to it!

    Had trouble with CP on newest DSM. Removed both java and cp.
    Installed Synology Java manager and newest 1.7 + Crashplan.
    Did a lot of reboots in the process.

    Now CP won’t start and the logs state the mentioned line and nothing else.

    Any ideas?

    BR Jeppe

    Reply
  30. B. Goodman

    Also, does the CPU usage settings “When user is away…” and “When user is present” do anything on a Synology NAS? (DS412+)

    Reply
  31. Jeppe

    Hey,
    I’m having issues with CP on a 214+.

    It has been working before, but stopped for some reason. After trying a lot of things with different Java versions, numerous reboots and so on, I finally crumbled yesterday and removed both Java and CP.

    I figured that start over would be an “easy” fix, but it turned out not to be.

    I used the Synology Java Manager and installed the newest build of version 1.7. Then rebooted and installed CP. Then rebooted…

    But CP won’t start, it just stops immediately.

    I managed to get hold of some log files and found this line in one of them:
    “JVM temp directory /@tmp exists but we can’t write to it!”

    Can anyone help me out here? What is the solution?

    BR
    Jeppe

    Reply
    1. patters Post author

      Normally there is a hidden system created temp folder on your main volume (/volume1/@tmp). I have to move the Java temp folder to this location because on a Syno the default location of /tmp has very little space. I don’t understand why your NAS would not have this. Can you check for this folder via SSH?

      Reply
      1. Jeppe

        It is there. I will try and set the permissions manually on the @tmp folder and then re-install Java package and CP.

      2. Jeppe

        Hmmmm, didn’t do the trick.

        I see the same issue on my own DS412+(the other was a friends 214+). I’m in the process of migrating from a Qnap and didn’t get around to setup CP yet(other than just hitting install on Java and CP).

      3. Jeppe

        Hmm got it working on my own DS412+ just by re-installing Java and CP.

        On the DS214+ it’s another case however. Is there any other logfiles that could show some usefull information?

  32. patters Post author

    It used to be that Synology only updated DSM a couple of times a year, but now it’s becoming clear that they’re going for a much tighter update cycle in order to fix security vulnerabilities and so on. I will soon update the Java package to check for the PATH and JAVA_HOME modifications each time it starts so it can survive DSM updates. The big problem used to be the locale stuff getting removed by DSM each time it updated, but that’s been baked into the OS from about DSM 4.2. This should in turn make CrashPlan much more reliable. Stay tuned.

    Reply
  33. Hagion Pneuma

    After the last DSM update, the package stopped working. I uninstalled it, but now I can’t find it anymore in the package center. There’s only Java 6/7/8 and Serviio.

    Reply
  34. Matt Maher

    I’d really appreciate some help here. I’m so lost!

    I have everything installed correctly (or so it seemed), but I just can’t get it to work. I get the dreaded “unable to connect to the backup engine” message on the client PC.

    I have Java installed on my Synology. I have Crashplan installed. Both are running. I have the ui.properties file edited on my client PC. I have the Crashplan service stopped on the client PC. But I just can’t get Crashplan to open so I can set it all up to run the backup. I’ve reset the client PC and the Synology a few times now.

    I’ve looked through all these comments, and I’ve been working on this for hours now. Please help!

    Reply
    1. B. Goodman

      Any chance you ruined the ui.properties by editing with Notepad in Windows? You probably already knew about that, but just thought I’d mention it since nobody else responded yet. FWIW, I copied that file to the Synology, used the editor in the Synology, then copied it back to the right spot in Windows.

      Reply
      1. Matt Maher

        Thanks for your response, but I think the file is OK. I used Notepad++, and the file looks fine when opened. I did have trouble saving my changes, but I renamed the old file and created a new one with the settings I wanted. I have the correct IP address, and the servicehost and serviceport are both “commented in” or however you describe that…

  35. Perry

    Hi, i have exactly the same issue as Matt: “Unable to connect to the backup engine” message on the client PC and everything seems to be running as is should.

    But when i do netstat:
    DiskStation> netstat -an | grep 424
    tcp 0 0 0.0.0.0:4242 0.0.0.0:* LISTEN
    tcp 0 0 0.0.0.0:4243 0.0.0.0:* LISTEN
    This is different from what CP says it should be.
    The second line does not show 127.0.0.1:4243

    Can this be the issue?
    How to change it?

    I have a DS414 with 2 nic’s, nic2 is not used, CrashPlan Service is enabled and bound to interface LAN1 in control panel Info Center.

    Reply
  36. Perry

    Hi,
    Issue solved. Issue was on the client side, In UI.PROPERTIES, servicehost was not commented out.
    Thanks for all the work you put into making Crashplan available for Synology!!

    Reply
    1. Matt Maher

      Wait, I’m not sure I understand. You’re saying that you ended up having to comment out the servicehost line? But that’s the part that actually connects to the NAS. Without that isn’t the UI just pointing to the client computer?

      Reply
  37. Babak

    Hello,
    Crashplan says 21 files cannot be backed up. Does not specify exactly which files and log file is not anywhere on the synology diskstation. Anyone know what I can do to find out the names of the exact files? I would greatly appreciate your help!

    Reply
  38. belnas

    Patience,
    the secret sauce was patience.

    1) installed everything with reboot between each step,
    2) connected with the client & adopt
    3) DIDN’T connect with the client again for 1 day!
    (every time I connected with the client, CP disconnected from backup engine)
    4) after one day, I connected with the client and everything was working smoothly.

    Thanks.
    I’m backing up again: http://i.imgur.com/i9WZqX9.png

    - setup -
    DS213j | DSM 5.0-4493 update 1
    community [patters] Java SE, 7u60
    USR_MAX_HEAP=1024M
    backing up 1.3Tb (in 4 sets of 500Gb, 400Gb, 300Gb, 100Gb)

    Reply
  39. Joshua Abrams

    Hi Patters, thank you so much for this great package! I had not problem downloading and installing your package as well as Java and all appear to be running. I also configured a remote client and that looks fine as well however I am getting an error “unable to back up – no encryption key”. I had this working no problem in the past I do recall it allowing me to enter the encryption key and all worked fine but I am no longer able to get it to prompt me for the encryption password so I can’t get it working again. I thank anyone reading this for their help. Joshua

    Reply
  40. tonysqrd

    Hi there. One of the best packages that I have running on my Synology DS1813+. I have been running it with no issues. Sometimes I had to re-install it after a DSM update and this blog helped me figure out what I needed to do. Since the latest DSM update though my synology CPU is running between 48% and 60%. It has been running like this for 4-5 days now. Since I updated to the latest DSM. In the resource monitor it is Java that causing the CPU spike. When I stop the crashplan package the CPU usage drops. The crashplan site says that the backup is 99% done and when I open the windows client it says that it is “analysing”. I uninstalled everything and even downgraded the Java software. Same results.

    Anyone has a similar issue? Any suggestions?

    Reply
    1. Steve

      @tonysqrd, I am having the exact same issue on my DS414+, but I thought it was because of a recent update.

      I updated to the latest Java 8 tonight, so I will post back if that helps the CPU usage at all.

      Reply
      1. tonysqrd

        @steve, I did the same. I upgraded to latest Java but same results. My next step is to uninstall everything, reboot and install again… unless you found a better way.

      1. Steve

        This has fixed (so far) the “disconnected from backup engine” error for me

        1. Stopped and uninstalled CP from DSM
        2. Uninstalled Java 8 from DSM
        3. Reboot Synology
        4. Delete /@appstore/CrashPlan, (and anything else that said Crashplan)
        5. Installed Java 6
        6 Install CP, wait start/stop, etc..
        7.Went through adoption process on client PC

        So far so good, we will see if it lasts. The “files” section was finally able to get all the way through my files without disconnecting so that is a good sign. Now I am waiting for the “synchronizing block information” to complete, which may be several hours.

      2. Bert

        Same for me Steve, eacht time there is an update of DSM I have to perform the same procedure, except that I do not delete the folder /@appstore/Crashplan and I use Java 7.

      3. Steve

        Update: I had to edit the /volume1/@appstore/CrashPlan/syno_package.vars file to make it work. Un-comment out the line and update the memory to 1024M. Just like the instructions above ;)

      4. tonysqrd

        Thanks for the update Steve. And how is your Synology CPU behaving so far?

  41. Jesse Graves

    I have a Synology DS1513+ running DSM 5.0-4493 Update 1, patters Java 1.6.0_38-0026, and patters 3.6.3-0027. Previously (running older versions of everything), when Crashplan went to 3.6.3, Crashplan stopped working. It does this every two minutes:

    I 06/24/14 03:11PM CrashPlan started, version 3.6.3, GUID 621622929056596338
    I 06/24/14 03:11PM Backup scheduled to always run
    I 06/24/14 03:11PM [Default] Scanning for files to back up
    I 06/24/14 03:12PM [Default] Starting backup to CrashPlan Central: 348 files (1.20GB) to back up

    Looking at the log engine_output.log, nothing looks bad except this WARN:
    [06.24.14 15:11:50.346 WARN main com.backup42.common.config.ServiceConfig] ServiceConfig version higher than it should be. uniqueId=642382000797778291

    engine_error.log is blank.

    When I connect with a GUI client, it shows the backup starting but within 10 to 30 minutes, loses its connection with the backend server.

    Trying to debug this, I upgraded to the versions l listed above. Now I’m stumped. Can anyone offer any suggestions or point me towards a more appropriate forum to ask this?

    Reply
    1. DougE

      Jesse,

      Did you figure this out? If not does anyone else have the same problem? I’m having the exact same problem that Jesse is having and I’ve followed the same steps.

      Thanks in advance all.

      Doug

      Reply
      1. Jesse Graves

        I have not figured this out. Someone recently posted this list of steps for an clean re-install.

        This has fixed (so far) the “disconnected from backup engine” error for me

        1. Stopped and uninstalled CP from DSM
        2. Uninstalled Java 8 from DSM
        3. Reboot Synology
        4. Delete /@appstore/CrashPlan, (and anything else that said Crashplan)
        5. Installed Java 6
        6 Install CP, wait start/stop, etc..
        7.Went through adoption process on client PC

        I will eventually try it. Step 4 is perhaps the magic.

      2. Jesse Graves

        I think maybe I was unclear. The list of steps I just re-posted did not fix my problem. I just intend to try them. Step 4 is the only change from what I’ve already done in the past.

  42. Jeppe

    Need help figuring this one out.

    I migrated a lot of data using FXP between my old Qnap and my new Synology DS412+.

    Got Java 8 and CP installed an connected to the headless client and setup the account.

    I didn’t choose to adopt, so I’ll have to reupload, but I can live with that.

    For the time being I’m only backup up three main folders(dokumenter/photo/web). I selected the folders in CP and it started scanning.

    However it sees only one file in both “dokumenter” and “web” and in “photo” it sees a lot more files an GB than my Synology reports. “Dokumenter” and “web” contains a lot of files.

    Any ideas what could be wrong?

    Crashplan: http://oi60.tinypic.com/1ghgrt.jpg
    Synology “dokumenter”: http://oi60.tinypic.com/16gx6cn.jpg
    Synology “web”: http://oi58.tinypic.com/ot455y.jpg
    Synology “photo”: http://oi59.tinypic.com/110ht1y.jpg

    BR
    Jeppe

    Reply
    1. RAJ

      Hi Jeppe,

      Not sure about the rest, but with regards to ““photo” it sees a lot more files an GB than my Synology reports”… it may be Synology’s @eaDir directories. The @eaDir directories contain extended attributes and thumbnails that take up quite a bit of space, not unlike Windows Thumbs.db files.

      On the CP Settings menu where you choose with directories to backup… there is an option for “Filename Exclusions:” In that, I included “ea@Dir”. This will force CP to ignore these files and folders. When I did that, the number of files and memory droped significantly.

      Hope that helps,

      RAJ

      Reply
      1. Jeppe

        Thx RAJ,

        Got it working when manually setting read permissions for the CP deamon user.

        Excluding @eaDir did the job and reduced the backup significantly :o)

        BR
        Jeppe

  43. Dino

    hi all,
    can some please help me by providing a summary of the changes needed in config files etc? I’ve reinstalled crashplan and settings have reverted back and I don’t remember which areas I need to change.

    thanks

    Reply
  44. Tom

    Great tutorial, thanks. I am using CrashPlan on my Mac and figured out how to change the serviceHost setting (in /Users//Library/Application Support/CrashPlan/ui.properties) to point to my Synology. But now I can’t edit my Mac’s CrashPlan settings, I have to comment it in and out to switch between Mac and Syno. Is there any better way to achieve this other than installing the CrashPlan GUI on another system (where I might have the same problem, btw). I just don’t have a free system with no Crashplan …

    Thanks
    Tom

    Reply
    1. Mohammed

      I have installed Crashplan on my Synology D213+ and on my Mac laptop. However, when I go into the ui.properties file it does contain the service host language that I need to amend to point the program to the NAS device. Instead, it states “#Tue Jul 15 23:05:18 GST 2014
      MessageDisplayed.UpgradeReminder=1405450641813
      window=74,87,800,610″

      Can anyone help me?

      Reply
      1. Bjorn

        Have a look and see if you can’t find another ui.properties file. I think there were at least 2 on my pc installation, in different locations.

  45. Toon

    I really like to have CrashPlan on my Nas (DS713+). But I have some questions.
    I’m using CP+ (single license) for a couple of years now (on my iMac), so I have a lot of data backed up by now. When I want to start using CP on my NAS, will I need to create a completely new backup set, or can I just append or merge it to the ones I already have.

    Reply
  46. Hagion Pneuma

    Just after I finished getting everything running, there are new updates for DSM (“update 2″) and Java (60-0025). Can anybody tell me if I can install them safely, or will I have to uninstall and reinstall everything again?

    Reply
  47. David

    I have a couple issues. Hoping someone can help.

    When I connect from a new machine it seems that I can no longer connect to the backup that is on the crashplan site. And if I enter my code for the crashplan+ it wants to delete the backups that are on the crashplan site. Is there something i need to backup or migrate over when I move to a new machine or have to reinstall crashplan from scratch after java breaks.

    Secondly, I noticed that I can no longer get the x86 download from the crashplan site. Always gives me the x64 client. I cannot connect from the x64 client. Luckily I had a copy of the x86 installer still, but this might be a problem for some folks also.

    Reply
  48. Chad McDonald

    Patters, Do you have any older versions of the this package with the CrashPlan PROe client 3.5.3 available? If not is there a way I can use your package but still install something other than the currently available realease of the CrashPlan PROe client (3.6.3 as of this writing)

    Reply
  49. tonysqrd

    Please help! I just got the DS1513+ and uploaded the latest DSM 5.0-4493 Update 2. I have CrashPlan v.3.6.3.-0027 and Java v.1.8.0_0132-0026

    I cannot connect to crashplan from the desktop. I followed almost every direction in the post but no luck. I installed it and uninstalled it and rebooted everything numerous times with no luck. I used SSH to connect to it, created the /tmp/@tmp folder with permissions to all but did not help.

    I have the DS1813+ as well and that runs the same versions and other than a high CPU issue it is running fine. Any help will be highly appreciated!

    Reply

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s