Upgrading to 9.X

Use Latest Versions

Use Hosted Documentation

Use this page on our live documentation instead of the documentation on your local CloudBolt instance. Our online docs have the latest versions of the instructions and utilities required for an upgrade. Using the latest versions ensures that you will get the benefit of any updates we have made.

  • Download the latest version of cb_export.py and copy it to your existing CloudBolt 8.X instance. You can run this utility from anywhere on your filesystem. For example, if you put cb_export.py in the /var/tmp directory, then you would call python /var/tmp/cb_export.py [ARGS] to run it. See Export CloudBolt 8.X below for more details on cb_export.py.
  • Download the latest version of cb_import.py and copy it to your new CloudBolt 9.X instance. You can run this utility from anywhere on your filesystem. For example, if you put cb_import.py in the /var/tmp directory, then you would call python /var/tmp/cb_import.py [ARGS] to run it. See Import CloudBolt on 9.X below for more details on cb_import.py.

Install CloudBolt 9.X

You install CloudBolt 9.X in the same way that you install any CloudBolt, it will just be running on the CentOS 7.6 operating system instead of CentOS 6. See Installing CloudBolt.

Export CloudBolt 8.X

We have prepared a command line tool, cb_export.py, to take care of archiving and compressing all of the standard CloudBolt files and directories into a single .tar.gz compressed archive. The utility is modular enough to allow users to easily add additional files and directories to include in the export or to mark specific files or directories to be excluded from the export.

On our largest internal testing servers, a full export completed in approximately 60 minutes.

cb_export Usage

The export utility will not run if either the Apache web server (httpd) or the job engine is active. This is to prevent an active CloudBolt from making changes to files and directories as they are being compressed, which could break the export.
  • service httpd stop will stop Apache
  • There are 3 potential ways to stop the job engine, depending on your version of CloudBolt
    • supervisor-based job engine: supervisorctl stop jobengine:*
    • celery-based job engine: supervisorctl stop celeryd:*
    • cron-based job engine: service crond stop
Information service crond stop only stops cron from automatically restarting the job engine and does not kill running job engine processes, but running job engine processes will die either within 60 seconds or once they complete all of their jobs, whichever is later.
cb_export can be called as a Python script from anywhere on the filesystem once copied to your CloudBolt instance, (e.g. python /tmp/cb_export.py -is) or it can be called as a Django management command on CloudBolt 9.x or greater (e.g. python manage.py cb_export -is). Either way, the utility will be run as a Django management command behind the scenes because it needs access to various settings defined in Django.

You can run cb_export with the --help or -h option to print the help text, but the same information is also provided here in a different format.

The export utility will include the following directories:

  • VARDIR/opt/cloudbolt where VARDIR is defined in Django settings
  • MEDIA_ROOT and STATIC_TESTS_DIR, where each is defined in Django settings

The export utility will enable maintenance mode when it starts and disable it when the export completes, much like an upgrade.


  • Default Files
    • You can view what will be included in the default export with the -l or --list-defaults option. This will also list the sizes of those files and directories.
  • Dry Run
    • You can run this utility without making any changes to examine its output messages by running it with the -d or --dry-run option.
  • Job Logs
    • By default, the export will include the job logs directory defined by the JOBTHREAD_LOGPATH Django setting, which defaults to /var/log/cloudbolt/jobs/. This can be turned off with the -nj or --no-job-logs option. If your job log directory is too large, you might wish to handle transferring your job logs outside of this export process. The job logs need to be moved to the new CloudBolt instance so that links to view or download a job log from the job details page will still work for jobs that were run on the old CloudBolt instance.
  • MySQL
    • By default, the export will include a database dump of the MySQL database this CloudBolt instance uses. This utility uses the mysqldump command for the database dump. If including the database dump in the export, the exporter will use the ‘default’ database defined in the DATABASES Django setting for database, username, and password. If the password is not defined in those settings, then the user will be prompted for this password as this utility runs.
    • The utility also includes the default MySQL configuration files, if they exist: /etc/my.cnf and /etc/mysql/my.cnf.
    • All the MySQL functionality (database dump and config files) can be excluded with the -nm or --no-mysql option.
  • Apache
    • By default, the utility includes the following configuration files for Apache: /etc/httpd/conf/httpd.conf, /etc/httpd/conf.d/ssl.conf, and /etc/httpd/conf.d/wsgi.conf.
    • These files can be excluded with the -na or --no-apache option.
  • CloudBolt Secrets
    • By default, the CloudBolt secrets directory (/var/opt/cloudbolt/secrets) is excluded from the export, but it can be included with the -is or --include-secrets option. The directory contains secret keys used to encrypt and decrypt sensitive fields stored on the CloudBolt database. Your new CloudBolt instance must have these keys to properly access sensitive fields included in the database dump. If you do not include the secrets directory in this export, then you must transfer it to your new CloudBolt in a different manner.
  • SSL Certificates
    • Users can include the -ic or --include-certs option to add certificates to the export. Certificates will be included from the following directories: /etc/pki/tls/certs and /etc/pki/tls/private.
    • Including certificates in an unencrypted .tar.gz file has security implications because secret information will be saved in a format that anyone can read, so be sure you want to include this information before using this option.
  • Extra Files and Directories
    • Users can specify their own files and directories to include as arguments after the options. The user-specified directories and files will be moved to the same paths on the destination machine by the cb_import utility.
    • For example, if you wanted to include certain pet-related data from a CloudBolt 8.X machine in the export, you could run python cb_export.py /tmp/dog_photos/ /var/opt/cat_stories/*.txt.
  • Exclude Files and Directories
    • Users can specify files and directories to exclude from the import (including default CloudBolt directories or their children) using the -ex or --exclude option. Users can include this option as many times as they want. Excluded directories that are within included directories will be temporarily moved to /var/tmp/cb_export_tmp while their parent directory is archived and compressed. Paths to exclude will always override default paths and paths to include.
    • For example, if you wanted to include all logs CloudBolt logs in an export except for the job engine log, then you could run python manage.py cb_export /var/log/cloudbolt/ -ex /var/log/cloudbolt/jobengine.log
  • Output Directory
    • By default, the final compressed archive will be saved to the /var/tmp/ directory. This can be overridden with the -o or --output option. The compressed archive will be named cbdumpv1_YYYYMMDD_HHMMSS.tgz, where the YYYYMMDD_HHMMSS is a timestamp.

Import CloudBolt on 9.X

We have prepared a command line tool, cb_import.py, to take care of extracting and moving the various files and directories contained within the compressed archive created by cb_export.py.

cb_import Usage

You must verify values for the following Django settings before running this utility: DATABASESVARDIRMEDIA_ROOT, and STATIC_TESTS_DIR. The default values can be found in /opt/cloudbolt/settings.py and any overrides can be found in /var/opt/cloudbolt/proserv/customer_settings.py. This utility will use these settings to determine which database to connect to and where to extract certain directories.cb_import can be called as a Python script from anywhere on the filesystem once copied to your CloudBolt instance, (e.g. python /tmp/cb_import.py cbdump...tar.gz) or it can be called as a Django management command on CloudBolt 9.x or greater (e.g. python manage.py cb_import cbdump...tar.gz). Either way, the utility will be run as a Django management command behind the scenes because it needs access to various settings defined in Django.

The import utility will extract the compressed archive specified in the command invocation to the system’s temporary directory as determined by the Python Standard Library function tempfile.gettempdir. The system’s temporary directory can be altered from the default of /tmp by setting a new directory in the TMPDIR environment variable. As the import runs it will copy the files and directories from the extracted import to their destinations. The extracted import will be deleted when the importer exits, successfully or otherwise.

The compressed archive will be extracted to temporary directory that will be deleted once this command finishes running. Files in the extracted archive are deleted as they are processed to conserve space.

Space requirements are estimated cautiously to prevent a failure mid-import that leaves CloudBolt in a broken state. If the utility determines that your system lacks enough disk space to run the import safely, you can fix it with one or a combination of the following options:

  • Add more disk space.
  • Free unused space on the CentOS 7.6 machine. The /tmp/ and /var/tmp directories might be good options for cleaning.
  • Decrease the size of your export. Try excluding the job logs from the export with the -nj option or excluding the MySQL configuration files and dump with the -nm option. All of this can be copied over after the the import command completes without issue.
All files and directories that are written by this utility will maintain owner, group, permission, and timestamp information. Existing files that match those in the export will be overwritten. Files within existing directories that the import utility is writing to will not be deleted, only updated or created.

You can run cb_import with the --help or -h option to print the help text, but the same information is also provided here in a different format.


  • Dry Run
    • You can run this utility without making any permanent changes to examine its output messages by running it with the -d or --dry-run option. This will extract the compressed archive to a temporary directory to examine its contents, but it will be deleted once the utility finishes running (or if it encounters an exception).
  • Backups
    • By default, the utility will create backups of existing files and directories before potentially overwriting them. Files will be backed up in place by moving the existing file to <FILE_NAME>.bak. Directories will be backed up in place as compressed archives with the name <DIR_NAME>.bak.tar.gz. Backups can be turned off with the -nb or --no-backups option.
  • MySQL Load
    • If the extracted directory contains a MySQL database dump (the file is cb_mysql_dump.sql), then that will be used to load the database, overwriting any existing database that matches the default defined in the DATABASES Django setting. The database load can be skipped with the -nd or --no-database-load option. If the database load fails for any reason, cb_mysql_dump.sql will be copied to the /tmp/ directory so that the database load can be run manually with mysql < /tmp/cb_mysql_dump.sql.

Run Django Database Migrations

If your export included a database dump and load, you will need to run database migrations. This is because the database on your new CloudBolt 9.X instance will be in the exact same state as the database on your CloudBolt 8.X instance, but the CloudBolt 9.X instance will have code and schema changes that the CloudBolt 8.X instance lacks.

Fortunately, running Django database migrations is straightforward.

  1. Log in to your CloudBolt 9.X machine.
  2. Run the Django migration management command python /opt/cloudbolt/manage.py migrate/opt/cloudbolt/ is the default directory for CloudBolt, but your root CloudBolt directory could be somewhere else.
  3. Wait for the migrations to complete, then restart httpd systemctl restart httpd.

Create New Database Objects

To pull in new functionality that is dependent on out-of-the-box database objects (such as new Orchestration Actions and Recurring Jobs), run the create_objects Python script, which is run behind the scenes as part of every CloudBolt upgrade.
  1. Log in to your CloudBolt 9.X machine.
  2. Run the create_objects.py script. By default, this script can be run with python /opt/cloudbolt/initialize/create_objects.py, but it might be in a different directory if you have overridden the default CloudBolt directory structure.
This is necessary because the import process completely overwrites the database of your new CloudBolt 9.X instance with data from your old 8.X instance, which wipes out any new functionality that depends on database objects. The create_objects script will not overwrite changes that you have made to existing out-of-the-box Actions or Plugins.

Collect Static Files

This step is only required if your CloudBolt instance has UI Extensions (XUIs) or customized HTML templates, but it will not damage your CloudBolt to perform it. By default, XUIs are in the /var/opt/cloudbolt/proserv/xui/ directory and customized HTML templates are in the /var/opt/cloudbolt/proserv/templates/ directory.

To collect static files for CloudBolt:
  1. Log in to your CloudBolt 9.X machine.
  2. Run python /opt/cloudbolt/manage.py collectstatic. If you want to review a dry run to confirm the changes this will make, you can use the -n or --dry-run option.
  3. Before making any changes to CloudBolt, Django will confirm that you want to rebuild your static files.

Change Upgrade Prefix

It is unlikely you will need to change your upgrade prefix. When your CloudBolt server checks for compatible upgraders, it can use a prefix to select a different version. If you are directed by a CloudBolt support engineer, you can configure your prefix by doing the following:
  1. Open your customer_settings.py file.
  2. Update the variable VERSION_INFO, a dict with a key ‘RELEASE_VERSION_PREFIX’. Add a line: VERSION_INFO.update({'RELEASE_VERSION_PREFIX': 'the_new_prefix'})