Backup Discourse to Google Cloud Storage Instead of S3


The discourse forum has built-in tools to allow you to move system backups to S3 automatically for off-site storage. If you don’t want to use S3 you have to do this manually.

Following is a script that you can run via cron to upload the most recent backup from discourse to google cloud storage and remove the one before that (if that’s what you want to do). You’ll have to update for your own environment, although the backups should be in the same place on all? installations of discourse.

# This script will copy the latest discourse backup to google cloud
# Then it will remove the previous backup from google cloud

LATESTBACKUP=`ls /var/discourse/shared/standalone/backups/default/*.gz -t1 | head -n1`
PREVBACKUPWDIR=`ls /var/discourse/shared/standalone/backups/default/*.gz -1tr | head -1`
PREVBACKUP=`find $PREVBACKUPWDIR -exec basename {} \;`

/home/user/gsutil/gsutil cp $LATESTBACKUP gs://yyh-community-backups
/home/user/gsutil/gsutil rm gs://your-backup-bucket/$PREVBACKUP

The gsutil utility can be installed with these instructions from google.

Setting Up Automated Backups for WordPress and Others to S3 with Serverpilot


Serverpilot is an automated server building solution, similar to forge from the creators of laravel.  These are great solutions for those of us who want to focus on building applications and programming and not on server maintenance.  While the server setup is not difficult, particularly for one or a few sites, if you have many sites on your server and many development environments with multiple ssl certificates, etc. these tools can save a lot of time and headaches.

While this is written for serverpilot, the majority of it is applicable regardless of what solution you use to manage your server, even if you set them up yourself.  Serverpilot will also run on vultr, digital ocean, or other vps providers, so there might be some other tailoring as well. Additionally, while serverpilot is often used with WordPress sites, this script will backup all databases and all files so it will work for drupal, processwire, anything you have on your server.

While serverpilot handles many aspects of managing your web sites, it doesn’t provide any backup services.  And while, vultr and digital ocean, for example, offer server wide backups these are impractical for many usages, particularly on servers with multiple sites for different clients.

On to the steps:

  1. ssh into your server
  2. su to root on your server and cd /tmp
    su -root
  3. Now download and setup the command line tools for amazon web services (AWS)
    curl "" -o ""
    unzip ** note that on serverpilot administered servers unzip is not available by default so you will need to apt-get install unzip
    ./awscli-bundle/install -b ~/bin/aws
  4. Now you need to configure with your keys
    aws configure ** if you don’t know how to get your keys, read this.
  5. Now you should be able to do a command like aws s3 ls (dir list of s3 buckets), if you cannot, you need to give the user associated with the keys you entered in the configuration appropriate s3 permissions.

Now we are going to setup the database backup script.  In serverpilot by default you have .my.cnf file which contains your root mysql password.  This means that commands like mysqldump do not require a password if run by this user.  You can create this file manually, or better, use mysql_config_editor (5.6+) to create mylogin.cnf which hashes the password.

  1. cd ~
  2. touch && chmod 700 && nano
  3. Enter the following into this file (modify as needed).  For the most part this would simple be changing the “ExcludeDatabases” line to contain any other databases you don’t want to backup.  If you only have one database you can drop all of that and just use the mysqldump command for that one database.  This script assumes you have multiple sites setup, otherwise you probably wouldn’t be using serverpilot anyhow…
    databases=`mysql -u mysqladminuser -e "SHOW DATABASES;" | tr -d "| " | egrep -v $ExcludeDatabases`
    for db in $databases; do
    echo "Dumping database: $db"
    mysqldump --add-drop-table -u mysqladminuser $db | bzip2 -c > /whereyouwantthem/backups/`date +%Y%m%d`.$db.sql.bz2

Now we will open the script and clean it up a bit and add our tar backups and then copy everything off to s3.  So nano and the entire script is below:

  1. #!/bin/bash
    databases=`/usr/bin/mysql -u mysqladminuser -e "SHOW DATABASES;" | tr -d "| " | /bin/egrep -v $ExcludeDatabases`
    TodaysDate=`date +%Y%m%d`#Setup or Clear Backups Directory
    if [ -d "$BackUpDir" ]; then
    rm $BackUpDir/*.bz2
    fiif [ ! -d "$BackUpDir" ]; then
    mkdir $BackUpDir
    fi#Dump all the Databases into Backup Directory
    for db in $databases; do
    echo "Dumping database: $db"
    /usr/bin/mysqldump --add-drop-table -u mysqladminuser $db | bzip2 -c > $BackUpDir/$TodaysDate.$db.sql.bz2
    doneExcludeDirectories="--exclude=/directory/to/exclude/*.jpg --exclude=/directory/to/exclude/logs"

    # This will copy all serverpilot user accounts and all of their files for a complete backup

    tar -cvpjf $BackUpDir/$TodaysDate.allsites.tar.bz2 $ExcludeDirectories /srv/users/

    #Copy Everything in the Directory to S3 for offsite backup
    #Occasionally you should delete some of these on aws to save space
    /usr/local/bin/aws s3 cp $BackUpDir/ s3://bucketname/$TodaysDate --recursive --include "*.bz2" --region your-region-1

  2. Then add this to the root (or admin user) crontab, crontab -e, to run, perhaps weekly on saturdays at 2am – 0 2 * * sun /root/

That’s about it, obviously you should tailor this for you own needs and also clean up your aws buckets from time to time so that you are not paying for storage you don’t need.  I hope this is helpful for others.

Processwire and CKEditor – Stop Removing Divs and Id Attributes


CKEditor is a the default editor on the processwire CMS. Generally speaking it is a well functioning editor, but it does have some advanced features which can lead to frustration.

One of these is the advanced content filter. Settings for this allow certain attributes which would normally be removed upon saving your document.

One very commonly used element is the id attribute – i.e.

a id="linktoheading"

By default CKEditor will strip this out as it is setup in processwire. To change this you need to edit the field for which you would like this tag left in, click on the “Input” tab, and then scroll down to “Extra Allowed Content” and enter the following:

a *[id]

This allows the id attribute (optionally – “*”).

Another commonly used tag that is stripped out is classes on div tags. In the same box as above you could enter the following to allow, for example,

div class="overflow"

div (overflow)

Hope this saves you hours of searching around!

Update Links in WordPress with PHPMyAdmin


Often times during site migrations you may need to do bulk updates to links or other content to many pages at once.  Using phpmyadmin is a great way to accomplish this.

A general snippet would be:

UPDATE wp_posts SET post_content = REPLACE(post_content, 'whatyouhave', 'whatyouwant')

For example,
UPDATE wp_posts SET post_content = REPLACE(post_content, 'http://', 'https://')

Would update all http links to https links


UPDATE wp_posts SET post_content = REPLACE(post_content, '/subdomain/sub2/', '/newdomain/new2/')

Would update urls from one path to another.

WordPress Upload and Import Images – Woocommerce Migration


During a recent migration from a hosted ecommerce solution to woocommerce we had to move a large number of product images.  With a combination of the woocommerce csv export and csv import modules moving large numbers of products is quite easy.  That’s for the data.

For the graphics you cannot just upload graphics into the wordpress media library and have them immediately recognized so you need to do a few things first.

  1. Using your sftp program – upload your image files to /wp-content/uploads/subdirectory
  2. Install and activate the “Add From Server” plugin
  3. Go to ->media->add from server and navigate to your “subdirectory” that has your uploaded images
  4. Then select all the images and click “import”

Then all of your images will be uploaded to the “uploads/subdirectory” directory.

That’s not all, however, if you are using these images in woocommerce products.  You then need to regenerate the thumbnails so…

  1. Install and activate the “Regenerate Thumbnails” plugin
  2. Go to ->tools->Regen. Thumbnails and click to regenerate the thumbnails.

At this point all your images are ready to use.

Drupal to Processwire Notes and Snippets


These notes are for a project converting a relatively complex drupal 6 site to processwire.  This particular site had a number of relatively complex interrelationships between the data that had to be accounted for. It had also come through multiple versions of drupal over the years had some cleaning up to do.

Because of the amount of data that needed to be cleaned wide use of csv files was more useful than a straight database to database transfer.  More will be added to this later.

SQL Snippets

  • Creating current drupal taxonomy links on pages to their processwire counterparts (some other work before this with views exports, csv imports and then creating a temporary file holding the relationships.  The excel commands =SPLIT(A1;”,”) and =TRANSPOSE(A1:Z1) to convert the views csv export to rows is helpful.
  • UPDATE conditions_patterns_temp INNER JOIN field_title ON conditions_patterns_temp.Conditions = SET conditions_patterns_temp.cond_id = field_title.pages_id
  • Updating urls from a temp table in phpmyadmin
  • UPDATE pages INNER JOIN field_title ON = field_title.pages_id INNER JOIN temp_iherb_functions ON = SET = temp_iherb_functions.url
  • Build a table of page ids to create relationships later
  • UPDATE temp_fherb_ingredients INNER JOIN field_fherb_chinese_name ON temp_fherb_ingredients.formulaname = SET temp_fherb_ingredients.formulaid = field_fherb_chinese_name.pages_id

Excel Snippets

  • Since this is an old drupal site (pre d7 fields) there are many times where you have to split a body into what will become multiple fields.  In the csv via google sheets the following may be useful:
  • A1 part1:part2
  • B1  =RIGHT(A1,LEN(A1)-Find(“:”,A1)) = part2
  • C1  =SUBSTITUTE(A1,B1,””) = part1:
  • A1 part1>part2
  • B1  =LEFT(A2,FIND(“>”,A2)-1) = part1
  • C1 = SUBSTITUTE(A2,B2,””) = >part2

PHP Snippets

  • This is a small script to enumerate through an array that has various mappings and set the sort order column so that you can create fields with multiple relationships in processwire:

  • $query = $database->prepare("SELECT formulaid, indherbid, sortorder FROM temp_fherb_ingredients");
    $result = $query->fetchAll();
    if (count($result)) {
    foreach($result as $row) {
    $formulaid = $row['formulaid'];
    $indherbid = $row['indherbid'];
    if ($justusedid != $formulaid) { $cnumber = 0; } else { $cnumber++; }
    $sortorder = $row['sortorder'];
    $justusedid = $formulaid;
    $query = $database->prepare("UPDATE temp_fherb_ingredients SET sortorder = $cnumber WHERE formulaid = $formulaid AND indherbid = $indherbid");

Using CSV Files to Update Database Tables with phpMyAdmin


For many reasons either when moving between different cms systems perhaps or for more general reasons it is useful to import data with csv files.  Often this is because it is easier to manipulate the data in a spreadsheet and then create mass updates.

phpMyAdmin doesn’t allow you to update data from a CSV file directly.  You can import the data into an empty table, but you cannot update existing records.  To get around this, you can follow these steps:

  1. Export the data that you need from the current table – in part so you can see how it is structured.
  2. Then edit this file in a spreadsheet program.  After you changes are finished then export the file as csv.  In this example we are updating references to page urls in a table “pages” and our data file has simply id and a url instead of all the data.
  3. In phpmyadmin create a new temporary table
    CREATE pages_url_update LIKE pages (which would create a replica -- or just create the columns you need -- in this example an id and a url)
  4. Open that table in phpMyAdmin and then choose “import”
    — Select the csv to upload
    — And choose CSV Using LOAD DATA under Format
  5. Then select the true data table and click on the sql button
  6. Then do
    UPDATE pages
    INNER JOIN pages_url_update on =
    SET pages.url = pages_url_update.url
  7. Then drop the temporary table – DROP pages_url_update

Using Octopress For Blogging


Static site generators can be very useful for sites that do not need a large CMS. The benefits are largely speed, security and relative ease once you get everything setup. The downsides are a loss of some of the dynamic connections with your content, but these are minimized by using the right tools.

Octopress is one of many static site generators. The documentation on their site is thorough and most people with a background of command line skills can get a blog setup within a couple hours.

Below are some very basic commands to serve as a reminder once you get going

  • Add a post: rake new_post[“title”]
  • Add a page: rake new_page[“page-name”] (or [“folder/folder/page.html”] for example
  • Deploy your blog:
    • rake generate
    • rake deploy
    • git add .
    • git commit -m ‘commit message’
    • git push origin source

Postfix Relay Setup With Drupal and Basic Postfix Management


This article covers some basic steps to install the postfix mail server as a relay on Ubuntu, get Drupal to work properly in all mail environments, plus some general postfix management commands. These instructions may have to be adapted slightly to your environment and linux distribution, but they are generally applicable.

For more help in getting an intial server setup, please read Ubuntu Setup on a Linode VPS, and our PHP, Mysql, Apache Setup on Ubuntu articles.

  1. Postfix Installation
  2. Postfix Configuration Notes (As A Relay)
  3. Relaying Postfix Mail From Root – Localhost
  4. Drupal Email Configuration for Restrictive SMTP Servers
  5. Postfix Queue Management Commands

1. Postfix Installation

Below are the basic installation steps for postfix on an ubuntu server (many have it already installed). You can also use the drupal smtp module for this function but for high volume mailing and/or better control a local relay, such as postfix, is often helpful.

sudo apt-get install postfix If already installed you can reconfigure it by issuing the following command:
sudo dpkg-reconfigure postfix
Then enter the following for our purposes of setting up a relay:

Choose Satellite Configuration
System Mail Name: your maildomain name
SMTP relay (we often use authsmtp, but whatever your ISP smtp server is wold go here)
Root and Postmaster mail recipient:
Other destinations to accept mail for: localhost.localdomain,localhost (accept defaults)
Force synchronous updates on mail queue: No, unless you know what you are doing
Local Networks: defaults
Procmail for local delivery: Yes
Mailbox size limit: 0
Local address extension character: +
Internet Protocol:All

2. Postfix Configuration

If you ran the reconfigure command above you are most of the way there, but you should edit your config file anyhow and look over the following. It should look like this:

sudo nano /etc/postfix/

myhostname =
mydomain =
myorigin = localhost
smtpd_banner = $myhostname ESMTP $mail_name
biff = no
append_dot_mydomain = no
alias_maps = hash:/etc/aliases
alias_database = hash:/etc/aliases
mydestination = localdomain, localhost, localhost.localdomain, localhost
mynetworks =
mailbox_size_limit = 0
recipient_delimiter = +
# SECURITY NOTE: Listening on all interfaces. Make sure your firewall is
# configured correctly
inet_interfaces = loopback-only
relayhost = ## whatever your is
smtp_connection_cache_destinations =
smtp_sasl_auth_enable = yes
smtp_sasl_password_maps = static:username:password
smtp_sasl_security_options = noanonymous
default_destination_concurrency_limit = 4 ## to stay within your ISP’s policy
soft_bounce = yes
inet_protocols = all

Then make sure your aliases are ok:

You then want edit your /etc/aliases file (sudo nano /etc/aliases) and fill it out for your needs:

# See man 5 aliases for format
postmaster:    root

After this you can run:

sudo newaliases

And finally restart postfix

sudo /etc/init.d/postfix restart

3. Relaying Postfix Mail From Root – Localhost

As we have setup postfix above to relay all mail to an external smtp server you will lose mail from root – or root@localhost accounts.  To remedy this you can setup an alias via the following steps:

# cd /etc/postfix
# touch canonical
# echo "sender_canonical_maps = hash:/etc/postfix/canonical" >> /etc/postfix/
# echo "root" > /etc/postfix/canonical
# echo "root@localhost >> /etc/postfix/canonical
# postmap /etc/postfix/canonical
# /etc/init.d/postfix restart

To Test:
# echo TEST | mail -s TEST

4. Drupal Email Configuration for Restrictive SMTP Servers

By default, drupal doesn’t work well with ISP that require appropriately formed From and Reply-To Headers. There are issues in the queue to have this work properly in version 7 but for 5 and 6 you will need to edit the core contact module (sites/modules/contact.module).

Below is a copy of the patch file for version 5 of drupal, but it is very close to 6. It is best to edit the file yourself rather than try to submit this as a patch as we have other edits in the file and the line numbers may not be correct for you.

+++ contact.module 2009-01-29 20:48:45.000000000 -0500
@@ -418,10 +418,17 @@ function contact_mail_user_submit($form_
// Prepare all fields:
$to = $account->mail;
if ($user->uid) {
–    $from = $user->mail;
+//    $from = $user->mail;
+// Changed To Set Proper From and Reply-To And Add New Variable For User Copy
+ $usercopy = $user->mail;
+ $from = variable_get(‘site_mail’,”);
+ $headers = array(‘Reply-To’ => $user->mail);
else {
–    $from = $form_values[‘mail’];
+//    $from = $form_values[‘mail’];
+// Changed To Set Proper From and Reply-To
+ $from = variable_get(‘site_mail’,”);
+ $headers = array(‘Reply-To’ => $form_values[‘mail’]);

// Format the subject:
@@ -431,11 +438,14 @@ function contact_mail_user_submit($form_
$body = implode(“nn”, $message);

// Send the e-mail:
–  drupal_mail(‘contact-user-mail’, $to, $subject, $body, $from);

+//  drupal_mail(‘contact-user-mail’, $to, $subject, $body, $from);
+//  Changed To Add Headers
+ drupal_mail(‘contact-user-mail’, $to, $subject, $body, $from, $headers);
// Send a copy if requested:
if ($form_values[‘copy’]) {
–    drupal_mail(‘contact-user-copy’, $from, $subject, $body, $from);
+//    drupal_mail(‘contact-user-copy’, $from, $subject, $body, $from);
+//  Changed To Add Headers
+    drupal_mail(‘contact-page-copy’, $from, $subject, $body, $from, $headers);

5. Postfix Queue Management Commands

Now that you have everything working correctly, here are some useful postfix commands

# mailq  (show items in the queue – when you are relaying this is generally failed items)
# postcat -q queueid (read the contents of a message in the queue)
# postqueue -f (try to deliver all items in the queue immediately)
# postsuper -d ALL (remove all items in the queue)

Helpful SQL Queries Commands For Drupal


Simplifying mass updates and changes to websites is one of the main reasons to use a database driven content management system like drupal.  While drupal has a variety of helpful tools for automating tasks there are many times where it is easier and some where it is your only option to make changes to the database directly.  Below is a list of mysql commands that we use occasionally to create changes and make updates to our sites.  They are here largely for reference, please do not use them without knowing what they mean and without a backup of your database.

  • General Search and Replace Function
  • UPDATE tablename SET tablefield = replace(tablefield,”findstring”,”replacestring”) WHERE type = “x”
  • Example:  Create mass changes in drupal node body fields

    UPDATE `node_revisions` SET `body` = replace(body,’texttofind’,’texttoreplacewith’)


  • Create a spreadsheet of your drupal users usernames, emails and uid (we use this at times to create email newsletter subscription lists as an example)
  • This should be done in phpmyadmin so you can easily export the results in a CSV file for import, say into Mailchimp…
  • Example:  Select the desired user information from users with a specific role
  • SELECT, users.uid, users.mail FROM (users LEFT JOIN users_roles ON users.uid = users_roles.uid) WHERE users_roles.rid = “3” ORDER BY users.uid