How To Backup Your Website Through SSH Command Line

February 21, 2012

How To Backup Your Website Through SSH Command Line


ssh backupBacking up your website or blog can be an expensive and arduous task, requiring a variety of plugins, or additional plans from your hosting provider – but it needn’t be really. If you have SSH access to your website host (generally you would need at least a virtual private server for this), then it’s easy to backup, restore and migrate your entire website with only a few commands. Let me show you how.

What is SSH Command Line?

SSH gives you the ability to talk directly to your web-server. It doesn’t give a pretty interface, or a nice GUI, just a straight-up powerful command line. This can be daunting to some people, but the sheer power, speed, and level of automation it provides can be an absolute life-saver and makes the process of migrating sites incredibly easy.

Most shared hosts unfortunately don’t allow SSH access to your account, at least not by default. If you’re hosting with GoDaddy, you can enable it though, so be sure to check first.

To log in via SSH, open up the Terminal in OS X (or get some free SSH software for Windows) and type in the following:

ssh username@yourdomain.com

You’ll be prompted for your password. If you’ve never used SSH before, you might be surprised when typing in your password doesn’t anything on screen. Don’t worry, that’s for security.

Once logged in, you’ll be presented with a command prompt, similar to the following:

-bash-3.2$

This means everything is fine, so go ahead and continue with these commands.

Start by taking a look around and trying to navigate to your web directory. Type:

ls

To ‘list’ the current files and folders.

cd directoryname

to change to a directory. In this case, I’m going to navigate to the httpdocs directory, which is the root of my web site (where all my wordpress files are stored). You can then ‘ls’ again, just to be sure.

ssh backup

At this point, we’re ready to begin the SSH backup process.

Backing up the Database:

Since the majority readers will be doing this with a WordPress install, you will most certainly have a database to backup in addition to any files stored on the site. First off, you’ll need 3 bits of information to backup your database, but all can be found within wp-config.php (if you’re running wordpress, that is):

  • Database name
  • Database user
  • Database password

Then, issue this simple command, being sure to replace the username, table name, and backup filename where neccessary:

mysqldump --add-drop-table -u username -p tablename > backupfilename.sql

Hit enter, and enter your password. Once it’s run, you can then issue another ‘ls’ command to check that the file has been output. Congratulations, this is all the information in your database as a single SQL file, ready to backup or import somewhere else.

Note: I’ve assumed that your database server is running on the same server on which you are hosting. On a GoDaddy host however, the MySQL database is actually stored remotely on a separate server to which you don’t have SSH access. In cases like these, you will need to access PHPMyAdmin via the hosting control panel, but that is out of the scope of this tutorial.

Backing Up Files:

Now that we have the database stored to single file on the server, we can go ahead and backup both that and your website files down to a single compressed backup file. To do this, we are going to issue one simple command. You need only replace yourbackupfilename with whatever you want it to be called.

tar -vcf yourbackupfilename.tar .

Let me break that down. Tar is a common linux compression format, similar to zip but more efficient. -vcf are simple some options that say “make a new archive, and tell me what you’re doing”. Next is the name of the file we want to create, and finally a single period mark tells it to include everything. We could have written * instead, but this would miss any hidden files such .htaccess which is essential for WordPress.

That’s it. Once that’s run, you will have a single .tar file consisting of every file on your site. You could log in via FTP at this point and download it, but let me show one final step that allows you to restore all these files.

Restoring Everything:

Let’s say the worst has happened, and something has gone horribly wrong with your site. You’ve got a tar file of everything that you backed up last week, so now you’d like to restore it to that. First off, log in via FTP and upload the backup file onto your server. Perhaps you’ve been storing them in a special directory. Either way, move the latest complete backup file into the root of your site, and we’ll begin.

Start by unpacking all the files, the reverse of what we did to back them up:

tar -vxf yourbackupfilename.tar

The crucial difference here is in the -vxf switch, which tells it to extract the files instead of creating a new backup. Also, there is no period on the end of the command this time.

The last step is to suck your database back in to where it was before. Make sure you have a blank database setup with the same password and tablename as before, or you’ll need to change your site configuration settings too. To suck the data back in, issue this command:

mysql -u username -p tablename < databasebackupfilename.sql

Next week: Automating Your Backups

That’s enough to get you started with doing SSH backups for now, then next I’ll show how to automate the task with a simple shell script and a CRON command. If you have some Amazon s3 storage space, I’ll even show you how you can automatically upload your backup files to a storage bucket once they’re done.

One last tip – when I first began to use the command line, this one really impressed me – try pressing the tab key when your typing in a long filename, and if the name is unique enough it will attempt to autocomplete the rest of the filename!

 

SSH Commands / Tricks - Best 25

February 21, 2012

25 Best SSH Commands / Tricks

1) Copy ssh keys to user@host to enable password-less ssh logins.

ssh-copy-id user@host

To generate the keys use the command ssh-keygen

2) Start a tunnel from some machine’s port 80 to your local post 2001

ssh -N -L2001:localhost:80 somemachine

Now you can acces the website by going to http://localhost:2001/

3) Output your microphone to a remote computer’s speaker

dd if=/dev/dsp | ssh -c arcfour -C username@host dd of=/dev/dsp

This will output the sound fr...


Continue reading...
 

Working with Permissions - 2

February 16, 2012

Permissions - Part II 

Working with Permissions

Now that you should have the basics of file permissions down, this section includes some "tips and tricks" that I use to ensure that permissions are properly set on servers.

Home Directories

Most admins overlook the setting of permissions on user's home directories. I believe, this is because most admins do not have a good understanding of Unix Permissions (especially those coming from a Windows background). Because of this, most Linux Servers are d...


Continue reading...
 

File & Directory Permissions on Linux

February 16, 2012

File & Directory Permissions on Linux

Special Bits

So far, you have been shown the basic manipulation of file and directory permissions. However, there are quite a few circumstances which these will not be enough for the situation. For instance, what if you have a few people that have different "Default Groups", but are members of another group and you need them to all have write access to certain files. Or what if you have some less knowledgeable users that may accidentally delete other people...


Continue reading...
 

20 things to plan for an IT Disaster Recovery

September 16, 2011

                         20 Things to Plan for an IT Disaster Recovery


 Implementing a disaster recovery solution is dependent on three factors — 1) time 2) resources 3) dollar amount.

Most organization doesn’t even think about DR when the IT infrastructure and applications are running without any issues. Most of them think about DR only when something breaks that created a major negative impact on the business.

If you are a sysadmin, or someone who is responsible for keeping the IT running,...


Continue reading...
 

HTOP

September 16, 2011

HTOP:
 
 htop is just like top, but on steroids.

Once you are used to htop, you’ll never go back to top again.

htop is a ncurses-based process viewer.

You can interact with the htop using mouse. You can scroll vertically to view the full process list, and scroll horizontally to view the full command line of the process.

This article explains 15 essential htop command examples.

Install Htop

top command is available on all Linux system by default.

To use htop, you need to install it first. Go to htop ...


Continue reading...
 

2 Easy Steps to Enable SSL/HTTP on Tomcat server

September 16, 2011

2 Easy Steps to Enable SSL / HTTPS on Tomcat Server


If you are running tomcat server that runs only on HTTP, follow the 2 easy steps mentioned below, to configure tomcat for SSL.

1. Create Keystore using Java keytool

First use the keytool to create a java keystore as shown below. Make sure to note down the password that you enter while creating the keystore.

# $JAVA_HOME/bin/keytool -genkey -alias tomcat -keyalg RSA
Enter keystore password:
Re-enter new password:
What is your first and last name?...

Continue reading...
 

SELinux Features

September 12, 2011

SELinux Features:
  • Restricts access by subjects (users and/or processes) to objects (files)
  • Provides Mandatory Access Controls (MACs)
  • MACs extend Discretionary Access Controls (DACs (Standard Linux Permissions))
  • Stores MAC permissions in extended attributes of file systems
  • SELinux provides a way to separate: users, processes (subjects), and objects, via labeling, and monitors/controls their interaction
  • SELinux is integrated into the Linux kernel
  • Implements sandboxes for subjects and objects
  • Default RH...

Continue reading...
 

Understanding TOP command ouput in Linux

September 12, 2011

Understanding TOP command ouput in Linux


 How do I determine CPU and Memory utilization, based on running processes in Linux using TOP?
Top command provides a real-time look at what is happening with your system. Top produces so much output that a new user may get over whelmed with all that’s presented and what it means.
Let’s take a look at TOP one line at a time.

The first line in top:
top - 22:09:08 up 14 min,  1 user,  load average: 0.21, 0.23, 0.30
“22:09:08″ is the current time; “up...

Continue reading...
 

comparison of Ext3 and Ext4 File systems

September 12, 2011

Here is the quick facts and comparison of Ext3 and Ext4 File systems:  Hope this helps..!


Features
Ext3
Ext4
Stands For
Ext3 stands for third extended file system.
Ext4 stands for fourth extended file system.
Introduced
It was introduced in 2001.
It was introduced in 2008.
Kernel Support
Supports from Linux Kernel 2.4.15
Supports from Linux Kernel 2.6.19
Maximum individual file size supported
Maximum individual file size can be from 16 GB to 2 TB
Maximum individual file size can be from 16 GB to 16 TB
Maximu...

Continue reading...
 

Recent Posts