Leon's Weblog

October 19, 2009

Transferring Linux Server to a 1&1 Shared Host

Filed under: Personal,Software Dev — Leon @ 1:40 am

Recently, I was forced to relocate my Linux server so I decided to try out 1&1’s Shared Web Hosting package. This option was a lot cheaper then paying collocation fees at a server farm and provided a solution that is a bit easier to maintain. The challenge was setting up the environment to have the same functionality that I used to have on the LAMP server in 1&1’s restricted environment. I’ll describe some of the challenges and solutions bellow. This is a follow-up to an earlier guide that I wrote on Configuring a 1&1 Shared Host.

Getting Started
The main features required to get started are:

  • Linux host with LAMP setup (Apache, mySQL, and PHP)
  • SSH access into home directory
  • IMAP mail (since I was giving up my mail server as well)

The Linux Hosting Business Package at $10 per month meets these requirements.

Setting up the home directory
There is a lot of functionality that I wanted to accomplish so planning the home folder structure first was important. Since the entire home folder is web accessible we have to carefully set the permissions for each file and folder that is not intended for the web.

The home directory has a protected logs folder where 1&1 stores each user’s Apache access logs. Similarly we will need an ~/opt folder to store programs and a ~/tmp to store temp file which we are working on. We can protect these folders with the following .htaccess file. Also, since I was planning to configure and use the content of these folders under one account, I changed the folder permissions to be accessible to my account only using chmod 700 ~/tmp.
 Order allow,deny
 Deny from all

The files and folders containing our settings need to be protected as well. These files typically start with a dot so that they are not visible by default while listing folder content. We need to prevent web users from downloading these files by including the following lines in the home folder’s .htaccess file. Additionally we will change the file permissions of these file to limit access to only the files’ owner.
 #prevent listing hidden files
 <FilesMatch "^\.">
  Order allow,deny
  Deny from all
 </FilesMatch>

Setting up SVN
I covered the topic of setting up an Subversion repository in a previous post. I can confirm that this setup still works like a charm. To transfer my existing SVN repository for the site to the new server I used the svnadmin dump and load commands as follows:
 #on old server
 > svnadmin dump /path/to/repo > reponame.dump
 #copy the dump file to the 1&1 server, create a new repository, and load the dump file
 > cd ~/opt/svn
 > svnadmin create reponame
 > svnadmin load reponame < reponame.dump

With this done, I checked out the latest revision of my site from the SVN repository into the home directory. Note that there will be an .svn file in every versioned directory that we probably don’t want visitors seeing over the web. The .htaccess filter defined above will prevent this.

Transfering mySQL databases
Now that we have our code base loaded on the server we need to transfer the mySQL databases. I found that making a dump of the databases on the old server using the mysqldump command works best. This article goes over the process well. One important problem to note is that 1&1 does not allow users to choose their own database names. As a result we will have to remove the reference to the original database names from the dump files. This can be done using any text editor. The reference only appears once near the top of the dump file.

Setting up AWstats
The log file analyzer on 1&1 leaves much to be desired so I setup AWstats instead. There is an excellent tutorial available for setting up AWstats on 1&1; however, to get it working I had to install some missing PERL modules. In particular, the XWhois module is needed for DNS lookups. The easiest way to install PERL modules is via the CPAN and, after a little Googling, I found a guide for setting it up on 1&1 as well.

I prefer generating static log pages instead of parsing the statistics files dynamically each time the AWstats page is loaded (less load on the server), so I configured the following script to run in cron nightly.
 #load environment variables
 export PATH=$PATH:/kunden/homepages/xx/dxxxxxxxxx/htdocs/opt/svn/bin
 export PERL5LIB=${PERL5LIB}:/kunden/homepages/xx/dxxxxxxxxx/htdocs/opt/perl/lib

 dt=`date -d yesterday +%Y%m`
 outdir=/kunden/homepages/xx/dxxxxxxxxx/htdocs/awstats/$dt/

 if ! test -d $outdir
 then
   mkdir $outdir
   chmod 755 $outdir
 fi

 mm=`date -d yesterday +%m`
 yy=`date -d yesterday +%Y`
 /kunden/homepages/xx/dxxxxxxxxx/htdocs/opt/awstats/tools/awstats_buildstaticpages.pl -month=$mm -year=$yy -config=lbsharp -update \
   -awstatsprog=/kunden/homepages/xx/dxxxxxxxxx/htdocs/opt/awstats/wwwroot/cgi-bin/awstats.pl -dir=$outdir

Note that the script has to load the environment variables explicitly so that it will have access to the Perl modules that were installed. Cron runs as the root user so the user bash profiles are not loaded.

Setting up Unison File Synchronizer
I’m a big fan of Unison and, as I described in a previous article, have used it religiously. In this case, I wanted to use Unison to synchronize/backup the content on the home directory that was not versioned in SVN. To setup Unison, download the latest compiled Linux binary from this site (the text only version will do) and copy the file to the ~/opt folder. Ensure that the file has execute permissions and that the path to the executable is in the PATH by updating the ~/.bashrc file as I described earlier. With this set, you should be able to synchronize the contact of your 1&1 home directory with any client over SSH.

Not bad for a $10/month host…

No Comments »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a comment