↓
 

A Pipe and a Keyboard

A sort of Linux scrapbook

  • Home
  • About
  • Software
  • List of posts

Category Archives: Tech stuff

Post navigation

<< 1 2 … 4 5 6 7 8 9 10 … 27 28 >>

Shell scripting and LFTP

A Pipe and a Keyboard Posted on May 11, 2013 by RichardMay 11, 2013

I have been introducing myself to Linux shell scripting.

I maintain a fair number of WordPress installations, and part of that maintenance is the task of backing them up.  The actual backups are easy as they are done by simple scripts on the servers which are called by CRON tabs.  Each server has three tabs – one to backup the database (on a nightly basis), one to backup critical files, such as uploaded files and configuration files (run once a week) and one to backup the entire site (run once a month).

While backup files on the server are all very well, they are not much use if the server collapses, so I need to transfer those backups, and this is where the need for shell scripting came in.

To retrieve the files, I initially had to fire up my FTP programme and then log into each server in turn, download the files and then delete redundant files from the server.  That proved to be an extremely tedious job and of course it had to be done every day.

My first and obvious idea was to create a simple shell script to log onto a server and FTP a file down onto the local machine.  There were two initial problems – I would have to have a script for each individual server, and there was the problem of specifying the file I wanted to download.  For various reasons I wanted to be able to specify precisely which file I wanted to download (and the actual names varied from server to server), so this involved wildcards.

My initial solution was to create an array of variables – one set for each server containing the server address, the username, the password and where the file was to be stored locally, and some variables that were specific to the day, as each backup file was time stamped.

MYDATE=$(date +”%d.%m.%y”)
REMOTEFILE=*$MYDATE.*

SERVER[0]=”domain1.com”
USER[0]=”user1″
PASSW[0]=”password1″
LOCAL[0]=”/home/sites/BlogBackups/Site1″

SERVER[1]=”domain2.com”
USER[1]=”user2″
PASSW[1]=”password2″
LOCAL[1]=”/home/sites/BlogBackups/Site2″

I then created a simple loop which cycled through the array, connecting to each server in turn, downloading the files and then deleting the files on the server.

count=0
while [[ $count -lt ${#SERVER[@]} ]]
do
ftp -v -i -n ${SERVER[$count]} <<END_OF_SESSION
user ${USER[$count]} ${PASSW[$count]}
cd webspace/www/backups
lcd ${LOCAL[$count]}
mget $REMOTEFILE
mdelete $REMOTEFILE
bye
END_OF_SESSION
(( count++ ))
done

I as quite pleased with this effort as it ran perfectly and did precisely what I intended.

Until the end of the month, that is.

On the first of the month, I ran my script and it locked up.

A little investigation found the cause – it was downloading files greater than 200Mb, but on completing the download the script just stopped, for no apparent reason.  There were no error messages, no timesouts, nothing.  I searched the Web but couldn’t find any mention of a possible cure for this.  In the end I posted a query on a couple of forums, and while they didn’t provide an answer they did suggest something else – that I try something other than FTP.

Unfortunately, for various reasons none of the suggestions worked, so I started my own search and found LFTP.

LFTP is essentially FTP with a load more bells and whistles.  It took a while to sort it out as my only reference was the Web.

The basic command line to retrieve a file from the server is

lftp -u username,password ServerURL -e “get /webspace/www/backups/remotefile”

I modified this using mget rather than get as that allowed for wildcards.  I used a loop to cycle through my array and that’s where the fun began.

First of all it started throwing up warnings left right and centre about bad certificates.

WARNING: Certificate verification: Not trusted
WARNING: Certificate verification: Expired
WARNING: Certificate verification: certificate common name doesn’t match

This was easily cured by adding a line at the bottom of /etc/lftp.conf – set ssl:verify-certificate no 

My next problem was that it would access the first server all right but would refuse to connect to any other, giving an error that there was “No Route to Host”.  It was logging in all right on Port 21, but was then trying random ports to do the actual transfer and kept cycling through them.

I cured the long wait between cycles by adding three more lines to /etc/lftp.conf –

set net:timeout 10
set net:reconnect-interval-base 5
set net:reconnect-interval-multiplier 1

While that sped things up a bit, it didn’t solve the problem!  I tried forcing Active mode over Passive but that had no effect.

The solution in the end was to add yet another line to /etc/lftp.conf –

set ftp:ssl-allow false

I ran the script, deliberately including some large files and it worked.

My final script –

count=0
while [[ $count -lt ${#SERVER[@]} ]]
do
lftp -u ${USER[$count]},${PASSW[$count]} ${SERVER[$count]} -e “mget -E /webspace/httpdocs/backups/$REMOTEFILE -O ${LOCAL[$count]}; ; exit”
(( count++ ))
done

I added in a couple of extra features – the –E flag deletes the remote file but only on successful download, and the –O flag sets the target location for the file.  I ran it against the full server load (having previously uploaded some of the large files) and it ran perfectly.

LFTP is an immensely powerful tool as it provides such commands as MIRROR which can be used to mirror a local folder with a remote one.  I can see myself using it a lot more! 

Posted in Linux, Tech stuff | Tagged Linux | Leave a reply

Thunderbird opening multiple windows

A Pipe and a Keyboard Posted on April 13, 2013 by RichardAugust 5, 2015

I have been having a small but irritating problem with Thunderbird.

I use the Nightly (Firebird) version which is currently V21, but I believe this problem is not new.

Whenever I opened Thunderbird, it insisted on opening three or more versions of itself, which meant closing several windows every time.  It started off with three windows, but over time this grew to seven which meant a tiresome session of closing windows each time.

I searched the Internet and this is not a new problem.  The answer generally given is to close the extra windows using Alt+F4 but this didn’t work for me.  It closed the windows all right, but the next time I ran Thunderbird the problem was still there.

The solution is simple.

Delete the file “session.json”.

For the uninitiated, open your file browser and ensure that “View –> Show hidden files” is selected.

Browse to home/.thunderbird (note – dot thunderbird!) and you will find “session.json” in your default profile folder.  If for any reason you have several profile folders, the active one is listed in the file “profiles.ini”.

Well, it worked for me…….

Posted in Linux, Tech stuff | 3 Replies

How to cancel a print job

A Pipe and a Keyboard Posted on March 21, 2013 by RichardMay 9, 2016

Last night I wanted to print off a couple of pages.

I’m using Linux Mint 14 (Nadia) and a wireless HP Officejet J4680.

For some reason, the first page got stuck in the queue and refused to print.  What was worse, I couldn’t cancel it.  I tried all the usual rebooting but to no avail.  The Preferences –> Printers showed the queue but the Stop button had no effect.

The solution, as always is simple.

Open http://localhost:631/jobs/ in your browser.

That’ll do the trick.

Posted in Linux, Tech stuff | Tagged Linux | 3 Replies

Backing up WordPress

A Pipe and a Keyboard Posted on March 10, 2013 by RichardMarch 10, 2013

One of the more important jobs in running a WordPress site (or any site for that matter) is backing up.

There are two parts to this – the files and the database.

Not all files need to be backed up, as in the event of catastrophic failure, the core WordPress files can be restored using the standard installation.  However the contents of the wp-contents folder are critical, as this is where the images, plugins etc are kept.  Granted, plugins also can be restored using available files from the Web but restoring them from a backup can save a lot of time and effort.

I experimented with several plugins to do my backing up, but had to abandon all as they were unreliable, unworkable or very resource hungry.  I had to find an alternative method, as the idea of connecting to several sites, and doing an FTP download plus doing a PhpMySQLAdmin dump each and every day was a non-runner.

I decided that Cron was my solution.

I set up a couple of small files – one to compress the contents of wp-content and the other to do a mysqldump.

Files –

/bin/tar –czf /usr/local/hosts/webspace/httpdocs/backups/backup-‘date +%d.%m.%y’ .tar.gz /usr/local/hosts/webspace/httpdocs/www/wp-content

Database –

/usr/bin/mysqldump –host=dbserver –user=dbusername –password=dbpassword databasename –quick > /usr/local/hosts/webspace/httpdocs/backups/backupdb-‘date +%d.%m.%y’ .sql

(need I point out that the paths, names and passwords are not the actual ones used?!)

I saved both files on the server and then set up a Cron Tab to run each file.

This worked perfectly on two of the sites I was testing on, but the third refused to work properly.  The ‘mysqldump’ function threw up an empty file each time, despite the code being identical in structure across all three sites.

It took a while to find the problem, but I eventually solved it.  It didn’t accept the password for the database as the password started with an ampersand ‘&’.  I tried various methods, such as escaping the ampersand and inserting quotes around the password but each time it failed.  In the end, I took the simple route – I set up an additional user for the database and gave that user a slightly simpler password.  It worked!

I now have a series of sites set up and all are running smoothly.  Because of the flexibility of Cron I could set the backups to run anywhere from once a minute to once a year.  However I have chosen a more sensible option. The databases are backed up every night and the files are backed up every week. That should be sufficient?

My next job is to refine the Files backup to ignore all cached files, as they are very bulky and aren’t necessary for a restore anyway.

Posted in Blogging, Tech stuff | Leave a reply

Testing C.D.N.

A Pipe and a Keyboard Posted on February 19, 2013 by RichardFebruary 19, 2013

In my quest for efficiency I have been experimenting with CDN.

Content Delivery Network promises a lot in return for a simple setup.  I decided to give it a try.

I signed up with CloudFlare, as they have a fair reputation and are free.  Setuip was simple, as all I had to do was to give them the name of the site.  I then pointed the site’s domain at their name-servers and sat back to watch results.

To test the effectiveness, I measured site load speed on three sites and the following are the results I obtained from the three –

Domain nameSizeLoad TimeAverage Speed per KB
headrambles.com52.75 KB 0.63 seconds 0.01 seconds
apipeandakeyboard.com57.08 KB 0.58 seconds 0.01 seconds
smokingoutthetruth.com39.77 KB 0.95 seconds 0.02 seconds

Of the three, this site is the fastest, and it is also the site that is registered with CloudFlare.  However as the difference in speed is a matter of a few milliseconds, I doubt if I will bother setting up Smoking or Rambles.

On another note, I have also been playing around with caching.  Having tried several caching plugins (one of which caused an almost total failure of the site) I settled back into WP Super Cache.  I had heard that it conflicts with my theme, but having tested the site on both browser and mobile phone I could see no problems.

However one thing I did discover is that by playing around with the Advanced settings, I could gain an appreciable increase in speed.  Using the default settings, Smoking loaded in around 3.41 seconds.  Having tweaked Super Cache, that dropped to the 0.93 seconds as shown in the table above. 

Now that is what I call an appreciable return.

Posted in Tech stuff | 2 Replies

Post navigation

<< 1 2 … 4 5 6 7 8 9 10 … 27 28 >>

Recent Posts

  • GRUB problems
  • Tab bar broken again in Firefox 133
  • WP phpMyAdmin Session mismatch
  • When Linux slows down
  • Firefox broken again

Categories

  • Blogging
  • General
  • Linux
  • Media
  • Tech stuff
  • Writing

Blogroll

  • Head Rambles
  • Kirk M's Just Thinkin'
  • Wordpress Beginner
  • Wordpress Development

Archives

©2025 - A Pipe and a Keyboard - Weaver Xtreme Theme
↑