WordPress Hello Dolly -> Rick Roll

For a little bit of fun a number of months ago I edited the default hello dolly wordpress plugin and changed it to a rick roll.

Personally, I’m not of the Hello Dolly generation and it didn’t interest me at all, whilst Rick Rolling all the other WordPress users is fun!
Continue reading

Posted in Code | Leave a comment

SugarCRM Soap v2 Modules and Link Fields lister

Here at ANAT we use SugarCRM. A reasonably good, open source and free Customer Relations Management System which is used to keep the membership database information, send out monthly digest emails to subscribers, do document control, and much more.

Unfortunately it has a downside that isn’t initially apparent when you install it. It’s SOAP API is dirty.

What I mean by this is that you THINK you should be able to do something, like write a script to add a new user and subscribe them to the newsletter, (what SugarCRM calls a prospect_list), but when you actually try it, you find there’s very little documentation, and in some cases doesn’t even work.

What’s more, there’s two SOAP API’s. The default one that you’ll likely come across is the v1 API. For those who have had their hearts ripped out by that and are moving on to try the v2 API I’ve written up a module and link_field lister.

Once you’ve put your URL, username and password in, the script will connect to your SugarCRM SOAP, then run through and list all of the available modules, their fields, and also the Link fields, which is a crucial way of accessing related modules and information in most of the v2 API calls, yet is barely explained in any of the docs.

Note that the script uses the nusoap libraries (included in the zip file) and YOU WILL NEED TO EDIT THE URL, USERNAME, and PASSWORD variables at the top of the script to point to your SugarCRM install with a valid username and password.

Posted in Code | 3 Comments

Fun with sudoer

As part of our ongoing push to make our servers more secure the tech department decided to lock down the sudo command on our servers by restricting which users could use it and enabling e-mail alerts for unauthorised access. For those of you who do not know of sudoer it is the configuration file on all UNIX/Linux systems that controls who can use sudo and how. Note: sudo allows non root users to run commands as root (admin).

Restricting user access was easy as there are many useful guides on the net. One of these guides mentioned using visudo to edit the sudoer file as it allows you to use your default editor and then runs an integrity check before saving and I highly recomend this.

The fun began when trying to set up e-mail alerts. All the guides and forum posts I had read told me the options I needed but had conflicting syntax. After trying every combination I could think of I resorted to RTFM and read the man page (the one embedded in Ubuntu 10.x not the one on the web) that showed the exact syntax as below.

Defaults      mailto=user\@domain.org.au
Defaults      mail_no_user
Defaults      mail_badpass

Every other guide had assumed that I would automatically include the Defaults prefix and neglected to mention it (including the web based man page) so I spent half a day chasing my tail.

I hope this helps you avoid the same pain.

Posted in Code, Gripes, OMFG, WTF | Leave a comment

When Sysadmins Ruled the Earth

When Sys Admins Ruled the World

When Sys Admins Ruled the World

If your a fellow geek who wants an engaging story then you’ll probably enjoy this as much as we did.

http://baens-universe.com/articles/when_sysadmins_ruled_the_earth

Posted in OMFG | Leave a comment

Empty directory cleanup

This script will remove all empty subdirectories within the directory supplied as a command line parameter.
i.e. ./CleanupDirs.sh /docs/stuff will remove any empty subdirectories of /docs/stuff.

#!/bin/bash
 
# This script was created by Dale Caon
# On the 15th of November, 2010
# for ANAT, the Australian Network for Art and Technology.
# Title : Empty Directory Cleanup Script.
# Description : This should be run to recursivly delete all empty subfolders in a directory.
# Requires : A posix compliant OS (or close). NOTE: Mac OSX does not count.
 
# Ensure that current working directory matches target dir
cd "$1"
if [ $? != 0 ];
then
        echo "Error: Failed to change directory - Exiting"
exit 1
fi
 
# find directories recursivly in current directory and output list within single quotes
# use sort to reverse order
# use xargs to convert sorted list to command line args for rmdir
# rmdir will error when removing non empty dirs therefore deleting empty dirs and leaving files intact.
 
find ./ -type d -printf "'%p'\n" | sort -r | xargs rmdir > /dev/null 2>&1

This took us ages to puzzle out so please make use of this.

Posted in Code | Leave a comment

MySQL Backup and Rsync Script

As part of running a Web server, in this case a LAMP (Linux Apache, MySQL, PHP) server, you need to ensure you have a good backup system.

As per the Tao of backup, you need something that allows you to backup and keep point in time snapshots.
We use Rsnapshot to create our point in time backups, however need to get a compressed SQL database dump from MySQL.

Given we have a Google Apps email account (Gmail for domains), it gives us over 7GB of inbox space for the free version (and 25GB+ if you pay $50/yr per user). As such we both Rsnapshot the MySQL backup and also email a copy to ourselves.

Assumptions :

  • You are using a LAMP server
  • You are backing up with Rsnapshot
  • Your MySQL backup database is small enough to be emailed (although you can stop the emailing if you want)
  • You have root MySQL database access (or, at least have access to backup up more than one database, you can trim the code down if it’s just one DB).
#!/bin/bash
#unset PATH
 
# USER VARIABLES
MYSQLUSER=root                         # The mysql user
MYSQLPWD='[INSERT YOUR PASSWORD]'                    # The mysql root password -- Note, might not be needed if you are using .my.cnf, but you'll need to change the script if that's the case.
MYSQLHOST=localhost                    # This should stay localhost
MYSQLBACKUPDIRTAR="backups/mysql/$HOSTNAME"            # The leading / is to be removed for TAR, otherwise it sends an annoying email each time it compresses the files.
MYSQLBACKUPDIR=/${MYSQLBACKUPDIRTAR}            # A folder where the backupped databases will be put.
EMAILADDRESS='[INSERT YOUR EMAIL ADDRESS]'                # The email address to send the backed up database to. e.g [email protected]
EMAILBODY="Attached should be the latest zipped backup of the MySQL database on $HOSTNAME"   # Where the body of the main email comes from, or the content itself.
RSYNCHOST='[[email protected]]' # This should be set to however you'd SSH into the host you want to RSYNC to. e.g [email protected]
echo "Saving compressed MySQL database backup to $MYSQLBACKUPDIR, please wait"
 
#Other Variables
CURRENTDATE=$(date '+%d-%m-%Y') #The date (e.g 23-02-2008) for use in the zip file name
MYSQLBACKUPFILENAME="MySQL_${HOSTNAME}_[$CURRENTDATE].sql" # Should look something like 'MySQL_office.anat_[23-10-2010].sql'
 
##                                                      ##
##      --       DO NOT EDIT BETWEEN THIS HERE     --   ##
##                                                      ##
 
# CREATE MYSQL BACKUP
# Create backup dir (in case it doesn't exist yet) and remove old .gz files to make way for new ones.
mkdir -p  $MYSQLBACKUPDIR
rm -v $MYSQLBACKUPDIR/*.gz
 
#Dump individual sql files... The important bit!
for i in $(echo 'SHOW DATABASES;' | mysql -u$MYSQLUSER -p$MYSQLPWD -h$MYSQLHOST | grep -v '^Database$' ); do
  mysqldump                                                    \
  -u$MYSQLUSER -p$MYSQLPWD -h$MYSQLHOST                         \
  -Q -c -C --add-drop-table --add-locks --quick --lock-tables   \
  $i > $MYSQLBACKUPDIR/$i.sql
 echo -n "."
done;
echo ""
 
##                                         ##
##      --       AND THIS HERE     --      ##
##                                         ##
## Hint, you can edit below this line      ##
 
echo "Done, dumping files to $MYSQLBACKUPDIR"
 
# This will Tarball (compress) the sql files up individually as they should get rsnapshotted, so if they haven't changed then you don't have to create a new file for them
# Note, the change directory and -C in the Tar command is to stop it from generating an annoying error saying that it's removing the leading / from the tar names.
 
cd $MYSQLBACKUPDIR/
#find * -name '*.sql' -exec tar -vcf $MYSQLBACKUPDIR/{}.tar -C / $MYSQLBACKUPDIRTAR/{} \;
# another way is to run :: find $MYSQLBACKUPDIR/ -name '*.sql' -exec tar -zvcf {}.tar.gz -C / {} \;
# The above would  tar and gzip at the same time, but I'm not sure if it's in an rsyncable way
 
## Compress the SQL files ##
find $MYSQLBACKUPDIR/ -name '*.sql' -exec gzip --best --rsyncable {} \; 
 
TARLIST=''
###  CREATE A MASTER .TAR FILE OF ALL GZIPS ####
cd $MYSQLBACKUPDIR/
# This will create a listing of all the .tar.gz files in the directory
for i in $( find * -name '*.gz' );
do
TARLIST="$TARLIST $MYSQLBACKUPDIRTAR/$i"
done;
 
# Now do that actual tarballing of the files we've listed above #
tar -vcf $MYSQLBACKUPDIR/$MYSQLBACKUPFILENAME.tar -C / $TARLIST
echo "Compressed $MYSQLBACKUPDIR/$MYSQLBACKUPFILENAME.tar"
echo
 
#### Rsync push the backed up files somewhere ####
rsync -avh $MYSQLBACKUPDIR/*.gz -e ssh $RSYNCHOST:/backup/mysql/$HOSTNAME/
## Note, you might need to create the directory on the target server, or maybe change to something like /backup/mysql/$HOSTNAME and hopefully the parent directory already exists.
echo "---"
echo "Rsync'd daily backup"
 
### Mail the zipped file as an attachment... If your database is too big you'll have to replace this with rsync, or FTP backup ##
echo $EMAILBODY | mutt -s "[$HOSTNAME] DailyMysql backup" -a $MYSQLBACKUPDIR/$MYSQLBACKUPFILENAME.tar -- $EMAILADDRESS
echo "emailed the zipped file"
 
### All databases in one file, Emergency Backup in case they all die ###
# mysqldump --add-drop-database --all-databases > $MYSQLBACKUPDIR/allMySQL-current.sql
# gzip --rsyncable $MYSQLBACKUPDIR/allMySQL-current.sql
# echo "Finished the all in one backup $MYSQLBACKUPDIR/allMySQL-current.sql.gz"
 
# Before Rsnapshot decides to copy all of this lets remove the unwanted files
#rm $MYSQLBACKUPDIR/*.sql # Don't need to remove them as they've already been converted to .tar.gz files
rm $MYSQLBACKUPDIR/$MYSQLBACKUPFILENAME.tar
rm $MYSQLBACKUPDIR/*.sql
 
echo
echo "Listing of compressed files in $HOSTNAME:$MYSQLBACKUPDIR"
echo "------------------------------------------------------------------"
ls -Asch1 $MYSQLBACKUPDIR/
echo "---"
echo "DONE!"

Please note : Whilst the majority of the work on this script was done by Michael Kubler, and some by Dale Caon, parts of it were copied from online sources. Unfortunately links to them have been lost in the sands of time.

Posted in Code | 1 Comment

Usefull Aliases

Here is a list of useful aliases used by the ANAT Tech department. Please feel free to use these as appropriate :

alias aRestart='sudo apache2ctl -k restart'
alias acs='apt-cache search'
alias agg='sudo apt-get upgrade'
alias agi='sudo apt-get install'
alias agr='sudo apt-get remove'
alias agu='sudo apt-get update'
alias apt='sudo apt-get install'
alias aliasd='nano ~/.bash_aliases'
alias alogs='sudo tail -f /var/log/apache2/access_logs/*_log /var/log/apache2/error_logs/*_log'
alias bashrc='nano ~/.bashrc'
alias cdetca='cd /etc/apache2/sites-available'
alias cdwww='cd /var/www/'
alias chownWWW='sudo chown -Rc www-data:www-data * .htaccess'
alias da='du -hsc *'
alias dir='ls --color=auto --format=vertical'
alias directoryExec='sudo find * -type d -exec chmod -c ug+x {} \;'
alias ds='du -hsc * | sort -n'
alias egrep='egrep --color=auto'
alias eximQ='sudo exim -bp'
alias eximRemove='exim -Mrm'
alias eximRouting='exim -bt'
alias eximStats='eximstats -nr /var/log/exim4/mainlog'
alias fgrep='fgrep --color=auto'
alias grep='grep --color=auto'
alias l='ls -CF'
alias la='ls -A --color=auto'
alias ldir='ls -l | egrep '\''^d'\'''
alias lf='ls -l | egrep -v '\''^d'\'''
alias ll='ls -Alsch --color=auto'
alias load='uptime'
alias logs='find /var/log/ -name "*.log" -type f -exec tail -f {} +'
alias logsudo='sudo find /var/log/  -name "*.log" -type f -exec sudo tail -f {} +'
alias ls='ls --color=auto'
alias maillog='tail -n 50 -f /var/log/exim4/mainlog /var/log/exim4/rejectlog'
alias mailstats='eximstats /var/log/exim4/mainlog'
alias mem='free -m'
alias netstats='sudo netstat -tulpn'
alias openz='gzip -d -c'
alias rm='rm -v'
alias rma='rm -Rv'
alias vdir='ls --color=auto --format=long'

Note: We have removed some aliases for security reasons.

Assumption : You are using Ubuntu on a web server. The aliases we have listed are mainly used on a standard LAMP (Linux Apache, MySQL, PHP) server using Ubuntu. Some commands are useful on other distros, whilst the apt-get and anything with specific directories might not be appropriate. Also, on some hosts apache is called httpd.

Explanation of some interesting aliases

aRestart : Just a quick way of restarting the Apache web server. This does a graceful restart so it won’t kick off any existing connections, although if someones downloading a large file (that will hopefully be restartable in any decent file downloader), then you can try the more forceful command /etc/init.d/apache2 restart.

acs : ‘apt-cache search’ This is Debian/Ubuntu specific and will do a search for packages (programs). Usually it can be a little too verbose and you might have to pipe the output to grep to find what you want e.g acs php | grep php

agi : This will run apt-get install for the package you want. I usually run it with ‘-s‘ first to simulate the install and see what it will actually do. e.g agi -s php5

aliasd : This is what you’ll likely use to edit your existing alias’s. Note the install install instructions.

chownWWW : This is great to use when you are migrating various files into an apache hosted directory, but be careful! This is run as root and is very powerful. You certainly don’t want to run it in the / root folder or something bad. Usually you’ll run it in the /var/www/ or ~/public_html/ folders depending on how your server is configured.

da : I use this a LOT for when trying to work out where my disk space is going. An alternative is to install ncdu (n-curses disk usage)

logs : This is a great way of tailing all the logs in the /var/log/ directory and seeing whenever they get appended to. Mainly used for checking apache logs or getting some more info about what your server is or isn’t doing. Note, this only gets the files which end in .log as I couldn’t get it to ignore gzip files by their type and there’s no point tailing compressed, log rotated logs.

directoryExec : This is used to add executable permissions to all subdirectories so you can actually view them. That is, in order for a user to be able to view the contents of a directory they need executable permissions on that directory. As with the chownWWW command this is a powerful, recursive command and should only be used when needed.

An older version of the directoryExec script is below for reference purposes.

alias directoryExec='for directory in "$( sudo find * -type d )"
do
sudo chmod -c ug+x $directory
done'

eximQ : This is an easy to remember shortcut if you use the exim4 email server. It’ll show the queue of emails waiting to be processed.

ll : I use this ALL the time to view a listing of files. I basically can’t use a terminal prompt without this as it gives lots of useful information. You can remove the h (so it’ll be ls -Alsc) if you don’t want it human readable, as it can sometimes be a little easier to find the larger files without h.

netstats : As per http://www.cyberciti.biz/faq/what-process-has-open-linux-port/ this shows which ports are open and which programs are opening them.

Got any more? Let us know of any useful aliases you’ve developed or found!

Posted in Code | 2 Comments

Webalizer Scripting Shenanigans

After much swearing and cursing the ANAT tech department has finally come up with an automated self scaling script to maintain web stats accross multiple sites hosted on a single server using Webalizer. Please feel free to utilise these scripts.

###################################
## Script to add new logs to existing web stats ##
###################################
 
BASE_OUTPUT_DIR='/home/anat/public_html/webstats/'
BASE_LOGS_DIR='/var/log/apache2/access_logs/'
LOGS_APPENDED_CHARS='_log.1'
SED_LOGS_DIR=${BASE_LOGS_DIR//\//\\\/}                #Replace / with \/ to get around SED regex substitution
SED_LOGS_APPENDED_CHARS=${LOGS_APPENDED_CHARS/\./\\.}            # Like above but replace . with \.
for LOG in $( find "$BASE_LOGS_DIR" -name "*$SED_LOGS_APPENDED_CHARS"  -type f )
do
SITE=`echo "$LOG" | sed -e "s/$SED_LOGS_DIR//" -e "s/$SED_LOGS_APPENDED_CHARS//" `
mkdir -p $BASE_OUTPUT_DIR$SITE/
webalizer -o $BASE_OUTPUT_DIR$SITE/ -n $SITE $LOG
done

For full script with once off import functionality and other features please view the GoogleDoc here

Posted in Code | Leave a comment

The Tao of Backup

This is a great way of learning the 7 lessons to reliable backups (just ignore the tiny amount of outdated advertising).

http://www.taobackup.com/

  1. Coverage — Backup everything you can
  2. Frequency — Backup as often as you are making changes
  3. SeparationGeographically redundant storage (hint : if you don’t have 3 copies of the data in at least 2 locations then your data might as well not exist).
  4. History — Versioning/snapshots of data changes over time. This helps recover from corruption, virus infiltration and PEBKAC.
  5. Testing — If you haven’t tested your backups then Murphy’s law says they will likely fail when you need them the most.
  6. Security — If I want to hack you, I’ll go for your unsecured backups.
  7. Integrity — If your original data has been corrupting without you knowing then your copies of it are useless.
Posted in Links, Need to know | 2 Comments

Rootless Root – The Unix Koans of Master Foo

I felt I had to share this amusing take on the art of Unix/Linux by Eric Steven Raymond

http://catb.org/esr/writings/unix-koans/

Posted in Links, OMFG | Leave a comment