Monthly Archives: July 2009


Are web frameworks really worth it?

railsAs is maturing to it’s 500 days being online and growing… It’s also awfully painstaking how cumbersome a framework (it’s written in Ruby on Rails) can be. Agreed, I am not the most seasoned rails programmer, and my ruby code sometimes makes people cry, but still. Note that this rant is about web based frameworks, though it might apply in other environments also.

Web developments Frameworks have one main advantage: it’s a framework. You don’t have to write 90% of your code anymore. If you want user-login-forgot-password functionality, or a nice captcha, you often only has to enable one module, et voila, you’re rolling. Great to advocate so-called agile development. I wrote a audit tracking tool (points raised in a systems audit and outlining their risks) at work in Rails in a few hours, needless to say, it was impressive.

The downside of a framework is that it is a framework, you are bound by it’s rules and bend yourself often to make it work.

The first 90% of your applications gets done in 10% of the time, and you’ll be debugging for the remaining 90%. As said, it’s my personal opinion as a non full fledged rails guy, but a project that matures over time seems to give problems with frameworks. (Database migrations tend not to work well in the/my real world)

Sorry just a rant. It’s 3 AM, and these migrations are driving me up the wall, to an extend that I downloaded MDB2 and smarty, and am assessing how long a rewrite would take.

Good night,

code sysadmin

Tar based incremental backups

A small bash script I wrote to have incremental backups done on a unix server, and then pushed to a Windows File system. On the fileserver, we add this directory to the normal backup.

This is on a mail server, where emails are stored in MailDir format. We create weekly full backups on sunday, and daily incremental. This script is called daily at night from a cron job. Gotta love the scripting abilities of bash.

It might help you out, so here goes:

# backup script is doing following items
# dump all incremental email into a backup file, gzip the backup file and
# move the file to an external file server

echo "backup started at: ${START_TIME}"

DOW_N=`/bin/date +"%w"` # number, 0 (sun), 1 (mon)
DOW_T=`/bin/date +"%F"`

FILE_SERVER="/mnt/fileserver/" # mounted over SMB

# if it's a sunday, delete the incremental file and take a full backup
if [ ${DOW_N} -eq "0" ]; then
  /bin/rm ${BACKUP_LOG}

/bin/tar -c -f ${TEMP_FILE} --listed-incremental=${BACKUP_LOG} ${TO_BACKUP}
/bin/gzip -f ${TEMP_FILE}
FILE_SIZE=`/bin/ls -lah ${TEMP_FILE}.gz | awk '{ print $5 }'`
/bin/mv ${TEMP_FILE}.gz ${FILE_SERVER}

## report, this goes in an email through cron
END_TIME=`/bin/date`; export END_TIME
echo "backup ended at: ${END_TIME}"
echo "data moved: ${FILE_SIZE}"