I went down to Newcastle today for the Maker Faire which is being run as part of the Newcastle Science Fest'10. I went down to participate last year and have been looking forward to this years event since. The Maker Faire isn't a full on technical meetup; there is a lot of emphasis on family fun style activities. Having said that though there is plenty of people working on some really interesting things. Below I've included a video highlighting some of what caught my eye. I'm heading down again next weekend for the BarCampNorthEast3 event. Again, I attended this event last year and had a fantastic time. As I'm writing this there are still 28 tickets available so if you're in or can get to Newcastle do check it out. In addition to the exhibitors in the video there are also several I would particularly like to highlight but couldn't include in the video principally due to my poor photography skills
The Webcycle is an interesting project which limits the speed of your internet connection based on how hard you're cycling. The faster you cycle the faster the internet connection. Also here.
The Curiosity Collective took the google maps directions a little too literally and ended up travelling to Newcastle from Ipswich via The Netherlands. They had a variety of projects this year around the concept of mapping.
Sugru was in attendance demoing their moldable silicone. Definitely the type of thing worth keeping with the Duct tape.
CamBam were showing what their software could do when coupled to a CNC milling machine.
Hexapodrobot had a number of their robots on their stand. The finish on them really looked very professional.
Brian Degger and Cathal Garvey were showcasing DIY Biology. The idea being to take idea of the garage based startup from the software world and transfering it to biology. Given that my principal interest is in biology/biochemistry/biomedicine I'm really not sure what to make of this. It definitely needs further consideration; I may well post about this again in the future.
Mike Cook was demonstrating several of his projects with the Arduino project. Definitely worth a look as are his tutorials.
Oomlout were demoing a number of their projects. It's a wonder they find time to run what is probably the best UK based shop for Arduino related goodness. Check them out.
The instructions on the tarsnap site are really very easy to follow. I was momentarily caught out by not importing the code signing key but after getting that sorted out it was fine. I did need to use sha256sum rather than sha256 as suggested. Installation went well and then I had a little play with creating, listing, deleting and recovering data from backups. It was at this point when my only real gripes with the software started to become obvious - you can't humanize the data size figures when using --list-archives and there is no shortcut for --list-archives. As gripes go these are fairly minor though and everything else works nicely.
With the tarsnap client running on my server it was time to automate my backups. I put together a small script which creates a dump of my database and then creates a new backup with the tarsnap client.
echo "Beginning backup for $dateString" >> /home/streety/sources/backup/tarsnap.log
#dump the mysql database
rm -f /home/streety/mysql-backup.sql
mysqldump --user=backup -ppassword --all-databases > /home/streety/mysql-backup.sql
#backup to tarsnap
tarsnap -c -f linode-jscom-$dateString /home/streety /etc/apache2
echo "Backup complete for $dateString" >> /home/streety/sources/backup/tarsnap.log
That script worked fine when I ran it from the shell but cron didn't seem to be running it. I needed to specify the path to the tarsnap script. Easily enough done.
# m h dom mon dow command
5 0 * * * /home/streety/sources/backup/backup.sh >> /home/streety/sources/backup/output.log 2>&1
With everything working I wanted to get permissions set up. Again this was very easy.
I've heard enough horror stories about lost data to know that backups are important. For those files that mainly live on my laptop I use Jungledisk to automatically backup the important files daily. At the time I signed up the program cost $20 and my storage costs are about $0.50 a month. Today you have to pay at least $2/month and then the storage fees as well. Not bad but after a couple of years those fees are going to add up. When I used shared hosting I sporadically backed up the files and then emailed a database dump to my gmail account daily. This worked perfectly well as the files rarely changed and gmail was able to hold several hundred copies of the database for my little blog. When I needed to I could simply go in and delete last years backup emails. Recently though I've started renting a VPS from Linode and I'm now in the position where both the files and the database are frequently changing. I need a way to backup both the files and the database and as I'm lazy I want it to be automated. I started looking around for information on how other people were handling this.
I came across a post from John Eberly discussing how he automates his backups to amazon s3. This looked like a good place to start but I was sceptical about how rsync would work with amazon s3 as described and there was only one backup. Based on this I formulated the following plan: At the start of each week copy the directories to be backed up to a temporary directory using rsync and then encrypt using gnupg. Then push the resulting file to amazon s3. On each subsequent day make a differential backup using the batch mode of rsync, encrypt and then push to s3. Repeat for the start of the next week. After putting a surprisingly short script together I had a working approach. Except nothing was actually being pushed to s3. I still need to investigate why this was happening but it isn't at the top of my list of things to do as I have since found a far better way to handle my backups.
I'm not an expert at backups. Nor am I a security expert. Nor am I interested in becoming an expert at either backups or security. This means someone has likely already built a better backup utility than I could. I believe I have found it in Tarsnap. Below is a list of what tarsnap does. I've highlighted the features which take it above and beyond my approach.
Backups on my schedule
Files are encrypted
Utility pricing - pay only for what you use with no standing charges
Open source - I can check that only what I want to happen is really happening
Efficient - backups take up no more space than my full+differential strategy and yet each backup can be manipulated independently of any other backup
Permissions - With tarsnap I can allow my server to create and read backups but not delete them
The efficiency is nice but a difference between $0.50/month and $0.60/month isn't a massive deal. What is a big deal is the permissions. Backing up my files anywhere with an online connection has always made me slightly uneasy. Email works well as once an email is sent it can't be called back. If you want to backup to amazon s3 you have to give unrestricted access to read, edit and delete which means it is possible to loose all your backups. Tarsnap is not vulnerable to this weakness though and this is a big deal. It's one less thing to worry about which is certainly worth the $0.15/GB premium over s3 alone. My next post will detail how I have implemented backups using tarsnap.
I have recently returned to a project I began last winter. The project is to design an alarm clock which eases waking up on these dark and dismal winter mornings. The idea is fairly simple. 30 minutes or so before I want to wake up a bank of LEDs will begin to shine simulating the dawn. This will hopefully prime me for the alarm itself. Last winter I had the time working with a DS1307 real-time clock and was able to control the LEDs through an Arduino.
Although I'm keeping the real-time clock I'm switching the LEDs from white to blue light and I'm adding in an LCD to display the time and likely allow programming. So far I have soldered the bank of LEDs up so they are almost ready to go and setup the Arduino to pull the current time from the DS1307 and then display it on the LCD. So far the project is going well. Hopefully this year I'll actually finish it while it is still of some use. When spring comes around I'll let nature do the work.