In a recent project I was working on which was based on Zend Framework (ZF) I wanted to send out some fairly complex emails following user registrations and for various alerts. ZF has a decent class for sending emails which just left forming the text I wanted to send in each email. For each type of email much of the content would remain the same with just a few things changing like name and date. I could have mixed up these elements and strung them together in a variable. This would probably have become very messy very quickly in the same way that mixing html and logic together gets messy quickly.
The solution to the html/logic issue is to use templates and unsurprisingly the same approach also works well for email. There is no shortage of templating systems for PHP and I suspect most would work perfectly adequately for this task. As I was already using ZF though I decided to go with Zend_View.
The class which follows wraps Zend_Mail and Zend_View together. It's possible to quickly and simply assign variables to be used in the template and then when the time comes to send the email additional default variables can be included from a config file. The config file also includes the location where the templates are stored and the email address from which the email should be sent.
/** * A template based email system * * Supports the sending of multipart txt/html emails based on templates * * * @author Jonathan Street */class Acai_Mail{ /** * Variable registry for template values */ protected $templateVariables = array(); /** * Template name */ protected $templateName; /** * Zend_Mail instance */ protected $zendMail; /** * Email recipient */ protected $recipient; /** * __construct * * Set default options * */ public function __construct () { $this->zendMail = new Zend_Mail(); } /** * Set variables for use in the templates * * Magic function stores the value put in any variable in this class for * use later when creating the template * * @param string $name The name of the variable to be stored * @param mixed $value The value of the variable */ public function __set ($name, $value) { $this->templateVariables[$name] = $value; } /** * Set the template file to use * * @param string $filename Template filename */ public function setTemplate ($filename) { $this->templateName = $filename; } /** * Set the recipient address for the email message * * @param string $email Email address */ public function setRecipient ($email) { $this->recipient = $email; } /** * Send the constructed email * * @todo Add from name */ public function send () { /* * Get data from config * - From address * - Directory for template files */ $config = Zend_Registry::get('Config'); $templateDir = $config->email->template->dir; $fromAddr = $config->email->from; $templateVars = $config->email->vars->toArray(); foreach ($templateVars as $key => $value) { //If a variable is present in config which has not been set //add it to the list if (!array_key_exists($key, $this->templateVariables)) { $this->{$key} = $value; } } //Build template //Check that template file exists before using $viewConfig = array('basePath' => $templateDir); $subjectView = new Zend_View($viewConfig); foreach ($this->templateVariables as $key => $value) { $subjectView->{$key} = $value; } try { $subject = $subjectView->render($this->templateName . '.subj'); } catch (Zend_View_Exception $e) { $subject = false; } $textView = new Zend_View($viewConfig); foreach ($this->templateVariables as $key => $value) { $textView->{$key} = $value; } try { $text = $textView->render($this->templateName . '.txt'); } catch (Zend_View_Exception $e) { $text = false; } $htmlView = new Zend_View($viewConfig); foreach ($this->templateVariables as $key => $value) { $htmlView->{$key} = $value; } try { $html = $htmlView->render($this->templateName . '.html'); } catch (Zend_View_Exception $e) { $html = false; } //var_dump($subject); //var_dump($text); //var_dump($html); //Pass variables to Zend_Mail $mail = new Zend_Mail(); //var_dump($fromAddr); $mail->setFrom($fromAddr); //var_dump($this->recipient); $mail->addTo($this->recipient); $mail->setSubject($subject); $mail->setBodyText($text); if ($html !== false) { $mail->setBodyHtml($html); } //Send email //$config = Zend_Registry::get('configuration'); $transport = $config->email->transport; if($transport == 'Dev') { $tr = new Acai_Mail_Transport_Dev; $mail->send($tr); return; } $mail->send(); }}
You may want to make some changes to how the class fetches its default values depending on your setup.
Using the class is very simple. Here is the code I use to send a confirmation email to a new user.
$emailObj = new Acai_Mail;$emailObj->setRecipient($data['email']);$emailObj->setTemplate('register');$emailObj->activationLink = $actiUri;$emailObj->send();
Hopefully you find the above code of some use and can integrate it into your own projects. If you have any questions or suggestions please do post them in the comments below.
Three weekends ago I went down to Newcastle to attend BarcampNortheast3. For anyone who doesn't know what a barcamp is Wikipedia provides a decent explanation
BarCamp is an international network of user generated conferences (or unconferences) - open, participatory workshop-events, whose content is provided by participants. The first BarCamps focused on early-stage web applications, and related open source technologies, social protocols, and open data formats.
The idea is that those attending also present. I decided to talk about the work I did setting up a dedicated search engine for a private phpBB powered forum as it is something not many people probably have ever needed to do. Below is the short powerpoint presentation I put together. Excessive use of powerpoint is discouraged so there isn't much in it. Further details are below.
phpBB does have its own search functionality so it's reasonable to ask why is anything else needed. Unfortunately the users of this forum had found that the site was slowing down. The problem was believed to be the search functionality and so it was set to only make the past year and future content searchable. This meant there was several years worth of content which didn't show up in the search engine. I had arrived to this community relatively recently and so had missed a lot of good content. Initially I had asked to have access to the database which would have made making the site fully searchable straightforward. Unfortunately the owner of the site didn't have the technical know how to feel safe granting me, a relative newcomer, access to the database.
3 steps
To get around this I decided to do what google does and fetch the site one html page at a time and get the content that way. The project could be broken down into 3 steps; create a mirror of the site locally, insert the content extracted from the local mirror into a database, and finally setup sphinx to index the content.
The Problems
As the site is password protected creating the local mirror required some additional work to handle login sessions. The html of the pages throws up several errors running it through the w3c validator which created problems for extracting content. Finally all the important information in the URL is in the query string.
Wget
Wget is a really nice tool for downloading information from the internet. It was relatively easy to cajole it into handling logging it. Unfortunately I soon realized it was repeatedly downloading the same content again and again. This was a problem with wget but reflected the redundant linking structure of phpBB. A topipc might have twenty posts and each post had a unique URL which pulled in all the content for the entire topic. Wget does allow you to filter the URLs you download but it doesn't filter on the query string which is what I needed.
Zend_HTTP
In the end I created a custom script to handle crawling the site with Zend_HTTP handling the actual HTTP requests.
Scraping HTML
Running each downloaded page through the PHP Tidy extension and then feeding the resulting text to SimpleXML worked in most, though not all, cases. Since the barcamp conference I have since re-implemented this section of the project using Python and BeautifulSoup which was able to handle all the downloaded pages.
Releasing Sphinx
I considered three options for the search engine; two Lucene based projects, Zend_Search and Solr, and then sphinx. I had heard that Zend_Search would be rather slow at indexing and I felt that Solr, although the most powerful option, would be overly complex for my needs. I therefore decided to give Sphinx a try.
Sphinx is set up to be able to pull content from a MySQL database so when I had a complete copy of the forum locally I extracted all the posts to a MySQL database table ready to be fed to Sphinx. First though I had to get Sphinx talking to MySQL. To fetch content directly from a MySQL database Sphinx requires the mysql-devel package to be installed. A search using apt-get couldn't find the package. Fortunately a quick google search turned up this page which suggested a fix.
After that the only other problem was the paths in the manual didn't match up with the path to the sphinx executables on my system. The api and the example configuration files could be easily adapted for my needs and I quickly had a working search engine. The rest of the slides are self-explanatory so I'll stop here. If you have any questions post them in the comments below and I'll do my best to answer them.
I went down to Newcastle today for the Maker Faire which is being run as part of the Newcastle Science Fest'10. I went down to participate last year and have been looking forward to this years event since. The Maker Faire isn't a full on technical meetup; there is a lot of emphasis on family fun style activities. Having said that though there is plenty of people working on some really interesting things. Below I've included a video highlighting some of what caught my eye. I'm heading down again next weekend for the BarCampNorthEast3 event. Again, I attended this event last year and had a fantastic time. As I'm writing this there are still 28 tickets available so if you're in or can get to Newcastle do check it out. In addition to the exhibitors in the video there are also several I would particularly like to highlight but couldn't include in the video principally due to my poor photography skills
The Webcycle is an interesting project which limits the speed of your internet connection based on how hard you're cycling. The faster you cycle the faster the internet connection. Also here.
The Curiosity Collective took the google maps directions a little too literally and ended up travelling to Newcastle from Ipswich via The Netherlands. They had a variety of projects this year around the concept of mapping.
Sugru was in attendance demoing their moldable silicone. Definitely the type of thing worth keeping with the Duct tape.
CamBam were showing what their software could do when coupled to a CNC milling machine.
Hexapodrobot had a number of their robots on their stand. The finish on them really looked very professional.
Brian Degger and Cathal Garvey were showcasing DIY Biology. The idea being to take idea of the garage based startup from the software world and transfering it to biology. Given that my principal interest is in biology/biochemistry/biomedicine I'm really not sure what to make of this. It definitely needs further consideration; I may well post about this again in the future.
Mike Cook was demonstrating several of his projects with the Arduino project. Definitely worth a look as are his tutorials.
Oomlout were demoing a number of their projects. It's a wonder they find time to run what is probably the best UK based shop for Arduino related goodness. Check them out.
The instructions on the tarsnap site are really very easy to follow. I was momentarily caught out by not importing the code signing key but after getting that sorted out it was fine. I did need to use sha256sum rather than sha256 as suggested. Installation went well and then I had a little play with creating, listing, deleting and recovering data from backups. It was at this point when my only real gripes with the software started to become obvious - you can't humanize the data size figures when using --list-archives and there is no shortcut for --list-archives. As gripes go these are fairly minor though and everything else works nicely.
With the tarsnap client running on my server it was time to automate my backups. I put together a small script which creates a dump of my database and then creates a new backup with the tarsnap client.
#!/bin/bashdateString=`date+%F`
echo"Beginning backup for $dateString">>/home/streety/sources/backup/tarsnap.log
#dumpthemysqldatabaserm-f/home/streety/mysql-backup.sqlmysqldump--user=backup-ppassword--all-databases>/home/streety/mysql-backup.sql
#backuptotarsnaptarsnap-c-flinode-jscom-$dateString/home/streety/etc/apache2echo"Backup complete for $dateString">>/home/streety/sources/backup/tarsnap.log
That script worked fine when I ran it from the shell but cron didn't seem to be running it. I needed to specify the path to the tarsnap script. Easily enough done.
The original key is then removed from the system and kept in a secure place. The new limited key should allow us to create and read from backups but not to delete them.
As you can see I keep forgetting to use sudo but it all works. I can create backups, list the existing backups but I can't delete them, at least not from this server. Success.
I've been running this script for a little more than a month now and so far I'm very happy with it.
I've heard enough horror stories about lost data to know that backups are important. For those files that mainly live on my laptop I use Jungledisk to automatically backup the important files daily. At the time I signed up the program cost $20 and my storage costs are about $0.50 a month. Today you have to pay at least $2/month and then the storage fees as well. Not bad but after a couple of years those fees are going to add up. When I used shared hosting I sporadically backed up the files and then emailed a database dump to my gmail account daily. This worked perfectly well as the files rarely changed and gmail was able to hold several hundred copies of the database for my little blog. When I needed to I could simply go in and delete last years backup emails. Recently though I've started renting a VPS from Linode and I'm now in the position where both the files and the database are frequently changing. I need a way to backup both the files and the database and as I'm lazy I want it to be automated. I started looking around for information on how other people were handling this.
The Plan
I came across a post from John Eberly discussing how he automates his backups to amazon s3. This looked like a good place to start but I was sceptical about how rsync would work with amazon s3 as described and there was only one backup. Based on this I formulated the following plan: At the start of each week copy the directories to be backed up to a temporary directory using rsync and then encrypt using gnupg. Then push the resulting file to amazon s3. On each subsequent day make a differential backup using the batch mode of rsync, encrypt and then push to s3. Repeat for the start of the next week. After putting a surprisingly short script together I had a working approach. Except nothing was actually being pushed to s3. I still need to investigate why this was happening but it isn't at the top of my list of things to do as I have since found a far better way to handle my backups.
Tarsnap
I'm not an expert at backups. Nor am I a security expert. Nor am I interested in becoming an expert at either backups or security. This means someone has likely already built a better backup utility than I could. I believe I have found it in Tarsnap. Below is a list of what tarsnap does. I've highlighted the features which take it above and beyond my approach.
Multiple backups
Backups on my schedule
Files are encrypted
Utility pricing - pay only for what you use with no standing charges
Open source - I can check that only what I want to happen is really happening
Efficient - backups take up no more space than my full+differential strategy and yet each backup can be manipulated independently of any other backup
Permissions - With tarsnap I can allow my server to create and read backups but not delete them
The efficiency is nice but a difference between $0.50/month and $0.60/month isn't a massive deal. What is a big deal is the permissions. Backing up my files anywhere with an online connection has always made me slightly uneasy. Email works well as once an email is sent it can't be called back. If you want to backup to amazon s3 you have to give unrestricted access to read, edit and delete which means it is possible to loose all your backups. Tarsnap is not vulnerable to this weakness though and this is a big deal. It's one less thing to worry about which is certainly worth the $0.15/GB premium over s3 alone. My next post will detail how I have implemented backups using tarsnap.