Third Mac mini

So, after four and a half years of solid service from my second Mac mini (a 2.53Ghz Core 2 Duo polycarbonate), which followed three and a half years of solid service from my first Mac mini (a 1.5Ghz Core Solo polycarbonate), I now have a third. It’s an aluminium unibody Intel Core i7 (2.7Ghz) and I’ve pimped it out with 16Gb of RAM and a 512Gb SSD, which will make it perform as fast as it will ever go. It’s quite a departure from its predecessor in terms of speed and usability.

Screen Shot 2014-06-24 at 22.24.02

I was going to splash out on a new iMac, but something stopped me at the last minute (most likely the imagined image of the credit card bill arriving) and then the opportunity to acquire this Mac mini presented itself, so I’ve saved myself a fair whack of cash. I’m still running it on the two 1600×1200 20″ monitors I bought ten years ago, which, because I spent rather a lot of money on them in 2004, simply refuse to die because they’re very good quality. The purchase of an iMac would have made these old faithfuls unnecessary redundant. The only real feature I’ve sacrificed in not buying the iMac is an up-to-date graphics card, which would have been nice, but an extravagant indulgence given how often I actually play demanding games.

This Mac mini should last me at least until the end of 2016 at which point I will consider my options again. I expect the monitors will probably last until then too, at which point they’ll be even more old fashioned but even harder to retire due to my irrational loyalty to their continued enduring service.

In other news I now have a Windows PC on my desk at work. This is the only statement I’m willing to make about it.


Driven to drop Google Drive for Dropbox

Cloud computing is a wonderful thing, whether you are a business or a consumer. It isn’t the answer to everything, but it’s certainly solved some common problems, not least of which is the issue of back-ups. These days for a few dollars per month everybody can transparently back-up most if not all their important files to servers on the Internet and have those files synchronised between multiple computers and mobile devices such as smartphones and tablets.

There’s also no shortage of companies willing to offer their cloud storage services. Some services, like Amazon’s S3 service, are geared towards developers for integration into software (although Amazon now have a consumer offering), but there are many aimed at consumers who want a simple way of achieving transparent backup of their personal files. Microsoft, Symantec and Google all offer solutions, although not all are cross-platform.

Google Drive

Up until last week I used Google Drive, having taken up the service since it was launched earlier in the year. It costs $4.99 per month for 100Gb of storage and comes with software which you install on your computer and it automatically manages the sychronisation of your files, so long as you save them in the special “Google Drive” directory.

However, Google Drive was not without its problems from the very start. The software is not particularly well written and it is apparent that it has some bugs. It suffers from massive memory management problems and is prone to crashing without warning. This was especially annoying during my initial upload of files, which would have taken around a week if the software had remained running, but it did not and it would quit every few hours. Because I was either not awake or not at home to keep restarting it each time it crashed, my initial upload took far longer.

But it got there in the end, and for around six months it successfully kept my files safe and sychronised between my computers. I still had the memory issues (it typically used between 700Mb and 1Gb of RAM even when idle), and so I often found myself having to quit the software in order to free up some RAM if I needed it. This wasn’t ideal as it meant that I had to remember to restart Google Drive in order to ensure my files were kept up to date, but I lived with it.

Restoration test

Then, at the end of November, came a real test of the value of Google Drive. The hard disk in my desktop Mac Mini developed unrecoverable hardware problems, and I had to replace it. Although this was a time-consuming process it was not a disaster for me as I had all my important data in one cloud service or another. I have all my music on iTunes Match, all my development work on Github and all other files that I would be upset about losing in Google Drive. I have other files that aren’t on any cloud service stored on an external hard drive; these are files that could be replaced relatively easily if I had to and it’s not worth backing them up.

So I merrily removed the old hard disk without attempting to remove any of my data from it and installed the new one in its place (putty knives and swearing is always involved when upgrading an old-shape Mac Mini). I installed the operating system from scratch and all my software on the new hard disk and then began the process of restoring my data from the various cloud services. Github and iTunes Match worked like a charm straight off the bat, but Google Drive was, unfortunately, an entirely different story.

I installed the latest version of the software and entered my Google account details. It thought about it for a bit, allocated itself a whopping 3.25Gb of RAM, and then started to download my files. “OK”, I thought, “the RAM thing is even more annoying than it was before, but whatever”, and left it to do its thing. After downloading around 700Mb, it displayed a window saying that “An unknown issue occurred and Google Drive needs to quit“. The window also said that if this happens repeatedly I should disconnect my account.

It did this seven further times. Each time I was able to download around 100Mb of data before it displayed this error again. After the seventh time it didn’t download any more data, no matter how many more times I ran it. It had only downloaded 1.3Gb of my 55Gb of data. So I tried disconnecting my account and logging-in again. It insisted on starting the download from scratch, forcing me to discard the 1.3Gb already downloaded. Unfortunately it did exactly the same thing, repeated errors and then “maxing-out” at around 1.3Gb of files after numerous restarts. It was, frankly, ridiculous.

Out of frustration I called upon Google’s support, which as a paying customer I was entitled to. Their suggestion was to uninstall and re-install the software, and this suggestion came 48 hours later. Needless to say I was not particularly impressed. I did not believe for a second that this would fix the problem and that I was simply being taken through a standard support script. This was the final straw with Google Drive, after all the upload issues, memory issues and now this, an apparent inability to restore from my precious backup when I needed to.

I am 99% sure that it was crashing due to poor memory management (i.e. it was running out of memory), if the console messages were anything to go by. I considered that following their reinstallation advice would be a waste of my time based on this and I would further waste my time attempting to explain my technical suspicions to them. I needed my files back and I needed my cloud service back, on my timescale and not on Google’s.


I am fortunate to own two computers, and this was my saving grace. I still had the copy of the Google Drive directory on my other computer, so I still had a local and up to date copy of all my files. If, however, I had only one computer, I would have been entirely at the mercy of Google to get my files back. That was not something that I decided I was comfortable with and so I decided I had two choices:

  1. Persevere with Google’s support and, assuming they manage to fix the issue, continue to tolerate their piss-poor software going forward.
  2. Use the other copy of my files I had, find an alternative cloud storage service, upload them to it, and dump Google Drive.

I chose the latter. I had heard good things about Dropbox. They are a small firm for whom online storage is their entire business, rather than just another product, which is the case for Google. It is absolutely in their interest to get their offering right, because if they don’t they don’t have a dominant global search engine business (for example) to fall back upon. I wouldn’t be surprised if Google Drive grew half-arsed out of project that a Google developer created on his “do your own thing” day of the week, a privilege extended to Google developers as standard, to the envy of most others.

Dropbox is twice the price of Google Drive, costing $9.99 per month for 100Gb instead of $4.99. This isn’t a high price to pay for a reliable solution in my opinion. Like Google Drive, it too comes with software to be installed on your computer(s) which creates a special directory into which you save your files and it sits there in the background and uploads and downloads files as required. The difference between the Dropbox software and the Google Drive software is that the Dropbox software does so without using all your RAM and without quitting every few hours. Amazeballs!

It took around 7 days to upload my files to Dropbox, during which the software did not crash even once and used no more than 400Mb of RAM at its peak. Google Drive’s memory management was so poor that it never released memory if it didn’t need it any more; its RAM usage just kept going up and up and up. I was supremely impressed with this; this is how Google Drive should have been from the very beginning and the fact that Dropbox can do it means there is no excuse for Google Drive not to be able to. I am currently in the process of downloading these newly-uploaded files to my other computer en-masse, and guess what, still no crashes and it doesn’t seem to think that downloading 55Gb is a somehow insurmountable task, so doesn’t give up after the first 1.3Gb.

Other things I like about Dropbox:

  1. Great mobile app for iPhone and and iPad. This, too, Just Works, and allows viewing of a wide range of file types. It also backs up the camera photos from each device, which is a nice touch.
  2. It has an API, which allows it to be integrated into other software and services, such as IFTTT. This is more exciting for me than it probably would be for most people, but it’s something that Google Drive doesn’t have.

Of course, Dropbox may well not be without its own problems which are not yet apparent. If any transpire I will of course report on them, but initial tests and use of the service is very promising, and certainly far better than comparable early days with Google Drive.

So there you are. If you’re looking for advice on which cloud backup service to use, I recommend Dropbox. It’s compatible with Mac OS, Linux, Microsoft Windows, iOS (iPhone, iPad) and Android. Enjoy.


Mac, Apache, MySQL and PHP (MAMP)

Mac OS X serves as an excellent development environment, even if you are not actually developing Mac OS or iOS applications. It is the darling of many a LAMP (Linux, Apache, MySQL and PHP) developer, who enjoys a slick desktop operating system with good UNIX-like underpinnings but who don’t necessarily want to put up with all the various limitations and complications that running a Linux desktop brings, consistent improvements in this regard over recent years notwithstanding.

The only trouble with this is that if you want to develop LAMP applications and work on a Mac then traditionally you’ve needed a two-box setup; a Mac on your desk and Linux on a development server. For many this isn’t an issue, and indeed when you’ve got a team of developers, optimal, but what if you wanted a self-contained development environment that was restricted to just one box? What if you wanted that box to be your laptop so you could take it anywhere?


“Virtual machine!”, I hear you cry. Yes, this is a possible solution, and for many works well. Good virtualisation software is free these days, but using a local VM is cumbersome. Not only does it consume a large slice of your RAM but it also puts a lot of strain on the CPU, meaning that if you are running off your battery your battery life will be decreased. It’s also cumbersome; you have to start up the VM when you need it and there can be complications with the networking, for example, if you have connected to a public wireless network it’s possible that your VM might not be extended the same resource.

There is a software package for Mac OS called MAMP (the M for Mac OS replacing the L for Linux). This is a point-and-click installer which bundles Apache, Linux and PHP for installation on Mac OS. I don’t like this solution, for a number of reasons, including:

  1. Limited functionality unless you “go pro” (at quite considerable cost). Any self-respecting developer will require multiple virtual hosts as a minimum and won’t need or want a clicky-button interface to get what they want.
  2. You are entirely at the mercy of the distributors of MAMP with regards to component software versions that are made available to you and when.

Alternative solution

There’s an alternative to this. You don’t have to fork out £39 for a package of what it otherwise freely and widely available software. With the help of my friend and colleague Ben Nimmo I present the following assembled and tested instructions for turning your Mac into a native MAMP server without using the packages download.


  1. Download and install the latest .dmg and install both the *.pkgs within it (don’t use the TAR/GZ archives). You may wish to install the Workbench too, it’s really good these days.
  2. Find where the mysql.sock file is expected to be in /etc/php.ini (should be /var/mysql/mysql.sock)
  3. Create the folder and link the socket file to the expected location.
sudo mkdir /var/mysql
sudo ln -s /private/tmp/mysql.sock /var/mysql/mysql.sock
  1. Add MySQL to command line by editing /Users/username/.bash_profile and adding this line and then either restarting terminal or source-ing the file:
export PATH=$PATH:/usr/local/mysql/bin


PHP comes with Mac OS, so it’s not necessary to download and install it, however, there are a couple of necessary steps to configure it:

  1. Copy the default php.ini file:
sudo cp /etc/php.ini.default to /etc/php.ini
  1. Edit /etc/php.ini and uncomment this line to enable xdebug (not essential, but recommended):


Apache too comes with Mac OS, so again, no need to download and install it. Its configuration, however, is a little more complex, but nothing scary. The described configuration will provide a special Apache “sandbox” environment for your projects. It uses the existing “Sites” directory in your Mac OS home directory.

  1. Create a subdirectory in this directory for each of your projects, ensuring that the directory name does not contain any characters that would be illegal in a URL. Within each of these subdirectories create another subdirectory called “web”; this will be become the web root of each project. The extra subdirectory is in case you wish to use a framework in your projects which may keep some of its files outside of the web server root (Symfony is a good example of this).
  2. Create a subdirectory called “logs” in your “Sites” directory; Apache will maintain two log files, access and error, for all the sandbox sites.
  3. Enable PHP5 with Apache by editing /etc/apache2/httpd.conf and uncomment the following line:
LoadModule php5_module libexec/apache2/libphp5.so
  1. Change the user and group to your username and “staff” respectively, also in /etc/apache2/httpd.conf:
User sbf
Group staff
  1. While still in /etc/apache2/httpd.conf, find the following configuration and change “Deny from all” to “Allow from all”:
<Directory />
    Options FollowSymLinks
    AllowOverride None
    Order deny,allow
    Deny from all
  1. Create and edit /etc/apache/users/user.conf with the following, changing “sbf” to the username:
<VirtualHost *:80>

    ServerName dev.local
    DocumentRoot /Users/sbf/Sites/

    RewriteEngine on
    RewriteLogLevel 1
    RewriteLog /var/log/apache2/rewrite.log

    # sites in the format http://[site].dev.local
    RewriteCond %{HTTP_HOST} ^[^.]+\.dev\.local
    RewriteCond %{REQUEST_URI} !^/error/.*
    RewriteCond %{REQUEST_URI} !^/icons/.*
    RewriteRule ^(.+) %{HTTP_HOST}$1 [C]
    RewriteRule ^([^.]+)\.dev\.local/(.*) /Users/sbf/Sites/$1/web/$2

    # Logging
    CustomLog /Users/sbf/Sites/logs/sandbox.access.log combined
    ErrorLog /Users/sbf/Sites/logs/sandbox.error.log

  1. Restart Apache:
sudo apachectl restart

Then, for each of your sites, add an entry in /etc/hosts for with the format “name.dev.local” pointing to, where name corresponds to a subdirectory in your “Sites” directory. Don’t forget that the public subdirectory of each site is assumed to be “web”, so make a symlink to this if the framework you use has a different convention.

You should then be able to access each of your sites from URLs using the convention http://name.dev.local/ – where “name” again is a subdirectory within your “Sites” directory.

I’ve tested this setup procedure and It Works For Me [tm]. If, however, it doesn’t quite work for you as described, please let me know where you’re going wrong and how, if you were able, to resolve it, and I will update these instructions accordingly.


Python SimpleHTTPServer

I don’t use Python on a daily basis, barely at all in fact (knowingly), but I stumbled across a handy purpose for it today which I see myself using. Type this command in a Linux shell or a Mac OS terminal and it will start a local web server on port 8000 using the current working directory as the document root and sending access requests to STDOUT:

python -m SimpleHTTPServer 8000

Potential uses include:

  1. Developing local web applications on the local filesystem which use no server-side processing but do use Javascript and AJAX calls that a web browser would not normally allow without starting it with special flags (anyone who’s seen “XMLHttpRequest cannot load $RESOURCE. Origin null is not allowed by Access-Control-Allow-Origin” in their error console will know what I’m talking about).
  2. Quickly sharing a directory with a colleague on the same network rather than uploading files to a shared resource such as a file server or e-mail system. Obviously you’d need to ensure that whatever port you choose is allowed in your local incoming firewall if you have one.

There’s even better news for PHP users. PHP 5.4, released this week, includes a built-in web server which can be started with an equivalent command. The advantages of this of course is that it’ll also process PHP scripts instead of being limited to static files.


New life for old Mac mini

Mac Mini (original case)

18 months ago I replaced my aged 2006 Mac Mini with an up to date model, which is still my main desktop computer today. The old Mac Mini was relegated to being a quasi-media centre, but of course because it was actually a desktop computer it really wasn’t a very good media centre, but due to its age nor was it a very good desktop computer, hence why I replaced it.

I never used it as a media centre beyond the odd occasion and it’s spent the last 18 months mostly consuming enough power to sleep and collecting dust. Until this evening, that is. I’m moving again soon (which I’ll cover in full in a different post) and I’m trying to take as little as possible with me. I was using an old Dell PC as a local Linux development server, which isn’t anything special but did the job nicely. There were three problems with it, however, specifically that it’s as ugly as hell, chews through electricity because it was manufactured at a time when computer manufacturers thought it grew on trees, and it belongs to my housemate.

I don’t really want to take it with me when I move because of all of those reasons, although I’m sure the last one could be eliminated with £30 or so. Then I remembered that I had this entirely idle old Mac Mini tucked away on a corner of my network doing nothing. I wondered if it would accept an installation of Ubuntu Server, given that it’s an Intel-based Mac (the original Intel Core Solo model, no less). Sure enough, it turns out that it can, and it works a treet.

My old Mac Mini has a 60Gb hard disk and 1.25Gb of RAM. It’s not going to break any records with its single-core 1.5Ghz processor, but for running a local Apache2 server it’s nothing less than what I need. The only caveat is that it won’t boot on its own into Linux straight from the hard disk, I have to keep an CD with rEFIt on it in the CD drive for it to do that; it’s certainly not the end of the world.

From a cold-boot to getting a login prompt with all services started it uses just 85Mb of RAM and with all the software I need on it and my Git repositories in place it’s using just 2.5Gb of it’s hard disk. All this on a 65 watt power supply. In addition to this, and despite the fact that it’s very different internally to my new Mac Mini, the two look identical from the outside and so look pretty good stacked on top of each other.

So don’t throw out your old Mac Mini, give it a proper job to see out its old age! The only thing I can’t do with this which I was thinking about doing with the Dell PC was putting an x100P card in it. I’ll live.


Mac gaming spree

I’ve recently been surrendering large amounts of my spare weekend time to playing computer games after a hiatus of several years. Back when I had a PC I used to play computer games quite a lot, but since switching to Mac it has, until recently, been more difficult to do so, in part due to inadequate graphics hardware but mostly due to the fact that traditionally there simply wasn’t that many decent games available for Mac OS X.

I have two computers, a Mac Mini with a 2.53Ghz Core 2 Duo processor and a 256Mb Nvidia Geforce 9400 graphics controller and a Macbook Pro with a 2.8Ghz Core 2 Duo processor, which actually has two graphics controllers. It has a the same controller as the Mac Mini for “normal” operations and then it has an extra 512Mb Nvidia Geforce 9600 GT controller which you then switch on (requiring a logout instead of a reboot) when you want some serious graphics grunt. The reason why it doesn’t just have the super-duper one is that it absolutely hobbles the otherwise excellent battery life, so you only enable it when you really need it.

I’ve mainly been playing Half Life 2, which is available for the mac along with a plethora of other games via the Steam platform. Half Life 2 really puts my Macbook to the test, but it fares very well as I’m able to play the game at full screen resolution with nearly all the graphics settings turned up to maximum (meaning that it renders very pretty scenes) and still get a consistent frames per second (FPS) rate of between 30 and 60, which is good enough for me. The computer gets jolly hot whilst it’s doing this but appears to be designed to deal with it.

The other thing I’ve been playing is an old favourite from the turn of the century, Quake III Arena, the source code for which is now freely available and can be easily compiled on Mac OS X. All you need are the PAK files from the original game disc (as the content in these files are still under copyright). This game runs at a consistent, unwavering 90 FPS even on my Mac Mini’s relatively humble graphics controller with all the graphics settings turned up to maximum. It’s by no means a clever game, but it’s an awful lot of fun if you just want to blow off some steam in an unapologising shoot-em-up.

I’d really love to get Grand Theft Auto: San Andreas working too. There is a way, apparently, but it’s flaky, difficult to play on a laptop and I can’t play that game and enjoy it without all the mods cheats that I used to use, none of which will work on a Mac even if the game does. It’d be fabulous if other games publishers in addition to Aspyr used something like Steam to distribute their games to multiple platforms. It’s clearly a system that’s working very well and I think that publishers need to take the Mac platform more seriously as it gets more and more popular, especially amongst younger people who are their principle market.

If I had more time I would probably play games a lot more as they’re a great (and relatively) cheap way to escape and blow off some steam. That said I wouldn’t want to spend every spare minute playing them, I know what happens to people who do that.


MIDI connection between digital piano and Macbook Pro

Carrie always was fond of the piano stool.

Right, I’ve torn my hair out over this for long enough now and so I’m pleading for help. I must be missing something very basic and elementary, but I cannot for the life of me find what it is.

First some back-story, which you will need to know if you don’t follow me on Twitter. I have recently inherited my late mother’s Technics digital piano (model PX9), as my Dad is moving house and there is no room for it at his new place. I’ve had my eye on it for some years now and my current home does have room for it, so he brought it up last week for me. Given that I cannot actually play the piano, I wish to connect it to my laptop using its MIDI ports.

My reasons for this are simple: All I want to do is:

  1. Download music from the Internet as MIDI files.
  2. Connect my laptop (Macbook Pro, Mac OS X 10.6.6) to the piano using a USB MIDI interface.
  3. Play the MIDI files on the digital piano from the laptop.

So far, I have achieved the first two goals, but I am having serious trouble with the third. Please note that I do not wish to compose music, create MIDI files or use the digital piano as a MIDI keyboard, all I want to do is have the laptop play the piano.

I have tried using the following software to achieve this:

  • iTunes – will play MIDI files, but not via MIDI, only using the computer’s speakers.
  • GarageBand – will play MIDI files on the laptop and will use the piano as a MIDI keyboard. I see MIDI signals being recognised, but it will not use the piano as a MIDI output and I cannot find any settings or options to that effect. Various Google searches suggest that GarageBand does not support MIDI output, despite supporting MIDI input.
  • Reason – this baffled the hell out of me, I couldn’t even load my MIDI file into it, much less find any MIDI output options.
  • Logic Express 9 – again, this is a complicated piece of professional software and I still could not find any MIDI output options. This surprised me given that this is supposed to be Apple’s professional composition software (in contrast to GarageBand which is aimed at amateurs), so I may well have missed them somewhere.

My question to those who know about this sort of thing is simple: How do I achieve what I want to do? What software do I need and which settings do I need to set? Surely it cannot be that difficult? I would imagine that it would be a case of having an option somewhere that changes the output audio device from the local sound card to a MIDI device. I know it’s possible because, back in the day, I had a similar interface for my Acorn Archimedes, and I distinctly remember achieving this with some considerable ease using basic bundled software.

I have confirmed that my USB MIDI interface is working correctly and that it is connected to the digital piano correctly. I think the fact that GarageBand recognises input signals from the piano confirms this. I would welcome help and advice from anyone who can help me.

Incidentally I think it’s amazing that an up to date laptop is able to connect and talk to a 24 year old piece of equipment using nothing more than a smart cable that cost £2.50 from Amazon.


Command line Twitter script

I mentioned on Twitter the other day that I have a relatively basic but functional command line Twitter script, for use when you can’t or just don’t want to load the full-fat Twitter site in your web browser or you don’t want to use a third party GUI client. I received much more interest in this than I thought I would (i.e. more than zero) and so after some thought* I’ve decided to make it available to anyone who wants it.

The script, which supports a single Twitter account, supports the following actions:

  • Update (tweet), with option to specify an existing update ID to reply to.
  • Retrieve your public timeline (your tweets and those of who you follow).
  • Retrieve your own timeline.
  • Retrieve the public timeline.
  • Retrieve your recent mentions.
  • Retrieve recent re-tweets of your updates.
  • Search Twitter.

In order to work, you must authorise your Twitter account with SuperTweet, and provide the script with the username and secret that you specify in your SuperTweet account. This is because the script does not support oAuth (at least, not yet). Also, if you use Twitter to post URLs (and you probably do) you will also require a Bit.ly API key. Edit the script and provide both sets of credentials at the top.

The script is self contained, containing all the various classes that it depends on in the same file. I’ve verified that it works “out of the box” on Ubuntu Linux 10.04 and Mac OS 10.6. It will probably work on many other systems too, assuming they have PHP installed. With a small modification to the first line you can probably get it to work on Windows too, but that’s as far as I can advise, you’re on your own from there. The version I use doesn’t contain the included classes as I link it to my local class library.

To get started, download the script, unpack it with gzip -d, add execute permissions with chmod +x and then type ./twitter.php commands for a usage summary. Some commands when called without arguments will present further usage summaries which will tell you how to use them. You’ll probably want to start with something like:

./twitter.php update "Testing @stuartford's pitiful command line Twitter script."

Don’t forget to add your Twitter and Bit.ly credentials to the top of the script otherwise it definitely won’t work first time for you and I’d rather that didn’t happen.

If you don’t understand most of this post then this script probably isn’t for you, sorry, it is what it is, no warranty, etc.

* I say it required some thought because it’s rare that I make my code available for public use. I don’t know why, because I believe I am a talented software developer, I guess it might be the slight family creative gene within me that might be forcing a behaviour equivalent to an artist who’s perpetually reluctant to show people his work, who knows, that’s one for the shrink’s chair. Certainly the script isn’t my best work, it grew out of something quick and dirty, and as any developer will tell you, anything that grows out of something quick and dirty will always be quick and dirty at its heart.

1 2