Friday 13 November 2015

Fixing 'Driver: San Francisco' in OS X 10.10.5

The 10.10.5 update for OS X included a newer version of cURL. This change breaks 'Driver: San Francisco' (via Steam).

Luckily I worked out a fairly simple fix. The short of it is that you need to override the default OS X libcurl library, when running the game, with an older version. The newer version one that comes with OS X 10.10.5 will remain in place and work for everything else.

Download, curl v83.1.2

cd /tmp
curl -O

Extract the source

tar xvfz curl-83.1.2.tar.gz
cd curl-83.1.2/curl

Compile curl with i386 architecture (this is to match the 'Driver: San Francisco' architecture)

export CFLAGS="-arch i386"
export LDFLAGS="-arch i386"
./configure --with-gssapi --enable-hidden-symbols --disable-static --enable-threaded-resolver --with-darwinssl --without-libssh2

Install the older version of curl in /usr/local

make install

Launch 'Driver: San Francisco' with the newly compiled older library

cd ~
DYLD_ROOT_PATH=/usr/local/lib ~/Library/Application\ Support/Steam/steamapps/common/Driver\ San\ Francisco/Driver\ San\

I hope this works for you as well as it worked for me!

Monday 3 March 2014

EASYACC Multi-port USB charger review

EasyAcc 25W USB charger K-5B25
Multi-port USB charger
Purchased: 2014-03-01

First Impressions

  • Nice size for what it is. It's not small but it's not too big either.
  • Good build quality. Feels solid, not cheap -- kinda like an Apple MacBook charger.
  • Port selection seems a little random. 2.1A, 1.3A, 2.1A, 1.0A, 1.0A.
  • There's probably a good explanation for it. Whatever.
  • Clearly labelled ports and their charging values are printed on the back. Handy.
  • Universal power cable. IEC_60320 C7/C8. Yay - standards!
  • No obvious manual. Good. No waste of paper.

That's it. I like it.


I'll see how it goes. But for now, going strong.

Could it be better?

  • They could print a permalink URL on the device for a support and FAQ page.  Links to the online manual a must.
  • Put the charging ports into an order of some sort. I'm nitpicking.

Monday 10 February 2014

Is Rapid Development a security threat?

The business world borrows Open Source software methodologies so that they can become agile (little a).  Release early, release often.  Continuous builds.  Unit testing.  This works out well.  Lower costs, quicker time-to-market, more reliable, repeatable and redundant.

We're into update cycles now.  We know about version numbers, feature sets, upgrades, updates.  That latest security thing?  Just update it.  Make sure it says v15.0.20140210_Build_29e4.  It's mundane.  One of my browser's plugins wants an update.  And my phone has some.  And my tablet.  TV.  Car.  Meh.  It's probably just a new wallpaper option, anyway.  Via the new in-app store!  I'll reboot on the weekend and it can apply the updates and reboot three times if it wants to.  I'll make some breakfast and eat it with my daughter, watching Regular Show.

But could Rapid Development actually be providing crackers with the exact information they need?

A committed attacker probably has about a week to crack a lot of systems.  But where should a potential cracker go to look for such weaknesses?  Backlogs, issue lists and change logs.  "It crashes right after I click save.  Here's the stacktrace and corresponding log entries: ..." "oops.  forgot to uncommnt the call to the fn() that cleans up all the tmp files before we exit().  Sorry.  Pull request 
6ac1649f26daed8e1e0ccafba43568a67ed00686."  Thanks for the exploit, fellas.

Perhaps this is why +Google  never reply to their users?  "Customer service rule 2: Just read what people are complaining about, don't say anything, fix the problem, release the update, announce '..., bug fixes, ...' and move on."

Saturday 19 May 2012

Prevent the FitBit daemon from filling your system logs

On the Mac the FitBit daemon logs to the system.log but it can be a bit verbose.

The following change to the FitBit launch daemon should send the output to its own file:
--- com.fitbit.fitbitd.plist.orig 2012-05-19 09:24:54.000000000 +0100
+++ com.fitbit.fitbitd.plist      2012-05-19 09:06:51.000000000 +0100
@@ -44,5 +44,9 @@
+    <key>StandardErrorPath</key>
+    <string>/var/log/fitbitd.log</string>
+    <key>StandardOutPath</key>
+    <string>/var/log/fitbitd.log</string>
Don't forget to create the file ahead of time and give it the correct permissions.
$ sudo touch /var/log/fitbitd.log && sudo chown nobody: /var/log/fitbitd.log
Restart fitbitd:
$ cd /Library/LauchDaemons
$ sudo launchctl unload com.fitbit.fitbitd.plist
$ sudo launchctl load com.fitbit.fitbitd.plist

Monday 12 March 2012

Getting Heimdall to work on your Mac

I have a Samsung Galaxy Android phone and a Mac. I'm a bit of a hacker and like the hardware I own to do as it's told. Unfortunately most of the software available for flashing a Samsung phone (Odin) is for Windows. There's no reason that a Mac can't do these things but most of the hackers out there run Windows or Linux so that's what the software is made for.

However, there are some Geniuses out there who wrote a multi-platform piece of software called, Heimdall. Unfortunately, like many, I've had problems with this piece of software and I wrote it off as yet-another-piece-of-multi-platform-software-that-doesn't-actually-work.

Specifically, the issue I was having was:

$ heimdall flash –kernel zImage
Heimdall v1.3.1, Copyright © 2010-2011, Benjamin Dobell, Glass Echidna

This software is provided free of charge. Copying and redistribution is

If you appreciate this software and you would like to support future
development please consider donating:

Initialising connection...
Detecting device...
Claiming interface...
ERROR: Claiming interface failed!


One day, however, I'd had enough. I was determined to get to the bottom of the problem and get it working. Luckily, it wasn't that hard.

As a Samsung device owner I had, of course, installed their awful piece of software, Kies. This useless software is the cause of the problem. More accurately, the Kernel Extensions that they load:

$ kextstat | grep -v apple
Index Refs Address    Size       Wired      Name (Version)
  55    0 0x574aa000 0x5000     0x4000     com.roxio.BluRaySupport (1.1.6) <54 53 52 51 49 17 12 11 10 7 6 5 4 3 1>
  70    0 0x57574000 0x3000     0x2000     com.devguru.driver.SamsungComposite (1.2.4) <33 4 3>
  72    0 0x57831000 0x7000     0x6000     com.devguru.driver.SamsungACMData (1.2.4) <71 33 5 4 3>
  94    0 0x57674000 0x3000     0x2000     com.devguru.driver.SamsungACMControl (1.2.4) <33 4 3>
 132    0 0x580ac000 0xd2000    0xd1000    com.vmware.kext.vmx86 (3.1.2) <11 5 4 3 1>
 133    0 0x5779f000 0xc000     0xb000     com.vmware.kext.vmci (3.1.2) <5 4 3 1>
 134    0 0x577ab000 0x6000     0x5000     com.vmware.kext.vmioplug (3.1.2) <33 29 5 4 3 1>
 135    0 0x57745000 0xa000     0x9000     com.vmware.kext.vmnet (3.1.2) <5 4 3 1>
 137    3 0x6c30e000 0x29000    0x28000    org.virtualbox.kext.VBoxDrv (4.0.8) <7 5 4 3 1>
 138    0 0x5804d000 0x7000     0x6000     org.virtualbox.kext.VBoxUSB (4.0.8) <137 44 33 7 5 4 3 1>
 139    0 0x5791d000 0x4000     0x3000     org.virtualbox.kext.VBoxNetFlt (4.0.8) <137 7 5 4 3 1>
 141    0 0x57825000 0x3000     0x2000     org.virtualbox.kext.VBoxNetAdp (4.0.8) <137 5 4 1>

I've highlighted the offending extensions. These need to be unloaded to get Heimdall to work correctly.

So, let's unload them:

$ sudo kextunload -b com.devguru.driver.SamsungComposite
$ sudo kextunload -b com.devguru.driver.SamsungACMData
$ sudo kextunload -b com.devguru.driver.SamsungACMControl

... and that's it. Heimdall should now work as advertised. At this point I recommended uninstalling Kies and eradicating all evidence of its existence.

Have fun with your rooted and newly flashed Samsung device!

Sunday 30 October 2011

Pulling a file out a bag

I have been playing with Chef recently in order to replace the hand-crafted automatic deployment system that we've created over the last few years. There are plenty of cookbooks, recipes and examples available on the Internet but I wasn't able to find a recipe that would build a multi-line file from the contents of a data bag. So using a little bit of Ruby I was able to build an array of strings in memory before writing the whole thing out to a file.

Here's my example which builds the /root/.ssh/authorized_keys file from a data bag within Chef:
key_list =
ssh_users = data_bag("ssh-authorized_keys")
ssh_users.each do |id|
    ssh_user = data_bag_item("ssh-authorized_keys", id)

file "/root/.ssh/authorized_keys" do
    content key_list.join("\n")

Thursday 28 January 2010


With the recent advancement of HD content, storage requirements are becoming aggressive and require higher networking bandwidth than ever before.  Disks are getting larger and cheaper but most modern consumer computer devices don't have enough internal storage to handle all of our requirements when it comes to photos, music, video, documents, etc.; and, to make matters worse, on a daily basis I use three to four different computers, all of which require backing up in case anything should go wrong.

Right now reliable, safe, long-term data storage means having multiple, redundant disks on the network - which can be a considerable cost.  Our home requires somewhere in the vicinity of 2-4TB of storage to handle everything comfortably, with some room for growth.  Whilst this is not a lot of storage in today's terms, any future-proof device would still be quite large, noisy, hot and have a low spouse acceptance factor.

So, what's the answer?

Cloud computing and storage has always appealed to me and have reached the point where they're just as good, if not better, than local alternatives.

The home network will always require some local storage.  Home broadband hasn't yet reached the level where everything can be streamed from the Internet as an on-demand service - not in the UK at any rate (and certainly not in the US or Australia).  So, a local 1-2TB, low-power NAS for transient video data is still required and, because of the nature of the content, backups aren't required.

Music is roughly in the same camp.  Since most of the music is synchronised with a portable music player, in the event of an disaster, a simple re-sync should be sufficient.

Photos are a tricky call and one that I've had to take quite a firm stance on.  Typically, when taking a bunch of photo only a few are ever really any good and worth showing to people.  What do we do with the rest of them?  They usually languish in a folder somewhere getting pushed from computer to computer, medium to medium without ever being accessed again.  So, I say, get rid of 'em.  The rest should be put into the cloud for sharing with the rest of humanity.  Flickr, PhotoBucket, Picasa... take your pick.  Let them worry about the storage and bandwidth.

Documents are fairly easy.  There are plenty of services out there that will do simple, cloud-based file storage sitting on top of Amazon's S3 or Rackspace.  Again, take your pick and get them off your local disk.

If you take this approach it becomes fairly obvious that your local PC, laptop or tablet (hello, iPad!) becomes a commodity device as it simply consists of an OS and a bunch of applications that allow you to access the cloud and your content.  Nothing of consequence is ever stored locally.  It also means that you become free and untethered to a particular machine or environment.

So, what happens if the cloud disappears?

First of all, all of The Cloud would have to disappear at once, which is unlikely.  Secondly, you can always backup your data.  There are services that offer backups of your cloud-based content.  Backupify are an example of such an organisation.  Their restore and export procedures are not yet complete but their experts will help you out should the worst happen.

As a geek I also tend to have a slightly above-average set of requirements when it comes to the Internet; all of the above as well as running a blog, a micro-blog, having several domains, source code repositories, etc. all of which have to be hosted somewhere.  Previously the sensible option was to host these myself.   However, recently I've come to the conclusion that out-of-the-box software (such as Wordpress & Google Mail) do a pretty good job as they come and no longer require me to install, configure, upgrade and maintain software -- leaving me with more time to do other things.

Not only have I save myself some effort but I've also saved quite a bit of money, as a majority of the cost before was running a server.  The cloud now provides a plethora of services for free (usually subsidised by advertising) that have relatively low-cost upgrade paths also providing scalability, should you need it.

The following table is a simple side-by-side cost comparison for the content that I was hosting myself previously versus the cost, today, of utilising cloud technologies:

Function Hosted solution Yearly cost Cloud Solution Yearly cost£720N/A£0
EmailExim, Dovecot, Horde & Imp£0Google Apps Mail£0
Calendar Horde & Kronolith£0 Google Apps Calendar£0
Contacts Horde & Turba£0 Google Apps Contacts£0
Docs and spreadsheetsMicrosoft Office£160 (£320 over 2 years) Google Apps Docs£0
Blog Wordpress software£0 Wordpress hosting£7 domain mapping £16 custom CSS
Online filesystemSSH FS£0Dropbox£0
PhotosGallery 2£0Flickr Pro£16
EncryptionSSL certificates£35(included)£0
BackupsCustom script£0Backupify£0
Total £915 £39

Not bad, eh?

I still, however, have some concerns about turning all my data over to the cloud.  Will these companies disappear one day, without a trace - taking my data with them?  Possibly.  Will the data be backed up somewhere?  Maybe. Will I be able to access it?  Probably not.

These questions are unanswerable right now and only time will tell - but I really can't think of a better way of doing it, today, without having a vast, "expensive" storage array at home.  Hopefully, someday soon, there will be a breakthrough in storage technology and I can store my ever-growing digital life safely and locally, somewhere that I trust.  But, if that does happen, you can be sure that content generators will find a way to use it -- resulting in an arms race again, pushing the limits of storage technology ever forward.

I do, however, feel lighter for having made the move.