Now that my website is static again I needed to put together a new script for the hit counter. It seems the data related to the last time this site was delivered statically was not very important to me at some point, because otherwise I would still have it stored somewhere.
So I went to work. Ruby is my favorite scripting language, so I went with it without thinking twice. Admittedly I failed to get FastCGI working, so I went with CGI. Figuring that out again took some time but I managed - good enough! It's just a visitor counter. But still, at the very least it shouldn't count bot hits. So I took another look at Voight-Kampff and realized it hadn't been updated in a while(1), no good! Some more digging and I came across CrawlerDetect. It claims to be better than other libraries for the same task, was updated recently and had a Ruby Gem - awesome! What could go wrong?
All I was writing was a simple script. No reason to bother with Rails or rack. The CrawlerDetect gem worked without these for me on my local tests, so I seemed good to go!
But when I tried to install the dependencies for it I ran into trouble. It turned out the crawler_detect gem required a gem that was not compatible with the version of Ruby that was natively installed on my webserver. But my server runs RVM, so no problem. No problem. Right?
Making sure my script ran with the right Ruby interpreter I still ran into more trouble. Next it wouldn't find the required gems in its load path. Great. But the load path is just an array, so I did the sensible and just pushed the path that was missing onto it.
But Ruby kept throwing an error about the gem not available in some directory. For some reason it wasn't checking all the directories in the load path. But that's when I had enough.
I ended up writing the script in PHP, which ran without any fancy extra server configs. Fans of PHP suffering from Stockholm Syndrome tell us that PHP is a good and competetive language today, but just writing a few lines of code reminded me that that is not the case. But it works. As long as I remember to run the composer command necessary to install CrawlerDetect everytime I update the website! I don't even.
No way I'm going to let it stay like this. It's just good enough for now. But ultimately I will have to write my own bot parser here.
(1) The problem here is actually that the list of bots and crawlers the library uses hasn't been updated in months.