History of the Bible

I ran a different type of study with our church youth group today where we investigated the history of the Bible. It was a combination of a trivia-style and lecture-style presentation and I’ve uploaded the notes I used. It took a little while to put it together – most of the time was spent reading on the Internet and distilling the important (and fun) facts – history can be a little dry. I have included references for most of the resources I used except Wikipedia (which is pretty much the foundation for most of the material). Having done this, I do have a greater appreciation for biblical scholars and academics.

Adding CSRF token to jQuery AJAX requests

When using a jQuery-supported framework such as Backbone, underlying jQuery AJAX requests are typically abstracted at the model layer. To insert Cross-Site Request Forgery (CSRF) tokens or other session data into the request, one method is to proxy a method in the call stack and add the token via an option (example). This does have a disadvantage if you need to call $.ajax directly as you’ll need to again insert the CSRF token as a header option.

The DRY way? Use jQuery’s ajaxPrefilter API:

$.ajaxPrefilter(function(options, originalOptions, jqXHR) {
  var token;
  if (!options.crossDomain) {
    token = $('meta[name="csrf-token"]').attr('content');
    if (token) {
      return jqXHR.setRequestHeader('X-CSRF-Token', token);

GUS Gives

I was approached two years ago by a colleague from the Queensland Department of Education and Training (DET) to help set up a web application for GUS Gives – a charity portal that collects payments from members and provides detailed analytics for charity organisations. My role was to provide data management support and the first task on the list was creating sample data sets in order to test the report generation functions of the website.

I had done similar work when I was working at DET of rule-based generation of staff and student data sets. Using the same technique, I developed an application that generated random people/members as a CSV file (file type choice by the application developers). The reference data of first and last names, locations, phone number prefixes, salaries, etc. were from public sources – US Census Bureau, Australian Bureau of Statistics, OpenStreetMap, and Wikipedia.

Now that GUS Gives looks like it’s a non-starter, I have uploaded the source to Github. The project also contains the data requirements and the collated reference data so the application can run without further dependencies.

Parallel Bible

My 85 year old grandma has been getting into reading various bible translations from her iPad. In order to improve her Hindi reading skills, she was looking for a side-by-side translation of a Malayalam bible and Hindi bible. While we could find Malayalam-English and Hindi-English versions, we couldn’t find a Malayalam-Hindi one that she could access on her computer or iPad.

I decided to help her out and the result is Parallel Bible. I have also uploaded the source code to Github.

The application is static site generator using Razor templates – I took this approach as I couldn’t host a Ruby on Rails or .NET website using my current hosting provider. It uses jQuery Mobile as my grandma uses her iPad for the majority of her browsing, and I wanted the site to be reasonably usable on a mobile. To run the application, the bible data itself must be downloaded for the English version, but is screen scraped for the Indian translations. The translations are then merged during the template execution process.

I did find it interesting that the number of chapters and verses did vary per translation. It was more frequent with the English vs Malayalam where there were quite a few instances of ±1 verse.

Reducing Rails asset precompile times on JRuby

Rails asset precompile times on JRuby are considerably slower compared to MRI. I came across this post which provided suggestions on speeding up the asset precompile task.

Using the following options – using Node.js instead of therubyrhino for JS compilation, forcing the JVM to 32 bit (although this can be omitted on a 32 bit JVM) and not using JIT compilation – cut my asset precompile time from 4 mins 37 secs to 2 mins 8 secs. Using Node.js contributed to the majority of that time since I’m using a 32 bit VM.

EXECJS_RUNTIME='Node' JRUBY_OPTS="-J-d32 -X-C" rake assets:precompile

Silencing noise in the Rails development log

The standard Rails development log contains a lot of noise that is rarely meaningful for debugging. The Quiet Assets gem is a mandatory part of my Rails development process as it removes the logging noise of the asset pipeline.

Also, if WEBrick is used as a development server, the following entry is logged for each asset pipeline log entry (whether silenced or not by Quiet Assets):

WARN  Could not determine content-length of response body.
Set content-length of the response or set Response#chunked = true

While some have suggested monkey-patching WEBrick or installing Thin as an alternate server (which doesn’t work under JRuby), the simplest way to remove those statements is to add WEBrick specifically in your Gemfile:

gem 'webrick', '~> 1.3.1', group: :development

Mixing DatabaseCleaner transaction and truncation strategies

RSpec by default issues a database transactional rollback that is executed after the completion of a block. When using Capybara with the :js option (to enable testing with Selenium or Webkit), the transactional rollback offered by RSpec is unusable as the browser is loaded via a separate thread and has a separate database connection. Thus any uncommitted data cannot be read by the browser.

Most people suggest employing DatabaseCleaner using a truncation strategy to overcome this deficiency. The performance of cleaning via truncation vs transaction depends on the size of data – with smaller fixtures, transactions are preferable.
Instead of using one or the other, we can mix strategies to have the best of both worlds:

config.use_transactional_fixtures = false # Using DatabaseCleaner instead

config.before(:suite) do
  DatabaseCleaner.strategy = :transaction # Default strategy
  DatabaseCleaner.clean_with(:truncation) # Initially clean with truncation

config.before(:each, type: :request) do
  # Swap to truncation as Selenium runs in separate thread with a different
  # database connection
  DatabaseCleaner.strategy = :truncation

config.after(:each, type: :request) do
  # Reset so non-request specs can use transaction
  DatabaseCleaner.strategy = :transaction

config.before(:each) do

config.after(:each) do

On my current test suite of around 461 tests with 73 request specs, this dropped the run time by just over a minute.

Mocking instances created via ActiveRecord’s find

Most Ruby mocking frameworks have the ability to mock a new object created via a constructor. However, when an object is created via ActiveRecord’s find or find_by_* methods, the .new method isn’t invoked. Instead, the .instantiate method is called.

For example, to specify :instantiate as the object creation method using FlexMock: