Amazon SES on Rails 4

Last week we were launching a new website hosted on Amazon AWS and we had the need to setup its mailers.

After having requested the removal of the limits and be granted Production access (the process can take days, so be farsighted and ask it in advance), we verified the domain and moved to set the mailer up.

A quick google search gave us tons of suggestions, but the smtp configuration we found didn’t work and we wanted a simple solution that wouldn’t involve the installation of a couple of gems only to use a smtp mailer.

Here is the configuration we found to work flawlessly, note the port we use: the examples we tried with authentication :login and port 465 don’t work.

HID emulation on an iPhone is not allowed

Last week in a project of ours we were exploring the BLE capabilities of CoreBluetooth framework in iOS: the vast majority of apps have the iPhone working as a central Bluetooth hub and interacting with various BLE peripherals that advertise their services.
We wanted instead to have the iPhone acting as a peripheral and advertise its services to another Bluetooth enabled device.

For the sake of simplicity we decided to implement an HID device: the iPhone would’ve acted as a bluetooth keyboard.

We set up our service and characteristics constants as:

define humanInterfaceDeviceService @“0X180D”

define bootKeyboardInputReportService @“0X2A22”

define bootKeyboardOutputReportService @“0X2A32”

and created a CBPeripheralManager and implemented the CBPeripheralManagerDelegate into our ViewController. More info here.

Unfortunately the service was not being correctly discovered and after some tedious debugging, we found out Apple has intentionally reserved HID support for iOS since it filters out all related services during the discovery process, although there’s no clear documentation about that.

Changing constants to non standard ones solved the discovery problem, although in this way the iPhone cannot work anymore as a standard HID device.

Geolocalization galore for your rails app

Having to geolocalize our users and not requiring a high level of accuracy, we decided to explore the IP geolocalization of the request instead of using the native API of the browser.

We found and tried a couple of gems:


Geo_ip retrieves a geolocation of an IP address using the api. You need to register for an API key, but after having obtained it and set into your app, you are set.

You just need to add in your controller

location = GeoIp.geolocation(request.remote_ip)

and in location you’ll have a hash with the latitude and longitude, the Country, the City, and the zip code.


Geokit-rails uses the geokit gem to offer a set of location-based features for your app, not only IP geolocalization. The list includes ActiveRecord distance-based queries, distance calculations and geocoding.

As for the IP geocoding, the code to use is the following:

location = Geocoders::MultiGeocoder.geocode(request.remote_ip)

the interesting part is that geokit is IP geocoder-agnostic since it provides a failover aong multiple IP geocoders and it queries them using the order you provide. Please, check the Geokit documentation to see the configuration options.

You can also write your own geocoder!

A special mention goes to the rubygeocoder gem, which we didn’t use for our app (it seems it doesn’t behave well with Heroku, where our app was being served from) but it does look interesting nonetheless.

Google OAuth 2.0 integration between your RoR and mobile applications

We have a working web application that uses Google+ OAuth 2.0 authentication to allow access only to owners of our own email address domain.
The solution is out of the box or close to it: put devise, omniauth and omniauth-google-oauth2 in your Gemfile, set it up, generate the keys here and you are ready to go.

But having decided to create a native mobile application to access the web application DB, we had to implement the same login method. In the past we could’ve used token_authenticable of Devise fame, but since it has been (rightfully) deprecated for security reasons we have two options:
1. We could use Cross Client Authentication and you can check here this approach. Unfortunately the documentation seems to be heavily Android centric, but we’ll delve into the iOS approach in the future.
2. We could ask for a short lived access_token on the native app and then exchange it with the web application to provide authentication.

In this article we focus on how to adapt the web application to accept the access_token and grant access when it has been validated.

We have to add a before_filter to ApplicationController:

We are passing store: false, so the user is not stored in the session and a valid access_token is needed for every request.

Now we just need to add a method to the User class and that’s it.

Here we had to validate the token against the Google API to ensure the token is still valid and to retrieve the email address and check if it’s in our database.

Obviously the first Cross Client Authentication is preferable since we are not passing an access_token (albeit short lived), but this approach has the benefit of not tying our application to any SDK 😉

Playing with POP

A couple of weeks ago Facebook shared a video with some behind the curtains details about the realisation of Paper, their iOS new fancy client.

Among the talks a bomb was released: they were going to open source their animation framework POP, and they finally did!

POP it has been released for iOS and OSX, it can animate everything, not only CGLayers, you can even animate volume or other object’s properties.

And to whet your appetite before heading to the repo check Codeplease’s posts on POP

My love-hate relationship with Emacs

Hi, I’m Emanuel and I have a problem: I love and hate Emacs.

Today instead of offering a solution of a known problem or posting a link to cool stuff, I’m here to ask for hints: I want to hear from other Emacs users what do they use to address my UI problems.

I’ve always been a Vim user (compulsory editor flame war image here) but I wanted to take the red pill and try to embrace Emacs for a while and then take an informed decision. Will I be able to go back? Well it seems I wasn’t and I want to stick with Emacs, the change wasn’t easy: I always kept relapsing to my editor of choice due to impending deadlines, but almost a year ago I decided to take the plunge and I installed only Emacs on my new laptop (I am on OSX, use Emacs with Cocoa installed via homebrew), no other serious editor available.

And I loved it! For a RoR developer, Emacs was a perfectly viable solution, I fell in love with its bindings (after the necessary hiatus to forget dd to delete a line) and my muscle memory reset to the new environment.

But I don’t always write software and I’d love to use Emacs (switching editors is almost more taxing than context switching) to jot down thoughts and write articles, and whenever I do, I feel it goes in the way. When I thought of writing this article I was sure I’d come up with a list of different problems, but in the end I found out my gripes are mainly with the UI and almost anything points to the buffers. My Emacs feels like a one window program, I usually split the screen in 2 or 3 parts, but I’d love to use tabs or multiple windows to arrange my thoughts and get the most of my screen estate. Sure, I could navigate through buffers, but sometimes it feels like throwing everything in a black hole where it’s hard to get something back or at least know what’s there.

And when I need to start some plain text editing, I’d love to find a zero friction way to just open a tab and start jotting down words, like a markdown default mode for empty files (did I mention I don’t want to even start thinking of where I want to save the file beforehand?) something that will make my Notational Velocity setup obsolete.

Maybe there’s no solution (I doubt it) and I’ll have to keep using nv, Emacs and Xcode or there’s a simple one and someone will just RTFM-me, but I’d love to hear what’s your setup and how you addressed similar problems. So, any suggestion is most welcomed, you can find me on Twitter (@onigiri) and no, please, don’t reply with suggestions to switch back to Vim or Sublime and that Textmate is open source now, or Atom, the new kid on the block. It’s not the goal of this article 😉

Let your raspberry PI see this wonderful world!

At Mikamai we’re always testing with new technologies and we’ve already expressed our love for the Raspberry PI.

Lately we finally got our hands on a Pi Camera and started using it. Well, the Picamera has been around for a while, but the wait was fruitful since just recently a Python library has been released in the wild!

Raspbian comes with a toolset of useful CLI programs to start grabbing pictures and videos right after unboxing, but a pure Python interface is just perfect for writing Raspberry applications.
Enter Picamera

The setup is really easy:

sudo apt-get install python-picamera

From grabbing a single picture

#simple grab
import picamera

with picamera.PiCamera() as camera:

to setting up a time lapse, the effort is nil.

#simple time lapse
import time
import picamera

with picamera.PiCamera() as camera:
    for filename in camera.capture_continuous('img{counter:03d}.jpg'):
        time.sleep(60) # wait 1 minute

And if we use the GPIO to add an infrared proximity sensor (like we did for Arduino), we can build our cheap alarm camera system!

So procure yourself a camera and then head to the Picamera documentation and show your Raspberry some pictures.

Rotating an UIView around an arbitrary point

Developing on iOS often brings joy thanks to default behaviours and design choices made by the Apple designers, but you can get many headaches too when trying to step off the beaten path.
Fortunately this time the solution is easy and straightforward and the headaches are saved for other more consistent problems. 🙂

Recently I had to let a user rotate a View on an iPhone app using his fingers, but when tested, the view rotated around its center.

I wanted it to mimic the real world, where if you do it on a piece of paper lying on a table, it usually spins around your two fingers.

To rotate a view, we need to apply to it an affine transformation, and looking at the UIView Class Reference we read:
“The origin of the transform is the value of the center property, or the layer’s anchorPoint property if it was changed.”

So the anchorPoint of the view’s layer is the point around which transformations (in this case a rotation) are applied. If we change it accordingly to a middle point between our two fingers touches we are set and running!

Unfortunately its coordinates are normalised to the UIView boundaries, ranging from 0 to 1, with the center represented by 0.5, 0.5.

The trick will be done by this little snippet of code:

CGPoint firstTouch  = [(UIRotationGestureRecognizer*)sender locationOfTouch:0 inView:self];
CGPoint secondTouch = [(UIRotationGestureRecognizer*)sender locationOfTouch:1 inView:self];

self.layer.anchorPoint = CGPointMake(((firstTouch.x + secondTouch.x)/2)/self.bounds.size.width, ((firstTouch.y + secondTouch.y)/2)/self.bounds.size.height);

With self being our UIView class. If you want to try a demo, you can clone our simple demo app on Github.