I’ve always liked pair programming, since my college times. I remember doing coding exercises spending late nights at the computer lab and we were always working with your partners. And you would talk to everyone in the room trying to figure out how to solve things, you’d sit with people from other teams and share code. I didn’t know you’d call this pairing then.
And then I started doing an internship to get my degree, and I was assigned a project with someone from my same university. And we would work together a lot, we’d sit together solve problems we had with Enterprise Java Beans (OMG!!) the Oracle web server and related technologies of the time for hours. And we’d code together on the same things, sometimes at my computer or his and I still didn’t know it was called pair programming.
And as I started working after I moved to Spain I was working as a consultant and I didn’t do much of that. Yes I’d talk to colleagues and solve some things together but it was not well seen to work together. How can you have two people working on the same thing? It would take double to get things done they’d say.
And then I had my first Rails job, a totally different environment, and we worked together a lot, sit with someone for many hours solving some problems, debugging, creating things and having fun. It was at this moment when I started to learn about pair programming.
It was a thing, doing that which always seemed to me like the best way to work actually had a name. And there was a lot of people that talked about this subject, a consultancy like Hash rocket would do it and talk about it, what worked well for them and it was good to realise there was a movement of people promoting this work.
And so since then I’ve had a few jobs and I’ve always been bit of a proponent of it and I’ve always tried pairing whenever possible since I enjoy working with others this way.
The software we’re building is getting more and more complex with time, the demands of our users become ever increasing and we’re required to handle many different parts. With micro services and the cloud this is only getting harder. And for me it’s becoming more clear that in order to solve this things what we need is more collaboration and that pairing is a more effective way of getting work done.
There is only so much information one can keep in their heads when you are pairing you have much more capacity. When you have a good partner and you’re driving you can feel more relaxed since someone else is spotting many little mistakes that help things flow better instead of hitting them and having to fix them.
When tackling a problem and there’s a need to discuss things and you’re pairing you have immediate access to a smart person sitting next to you to help you discuss alternatives and make a decision.
Writing tests for a feature and having a partner is very useful going through edge cases, suggesting possibilities that you haven’t considered making sure things are more robust from the beginning.
All the features done with a partner get an automatic +1. Features developed pairing are better designed and more robust and other team members will feel usually more inclined to accept them.
Pairing is a social skill that needs to be developed, it’s not the most natural way to work for people, you need to learn how to work with your partner when you’re writing the code and when you’re the “co-pilot”. It takes time to break old habits too, getting rid of the constant distractions and focusing on the task with your partner can be challenging.
I’m preparing a workshop to help you learn how to get the most from pair programming, if you’re starting now or want to improve the way you work with your colleagues please sign up to the newsletter to keep you informed.
Many companies keep their AWS accounts separated per environment, per team, etc and you can find yourself in a situation where you have users and credentials on many different accounts. From a security perspective this is not ideal as managing this can be very hard, people will change teams, leave the company, etc and cleaning up their access on all accounts can be very tedious and time consuming.
A way to deal with this is by having a user in one account and then assume role in another account so there’s only one place to manage your users and credentials.
I’m going to assume you already have knowledge about how to create a
role in the target account and configure your
file. If not AWS has a good basic introduction
This has some advantages for security but if you are using MFA it can
be a bit more tedious to have to constantly type the tokens. The AWS
CLI when you give it the
--profile flag will perform an
STS:AssumeRole call for you an cache the results for one hour.
This can be sufficient for your needs but if you’re working with some of the SDKs like boto you may encounter some problems. If you’re using ansible it uses boto2 underneath which doesn’t support profiles to assume role with MFA. Boto3 works correctly but you need to provide the MFA token on every invocation since it won’t cache sessions for you.
To make my life easier and some colleagues while migrating from boto2 to boto3 I created a very small tool called roly.
The AWS SDKs will respect the
AWS_* environment variables you
have set and roly just leverages this fact.
You can use roly in two ways:
$ roly export target-profile Assume Role MFA token code: XXXX export AWS_ACCESS_KEY_ID='XXXXXXXXXXXXXXXX' export AWS_SECRET_ACCESS_KEY='XXXXXXXXXXXXXXXX' export AWS_SESSION_TOKEN='XXXXXXXXXXXXXXXX'
If needed will prompt for your MFA token and then it will print the commands needed to set your environment variables. The session will last for an hour. Copy and paste them and you’re set.
Or you can set the credentials and call a command just once.
$ roly exec target-profile command
You can find the source and binaries at the github repository.
I hope you find it useful.
For a long time I’ve used my own command alias to navigate through my
work related projects and jump to them using the
Originally the x comes from Xing where I worked at the time. It also
happens that the letter x is located next to the c which makes it very
It’s been a while since I worked on the blog’s design. The design dated back to 2011 which is like an eon in Internet time and I didn’t particularly liked how it looked anymore so I decided to change it.
I normally don’t do any design work and CSS is not something I’ve enjoyed particularly but I’ve always used this space to work on that area.
Recently I’ve been using AWS Lambda at work for some projects and one of the limitations that you have is trying to use packages with compiled dependencies.
Compiling them on your machine won’t work and the “recommended” way is to start an EC2 instance and compile your dependencies there and then copy those to your machine back.
Fortunately AWS now provides a docker image for amazon linux which we can leverage to build our depencies. We can avoid launching an instance and get faster results.
This tutorial is an extraction from a talk I recently presented about docker and rails apps at the Barcelona on Rails user group. I’ll explain how to integrate docker into an existing rails app workflow.
During the talk I presented a tool that is helping me focus on developing the different containers and make the changes to the Dockerfile and the apps themselves and leave out the details of using fig.
The tool is called guard-fig.
It’s been over 3 months since my change to the site operations team at Xing and time has really been flying by.
Everything is new and I really don’t know what to do many times and that I find very delightful. The investigation, learning, figuring out how to achieve something, reading forums or stack overflow and trying to fit a puzzle into your head. I think this is what I love the most about computers, that moment when you understand the machine, what it does and why it does it. It’s the hack, making the computer do what you want. Getting out of my comfort zone is helping me appreciate the basics more.
A couple months ago I started growing a vegetable garden at home with my girlfriend. It’s been a great experience. She’s always been into plants, we have lots of them around the house but never had we tried to grow food. I think the fact that I can get something out of the experience that is more than just aesthetic but also functional (I can eat the vegetables) has made it more appealing for me.
There have been some experiences so far that I somehow can relate with developing a software project.
I was invited with my friend Jean Carlos Meninno to give a presentation on the GDG DevFest Barcelona 2013. It was a great opportunity to talk about the work we’re doing recently for XING and the things we’ve been learning about developing large scale backbone applications.
Here you can find the slides: http://diasjorge.github.io/google-dev-fest-slides/
Hope you like them.
During my time working at XING I believe my single biggest contribution for the company is a side project I’ve developed called Xing scripts. This project started with a personal need for working with our development environment in a more automated way. I’m a big proponent of automating everything you can and so when I started working I realized that there were these tasks that I would do over and over again. Since I couldn’t bare doing all this manual work I started writing my own scripts.
Yesterday I was at work with a colleague and we wanted to merge a long-running branch we had. This branch was full of useless commits so we wanted to clean it up. We tried an interactive rebase but we got a lot of conflicts since git doesn’t know how to resolve merge conflicts that we had previously fixed. As you probably know this is no fun, so we did what any sane person would do and found a nice solution for this called git-rerere.
Recently I had to reinstall my computer at work since I had to update to Lion and I could only do a fresh install, so I decided to try to automate the installation process since some of my colleagues are going through the same and it seems like every time we have to waste many hours or days to solve the same issues over and over.
As a programmer, the impetus to go as fast as you can is to be taken with caution as it helps you move forward but it can also hold you back.
It is perhaps my experience but I’ve hardly had the opportunity to work on green field projects, but rather worked on legacy ones where most of the original developers were no longer part of the team or even none of them. Projects with little to no documentation and in some cases no tests at all. You probable know this feeling, it sucks, you want to do things but everything you touch breaks something else, where you obviously see that there was lack of care.
It’s been some agitated months lately for me, I quitted working at JustLanded after almost two years there and then went working for some consulting, the experience was not so great, actually it was really bad, the kind of experience that has made me learn to choose very carefully my future career moves and never again believe in marketing people. Fortunately I got an offer to work at XING offices in Barcelona, so I packed my stuff and moved there. Now this is a really good place to work, everything was as we talked, they’ve been very helpful with my relocation and the environment is great, lots of smart people that want to do a good job, so nowadays I’m very happy and enjoying my new city.
If you’ve ever forgotten to push your changes to the remote repository before trying to make a deploy, you will know that it can be really frustrating to think that everything has gone live when actually it has not. This little script will help you avoid this situations.
When using passenger with rvm I’ve had some issues with project specific gemsets, where bundler was unable to find the gems. After searching a lot I found out about using the “config/setup_load_paths.rb” file to tell passenger where to locate your gems, but then I had a new issue with rvm trying to use the system ruby instead of the ruby version of my .rvmrc file.
After going to the irc channel, I got some help that help me fixed my problem. The culprit was my rvmrc file.
As promised here are the slides for the “Conferencia Rails” workshop on process automation. Thanks to all the people that were there. I’m also releasing the redmine CLI I’ve created along with the CLI twitter client.
The presentation was created using the slideshow gem which generates an html document for you.
Hope you enjoy it.
Today I spent several hours with my friend Gleb trying to find a weird bug we we’re having importing some rss feeds.
We have a rake task that will grab an xml feed and import it to our system. When we call this rake task from the command line it would run fine, but if we run it from inside our application, we would get some wrong characters (you know, the usual ???) in the imported items.
If you’re using emacs to write your jekyll blog posts, there is a mode to help you with some common tasks. It is originally from metajack. Recently I thought it could be a nice addition to have syntax highlight support for jekyll posts, so I got my hands dirty and after some hours of lisp hacking (this was my first attempt at lisp programming) it was a reallity. It is based on nxhtml so you need it to work.
If you’re using capistrano-ext to deploy to a different server, using a custom environment, you’ve probably noticed that it always tries to run the migrations for the production environment, like this:
Digging through capistrano’s source I found the solution is really simple, just set the rails_env variable to the environment you want, in this example staging. So inside config/deploy/staging.rb
Then when migrations get executed they’ll have RAILS_ENV=staging.
Recently I moved my blog to Jekyll, while being able to write stuff directly in my favorite editor EMACS, there was some functionality that I was missing from my previous custom blog engine, such as archives. Looking at how I could achieve this, I found Raoul Felix approach to the problem. Instead of patching jekyll, he wrote a small library that wraps around it, called jekyll_ext. Using it was really easy, and based on some of the extensions he created, I was able to provide this functionality in my site.
Although I had archives generated for me, I was still missing a way to display this information on my site, so I decided to create my own extension.
I’ve been developing some new mailers at work, and I’ve found it really helpful to be able to view the emails as they are sent to the users. So I’ve implemented a cucumber step to help me achieve that, inspired on a similar webrat step for web pages.