Sunday, 29 September 2013

Professinal Scrum Master I

Last week, I was able to become Professional Scrum Master. Can you believe that? :)

I knew some things about Scrum for some time now, but I've never attended real Scrum course until about a week ago. The course took 2-days to complete and was conducted by one of the local training centers. After the course I was given password to complete Professional Scrum Master I assessment from

Except for taking Scrum training, to pass the assessment I believe you clearly need to take "Open assessment" (which is free of charge and which you can find on the same website) as many times as you need to be comfortable with passing it. The real assessment contains 80 questions while you have only 60 minutes to answer. And difficulty of number of questions is significantly higher than questions from "open" assessment. To pass the exam, you need to achieve score of at least 85%. Personally, I thought that I passed comfortably, but it turned out that I got "just" 89%.

After passing it, you need to wait for a few hours before your certificate is ready to be printed from pdf document :) Oh, and you are given the right to use the badge:

Tuesday, 10 September 2013

Effective Akka by Jamie Allen; O'Reilly Media

This short book, written by Jamie Allen, contains number of advises for Akka developers. I believe that you should be already familiar with Akka framework before reading the book, because the author assumes that you already know how to use at least the basic features of the framework.

First chapter of the book discusses the approaches to designing actor based applications. It's hard not to agree with the author about the presented ideas, but I think it's something that most Akka developers already know.

Effective Akka's second chapter presents two quite small patterns used in real world applications. I liked the first one, but the second one I consider a tip instead of "pattern" - like Jamie called it. Application of the pattern are presented with unit-tested source code, which is definitely a plus.

Third chapter (the last one!) presents general advises when using Akka, but I feel developers should be familiar with them already as these advises are not much different that general programming / designing rules. The only difference is that here Allen show how they are relevant to building actor based application. You will also find here ideas for creating systems with resilience and high-performance.

In conclusion, I'd like to say that the book seems nice to me. On the other hand, as Akka developer, I'd love to read a book that would push me on two levels higher in building actor systems, and this book left me a little bit disappointed in that regard.

Saturday, 17 August 2013

Simple app using Gradle, Spray and Heroku

After posting quite a few times in June, I slowed down and I posted just once (book review) in July-August. One of the reasons for that is that it was quite busy period for me. I thought that I'd like to get back to writting by posting some cool stuff in here.

Recently I've read two books about Gradle build system, I've even reviewed here one of them. I decided to create an application that is built with Gradle, so that I can get more practice with it.

As cloud computing is getting more and more popular, then I thought I will create an app that will be run on Heroku platform. The application will use Spray-Can server for creating scala, actor based web application. The application will offer simple REST application and use spray-json module for json conversions between string and case class representation.

Build setup

In first step we will create a build file (build.gradle) that will tell gradle what dependencies are needed for an application, how to get them and how to create a result package

We define 2 repositories that we will be getting dependencies from. There are few dependencies that include scala library and akka and spray modules. We define here a new task - "stage" that will be run later by heroku. Basically it will just trigger two other Gradle tasks: clean and installApp. The latter will gather dependencies and create a distributable package with a script that will run the app.

Some of you might be wondering what Akka is. Let me just tell you that it is an innovating and exciting framework for building scalable, distributed systems. It is used internally by Spray and will also be used by me in the app.

Now we need to write some Scala code for an actual application.

Creating Spray app

First let's create scala's App.

It just initializes Akka's actor system and creates an single actor in there (actor of class HelloWorldActor). Then there is a binding of this actor to port provided in environmental variable, or 8080 if not provided at all.

Let's now put our focus on the behavior of this actor. We will be creating REST api based on json, so I've created some code for our domain and conversion of it's case class to json (and from json to case class as well if needed).

Now, the behaviour of HelloWorldActor:

In Spray (or Spray-Routing, I should write), you create something called route, which is a set of routing rules. HelloWorldActor is basically just running the route from trait HelloWorld. You usually keep these two things separated as it allows you to unit test is more easily.

Route defined in HelloWorld specifies that whenever there's a request for path api/persons/X, convert X to an integer and run a closure that returns json object with my name :). As you see in the snippet above, to get json representation of case class I can just use .toJson method on Person - there is similar method to get case class from a string.

The app should be runnable by now. You can just "gradle run" and check the result in http://localhost:8080/api/persons/5

Deploying to Heroku

Now that we have a runnable application in place, we can think of running the app in the cloud. To run this app on Heroku, we need to provide a special file: Procfile. It will tell Heroku how to actually run the web application.

The Procfile can contain just a single line:

As you see, Heroku will just invoke the script that was created by the installApp task of Gradle's application plugin.

As last point, I'd like to tell you that Heroku's official build-pack (set of scripts that build the app) for Gradle based application is a bit outdated and most probably this application cannot be run straight away.

But I've already forked Heroku's gradle build-pack repository on Github and updated it to fetch newest Gradle version (1.7). You can freely use it by setting an enviromental variable using heroku console:


That's basically it! Our REST API should already work on heroku. I've created repository on Github with the code of that application with even some more additions. As you see, creating app based on heroku,spray and gradle was pretty quick and easy.

Writing this post was fun and I look forward to posting again. I think that next time I might write about running web application on Raspberry Pi... :)

Saturday, 3 August 2013

Gradle Beyond the Basics by Tim Berglund, O'Reilly Media

It is a quite short book (only 4 chapters) that presents you some more advanced topics of Gradle. Tim Berglund cover here topics such as: file copying & processing tasks, building custom plugins, using hooks to life-cycle events and management of dependencies.

I enjoyed the book. It is easy to read as the authors show many snippets of code as an example for the topics he covers. And because of the relatively short length of the book, you don't need to spend a lot of time reading about details that you probably don't care about.

The book is definitely not for those who are new to Gradle build system. I believe that you should at least be familiar with topics covered in previously published "Building and Testing with Gradle". The authors assumed that the reader already know how to use Gradle and quickly started with describing another features of it.

On the other hand, if you already know how to use it then you might not need to read this book at all. If in your job you need to use more advanced gradle tools, you might as well use only official online documentation, which most probably already covers all the topics from the book. The only benefit you will have from reading Tim's book is that he covered some example step-by-step instructions for using these advanced features, like creating custom plugins.

Sunday, 30 June 2013

Hello world with Vagrant

Hello everyone!

Not long ago I wrote a review of Vagrant: Up and Running. This time, I'd like to post some tutorial on using Vagrant for those, who are totally new to it. Let me just remind you that Vagrant is a useful tool for managing virtual machines and their settings of resources used, network and others. In most cases, people use to create VMs for VirtualBox, but there is also possibility to set up Amazon EC2 machines directly from Vagrant.

Setting up

After installing Vagrant (and VirtualBox), to create new virtual machine, you just need to type in your terminal:

This means creating new settings for a VM that will be based on ubuntu 12.04. In your current directory new file will be created: Vagrantfile. It's a text file with all the settings of your VM. Actually, it contains just a Ruby source code, but don't think you need to know Ruby to use Vagrant efficiently.

Initially created Vagrantfile contains some default settings and huge amount of settings that are commented out, just to give you some idea about what else can be configured here. But for know let's just keep the defaults.

Starting the machine

To start your newly defined virtual machine you just need to type:

If it's the first time you run it, the VM will be created. If base image for a vm (clean ubuntu) is needed then it will be downloaded automatically from url specified in "init" command.

To actually use the machine you need to use ssh:

Finishing work

After you finished your work, you would usually stop it by:

Or you can destroy the whole machine, so that all the resources are freed (including hard disk space).

More resources

If you need high-performance on the virtual machine, then you probably need to adjust the resources it is using. When using Vagrant with VirtualBox you need to add some additional settings in Vagrantfile. These settings are specified in format of VBox's "modifyvm" command:

Sharing a folder

When you work on guest virtual mashine, there is often need for sharing a folder between host and guest operating system. Nice thing in Vagrant is that you can just specify a single configuration file to set up this shared folder (and mount it on guest OS).

After next "vagrant up", you will be able to use the folder on guest.

Port forwarding

When you run some kind of a (web?) service on guest, you need to somehow be able to connect to it. In Vagrant, it is just another one-liner!

This line specifies that when you try connect from host to "localhost:9090", you will actually connect to the guest machine on port 9000. It's as simple as that. This way you can easily test web application running on guest, using your web browser from host.

Additional software - provisioning

Managing of software inside a virtual machine is called provisioning. There are few mechanisms available in Vagrant for this job. Here I will describe only the most basic one: shell provisioning.

Shell provisioning is just a set of shell "tasks" to be executed after machine boot. You can write exactly shell commands in Vagrant file, or point to a shell script that should be executed.

In clean ubuntu you should start with running "apt-get -f install", just to be able to install additional software using apt-get package managing tool.

To do it after each machine boot, just put following line in your Vagrantfile:

If you want to run a script, you can specify path to it, ex:

I assume that this script has instructions for installing git in the system. I'd like to point out that when using Vagrant, you should use option "apt-get install -y {package_name}", which makes apt-get assume that you answer positively on any "y/N" question.

You might wonder, what you can do to make some scripts run only once (on first boot), rather than on every "vagrant up". The simple trick is to make inside a script an if statement for presence of some file (let's say you keep logs of installing git in it):

Then only the first time you start the vm the script will be run. If you want to run it again, you need to delete particular file.


Any GUI? If you set a flag: "vb.gui = false" inside a virtualbox configuration (in Vagrantfile), you will have your GUI. But you would probably also need to install packages like Gnome to make real use of that.

More provisioning techniques? There are also ways to use provisioning tools like Puppet of Chef, but I'm not an expert on those, so you need to find something about it on your own:)

Custom base images? No problem, just look on "how to create your own box" in Vagrant documentation.

More network settings? You can do a lot - actually anything (I think) that is possible with typical VirtualBox.

More machines defined inside a Vagrantfile? Yes, that's doable. Normally you use only a single Vagrantfile per project, even if you need more virtual machines.


You can find an example definition of a Vagrant environment on my github. There is a single-machine definition of vm for scala+mongodb+play (actually +sbt) development.

Note: It's common for Vagrant users to run developed code on VM, but edit the code in your favourite IDE on host OS.

Monday, 24 June 2013

Graph Databases by Ian Robinson, Jim Webber, Emil Eifrem; O'Reilly Media

This book significantly help in understanding what graph databases are and how to use them properly. The authors introduce basic ideas behind graph databases. They write about why the need for such databases emerged, why there's a need for having database engine in which relationships are first class citizens. 

I believe that most important chapter of this book is the one that explains data modelling with graphs. The way you need to think when using graph db is totally different that in other types of db. The authors based their teachings on a set of examples, with each being discussed in detail. Various use-cases are shown, and you'll be surprised how efficient data model can be, when used properly.

You will be also able to learn basics of Cypher, which is a language that is used for querying a graph database. It's not really comprehensive introduction, so therefore it cannot be used as a reference. The book shows examples for querying Neo4j, which is probably the most popular graph database implementation. I don't think that you will be very comfortable at using Neo4j immediately after reading this book. It rather intends to make you familiar with fundamental concepts of graph databases and showing how it differs from still more popular solutions like RDBMS. 

Also, some additional topics were covered, like: overview of using graph database in agile (also tdd-based) manner, introduction to Neo4j internals (different available APIs or ways of running it) or overview of other NoSQL storage.

I really liked reading it and the book made me more interested in graph-dbs as it provided solid arguments for using it in various applications. On the other hand, after reading it, I still think there's a lot for me to learn (from other resources) before I become comfortable with Neo4j. I would recommend this book to all developers, who are new to concepts of graph databases and who wants to become familiar with its strong points, before they try start using concrete graph database solutions like Neo4j.

Graph Databases - O'Reilly Product Page

Thursday, 20 June 2013

Tricity JUG: Apache Lucene in practice

On June, 15th I was one of the participants of the workshop titled "Apache Lucene in practice". The meeting was organized by Tricity Java User Group and was lead by Dominika Puzio and Patryk Makuch, whose previous experience included building search engines for many systems that belongs to Wirtualna Polska. 

Apache Lucene is a high-performace text search engine written entirely in Java. That's the why workshop included developing simple java web application for searching through wikipedia content. The actual wikipedia's contents was already downloaded earlier by organizers and distributed before the event.

The project that was developed during the meeting is pushed to github you can access it through my fork of it. As a participant, I didn't need to spend a lot of time on writting same code as presenters. What I was really doing was just git checkout "next commit" after each step done, and I've only experimented sometimes with the given code. This way I could just listen carefully to the presenters most of the time, as they explained each step with many details. The presenters showed that they have a lot of experience in that field and shared a lot from it.

Should I mention that organizers ordered pizzas for participants? :)

I really enjoyed the workshops. I believe that I've learned a lot and I think that I'm now fully capable of building Lucene-based search engine on my own :). I'm glad that I could participate in the event and I'm looking forward to next TJUG's meetings.

Sunday, 16 June 2013

Vagrant: Up and Running by Mitchell Hashimoto, O'Reilly

Vagrant: Up and Running is a very concise book that helps you to get started with Vagrant, which is a very smart tool with growing role in software development. I believe that title of the book is very appropriate as you can hit the ground running and start using the tool comfortably for your project very soon after reading this book.

There's quite short introduction to what Vagrant is, and then the author shows exactly how to use it properly. Mitchell show how to install it and how to interact with it by the offered set of commands. You'll find there instructions for creating virtual machine, provisioning it and setting up a network, also between set of machines. The author discusses many common use cases for using every feature of this tool. Portion of the book shows usage of the plugins for Vagrant and teaches you about development of these plugins. At the end of the book, you'll also find reference chapters that describes Vagrant's options one by one.

I've really enjoyed reading this book. Mitchel Hashimoto shows how to solve most common problems - problems that most readers will encounter if the try Vagrant or virtualization in general, like automatic software installation, port forwarding, folder sharing. And he does it all with quite simple and very easy to understand examples.

I fully recommend reading it for anyone that is interested in what Vagrant offers to its users and how to make best use of it.

Thursday, 13 June 2013

Functional Programming Principles in Scala

Most of my previous posts were about functional programming and I would like to write this topic one more time - I think last time for now. This time I'd like to share my experience with one of the MOOC (massive open online course) that I've taken recently: Functional Programming Principles in Scala.

This course is offered by and is lead by Scala's inventor, prof. Martin Odersky. The course is divided into 7 weeks. Each week contains about 2 hours of video lectures and programming assignment that on average takes another few hours to complete.

In video lectures the speaker tells you about all the ideas behind functional programming and describes all concepts that FP consists of. Professor Odersky initially presents some theory (sometimes its full of math) behind some of these concepts and then he shows an example for each of them with a code that he writes in Scala. So there's actual practical application shown in the lecture part.

What you learn while watching the lectures, you can immediately try in practice by solving the problem that programming assignment describes. The assignments are sometimes challenging, however every time it is just pure fun to work on them. Let me just tell you that assignments include working on:

  • anagrams generator
  • logic game solver
  • Huffman coding
  • binary trees

I can assure you that most participants (including myself) are very pleased they could attend it. And I fully recommend you to take that course in the next available session if you are, like me, interested in functional programming. If you have any questions about the course, feel free to ask me privately or in the comments section.

You can find more details on the course page

Monday, 10 June 2013

Functional Thinking by Neal Ford, O'Reilly Media

Neal Ford in his video presents functional programming's foundations and principles as he believes that this paradigm is going to dominate software development as mainstream technologies are including its features more and more.

He believes that learning syntax of new programming language is easy, so instead teaches you about new way of thinking. Examples of code that are shown in the video present those ideas in languages like Groovy, Closure, Scala or even Java. But the speaker doesn't really try to give you comprehensive lesson about syntax of these languages. He discusses the basics of the paradigm that stands behind that new syntax.

Topics that are covered in the video include background of functional programming: immutability, higher order functions, laziness and other topics. There are also points when speaker contrasts new principles with object oriented design. I particularly liked the part where Neal teaches you that functional programming is about separating "how" and "what" needs to be done, so that you can make "how" part the problem of someone else. The lesson I learned is that you can use functional programming to minimize number things you need to worry about.

I feel that people that would benefit the most of watching this video are software developers that have already experience in object oriented technologies like Java or .NET and want to learn new concepts whose role is getting larger and larger in programming. If a developer is new to functional thinking I can definitely recommend him Neal's video .

I review for the O'Reilly Blogger Review Program

Sunday, 9 June 2013

O'Reilly Blogger Review program

I'm more than happy to announce that I've been approved for O'Reilly Blogger Review program!

Let me just remind you that O'Reilly is a well known publisher of books and other resources on information technology topics. Their blogger review program allows anyone who runs a blog to have access to free copies of books or videos in exchange for an honest review. Resources that are available include newly released books and videos.

The first resource that I've chosen to review is a video Functional Thinking by Neal Ford. You can expect me posting review quite soon!

I review for the O'Reilly Blogger Review Program

Thursday, 6 June 2013

Introduction to functional programming

This is a moment for my first real post. For start I wanted to write on some topic that I feel rather comfortable, so I went for functional programming. I'll try to give you some introduction to this interesting topic. Some concepts that will be described, will be followed by example in Scala, which is my language of choice for FP.

Functional programming can be probably defined in many ways. I'd say that it is a paradigm that enforces usage of the following two principles (I believe that they are foundations of FP):
  • usage of functions (also higher-order* ones) as first class citizens - so that they are the basic construction blocks for building a computer program.
  • avoidance of mutable state, side-effects or assignment statements.

Firstly I'd like to focus on the first one. This principle is very attractive to me, as my favourite programming principle is:

Encapsulate what varies

I believe that many people identify encapsulation with the process of hiding representation of some object's data. I think that's a mistake, as hiding behaviour is at least as important as hiding data representation. Easy way to create/pass/use functions helps you achieve that in a pretty convenient mannner.

Scala's way to work with functions is very effective, which is typical for a FP language. But in some other languages like Java you usually need to explicitly create a separate interface. One can think that it takes a bit more effort than it actually should.

I believe that usage of functions is particulary useful when you work with collections. In Scala you can easily process them as they offer you following operations:

  • map - translates from one representation to another
  • filter - selects only elements that match given predicate
  • reduce/fold - combines elements of the collection to a single object 

There are people that call them the functional operators, as these ideas are very common in functional programming. Theirs example usage is shown in the snippet below.

Another principle that is at the very core of functional programming is:


Data immutability means that one object, once created, will not have its state changed. This is particular useful if you try to write some parallel, multi-threaded code. You see, when data is never going to change, there is no need for synchronizing access to this data. And when synchronization oriented concerns disappear, developer has a lot less to worry about.

In FP language, it is feasible to work on immutable data as there are useful techniques available to create copies of the object that differs with some specified property. This is illustrated below.

When you do a functional programming you might think that recursion is closer to those principles than iteration. The latter is usually connected with mutable state, while recursion avoids it as much as it can. One can argue that iteration is more effective than recursion, but there's a technique that allows recursion to match iteration's performance. This technique is called tail-recursion and is offered in Scala. I am not going to describe its specifics now, as it is quite far from intended topic of the post.

People can question, like I do, if you can totally avoid mutable state in programming. I'd say that in real-world applications you can't (or shouldn't). But if there is actually genuine need for it, then there are tools to support you with that. One such tool is an actor (or whole actor system) that would allow you to synchronize access and operations on some data. Actors are often used in Scala with a library called Akka. One day I'll definitely write about Akka.

There is still more

In this post I've described the principles that I believe are fundamental to functional programming. Particular FP languages like Scala offers other features that I feel support FP even more, like: Option type, pattern matching, lazy evaluation and implicit parameters/conversions. If you are interested in the topic, I would encourage you to try those things as well.

*higher-order functions are functions that take other functions as arguments or return them

Tuesday, 4 June 2013

About me

Hello everyone! My name is Krzysztof Ropiak. This is my first post on newly created blog. As I'm a software developer, I intend to post on my professional experiences which are usually programming oriented. My past experience includes building Java or .NET systems and also creating mobile applications based on iOS.

I think in the future you'll be able to find here some texts about concepts such as object-oriented design or functional programming. Some of my other professional interests are scalability of computer systems or cloud computing - I will be more than happy to post about it as well, as these things are really exciting. I also expect to write about stuff with more scientific nature - like natural language processing or machine learning. But then again - probably every post here will be written from developer perspective. I also predict to write some kind of reviews of technical books or other sources of knowledge.

I'd like to publish new text at least once in month. This way I can keep my readers satisfied and avoid going out of practice. I believe I will start posting really soon. I think next post will cover introduction to functional programming, which is becoming more and more important in enterprise applications and one can predict big future of it in cloud computing market - that's quite handy topic for start, isn't it?

I hope you will enjoy reading my blog and that you'll come back here time and time again!