opinionated or not

June 2, 2012

Opinionated framework on top of an unopinionated toolkit. If I had to choose my favorite engineering principle behind Gradle, there you go.

Sticking to the principle is challenging. Imagine a subdomain where the are no clear conventions, no popular, easy-to-harvest patterns. How do you come of with the “opinion” in such a field? We gather use cases and then try to resolve them in the best way we can. Sometimes the convention emerges organically and we can unanimously rejoice. Sometimes several conventions emerge and we are not quite sure which one to pick. We might choose the one most fitting or wait until we gather more knowledge, more use cases. Sometimes, we don’t see any reasonable conventions and we refrain from the opinion for the time being.

Good old ant doesn’t have many opinions. It is a toolkit. It lets me script the project automation in an imperative fashion. Kick starting a new software project almost always started with taking the build.xml from the previous project. The level of copy-paste in the build.xmls is legendary. Yet, back in the days, I loved ant as it allowed me to do amazing automation in projects. Thank you ant!

Maven delivers plenty opinions. It is a framework. However if my context does not quite fit maven’s opinion then I might be in trouble. It was a great technological leap from ant to maven. I can still remember my feverish efforts advocating maven2, pulling my conservative team mates into the new era of declarative dependencies and convention-over-configuration. Thanks, maven!

We want Gradle to offer the inspiring technological leap you might remember when you migrated your first project from ant to maven. This time, though, you will taste it when upgrading from maven. Some of you have enjoyed that feeling already and I am so glad to have you on board. Some of you will come along at some point I’m sure. We want to deliver opinionated conventions so that you can build your standard projects with a one-liner build scripts. We also want to provide an unopinionated toolkit so that you are never lock down with the design choices we have made.

I’ve just got back to Krakow from the amazing city of Berlin. I had a pleasure to brainstorm, debate and drink cytrynowka with the fellow Gradle gang members. Among many things that are crystal clear to me now is this one: The question is not IF Gradle will dominate the space of project automation but WHEN. Have a great weekend everybody :)


IDE & patterns for huge maven project

December 11, 2010

I started working on a huge multi-module maven project these days. Trust me, working with ~150 maven modules at a time is a pain for a developer. On the other hand there’s a lot of room for me to improve the development environment & make dev cycles efficient.

Some more context: The team is pretty big; over 100 developers on a single codebase working in a ‘continuous integration’ fashion: little branching, keeping the integration gap small, automated testing, CI servers etc. Each developer checks-out entire trunk that contains ~150 maven modules linked via the default ‘x-SNAPSHOT’ dependency. Maven veterans probably flinch now because this way of managing internal dependencies in a large project is probably considered ‘lame’. Maven ‘by-the-book’ approach may put emphasis on keeping the internal dependency links to a specific version. I’m a pragmatic developer and I see pros & cons of both approaches. This blog post, however, does not dwell on the subject of internal dependencies. I want to focus on the most effective dev environment set-up and the choice of the IDE.

“Toggle IDE/local-repo dependency” pattern

Couldn’t find a good name for this pattern. Theoretically, it can give a nice dev performance boost in a huge multi-module project. Here’s the idea: out of the 150 modules we will work on the handful only. Therefore I want to be able to quickly decide which module should be a concrete project/module in the IDE and which should a mere jar dependency resolved via the local maven repo. It is a sort of a toggling dependency IDE/local-repo.

Here’s how I see it working:

1. The developer pulls in a smaller module set into an IDE by selecting a relevant sub-parent in the multi-module tree.
2. At some point the developer needs to perform changes in the module that is not pulled into the IDE. He pulls in the relevant module(s). Dependencies inside IDE are updated from jar-dependencies to: project-dependencies (Eclipse) or module-dependencies (IDEA).

Here’s how my team so far coped with the project and how the above pattern could be used.

eclipse:eclipse or idea:idea

At the moment I think more than half of the team uses maven eclipse:eclipse plugin to generate configuration that is consumable by Eclipse. In our environment it is not very efficient because after every svn update developer has to regenerate project configs (in case someone changed internal/external dependencies) and restart Eclipse (‘just’ refreshing Eclipse didn’t work well occasionally).

Team members who use IntelliJ IDEA never use idea:idea maven plugin. I cannot confirm that but I heard feedback that it was not generating the correct configurations for latest & greatest IDEA. However, most devs simply don’t need idea:idea because maven support built-in to IDEA 9+ is simply brilliant. I hardly imagine someone wanting to step back and generate configurations via idea:idea.

Toggling dependencies IDE/local-repo is possible but somewhat inconvenient. It requires specifying project names by hand at command line; then refreshing/restarting IDE.

M2Eclipse

I don’t recall anyone using it in my team. The plugin seems to be making progress – I heard several good reviews. In our particular environment developers complained on reduced perceived performance of Eclipse and they fell back to generating configs via eclipse:eclipse. I tested m2eclipse myself recently with our project; there were things I liked about it. Yet, I still found Eclipse choking (even If I fine-tuned the plugin).

Toggling dependencies IDE/local-repo is possible but I cannot comment how it behaves as we haven’t used m2eclipse too much.

IntelliJ IDEA

The built-in maven plugin for IDEA is getting more & more popular in my team. It does not seem to reduce the perceived performance of the IDE. However, IDEA tends to be a bit slower than Eclipse anyway. IDEA 9+ is fast enough for me – I tend to use it more often than Eclipse. My experience with IDEA 8- is limited because I rarely had enough patience to wait until it started. Spanking new IDEA 10 is marketed as faster than 9. I have not tested it yet but I want to extend my thanks to JetBrains for working on it!!!

Back to the maven support in IDEA and the context of our huge multi-module project. Options that we absolutely needed to tweak were:
1. Disable files/folders synchronization on IDEA frame activation. In our case, IDEA choked every time one ran any maven goal outside of the IDEA.
2. Disable automatic maven project reimporting (it’s the default anyway). It is related to the #1.

Also, be sure to read some of the posts on how to improve IDEA performance in general (most notable: choose wisely the plug-in set enabled at IDEA; get SSD drive; etc.)

In our project we found this feature very useful: the separation of 2 different actions: ‘reimporting maven projects’ and ‘updating folders’. The first one makes sure the project internal/external dependencies are linked well. The latter makes sure sources are generated for modules that need them (generate-sources).

Finally, toggling dependencies IDE/local-repo works like a charm in IntelliJ!!! Basically, all I need to do is “Import module from existing model”, point out the relevant pom.xml. IDEA takes care of updating the dependency links of my existing modules to use IDE links rather than local-repo links. How cool is that?!

I still keep myself using both Eclipse & IDEA as I work with various developers. However, it’s going to be harder to get back to Eclipse…

Disclaimer: Please notice the date of this blog post! I don’t know if the problems I mentioned with some of the tools will still be valid in future. Hopefully all those tools will be great, fast & very convenient for working with huge maven projects. Respect to all the people who work on the dev tools that we use today (Eclipse, IntelliJ IDEA, eclipse:eclipse, idea:idea, m2eclipse). Thanks a lot!!!


XML please, I don’t want to recompile

November 15, 2010

There’s one quite naive argument that sometimes determines the technology choice. I want to write about it because this argument I’ve been hearing on and off for years. It’s about favouring XML to avoid recompilation.

I remember in the late days of EJB2, one of the selling points of Spring was that you can reconfigure the wiring of your beans without recompilation. At that time I was quite enthusiastic about this feature… even though the argument was almost completely negligible.

My friend said today:

“Another advantage [of XML workflow] is you can do some modification to the flow without recompiling the whole app”

It’s true that you don’t need to recompile. However, if you want fully implement the change to the XML workflow you probably still have to:

  • run tests
  • validate the app works
  • check-in the change to your VCS (new revision created)
  • make sure the integration build on newly created revision works OK
  • validate the application (built with the new revision) works on the integration box

In most cases, updating XML by hand on the integration/prod box is not practical. Don’t fall for the ‘no recompilation’ argument because it is negligable. Usually you still need a build & QA cycle.

This post is not intended to claim that XML is bad for you (although I probably agree with that statement =)). This post claims that ‘no recompilation’ argument is void.

Nevertheless, If I have to use a workflow in my java app it will declared in java and not in xml. I’m great fan of fluent, type-safe & intellisense-ready configurations!


expected exception in tests

July 26, 2010

I have a feeling too many people get it wrong so let me stress:

@Test(expected = SomeException.class) // IS BAD FOR YOU

I do enjoy jUnit a lot, thanks guys for sharing! Yet, I don’t like @Test(expected) feature of jUnit. In my TDD classes I simply teach to avoid using it. I admit the feature can potentially make few tests cleaner. However, it tends to bring more trouble than value. Newbies too often misuse it which leads to poor quality tests. Therefore:

Rule 1: don’t use @Test(expected)

My colleague (cheers Bartek!) pursues Master at the Uni. They teach him testing. One day a lecturer was presenting some test code:

@Test(expected=Exception.class)
public void testSomething() {
  //line 1: throws NullPointerException
  //lines 2-30: some crappy test code
}

The test is useless. Due to some bug in the setup the test method breaks in line 1. Therefore most of the test code is not even executed. The expected exception does great job in hiding this issue; the test is green! Therefore:

Rule 2: If you violate Rule #1 then at least make the exception very specific. Example:

@Test(expected=BreadOutOfStockException.class)
public void shouldBuyBread() {}

Even if you use specific exception then the test does not document exactly where the exception is thrown. Sometimes it may lead to subtle bugs. Therefore:

Rule 3: Don’t violate Rule #1 when the test method has multiple lines.

Instead you can simply write:

@Test
public void shouldBuyBread() throws Exception {
  //given
  me.goToBakery(bakery);
  bakery.setBreadAmount(0);

  try {
    //when
    me.buyBread();
    //then
    fail();
  } catch (BreadOutOfStockException e) {}
}

Above is my preferred way of documenting ‘exceptional’ behavior. There you go, it is The Ultimate Test Template for ‘exceptional’ behavior.

JUnit is getting better. New goodies for exception handling in tests arrive. @Rule ExpectedException better documents what action triggers the exception. Also, it nicely maps to //given //when //then style. The downside is that you need some extra code if you want to interrogate the exception, for example if you want assert if exception message contains a substring (api as of 07-2010). Nevertheless:

Rule 4: Favor @Rule ExpectedException over @Test(expected) when test method has multiple lines.

Rule 5: Keep an eye on new versions of your testing libraries.

If you are already at RI in TDD than consider my Rule #1 as a safety net. You’ll never know when a new junior team member joins. He may easily misuse the patterns he sees in the codebase. Javadoc for JUnit does not say how to safely use @Test(expected) (at the moment).

If you are learning testing don’t get too upset with my dogmatic rules. If only you write any tests then I am smiling already. You smile too, write tests and be happy. Soon you’ll get your RI rank in TDD.

Rule 6: Write tests & smile!


//given //when //then forever

December 7, 2009
@Test
public void shouldDoSomethingCool() throws Exception {
    //given
	 
    //when
 
    //then

}

I like to call it the Ultimate Test Template. I’m so fond of those 3 little comments but surprisingly, I didn’t buy it at first. Micheal (it’s you, tapestry-maven-iPhone fan boy =), friend from ThoughtWorks showed me it a couple of years ago. Actually, he didn’t show it to me – I just overheard him coaching a young dev about it. That day I thought I didn’t need any hip comments because my tests were great anyway. It was foolish.

Don’t be a fool like me and start writing //given //when //then today. Life is too short for messing around – you want to get level 85 in software craftsmanship soon, right? Here’s the deal: use the template for 1 iteration and if you don’t like the results then I will give you your money back. Seriously, no matter what you think about it – buy it! BTW. If you need to document some ‘exceptional’ behavior in your test somewhere there is the template for tests with exceptions.

Lately, I’ve been selling //given //when //then quite relentlessly. I even try to sell it via Mockito api. (The link also shows how to install the template in Eclipse so don’t miss it!)

I tried to lobby for the Church of given-when-then in Krakow, Warsaw & Kiev. I heard rumors that Wroclaw develops a growing number of brothers and sisters in faith =)

I think I forgot to thank Dan North for given-when-then and Liz Keogh for the idea of BDD aliases in Mockito. There you go!


Mockito in Zurich

June 21, 2009

I will have a pleasure presenting Mockito at Jazoon in Zurich. My session is on Thursday – make sure you don’t miss it. During the session I’m going to show few slides and do live TDD with Mockito & other mocking frameworks. I’m going to use Infinitest. Guys, it’s huge. Brett Schuchert wrote about it, Bartek told me to try it several times. Finally I did and I must say I’m impressed. Infinitest is like transcending to a real TDD.


is debugger a modern translator?

May 19, 2009

Couple of days ago we interviewed a young adept of the craft. We let him code but he got stressed out a bit and wrote very complicated code. At some point he stopped understanding the code he wrote. So he started using debugger to figure out the meaning of the code. This is when debugger became sort of a modern electronic translator.

The thing is we probably don’t want to use debugger as a translator. We prefer to understand the code we write. Instead of debugging I usually recommend writing a small test that covers the logic in question. You can try this exercise next time you are about to launch the debugger.

Don’t get me wrong – debugger is a handy tool in every developer’s kit. It’s just the less you use it the better you are in producing high quality code. Actually, it’s probably other way around – when you get fluency with TDD/simple design you will soon discover how rarely you use debugger.


subclass-and-override vs partial mocking vs refactoring

January 13, 2009

Attention all noble mockers and evil partial mockers. Actually… both styles are evil :) Spy, don’t mock… or better: do whatever you like just keep writing beautiful, clean and non-brittle tests.

Let’s get to the point: partial mocking smelled funny to me for a long time. I thought I didn’t need because frankly, I haven’t found a situation where I could use it.

Until few days ago when I found a place for it. Did I just come to terms with partial mocks? Maybe. Interestingly, partial mock scenario seems to be related with working with code that we don’t have full control of…

I’ve been hacking a new feature for mockito, an experiment which suppose to enhance the feedback from a failing test. On the way, I encountered a spot where partial mocking appeared handy. I decided to code the problem in few different ways and convince myself that partial mocking is a true blessing.

Here is an implementation of a JunitRunner. I trimmed it a little bit so that only interesting stuff stayed. The runner should print warnings only when the test fails. The super.run(notifier) is wrapped in a separate method so that I can replace this method from the test:

initial-code1

How would the test look like? I’m going to subclass-and-override to replace runTestBody() behavior. This will give me opportunity to test the run() method.

subclass-and-override1

It’s ugly but shhh… let’s blame the jUnit API.

The runTestBody() method is quite naughty but I’ve got a powerful weapon at hand: partial mocking. What’s that? Typically mocking consist of using a replacement of the entire object. Partial stands for replacing only the part of real implementation, for example a single method.

Here is the test, using hypothetical Mockito partial mocking syntax. Actually, shouldn’t I call it partial stubbing?

partial-mocking1

Both tests are quite similar. I test MockitoJUnitRunner class by replacing the implementation of runTestBody(notifier). First example uses test specific implementation of the class under test. Second test uses a kind of partial mocking. This sort of comparison was very nicely done in this blog post by Phil Haack. I guess I came to the similar conclusion and I believe that:

  • subclass-and-override is not worse than partial mocking
  • subclass-and-override might give cleaner code just like in my example. It all depends on the case at hand, though. I’m sure there are scenarios where partial mocking looks & reads neater.

Hold on a sec! What about refactoring? Some say that instead of partial mocking we should design the code better. Let’s see.

The code might look like this. There is a specific interface JunitTestBody that I can inject from the test. And yes, I know I’m quite bad at naming types.

refactored-code1

Now, I can inject a proper mock or an anonymous implementation of entire JunitTestBody interface. I’m not concerned about the injection style because I don’t feel it matters that much here. I’m passing JunitTestBody as an argument.

refactored-test

Let’s draw some conclusions. In this particular scenario choosing refactoring over partial mocking doesn’t really push me towards the better design. I don’t want to get into details but junit API constrains me a bit here so I cannot refactor that freely. Obviously, you can figure out a better refactoring – I’m just yet another java developer. On the other hand, the partial mocking scenario is a very rare finding for me. I believe there might be something wrong in my code if I had to partial mock too often. After all, look at the tests above – can you say they are beautiful? I can’t.

So,

  • I cannot say subclass-and-override < partial mocking
  • not always refactoring > subclass-and-override/partial mocking
  • partial mocking might be helpful but I’d rather not overuse it.
  • partial mocking scenario seems to lurk in situations where I cannot refactor that freely.

Eventually, I chose subclass-and-override for my implementation. It’s simple, reads nicer and feels less mocky & more natural.


build systems @ devoxx

December 11, 2008

I couldn’t attend to the session on gradle because there was a Mockito session at the same time. For some unknown reasons I chose Mockito…

About gradle. Later on, Hans Dockter (AKA the gradle guy) was kind enough to help me writing a build.gradle for Mockito. Also, we played with multi-project builds and brand new gradle jetty plugin.

Then I went to Jason Van Zyl’s session, mostly on maven3. Third time lucky, right? Let’s not mention M1 or M2 here because I would have to use words gentlemen don’t use. Maven3 looks quite promising and I wonder what’s gonna be my first choice build system in mid-2009: gradle, maven3, or maybe something else? Felix made a wild experiment and cut an interesting, object-oriented java build system.

Oups, did I just call maven a build system? Oh, I know it is “not just a build tool” but 94.72% of developers use it to build projects so pardon me for taking this mental shortcut.

After Jason Van Zyl’s session I had an impression that he believes the best fit for the build system is a declarative architecture. I tend to disagree and I believe the build has inherent scripting nature. Then the natural fit for the build system is some scripting language. Quick reminder: xml is not a scripting language. Fully declarative build system like maven makes me eventually hit the boundary – it’s when the stuff that can be declared or configured is not good enough. Then I need a painless way of “specialising” my build, for example by scripting. Mind that I’m not nearly a build architect so don’t take my rambling too seriously.

I didn’t meant to complain about maven – there are many really good ideas in it. However, allow me to have my favorite build system different from maven. Good luck with your maven builds. You’ll never know when they stop building…


I wish there was a mocking framework…

November 3, 2008

Let’s start from the beginning. Lately, I’ve been quite busy (perhaps lazy is a better word, though) and I couldn’t find time to sharply reply to few interesting blog articles. Dan North became a friend of Mockito and wrote about the end of endotesting. His post sparked few interesting comments. Steve Freeman wrote:

But, it also became clear that he wrote Mockito to address some weak design and coding habits in his project and that the existing mocking frameworks were doing exactly their job of forcing them into the open. How a team should respond to that feedback is an interesting question.

Well, I wrote Mockito because I needed a simple tool that didn’t get in the way and gave readable & maintainable code.

In fact, the syntactic noise in jMock really helps to bring this out, whereas it’s easy for it to get lost with easymock and mockito.

When something is hard to test it gives excellent feedback about the quality of the code under test. If you’re a developer then your main responsibility is to respond to this feedback. jMock guys came up with this excellent metaphor: “you should listen to your tests“.

I wish there was a mocking framework which gave positive feedback when the production code was clean but gave negative feedback when the design was weak. When I used jMock I had an impression that the feedback is negative all the time, regardless of the quality of the design. Good code or bad code – the syntactic noise was present in every test. If the feedback is always negative how do I know when the design is weak?

Patrick Kua writes in his blog:

That’s why that even though Mockito is a welcome addition to the testing toolkit, I’m equally frightened by its ability to silence the screams of test code executing poorly designed production code

Mockito tends to give positive feedback. It keeps the syntactic noise low so the tests are usually clean regardless of the design. If the feedback is always positive how do I know when the design is weak?

So, what brings out design issues better – jMock or Mockito? Go figure: the first gives too much noise, the second is too silent. Before you join the debate, please note that the discussion is fairly academic. After all it is nothing else but looking at the mocking syntax and thinking how it influences the production code.

Can we really say that the quality of the production code is different when tested with different mocking framework? We strive to build fabulous mocking frameworks but high quality code is the result of skill, experience and so much more than just a choice between Mockito or jMock, TypeMock or RhinoMock, expectations before or after, etc.

At this moment, for unknown reason, one of the readers suddenly remembered the test he wrote last week. The one that required 17 mock objects. Which one? The one where the mock is told to return the mock, to return the mock, to return the mock, to return the mock, to return the mock. Then assert something equals seven. Oh yeah, that one! The test looked quite horrible but hey, you know what they say, better bad test than no test. Obviously, this ugly jMock made the test unreadable. The design was absolutely flawless because there were many patterns used. Fortunately, the team started using Mockito the following week! The rumor was that Mockito test became unreadable only if the number of mocks per test (MPT) exceeded 25. Awesome! Next week, the eager developer was unpleasantly surprised with Mockito. The test with MPT 17 was looking as horrible. To solve the problem, the team decided to install EasyMock.

I haven’t participated in enough java projects but I dare to say this: The quality of design was no better when we did hand mocking, EasyMocking, jMocking or drinking Mockitos. Somehow drinking Mockitos was much pleasant experience, maybe because it’s darn good drink.

No hangover guaranteed. (Eastern Europeans only).