About a year ago I attended a meetup at King in Stockholm featuring James Bach. During his talk James drew a number of well-known geometrical shapes like triangles, squares and circles on the white-board. Some were filled with lines and some were empty. He then asked the audience a question in line with:
-"Raise a hand if you see a filled triangle on the board."
We could clearly see that there was a filled triangle on the board and eager to please the great Mr. Bach we all happily raised our hands. He then looked at us in the audience for a while before turning to the board saying:
-"What do you mean? The triangle is empty!"
While pointing at the empty circle on the board...
We all made the assumption that we had a common definition of a triangle and focusing on the task at hand we all missed out on the opportunity to ask just one simple question whereby we could have reached an common understanding on what he meant when saying "triangle".
I sometimes think about the above and the fact that we all often miss out on the opportunity to ask questions and clarify things.
It get extremely apparent among our applicants when faced with a technical assignment we use in our hiring process.
The assignment consist of some high level instructions and access to a WSDL for a simple calculator. The short instructions contain the following information:
- Set a scope and define requirements based on what you can find out through the supplied WSDL
- Provide a set of automated tests that you think are necessary to verify the requirements using .Net / C#
- Tests should cover functional and non-functional requirements for the given service
- Make a simple report of your findings
- If you have any questions about the assignment or the process, just get back to us.
So the applicants can basically set a scope and come up with their own requirements, cover it with the needed tests and produce a simple status report. Simple enough one would think?
So far all applicants have provided some sort of automated test suite. Some have created good test cases, showing that they at least know the basic in regards to input validation, equivalence classes, boundary values, exception handling et.c.
Some have provided some sort of status report, others have actually covered the seeded defects with tests but haven't reported that they found anything out of the ordinary.
No one have provided any information in regards to scope or requirements...
When we ask why they haven't listed even the simplest requirement some say that it's obvious what a calculator does(1) but more often we hear that they did not know what the requirements were(2).
- We are back to the "assumption is the mother of all f_up's" scenario again. If I don't know what their understanding of the application is it's hard for me to say that the supplied test cases are in any way conclusive.
- If they don't know what the requirements are then why not just come back to us with some questions in regards to that? The fact is that we have a ready set of requirements to supply them with if they ask for it.
We end up with many applicants that flunk this fairly simple assignment just because they make incorrect assumption or do not bother to ask for more information.
If you are not asking questions or use some tool/model like specification by example, BDD or other in order to reach a common understanding of the task at hand it's about time you start doing it!
"The only true wisdom is in knowing you know nothing"
Image by @kirsty_joan