go back to the blog

Test Automation Tools – Tools Are For The People

  • 08/05/2017
  • no comments
  • Posted by EuroSTAR

Especially as an automation tool provider, it is easy to forget the human component at the heart of any process. In this blog post, we explain the success story of our unsuccessful conference submission to the EuroSTAR testing conference.

I’m the co-founder of a startup that develops a novel GUI automation tool called Automa. As a tool provider, it is very tempting for us to see automation (for instance in testing) as the silver bullet to any problem. Fortunately, there is a community of testers, who despite being extremely busy themselves take the time to share their experiences and give us a more integrated point of view.

We submitted a conference proposal to the EuroSTAR testing conference. In the proposal, we advocated a development and QA process where virtually all tests are automated and manual testing is only used to find new bugs not yet covered by the automated tests. This is the approach we use to develop our software, and have been faring very well with.

Our proposal was not accepted, but strange as this may sound, this is where the good part began. When your submission isn’t accepted, the EuroSTAR team (and the programme chair Michael Bolton in particular) give you the chance to ask for specific reasons why your submission wasn’t accepted. Keen to find out, we asked to receive this information.

Michael Bolton got back to us with a list of several very valid points raised by the reviewers that explained why our talk would not have been a good fit for the conference. The list was already very educating, but Michael did not stop there. Despite it not being related to our proposal, he went on to give feedback on specific posts on our blog that were attached to our submission.

We were pretty impressed by this. With over 400 conference submissions to read through (and potentially give feedback on), Michael took the time to say “by the way, I looked at your other stuff and here is what I think”. He said we’re on the right track in many ways. However, in the discussion that ensued, with Michael kindly responding to several further emails, it became clear that we needed to expand our views of testing and automation.

The first key takeaway we got from our discussion with Michael is probably already clear to the readers of this blog: Testing is a far more general activity than checking for regressions. In our new understanding, testing is any activity that provides or enables information about how well a piece of software meets the needs of a human being. Maybe it sounds strange, but in the developer background we come from testing is still very much equated with regression checking.

The second learning we obtained by talking to Michael is that automated and manual regression tests are very different things. Automated regression checks cannot catch defects they were not designed for, but can serve as living documentation and are cheap to run many times. Manual tests on the other hand provide a lot more “qualitative” information, but are expensive to execute. The two approaches are different (and we like to think complementary!) ways of achieving the same ultimate goal.

The final realisation Michael helped us have is that automation is not just useful for regression checking. In the case of our GUI automation product for instance, it can also be useful for setting up or tearing down a test context, or any other repetitive action such as creating a new ticket in an issue tracker.

Taken together, the above observations make two things very clear: Testing is an inherently human activity, and automation is a tool that helps humans perform this activity. As such, any automation must be seen in a human context. This is remarkably reminiscent of the first principle of the famous Agile Manifesto:

“Individuals and interactions over processes and tools”

Please do remember however that the Agile Manifesto also states:

“while there is value in the items on the right, we [the Agile
Manifesto’s authors] value the items on the left more.”

The conclusion is thus: Automation does not replace human testing, but can help those doing it. You may think it’s simple, but for an automation tool provider it requires some effort to arrive at   ;-)

The above realisations would not have been possible had we not submitted our conference proposal to EuroSTAR. The fact that Michael Bolton and his committee provide such good feedback makes time spent writing a conference proposal not wasted, even if the submission is not accepted. We thank Michael Bolton and his committee for the feedback on our conference submission, which went to a far greater extent than we would have expected.



michael herrmann_150x192Michael Herrmann is the co-founder and tech lead at BugFree Software, an EU-funded startup that develops a novel GUI automation tool called Automa. He holds an M.Sc. from the University of Oxford, with an award-winning Master’s thesis in quantum computing. Prior to co-founding BugFree Software, Michael worked at, Austria’s largest web portal for classified ads, where he was the tech lead of the platform’s anti-fraud system that handles millions of messages between users per month. Michael was a speaker at the international EVA Vienna 2008 conference “Digital Cultural Heritage – Essential for Tourism”, where he presented a paper on image based measuring of ancient coins.

Blog post by

go back to the blog


Leave your blog link in the comments below.

EuroSTAR In Pictures

View image gallery