Blog

go back to the blog

The Curse of Interesting Times by Paul Gerrard

  • 04/02/2014
  • 8512 Views
  • no comments
  • Posted by EuroSTAR
-->

Last year  we had a very popular blog post series here on the EuroSTAR Blog called ‘A View From The Chair’ with Michael Bolton. Following the success & interest of the series last year we have decided to continue it with the 2014 Programme Chair, Paul Gerrard. Here is the first blog post of this year’s series.

The saying, ‘May you live in interesting times’, has often been quoted as a Chinese curse most famously by Robert F Kennedy. Kennedy went on to say, ‘like it or not, we live in interesting times’ but he suggested this was a time of great opportunity for creativity and change in the 1960s.

Apparently, there’s no record of the original Chinese curse. It’s interesting that ‘interesting times’ could be regarded as a curse at all. But understandable. If interesting means we are living in a period of change and uncertainty then it’s hard to argue with that. We are indeed living in interesting times. But, like Kennedy, I’d like to take the glass-half-full perspective.

The reason I mention this is because there are just a few weeks to go before the Call for Submissions for EuroSTAR 2014 closes (on February 14th – Saint Valentine’s Day). This is the time of year that I would be writing submissions for the show. I won’t be, of course, but I wondered, if I were submitting something, what might I be thinking about?

I like to talk about stuff that other people aren’t talking about. (You don’t???) So, inspiration for me usually comes from thinking about the new technologies or changes in the IT, software or computing business that are taking place. Let me tell you what has caught my attention over the past 12 months or so.

The banking crisis in 2008 and the recession it caused has been a challenge to all of us in our personal and professional lives. If you had a steady job in a business that was growing, you were one of the lucky ones, I guess. For most people, job security diminished, contract rates were squeezed as projects were cancelled or shelved. For those in work, the expectation was inevitably to do more with less. Several people I know got out of testing or IT or even retired.

But we now hear of recovery. Optimism is returning. Sustained growth for most European economies now seems likely. (There is even talk that we might be entering a boom). Companies have hoarded a mountain of money (earning little interest) and are now about to invest in projects that have been shelved for years. The floodgates are about to open and we’ll all get rich. Or not. It seems to me that the increasing demand for people with the right skills and especially new skills, cannot be satisfied – at least in the short term. How will it affect your business or your career?

One response to the potential skills crisis is the Shift-Left approach. Over the course of the last year or so, “Shift-Left” has emerged as the term most often used to denote a restructuring of development and test teams. The ‘testing is dead’ meme was alarmist, but for a significant number of companies, the promise of moving testers into development teams, embedding them with programmers and adopting Continuous Delivery and DevOps approaches is too great to resist. (A cynic might say that embedding testers in development teams removes the problem of having to justify huge test budgets, and delays).

The principle is that testers will validate and thereby improve requirements and create test scenarios to enhance business stories that can feed the Behaviour-Driven or Test-Driven Development approach. With a DevOps culture, a company can practice Continuous Delivery, driven by information derived from production-testing, web and test analytics. Changes must be delivered in a very short timeframe, so all changes are scoped to be very small – just a few story points in size, perhaps. It’s a very different culture and approach from those Waterfall ways, needless to say.

Online businesses in particular are increasing in number and size dramatically. Most do not have the baggage of waterfall or structured methods so they often have development teams but no dedicated test team. Have no doubt that testing is happening; it just isn’t separated as a staged activity. It’s embedded in the development process. Their software delivery processes are very highly automated and require great discipline and repeatability. In effect, the team operate very much like a production-line cell. The team have control over their local processes, but must align with the cadence of the delivery of new stories from business analysts and testers. It’s a dramatic change, and a lot of companies are attempting it.

It may be a contentious view, but I would argue that Continuous Delivery is not Agile but rather the alternative to Waterfall and Agile. I’ll be writing about this proposal in a future newsletter.

The debate on whether testers should learn how to write code is ongoing but the shift-left paradigm seems to force the issue. Surely, if testers are pairing with developers they have to learn how to be programmers? Well, I don’t think it is as clear-cut as that. In one team structure, the tester might be closely aligned to a business analyst and in this case, no coding skills are necessary perhaps. But at the other extreme programmers might pair off to write unit tests and the code itself. In this case, the pair might have the same coding and testing skills. There is no ‘tester v developer’ anymore.

I think that actually, there is a range of coding skill levels between these two extremes. For example, being able to read, understand and review code requires one set of skills, writing/running unit test another and the ability to script utilities to manipulate test data, XML or scrape web content others. We need to stop talking about testers becoming programmers, but rather testers acquiring new capabilities. Some capabilities can be acquired without a massive investment in time or training. I’ll be blogging about that in a future edition too.

The Open Source revolution is gaining pace. A few years ago, the choice of free tools out there was limited and they weren’t that great. Now, there are thousands and thousands. Of course most are written by developers for developers. But the tools available to testers now cover test execution, testing frameworks, incident tracking, performance/load generation, network monitoring, continuous integration, data analysis – the list goes on and on.

In the past, it was small companies or the likes of Google and Amazon who adopted such tools but an increasing number of fast-growing businesses owe their existence to open source. The corporates, who for a long time have used Open Source infrastructure products, may now be changing their risk-averse attitude. The majority of software development has been performed using open languages and developer tools for some years now. For testers, the drift away from proprietary and towards Open Source is gaining pace because the skills are available in the marketplace and an increasing number of service companies offer specialist support too.

(BTW, we want to make space in the Expo area for something to support the open source community. If you have ideas, suggestions for how this might work – do let us know.)

The concept of Big Data has been around a few years now. Big could mean vast volumes and rapid growth in them – the emergence of NoSQL (not only SQL) databases and products like Hadoop will address the challenge of scale. More interestingly to me, however, is the variety and lack of structure in that data that poses the more interesting challenge for developers and testers. To date, there has been a lot of hype, but applications of Big Data and case studies are appearing.

I would suggest that performance test results and analysis are an example of Big Data and this might be familiar to you. But I’ve said elsewhere that Test Analytics – the capture, merger and analysis of (mostly automated) test outputs with other lifecycle data and production logs to guide the development process holds much promise – it’s a Big Data discipline. The ‘test analyst’ role might mean something different in the future or maybe some testers will just become data scientists?

Now, where does all this Big Data come from? The familiar web and other service logging are obvious sources but in fact, most of the electronic devices in factories, retailers, banks and so on are capturing data that is used for analysis purposes. Soon, almost every device with a power supply could be connected to the internet. The Internet of Things might well be the ‘next big thing’.

The Internet of Things or IoT will include your mobile, your tablet, your car and most of the electronic devices in your ‘smart home’. It’s been estimated that the average smart home will log 75,000 rows of data per day. The searches, the web sites you visit, the phone calls you make, the bills you pay, the things you buy, and your (or your phone’s) location, and video of what you are doing and where, are all recorded somewhere. That’s an awful lot of data and it will have value to someone you don’t know and probably don’t trust.

The natural conclusion of all this will be the ‘Internet of Everything’. How exciting will that be? Except… who is going to test it and how? I’ll leave that question to you, as homework.

Right now, the things I mention above are on my radar. I am not suggesting that you submit a polemic, theorising about the outcome of some of the issues that I’ve mentioned. But if you have or expect to have real-life experiences of them in the next few months, then perhaps you could bring your story to EuroSTAR?

I hope your interesting times are not a curse.

Blog post by

go back to the blog

eurostar

Leave your blog link in the comments below.

EuroSTAR In Pictures

View image gallery