go back to the exhibitors



About Accenture Testing Services
To succeed in a high-velocity, software-driven world, Accenture helps companies transform testing into a continuous and efficient end-to-end quality engineering function. Accenture offers the entire breadth and depth of Testing services across a broad array of applications, digital technologies and industries delivered under flexible business arrangements to help companies shape their digital future and lead their industry while delivering an exceptional customer experience. With more than 31,000 testing professionals across more than 30 delivery centers in 17 countries, Accenture serves over 1,000 testing clients across more than 40 industries. The company combines industry specialization, pioneering cognitive automation, digital leadership and cloud innovations delivered through an unmatched global network of alliances and delivery centers to power high performance.

About Accenture
Accenture is a global management consulting, technology services and outsourcing company, with more than 358,000 people serving clients in more than 120 countries. Combining unparalleled experience, comprehensive capabilities across all industries and business functions, and extensive research on the world’s most successful companies, Accenture collaborates with clients to help them become high-performance businesses and governments. The company generated net revenues of US$31.0 billion for the fiscal year ended Aug. 31, 2015. Its home page is www.accenture.com.


Confessions of an Integration Tester

This presentation looks at the role of the Integration tester and focuses on their confessions from experience

  • High Speed Testing Cycles

    This presentation looks at high speed testing cycles

  • OAT The Cinderella of the Lifecycle

    This presentation looks at OAT and how it is the Cinderella of the software lifecycle

  • Testing: A Heretic’s View

    This presentation looks at the testing from a heretics perspective

  • Going OO – A Model For Test Process Transition

    This talk looks at going OO and a model that the presenter used for test process transition

  • A Complete Guide to Evaluating Testing Tools

    This presentation aims to show a complete guide to evaluating testing tools.

  • Automated Testing Environment for SIM Application Toolkit

    This presentation looks at the automated test environment for SIM Application Toolkit

  • Design and Replay Exists and It Works

    This presentation focuses on design and replay and how to make it work properly

  • Get It Right First Time dot com

    This presentation discusses how to get web testing to work first time

  • Web Testing – Re-Inventing the Wheel

    This presentation takes a look at web testing and the challenges it has brought to testing in a totally new environment

  • E-Testing the Airline

    This presentation looks at the e-testing of an airline

  • E-Testing: Focus on User Perception

    This presentation takes a look at e-testing and how a focus on user perception can be beneficial

  • The ABC of e-Testing

    This presentation takes a look at the ABC of e-testing

  • From Art of Software Testing to Leadership

    This presentation looks at the art of software testing and how it can moves towards effective leadership

  • Towards a More Mature Test Process

    This presentation looks at moving towards a more mature test process

  • Risk – Is This the Centre of Our Universe

    This presentation takes a look at risk and discusses if it is the centre of a testers universe

  • Using Risk Analysis to Priortise Your Testing

    This presentation looks at the use of risk analysis to prioritise your testing

  • Quality Assurance of ‘Handed On Software’

    This presentation looked at QA of handed on software

  • Its Just Common Sense

    This presentation looks at a common sense approach to a project

  • Experiences of Test Driven Development

    This presentation takes a look at the presenters experiences of test driven development

  • Structered Testing Methodology for the Embedded System

    This presentation takes a look at sstructured testing methodologies for the Embedded Systems

  • How to Achieve Maintainability, Manageability & Traceability

    This presentation looks at how to achieve Maintenance, Manageability and Traceability

  • Did Your Quality Initative Fail Again

    This presentation focuses on quality iinitiatives and wwhy they sometimes do not work

  • Testing has Become a Bottleneck in Software Development

    This presentation takes a look at whether testing has become a bottleneck in software development

  • Testing Broadband Internet at a Large Telecom Operator

    This talk looks at a case study where the presenter was iinvolved in the testing of broadband internet at a large telecom operator

  • Track Testing

    This presentation looks at the sspecialist technique of track testing

  • eXtreme Programming: The Role of Inspections & Reviews

    This presentation focuses on eXtreme Programming and the role of inspections and reviews

  • Time Management Control in a Validation Project: Why is it so Hard?

    This presentation takes a look at why time management is so difficult during validation projects

  • The Troubled Project: Best Practice From Theory to Reality

    This presentation looks at an actual troubled project and how best practice can be implemented from theory to actual reality

  • Usability Testing – Starting with Satisfied Customers

    This presentation takes a look at usability testing and how it possible to start off with satisfied customers

  • Management of Test Cases Using Database Concepts

    This talk explains how test cases can be managed using database concepts

  • End to End Testing, As Little as Possible

    This presentation looks at end to end testing and why the presenter tries to engage in as little as possible

  • Pitfalls in Euro Testing

    This presentation talks about the pitfalls involved in Euro Testing

  • Dot.com or Dot.Bomb? A Dress Rehearsal Strategy

    This presentation looks at whether a lot of internet based testing is ‘dot.com’ or ‘dot.bomb’

  • Prediction of Software Failures Based on Systematic Testing

    This presentation looks at software failures that were based on systematic testing

  • Structured Internet Performance Testing

    This presentation takes a look at sstructured internet performance testing

  • Iterative Development – Don’t Forget the Testers

    This presentation takes a look at iterative development and why it is so important to not forget the testers

  • Sharing Knowledge with Test Patterns

    This presentation looks at how to share knowledge with Test Patterns

  • Improving Developer’s Tests

    This presentation looks at how developers tests can be improved upon

  • Experiences From End to End Business Integration Testing

    This presentation looks at experiences from end to end business integration testing

  • Failure Is Not An Option

    This presentation focuses on the internet and looks at how failure is not an option

  • Risk & Benefits in Monolithic & Iterative Life-Cycles

    This presentation takes a look at the risks and benefits in Monolithic and Iterative lifecycles

  • Implementing the Theory of Constraints

    Implementing the Theory of Constraints

  • Testing a Chain of Applications

    This presentation looks at testing a chain of applications

  • Pain Vs Gain Testing, The Impact of Web Sites

    This presentation looks at the impact of web sites on testing as a profession

  • Build It Fast & Build It Right

    The presenter looks at how to actually build it fast but also build it right

  • Clearing the Testing Minefield

    The presenter looks at how to clear the testing minefield

  • System Testing Process in Iterative Environment

    This presentation looks at systems testing in an iterative environment

  • E-Commerce: The Enemy Within’

    The presentation looks at the enemy for testers within E-Commerce

  • Testing in Quick Time – Some Case Studys

    This presentation takes a look at some case sstudies focusing on testing in quick time

  • Managing a Testing Improvement Programme – If Only I’d Known

    This presentation looks at the setting up of a test improvement programme by the presenter and what he wishes he had of known in advance

  • Test Tools, What Exists and What to Select

    This presentation looks at what tools exist in the marketplace and which you should select

  • The Role of Metrics in Testing Process Improvement

    This presentation looks at the role of metrics in TPI

  • Delivering Education in Software Testing: Challenges & Solutions

    This presentation looks at how education on software testing can be delivered and the cchallenges & solutions to this

  • Performance Testing “Step on It”

    This presentation focuses on performance testing and how to accelerate towards it

  • Theory and Practice in TPI

    This talk looks at the theory behind TPI and also TPI in practice

  • AGEDIS-Research into Automated Generation & Execution

    This presentation looks at research into automated generation and execution

  • Development & Validation of a Metric Based Test Maturity Model (1)

    This presentation takes a look at the development and validation of a Metric Based test maturity model

  • Building Internal Certification Schemes for Testers

    This presentation looks at how internal certification schemes for testers can be implemented

  • Emotional Intelligence As The Key To Software Quality

    This talk looks at emotional intelligence as the key to software quality

  • Testing Beyong Your IQ

    This talk looks at testing beyond your IQ

  • Application Fields for Evolutionary Testing

    This presentation looks specifically at application fields for Evolutionary testing

  • Release Metrics: When to Stop Testing with a Clear Conscience

    This presentation looks at release metrics and enough testing is enough

  • Managing the Learning Curve of Software Testers

    This talks looks at how the learning curve for software testers can be managed

  • Non Functional Testing Standards Workshop

    This workshop looks specifically at non functional testing standards

  • I Don’t Need a Test Strategy

    This presentations looks at whether a test strategy is always needed and looks at the occasions where ‘I Don’t Need A Test Strategy’

  • Considerations in Automatic Tests Development

    This presentation takes a look at considerations in automatic tests development

  • Traditional Coverage Metrics: How Meaningful Are They?

    This presentation takes a look at whether traditional coverage metrics are actually meaningful any longer

  • Recruiting & Retaining the Right Test Personnel

    This talk looks at the recruitment process for testers and how to retain the right test personnel

  • SPICEing Your Software Standards – A Tasteful Approach

    This presentation takes a look at SPICE and how it can be applied to software standards

  • Successful Test Strategies

    This talk looks at examples of successful test strategies

  • State of the Art Automation

    This presentation looks at the tools used for state of the art test automation

  • The Importance Of Agreed Software Quality Metrics

    This presentation looks at the importance of agreed software testing metrics

  • Can Anybody Work In A Test Team

    This presentation discusses whether anybody can become a member of a test team

  • Software Testing Requirements in Safety Related Standards

    This presentation looks at testing requirements in safety related matters

  • The Embedded Internet – Towards 100 Billion Devices

    This presentation looks at the embedded internet and the move towards 100 billion devices

  • The Benefits of Test Automation in End to End Testing of Mobile Testing

    This presentation from the 2002 EuroSTAR Software Testing Conference focuses on the importance of test automation in end to end testing of mobile networks.

  • Mind Mapping 101 for Testers

    This presentation takes a look at the advantages of mindmapping for test professionals

  • Changing Testing At Legal & General

    This presentation takes a look at how testing has cchanged at Legal & General

  • A Test Career, Especially When Things Look Bad

    This presentation takes a look at the test career and specifically when thing looks bad

  • KISS with Metrics

    This presentation takes a look at KISS and how it can be used with test metrics

  • Quantifying the Benefits of Investing in Test

    This presentation looks at how the benefits of investing in test can be quantified

  • Stetsons and Testers – A Practical Guide To Recruiting Testers

    This presentation explains a practical guide to recruiting testers

  • Get an Organisation Test Wise

    This presentation looks at how an organisation can become test wise

  • Risk Based Testing: A Common Language for Project Stakeholders

    This presentation takes a look at how risk based testing can be a common language for project stakeholders

  • Get Your Message Across: Reporting for Maximum Impact

    This presentation looks at how testers can get their message across for maximum impact

  • Risk Based Testing & Metrics: Experiences at ABN Amro Bank

    This presentation presents a case study from ABN Amro bank where Risk Based Testing and Metrics were used

  • Requirements: Adding Value By Testing Requirements

    This presentation takes a look at how added value can be added through test requirements

  • Common Web-Based Bottlenecks – Isolation and Identification

    This presentation focuses on web-based bottlenecks and more specifically how they can be isolated and then identified.

  • Experiences of TPI Using TIM

    This presentation takes a look at experiences of using TPI with TIM

  • Adding Value by Static Testing

    This presentation looks at how added value can be achieved through static testing

  • Testing to Die For

    This presentation compares testing to a samurai way of life

  • Developer Testing To Reduce TTM on Sony-Ericsson Smartphones

    This presentation takes a look at a case stidy where developer testing was used to ruduce TTM on a Sony Ericsson device

  • TPI Initiating SPI

    This presentation looks at TPI and how TPI can be initiated with SPI

  • Performance Testing: The Right Tool

    This presentation takes a look at performance testing and how to select the right tool for it

  • Measuring for Success

    This presentation looks at how success can be achieved and then measured

  • Extreme Programming – What We Should Learn From It

    This presentation takes a look at extreme programming and what testers should be learning from it.

  • Upgrade Your Test Process: An Approach That Really Works

    This presentations looks at an approach that actually upgrades their test process

  • If It Wasn’t for The Users

    This presentation looks at the scenario – what if it wasnt for the users?

  • Test Assessments based on TPI

    This presentation looks at test assessments based on TPI

  • Test Management Tools: What Can They Do For You?

    This presentation looks at an array of test management tools and shows what they can do for you

  • Myths and Legends of Testing

    This presentations looks at some of the common myths and legends associated with software testing

  • Think Design, Improve Testability

    This presentation looks at how by thinking design you can improve testability

  • Testing WebSphere Voice Server

    This presentation looks at experiences and lessons learnt from testing the websphere voice server

  • TPI Made Easy(er)

    This presentation looks at how TPI can be made easy or at least easier

  • Test Monkeys – The New Members of Your Team

    This presentation looks at new members of the test team – Test Monkeys!

  • Adding Value Through Outsourcing Testing Related Risk

    This presentation takes a look at how added value can be achieved through the outsourcing of testing related risk

  • Quantifying Test Value: Some Examples and a Case Study

    This presentation looks at quantifying test value and looks at a case study illustrating this

  • Four Quantifiable Ways Testing Adds Value

    This presentation looks at four quantifiable ways testing can add value

  • What is the Value of Testing & How Can we Increase It?

    This presentation takes a look at the value of testing and looks at how we can increase it

  • Increasing the Value of Every Person On Your Team

    This presentation takes a look at how you can increase the value of every person on your test team

  • If We Could Do It Over Again, What Would We Do Differently

    This presentation looks at what we might have done differently is we had the chance

  • Testing In The Inital Project Phases

    This presentation takes a look at the testing function in the iinitial project phases

  • Structured Testing in DSDM Projects

    This presentation takes a look at structured testing in DSDM Projects

  • Real Life Management of Complex Testing Environments

    This presentation takes a look at the real life management of complex testing enxironments

  • Challenges in Testing Real-Time Systems

    This presentation looks at the cchallenges in testing real-time systems

  • Managing Integration Testing in the Large – The Cluster Approach

    This presentation takes a look at the cluster approach and managing integration testing

  • How To Justify the Cost of Testing to Business Decision Makers

    This presentation looks at how to justify the cost of testing to key decision makers

  • A Day in the Life of a Test Manager

    This presentation looks at a day in the life of a test manager

  • Testing as A Driver for Development Changes

    This presentation looks at testing as a driver for development changes

  • Testing Iterative Incremental Development, Sticking to the Principles

    This presentation takes a look at testing iterative incremental development projects

  • Cost & Quality Effective Management of your Test Environments

    This presentation takes a look at the quality effective management of test environments

  • Enterprise Wide Testware Architecture

    This presentation takes a look at enterprise wide Testware Architecture

  • Information Security and Testing

    This presentation looks at information security and its rrelationship with testing

  • Adventures in Session Based Testing

    This presentation takes a look at adventures in session-based testing

  • The Seven Deadly Sins of Test Management

    This presentation outlines the seven deadly sins of test management and how you can prevent them

  • Using Earned Value to Track Software Testing Projects

    This presentation looks at how using earned value to track software testing projects can have its advantages

  • A Quantative Approach to Test Schedule Control

    This presentation takes a look at a qquantitative test schedule control

  • Around The Test Management World in 180 Days

    A novel presentation looking at the test management world

  • Testing Challenges of Incremental Component Based Development

    This presentation looks at the challenges of incremental component based development

  • How to Set Up & Operate A Usability Testing Lab

    This presentation looks at the process of setting up and operating a usability testing lab

  • Emerging Techniques in Testing Embedded Systems

    This presentation takes a look at emerging techniques in testing embedded systems

  • No Silver Bullets but 20 Golden Rules

    This presentation explains that there are no silver bullets but rather a set of 20 rules

  • Testing Daily Builds in an Iterative Development Process

    This talk looks at how tests can be conducted on daily builds in an iterative development process

  • Testers Are Doing It For Themselves

    This presentation from Julie Gardiner looks at the role of a software tester in a team and how testers can take ownership of their career development looking specifically at the mindset of a tester and the attributes of a good software tester and creating an action plan for investing in that development.

  • A Way of Making Developers Tests More Successful

    This presentation looks at a way of making developers tests more successful

  • The Interplay Between Model Coverage and Code Coverage

    This presentation looks at the interplay between code coverage and model coverage

  • Foundations for Successful Testing

    The presenter looks at the vital foundations for successful software testing

  • Statistical Usage Testing Using the MaTeLo Tool

    This presentation focus on the test tool, MaTeLo

  • Testing Improvements – Measuring the Benefits

    This presentation looks at how the benefits can be measured as a direct result of testing improvements

  • Developer Testing with XP

    This presentation focuses on developer testing with XP

  • Automatic Extraction of Test Cases from a Semiformal Specification

    This presentation looks at the extraction of test cases from a semiformal specification

  • Influencing Management with Project Intelligence (PI)

    This presentation looks at how management can be influenced by project intelligence

  • Why Not to Automate Testing?

    This presentation discusses test automation and test driven development and when not to automate

  • KISS with Metrics and ROI – 12 Months On

    This presentation look at the situation with KISS and Metrics 12 months on

  • Testers Introduction to Agile Software Development

    This presentation aims to introduce testers to agile software development

  • Higher Efficiency by Integrated Test Design & Automation

    This presentation looks at hoe higher efficiency can be achieved through integrated test design and automation

  • Testing Solutions for Datawarehousing

    This presentation focuses on testing solutions for data warehousing

  • Building Success: Beyond The Obvious

    This presentation takes a look at building on success and perhaps looking at some not too obvious means of accomplishing this.

  • Successful Test Evaluation in Concurrent Systems

    This talk looks at successful test evaluation in concurrent systems

  • Improving Testing In A Small Organisation

    This presentation speaks about how testing can be improved within a small organisation

  • People Issues In Successful Testing Projects

    A talk that looks at the importance of people issues in successful testing projects

  • Use of Inspection as a Risk Mitigation Tool

    This presentation looks at the use of inspection as a risk mitigation tool during software testing

  • How CMMI Contributes to The Testing of Software

    A look at how CMMI Contributes to the testing part of the project

  • Effective People Skills for Testers and Developers

    This presentation looks at effective people skills for both testers and developers

  • What is Success for the Tester?

    This talk looks at what constitutes success to a tester

  • Exploratory Testing from a Geological Perspective

    This presentation takes a look at exploratory testing from a geological perspective

  • Improve Your Estimates by Using Historical Data

    This presentation explains how you could improve your estimates by using historical data

  • Building on Real Success

    This presentation looks at how to build further on real success

  • The Agile Way to Success

    This presentation looks at how agile testing can lead to success

  • Process Improvement – It’s Not Rocket Science

    A look at process improvement and how it is not rocket science

  • 10 Practical Rules for Change Management

    This presentation outlines 10 practical rules for change management

  • Banking on Success – Implementing Test Management in ABN AMRO

    A presentation that discussed how test management was implemented at ABN AMRO

  • Test Improvement Success at Philips Medical Systems

    A look at test improvement success at Philips Medical Systems

  • Automated Test Ability: A Prerequisite for Successful Test Automation

    This talk looks test automation and the necessary conditions required for successful test automation

  • Solid As A Rock

    This presentation takes a look at the importance of a solid presence in a project

  • Be A Better Tester, Be A Beta Tester

    This talk discusses how to become a better tester by becoming a ‘beta tester’

  • Test Process Improvement – An Empirical Study

    This presentation look a study conducted on TPI

  • The Accidental Project Manager, QA to the Rescue

    This presentation look at the role of the ‘accidental’ project manager and how QA can assist

  • Testers Rights: What Test Should Demand from Others, and Why?

    This keynote talk by Tom Gilb from the 2003 EuroSTAR Software Testing Conference discusses testers rights and what they should be demanding of other stakeholders involved in software development and why!

  • Scripted Verus Exploratory Testing

    This presentation takes a look at scripted vversusexploratory testing – the advantages and ddisadvantages of both

  • Testing a Key Public Infrastructure

    This presentation takes a look at the testing undertaken on key public infrastructure

  • Safe Security Testing – Avoiding the Pitfalls

    This presentation looks at security testing and how to achieve safe security testing and avoid the pitfalls along the way

  • Test Manager As A Service Manager

    This presentation looks at whether the role of a test manager is iinterchangeablewith that of a service manager

  • Building a Test Organisation, The Long and Winding Road

    This presentation looks at the long process associated with setting up and building a test organisation within a company

  • Ghostbusters: Safety Testing

    This presentation looks at safety testing from the perspective of a ‘ghostbuster’

  • The Factory: A Success Story Over 3 Phases

    This presentation looks at a success story entitled ‘The Factory’ – an interesting case study

  • The Double Backslash Model (\\ Model)

    This presentation looks at the double backslash model (\\ model)

  • The Complexities of Performance Testing

    This talk focuses on the complexities involved whilst engaging in performance testing

  • The Art of Managing Fixed Price/Date/Quality Test Project

    This presentation takes a look at the art of managing test projects with a fixed price and deadline

  • Designing & Implementing the Toyota Unified (Test) Process

    This presentation looks at how the toyota unified process was design and implemented and how the test function was approached.

  • Another Week Of Testing

    This presentation looks at the question – when has enough testing been done?

  • Test Managers: Chameleons of the Project World

    A look at the function of test managers and a discussion as to whther they are essentially chameleons of the project world.

  • Oh and We Want Some Performance Testing

    This presentation looks at the need for performance testing and the increasing demand for it.

  • Customer Specific Risk Based Tests

    This talk looks at risk based tests and how to make it more customer specific

  • Iterative Test – Make It Work

    This presentation looks at iterative tests and how you can make it work for you

  • Test Design – One Successful Way Forward

    This presentation focuses on test design and takes a look at a successful way forward

  • Q Gates: Structuring the QA Process

    This presentation takes a looks at how Q Gates can aid in structuring the QA Process

  • Usability Testing: Recipe for Success

    This presentation looks at usability testing and how you can mix up a recipe for success

  • Not Testing From Outside the Firewall? It Will Come Back To Haunt You

    This presentation looks at testing outside the firewall and its importance

  • The Journey to Test Automation Maturity

    This presentation looks at the journey towards Test Automation Maturity

  • What We Knew 10 Years Ago But Still Do Not Do and Why

    In the presentation, the presenter looks at things testers knew 10 years ago but still do not do today and explains why this is so.

  • Increasing Responsibilities and Economy of Software Inspection

    This presentation takes a look at the increasing responsibility of software inspection

  • Development Manager’s View of Testing

    This presentation looks at testing through the eyes of a development manager.

  • Secrets of Test Driven Development

    In this presentation, the presenter shares some secrets of test driven development

  • Optimising Testing of Embedded Software

    This presentations looks at how to optimise the testing of embedded software

  • Becoming A Test Expert

    In this presentation, the presenter gives some hints and tips on how to become a test expert

  • Testability Reviews

    This presentation focuses on Testability reviews

  • Helping Developers To Accept Testing

    A talk on how developers can be helped to ‘eventually’ accept testing

  • Agile Development: The Story Of A Story

    This presentation looks at an agile project and how it became the story of a story

  • Non Functional Testing of Embedded Systems

    This presentation takes a look at non functional testing of embedded systems

  • TPI Benchmark

    This presentation is an in-depth analysis of the TPI benchmark

  • Review UML As Part Of The Project Team

    This presentation looks at the process of reviewing UML as part of the project team.

  • How Requirements and Testing Fit Together

    This presentation takes a look at how requirements and testing do fit together

  • Breaking Down The Barriers Between Testers and Developers

    This presentation looks at ways to break down the barriers between testers and developers

  • Integrated Development/Test Project Teams

    An insight into the workings of an integrated development/test team.

  • Integrated System and Test Development – The UML Testing Profile

    This presentation takes a look at the UML Testing profile and test development

  • Reusable Testware Through Object Orientation

    This presentation looks at reusable testware in the context of object orientation

  • 13 Common Attacks On The QA-Test Manager

    This presentation details 13 commons attacks on the QA-Test Manager and how to deal with them

  • Effective Testing With Limited Resources

    This presentation looks at how to ensure effective testing with limited resources

  • Technical Testing: Bridging The Gap Between Testers and Development

    A look at how the gap between testers and developers can be bridged whilst performing technical testing

  • The Essence of Requirement Specification

    This presentation looks at requirements specification and its very essence

  • The Sundering Flood – Tests We Didnt Do

    This presentation focuses on those tests that aren’t done and the results and consequences of this

  • Testability – Knowing Your Tests

    This presentation takes a looks at the importance of knowing your tests.

  • Test Set Reduction by Optimising Decision Tables

    This presentation discusses test set reduction by optimising decision tables

  • A Practical Model for Programme Test Management

    This presentation takes a look at a practical model for programme test management

  • Testing Competence

    This presenation focuses on testing competence and how it can be achived and/or improved upon

  • Planning For Plan-Driven Vs Agile Testing

    This presentation looks at how to plan for the plan driven vs agile testing debate

  • What Can Be Done About Software Reliability

    This presentation focuses on software reliability and what can be done to improve it.

  • Future Analysis: Preventing and Detecting Imminent Software Disasters

    This presentation looks at how to prevent and detect imminent software disasters using a technique known as future analysis

  • Risk Mitigation Trees: Review Test Handovers with Stakeholders

    This presentation looks at risk mitigation trees and how test handovers with stakeholders are reviewed

  • Assessing Tester Ability – Some Proposed Test Techniques

    This presentation looks at test techniques that help to assess tester ability

  • Automation Tools and Exploratory Testing

    This presentation looks at certain automation tools and how they work while engaging in exploratory testing

  • Beyond Conformance: Preparing Software for Hostile Environments

    This presentation looks at how software is prepared for hostile environments

  • The Relative Attributes of Test Techniques

    This presentation takes a look at the relative attributes of different test techniques

  • How To Define Efficient Test Goals & Strategy

    This presentation looks at the best way to define efficient test goals and strategy

  • Realising The IB-Group Organisational Test Policy

    This presentation takes a look at how the IB groups test policy was realised with the aid of Capgemini

  • The Testing Maturity Model in the Real World

    A look at the test maturity model as applied in the real world

  • Automated Testability, The Missing Link in Test Automation

    This presentation looks at automated testability as the missing link in test automation

  • Building Test Automation Frameworks ith Finite State Machines

    This talk takes a look at the process of building and implementing test automation frameworks for finite state machines

  • Effective Feedback From Quality Tools During Development

    A presentation that looks at the importance of effective feedback from quality tools during development

  • Test Automation: A Developers or Testers Job?

    A talk that looks at whether test automation is a developers or a testers job.

  • Six Steps Towards Successful Load Testing

    This presentation details six steps towards successful load testing

  • Becoming A ‘Trusted Advisor’ – How To Present The Testing Message Effectively

    This presentation explains how you can become a trusted test advisor and looks at the best ways to present the testing message effectively.

  • Communication and Assertiveness

    This talk looks at the importance of communication and assertiveness for test professionals

  • Career And Competence Development Of Test Personnel

    This presentation looks at the development of test professionals careers and the competencies that require development in a personal capacity

  • How To Break Software Security

    This talk looks at methods by which software can be broken

  • Measuring the Effectiveness of a Test

    This presentation looks at how best to measure the effectiveness of a test.

  • What Testers Can Learn From Scentific Methods

    This presentation explains how test professionals can learn much from looking at sscientific methods

  • Setting Up and Managing A Test Lab

    This presentation looks at the entire process from start to finish of setting up and managing a test lab.

  • Alternative Test Tools In Action

    A look at alternative test tools available

  • The Man In The Middle

    A presentation looking at performance testing efforts and the role of the middle-man

  • Testing Real Time Properties in Large Architectures

    This presentation takes a look at testing real time properties in large architectures.

  • Better By Design

    A presentation from EuroSTAR Software Testing Conference 2004 looking at the importance of test design the human element and how important test design is to overall success

  • First Structure, Then Tool; But Do Not Overstructure

    This presentation looks at the importance of structure during testing but also highlights the need to not over-structure and become too rigid

  • When Scrap Becomes Product: Showing Added Value With Solved Defects

    This talk looks at situations where scrap or discarded items become the product, how this was accomplished and the added value of solved defects.

  • Process Improvement Techniques That Deliver Measurable Added Value

    A presentation that discussed different process improvement techniques that can ultimately deliver added value.

  • Self Management Of A Career In Testing

    A presentation looking at how individuals can manage their own career in testing – the presenter shares experiences and some hints and tips.

  • SLA for Testing – Approach to Quantify Quality

    This presentation look at an approach to quantify quality – SLA for Testing

  • Operational Excellence Through Efficient Software Testing Metrics

    A talk that looks at how organisational excellence can be achieved through efficient software testing metrics.

  • Offshore Testing – Heaven or Hell

    This presentation looks at different experiences from outsourcing test projects – both the good and the bad

  • How Industrial Testing Can Benefit From Formal Methods

    This talk details how industrial testing could benefit from adopting formal approaches

  • The Challenge of Test Outsourcing

    This talk discussed the cchallengesassociated with outsourcing testing

  • Measuring ROI: Agony and Ecstacy

    This presentation takes a look at different ways of measuring ROI and when it goes right (ecstacy) or wrong (agony)

  • Security Test of (Web) Applications

    A presentation that focuses on security testing and more specifically the security testing of web applications.

  • Advanced Industry – Academy Learning in Software Testing

    A look at research conducted by the presenter on the influence of academy learning on software testing and how the two could compliment one another.

  • Exploratory Testing: Revolution or Evolution

    This presentation focuses on Exploratory testing and discusses its emergence, is it revolutionary or evolutionary?

  • Rightsourcing Software QA and Test To Deliver Business Results

    A look at outsourcing/rightsourcing and how it can dliver the desired business results

  • Verification Of Critical Telecommunications Systems

    A presentation that focuses on the verification of telecommunication systems – a case study from Ericsson

  • ITIL & Test Management: Friends or Foe

    This presentation looks at whether ITIL and Test Management mix and if they are in fact friends or foes.

  • Optomising Testing With Bayesian Networks

    This presentation looks at ways to optimise testing by using bayesian networks.

  • TPI Automotive – Closing The Software Quality Cap

    This presentation looks at TPI as applied to the automotive industry and explains how you can close the ‘quality cap’

  • Component Based Testing

    An insightful and detailed look and analysis of component based testing

  • The Economics of Test Automation

    The presenter takes a detailed look at the intricate economics of Test Automation and welcomes discussion and debate.

  • Smarter Infrastructural Testing: A Service Orientated Approach

    This presentation takes a look at Infrastructural Testing and applying a SOA approach – the advantages and how to implement it.

  • Systematic Test Design – A Smarter Strategy

    This presentation looks at systematic test design and discussed whether or not, it is a smarter strategy to apply.

  • Applying TPI – An Unusual Case

    This presentation looks at the application of TPI in a particularly unusual project.

  • Ultimate Model Based Testing

    This talk by Harry Robinson of Google from EuroSTAR Software Testing Conference 2005 looks at moving from traditional testing methods to Model Based Testing and ad hoc testing.

  • Comparing Quality To A Medical Illness

    This presentation looks at testing from the perspective of a medical illness and looks at the unique similarities of each.

  • Model Based Testing Using UML 2.0

    This presentation looks at model based testing using UML 2.0

  • Performance Testing, Back to Basics

    This presentation discusses performance testing and looks at it from the most basic level and how by doing so, it benefits the testing as a whole

  • Measuring For Testing Skill Development

    This presentation looks at how testers can develop themselves through constant self appraisal and looks at means for measuring development.

  • Software Inspection – A Failure Story

    This presentation looks at a project where software inspection failed to work as it should and details what happened and how to get back on track.

  • Adding Autism Competencies To Testing

    An inspirational presentation looking at how the presenter began to use the competencies of Autistic people for testing.

  • The Use Case Problem: Time Testers Spoke Out

    A discussion looking at what has been phrased the Use Case Problem and explains why testers should speak out against it.

  • Toll Collect: A Challenge For Non-Functional Testing

    A case study looking at the challenges encountered during a project that utilised non-functional testing.

  • Testing Can Be Adaptive

    This presentation looks at how testing can be adaptive to different environments and highlights how testing is adaptive.

  • Testing In Agile Maintenance Processes

    A look at testing in an agile maintenance process – this talk looks at the challenges faced and provides tips for success

  • Configuration Management Adds Value To Your Test

    This presentation looks at how configuration management adds value to the overall test process

  • Improving Embedded Test Value – ROI is King

    This presentation looks at ways to improve embedded test value and focuses on the importance of ROI (return on investment)

  • Improving Software By Advanced Data Analysis

    This presentation looks at how to improve software as a direct result of using advanced data analysis

  • Smarter Selling Of Testing

    This is an interactive session looking at how testers can sell the value of testing to various stakeholders

  • Testware Architecture

    This presentation looks at the testware architecture available in the marketplace and discussed it in detail.

  • Experiences of Design Based Testing With TTCN-3

    This presentation takes a look at the TTCN-3 technique and the presenter’s experiences of using it in Design Based Testing

  • TMM – Add Value by Impoving Your Test Process

    A presentation that takes a look at TMM and explains how it can be used to improve your test process

  • The V Model, Adding Real Value

    This presentation looks at the V Model and details how it can be utilised to deliver real added value.

  • Agile Inspections for Testability: Test Before the Code is Written

    This presentation looks at an approach where testing is conducted even before code has been written.

  • Analysing Defect Data from an Iterative Development Process

    A talk highlighting how testers can and perhaps should analyse defect data from an iterative development process.

  • Starting Up & Managing A Test Department – Result Driven Development

    A case study looking at the set up of a test department within the presenters organisation and following on from that, the management of the department after it has been set up.

  • What Any Developer Or Tester Should Know Even After Midnight

    A thought provoking presentation looking at the software delivery process from both the perspective of the developer and the tester while also highlighting key essentials that both should insist on.

  • Back to Basics: Functional & Structural Testing Techniques in Practice

    Get back to basics as this presentation look at techniques in practice and discussed the pros and cons of each technique.

  • Understanding System Requirements

    This presentation discussed system requirements and how testers can learn to understand these requirements to ensure better quality.

  • Making Sense of Metrics

    This presentation takes a look at metrics, their role with testing and how you can make sense of the most complicated set of metrics.

  • Defining Quality With The Help of Your Customers

    This presentation looks at the role of the customer in the testing process and how defining their requirements can lead to increased quality.

  • Techniques For Ensuring Quality Within Outsourced Projects

    This presentation looks at how you can ensure quality within outsourced projects and techniques that may be used to reach this point.

  • Economic Aspects of Software Testing

    A presentation that highlights the economic aspects of testing and how they can work in an organisational context.

  • Command Driven Testing: A Step Beyond Keyword Driven Testing

    A presentation that focuses on Command Driven Testing and its usefulness

  • Risk Based Testing Packaged Software – The PRICES Model

    This presentation takes a look at the PRICES model and how it applies to the Risk Based testing of packaged software

  • Personal Test Maturity Matrix

    The presenter looks at the Personal Test Maturity Matrix and discusses how it can be used to improve individual testers performance.

  • Process Improvement at Marks & Spencers

    A Case study looking at how Marks & Spencers implement process improvement.

  • Getting The Most Out Of EuroSTAR 2005

    A Presentation to show EuroSTAR Conference attendees how to make the best out of their conference attendance through some hints and tips from those that attended EuroSTAR over many years.

  • Open Source Software Test Tools

    A look at some of the open source test tools in the marketplace.

  • A Test Design Poster for Smarter Testing

    The presenter outlines their ideas for smarter testing through a specially designed poster detailing the entire process

  • Knowing What’s Right, Doing What’s Wrong

    This presentation takes a look at situations where the best way forward is clearly evident but another route is choosen and also how to avoid this pitfall.

  • Future Opportunities for Testers

    An in depth look at what opportunities there are for testers and how to make the most of these opportunities.

  • IT Trends For The Future

    A look at IT trends for the future and how testing fits into this.

  • Developing Testers: What We Can Learn from Athletes

    This talk describes how an athletic training programme is run and compares that with the way most testers are developed.

  • We Can’t Afford Not To Test

    A look at some of the misconceptions and what type of reasons exist to create them that surround testing as a function – ‘we need to ship, regardless of what testing say…’

  • Achieving Test Case Effiency by Combining Use Case Scenarios and Orthogonal Arrays

    This presentation explores the use of two effective testing techniques – orthogonal arrays and use case scenarios, to design tests that provide a high level of test coverage while minimizing the total number of tests.

  • Selling Testing to Management

    A look at the importance of been able to clearly outline and manifest the importance of testing to top level management.

  • Testing – It’s Been Emotional

    Derived from personal experiences over 30 years, this will be a mind map based talk from the heart, not a ‘death by powerpoint’ presentation. A look at various aspects of testing from the presenters experience.

  • Testing for Accessibility

    This presentation will give an insight into the relationship between accessibility and usability for disabled people.

  • Value At Risk: And Model for Quantifying Risk

    This presentation focuses on the deriving a model for risk evaluation and risk based testing, based on a simple value at risk model, addressing limitations of the traditional models.

  • Testing Hats

    This inspirational presentation draws on ideas from the testing industry and outside it. Gain fresh perspectives from wearing somebody else’s hat.

  • Center Based Testing Approach: Predictability, Productivity & People

    This presentation offers a viewpoint that testing can only be predictable and productive to any meaningful degree when carried out in a controlled environment where the test process is Industrialised i.e. In a Test Center

  • Controlling Mindsets As Key To Success in Testing

    Testers and programmers live in different worlds – different views of the same product and expectations based on their own, limited view lead to misunderstanding and disappointment.

    This talk from Echhard Buchholtz of Nokia in Germany looks at how the mindsets of software testers and programmers differ and ways in which they can cooperate for common benefit.

  • A Survey Of Unit Testing Practices

    This presentation looks at a survey conducted into unit testing practices in 19 different companies along with an analysis of the results.

  • Aligning Development and Testing Lifecycles

    The key thing to do when producing a testing strategy is to align the testing and development activities – this presentation looks at how this can be done.

  • How to Become a Happy Tester

    This presentation shows you that changing your attitude and thinking differently about yourself and others can and will change the situation and most importantly your reactions to it.

  • Can Testing & Development be Combined

    A look at the effects on testing of multifunctional teams within the presenters organisation. The effect in projects and how to make these teams work!

  • Taming the Testing Monster

    In this presentation, a number of techniques are described and outlined as applied to the incremental development model that has proven to be very effective in tackling certain challenges/

  • Use of a Weighting Factor To Improve Software Verification

    This talk from Mark Flecken highlights a case study undertaken at Ericsson for software development and verification projects of the new telecommunication data services. This presentation demonstrates that a weighting factor can be used for improving a verification process in large defect data sets. The weighting factor can be used to illustrate if the verification process is good at finding faults and to find weaknesses in testing.

  • Building A Positive Passionate Organisation

    One of the key ingredients of a successul and performing testing organisation is a positive and passionate workforce – this presentation looks at how to build such a workforce.

  • Things Testers Miss

    This talk uses real examples of bugs that slipped past the test team and the reasons why those bugs were missed

  • Effective Testing using Pairwise Techniques

    This presentation explains two different approaches to Pairwise testing techniques using examples from real projects for illustration.

  • Why Risk-Based Testing and TPI Fails

    Lets be honest, we all learned how not to fail by failing ourselves! This presentation looks at combining Risk Based Testing and TPI.

  • The Software Testing Body of Knowledge

    This presentation provides a platform for discussion and debate about the Software Testing Body of Knowledge (BOK). ‘What is it?’ Who should use it?’, ‘Does it exist?’ and importantly ‘Do we need it?’

  • Real Life Tested Recipes To Build A Close-Knit Testing Team

    Real Life Tested Recipes To Build A Close-Knit Testing Team

  • Use Community Tools To Improve Team Work

    The presenter talks about his experiences of using tools commonly used in open source projects on the internet in an enterprise environment.

  • The Test Business Team

    A discussion on how taking a business-like approach might help test and QA managers to be more influential within their organisation.

  • Rise and Fall of a Test Organisation

    This presentation looks at the rise and fall of several professional independent test organisations as witnessed by the presenter.

  • Assessing Data Quality, The Biggest Risk in Town

    A look at how testers manage data quality and the risks associated with this. This presentation will look at data quality reference models and data verification models.

  • Testers Are Doing It For Themselves

    This presentation concentrates on how every testing professional can move their career forward through personal investment. Different types of personal investment are looked at, including hints and tips on how to help yourself, your teams adn the testing community!

  • Management by Numbers

    This presentation addresses some of the typical people issues that are encountered when utilising metrics and shows how cognitive science and social psychology can play an important role.

  • Making The Most of Test Design

    Without effective test design, all other testing efforts are more or less a waste – this presentation looks at how to make the most of test design.

  • Model Based Testing from UML Models

    This presentation gives a practical introduction to model based testing, showing how to write models in UML 2.0 for testing purposes.

  • Use Values In Building A Testing Team

    A presentation looking at how value based leadership can be harnessed in testing organisations. Learn how to pick the right people for specific functions within your test team.

  • Jujutsu Testing

    Jujutsu is based on the Budo analogy and this presentation looks at how testing and Jujutsu are based on a solid set of basic techniques critical to success.

  • Keyword Driven Test Automation

    A look at the recognised technique of Data-Driven Test Automation that has been developed in many different ways.

  • Taking Your Team The Iterative Direction

    This presentation takes the reader along on a customer experience story where the teams morale is lifted again. It looks at ways to speed up the testing process through working in iterations.

  • The Open Project & It’s Enemies

    This speculative presentation seeks to raise questions and discussion on the role of leaders in projects; collaboration, confrontation and cooperation in projects and also the relationship of the dream team to preferred style of leaders. Is it desirable or possible?

  • Performance Test Lead!

    A in depth look at the similarities between Amrican Football positions and positions within the Performance Test Team

  • A Special Member For The Dreamteam

    An overview of the creation of Specialsterne and how they created a special environment for people with Autism to work as testers and how their unique skills are a match for testing

  • Managing Contractual Aspects of Test

    This presentation sets out the aspects of a contracts schedule that ensure supplier obligations are documented and met.

  • The Security Testers Toolbox

    An overview of free tools that you can use for security testing (web) applications.


  • Feedback Focused Process Improvement

    A presentation looking at how we could learn from Toyota (lowest defect car maker in the world) – how we can learn from their models and methods that have revolutionised manufacturing and then inspired agile practises in IT.

  • Communicating to Decision Makers

    A presentation highlighting the importance of effective communication to top level management and how to get your point across

  • Managing Global Expansion of Testing

    A look at how one division within Mentor Graphics has grown across three continents amd how an iterative development process and has been implemented to achieve twice yearly releases.

  • Working With Model-Based Testing

    This presentation shows ways in which you can implement model based testing in your organisation

  • Keeping Your Metrics Message Simple

    This presentation looks at how over-detailed reporting leads to misunderstanding and porr decision making by obscuring key facts through looking at messages that test managers wished to convey and how they were actually interpreted.

  • What Kind of Fish Is A Tester?

    A quirky presentation looking at how people are different and how unique skillsets can be used in putting together a test team.

  • Investing in the Dream Team

    This presentation looks at the real life story of investing in the dream team at CTG and will not talk about complex theories but provide you with no-nonsense practical models that you can apply in your organisation

  • Forensic Software Engineering: Take The Guesswork Out of Testing

    A presentation about failure patterns and how to take the uncertainity out of testing.

  • Defining the Profession

    A discussion the overall state of the test profession – the presenter shares his experiences helping to shape and define the profession and shares these along with lessons learnt along the way.

  • Test Automation: From Record/Playback to Frameworks

    This presentation will look at the different automation code structures and their relative merits. It will also look at test frameworks and review progress in the field.

  • Setting Up A Test Centre In A Maintenance Department

    This presentation discusses the process whereby a test centre is set up and looks at the aims and challenges encountered.


  • Testing – Beyond The Final Frontier

    A presentation that looks at a slightly ‘tongue in cheek’ account of where we are now ad where are we heading.

  • When Testing Stops and Politics Take Over

    This presentation looks at real life experiences and shows how a structured project can turn into a race against time and budget to reach one specific goal.

  • Focusing Test Efforts While Servicing Large Systems

    An in-depth analysis of challenges behind servicing complex software, ideas behind our system of collecting and reporting change related data, implementaton details and some results we achieved with this.

  • An Exploratory Testers Notebook

    This presentation demonstrates more formal approaches to documenting exploratory testing and includes a look at an informal exploratory test session, simple mapping and diagramming techniques.

  • One Flew East, One Flew West… Lessons Learned Testing a Travel Website

    A case study looking at a major project testing a major companies new travel website

  • Defining the Role – Making It Great Being Special

    A look at how within the testing community, there is a need for special people with different professional and personal skills – a fasinating presentation!

  • Ready to Ship

    This presentation looks at metrics and how to establish and utilise a set of simple yet powerful KPI’s.

  • Where Testing Creativity Grows

    This presentation puts creativity in the testing context and talks about environments that can foster creativity and techniques that may be used in different situations

  • Testing the Heathrow Terminal 5 Baggage Handling System

    An in depth look at the massive test project that was the Heathrow Terminal 5 Baggage Handling System

  • Testing is the Best Work There Is!

    A presentation looking at the enjoyable aspects of testing and what makes it such a great profession. Examples from within Ericsson are used to support this.

  • Defining the Test Profession: A Cultural Perspective

    This presentation will contend that when defining our profession, we should consider taking a cultural perspective which, while including factors such as a testing body of knowledge and skill set, will also consider wider cultural aspects.

  • Usability Testing for Flash Applications

    This presentation highlights the key approaches for Usability testing of flash applications.

  • Managing The Development Testing Via DevTest Score

    This presentation looks at the use of DevTest score in a development testing project.

  • Using Lessons Learned to Challenge ERP Projects

    This presentation looks at research into ERP (In particular SAP) project experiences and attempts to summarise the broad categories of failure so that the reader may adjust any of these ecperiences to their own situation.

  • Regulating the Profession

    A look at ways in which testing as a profession can be regulated as it moves more and more towards a professional status.

  • Test Approaches & Standards: A Survivors Tale

    This presentation compares how different approach work, introducing metrics for test effectiveness , efficiency and measuring as well as the “human factor”

  • Do Agile Methods Make More Satisifed Customers

    This presentation discusses Agile Testing from a customer point of view and tries to enlighten some of the benefits that may be gained and importantly of the challenges and problems that may occur.

  • The Professional Test Analyst

    The importance of the test analyst role is severly underestimated within the software testing industry – this presentation looks at how a good, well trained and experienced test analyst can benefit a project.

  • Testing – A Dangerous Profession

    A thought provoking presentation looking at how testers can be more prone to depression and burnout

  • Realistic TPI by Professionals: A Case Study

    This presentation outlines how an internal testing profession was recognised and improved upon in the retail IT division of HBOS.

  • I am an Agile Tester – This is Why!

    This presentation will look at customer stories, anecdotes, quotes and expert opinions with an aim to provide a clear overview on what an Agile Tester actually is.

  • Does Certification Define The Profession

    This presentation is based on a no. of interviews where managers and testers have answered their different sets of questions based on ceritifcation.

  • Testing and Developing: Equal Partners in Software Development

    This presentation gives an overview of the complete software development and test process and focuses on the close co-operation between developers and testers.

  • Software Testing of the NHS Spine

    This presentation describes the contribution that software testing has made to the successful delivery of a vital programme for the NHS (National Health Service of England)

  • Whatever! Testing with Attitude

    This presentation will highlight ways to take testing to a new level of professionalism and influence in business and IT.

  • Workshop on Agile Test Strategies and Experiences

    This workshop looks at the topic of Agile followed by an interactive discussion with audience participation on the issues raised and those affecting participants.

  • Managing System-Level Integration Tests

    This presentation shows two different scenarios – the integration of a legacy system and the integration of a messaging platform for a mobile provider.

  • How to Test A SOA System

    This presentation looks at the challenges encountered during testing a SOA System and the considerations necessary to set up a strategy for this

  • Can “Softer Skills” Help Define Your Career Growth

    This presentation looks at how softer skills and how they can assist tester in their career.

  • Software Testing: A Profession of Paradoxes

    This presentation aims to create awareness of a number of testing paradoxes that challenge us all. Some are well-known, while others are more obscure – but all are worth investigating.

  • Do I Call Myself a Professional Tester?

    This presentations provides attendees with a platform to discuss and debate the need for professional Performance Performance Testers in todays software testing world

  • The Future of the Test Profession

    This presentation looks at results and conclusions from research conducted on the maturity of testing in the Automotive, Aviation, Medical Systems and Pharmaceutical industries.

  • Developing & Maintaining Test Environments is a Profession Too!

    This presentation looks at the critical role maintaining testing environments proved to be in real live situations within Rabobank

  • Defining our Profession By Setting the Standards

    This presentation looks at a number of standards in the testing discipline and explains how they impact every day testing decisions.

  • Path to Professionalism: Skills of Star Performers

    This presentation discusses the specific, trainable skills which testers can learn to increase their credibility, improve their effectiveness and enhance their overall value to their organisation.

  • Metrics: Tips & Pitfalls

    This presentation will provide working professionals with some tips to help them use metrics more effectively in the daily role.

  • Defining the Test Profession at Sony Ericsson

    This presentation investigates the use of TPI within Sony Ericsson – pitfalls and what made certain projects successful

  • A Practical Experiment in Model Based Testing Adoption

    This presentation will introduce processes for undertaking MBT and highlight where there are different to traditional testing.

  • Testing is… Learning From Different Perspectives

    A presentation looking at different approaches to testing, testing from different perspectives and the challenges involved as a result

  • Grand Testing Master Theory – Mind Mapping Workshop

    This sessions leverages recent research discoveries about how Chess Grand Masters and how this can be applied to testing

  • Test Engineering at Google

    An insight into testing at Google from a senior test engineer that works there

  • From Keywords to Model-Based Test Automation

    This presentation looks at a solution that generates keywords sequences to be executed on the System under Test

  • Multidisciplinary Test Engineer – Is That A Future?

    This presentation looks at the vast role of the test engineer and discusses the future of such a complex role

  • Why I Am Not Yet Certified

    The Certified verus Uncertified argument is looked at in detail

  • Becoming Agile – Reshaping Testing for an Agile Team

    This presentation looks at real-life examples and will give a flavour of what it is like to bring a test perspective into an agile project and what it is like to be on an agile project with a sudden need for test expertise.

  • Software Testing for the Home Entertainment Industry

    A look into how software testing for the home entertainment industry works.

  • Why do we Bother with Test Strategies

    This presentation will provide an insight into why railroading into a Prince2 environment, the ‘industry standard’ test strategy to test case terminology does not always work.

  • Preparing your Team for the Future

    This presentation looks at how you can best prepare your test team for change and ultimately for the future.

  • Two Futures of Software Testing

    A discussion looking at alternate paths/futures for Software Testing as a profession. Where are we now, and where are we going?

  • An Entity Model of Software Testing

    This presentation will use a mixture of entity diagrams and practical examples to present a model that all testing follows to some degree – The Test Entities Model

  • Test Patterns: A New Concept for Testing

    This presentation looks at a new concept – Test Patternsand how they can support the testing process.

  • Personal Test Improvement – Dealing with the Future

    A presentation where basics skills necessary to become an excellent tester are discussed and how to focus on what you need to learn.

  • ISO 29119: The New International Software Testing Standard

    A discussion on the new ISO Standard applied to software testing

  • Software Testing: A Framework Based Approach

    This presentation looks at the advantages of adopting a broad framework to incorporate assumptions, values, concepts and practises that give a realistic picture.

  • SOA – Agile or Fragile

    This presentation looks at SOA and how it can be used to best effect in organisations

  • Where Do ‘Old’ Testers Go?

    This presentation targets professional testers of all experience levels but especially those who have reached that moment of innuendo of progress in their career and are determined to do something about it.

  • A Statement for the Future

    This workshop looks at putting together a statement for the future direction of testing.

  • Performance Testing in 5 Steps: A Guideline to a Successful Load Test

    This presentation looks at performance testing, the challenges you may encounter and highlights a Performance Testing methodology and how it can contribute to a successful performance test.

  • What Testers Can Learn From Management Guru’s

    This presentation looks at what testers can learn from management guru’s like Prahalad, de Vries, Maslov, Covey and Collins.

  • Agile Test Management Using SCRUM

    This presentation looks at lessons learned with using SCRUM as a test management method and the benefits of using it in your daily role as a test manager.

  • Test Maturity Model Integration (TMMi) – Process Improvement for the Present and the Future

    This presentation will present experiences implementing the TMMi model and the associated benefits.

  • What’s Wrong With My Test Data

    This presentation looks at the problems of test data management and ways to take them into account already during application and database development time.

  • Establishing Testing Knowledge and Experience Sharing at Siemens

    This presentation looks at how knowledge and experience is shared at Siemens through numerous different ways and how the company has benefited as a result

  • The Future of Testing is in New Zealand

    A presentation looking at the opportunities for testers in New Zealand

  • Testing – Can the Past Tell Us the Future

    This presentation is aimed at people working in testing for 20 years or more and asks the question, Can the past tell us the future?

  • Probalistic Stochastic Test Data

    This presentation discusses how to generate random test data that is repeatable and that are reasonably probabilistic of the entire population of possible test data inputs for both positive and negative testing.

  • Testing – An Olympic Challenge

    This presentation will outline the full life cycle of the testing project for the 2008 Olympics Games – fasinating insight into a truely unique project!

  • Software Testing – A User’s Guide

    This presentation looks at the relationship between testing and management and focuses on identifying the concerns of all parties in different business contexts.

  • The Specified Testers of the Future

    This presentation looks at the roles of testers in the future, the dwindling traditional roles becoming much more specified.

  • Red Hot Testing In A Green World

    This presentation looks at the impact of IT on the environment and how testing can play a role in reducing this effect.

  • Are Real Test Metrics Predictive for the Future?

    This presentation looks at the results of 36 different projects using metrics are discussed

  • Make Your Testing Smarter – Know Your Context

    This presentation looks at the mantra – ‘Test Smarter not Harder’ and how we set about accomplishing this

  • Testing is an Island – A Software Testing Dystopia

    Software Testing is complex – this presentation looks at the different aspects of complexity involved.

  • Preserving Our Future Through Customer Satisfaction

    A presentation looking at a Cutomer Focused approach to testing while also looking at the Developer Testing paradox.

  • The Truth about Model Based Quality Improvements

    In this presentation, model based techniques for quality control are discussed along with their advantages and disadvantages.

  • Mobile Software Testing – Facing the Future Challenges

    This presentation examines current and emerging trends in mobile systems and presents approaches for challenges facing developers and testers.

  • Heuristics: Solving Problems Rapidly

    This workshop looks at exercises and reports on Heuristics, how they work and how they can lead us astray and importantly, how they can be applied to software testing.

  • Imagination is More Important Than Knowledge

    A presentation looking at how testers need to keep pace with all other areas of IT in a constantly changing environment – Software Testing in a complex, networked world.

  • Model-Based Testing of a Financial Application – A Case Study

    This presentation looks at the deployment of a Model Based Testing solution to a project of migrating a critical financial application.

  • Testers in the Board of Directors

    A presentation that explores how testing can increase in importance and take its rightful place and well deserved place in Management

  • Introducing TMMi Level 2 – A Practical Approach

    This presentation gives an insight into how one company tries to reach TMMi Level 2 in a more practical manner.

  • Put Test in the Driving Seat

    This presentation looks at how to create the test manager transition from self-pitying victims to winners securing project success in the future.

  • The Damage Zone

    A presentation on Off-shoring as a strategy and how to manage the process where inevitably there will be setbacks and dissapointment – ‘The Damage Zone’

  • Tester Needed? No Thanks, We Use MBT!

    This presentation will go into detail regarding the current state of Model Based Testing and its future while also sharing experiences from a pilot project with Model Based Testing.

  • Testing, The Next Level

    A presentation focusing on the challenges encountered during testing and how to overcome them

  • The End of Testing As We Know It

    In this presentation, James presents his vision for what testing will look like in five to ten years. The tools required to do it right and the technological innovation necessary to get there.

  • Best Practices for Testing in Virtual Environments

    This presentation provides practical guidance and best practices for testing in virtual labs at VMware and several customer sites to help you get started and see immediate results

  • Agility and Process Maturity, Of Course They Mix!

    In this presentation, the method will be shown from the testers perspective, describing both the great wins and the challenges we still have to address.

  • Traditional QA meets Agile Development

    A presentation detailing how to successfully transform testing from a traditional process to a highly iterative approach.

  • Route Cards to Reach 2013 – Managing the Profession

    A workshop where attendees built route cards for the future of the testing profession up until 2013 and how to get there

  • Test Environments, the Future Achilles’ Heel

    Looking into the future of testing, one thing becomes very clear, test environments will become more and more the Achilles’ heel of test projects. This presentations discusses this in detail.

  • Analysing your Defect Data for Improvement Potential

    In this presentation, you will see how you can organise and perform an analysis of your companys defect data in an effective way. The typical information collected for the analysis will be demonstrated and how the data can be analysed and presented.
    It will also be demonstrated how to turn such analysis results into practical, focused improvement actions and the resulting benefits from them.

  • Best of Both Worlds – Exploratory Testing in a Structured Way

    This presentation looks at how scripted testing can be combined with exploratory testing, and looks at whether we should regard exploratory testing as a philosophy or a test design technique

  • Keynote 4: Moving to an Agile Environment: What Went Right, What Went Wrong

    Ray Arell, Intel, USA is a Senior Engineering Manager and Agilist at Intel. This presentation focuses on what went wrong and why and the effect it had on his team when Ray & his team decided to go Agile!

  • Keynote 3: Agile and Process Maturity – Of Course They Mix!

    Will process maturity or agility be the future of software development and test?

  • Changing How You Manage & Communicate Change

    People often ask why senior managers are so poor at implementing change.

  • Keynote 1: Today’s Testing Innovations

    As a consultant, Lee Copeland has spoken with thousands of software testers in hundreds of different organisations.

  • Adverse Effects of Distributed Development on Product Quality

    This presentation reports on a large-scale study including 22 distributed projects in various product domains, in which product quality problems were analysed.

  • Evolution of New Feature Verification in 3G Networks

    This presentation will provide a whistle-stop tour along a timeline from 2004 to today, dealing with challenges encountered in verification of new features in 3rd generation Mobile Telephone Networks within Ericsson, and how these challenegs were overcome.

  • More and Better ‘Test Ideas’

    The centre of the testing universe is for me the test ideas that can be documented or generated in the mind of a tester as the testing evolves.

  • The Supertesters – A Slightly True Story

    This is a theatre performance where ‘The Supsertesters’ mix serious test issues and common prejudices with laughs

  • Program Test Management: A Survival Kit

    This presentation will illustrate effective techniques for successful Program Test Management.

  • A Test Service Centre – In Reality

    Why don’t we get better even though all of us try our best? The solution is not to work harder; it’s to work smarter.

  • Fuzzing for Software Testing and Quality Assurance

    Fuzzing is a proactive method for discovering zero-day security flaws in software.

  • Bring Requirements to Life with a Paper SIT

    In this presentation the speakers will explain to you why a SIT in a SOA environment is so difficult and why that can result in a delay in the software delivery.

  • Empowering Open-Source Testing Frameworks

    In this presentation we dicuss how the combination of open-source and commercial products can increase the user-acceptance of the overall product, increase its capabilities and protect the customer’s dependancy on a commercial vendor.

  • Modelling Scenarios with a Framework Based on Data

    Scenarios representing realistic business sequences are important tolls for finding problems that matter to stakeholders.

  • Top Ten Quality Tips for Agile

    This presentation will demonstrate how common problems can be prevented using ten practical, real life tips that the participants can take back to the workplace and implement.

  • The Ethics Debate

    This workship is a hosted ‘House of Commons’ debate, derived from a popular Dutch TV show and on the topic of ‘Ethics within the Testing Profession.’

  • Free Testing Tools, The Market Scanned

    This presentation presents the results from several years of studying the test tol market, with the strong emphasis on an analysis of the usage of free testing tools.

  • Test Data Management – The Next Hype?

    Test Data has until recently been the forgotton component in software development and testing, even though developers and testers often struggle with their test data.

  • A Lucky Shot at Agile?

    In this presentation Zeger describes life as a tester in a team of agile rookies.

  • ISO 29119: The New International Software Testing Standard (1)

    This presentation describes progress on the development of ISO 29119 and the challenges with creating a generic testing standard that is applicable to all organisations and all types of projects.

  • Executable Requirements with Behaviour-Driven Development and Cucumber

    This presentation will show you how to set up Cucumber and demonstrates how to write and execute requirements with Cucumber.

  • Prescriptive or Exploratory Testing: Which is More Effective?

    In this talk the results of an industry-based experiment are presented, comparing the usability and defect detection effetiveness of EP, BVA and ST to exploratory testing.

  • Using Real-World Agile Practices on Your Testing Project

    This presentation reviews a survey of 25+ real world agile projects and highlights a number or successful agile practiced that you can use in your own agile testing projects.

  • Experiences Working with MBT an Qtronic

    This presentation covers requirements on and possibilties for test organisation considering introducing MBT.

  • Real Exploratory Testing, Now with a SOA Twist

    The use of ET is explained in this presentation for SOA projects.

  • Software Test Automation Success

    The implementation of STA is notoriously challenging and seldom lives up to expectations.

  • Don’t Shoot the Messenger!

    This session is most important for people who are keen to learn new techniques to improve their communication skills.

  • What next for MBT? Implementing with Tools

    This presentation discusses a selection of tools in relation to MBT.

  • SOA: What’s in it for Testers?

    In this presentation you will hear abot the risks which need to be addressed by testing when SOA is used.

  • Automated Testing for the ATM Channel

    This presentation is a case study covering the development of an automated QA system for an ATM service to underpin an agile life cycle.

  • Many can Quarrel, Fewer can Argue

    This presentation is a valuable resource for everyone who wants to improve test related communication.

  • Effective Bug Management – Challenges and Best Practices

    In this talk, Michael shres the experience collected over many years of involvement in bug management activities.

  • Software Metrics – So Simple, Yet So Dangerous

    Discover ways of using metrics in ‘right way’ and how to deal with harmful ‘side effects’ of metrics in this presentaiton.

  • Regression Testing in a Migration Project: Challenge or Nightmare

    In this presentation the speaker tells the audience which challenges he faced as test coordinator.

  • A Sneaky Way to Introduce More Automated Testing

    Have you established test automation or are you planning to do so?

  • The Power of Risk

    This presentation starts where all other presentations on risk based testing stop.

  • Help! We have a QA Problem!

    This is about a real case of too many developers feeding too few testers, causing a testing backlog of half a year.

  • Spend Wisely – Test Well

    This workshop will provide experienced test managers with powerful ammunition to dispel prevailing myths about the bottom line impact of testing.

  • Effective Test Automation a la Carte

    The advanced keyword-driven approach to automated testing does not require programming skills from testers.

  • Risk-Based Testing – Details of our Success

    This presentation describes how we have implemented RBT in a system test organisation in the product industry.

  • Successful Introduction of a Practical Test Strategy

    What to do if your testing is fairly good, but there’s still something missing?

  • Test Process Improvement on a Shoestring

    Find out how to select and rank the most effective and appropriate measures for cos reduction in your situation.

  • Outsourcing Partnership – Shared Perspectives

    This is a presentation based around my experiences of creating and managing an external testing relationship.

  • Measuring and Managing the End Game of a Software Project

    How do you know when a project is ready to ship? Learn how to estimate, predict and mange your software project as it gets closer to its release date.

  • Incrementing Scenario Testing: Beyond Exploratory Testing

    IST expands the ideas of Exploratory Testing by automating the creation of the test room.

  • How Exploratory Testing Helps get Structured Testing Started

    This talk will help you bring uncontrolled projects under control using the example of the speakers own team.

  • Performance Testing Effort: Estimation or Guesstimation?

    This presentation provides the delegates with a platform to discuss and debate about Performance Testing Effort in the software testing industry.

  • Is “Agile” Distracting You?

    Jonathon will talk about the popular distractions from creating value, and how to avoid them.

  • What to Automate and What not to Automate

    In this session an overview of the choices and resons for doing automation are discussed.

  • Burning Issues of the Day

    In this presentation Michael talks about issues that are relevant to testers at the conference.

  • Serving two Masters: Requirements and Risk

    This presentation focuses on te differences in interests of a project manager versus the interests of a business executive during software testing.

  • Success with Automated Regression Test

    This presentation is about achieving success with automated regression test.

  • Agile – Common Sense with a New Name Tag?

    Agile development and testing has qualified itself as a buzzword of our times and is the defualt way of working when it comes to development projects.

  • Exploitation Testing, for Testing SLAs

    This presentation will give an inside into IT Service management and will give a practical approach for translating SLA items into test specifications.

  • If Only we Could Make Them Listen!

    In the real world of testing today, software testers are expected to provide the information needed for project sponsors and managers to make go live decisions. Given this a key par ofour job, why therfore do we often get it do wrong, or are not listened to?

  • Low-Budget Tooling: Excel-ent?

    Professional sofware testers need tools to perform their tasks efficiently. In many cases management decides not to buy a tool as the cost seems to high. This presentation will present a set of some surprisingly productive approaches to get along without commercial solutions.

  • Stay Agile with Model-Based Testing

    In this presentation, we will explain how we started to use Model-Based Testing in an Agile Development project and share our experiences.

  • A Visual Approach to Risk-Based Integration Testing

    The objective of risk based testing and misconceptions of them by project stakeholders and testing teams can sometimes cause confusion and division. This presentation highlights a possible solution.

  • Collaboration not Competition

    Testing and Test Management is a business that is, by its very nature competitive. We are reluctant to to exchange ideas, information and strategies all in the name of ‘confidentiality’

  • Agile Projects with Open Source Tools

    When using agile methods you not only need to change the behaviour of each individual, you also need to re-think your way of doing QA.

  • Trinity Testing: A New Way of Working

    As an industry we need more effective ways to test software, if only to cope with the “do more with less” mantra.

  • Play to Learn – Learn to Play

    Sometimes, explaining test concepts and test work to people, who have none or very little practical experience in software development in general and software testing in particular, is just too difficult.

  • Prioritising Tests? Use Your Gut Instinct

    Testing efforts have always been limited by time and resource constraints. If the development schedule runs over, scheduled testing time may be cut. Therefore, itis important to have tests prioritised so the most important ones are run first.

  • Automatic GUI Test with Ruby and Watir

    Testing of internet apllications using ajax in an agile development environment requires a good regression testing part.

  • Introducing Exploratory Testing to Save the Project

    In this presentation, Johan will present his experiences from introducing an exploratory testing approach into a highly scripted organisation.

  • Passion for Details

    When the Harvard Business School launched an article ‘The Surprising Right Fit for Software Testing’ not many would have imagined that the surprise was setting up work environment where people with autism spectrum disorder can unfold their passion for details doing software test for the company Specialisterne (The Specialists).
    In this presentation the founder of Specialisterne will share his experiences from channeling the untapped resources of people with autism spectrum disorder into software testing.

    You will learn about the power of passion for details, new ways of accessing resources and about adding a new ‘people’ dimension to software testing.

  • How Much Testing is Enough? Achieving the Optimum Balance

    Everybody remembers hard discussions regarding the added value of testing. The use of a risk based approach can help you by addressing the risks, covered via testing, but it cannot help you in the discussion on the most optimum depth of testing.
    The depth of testing in most risk based testing approaches is based on the available budget. This budget is divided over the prioritised risks, but will it lead to the most optimal test approach?
    In order to ‘win’ this discussion on the efficiency of testing, the following question has to be answered: How can you reach the optimum balance in testing, in that which you are testing, not too less but also not too much?
    An approach for measuring the added value of testing based on the Juran curve will be presented.

  • Passion for Testing By Examples

    If you want to become a great tester with passion, you will have to learn, train, and practice a lot of things. Sometimes people get lost in all the learning and education stuff, and especially putting “theoretical” concepts and topics into practice fails quite often.

    To make a difference in this session I would like to share with you three practical, metaphoric analogy examples from different areas that will help you to better understand, implement, and remember the corresponding testing concepts and practices until the end of your life.

    The first example is a real-world story from engineering related to risk-based testing. The second example is a practical experiment that addresses the testing of non-functional requirements. The third example is on integration testing.
    Be free to share these examples afterwards with your colleagues and stakeholders to show them excellent testing practices as one result of your passion for testing.

  • A Pragmatic Approach To Improving Your Testing Process

    Many test managers know they need to improve the test process in their organisation, but are not clear how they should go about it. How do you understand the effectiveness of your current test processes? How do you then move forward with quick wins and prepare for the introduction of longer term gains? This presentation will provide you with ideas to gather the information, identify the areas for improvement, and then show effective ways to present the information to management with the aim to get their long term commitment. Once change starts you will need to monitor the roll out and impact, to gauge whether test effectiveness has improved and the original objectives are being met.

  • Test Strategies in Agile Projects

    How do you create an effective test strategy in an environment of constant change?
    How do I know what should be tested?
    What problems exist and how do I handle them?
    How do I know when I have tested enough?

    This presentation guides you through the common obstacles, starting from the requirements in the elaboration of User Stories to the creation and execution of test ideas. The test strategies set the context in how to plan your work, select tools, create a measurable Definition of Done, suggest efficient test methods and techniques and how to build a good reasoning structure while exploring the system to find the most important bugs.

  • Numbers For Passion – Passion For Numbers

    The motivation to make this paper is the surprising result we achieved in our organization while introducing metrics and measurements.

    The purpose of this presentation is not to give the impression that this success story can be generalized. Success depended highly on the culture of the company, its learning ability and its willingness to change.

    On the other hand, once past the initial resistance, the introduction of metrics and measurements has more positive effects than negative effects.

    This made us conclude that numbers can lead to more “test” passion and also that even sceptics can get a passion for numbers.

  • Alternative Paths for Self-Education In Software Testing

    Software Testing is the most controversial profession in computer science. Given the lack of credible classroom training in software testing, successful software testers have to educate themselves, often in their spare time. Because it’s crucial for software testers to know what they are doing, testers have recently evolved several paths to self-education.
    In this session, Markus Gärtner explains alternative paths to the knowledge a software tester needs. While traditional classroom training provides one way to teach a professional on software testing, several emerging techniques value collaborative learning approaches over certification, thereby forming
    communities of software testing professionals.
    The session introduces the following approaches:
    – Rapid Software Testing
    – Testing Katas and Testing Dojos
    – Weekend Testers
    – Miagi-Do school of Software Testing

  • TMMi – What Is Not In The Textbook

    This talk is aimed at novices and experienced practitioners alike, and is intended to inform people about what TMMi is; where, when and how to use TMMi (backed up by practical examples); and most importantly why it makes sense for individuals and organisations to apply TMMi for themselves.
    The talk will cover “what is not in the textbook” by providing insightful and diverse applications of the TMMi model. TMMi is meant to be used by everyone in the test community, regardless of industry and the type of software delivery approach, eg agile or waterfall.
    TMMi is NOT a stuffy formal tool aimed at experienced test practitioners! In fact, TMMi is designed to be understood by management, too! In closing, the talk will cover why TMMi makes good business sense.

  • Learning to Love Data Warehouse Testing

    The challenges of data warehouse testing can sometimes be underestimated, but like any other development, testing needs to play a key part if the project is to be successful. When faced with testing a data warehouse for the first time, how do you make sense of the complexities of ETL applications, understand the intricacies of the rules that need to be applied, and get to the heart of the end-users’ reporting requirements, in order to decide the best test approach?

    Based on my encounter with a major Data Warehouse project, this presentation provides details of how a real test team went about testing it, what they learned along the way – and how they’d do it differently next time.
    What started off as quite an intimidating testing assignment, has transformed into an opportunity to spread the Testing word, and utilise our tools of the trade to demonstrate effective Data Warehouse testing.

  • Implementing Continuous Integration – A Case Study

    During this minitrack you will be able to see a real example of how “continuous everything” – integration, building, installation and test automation- has been implemented in an agile and fast changing development environment. Believe it, it’s posible!
    Laura is test lead of the core development line at Infojobs.net. During the past few years she has been in charge of validating and managing the quality of all those projects that change the infrastructure of Infojobs’ products. Maintaining the configuration and integrity of the test environments has been one of her goals and she is especially proud of successfully implanting test automation within the QA team.

    After only 6 years of experience in the world of testing, she is taking advantage of a fully automated development and QA process and she was chosen to show DEV and QA achievements at the present conference.

  • Building Effective Test Data Management in a Distributed Environment

    IT organizations today handle incredible amounts of complex test data which consumes significant time and effort. Increased outsourcing and distributed environments further add to the complexity in streamlining and managing data for testing.
    This paper presents you a practical approach to test data management in complex test environments, offering a framework and best practices that facilitate test data localization, test data generation, scrubbing, masking, de-personalization, data cleansing and data optimization.
    We will also encompass the industry best practices, tools & Frameworks that aid test data management.
    Key points:
    • Learn effective ways of managing the complex test data in the distributed environments.
    • Gain knowledge about various methodologies and approaches used to efficiently handle test data.
    • Cognizant’s test data management framework that adopts proven techniques to manage test data

  • Advancing Testing Using Axioms

    Test Axioms have been formulated as a context-neutral set of rules for testing systems and they represent the critical thinking processes required to test any system.
    • The Axioms enumerate the key areas of test strategy and provide a checklist of concerns to be addressed in any test approach.
    • Any company can use the Axioms as the basis of context-neutral testing assessment and to identify areas requiring improvement without using artificial maturity levels.
    • Axioms define sixteen skills areas required by testers, and could form the basis of a tester development framework and a certification regime that has meaning to practitioners.
    • The Quantum Theory of Testing: “if evidence arrives in discrete quanta, can we assign a value to it?
    This talk will introduce the First Equation of Testing, the Testing Uncertainty Principle, the Quantum Theory of testing, testing Relativity and the Exclusion Principle.

  • Critical Thinking for Testers

    How do you interpret the information you receive? Is it complete; correct; valid; appropriate?
    Critical Thinking is a skill every tester needs to do their job, and yet it’s rarely taught. This is your chance to understand the basic concepts and practice your skills in a “safe” environment. The topics covered in this tutorial include:
    • Why we need critical thinking skills
    • Ways in which language is used to get a particular message over
    • Dishonest “tricks” that can be used against us and how to spot them

    There are practical exercises and we’ll conclude with a debate on how this new skill can be applied to our everyday work related tasks.

  • Open Source Tools for Test Management

    This track session was the winner of the Inagural VideoSTAR competition where prospective speakers submitted short videos on what they wished to speak on and the testing community voted for their favourite.
    Testing teams face multiple challenges when it comes to test planning, execution and tracking. Join C.V. Narayanan as he explains the principles of test management such as test planning, tool selection and the process of setting up an effective integrated open source based test management tool set supported by branded tools. During the session he will address the planning & execution of tests, defect management and different types of testing covering functional and non-functional testing using integrated open source platforms.

  • Alternatives to Testing

    Traditional Testing R.I.P. Alternative Testing. Do we have to test like we always have?

  • The Mobile App Quality Challenge: Why Mobile Apps Need Real-World Testing Coverage

    In the highly competitive world of mobile applications, getting high-quality apps to market quickly can make or break the success of a product or company. With new applications for BlackBerry, iPhone and Android battling for media attention and consumer dollars, the pressure to get apps built, tested and launched has never been greater.

    But the testing methods that have worked for web and desktop app companies (e.g. in-house QA, outsourcing, emulators/simulators, beta testers) do not meet the testing needs of mobile apps. The testing matrices in the mobile app world are far too complex. Companies must test their apps across handset makers and models, wireless carriers, OS, browsers and location.
    This calls for a new approach – crowdsourcing. Doron Reuveni provides insight into the growing trend of crowdsourced testing for mobile applications, as well as addresses the benefits & challenges of this new model.

  • Keynote 5: Moving Beyond The Status Quo – Differentiating Ourselves and Demonstrating Value

    The work should speak for itself. We’re working incredibly hard-so hard and it’s never appreciated. We just shipped a poor quality product and no one listened. This agile testing nonsense doesn’t work; nor does it truly apply to us anyway. “They” always seem to want “more”.

    Do some of these comments sound familiar to you? They do to me. They represent a phenomena I call Status Quo or Commodity Test Thinking. We want to believe that simply doing our jobs is enough-but it’s not. You’re attending a wonderful event-focused towards sharpening old and introducing many new skills. But what’s next?
    In this keynote, Bob Galen will share methods for sharpening your communication and marketing (yes marketing!) skills as test leaders and testers so that your key partners better understand your role, value and contributions. Come prepared to engage & communicate.

  • Keynote 4: For the Love of the Game

    This is a tour of how an award-winning computer game came to life. It is a look behind the scenes of an uncompromising software business where failure is not an option and testing is more than checking for correctness.
    In this talk, Dino Patti describes what it takes to bring together creativity, innovation and a passion for excellence. Game development naturally has a lot of unknowns, and reducing the challenge to the core has been absolutely necessary for them to keep the development and testing under control and the budget within a reasonable frame. He’ll talk about the principles they used as their mantra, and explain their direct hands-on approach and the learnings which came out of that. What are the quality challenges? And what kind of testing is needed when simply meeting specifications is not enough?

    • The challenges in the computer game industry
    • Why testing computer games is different
    • How to incite creativity and passion into your teams

  • Say What? Testing a Voice Activated System

    Whilst working as the Test Manager for one of Australia/New Zealand’s ‘big 4’ banks, an interesting proposal was made. The bank wanted to introduce a voice activated staff directory system. The bank telephone operators were directing 30,000 calls per month and it was felt that some sort of automation could save huge amounts of time and effort. I was asked to Test Manage this project and I jumped at the chance.

    This presentation is a journey through that project from test case design to acceptance. I’ll look at the use of traditional and new test case design techniques in a real live situation and show how testing at all levels can be a whole lot of fun whilst taxing the brain to the point that ‘testers block’ was becoming an issue.
    It taught me a lot about myself, others and what a bit of passion, trust and ingenuity can do.

  • The Carrot or the Whip – What Motivates Testers?

    People are one of the most important ingredients in any successful software testing venture. Building a skilful, passionate and dedicated team can be the difference between efficient and effective testing, compared to failing to meet test budgets, schedules or quality goals. A key aspect of successful test team leadership is understanding how to motivate your team, so they are inspired to achieve successful outcomes every time.

    In this presentation, practical approaches for motivating testing teams are presented. Each approach has been implemented by test managers and test leads around the world. These practical, proven-in-use strategies are supported by examination of the benefits of each approach from the test manager’s and test team’s perspectives. A selection of commonly ineffective, de-motivating techniques that should be avoided are also presented, providing insight into strategies that typically reduce tester motivation and result in lower quality testing.

  • A Lean And Mean Approach To Model Based Testing

    This presentation will show how to use open source tools for model based testing, without having to learn complex new techniques and/or having to invest in expensive programs.
    Based on state transition diagrams and “key word” driven Excel sheets, it is demonstrated that you can start experimenting with model based testing. I present a method for online, offline tests and also how to automate these tests.
    This is an experimental demonstration of the tools for test- or project managers, testers and programmers. The first half will be general, the second half more technical.
    Key points:
    • Model based testing and test automation
    • Using open source tools
    • Demonstrating state transition diagrams for testing

  • Testers ToolBox 7 Cognitive Techniques

    This software testing conference workshop will take you on a magical 90 minute journey through seven very useful but mostly unknown tools for perception and comprehension which will help you every day.

    The tools and their techniques are easy to learn and very powerful to use. And they will help you to master testing in the industry’s currently very demanding transition from that of a structured V-model history to a leaner, more agile and exploratory approach.

    The techniques are:

    • Gall-Peters Projection,
    • Popper’s Theory of Testability
    • Mind Control
    • The Stroop Effect
    • The Necker Cube
    • The Spinning Dancer
    • e-prime

  • Testing Package Solutions: Business As Usual?

    The last few years have showed a large increase in the number of organizations implementing software packages like SAP, Peoplesoft, Siebel, Oracle and so on. Obviously organizations run huge risks with these implementations. However, these risks are usually very much underestimated (by the organisation, not by the testers!). The risks are not so much in the functionality of the package as in the implementation in the organization and in the custom-made modifications. This presentation deals with testing in implementing packages and with the aspects that differ from the ones appearing in ‘traditional’ custom-made software.

  • Inspiring Passion in Test Teams

    Some of the most successful and highly regarded test teams are comprised of individuals passionate about their role and committed to performing at their best. Unfortunately however, testing is sometimes viewed as a less than fulfilling role, requiring little or no skill, and individuals enter the role underwhelmed and unaware of the positive and rewarding challenge it can bring.

    So how can individuals be inspired to be passionate about their role as testers? Where does inspiration and passion come from? This presentation will evaluate sources of inspiration and the power of learning to inspire and motivate. Suggestions are provided to assist test leads and managers in establishing continuous learning environments.

  • Embedding a Passion for Quality Across the Lifecycle – We Just Love Testers!

    This presentation could have been titled – ‘A passion for testing – how a tester held a project to ransom’. With the right focus test teams can be seen to add real value to a project instead of being the eternal problem. Part of the problem is that with the wrong focus test teams can actually get in the way of the generation of good quality software.

    Using case studies we will look at how we might avoid a situation where testers could be accused of being a problem, in other words being accused of having a negative passion for testing, and the benefits this will bring to the whole project not just testing.

  • From Model-Driven Testing to Test-Driven Modelling

    Test-Driven Development is one of the most successful agile development practices, which enables a higher quality code and software based on it. Model-Driven Development aims at raising the abstraction level of software development by shifting from programming to modelling. While there are a number of initiatives on Model-Driven Testing, there is a lack of initiative on promoting Test-Driven Modelling (TDM).

    In this session, we will introduce TDM by demonstrating how to apply it to modelling the data, service, and user interface layers of a small but illustrative case study.

    TDM promotes defining examples for designed abstractions already in the model. It enables involving quality engineers into a close collaboration with developers by preparing sample models, which contributes to producing better quality models and better quality software based on these models.

  • Weekend Testing – Skilled Software Testing Unleashed

    A detailed experience report of Weekend Testing (WT) – An innovative idea which is bringing passionate testers under one roof would be presented. Main areas of focus would include introduction to Weekend Testing (WT), its mission & vision, journey, roadblocks, testimonials, achievements and future plans among other interesting experiences to share.

    What is special about Weekend Testing?
    • WT is about sharing passion to improve testing skills and contribute to software community.
    • Testers are free to TEST their ideas in WT sessions; the TESTER is the TEST MANAGER.
    • Development of testing skills [Questioning, Bug hunting, Observation, Recognizing and Clearing traps, bug Investigation, Note taking, Collaboration, Rapid learning, Time management etc] is given prime importance.
    • WT is FREE of cost and is conducted on every weekend across multiple chapters across the globe.
    • An opportunity to share passion for SKILLED TESTING

  • A Scrum Master And Test Managers Dialogue

    How to bridge the gap between sound independent testing and agile methodologies like Scrum?
    We DO have an issue, because a main Scrum principle is: “Trust the Team”. A Scrum sprint delivers “working software, ready for deployment”. So “no need for independent testing” may be a strongly advocated viewpoint of Scrum people you are collaborating with. And it makes sense! To a certain extent at least.

    What’s a tester’s position here? Do we agree, or do we oppose? Or something in between? Witness the discussion between a passionate Scrum master and a passionate test manager! Be there when two clashing viewpoints evolve into a really smart agreement on effectively combining independent testing in a Scrum setting.

    Egbert Bouman, a strong advocate of sound testing practices, and Geert Bossuyt, a leading Scrum authority will bring you an entertaining and valuable experience.

  • Testing ‘Slow Flows’ Fast

    We live in a world where instant fulfillment has become a buzzword in business. Everything needs to go faster and faster in our ‘online’ world, including our information services. Too often, we forget about ‘slower’ processes that need to be tested. These processes, which often interface independent systems and are sensitive to different timers, are considered impossible or very difficult to test. They are often only tested in a system testing phase, where real end-2-end tests would mitigate risks much further.

    This track session from EuroSTAR Software Testing Conference 2010 in Copenhagen will show you how you can adapt your own test automation framework to become ‘Interrupt driven’, using some simple adjustments. An interrupt driven test automation framework allows you to test your ‘slow’ processes end-2-end, reducing risks in cross-system communications and ensuring correct time-related processing. The session contains a step-by-step demonstration.

  • Building a Quality Driven Team

    How do you deal with team members and customers who -maybe subconsciously- jeopardise quality? What can you do to get them ‘test-infected’?

    In this session I will explain what and how you get a quality driven team, by taking position from an agile perspective. In agile teams, every step you take as a team should be the right step. As a tester you can provide value by getting your team to do Acceptance TDD, Unit testing, Exploratory Testing and Early Acceptance Testing.

    To get non-testers to do testing requires influencing skills. Based on my experience, I will discuss techniques on how to influence people. As a tester, you should be a leader!

  • Help I’m Stuck in a Scrum Project!

    Many testers think it is difficult to adapt and make high quality tests in a Scrum project. The different parts of the test process are often questioned and it is hard to maintain the orderliness good test is founded on. During the presentation you will get some practical tips on how you may continue your efficient tests when you start working according to Scrum.

    This presentation isn’t mainly about the Scrum process, but about how you can map traditional testing into a Scrum project. Is it really possible to not write a test plan? How can system tests and bug reports be handled?..
    Key points:
    • Practical tips on how to test in a Scrum project
    • How it is possible to not write a test plan and how to handle your bug reports in a Scrum project
    • How testers can be confident about working in a Scrum team

  • Agile Test Design & Automation of a Life-Critical Medical Device

    We present a practical, hands-on example of how we, a combined team of 6 test specialists, consisting of both manual- and automated testers cooperated to create an agile yet structured, traceable and time efficient test process involving both manual testing and test automation.

    The development team practiced SCRUM, so requirements evolved during development. The rest of the project-organization followed a V model approach, due to the regulatory requirements inherent in creating a life-critical product. Thus as testers we faced the demand for an agile yet structured and traceable test. Due to the size and complexity of the product, it was necessary to automate the manual tests.

    The presentation will mainly focus on how we organised and managed the process of devising expandable, flexible and maintainable automation friendly tests rather than on the technicalities of the automation. It will be an honest, real-life recount of our achievements and errors we committed.

  • eResept – Implementing Electronic Prescription In Norway

    Test management and testing in a program with several distributed projects is extremely challenging and vulnerable. This presentation will focus on how to manage different test phases through a program owned by the department of Health in Norway, who has the superior responsibility for the program. The presentation will address some main issues within collaboration between distributed projects when it comes to development, testing and deliveries.

    Projects owned by the government and projects owned by private companies. I will talk about politics and competitions and finally how a bicycle made people ride together.

    Collaboration and trust have been important words in the contracts between the parties in this program. Important issues to focus on are the challenge of the collaboration when the deliveries were late, and did we manage to focus it or was the focus directed on competition?

  • Testing – From A Critic’s Perspective!

    Daily requirements changes, continual integration of new features, user interfaces changes and technology challenges have become more common in today’s testing projects. With the changing dynamics, traditional testing methodologies need to change.

    In this presentation Palak Kedia shares her experience and some practical techniques to keep the testing efforts on track while reacting to fast-paced projects with changing priorities, technologies, and user needs. Besides this, the need of collaboration along with the other projects teams and the need of ‘independent thinker’ v/s ‘independent tester’ are also discussed.

    Let’s look at the testing challenges with a new perspective and discover ways to take control of the situation-rather than get controlled by it.

  • Lean Kaizen Applied to Software Testing

    The Test Kaizen – Blitz presentation will explain how ATP has applied the principles to software testing, as a new way of looking at “continuously” improving all ATP’s test activities. I will walk through the main processes “step by step”, to show in detail how we have used it in our test setup.

    I will show how this fits into our way of getting inspiration for process improvement, and the impact on our overall test improvement view.

    It was very important that we rewrote all of our findings, so Management could understand the meaning of the findings and value of the improvement areas. This was necessary in order to get Management acceptance of the investment.
    I will present our findings and how we have categorised, prioritized and implemented them in our test organization.

  • Session-Based Test Management in Practice

    In Session-Based Test Management, you divide testing into chunks you can deal with. Each chunk focuses on particular areas of functionality, problems, usages or alike. You document the testing done, not just what you thought you could do. You are free to get ideas, and follow them. The tester makes the operational decisions.

    And should the tester choose to follow the ‘wrong’ path, the testing is saved by the time box and the debrief. But of course at the cost that the test manager must be engaged in the work, not simply looking at how many test cases was run on that day.

    And it’s dynamic. Almost instantly a number of testers can be directed towards new functionality, new areas of problems, new risks. And yes – it is in fact very structured. And it’s also difficult and demanding – and rewarding.
    It’s a must-have in the tester’s toolbox.

  • What We Can Learn From Big Bugs That Got Away

    If you have ever shipped a piece of software, you probably have your own “big bug that got away” story. Some missed bugs are bizarre edge cases and others you have to ask how that one got missed. In either case, it isn’t so much that the bug got through testing and out into the wild, this workshop will focus on learning from these bugs.

    In this fun and exciting workshop, Ken Johnston presents his own Microsoft bug stories, such as the illusive “Sasquatch” and “Billing 101” bugs that got away. Ken leads attendees to explore root cause concepts, such as seasonality, serialization, certification blind spots, and QoS for services.

    Take away models for doing your own root cause analysis and implementing process improvements within your team. Bring your current bug challenges and explore others’ bugs in this highly interactive session-where you learn while commiserating your peers.

  • Test Accounting

    Successful software development and testing is never accidental. It always arises from organizations where transferring useful information is valued and accounted for. Most organizations value accounting as a means to report business results. Yet few organizations use accounting for monitoring the delivery of their IT services on a day-to-day basis.

    ‘Test accounting’ can be defined as the bookkeeping involved in making quality records of transferring information and IT services, and in preparing statements concerning the assets, liabilities, and operating results of those IT services.

    In this talk I will set-up a bookkeeping for testing purposes, using accounts for disciplines and stakeholders, e.g. for Design, Build, Testing, Business owners. I will explain how accounting standards and measures can be applied, to end up with usable ‘balance sheets’ and ‘result statements’.

    Accounting standards teach us how to capture estimates including their underlying planning assumptions. With that, both control and agility are provided.

  • Testing and Lean Principles

    The word Lean was originally popularized in 1990 to characterize the Japanese approach to the automobile industry.
    Although the approach in the beginning was to improve the manufacturing industry, the ideas have been spread to authorities, healthcare, software development and testing.

    Some of the cornerstones for lean development can be summarized as: create value for customers, eliminating waste, just in time, demand driven output, respect for people and continuous improvements.

    Although lean has been adopted in software development methods such as agile development and Scrum, testing has however been regarded as support procedures to the development methods and not using ideas directly related to the lean principles.
    This presentation will show how lean, from testing point of view, is interpreted and used by example from some organizations. This will show that testing cannot improve by its own, but need close cooperation with other disciplines within software and system development.

  • Branch Out Using Classification Trees For Test Case Design

    Classification trees are a structured, visual approach to identify and categorize class partitions for test objects. They visually document test requirements to make them easy to create and comprehend. Julie will explain this powerful technique and how it helps all stakeholders understand exactly what is involved in testing and offers an easier way to validate test designs. Using examples, Julie will show you how to create classification trees, how to construct test cases from them, and how they complement other testing techniques in every stage of testing. Julie will demonstrate a free classification tree editing tool that helps you build, maintain, display, and use classification trees.
    Key points:
    • The essentials to the Classification tree technique
    • Why it should be considered as an essential technique in the tester’s tool box
    • When and when not to use Classification Trees

  • Keynote 3: When Passions Obscures the Facts: The Case for Evidence-Based Testing

    Stuart Reid talks about ‘When Passions Obscures the Facts: The Case for Evidence-Based Testing’

  • Risk Based Testing – A Must For Medical Devices

    A basic design input in the development of medical devices is a risk analysis whose purpose is to identify possible unintended misuse and harm to the patient when using a medical device. Mitigations suggest means in the product design, which will minimize the risk when using the device.

    Testing medical devices is based on risk analysis in order to convince medical authorities of the safe use of the device. If the launched product fails anyway, it will surely lead to a mandatory recall from the market. In order to prove process maturity to the medical authorities, it is mandatory to apply systems for requirement management and for configuration management. The testing activities are part of these management systems.

    Some examples of FDA (Food and Drug Administration) recalls from the American market will illustrate the seriousness of testing medical devices.

  • Virtual Test Labs, The Next Frontier

    Are you spending too much time setting up test environments? Do you have too many “can’t repro” defects? Test lab virtualization may be the answer you’re looking for. It’s no longer just a promise-it can be a reality in modern test labs today.

    Darshan Desai explains how to leverage virtualization to solve some of your complex testing problems. Virtualization provides the ability to create and share test environments quickly and do more testing in the same amount of time. Darshan explains how virtualization reduces the total cost of ownership of test labs and helps you test earlier on production-like environments. More importantly, you’ll be able to file high-quality, actionable defect reports that are reproducible for the developer.

    Learn how successful teams at Microsoft use virtual test labs and understand the best practices and the pitfalls to watch out for when you go virtual.

  • Can Exploratory Testing Save Lives

    In this presentation Fredrik describes how Exploratory Testing emerged in the MedTech company Maquet (developing life supporting systems) which has strongly contributed to a high product quality and a market leading product. Maquet has also cooperated with James Bach, which has resulted in tool support of Session Based Test Management.

  • I Don’t Want To Be A Tester Any More!

    Let’s be expectation engineers!
    This talk will present expectation engineering, where testers and requirements engineers join forces and use their skills to express the stakeholders’ expectations in such a way that the expectations not only serve as a communication platform with stakeholders, but also serve directly as a basis for test cases to verify the fulfillment of the expectation. This will reduce redundant work, and enhance the chances of the development being faster and easier (and more fun).

  • Causes of Losing Valuable Testing Time

    Many methodologies tell us how to practice testing. Management does not want methods; they want reliable systems and a short time to market. When testing is on the critical path, each delay in testing has a great impact. It is therefore important to reduce the time killers in the test project.

    During this year’s presentation, I will share the results of my quest for the causes of why we keep losing valuable testing time. This results in a powerful checklist that indicates the areas of attention for any tester who wants to save testing time. Learn how to recognize the time killers in your project, create an awareness that helps you to shorten the time to market.

    With this checklist we can really help the IT-managers. It helps them to address the challenges of the project. And the best thing is, we testers can provide them with it!

  • Ready For The Cloud

    The IT world is headed towards Cloud Computing. Are you ready to move your data, applications, and services into a brand new technology platform? Moving to the Cloud can be unnecessarily dangerous and risky.

    In this presentation, Frank Cohen delivers an immediately useful checklist and actionable knowledge for any medium and large organization to make the move to the Cloud. Cohen shows how to adapt Business Service Management, Load Testing, and Application Service Monitoring to be proven techniques, tools, and methodologies to achieve secure and profitable IT operations in the Cloud. Cohen shows how open source Cloud Testing tools and processes reduce risks, increase security, and deliver service excellence.

  • Exploratory Testing Champions

    Come and enjoy my story about how I introduced Exploratory Testing Champions at a Swedish telecom company. This was to meet the astonishing request to implement ET in only 8 days to an organization with 80 testers!
    My presentation will give you a road map and a tool kit to enable you to set up a similar project. What can it take to get Exploratory testing into your current processes and ways of working?
    I present a background of the company, scope, contents and results of the workshops, responsibilities and expectations of the Exploratory testing Champions. I will present real facts from this live case, schedule, comments from participants and where we are today.

  • Working Ourselves Out of a Job: A Passion for Improvement

    Should testers be needed? If everything went well, if no-one made mistakes, if nothing needed improving, then testing and testers would not be needed.

    Finding defects and failures late is more expensive than preventing defects. Testing is not well performed in many organisations – late, expensive, ineffective and inefficient. Time to change our ways.

    If “improvement” means late, expensive testing done faster, we have a job for life. But instead, improving IT means having a more rewarding job working in a professional, consultative and advisory role.

    In this talk we will discuss the need to:
    – Focus on improving the service to the customer, not on improving testing
    – Focus on enabling others to improve, not on increasing the need for our involvement
    – Focus our passion on the big picture while helping others to focus on the details

  • Keynote 2: Monty Pythons Flying Test Lab

    Monty Python’s Flying Circus revolutionized comedy and brought zany British humour to a world-wide audience.
    However, buried deep in the hilarity and camouflaged in its twisted wit, lie many important testing lessons-tips and techniques you can apply to real world problems to deal with turbulent projects, changing requirements, and stubborn project stakeholders.
    Rob Sabourin examines some of the most famous Python bits-“The Spanish Inquisition” telling us to expect the unexpected, “The Dead Parrot” asking if we should really deliver this product to the customer, “The Argument” teaching us about bug advocacy, “Self Defense Against Fresh Fruit” demonstrating the need to pick the right testing tool, and a host of other goofy gags, each one with a lesson for testers.

  • Keynote 1: Putting Tradition to the Test

    Antony Marcano, RiverGlide, UK Takes a look at three different stories from the every-day lives of some very different people, to see how the influences of a changing world have led to them going beyond the constraints of traditional roles.
    In this talk Antony Marcano draws on aspects of his own life experience to take you on some unexpected journeys through life, tradition and change.

  • Pushing the Boundaries of User Experience Test Automation

    One of my current responsibilities is to find ways to automate as much as practical of the ‘testing’ of the user experience (UX) of complex web-based applications. In my view, full test automation of UX is impractical and probably unwise, however we can use automation to find potential problems in UX even of rich, complex applications.

    I, and others, are working to find ways to use automation to discover various types of these potential problems. Here’s an overview of some of the points I made. The work is ongoing and is likely to have demonstrated significant business benefits by the time of the EuroStar conference. In my experience, heuristics are useful in helping identify potential issues. Various people have managed to create test automation that essentially automates various heuristics. All the examples I’ve provided are available as free opensource software (FOSS). I’ve learnt to value opensource because it reduces the cost of experimentation and allows us to extend and modify the code e.g. to add new heuristics relatively easily (you still need to be able to write code, however the code is freely and immediately available).

    Automation is (often) necessary, but not sufficient Automation and automated tests can be beguiling, and paradoxically increase the chances of missing critical problems if we chose to rely mainly or even solely on the automated tests. Even with state of the art (the best we can do across the industry) automated tests I still believe we need to ask additional questions about the software being tested. Sadly, in my experience, most automated tests are poorly designed and implemented, which increases the likelihood of problems eluding the automated tests.

  • Defect Prevention: How To Make It Really Happen

    The first Eurostar conference I attended really opened up my eyes. I learned that the real challenge of testing is not finding the defects and get them fixed, but catch the bugs as soon as possible, preferably before they sneak into the product being build. The next few years of my career as a tester proved that this is easier said than done. I saw projects fail because suddenly during acceptance test all kinds of problems appeared. I heard project managers complain about testers delaying the project because they found too much defects. Still I never forgot the lesson from that first Eurostar. Convinced that prevention is better than cure, I consistently applied some simple principles that really work.

    1. Focus on Business Context

    When I start a new assignment as a test manager, I always look for the business case and project goals, whether or not they ask me to do. I check this by simply asking questions to various people. If I don’t get consistent answers, I have found my first major defect and I will write my first test report. But how to make sure this message is heard and will result in proper actions?

    2. Co-operate with Requirement Engineers and Designers

    Good requirements are crucial for a successful project. Testers should have a big role in validating them, but in real life they often experience resistance and don’t know how to act. How can you, as a tester, add value to the project team in this area? How can you become a respected sparring partner for requirement engineers? Some useful techniques are simulation, walk throughs and test modeling. Some communication skills will help too.

    3. Co-operate with Developers

    Often, component testing seems a mysterious activity, misunderstood by testers as well as developers. When applied in the right way it can prevent lots of problems and dramatically improve project performance. Therefore I have some tips for testers to become friends with the developers and learn to understand each other and work together as a team.

  • Implementing Change = Methods + People + Processes

    Stop talking about how great changes can be made by focusing either on Methods, People or Processes. The only way to make persistent change includes working with Methods AND People AND Processes. E.g. having the best tools and top motivated people, will not ensure a persistent change, if it does not fit into the organizations processes. When pursuing quality as a Test Manager you probably want constantly to implement more improvements. Not only to the way your test team is working, but also to the requirement team and the development team. What about the Management Team? Ever wanted to change the way they look at things too?

    This presentation provides a new view on Change Management and explains the importance of working
    with all three aspects:

    – Methods/tools
    – People
    – Processes on all organizational levels:
    – Strategic
    – Tactic
    – Operational.

    The challenge is that each organizational level speaks its own language. At the strategic level arguments
    and metrics should be expressed in money, traffic lights or customer satisfaction – not in test techniques, number of tests or user stories like on the operational level. The maturity level of the organization is another aspect that has a great influence on how and what changes can be made successfully. The presentation will include important lessons learned from maturity models. Metrics are important to communicate the need for changes and the progress made. The possible use of metrics is very different on maturity level 1 and 4. The presentation includes several real examples of metrics from a number of companies at different maturity level.

    The speech is supported by several cases, where the approach above to change management were used and in particular a case where maturity according to the Testing Maturity Model was increased from level 1 to 4 in just three years – requiring dozens of changes.

  • Customer Testing Quality in Outsourced Development

    The insurance company made the decision to outsource most of its IT development and technical maintenance to suppliers. This demanded new requirements to testing and quality ensuring in the company and raised a lot of questions:
    – How do we ensure that suppliers perform a test which provides a solution that is not filled with
    – What are the responsibilities for the test activities between supplier and customer?
    – How do we ensure effective testing without delays due to misunderstandings between supplier and
    – What are the test criteria to the supplier and how should they report these?
    – How do we ensure that test material used by one supplier for development can be re-used by another
    supplier for maintenance testing in future?
    – How is defect handling, test reporting etc. best done between supplier and customer?

    From this, the company created a new test model and test policy which includes setting test- and quality requirements for the supplier. The model has a defined test contract appendix which sets the requirements for the suppliers. These include that suppliers in future should use the company’s own templates and must uphold the company’s test policy. This was done to ensure that all suppliers were following the same guidelines, as many projects had more than one supplier as part of application- and technical developments. The model has a high focus on test quality ensuring, test reporting and approval in each test phase, according to the defined acceptance criteria.

    In-house, the company had a focus on communicating and educating anyone working as testers within acceptance tests, or who worked as test managers. This was to ensure that they were adequately trained to perform test activity of high quality, had the competencies to ensure test quality from suppliers and to ensure that delivery by suppliers was as required.During implementation of the new model there was a specific focus on communication with, and
    approval by, management to ensure success.

  • Houston We Have a Problem

    Prevent the surprise, become a pro-active test manager. Too often projects suddenly seem to spin out of control. Challenges and risks keep stacking up and the defect count grows exponentially. At the same time, management can put pressure on you, asking when testing will be completed.

    A surprise? Not really, defects only paint half the picture. The test effort, after all, is primarily determined by the number of tests that need to be completed. For an on the spot status of testing and accurate view on the quality and risks of the entire project we need to organize the test process to provide flexible, up-to-date metrics and trends on a daily basis. E.g. we need a view on baseline vs. actuals and ETC’s on test cases. Advanced metrics will provide answers on what needs to be done tomorrow to stay on track, the location and root cause of issues and who is required to take action. Also the test effort remaining for an acceptable product (or a specific risk level) can be estimated fairly accurately.

    In addition early involvement and preparation in the development life cycle, performing test intakes rather than reviews, will help you bridge the gap between different development teams and allows you to verify consistency between business requirements, the integration model, functional specifications and technical specifications. It facilitates knowledge transfer and provides you with the “story” behind the specifications. This will help prevent structural issues in an early stage and avoid blocking issues during test execution.

    This presentation combines daily test metrics and trends with test process dynamics and shows you how to become a “pro-active” test manager. Even better you can apply it tomorrow and take your test process to a distinct higher maturity level.

  • Happiness As Quality Measurement

    Professors from around the world discover happiness as new topic for their studies. In the IT world we also hear often that we have to bring happiness to the client, project leader, developers, users, …In short: quality of life (another description for happiness) has a lot of studies to offer. They are based on psychology, sociology and the world of

    During our trip through the world of ‘quality of life’ we try to bring the quality to the software testing life. We learn about our reptilian brain and why it can be stronger than the work of a complete development team. We find out how to translate The Popsicle Index to the testing world. We learn how the ‘zero tolerance for digital failure’ and ‘”good enough” technology’ go together in the New Normal.Discover how past experiences with an IT application influences our future thoughts and quality view on it.

    We also build on ‘mindfulness’ (which comes from the world of healing of deeply depressed people) because we all feel depressed sometimes in the development lifecycle of a software project. Come with a beginner’s mind to this refreshing session about everything you didn’t know about quality in real life. So you can implement it in the software testing life!

  • Life’s Too Short: Make a Difference

    When you’ve always worked for large corporations with tens of thousands of employees, following
    PRINCE2, V-model, waterfall development practices, in big testing departments with an established
    career path: tester, lead, manager – where’s the next logical place to go? Forget logical. What about an Agile startup of 15 people, as the first tester they’ve ever hired? Illogical perhaps, but the best jump I’ve ever made – being able to see the results of your work instead of being a cog in the machine? Priceless.

    This mini-presentation is the story of my own personal journey from Waterfall to Agile – increasingly
    frustrated at organisational silos and a development process that seemed ill-suited to our problems, I’d
    spotted a job that promised to be different in almost every way. Would I survive? How would I cope?
    And what could I possibly offer? I’d heard plenty about Agile: the greatest thing ever ever ever in the whoooole history of the planet, an over-hyped con, hostile to testers, tester heaven, just another way of selling expensive consultants and training, 100% automation no manual testing at all ever, yes there is manual testing but they don’t value exploratory, not only is there exploratory testing but it’s crucial to Agile.

    Which version of Agile would I find? Would I like it? Could a manual tester really be useful? I’ll talk about how things rapidly diverged from my original plans, how I found myself and later testers fitting into the team, what skills I found myself using, which I didn’t, and skills traditional testers might be surprised to know they already have.

    I want you to leave my talk with:
    * A better idea of what a first experience of Agile might be like if you’ve never tried it.
    * Stuff that might trip you up – and what’s WONDERFUL too!
    * Why being prepared to be unprepared might just be your saving grace.
    * A burning desire to go back home and work on improving quality, in whatever environment you’re in.

  • Chasing Quality in Cloud Computing: Testing Different Levels of Quality Requirements

    The IT world is changing quickly, in fact it’s accelerating for a complete new era. During the next couple of years the IT and testing scene will migrate towards servicing and sourcing, partly provided in the private, public and hybrid clouds. There is no time to lose. In the tradition of the testing world we are already late. Architects, designers, developers and suppliers such as Microsoft, Google, Amazon and IBM are working day and night to provide the technology and infrastructure for the near future. The cloudy future of IaaS, PaaS, SaaS, webservices, mobile Apps, virtualisation and social media and networks, in fact the real Internet age.

    More and more applications consist of ‘cloudy’ solutions. Every piece of software that moves into the cloud increases the dependency on the ‘evil’ internet. The end-to-end test complexity increases rapidly. Since all testing in de cloud IS end-to-end, the testing challenges are huge!Quality can be bought from the cloud: it brings you flexible performance and storage. Pay as you go. But what about the continuity, availability, elasticity and controllability of these cloud services that are delivered through the insecure minefield called internet? Cloud Computing introduces a new set of quality requirements, at different levels e.g. at the level of the software supplier, at the level of the internet and at the end-to-end level. So in short: how to test the cloud?

  • You Say Kvalitet, I Say Quality: Stories Behind Establishing Quality Benchmarks

    At the heart of the matter was one word – QUALITY (or Kvalitet in Sweden). “Your software isn’t good enough” was the assertion from the customer. “You’re being unreasonable” was the retort from the vendor. “You are our last chance” was the message we received from the customer’s CEO. “You”, being a small specialist (SWOT) team comprising, a Senior Project Manager, a Systems Architect, a Technical Project Lead, a Senior Business Analyst and a Senior Testing Consultant. This is not an unfamiliar scenario for those of us who have seen the software landscape change from bespoke to prêt-à-porter solutions during the past 15 to 20 years. It is also not an unfamiliar outcome for lawyers to get rich in these situations and for the customer to lose credibility in the marketplace.

    In May 2009 I was asked to assist with a Transport Ticketing project that had been struggling to Go Live for the previous 2 years (the overall project lifespan was 7 years). The solution was 98% ready – just 360 requirements of the original 16,000 required sign-off. This had been the situation for over 12 months, with no agreed end in sight. How could they be so close and yet so far away? How we achieved (what appeared impossible just seven short months earlier) is the story I will tell at EuroSTAR 2011. How we turned “your software isn’t good enough” into “we’re going live next Saturday” will never be a Nordic legend, but it will be remembered as the project that saved a software vendor from bankruptcy and another major city from a multi-million dollar failure.

    At the centre of this story is how we re-focused everyone on QUALITY/KVALITET. What’s essential, what’s negotiable, what’s unacceptable? These questions were central to our success. Our team didn’t identify any new requirements or write any new software. We just analysed the available data, spoke to the right people, re-worked the Test Strategy and agreed on a new GO LIVE strategy with an agreed date – and then MADE IT HAPPEN.

    An interesting Footnote from this Case Study is that the Customer hired/payed us (and expected us to deliver) – so we got little in the way of recognition. The vendor was over the moon, as we had saved them from an expensive lawsuit and (almost certain) bankruptcy. The vendor still calls us. The customer experienced NO significant outages – KVALITET!!

  • Testing & The Bottom Line: A New Method to Estimate The Value Of Good Software Testing

    We will in this presentation propose a new practical approach to estimating the economical impacts of software quality and security testing to cost reduction, and also to estimating the cost of testing itself.

    Quality assurance is sometimes seen as an extra cost. In the fields of quality assurance and testing, the situation has been yet worsened by the difficulty of applying quantitative models widely adopted in more traditional industries: companies writing and testing software have had a hard time estimating total costs of software testing procedures because of a lack of proper tools for these calculations. This is the problem we aimed to solve with our estimation framework.

    Of course, rather than costing extra, a carefully designed testing framework actually helps companies reduce losses caused by quality issues, post-release vulnerabilities and the consequent fix programs. Our cost estimation method aims at quantifying these savings in a statistically coherent, yet flexible manner.

    In the formulation of our framework, we use statistical models for quantifying the costs of different kinds of bugs, the times when bugs are found, the effect of testing on these times, and the effect of testing approach and amount on how great the bug discovery improvement is. While the method is initially described in statistical terms, we also demonstrate through both symbolic manipulation and calculations using simulated simulated how the framework can be used for estimating the total cost of testing, and total cost savings achieved by more thorough and earlier bug discovery.

    The model is very promising in that while being easy to deploy for fist-time users, it also allows for a robust statistical cost estimation and usage of previous data to refine the model so that the initial model can be customized to fit a particular environment.

  • Boot Camp and Beyond

    Quality in the software development lifecycle can be seen as being underpinned by good Process, Tools, and People. After working on a number of pre-sales assignments last year, I realised that I could comfortably define our test process and tools approach, but when it came to ‘people’ questions, the response was disjointed. I could provide good individual answers to questions, like ‘What are the skills and experience of your test team?’ or ‘How would you implement a knowledge transfer plan?’, but there seemed to be no framework holding it all together and driving quality from a ‘people’ perspective.

    Then I was asked to assist some colleagues in France, who ran a service centre where they supported software applications for a large retail client. An opportunity had arisen to discuss with the client a testing service, where additional applications would be brought into the service centre and tested on the client’s behalf. My French colleagues needed help defining test processes and tools, which I was only too pleased to help with, but something caught my eye in their solution outline. They planned to utilise a tailored customer ‘academy’ they already had in place for the service centre, which brought in new team members, and got them up to speed in everything they needed to know to hit the ground running. I had seen the academy concept in action before – but only very specifically around test analysis. What was new about the French model, was that it encompassed customer context and culture, ongoing technical skills, and very specific functional knowledge that staff would have to gain in order to succeed.

    Much of the ‘people’ information I’d provided for individual projects was applicable in this wider model,
    and could be brought together to form a more cohesive approach to resource and knowledge
    management. I realised that with a bit of coordination, I could create a compelling model not only for
    creating a specific test team, but one which could pro-actively drive the quality of the testing (and the
    software) in the longer term.

  • Mind Maps As Part Of The Agile Testing Process

    When being a part of a highly dynamic agile team where requirements are changed almost weekly, it puts a high amount of stress on the test department to keep up with test cases and keeping those existing updated correctly.

    I found my solution in mind maps, these provide a very visible and flexible solution to the testcase challenge. I will speak about a real life example of how mind maps helped me on my current project to create a better test process in an agile team.

    • Initial test process on the project
    • Challenges in a highly dynamic agile team
    • Different attempts to overcome the challenges
    • New test process
    • My experiences with mind maps

  • Test Data Management & Project Quality Go Hand in Hand

    Traditionally, the testing community has perceived test data the same way most organisations perceive test. Boring, time consuming and none value-adding. But new winds are blowing. Initiated by the complex project and test environments of today, testing departments are now taking the first small steps to recognise the importance of a focused test data management function. Maybe the testing community will too? Realising that we have long passed the good old days where a mainframe test data copy would do the trick, challenges in implementing a TDM function in today’s complex set-ups are many and insidious. And it needs a well executed plan.

    This presentation takes outset in experiences and hardships gained from a TDM optimising project and provide a live demo, inspiration and guidelines in moving forward with implementing and optimising a TDM function. The project was run alongside a big-scale on-going SOA programme at a major Danish pension fund. The project focused on three areas: Technical, Process, and People & Communication.

    In the Technical area, the project developed a TDM Dashboard. As a main management component, the
    Dashboard provides a test data copy function from Production to Test and between test environments. Besides, it offers an overview of the test data in the different applications and environments.

    The Process area developed a TDM strategy and optimised the test data processes in order to deliver valid, transversal test data quicker. It focused on a wide range of areas such as production copying, data generation, handling of requirements, data cleaning, profile usage, data pools and data re-use.

    The People & Communication area focused on including stakeholders proactive in the test data process and communicating roles and responsibilities as well as new functions and processes.

    Not only has the project delivered measurable and visible results, number of defects in Production has been reduced; hereby stressing that a well implemented TDM function with continuous focus on optimising TDM is added value and worth the effort.

  • Model Based Test Design for Performance Testing and Other Non-Functional Requirements

    MBT (Model Based Testing) has been used within my department in Ericsson since 2007. As an MBT tool we have been using Conformiq Modeler, which is a commercially available tool. This has been a great success, and is now our main way of working when verifying functional requirements.

    Until now, MBT has neither within Ericsson nor outside, only been used very rarely for verification of non-functional requirements, such as performance testing, load testing, stability and robustness tests and characteristics measurements.

    This presentation covers the work of two Master Students, who in 2010 performed a study of the possibilities to use MBT for verifying non-functional requirements. One of the results of this study was a new method, inspired by MPDE (Model Driven Performance Engineering), where non-functional requirements can be covered by test models describing the functional behavior. Test Cases can then be generated from these models with an MBT tool.

    The proposed method provides different possibilities to handle the non-functional requirements. The requirements can, for example, be introduced with new dedicated states in the behavioral model, or be introduced by extending the existing state model. Another possibility is to implement the non-functional requirements in the test harness, and by that keeping the model simple. The most realistic scenario, however, is a combination of all the above. The grouping and allocation of both functional and non-functional requirements should be considered already in the early test analysis phase.

    The new method has been tried out and evaluated. It has been proved useful and fully applicable, and there are clear indications that it is beneficial, and that project lead time can be reduced by using it. We have therefore now started to apply this method in our new development projects.

    The presentation includes examples of real cases where MBT has been used for verifying non-functional requirements.

  • Situational Leadership Styles on Test Approaches

    David Kolb describes four different experiential learning styles: doing, experiencing, observing and planning. Each person has their own preferred learning style and none can be called best. To maximize the retention of what you’ve learned, you should actually run through all 4 stages, independent of which stage you used as a starting point. On top of that, you might not always use the same starting point, depending on the situation, your experience and the way you are guided in your tasks.

    Hersey and Blanchard have provided us with a model to describe how we can guide others. This model has four different leadership styles which are dependent on a person’s motivation and knowledge. Every manager and every managed person fits at least one style, be it telling, selling, participating or delegation. Again no style can be called best and styles may vary depending on the situation.

    It’s clear that every tester has its own preferred testing style. Some testers plan everything ahead, while others are fond of exploratory testing. Our own approach depends not only on our own preferences but also on other circumstances like our experience, the budget and the deadlines to meet.

    Can we draw any lines between David Kolb’s learning cycle and the different test approaches? Are agile, exploratory and structured testing, to name a few, different starting points in a testing cycle, like the learning styles in Kolb’s learning circle? What would this testing cycle look like and could we conclude that it doesn’t matter what approach you use because in the end, you’ve done them all? Should we conclude that no approach is best? And what is the role of the test manager in all this? Can we influence the testing style of our testers by changing the way we guide them, using Hersey and Blanchard’s model of leadership and guiding? And what guidance fits which testing approach best?

    In this presentation, we’ll dive into the theory of why a certain (test) approach might or might not work for
    you, and what should be the best management approach per test style.

  • Keynote 4: Power to the People – Achieve Fast, Sustainable Change in Quality by Harnessing the Crowd

    Why do quality improvement initiatives so often fail? Even when you have management buy-in, committed resources and clear objectives, what happens when the shiny new processes and ways of working you have been labouring over run into the cold hard reality of implementation? Why don’t your people want to change? What if your people were the change? What if your new processes were written by the very people who would have to use them? How might that transform their creation, their acceptance – and their eventual adoption?

    In 2010, the Application Services (IT) division of Deutsche Bank made the decision to become a world-class IT organisation – and this is the story of our journey. We identified the major areas of weakness in our delivery capability and created an improvement programme to address them. So far so like every other change initiative. But unlike previous, failed, transformation programmes, this one leveraged hundreds of people throughout DB in the definition, piloting and eventual implementation of the programme. We did this by creating specific communities for each of our core change areas – testing, project management, agile, and more. By the time we formally launched our first deliverable, 1200 people had joined the communities – and had helped shape the overall programme itself. Now, instead of a quality programme being defined by a few individuals and then “forced” onto the rest of the organisation, we had a quality initiative defined by the organisation, for the organisation. As a result, process definition was quick, sign-off was almost instantaneous – and implementation was cheap and quick – you don’t need to spend much time training people in a process they created themselves!

    This presentation will explore our continuing journey to world-class IT delivery – why we needed to change and how we gave the power of change to our people; how we created communities and how we won over the naysayers; how we know whether we made, and continue to make, a difference.

    Key takeaways:
    The secret to Deutsche Bank’s success lies in our communities (crowds). Participants will therefore learn the fundamentals of:
    Crowd management – launching successful communities
    Crowd production -using communities to create and sustain change
    Crowd control – barriers, channelling and kettling – how to ensure your communities stay on track

  • Keynote 3: Pursuing Quality? You Won’t Get There by Testing

    Manual testing, exploratory testing, user acceptance testing, integration testing, reliability testing, and so on, are techniques that require (at least partially) a working system and because of this they are very late arrivals in the quality game. All any of these techniques can hope to do is point out the lack of quality. Obviously bug removal is useful, but to be a full partner in actually helping achieve quality in the first place, testing and development have to be closer partners.

    James Whittaker shows how to weave testing and development together by both blurring the lines between the two roles and involving users in the development/testing process as well in a developer-tester-user quality partnership. In order to achieve this, we have to throw away traditional testing artefacts like test plans and test cases and focus only on quality activities that add measurable value at every stage of development. James explains these activities and shows examples of how they are performed at Google.

  • Keynote 2: Death to The Testing Phase

    While doing research for his latest book, Gojko interviewed a team with a 99% defect detection efficiency rate, a team that changed from six month delivery cycles filled with problems to daily releases without any big bugs, a team that had zero bugs in productions over a five year period and many more examples of fantastic quality. One of the major recurring themes in all these stories was cancelling testing as a phase.

    In this presentation, Gojko explains why cancelling testing as a phase forces teams to be much more effective with quality assurance and presents war stories and examples that will inspire you to improve your testing process.

  • Testing EURO Changeover- too big to fail?

    You are facing testing a project called “Do or die!”. The biggest testing event – Estonia’s biggest bank (Swedbank) is going over to EURO currency. You have six months to do it. Roll-back or postponement is not an option. You have 15 testing teams to coordinate and you have to test ONLY the whole IT system of a bank and Changeover to EURO process. What will you do? I was set as an IT Test Manager for the program. I’ll tell you what I did!

    The testing teams to be coordinated needed one “common thing” to rule them all. It turned out to be the common test environment. So was the environment driven testing plan born. Lesson learned: planning the common, align everyone – YES! No single test plan was agreed but test plans of teams were published to a commonly agreed and structured place. Teams aligned their activities against plans of others and in case of need re-planned their activities together with other teams. Lesson learned: Don’t over manage, believe in self-organization! Who’s afraid of EURO? The business is! Why not to let them ease their worries by playing it all through themselves? Over and over and over again. Lesson learned: those afraid the most, test the best.

    What happened, how we did it, what went well and not so well are the topics to be covered during my speech, flavored with examples, tips & tricks for the Test Manager of the next “Do or die!” project.

  • FEAR The Psychology In The Pursuit of Quality

    The test profession has grown quite mature and most test professionals know about test methods and techniques. Still the stakeholders often don’t see the results they expected. Project after project I discovered that this problem in the pursuit of quality has to do with psychology.

    Testers and developers often do things for which there seems to be no rational explanation. They act out of FEAR, which stands for Fantasy Experienced As Reality. That is to say: for various reasons they cannot (or will not) understand the actual situation and thus act differently than would be good for the project. Instead of extending their technical skills, effort should be put in upgrading their communication skills.

    Interaction between testers, developers and clients is a sensitive area, all should understand that instead of blaming each other a good cooperation is beneficial to the overall quality of the software product, their organization(s) and even their personal well-being. I will show how to effectively increase the mutual understanding.

    Test Process Improvement consultants often seem to forget that badly motivated people will jeopardize even the best process. You will learn how to pay attention to people issues when doing TPI and I will guide you how to prevent encountering the point of disillusionment.

    And of course, I will show and explain the “graph of testers happiness”!

  • Raising the Quality Bar In Agile Teams By Means of Early Test Automation

    When testing in agile projects, testers are required to deliver the best service possible to the developers. To be able to deliver this highly needed service to developers, the team needs already “done” features to keep on working, avoid known defects to resurface (reappear) as well as avoid builds to crash.

    I will present successfully applied methods and techniques used in agile software development projects. The focus on this approach for test automation is to let it become this needed “early” service to both developers and manual testers. By following these methods and techniques, testers will be able to impact the quality (positively) by contributing to test automation and test improvement via better design of test cases for test automation. It is a fundamental matter for high quality products that developers are able to continue keeping the high quality alongside new functionality is added. For this to be a success, test automation needs to be “pushed back” to as close as possible to the developers.

    In order to keep up with the pace of the developers and to avoid the “Death by Maintenance” syndrome, a proper approach to “early” test automation is required. I will introduce examples of a proper approach as well as some handy test techniques for easing the scripting of tests.

    The presentation is about how to incorporate successful test automation into an agile project. It is about the processes used daily by developers as well as incorporating our tests to the build processes and hence improving the quality of our products. I will also emphasize the importance of cooperation with the developers and their tool smiths, by pragmatic approaches.

  • The Pursuit of Quality- Chasing Tornadoes or Just Hot Air

    Rain is great for farmers and their crops, but terrible for tourists. Wind is essential for sailors and windmills but bad for the rest of us. Quality, like weather, is good or bad and that depends on who you are. Just like beauty, comfort, facility, flavour, intuitiveness, excitement and risk, quality is a concept that most people understand, but few can explain. It’s worse. Quality is an all-encompassing, collective term for these and many other difficult concepts.

    Quality is not an attribute of a system – it is a relationship between systems and stakeholders who take different views and the model of Quality that prevails has more to do with stakeholders than the system itself. Measurable quality attributes make techies feel good, but they don’t help stakeholders if they can’t be related to experience. If statistics don’t inform the stakeholders’ vision or model of quality, we think we do a good job. They think we waste their time and money.Whether documented or not, testers need and use models to identify what is important and what to test. A control flow graph has meaning (and value) to a programmer but not to a user. An equivalence partition has meaning to users but not the CEO. Control flow, equivalence partitions are models with value in some, but never all, contexts.

    If we want to help stakeholders to make better-informed decisions then we need test models that do more that identify tests. We need models that take account of the stakeholders’ perspective and have meaning in the context of their decision-making. If we measure quality using technical models (quality attributes, test techniques) we delude both our stakeholders and ourselves into thinking we are in control of Quality. We’re not.

    In this talk, Paul uses famous, funny and tragic examples of system failures to illustrate ways in which test models and (therefore testing) failed. He argues strongly that the pursuit of Quality requires that testers need better test models and how to create them, fast.

  • Acceptance Testing At It’s Best

    End users, and more precisely end users involved in acceptance testing decide whether a new application or system will go live or not. Therefore it is very important they are in the same pursuit of quality as the rest of the project. End users are no dedicated testers, although sometimes we expect them to be. Just by looking at their available time for testing, we already know they are not. The fact that they are not trained to be testers, doesn’t make it easier.

    But are we really looking for dedicated testers here?

    During this presentation, Erik will explain how you can involve end users in such a way that we optimize their added value during their testing activities. An error often made in projects is that end users are only involved during test execution. It’s by having them participate in the test process on regular, well selected moments that we can get the best out of acceptance testing.

    By means of a case study, Erik points out these moments. To start with, the acceptance testers need to know the goal of their testing activities. Knowing that, the acceptance testers are already involved at the end of the analysis phase in order to help the writing and prioritisation of high level test scenarios together with setting up the entry criteria for starting the acceptance test phase. Consequently, the acceptance testers will get demos on a regular basis of the software already delivered. These demos deliver valuable information, both for the project team as for the end users.

    And finally, after having assessed the test readiness of the system through system testing, the end users will execute their test cases closely monitored by the test coordinator. While executing the tests, it is up to the test coordinator to make sure the end users are always updated on the defects.

    The presentation will provide the audience with practical advice, examples and templates on how to set up their acceptance testing in a flexible way without drowning in administrative tasks.

  • Finding Quality – Having a Work-Life Balance

    I will share my story – a real life experience – following previous talks at EuroSTAR about Test Management and stress. The reason why I want to tell my story is that experiences like this hopefully can help fellow Test Managers look at their own life and consider if things are getting “out of hand”. If I can just influence one person to take his or her life up for revision – before he or she ends up in the same situation as I did – I have reached my goal. From my own experience, I will share with the audience the tools that helped me balance my work with my personal life and helped me gain some perspective on all the things that I had gotten myself involved in (Life wheel, Disney model, etc.). I will also touch on some of the warning signs, that I did not pay that much attention to before I was ill, and that I now try to pay “just enough” attention to without them limiting my work nor my life. I will tell, how expectations – also (mainly) from myself guided my choices and how I still need to be aware of what I engage myself in. I was (am) striving for (high) quality in both my professional and personal life, wanting to do good at work and at home. But one year ago I was suddenly not able to do anything. For 2½ months I was away from work and it took additional 4 months of part time work, before I was back in “full shape” working full time. I am the mother of two pre-teens and having a career as a Test Manager consultant – how do I make this work? How did I learn to say “no” to all the exiting things that are out there?

  • Agile on Huge Banking Mainframe Legacy Systems

    In this presentation I will go through my experiences of going agile, when working with a huge mainframe legacy system where most of the development is done using PL/1 or Cobol.

    I will present our main challenges including poor testability, poor maintainability and limited test automation tool support and how we dealt with these challenges. I will show how an agile mindset, good design custom and innovative test tool usage help us accomplish our goals. I will demonstrate our automated unit test setup and present our system test corporation with an offshore test centre in Bangalore. Finally I will discuss whether we managed to become agile or we just made a huge improvement in our development process and in our test process in particular.

  • Pursuing Quality & Why You Will Never Catch It

    Pursuing quality is a lofty goal, and a good theme for this conference. But just because you are bent on pursuing it, doesn’t mean that you will necessarily achieve it, for a number of reasons:

    quality of software (at any realistic scale) is not achievable by any one person, be they a developer, tester or manager
    testing cannot instill quality into a software product, it can only indicate or measure what quality is already there
    because testing is infinite, it can never be finished, so how can you know about the quality of testing?

    Does this mean that we should abandon the pursuit of quality? What should we pursue instead? Mediocrity? Rubbish? Nothing at all?It can be worthwhile to go after something even if you know that the ultimate goal is unachievable, because you may well achieve more than you would have done otherwise. But how do you know that you are moving towards quality?

    Here are some things that don’t work:

    – the “lone tester crying in the wilderness” – one tester bemoaning the lack of quality or desire for quality by everyone else
    – an antagonistic approach (“I’m better than you, I’m on the side of quality! – and I can prove that you’re not!”)
    – vagueness – yes, we want quality, but I can’t tell exactly what it is, but I think I’ll know it when I see it. (And by the way, this isn’t it yet)

    Here are some things that do work:

    – cooperation with other people – together we can achieve more than any one person alone
    – communication – a necessity in working with other people, but there are good and bad ways of communicating
    – making quality goals specific enough to measure whether they have been achieved or not
    – taking small steps in the right direction

  • Practical Approaches to Motivating Testers

    Why is the latest thinking in motivation so different from the ‘carrot and stick’ approaches that have been used as long as we can remember? What makes reward-driven approaches with incentivised bonuses reduce productivity? At this workshop you will gain a practical introduction to the latest motivation theories and see how they can be applied to testers.

    But do you trust new theories with one of the most important aspects of the job? Not only will this workshop contrast the new theories with the tried and tested approaches that have been around since the first bug was found, it will also present the results of a multi-national tester motivation survey designed to answer just these questions. We will see how actual test practitioners respond to various situations, consider the practices testers say work best for them, and see whether the testing discipline is ready to adopt 21st century motivation practices.

  • Growing to a Next Level Test Organisation

    Many organisations start improving their testing by implementing some kind of line organisation for testing (test expertise center, test service center), hereafter called TEC. Although a good starting point for improvements, in practice the TEC is often not much more than a resource pool of testers, possibly supplying certain templates or giving advice to projects.

    A next maturity level for a TEC is to grow to a test factory, responsible for delivering pre-agreed test results.From the experiences gathered mostly from a large railroad infrastructure organisation, this presentation shows the path to this next level of test maturity and responsibility.However, this is not a straight path, but a path with ups and downs and many curves, and getting there isn’t easy. It requires change, in organisational processes but, more difficult, also in the way people work, their behavior and their attitude.

    In my practice, I follow the principles of the Basic Change Method (from Dutch management guru Ben Tiggelaar). BCM is a combination of the most effective insights from cognitive and behavioral science and focuses on making people change their common behavior by management of both behavior intentions and change situations. Usually change management is mainly focused on end results. But the underestimated factor between change plans and desired results is behavior.

    Issues that will be discussed are:
    • using the TEC as a lever for test improvement
    • envisioning the roadmap
    • formulating improvement actions
    • (management) commitment
    • organising the improvement (team)
    • planning the change
    • implementing the improvements
    • changing behavior
    • measuring results.

  • Striving for Quality In One IT Landscape

    In 2010 the Danish PBS and the Norwegian BBS companies merged into NETS converging into the leading provider of solution for payments, cards and information within Northern Europe. Each company had its test process defined, established, proved and operational. Denmark worked on a process, called OPUS, based on the waterfall model, classic, thoroughly defined and solid; whilst Norway defined and used their U04 process built on scrum principles, effective, result oriented and consolidated within the company. For a while both processes co-habited in happiness and harmony, developers, system architects, testers and users worked each comfortably with the existing approach from each side of the sea.

    One day management communicates that Nets will have one single IT-landscape, ergo one test process. At this time the saga of negotiating scrum through the laborious processes required in a waterfall-based organisation starts, whilst negotiating waterfall for scrum is kind of out of question – Not even a single hit in Google. Working on a common goal for defining test in a scrum approach and a waterfall model without getting stuck between departmental snags was an arduous task and demanded agile undertaking, my apologies for the epigram.

    The test team first priority is striving for quality, whichever path was taken. So this story does not finish with the test teams experiencing conflicts, management subjective or biased decisions and making fit together somehow the two processes. The main focus was directed towards quality full integrated in the project lifecycle, no matter the approach. At this point testers, business, process people, IT architect and developers had worked in partnership to achieve the “One IT-landscape” process and ensure quality in every phase: Squeezing two antagonist approaches into one single process, collaboration to achieve a common project lifecycle will set the basis for this presentation and further discussion.

  • Automated Reliability Testing via Hardware Interfaces

    The case study described in this presentation has taken place at a medical equipment manufacturer. The product developed was a medical x-ray device used during surgery operations. The system generates x-rays (called exposure) and a detector creates images of the patient based on the detected x-ray beams (called image acquisition). The image pipeline is real-time with several images per second, so the surgeon can e.g. see exactly where he is cutting the patient.

    The presentation describes the approach that has been taken to develop an automatic testing framework in order to execute reliability test cases and identify reliability issues. To achieve the control of the system under test, the existing hardware interfaces (physical buttons of the different keyboards, handswitches and footswitches) were used to inject the system with actions (with the use of LabVIEW). This has been done to minimize the so-called probe effect.

    The expected results of the test cases have been automatically retrieved from the log files generated by the system. This way the test framework could react on system failures immediately, without wasting valuable test time on scarce test systems. The log files were used to extract information about the performed actions and failures in order to measure the MTBF (Mean Time Between Failures) of different critical system functions (like start-up of the system, and image acquisition). The Crow-AMSAA model for reliability measurements has been chosen to report reliability metrics to the organization. A Return-On-Investment calculation has been performed to get buy-in from senior management who provided additional funding to further develop the testing framework, and to apply the same ideas to different products and projects.

    The presentation explains the points which were crucial for the success of this approach to automated reliability testing and briefly explains future plans and extensions (e.g. operational profiles).

  • Where Exploration And Automation Meet: Getting The Most From Automated Functional Tests

    Agile teams invest heavily in automated functional tests. When done well, this investment is paid back with fast feedback enabling teams to release software quickly and often. By structuring tests in the right way, teams can further leverage this investment by using these tests as a platform for exploratory testing that could find issues a regression test suite won’t.

    This talk will cover the process and cost benefits of creating automated tests that can be reused to support exploratory testing. Concepts and practices will be described within the context of a mature agile delivery team. It will provide an overview of current practices for automated functional and exploratory testing within agile teams. The main focus of the talk will be to provide an overview and demonstration of techniques that will allow testers to utilise existing automated functional tests to support exploratory testing. The benefits of this approach such as; reducing the time needed for exploratory testing, increasing exploratory testing coverage, improved repeatability when reproducing bugs and improved robustness of your automated functional regression tests will be reviewed.

  • Team Work Within Test Teams

    The intent is to share the experiences with the industry on how to build an effective testing team and also provide them with a sufficient level of challenges to keep them motivated to work together. This paper does not narrate the standard testing process followed in the project, but will focus on all the innovative practices followed by the testing team in order to improve testing efficiency and make the testing team work closely with each other to achieve the goal. Each member of the testing team is motivated to play a role that is more than just a Tester. Apart from testing, the tester also needs to work on process improvements and plays a role of business expert within the project. One major factor would be motivation. Proper motivation can save a company time, money and resources. It is more cost effective to keep existing resources through proper training and motivation than it is to constantly recruit and train new ones. “There is an old saying – you can take a horse to the water but you cannot force it to drink, it will drink only if it’s thirsty”. The same can be said of people. They will do what they want to do or otherwise motivated to do. Whether it is to excel either at the work place or in the personal front. They must be motivated or driven to it, either by themselves or through external stimulus. Tuckman’s model Unique Motivation Technique : Forming – Storming – Norming – Performing is a model of team development, first proposed by Bruce Tuckman in 1965, who maintained that these phases are all necessary and inevitable in order for the team to grow, to face up to challenges, to tackle problems, to find solutions, to plan work, and to deliver results.

    As the popular saying goes, ‘There is no ‘I’ in Team,’ and therefore working as a team requires individuals to be able to effectively work together and in doing so learning to put personal feelings to one side. This is certainly no easy task as everyone comes together having experienced a whole range of issues and often carrying much baggage. In order to better understand the skills required for effective teamwork within the workplace, it may be helpful to first identify which aspects are likely to result in a team being inefficient and unproductive. Effective teamwork is not something that is quick and easy to establish. Each member needs to communicate and listen properly while not trying to pull focus. It is great to be able to work well under your own steam, but for a company to function successfully, there also needs to be effective teamwork. Sometimes this happens easily, but often it takes time and effort to work well together. Everybody needs to learn to accept and understand each other and not all pull in different directions. Instead of using teamwork as an opportunity to show off or pull focus, each member needs to focus on the good of the group.

    Particularly within the testing project environment trust is key to maintaining good working relationships between developers, BA and other testers in the team. Trust requires honesty and honesty is crucial to being able to work efficiently without worrying that one is going to be stabbed in the back or manipulated out of a job. Without accountability it is completely impossible to work efficiently within the workplace environment or indeed any other environment as a team. A really crucial aspect of effective teamwork is being able to communicate well with one another. Thus requires the following skills: active listening, negotiation skills, non-verbal communication awareness and patience. It is also important to be aware of tone, gestures and facial expressions as these will have an impact on how one is perceived by other members of the team.
    (E2)Q + p + P = TW
    E = Efficiency
    E= Effectiveness
    Q= Quality
    TW= Team Work
    Applying this formula will primary allow the testing team to focus and ensure that the components are delivered with the best of quality. The main objectives of this formula is to :
    • Meeting the deadline
    • Quality of deliverables
    • Self learning
    • Continuous improvement

  • Mixing Open & Commercial Tools

    Mixing open source and commercial software is a challenge we face today. The right combined solution offers advantages in flexibility, functionality, performance, and management that aren’t available when either open source or commercial technologies are used alone. But one of the major issues is that they don’t always play well together. Some of them can’t be loaded together or they fail to integrate properly.

    In this presentation we will provide a case study of successful blending open source and commercial software for testing Java applications. The testing environment consisted of using Subversion as versioned repository for tests, JIRA for issue management and in mixing IBM Rational Functional Tester with an open framework for automated functional, regression and GUI testing. These tools provide a rich set of high-level Java API useful for integrating with each other, by minimizing the integration costs. During the presentation, we will explore the costs (open tools used have no fees, integration and training costs …), the advantage (innovation, products that are best-of-breed …), the risks (dedicated support, community feedbacks, ability to add extensions …) and the product’s quality achieved by our solution compared to a full open source or commercial approach. By our experience, we will provide some hints and tips to guide testers and managers to choose a good mix between open and commercial tools based on: budget, technology, know-how and requested quality.

  • How to Create “Good Testers”

    During this presentation I will cover three key questions:
    1. What is a good tester?
    2. Why would you want to be one?
    3. How can you create or become one?

    Firstly, I don’t believe there is an industry wide definition of “good tester” that fits every situation; but I do believe good testers exist, and that they are very special and valuable individuals! So, I’ll share what I believe constitutes a good tester in some different situations and you can identify how this relates to your world. I’ll also give my opinion on how this relates to our obsession for certification.

    As for wanting to be a good tester, I will ask why you wouldn’t! It’s our work that provides the most valuable management information on the planet; gives confidence to stakeholders that their system is going to bring the benefits they want; and the skills we need cover the entire project life-cycle. If only we could unlock our minds from the tethers others try to constrain us with.

    Finally, I will share my experiences of how to create good testers. For each of the situations I defined earlier, I’ll expand on how your organisation can identify people to target for a career move into testing. Maybe you’ll identify how to improve your own career prospects too!

  • Keynote 5: Creating Customer Value with Agile Testing

    Roles are changing for Testers in Agile Teams. Rather than being a compliance officer who tracks conformance to a spec, I want to act as the user’s proxy during the Sprint. It is important that I discover the implications of product backlog items, explore the consequences and ensure that the working software will stand up to real world use. I want to make sure that my team is building the right product, a product that will delight our customers. I need to have the process and communication barriers between me and the other people on my team removed to allow us all to focus on maximizing customer value rather than dealing with bureaucratic overhead. Come learn about techniques that are allowing many teams worldwide to achieve these goals today.

  • Keynote 6: Quality in Use: The Beating Heart of the Customer Experience

    As testers, does our zeal sometimes cause us to miss the big picture of what the customer and the business really need? Do we focus on software quality at the expense of human and commercial factors? “Quality in use” measures human, business and societal impacts. This includes usability, accessibility, and flexibility for the customer and business, as well as commercial, human and environmental safety. For the people selling, supporting or using the products, this is the beating heart of the customer experience. How well are people supported to efficiently and efficiently carry out their tasks? Is the product accessible to all the people who want to use it? Without the big picture, software will not be acceptable, will not keep our organisations in profit, and may not be legal. The benefits of designing in and testing these attributes will increase the marketplace for our products and services.

  • Architecture Testing Wrongly Ignored!

    State-of-the-art testing approaches typically include different testing levels like reviews, unit testing, component testing, integration testing, system testing, and acceptance testing. There is also common sense that typically unit testing is done by developers (they are responsible to check the quality of their units at least to some extent) and system testing is done by professional independent testers. But, who is responsible to adequately test the architecture which is one of the key artifacts in developing and maintaining flexible, powerful, and sustainable products and systems? History has shown that too many project failures and troubles are caused by deficiencies in the architecture.Furthermore, what does the term architecture testing mean and why is this term seldom used?

    To answer these questions, Peter describes what architecture testing is all about and explains a list of pragmatic practices and experiences to implement it successfully. He offers practical advice on the required tasks and activities as well as the needed involvement, contributions, and responsibilities of software architects in the area of testing – because a close cooperation between testers and architects is the key to drive and sustain a culture of prevention rather than detection across the lifecycle.

    Finally, if we claim to be in pursuit of quality then adequate architecture testing is not only a lever for success but a necessity. And this results not only in better quality but also speeds up development by facilitating change and decreasing maintenance efforts.

  • Test Process Aligned To Scrum

    Kristina L. Tangen’s presentation on Experiences from project PERFORM.
    Kristina L. Tangen, Test Manager – Steria’s Deliveries to PERFORM

  • Playing Around With Risks

    I looked at my cards. 2 Aces. The best hand possible to have in poker on an empty board. At this point there is no risk that I can be beaten. I decide to exploit the situation. Get as much value as possible, but not letting my opponents know I have such a good hand. I don’t raise. 3 cards come on the board. I wait. The 4th card comes. I still wait. The fifth and last card comes and I make my move. I put all my money on the line and as it turns out, I got beaten by someone who has made a straight. How is this possible? I had the best hand, I evaluated the risk and still lost. The reason is obvious, the board changed 3 times, and with each extra card, my risk of losing also changed. And I did not adapt. I didn’t re-evaluate my risks and acted accordingly.

    There are quite a few games that deal with risks and risk responses. Poker and Monopoly are a few examples. There are world championships held in these games and there is general consensus who are the best players in the world. Those players have game tactics. What if we can map those tactics to Risk Based Testing? Can we improve our process based on those successful game tactics? In this presentation, I will elaborate on a few game tactics and map them on the Risk Based Testing process. I will give concrete examples of similarities between them and demonstrate that they can be adapted to improve our test process.

  • How to Apply Lean Test Management – Ban Waste, Gain Efficiency

    Cost reductions and the quest for more efficiency are more evident in today’s business world. It also follows that our testing processes will ultimately be affected. When test techniques and methods for structured testing are introduced, this results in improvements in the production of more consistent and predictable results.

    Introducing a risk based approach to testing makes it easier for the business to determine to what extent testing is necessary and most efficient. The resulting Go/No- Go decision process may not be sufficient for all companies so other creative methods need to be investigated. Many management theories speak about “Lean” as being one of the solutions. One of the key steps in using “Lean” is the identification of which steps add value to the customer and which do not. This track will give you information to start using “Lean” within testing and more specifically within test management.

    The presenter will also look at Lean Six Sigma as being one of the more popular theories that introduces the concept of “Lean” in combination with obtaining higher quality products. This subject will also be explained in combination with testing and test management. This track will focus on applying Lean Six Sigma techniques to test management processes using practical examples from customer cases. The audience can take home a practical “Lean Test Management” overview which they can apply in their own companies.

    This track is especially of interest to business managers, IT managers, QA managers and test managers that are involved in improving the quality of test management processes.

  • Dashboards: How Do You Effectively Report Your Work?

    As a tester, how do you effectively report your work in a way that is most meaningful to management? One way is to provide story that prompts conversation and questions. Managers claim that they want to know about the progress of testing–but is that what they really want? Perhaps they wouldn’t care much about the state of testing if they knew the state of the product. At very least, they’ll want to know about both. More than anything else, managers want to know about threats to the project schedule.

    One complication is that product status is highly multi-variate. In particular, our knowledge each product area can be in a different state. The important variables for each area include;
    – the amount of testing effort being expended currently;
    – the amount of test coverage obtained so far; and
    – information about threats to the schedule.

    However, the good news is that one of the most basic forms of data display–a table–can easily show this information in a way that is clear, concise, and compelling.

    In this presentation, Michael Bolton takes you on a detailed tour of a straightforward, easily-maintained testing dashboard that is designed to keep the entire project team informed of product status and testing activity, to foster productive conversation, and to help prompt focused questions from management.

  • Using Functional Test Automation To Prevent Defects From Escaping The Development Stage

    Making mistakes is a part of being human, so the likelihood of being able to write a piece of software without any defects (in any of the stages) is quite low. Instead of aiming for no defects at all, perhaps we should place more importance on finding defects at the right time. The right time to find defects can generally be answered with “as early as possible”. In our experience, this means not letting any defects escape the development cycle. For any given development process, bringing user-centred tests closer to the development not only gives us a good chance of defect “prevention”, but also gives us the time to ensure that the development is going in the direction the customer really wants it to. If we talk in terms of the V model, this means bringing the integration, system and acceptance testing into the development phase. If we look at agile processes, then it means parallel test development or ATDD. As always though, we need to get the maximum benefit (most requirements checked, most defects caught) from minimum resources.

    This talk presents strategies and best practices from a test process that can be used for both traditional and agile projects to ensure that the end-user and/or customer perspective is the focus throughout the development phase. The test process is exemplified based on the lifecycle of a new feature.

    For the feature, we look at:
    – When and how to set acceptance criteria and test specifications
    – Writing black box tests that put the applications business logic through its paces
    – Using continuous integration to gain frequent quality information
    – Reacting to test results to stop defects escaping

    The process is then considered in terms of its advantages and disadvantages. Alex evaluates what information the process provides and the cost required to gain it alongside the information that the process doesn’t provide and how this can be attained with minimal extra effort.

  • What The Top 10 Most Disruptive Technology Trends Mean For QA And Testing

    New and emerging technologies such as mobile apps, tablets, 4G, cloud computing, and HTML5 are making big headlines and impacting software engineering and testing organizations in various industries. These technological innovations are allowing sensitive data to be accessed through the web and on mobile devices more than ever before.

    With so much critical data flowing to smart phones and tablets, there is immense pressure to ensure that apps – those that a company produces for its customers and supports for employees, vendors or partners – are reliable, scalable, private and secure. And this evolution of technologies and user behavior dramatically impacts those who are responsible for developing and testing applications.

    The ways web and mobile apps are designed, developed and delivered are changing dramatically, and therefore the ways these apps are being tested are being taxed and stretched to the breaking point. Using real-world examples, Doron Reuveni identifies the top ten technology trends that have transformed the software industry and outlines what they mean for the QA and testing community today.

  • About Testers and Garbage Men

    Coming from EUROSTAR 2010, stuck at Copenhagen Airport, a colleague and I started to philosophize about a topic for the EUROSTAR 2011 Conference. At least that way waiting for our plane to take off wouldn’t be a waste of time. Now, if someone mentions to me the word “waste” or “garbage”, I automatically think of an old movie of Laurel and Hardy, “A chump at Oxford”, where they play the role of two garbage men.

    That’s how the idea grew for this presentation. Defects, unnecessary documents, unwanted processes, meetings that lead nowhere, waiting 14 days to receive a test environment, no focus on risks and last but not least, misuse of talent. That’s a lot of waste, no? Waste of time, effort, money and people.

    What does this have to do with Laurel and Hardy? Ever seen their movies? They are the best example of how to reduce waste. Or … was it to create waste? That’s why Laurel and Hardy are the theme for the presentation.

    In the first part of the presentation we use movie fragments to explain what waste is and that waste doesn’t JUST exist, but is created by people. Why? Because they are “people”, human beings who tend to act in a pre-conditioned way. In the second part of the presentation we discuss how to classify all kinds of waste and we handle the fact that “waste creates waste”. In the third and last part of the presentation we discuss an approach to stimulate people to find and reduce waste. We’ll discuss the relation between a traditional way of testing and an agile way of testing to conclude:“One of the main differences between Agile and other testing methodologies is the amount of waste” Or:“By reducing waste, the reason to work in an ‘Agile’ way is gone!” One last remark: The ghosts of Stan and Ollie will give the presentation and they’ll make this presentation memorable.

  • ‘Top Challenges we face in IT Today to Hinder Our Pursuit of Quality’ Lloyd Roden

    Some people thrive on challenges, while others struggle with how to deal with them. Handled well, challenges can make us stronger in our passion, drive, and determination. Lloyd Roden describes the challenges we face today in software testing and how we can respond in a positive, constructive manner.
    One of the challenges Lloyd often sees is identifying and eliminating metrics that lie. While we (hopefully) do not set out to deceive, we must endeavour to employ metrics that have significance, integrity, and operational value. Another challenge test leaders face is providing estimates that have clarity, accuracy, and meaning. Often we omit a vital ingredient when developing test estimates.
    A third challenge is convincing test managers to actually test regularly to attain credibility and respect with the team they are leading. A further challenge is to see why the use of the term “best practice” can be so damaging.

  • ‘Acceptance Test Driven Development using Robot Framework’ Pekka Klarch & Janne Harkonen

    Acceptance test driven development (ATDD) is an important agile practice merging requirement gathering with acceptance testing. In its core are concrete examples, created together with the team, that provide collaborative understanding and, as automated acceptance tests, make sure that the features are implemented correctly.

    Robot Framework is an open source test automation framework suitable for ATDD and acceptance testing in general. It has flexible test data syntax that also non­-programmers can use and the simple test library API makes extending the framework easy. The rich ecosystem around the framework contains, for example, a separate test data editor, several generic test libraries (web testing, Swing, databases, Windows GUIs, SSH, …) and plugins to common build and continuous integration tools.

    This presentation gives an introduction both to ATDD and Robot Framework. It contains different demonstrations and all the material will be freely available after the presentation.

  • ‘How to Test Mobile Apps’ Jeroen Mengerink

    How to test mobile Apps

    “There’s an app for everything”. Development is moving quickly and testing needs to keep up: in pursuit of quality. But how to test an App? Not only do we need to comply to customer requirements, we also have to deal with new aspects like “store acceptance”. Stores apply an extensive, growing list of acceptance criteria, which need to be incorporated in the testing process.

    Test complexity increases with the growing number of mobile devices that need to be supported. How to cope with that? To enable early technical testing, simulated test environments are made available by the mobile manufacturers, e.g. for portability testing.

    Apps are “cloudy” solutions for modern functional problems. This requires dedicated end-to-end testing! Functional and technical, but also non-functional.
    Testing has to redefine its position in the App development process.

  • ‘Catching a Highspeed Train’ Nathalie Rooseboom de Vries van Delft

    In 2007 I took my first steps in a program for the High Speed Train called ‘Fyra’ in the Netherlands. I was hired for a specific project within the program: End-to-End testing of the vending, control and financial systems regarding the Fyra. This is an experience talk about this end-to-end test. What will I talk about:

    I used a generic test plan and generic approach. The actual execution was done (mostly) on a pragmatic basis. I used both knowledge of the ‘traditionalist school’ as the ‘context driven school’. I adapted ‘standard stuff’ to cope with the complexity and agility (nót meant as ‘Agile’ here) within the project. I also changed components when I learned something new, f.e. Exploratory Techniques from EuroSTAR’09, when I thought it would do a better job. I had new, existing and changing components in my ‘chain’. I also had changing suppliers and processes, new processes and even at one time a changing (upgrading) infrastructure. I didn’t have the luxury of saying: “Hey tough luck, I will have to postpone my testing until you’re done”. It took a lot of adaptability, on-the-spot risk analysis and re-planning. Using timed components in a generic planning can help reschedule immensely.

    A lot activities were actually non-testing activities. Expectance management, communication and politics swallowed up a large amount of available time. The execution of the tests only used up a couple of percentages of the total time.
    The bulk of time was used up by getting the environment ready, lining up the people involved, instructing the people involved, designing test cases and writing stories and a lot of time was used to evaluate the results and follow up. Real life testing with real life customers, using a Closed User Group, camera’s and a survey agency. Not only did I get a lot of info on the actual systems and it’s usage, but management got to see ‘their’ clients. Resulting in some crucial changes but also the awareness that ideas might not always be in the customer’s interest.

    And…No experience track without any anecdotes right? I have some very cool ones!

    Proving the newspaper wrong;
    A white swan causing a black swan situation; -the Sliding Momentum and
    the PornTrain
    As attendee of this track you get to hear about hands-on experiences in an end-to-end test and how I applied theory I learned in practice. Tale from the trenches AND the books.

  • ‘An Evolution to Specification By Example’ Adam Knight

    For the last four years myself and my colleagues at RainStor have been evolving a process for testing a structure data archiving system in an Agile development environment. In this talk I will discuss the evolution of a team from a rudimentary Agile implementation on an unreleased product, to our current process which uses the fundamental elements of Specification By Example to successfully deliver software functionality across 30 different platform/backend configurations to a series of high profile and demanding customers. Last year our company was used as a case study for successful implementation in Gojko Adzic’s book on Specification By Example.

    My report will discuss the lessons learned during the early implementation and the challenges faced in moving away from a compressed waterfall approach. Through a process of incremental change we have identified and tackled the fundamental issues that undermined the development effort as a team. I’ll describe some of the mistakes made in attempting to implement a more formal process of requirements documentation into an Agile implementation and the benefits we uncovered on moving to a more flexible user story based approach. I’ll also discuss some of the issues around trying to implement user stories in a server system with no GUI and very technical and performance based requirements.

    Raising the importance of quality and the status of testing both within the development team and the organisation as a whole has allowed the challenges facing the team to be recognised and respected. The result has been a more collaborative approach taken between developers and testers both through “collaborative specification” of user stories and tackling the problems that impact the delivery of value to the customers. I also plan to discuss how we’ve expanded from documenting acceptance criteria for each user story such that we now document Criteria, Assumptions and Risks for each feature and, rather than a ‘Done/Not Done’ approach how we identify the confidence in each of these categories to measure the confidence we have in each new feature being implemented.

    Having the test team as an involved and influential team through the entire development process has also allowed us to implement a number of testability features to help to make the product more testable. I will discuss the benefits of having development understand and prioritise testability issues with some illustrative examples.

    I will discuss the challenges and benefits of developing our own metadata driven test harnesses as opposed to an off the shelf solution. I’ll detail how having control over these harnesses has allowed us to work towards a self documenting test system using realistic customer examples as “Automated Specifications” of the RainStor system allowing us to explain current behaviour to Product Management in terms of well understood customer scenarios.

  • ‘Operational Testing: Walking a Mile in the User’s Boots’ Gitte Ottosen

    When testing is based on the requirements and focusing solely on the features being implemented, we tend to forget an important perspective – whether the system fits for its purpose. We need to take testing one step further – to walk a mile in the user’s boots – to understand and to test based on intensive domain knowledge When developing software, we base our work on requirements – being line items, use cases or scenarios. But often there is a long way from requirements to what the user really wants, and often the user is far away from where the requirements specification is being defined.When testing is based on the requirements and focusing solely on the features being implemented, we tend to forget an important perspective – whether the system fits for its purpose. We need to take testing one step further – to walk a mile in the user’s boots – to understand and to test based on intensive
    domain knowledge.

    In Systematic we especially discovered this need when developing our product, a command and control system for the army, and we therefore decided to take testing to the trenches – to implement operational testing of the system. The company already used domain advisors extensively when designing the system in order to ensure the users voice being heard, but now we wanted to take it yet another step towards REAL use and REAL users, designing and executing large operational scenarios with the use of REAL operational users. In this presentation you will be introduced to the process we went through and the results this lead to.

    In the presentation I will describe the process we have gone through in order to implement the concept, describing both successes and challenges. I will try to ensure that I also include reactions from the users that are involved as well as feedback from development.

  • ‘Revolution, Evolution & Lean – A Test Process Improvement Diary’ Tapani Aaltio

    The move from a waterfall life cycle to an agile one is not straightforward for people with a long experience in software development and testing. The ways of working are in their backbones and hard to change. In this kind of situation, taking small steps towards the right direction simply takes too long. You need to throw away your old process (=Revolution) and then improve the new process constantly (=Evolution). Usually, it is not possible to have a full scale revolution at once. That makes the evolution even more important, to get rid of the bad practices from the time before the revolution. An effective approach to speed up the evolution is to take the seven wastes of lean management as a guideline – to detect and remove relics not blown away by the revolution.

    This presentation is my diary for test process improvement, spanning from Eurostar 2010 in Copenhagen and the Workshop on Lean Test Management (by Bob van de Burgt & Iris Pinkster), to the 2011 conference in Manchester. It highlights in interesting and surprising ways the evolution of three agile teams in a major Finnish company.

    Waste is often produced by applying good practices in the wrong place, e.g.because of a weak test strategy or by involving too many people in defect management. Waste can also be caused by very practical things like bad seating arrangements or people not attending meetings. However, the biggest waste is to optimize the wrong things – e.g. the detection of defects instead of their prevention. These are examples of the things that have come up so far – but there will more during the year. I will also introduce some practical and light ways to measure the consequences of the waste.

  • -->
    Share This

    EuroSTAR 2015 Highlights

    View image gallery