Bloggo back to the blog
The View From 4 Weeks Out-->
The run-up to a major code implementation can be a tense and anxious time. In this, the first of 5 weekly blog posts, Peter Morgan sets the project context, and describes project activities. As of 4 weeks out, the implementation date looks more likely than not – but sometimes the overall status changes three times in the same day!
The implementation date is now about 4 weeks away, so this article is an indication of progress as the project team aim for and come in to land on that date. It is slightly outside of real time; by the time you read this, it will be about 2 weeks out – but NOT written retrospectively. It is just that I wanted to make sure that the date had not moved significantly to the right – meaning that these weekly missives would continue for 18 weeks ……………………… And as this still appears, you obviously know that there has been no significant, subsequent schedule-slipping, difficulties.
Before continuing, I need to set the implementation into its project context. That in mid-May is the latest in a series of deliveries to enable data to be loaded into, and retrieved from, an Oracle database, labelled by the business ‘The Finance Datawarehouse’. The latest delivery will enable the business to see views of data not previously available, and will tie up the 6 new transaction data sets with previously loaded data, all in the cause of enabling better decisions to be made. Previous data feeds are not being changed, at least in functional terms, but care has to be taken to ensure that the daily and monthly loading previously implemented does not cause difficulties with the monthly and quarterly loads that are new. The problem prevention has also to work both ways – the new feeds should not interrupt the existing dataloads.
Data is loaded into ‘staging’ tables within the Oracle database using the Informatica tool as the ETL mechanism. After this has taken place, the database management platform is used to load this information into the datawarehouse, with referential integrity ensured as part of the management tool. Information is then extracted for building into reports with QlikView as the chosen solution. The business had readily agreed that whilst IT were responsible for loading data and enabling reports to be produced, it was very clearly within the remit of the business to check on the data available in reports. Business Objects is used to produce fixed reports detailing a summary of data that has been loaded, and JobTrac the scheduling tool – you can see that there is a mixture of technologies.
The strategy for implementations is to have three tranches of code promotions, but only the last would actually ‘do’ anything. The first two were both in preparation for the final implementation, and de-risking that last code promotion. This implementation strategy is that devised by Brett Gonzales and recommended by many. Four weeks out, there was the first of the preparatory implementations, adding new database records to the Oracle database, undertaken when the data model was promoted. Other elements included the creation of necessary folders on the Informatica server, and minor amendments to existing job schedules to prevent database table locking being an issue.
Four weeks out, System Testing should have been completed, and UAT started. There was some over-run of System Test activities, but sufficient was promoted to the UAT environment to enable UAT to start. The first week of UAT was mainly data set up, and ensuring that code had been correctly implemented. As text books stipulate, UAT is a controlled environment, and all code promotions have to be taken from the Configuration Management tool. This can cause delays, but at least code promotions should be undertaken with the same rigour that surrounds similar activities in the PROD environment. Early activities in the UAT environment tested out the first code-set for production, checking that the correct database tables existed, and that the amendments to parameter tables (insertions-only) were performed correctly. There was also ensuring that previously implemented data feeds still performed correctly – selected data sets from the PROD environment were loaded in the UAT environment, and all data records successfully loaded. Regression testing – passed!
At the end of the week, more defects had been raised (and some more closed too!). On the final day of the week, the number of System Test cases outstanding was 6 (out of 239), and all of these were carried forward to UAT, as the test conditions would be hit in that phase of testing. 33 of the UAT test lines had been passed (out of a total of 105). There were two minor problems with the (first preparatory) implementation into PROD. This did not stop activities on the live database continuing, and solutions for the two will be targeted for delivery in the second preparatory release. The final day of the week saw 4 show-stopper defects being raised, 1 in System Test and 3 in UAT. All of these were fixed, shown to be fixed and closed before the end of that day, (done without working fsar into the night, I am pleased to say). These were for a variety of reasons, but for UAT defects, only one required a new code promotion via the Configuration Management tool – a blessing as using the ‘proper’ route took 5½ hours to promote code on this particular day.
The good news is that the implementation in mid-May looks more and more probably as the days go by, and the business users are even beginning to smile and rub their hands in anticipation.