Introduction – quality is too important to be left to the quality fairies!

Build and XenRT test results

Screen showing live build and XenRT test results for XenServer

Quality is a critical property of a software product. And like most properties of software, it doesn’t just spontaneously happen but must be actively engineered in.

Have you ever experienced senior management and business-level attention on quality during the early phases of release planning? Hopefully yes, but in my experience it is more usual for all the business, executive and management attention to focus on release dates and features. Quality is silently traded out in favour of one or both of these. And later, after release, people can be disappointed and upset when quality is poor.

But if this happens then as quality professionals it is our fault!

If she or he does nothing else, the quality professional must ensure that quality is given an equal place at the table when business, release and project decisions are being taken. It is ok for quality to be traded out but only if it is done consciously, with full awareness of the implications.

In the XenServer group we strive to consider quality as a first-class citizen of a release project’s scope. This blog describes how we do this, with examples and real life scenarios from the evolution of XenServer over the last few years.

Quality matters because besides being a standalone product, Citrix XenServer is Citrix’s flagship platform for XenApp, XenDesktop and CloudPlatform. It is also a key component of Citrix NetScaler SDX.

Quality Goals

It is normal for a release to have goals – hopefully these will be made explicit (e.g. documented, reviewed and signed off). These goals are usually couched in terms of release dates, product features, market penetration, revenue targets etc. It is rarer for releases to also have explicit quality goals.

Defining quality goals allows stakeholders to debate and ultimately agree on the level of quality the engineering team should shoot for. This in turn allows for decisions on the relative investment in quality not to mention the inevitable trade-offs against time and features.

The XenServer group defines quality goals in terms of the number of Customer Raised Unique Defects (CRUDs) on the eventual release. As this is a lagging indicator of quality, we supplement this with targets for limiting the number of defects we post-ship. We sometimes set additional and more specific goals e.g. targeting improvements to specific features or to specific properties of the product such as reliability, performance or usability.

Quality Plans

Once we have agreed the level of quality we are shooting for, we need to convince ourselves we can achieve it. We do this by appointing a Quality Manager, and creating and signing off a Quality Plan.

A key principle is that someone be responsible for the quality of a release i.e. that it meets its quality goals. We appoint someone, typically a senior member of the QA team, to the temporary role of “Quality Manager” for the release. The Quality Manager has to sign off on the Quality Goals and will only do so if he/she is happy that his/her Quality Plan is resourced, feasible and generally adequate to meet the goals. If she/he or the various reviewers of the Quality Plan do not believe this to be the case then either the plan has to be improved (e.g. better resourced) or the quality goals have to be revised downwards.

So what does a Quality Plan have in it?

Testing, certainly. But ‘testing quality in’ is a lousy and inefficient quality strategy. A better strategy is to build quality in, planning for ways to prevent defect ‘injection’ rather than merely detecting them after the fact. The Quality Plan will often therefore include actions to increase the level of peer review on requirements, designs and code, or to extend unit test or static analysis, or to improve engineering practices used within the group.

When it comes to system testing, a typical XenServer quality plan will cover use of well-established Continuous Integration and Automated Regression Test capabilities. See my earlier blog on XenRT! It will also identify areas where test coverage will be improved, or where additional testing from 3rd parties (other Citrix groups or people outside Citrix) will be solicited.

The Quality Plan is also used as an opportunity for continuous improvement. By measuring quality outcomes on previous releases we can refine our quality goals and quality plans to gradually drive up quality. A typical approach we take is to analyse the CRUDs on the previous release and through this to identify weak points in the product quality and in the engineering processes that gave rise to them. Then we can cost, plan and execute corrective action within the Quality Plan. On Citrix XenServer 6.2 for instance, in response to problems on XenServer 6.1 we significantly extended automated test coverage of large pool and large host Windows VM deployments, together with ring-fenced development resource to fix the bugs we found. When we came under pressure to justify the spend on these activities (“why are you doing that instead of developing my favourite new feature?”) we were able to point at the data showing the underlying quality problem and its impacts on our customers. Dropping this work was always an option, however it had to be a conscious one, considered in the same breath as slipping the release or dropping a feature from the release.

Summary

Quality is a critical property of a software product. Quality doesn’t spontaneously happen but must be actively engineered in.

Attaching quality goals to releases and making someone accountable for quality plans ensures quality has a voice and an equal place at the table.

With quality firmly on the agenda it can’t be silently traded out for scope or time.

And what else?

For most of our customers, XenServer is just one component of a wider Citrix solution. Interoperability and solutions testing is a key part of XenServer quality plans and indeed of the activities of all Citrix engineering and product groups. See this excellent blog from my colleagues in the Citrix Solutions Lab. Or, indeed, this one! A great example of a high quality Citrix solution is virtualised GPU, where we’ve worked with partners such as NVIDIA, Dell and CAD software vendors to test large scale architectures. Or this one from Sagnik Datta, an engineer in my XenServer QA team, describing how we work with OEM vendors such as HP, IBM and Supermicro to QA and ensure hardware compatibility pre-release.