The examples and perspective in this article may not represent a worldwide view of the subject. (October 2016) (Learn how and when to remove this template message)
IT services are increasingly interlinked in workflows across platform boundaries, device and organisational boundaries, for example in cyber-physical systems, business-to-business workflows or when using cloud services. In such contexts, quality engineering facilitates the necessary all-embracing consideration of quality attributes.
In such contexts an "end-to-end" view of quality from management to operation is vital. Quality Engineering integrates methods and tools from Enterprise architecture-management, software-product management, IT service management, software engineering and Systems Engineering, and from software quality management and information security management. This means that quality engineering goes beyond the classic disciplines of software engineering, information security management or software product management since it integrates management issues (such as business and IT strategy, risk management, business process views, knowledge and information management, operative performance management), design considerations (including the software development process, requirements analysis, software testing) and operative considerations (such as configuration, monitoring, IT service management). In many of the fields where it is used, Quality Engineering is closely linked to compliance with legal and business requirements, contractual obligations and standards. As far as quality attributes are concerned, reliability, security and safety of IT services play a predominant role.
In quality engineering, quality objectives are implemented in a collaborative process. This process requires the interaction of largely independent actors whose knowledge is based on different sources of information.
Quality objectives describe basic requirements for software quality. In Quality Engineering they often address the quality attributes of availability, security, safety, reliability and performance. With the help of quality models like ISO/IEC 25000 and methods like the Goal Question Metric approach it is possible to attribute metrics to quality objectives. This allows measuring the degree of attainment of quality objectives. This is a key component of the Quality Engineering process and, at the same time, is a prerequisite for its continuous monitoring and control. To ensure effective and efficient measuring of quality objectives the integration of core numbers, which were identified manually (e.g. by expert estimates or reviews), and automatically identified metrics (e.g. by statistical analysis of source codes or automated regression tests) as a basis for decision-making is favourable.
The end-to-end quality management approach to Quality Engineering requires numerous actors with different responsibilities and tasks, different expertise and involvement in the organisation.
Different roles involved in Quality Engineering:
- Business architect,
- IT architect,
- Security officer,
- Requirements engineer,
- Software quality manager,
- Test manager,
- Project manager,
- Product manager and
- Security architect.
Typically, these roles are distributed over geographic and organisational boundaries. Therefore, appropriate measures need to be taken to coordinate the heterogeneous tasks of the various roles in Quality Engineering and to consolidate and synchronize the data and information necessary in fulfilling the tasks, and to make them available to each actor in an appropriate form.
Knowledge management plays an important part in Quality Engineering. The quality engineering knowledge base comprises manifold structured and unstructured data, ranging from code repositories via requirements specifications, standards, test reports, enterprise architecture models to system configurations and runtime logs. Software and system models play an important role in mapping this knowledge. The data of the quality engineering knowledge base are generated, processed and made available both manually as well as tool-based in a geographically, organisationally and technically distributed context. Of prime importance is the focus on quality assurance tasks, early recognition of risks, and appropriate support for the collaboration of actors.
This results in the following requirements for a quality engineering knowledge base:
- Knowledge is available in a quality as required. Important quality criteria include that knowledge is consistent and up-to-date as well as complete and adequate in terms of granularity in relation to the tasks of the appropriate actors.
- Knowledge is interconnected and traceable in order to support interaction between the actors and to facilitate analysis of data. Such traceability relates not only to interconnectedness of data across different levels of abstraction (e.g. connection of requirements with the services realizing them) but also to their traceability over time periods, which is only possible if appropriate versioning concepts exist. Data can be interconnected both manually as well as (semi-) automatically.
- Information has to be available in a form that is consistent with the domain knowledge of the appropriate actors. Therefore, the knowledge base has to provide adequate mechanisms for information transformation (e.g. aggregation) and visualization. The RACI concept is an example of an appropriate model for assigning actors to information in a quality engineering knowledge base.
- In contexts, where actors from different organisations or levels interact with each other, the quality engineering knowledge base has to provide mechanisms for ensuring confidentiality and integrity.
- Quality engineering knowledge bases offer a whole range of possibilities for analysis and finding information in order to support quality control tasks of actors.
The quality engineering process comprises all tasks carried out manually and in a (semi-)automated way to identify, fulfil and measure any quality features in a chosen context. The process is a highly collaborative one in the sense that it requires interaction of actors, widely acting independently from each other.
The quality engineering process has to integrate any existing sub-processes that may comprise highly structured processes such as IT service management and processes with limited structure such as agile software development. Another important aspect is change-driven procedure, where change events, such as changed requirements are dealt with in the local context of information and actors affected by such change. A pre-requisite for this is methods and tools, which support change propagation and change handling.
The objective of an efficient quality engineering process is the coordination of automated and manual quality assurance tasks. Code review or elicitation of quality objectives are examples of manual tasks, while regression tests and the collection of code metrics are examples for automatically performed tasks. The quality engineering process (or its sub-processes) can be supported by tools such as ticketing systems or security management tools.
- Txture is a tool for textual IT-Architecture documentation and analysis.
- mbeddr is a set of integrated and extensible languages for embedded software engineering, plus an integrated development environment (IDE).
- Ruth Breu; Annie Kuntzmann-Combelles; Michael Felderer (January–February 2014). "New Perspectives on Software Quality" (PDF). IEEE Software. IEEE Computer Society. pp. 32–38. Retrieved 2 April 2014.
- Ruth Breu; Berthold Agreiter; Matthias Farwick; Michael Felderer; Michael Hafner; Frank Innerhofer-Oberperfler (2011). "Living Models - Ten Principles for Change-Driven Software Engineering" (PDF). International Journal of Software and Informatics. ISCAS. pp. 267–290. Retrieved 16 April 2014.
- Michael Felderer; Christian Haisjackl; Ruth Breu; Johannes Motz (2012). "Integrating Manual and Automatic Risk Assessment for Risk-Based Testing" (PDF). Software Quality. Process Automation in Software Development. Springer Berlin Heidelberg. pp. 159–180. Retrieved 16 April 2014.
- Michael Kläs; Frank Elberzhager; Jürgen Münch; Klaus Hartjes; Olaf von Graevemeyer (2–8 May 2010). "Transparent combination of expert and measurement data for defect prediction: an industrial case study" (PDF). Proceedings of the 32nd ACM/IEEE International Conference on Software Engineering. ACM New York, USA. pp. 119–128. Retrieved 8 April 2014.
- Jacek Czerwonka; Nachiappan Nagappan; Wolfram Schulte; Brendan Murphy (July–August 2013). "CODEMINE: Building a Software Development Data Analytics Platform at Microsoft" (PDF). IEEE Software. IEEE Computer Society. pp. 64–71. Retrieved 7 April 2014.