Non functional list of ideas

A while ago the lead business analyst in my project asked me if I could supply him with a list of non functional tests.
While aware that he probably meant if I could supply him with the non functional tests defined, executed and/or logged for our project I did not have a lot of time to search our test ware for them and I gave him the below list of definitions to digest as a starter.

I shared that list so the items on it can be used as a list to draw test ideas from. Since the initial publication I decided that this should be a living document. My first addition is based on the list of Software Quality Characteristics by Rikard Edgren, Henrik Emilson and Marin Janson (The testeye). I would like to continue to add definitions and possibly the supporting items that the Testeye added to theirs for each of them.

accessibility: usability of a product, service, environment or facility by people with the widest range of capabilities

accountability: degree to which the actions of an entity can be traced uniquely to the entity

adaptability: The capability of the software product to be adapted for different specified environments without applying actions or means other than those provided for this purpose for the software considered.

analyzability: The capability of the software product to be diagnosed for deficiencies or causes of failures in the software, or for the parts to be modified to be identified

availability: The degree to which a component or system is operational and accessible when required for use.

capacity: degree to which the maximum limits of a product or system parameter meet requirements

capability: Can the product perform valuable functions?

Completeness: all important functions wanted by end users are available.
Accuracy: any output or calculation in the product is correct and presented with significant digits.
Efficiency: performs its actions in an efficient manner (without doing what it’s not supposed to do.)
Interoperability: different features interact with each other in the best way.
Concurrency: ability to perform multiple parallel tasks, and run at the same time as other processes.
Data agnosticism: supports all possible data formats, and handles noise
Extensibility: ability for customers or 3rd parties to add features or change behavior.

change ability: The capability of the software product to enable specified modifications to be implemented.

charisma: Does the product have “it”?

Uniqueness: the product is distinguishable and has something no one else has.
Satisfaction: how do you feel after using the product?- Professionalism: does the product have the appropriate flair of professionalism and feel fit for purpose?
Attractiveness: are all types of aspects of the product appealing to eyes and other senses?
Curiosity: will users get interested and try out what they can do with the product?
Entrancement: do users get hooked, have fun, in a flow, and fully engaged when using the product?
Hype: does the product use too much or too little of the latest and greatest technologies/ideas?
Expectancy: the product exceeds expectations and meets the needs you didn’t know you had.
Attitude: do the product and its information have the right attitude and speak to you with the right language and style?
Directness: are (first) impressions impressive?
Story: are there compelling stories about the product’s inception, construction or usage?

compatibility: degree to which a product, system or component can exchange information with other products, systems or components, or perform its required functions, while sharing the same hardware or software environment

compatibility: How well does the product interact with software and environments?

Hardware Compatibility: the product can be used with applicable configurations of hardware components.
Operating System Compatibility: the product can run on intended operating system versions, and follows typical behavior.
Application Compatibility: the product, and its data, works with other applications customers are likely to use.
Configuration Compatibility: product’s ability to blend in with configurations of the environment.
Backward Compatibility: can the product do everything the last version could?
Forward Compatibility: will the product be able to use artifacts or interfaces of future versions?
Sustainability: effects on the environment, e.g. energy efficiency, switch-offs, power-saving modes, telecommuting.
Standards Conformance: the product conforms to applicable standards, regulations, laws or ethics.

compliance: The capability of the software product to adhere to standards, conventions or regulations in laws and similar prescriptions.

confidentiality: degree to which a product or system ensures that data are accessible only to those authorized to have access 

flexibility: the ease with which a system or component can be modified for use in applications or environments other than those for which it was specifically designed

install ability: The capability of the software product to be installed in a specified environment

integrity: degree to which a system or component prevents unauthorized access to, or modification of, computer programs or data

interoperability: The capability of the software product to interact with one or more specified components or systems

IT-ability: Is the product easy to install, maintain and support?

System requirements: ability to run on supported configurations, and handle different environments or missing components.
Installability: product can be installed on intended platforms with appropriate footprint.
Upgrades: ease of upgrading to a newer version without loss of configuration and settings.
Uninstallation: are all files (except user’s or system files) and other resources should be removed when uninstalling?
Configuration: can the installation be configured in various ways or places to support customer’s usage?
Deployability: product can be rolled-out by IT department to different types of (restricted) users and environments.
Maintainability: are the product and its artifacts easy to maintain and support for customers.
Testability: how effectively can the deployed product be tested by the customer?

learnability: The capability of the software product to enable the user to learn its application

localizability: How economical will it be to adapt the product for other places 

maintainability: The ease with which a software product can be modified to correct defects, modified to meet new requirements, modified to make future maintenance easier, or adapted to a changed environment

maintainability: Can the product be maintained and extended at low cost?

Flexibility: the ability to change the product as required by customers.
Extensibility: will it be easy to add features in the future?
Simplicity: the code is not more complex than needed, and does not obscure test design, execution and evaluation.
Readability: the code is adequately documented and easy to read and understand.
Transparency: Is it easy to understand the underlying structures?- Modularity: the code is split into manageable pieces.
Refactorability: are you satisfied with the unit tests?
Analyzability: ability to find causes for defects or other code of interest.

modifiability: ease with which a system can be changed without introducing defects

modularity: degree to which a system or computer program is composed of discrete components such that a change to one component has minimal impact on other components

non-functional requirement: A requirement that does not relate to functionality, but to attributes such as reliability, efficiency, usability, maintainability and portability

operability: The capability of the software product to enable the user to operate and control it

performance: Is the product fast enough?

Capacity: the many limits of the product, for different circumstances (e.g. slow network.)
Stress handling: how does the system cope when exceeding various limits?Responsiveness: the speed of which an action is (perceived as) performed.
Availability: the system is available for use when it should be.
Throughput: the products ability to process many, many things.
Endurance: can the product handle load for a long time?
Feedback: is the feedback from the system on user actions appropriate?
Scalability: how well does the product scale up, out or down?

pleasure: degree to which a user obtains pleasure from fulfilling personal needs

portability: The ease with which the software product can be transferred from one hardware or software environment to another

portability: Is transferring of the product to different environments and languages enabled?

Reusability: can parts of the product be re-used elsewhere?
Adaptability: is it easy to change the product to support a different environment?
Compatibility: does the product comply with common interfaces or official standards?
Internationalization: it is easy to translate the product.
Localization: are all parts of the product adjusted to meet the needs of the targeted culture/country?
User Interface-robustness: will the product look equally good when translated?

recoverability: The capability of the software product to re-establish a specified level of performance and recover the data directly affected in case of failure

reliability: The ability of the software product to perform its required functions under stated conditions for a specified period of time, or for a specified number of operations

reliability: Can you trust the product in many and difficult situations?

Stability: the product shouldn’t cause crashes, unhandled exceptions or script errors.
Robustness: the product handles foreseen and unforeseen errors gracefully.
Recoverability: it is possible to recover and continue using the product after a fatal error.
Resource Usage: appropriate usage of memory, storage and other resources.
Data Integrity: all types of data remain intact throughout the product.
Safety: the product will not be part of damaging people or possessions.
Disaster Recovery: what if something really, really bad happens?
Trustworthiness: is the product’s behavior consistent, predictable, and trustworthy?

replaceability: The capability of the software product to be used in place of another specified software product for the same purpose in the same environment

reusability: degree to which an asset can be used in more than one system, or in building other assets

robustness: The degree to which a component or system can function correctly in the presence of invalid inputs or stressful environmental conditions

safety: The capability of the software product to achieve acceptable levels of risk of harm to people, business, software, property or the environment in a specified context of use

satisfaction: freedom from discomfort and positive attitudes towards the use of the product

scalability: The capability of the software product to be upgraded to accommodate increased loads

security: Attributes of software products that bear on its ability to prevent unauthorized access, whether accidental or deliberate, to programs and data

security: Does the product protect against unwanted usage?

Authentication: the product’s identifications of the users.
Authorization: the product’s handling of what an authenticated user can see and do.
Privacy: ability to not disclose data that is protected to unauthorized users.
Security holes: product should not invite to social engineering vulnerabilities.
Secrecy: the product should under no circumstances disclose information about the underlying systems.
Invulnerability: ability to withstand penetration attempts.
Virus-free: product will not transport virus, or appear as one.
Piracy Resistance: no possibility to illegally copy and distribute the software or code.
Compliance: security standards the product adheres to.

stability: The capability of the software product to avoid unexpected effects from modifications in the software

suitability: The capability of the software product to provide an appropriate set of functions for specified tasks and user objectives

supportability: Can customer’ usage and problem be supported?

Identifiers: is it easy to identify parts of the product and their versions, or specific errors?
Diagnostics: is it possible to find out details regarding customer situations?
Troubleshootable: is it easy to pinpoint errors (e.g. log files) and get help?
Debugging: can you observe the internal states of the software when needed?
Versatility: ability to use the product in more ways than it was originally designed for.

survivability: degree to which a product or system continues to fulfill its mission by providing essential services in a timely manner in spite of the presence of attacks

testability: The capability of the software product to enable modified software to be tested

testability: Is it easy to check and test the product?

Traceability: the product logs actions at appropriate levels and in usable format.
Controllability: ability to independently set states, objects or variables.
Isolateability: ability to test a part by itself.- Observability: ability to observe things that should be tested.
Monitorability: can the product give hints on what/how it is doing?- Stability: changes to the software are controlled, and not too frequent.
Automation: are there public or hidden programmatic interface that can be used?- Information: ability for testers to learn what needs to be learned…
Auditability: can the product and its creation be validated?

traceability: The ability to identify related items in documentation and software, such as requirements with associated tests

usability: extent to which a system, product or service can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use

usability: Is the product easy to use?

Affordance: product invites to discover possibilities of the product.
Intuitiveness: it is easy to understand and explain what the product can do.
Minimalism: there is nothing redundant about the product’s content or appearance.
Learnability: it is fast and easy to learn how to use the product.
Memorability: once you have learnt how to do something you don’t forget it.
Discoverability: the product’s information and capabilities can be discovered by exploration of the user interface.
Operability: an experienced user can perform common actions very fast.
Interactivity: the product has easy-to-understand states and possibilities of interacting with the application (via GUI or API).
Control: the user should feel in control over the proceedings of the software.
Clarity: is everything stated explicitly and in detail, with a language that can be understood, leaving no room for doubt?
Errors: there are informative error messages, difficult to make mistakes and easy to repair after making them.
Consistency: behavior is the same throughout the product, and there is one look & feel.
Tailorability: default settings and behavior can be specified for flexibility.
Accessibility: the product is possible to use for as many people as possible, and meets applicable accessibility standards.
Documentation: there is a Help that helps, and matches the functionality.

Any additions to this list or comments on the definitions used are welcome.

SOURCES

  • HTTP://WWW.COMPUTER.ORG/SEVOCAB
    SEARCH ITEMS ISO 29119 AND IS0 25010
  • ISTQB GLOSSARY
  • CRUSSPIC STMPL (RST COURSEWARE)
  • Software Quality Characteristics and Internal Software Quality Characteristics by Rikard Edgren, Henrik Emilson and Martin Jansson in The little black book on test design

TMap Day 2014

On October 28, 2014 I visited Sogeti’s TMap Day.
This year’s focus was on their new book “Neil’s Quest for Quality”,
subtitled “A TMap© HD Story”.

This blog post describes my first impressions of that day.  When I have read the book I will either return to this post and adjust it or write a separate one on the books content.

Note: I have read the book and have adjusted the post.

The book is written as a novel. It contains “the TMap Human Driven story, consisting of a business novel, building blocks, Mr. Mikkel’s musings and contributions from the innovations board in testing”.

A quality-driven approach

The new TMap presents itself as a quality-driven approach that is captured in the TMap© Suite which consists of the following three parts:

  1. TMap Next
  2. TMap© HD
  3. Building Block described in the TMap HD book and gathered, maintained and extended on http://www.tmap.net

Inspired on Lean, Agile and DevOps

Both authors explained their contribution to the book and expressed that the elements (more detail later) are mostly inspired on the market move towards Agile and DevOps and the whole is based on a Lean approach to software development.

Aldert Boersma, one of the authors, positioned quality-driven as follows.

Position_TMapHD

The approach described in TMap HD© distinguishes five basic elements:

TMapHD_Elements

The (short) TMap HD descriptions for these concepts are:
People
Only people realize moving from “Testing according to TMap” to “Testing with TMap”.People with a broad knowledge of quality and testing that is.

Simplify
Make things as simple as possible – but not more simple than that.
Start small and work from there. A complicated process will only lose focus on the result.

Integrate
Integration with respect to testing denotes to a shared way of working, with a shared responsibility for quality. Testing is not a stand alone process.

Industrialize
Industrialization is important in improving testing and optimizing quality. Test tools are used to test more, more often, and faster.

Confidence
The goal of TMap HD© is providing confidence in IT solutions through a quality-driven approach. Confidence is the fifth element over and above all others.

Building Blocks

The first four elements should help you choose and apply building blocks that give (build) confidence. There is no prescribed set of blocks to use and main blocks themselves are seen as larger blocks of which smaller parts are chosen.

The now available building blocks are:

  • Test manager
    In the Test manager presentation the remark was made that fewer people will be TM but the activities will remain
  • Test manager in traditional
  • Assignment
  • Test organization
  • Test plan
  • Product risk analysis
    Oddly during the test management presentation and in the book this was called
    Product Risk & Benefit Analysis but that is not part of the website (yet)
  • Test strategy
  • Performance testing
  • Test approaches
  • Crowd-testing
  • Test varieties
    This replaces the current Test Levels and Test Types.
    They are divided in Experience based and Coverage based
  • Test manager in agile
  • Permanent test organisation
  • Model-based testing
  • Quality policy
  • Using test tools
  • Quality-driven characteristics
  • Integrated test organization
  • Reviewing requirements

As a kind of closing motto for that morning the following phrase was handed as a summary for test managers “Do not report trouble but offer choices for the client”.

So…

The TMap Day and the book have left me with rather distinct, and slightly contradicting, impressions.

A move in the right direction

Sogeti has embraced the fact that software development and with it software testing has changed over the last decades. The rise of Agile and Lean on the one side and the decline of Waterfall hasn’t gone unnoticed and the new brand certainly addresses these developments. There is also an influence that Sogeti has carefully tried to avoid in mentioning but that I believe is clearly present. That influence is Context-Driven Testing. In spite of naming it environment, circumstance or situation TMap HD shares the principle that based on the context software testing is and should be different and use that what best suites the context.

Criticism

Ever since TMap, and TMap Next and particularly since their training and certification program appeared there has been a lot of criticism. This criticism especially focusses on the rigid factory school view on software and the limited value of TMap certification. While Sogeti itself did not react to this much many of the authors, most of them no longer working for Sogeti, did. The common denominator in their response was that the content was misunderstood and it was never meant to be followed by the letter.

Next to a wider interest for and influence of new software development approaches the emergence of building blocks shows that parts of the criticism is taken to heart and that TMap should and can now be used more flexible.

Superficial

In the Netherlands we have a saying “Oude wijn in nieuwe zakken” (litterally Old wine in new wineskins) expressing that although it looks new it’s still the same old stuff. I believe this applies also TMap HD. Even with influence of Agile and Lean and the introduction of Building Blocks in the book I am still left the feeling that beneath the surface the nature of the solutions is still the same as before. This feeling is enhanced by the fact that TMap Next is still declared to be the core of the testing approach and that all existing training courses and certificates remain. Especially that last part has led to the rigid and limited testing approach that many Dutch testers employ.

So while in theory there is hope for positive change I fear that in reality nothing much will change for the better.

The CDT Brigade

On August 26, 2014 @perftestman posted the following tweet:
“I think a lot of the CDT brigade just like the sound of they’re  own voice –
everything’s CONTEXT driven! ”

It was her reaction on my earlier tweet, that got some attention:
Dear @pt_wire I use effective, documented systematic testing processes and methods. But not generic one size fits all. #CDT vs #ISO29119

For a while both James Bach and I reacted to it but the limitation of 140 characters, twittering on a mobile phone and not the least work made me stop engaging in the twitter feed. But it wasn’t the first time that people resort to this kind of fallacy thus avoiding discussing the actual content. This made think about it and in this post I will share my little thought exercise.

Addressing the first part of her tweet an answer to what is a brigade?

  • A group of people who have the same beliefs
    I wouldn’t compare CDT to a belief system but one cannot deny that the CDT community shares values, talks about it, sometimes feels strongly about them and expresses them out in the open. This part doesn’t apply specifically to CDT as e.g. the ISTQB brigade has similar behavior.
    I also do not think that the CDT community is out to convert people to be context-driven testers. To convince by pointing out alternatives and different approaches -yes, answer questions – yes, share knowledge – yes, share experiences – yes, but testers are allowed to decide for them selves how and if they want to use it.
  • A group of people organized to act together
    The CDT community certainly regularly confers, meets each other (live or online) and challenges each other. They are however not organized as a single group or organization nor do I think they want to be. They are mostly independently and critical minds that have discovered that taking into account and using the context helps them to deliver more value and that building skill, acquiring and sharing knowledge and experience with others helps them to get better at doing that.
  • A large group of soldiers that is part of an army
    I fear that @perftestman intended to make use of this definition. This comparison however lacks credibility. The CDT community might seem to go battle figuratively over some subject and if feeling strongly about engaging in fierce discussion. And unfortunately occasionally a few of them even lash out to individuals that oppose CDT values or cannot handle the challenging style of discussion. But even if we form a community we are not so organized that we form a specific group of sorts nor do we intend to destruct or conquer. The community, essentially is a collection of likeminded professionals who rather aim to convince with facts, ideas and experience with the intend to advance the craft.

In a follow up post I will address an earlier tweet: “Too many of these so called test guru are impostors – detached from the realities of software development and #lovethesoundofyourownvoice” and compare the contributors to ISO 29119 to the CDT brigade and thus attempt to discover who are meant to be the potential impostors in this tweet and other occasions that spawned remarks like this.