Pyramid with 5 levels of DAMM Maturity

The Digital Accessibility Maturity Model: Dimension #7 – Testing and Validation

  • 0
  •  0

Written by: Sheri Byrne-Haber

In my last post in the Digital Accessibility Maturity Model (DAMM) Series I detailed the Development Lifecycle Dimension of DAMM.  In this post I’ll discuss the Testing and Validation Dimension, the aspects and artifacts associated with this dimension, and what each of the five maturity levels looks like for the this Dimension. (DAMM Definitions and Acronyms)

The Testing and Validation Dimension measures the degree of maturity associated with accessibility testing process and approaches. This encompasses an evolution from a chaotic testing environment to active, structured testing to a continuously updated testing approach that includes assistive technology and integrated testing by users with disabilities.

Aspects

  • Accessibility Testing Process – The degree to which accessibility testing is being performed in an effective fashion. This includes both ensuring that technical testing is performed in an effective fashion and that functional tests by users with disabilities are occurring throughout the development lifecycle.
  • Accessibility Testing Infrastructure – The degree to which an effective accessibility testing infrastructure is in place. This includes support for automatic, manual, global and use case testing in a single, unified and reconciled data set. Testing infrastructure includes the following requirements:
    • Software tools to manage development requirements and integration;
    • Integration of one or more automated tools with a defect tracking tool like Jira or HP Quality Center; and
    • Integration of an automated tool with CMS.
  • Accessibility Testing Artifacts Creation Process – The maturity of the organization relating to specific development artifacts that must be developed and or filed with the Accessibility Program Office as part of the accessibility testing process.

Artifacts

  • Accessibility Testing Artifacts – The maturity associated with a master accessibility testing plan and associated accessibility testing artifacts. Such a plan would define increasing levels of formal accessibility testing as product and project specific needs dictate. This defines the overall testing approach for accessibility and key gateways at which accessibility will be considered. The plan would include documented approaches for sampling an application and validating that such sampling would occur.
  • Quality Control Plan – processes for validating accessibility throughout the development life cycle.
  • Accessibility Testing Sampling – The level of maturity associated with sampling ICT for accessibility testing.
  • Usability Testing Artifacts, which includes:
    • Usability Goals;
    • Usability Test Plans;
    • Usability Test Checklists;
    • Usability Test Results / Reports; and
    • Non Disclosure Agreements.
  • User Group Profiles – describe the relevant characteristics people expected to use a product. For example, for a web-based office product, User Group Profile might include details on demographics, job responsibilities and tasks, frequency of use, type of hardware, place of use, type of software, experience using the Internet, and task knowledge.
  • Pilot Program Artifacts, which includes:
    • Recruiting Screener
    • Pilot Program Non-disclosure Agreements
  • Assistive Technology Catalog

Maturity Levels

Level 1 – Initial

  • There is no formal accessibility testing policy or standards defined for the organization.
  • Accessibility testing artifacts generated during the testing process are filed with the Accessibility Program Office, if it exists, sporadically.
  • Accessibility testing is carried out in an ad-hoc fashion, if at all.

Level 2 – Managed

  • A formal accessibility testing plan exists for each project, meeting the organizational policies and standards.
  • Some artifacts from the testing process that are produced by individual teams or lines of business have been identified and generally defined, and are filed with the Accessibility Program Office, as needed.
  • Accessibility testing is carried out in an ad-hoc fashion, and is occasionally documented in the overall testing plan.
  • Little to no functional testing performed by persons with disabilities.

Level 3 – Defined

  • Accessibility testing policy addresses the following issues:
    • Developer unit testing;
    • Testing with assistive technology;
    • The need to have a diverse tester pool including end users; and
    • Functionality and content testing; and
    • Defining the roles and responsibilities of everyone who performs a formal function in the testing process.
  • A detailed master accessibility testing plan is in place and is followed.
  • Increasing levels of formal accessibility testing occur as product/project risk and RoI allow.
  • A specific set of accessibility testing artifacts are produced by individual teams or line of business and filed with the Accessibility Program Office.
  • Formal accessibility testing is being conducted on all projects under the scope of the organization’s Accessibility Policy.
  • Functional testing by persons with disabilities is occurring and documented.
  • Accessibility testing results are included as part of the “ship” and/or product or project completion decisions.

Level 4 – Quantitatively Managed

  • Compliance level metrics from organizational level to product level are filed with the Accessibility Program Office.
  • The APO is performing independent testing and validation of the testing results provided by covered organizations.
  • The APO or an independent audit team is validating the testing approach selected by each team against project risk models.
  • Accessibility testing artifacts created by individual teams or lines of business are actively reviewed by the Accessibility Program Office.
  • The APO may, as needed, request additional testing information, after which amended artifacts may be filed.
  • Accessibility testing is conducted on all ICT by well trained, qualified individuals.
  • Use case testing by persons with disabilities is occurring at multiple stages in the development lifecycle.

Level 5 – Optimizing

  • Testing artifacts are actively updated and disseminated based on lessons learned from each group
  • Accessibility testing approaches and models are constantly refined to accurately validate compliance.
  • The assistive technologies used for testing purposes and user group profiles are actively updated and maintained based on market conditions.
  • Accessibility testing artifacts required by teams are actively updated and maintained for form and ease of use.
  • Accessibility testing best practices and techniques are actively maintained and updated.
  • As risk areas are identified by functional tests, manual validation items are created.

Coming Up

In my next post I’ll discuss Dimension #8 of DAMM – the Support and Documentation Dimension – which measures the degree of maturity with which an organization provides accessible support services and documentation, including ensuring that support systems meet the communication needs of people with disabilities and that documentation is provided in an accessible or alternative format.

No Comments

    Leave A Comment