logo

Quality statement

Introduction

A) The policy has been prepared in accordance with the Government guidance: 10 steps to Cyber Security Guidance:

https://www.gov.uk/government/publications/cyber-risk-management-a-board-level-responsibility/10-steps-summary

B) Amongst other quality assurance measures we engage third party testing of vulnerabilities and strengths of our Platforms. This establishes independent and authoritative verification that the Platforms are:

    easy to use for anyone who needs to use them, regardless of the device they’re using;
  • stable, secure and fast, regardless of how many people use the Platforms; and,
  • capable of quick iteration better to meet our users’ needs.

C) We strive to maintain the most secure service for our users and to ensure that our performance is rugged, dependable and that it incorporates the essential precautions to protect our infrastructure and users’ data.

D) Cyber Essentials Plus Certificate

We are in the process of achieving the Government’s independently assessed Cyber Essentials Plus certificate.

This process includes independent testing of whether the controls implemented collectively defeat threats from hacking and phishing. The testing also covers all internet gateway, all servers providing services directly to unauthenticated internet based users.

More information about this certificate may be found here:

https://www.gov.uk/government/publications/cyber-essentials-scheme-overview

Our Quality Assurance policy requires the following approaches and exercises to be undertaken.

1 The scope of our tests

The third party assessment incorporates tests to identify:

  • security issues that might affect our service; and,
  • the value and performance of the technology chosen to build our service.
  • the infrastructure tested may be considered in 5 categories:
  1. Boundary walls and internet gateways
  2. Secure configuration
  3. Access control
  4. Malware protection
  5. Patch management

When we speak about testing in this Policy we describe the assessment by various means of the Platforms owned and maintained by DisputesEfiling.com Limited which deliver our customers’ services.

A whole team approach

The service manager has overall responsibility for maintaining the quality of a service. However, because quality relates to every part of a service, the service manager is required to make sure all members of the team know how to:

  • 2.1 set goals for quality and measure the service’s performance against them;
  • 2.2 identify problems with any aspect of our service; and,
  • 2.3 take action to fix any issues and improve quality.

3 Quality testing

Because we cannot know how resilient our product is until it is tested we arrange for simulations of both normal and unusual conditions; for example, when our service has lots of visitors or is attacked.

Testing for quality helps us:

  • 3.1 build the best system we can;
  • 3.2 make sure our service performs as our users need it to do;
  • 3.3 build and regularly develop our service at a cost that is commercially acceptable taking into account: costs, business change, risk and other relevant factors.

4 Agile approach

As part of using agile methods, we test in a way that confirms the following as quickly as a possible:

  • 4.1 our code is working the way it’s expected to; and,
  • 4.2 our service is protected against malicious attacks.

5 Developing by iteration

When we test the Platforms we seek to automate as much of it as possible. For example, using a continuous integration system (where our tests form part of our codebase) means we have our code tested automatically every time we make a change.

We seek user feedback shortly after making changes which means we can respond quickly and make changes when needed. We can also spot bugs before they develop into bigger issues that may be more complicated and expensive to fix.

6 Types of testing

We run different types of test to examine all aspects of the Platform’s performance and integrity, for example:

  • 6.1 load and performance testing – to check how much traffic our service can handle and how stable and responsive it is under pressure;
  • 6.2 vulnerability and penetration testing – to check how secure our system is;
  • 6.3 exploratory testing – to manually check for bugs and defects;
  • 6.4 accessibility testing – to check that people with disabilities can use our service; and,
  • 6.5 acceptance and unit testing – to check that our code works as intended.

7 Technical debt

This is the expression used to cover any compromises we make on quality to develop something quickly in the short-term. The extra effort (or interest) required to improve what we have built is something we have to make (or pay) in future.

As our technical debt grows, we recognise that our code will become more difficult to work with. This means adding new features will get harder, take longer and introduce more bugs. If we compromise on quality to deliver something quickly, we will understand and provide for the means to pay the interest and technical debt in the near, medium or longer term.

8 When do we undertake the work required by this policy?

  • 8.1 third party assessments take place at least annually;
  • 8.2 technical patching always occurs in a timely fashion; and,
  • 8.3 malware protection is regularly updated throughout the year.

9 Responsibility and review

  • 9.1 This Policy is supervised by the service manager whose role it is to ensure compliance and review;
  • 9.2 This QA policy shall be reviewed annually by the service manager and a summary of findings produced; and,
  • 9.3 Amendments to the policy arising from the annual review shall be implemented in a timely fashion and communicated to relevant colleagues by the service manager.