1 Simple Way Developers Can Improve Testability

Software Tester Reacting To Great Testability
Software Tester Reacting To Great Testability

1 Simple Way Developers Can Improve Testability

Mar 12, 2024

It does not matter what automation testing tool or framework you use. If you want to deliver a quality product to your users, you should also want to give a testable product to your testers.

IDs

IDs are great; you probably use IDs for several things already. You might have anchor points through a page to give users a nice place to jump to. You might have them on your forms or the fields inside. You might use IDs for A/B testing your call to action or applying styling. This is the problem with IDs; there are too many purposes. 

Test IDs

Adding Test IDs is the single best way to massively improve your testability, regardless of your automation test tool or framework.

What is a Test ID?

A Test ID is an HTML attribute (commonly data-test-id but there are many aliases) added to core elements to speed up and simplify testing.

<button data-test-id="login-button" type="submit">Login</button>

Test IDs can be added to any element, but if you're getting started, I would suggest the following:

  • Buttons

  • Links

  • Forms & Inputs

  • User Feedback (alerts, modals, form validation, loading spinners, etc.)

  • Menus

  • Lists

  • Checkout Summaries

How do Test IDs save engineering time?

Let's say a developer adds a "forgotten password" button to a login form; the resulting code might look like this:

<form class="login" action="/login" method="post">
  <h1 class="h3 mb-3 font-weight-normal">Please Login</h1>
  <label for="inputEmail" class="sr-only">Email address</label>
  <input type="email" id="inputEmail" class="form-control mb-2" placeholder="Email address" required autofocus />
  <label for="inputPassword" class="sr-only">Password</label>
  <input type="password" id="inputPassword" class="form-control mb-2" placeholder="Password" required />
  <button class="btn btn-lg btn-primary btn-block" type="submit">Login</button>
  <p class="mt-5 mb-3 text-muted">
    <a href="/forgotten-password">Forgot password?</a>
  </p>
</form>

An automation tester will look at this and have to create a selector. They might use a combination of tags and classes for the element and its parents. Nothing here is too tricky, but producing a unique selector on a large page often takes several minutes.

Now let's look again with Test IDs added:

<form data-test-id="login-form" class="login" action="/login" method="post">
  <h1 class="h3 mb-3 font-weight-normal">Please Login</h1>
  <label for="inputEmail" class="sr-only">Email address</label>
  <input data-test-id="login-username" type="email" id="inputEmail" class="form-control mb-2" placeholder="Email address" required autofocus />
  <label for="inputPassword" class="sr-only">Password</label>
  <input data-test-id="login-password" type="password" id="inputPassword" class="form-control mb-2" placeholder="Password" required />
  <button data-test-id="login-button" class="btn btn-lg btn-primary btn-block" type="submit">Login</button>
  <p class="mt-5 mb-3 text-muted">
    <a data-test-id="login-forgotten-password" href="/forgotten-password">Forgot password?</a>
  </p>
</form>

These Test IDs will only take a developer seconds to add and a tester seconds to use. So, this has already saved your engineering team, as a whole, several minutes per selector. But the real benefits are still to come.

How Test IDs improve test quality?

The quality of the selector is far better; here's the above comparison as an example:

[data-test-id="login-forgotten-password"] // <-- Robust Test ID

Side Node: DoesQA has a specific selector type for Test IDs 😉

form.login > p.text-muted > a // <-- Flaky CSS Selector

Both of the selectors above were unique at the time they were created. But whereas the Test ID should permanently be unique, the other will stop being unique if another link is added to the login form; here's an example:

<form class="login" action="/login" method="post">
  <h1 class="h3 mb-3 font-weight-normal">Please Login</h1>
  <!-- ... -->
  <button class="btn btn-lg btn-primary btn-block" type="submit">Login</button>
  <p class="mt-5 mb-3 text-muted">
    <a href="/forgotten-password">Forgot password?</a>
    <a href="/register">Create Account</a>
  </p>
</form>

With the addition of a new link, previous tests will start failing. Revisiting this test to improve the selector is wasted time that could be used to expand test coverage elsewhere.

How Test IDs make testing more efficient?

We have already examined how Test IDs can save some time; with some imagination, we can extrapolate this to substantial efficiency gains in the long term. But there's a much more direct way to show that Test IDs will bring massive time savings.

What happens when the login form gets a face-lift? As it's such an integral area of your service, you have hundreds of tests following through it. Your functional regression tests shouldn't care if your form inputs now look slightly different (that's the job of your Visual Regression tests).

The simple solution is that your existing Test IDs can be reused. While development is ongoing, you could even have the same test pack running on different environments, one with the new form and another with the old. Here's a new form reusing the Test IDs:

<form data-test-id="login-form" class="max-w-md mx-auto mt-8 bg-white shadow-md rounded px-8 pt-6 pb-8 mb-4" action="/login" method="post">
  <div class="mb-4">
    <label class="block text-gray-700 text-sm font-bold mb-2" for="username">Username</label>
    <input data-test-id="login-username" class="shadow appearance-none border rounded w-full py-2 px-3 text-gray-700 leading-tight focus:outline-none focus:shadow-outline" id="username" type="text" placeholder="Username" required />
  </div>
  <div class="mb-6">
    <label class="block text-gray-700 text-sm font-bold mb-2" for="password">Password</label>
    <input data-test-id="login-password" class="shadow appearance-none border rounded w-full py-2 px-3 text-gray-700 mb-3 leading-tight focus:outline-none focus:shadow-outline" id="password" type="password" placeholder="Password" required />
  </div>
  <div class="flex items-center justify-between">
    <button data-test-id="login-button" class="bg-blue-500 hover:bg-blue-700 text-white font-bold py-2 px-4 rounded focus:outline-none focus:shadow-outline" type="submit">Sign In</button>
    <a data-test-id="login-forgotten-password" class="inline-block align-baseline font-bold text-sm text-blue-500 hover:text-blue-800" href="/forgotten-password">Forgot Password?</a>
  </div>
</form>

More Testing, Less Coding

Automation Testing is the overlapping area on a Vent Diagram between Coding and Testing. You want to save manual testing time by running high-quality automation tests with broad coverage, but traditionally, you've needed coders to write these tests while wanting them designed with a tester's mind.

Times are changing, and creating a high-quality test pack without code is both possible and desirable. However, building page objects/elements required some knowledge of CSS or XPath. These coding skills are entirely unnecessary and result in slower and less reliable tests than using the Test IDs inserted by developers.

Note to Devs

Howdy 👋, adding Test IDs to your department's code standards would be fantastic. This will add minimal time to your development work, and the automation test team will repay you with faster feedback, broader test coverage, and generally high confidence in your work.

Thank you,

What is the best way to get started using Test IDs?

While writing this, I added 97 Test IDs to the DoesQA app by first adding the Test Selectors ESLint Plugin, running the linting, and working through the warnings. It only took 20 minutes to do everything, and now I can change the rule from warnings to errors to ensure every new commit adheres to this rule as part of our in-house code standards.

I recommend this approach.

Now give these buttons a good test 😜

Want Better Automation Tests?

Want Better Automation Tests?

High-quality test coverage with reliable test automation.