BLOGCTA_TC_20130726_1_Testing_Ninja_936x110

Five Reasons for Doing Data-Driven Testing

In preparation for Nick Olivo’s webinar on building frameworks for data-driven testing, which begins at 2pm EST on November 15, I thought it would be a good idea to examine the importance and popularity of this particular testing method.

What is Data-Driven Testing?

Picture for a moment that the application under test (AUT) is a black box with an interface on the input and the output, and that there is typically some user interface where the user of the AUT can enter data. This data can be in the form of login information, credit card info, a list of items to purchase, membership status, a product code, dates, coordinates, etc. Once the data is entered into the black box, certain operations are performed as part of the application functionality. Exactly which operation is performed on a given data set often depends on the value that’s entered. For example, if this is a claim processing application you can enter the type of coverage, the type of expense and the date of expense. Based on this information the application will calculate whether or not you are eligible for refund. Another example would be display of user interface elements depending on the data that is entered by the user. 

Think of this logic as a controller for a railroad trackThe operation that is performed on a given data set is decided by the application logic, or business rules related to it. You can think of this logic as a controller for a railroad track. It directs the train – operations for a given data set – to a certain branch of the railroad track. When this happens there are a number of things that we’d want to test, such as:

  • Did the application produce any result at all?
  • Did it raise an exception?
  • Is this result correct?
  • Was the correct operation performed?
  • Etc.

By changing the data sent to the interface we exercise the logic corresponding to business rules (requirements) of the application, and then we look at the result. Since the application is controlled through data sent to its interface, we refer to this as data-driven testing. Here are five reasons why you may want to give data-driven testing a shot.

1. Multiply value of your existing tests

    If you already have tests for your application and you use some specific data in your test scripts you can parameterize the scripts and use an external data source to feed the application under test. When you create the framework for managing data and executing tests based on this data you can create new tests just by adding records to your test data repository. That’s a very efficient way of creating new tests and it allows you to increase the value of your existing tests and improve coverage of your application. If you do automated testing, data-driven testing is practically a must have.

    2. Combine positive and negative tests.

      In addition to increasing the number of your tests and the corresponding test coverage, you can easily build tests that enter invalid data to the interface to see how the application will handle such exceptions. Is it going to provide the correct feedback to the user, or is it going to perform some wild operation and log you in to the wrong account? You won’t know for sure until you try. You can create “positive” tests by using valid data and negative tests by using invalid, or incorrect, data and observing how the application reacts.

      3. Increase test coverage

        Clearly, if you have more tests, it’s likely that you’ll increase coverage of the application code. But unless you’re systematic about it, adding tests may not have the desired effect. If you monitor the code coverage of your application as you run your tests – by using a tool like AQtime, for example – you can monitor what portion of the code is executed for a given test, and use that information to create a test data set with the sole purpose of maximizing the code coverage. That’s an excellent way to look into certain corner cases that you would not otherwise test.

        4. Validate (some) requirements

          Above, I mentioned business rules and application logic corresponding to them. Often, these are synonyms for important requirements. For example, is the application going to issue a refund for the insurer with the insufficient type of coverage? It is likely that data-driven testing won’t be sufficient to accomplish this goal, but it could be a great first step if you are conscious about it when you create your data-driven tests.

          5. Improve architecture of your tests

            Managing test data separately from the tests may force you to change the way you look at your test suite. Test scripts can start looking like use cases and test data can extend the reach of these use cases to multiple business rules related to them. The use cases for which you have scripts may be very important, or they may be edge cases. Your tests can help you generate more test data for other parts of your process, especially if you work with APIs and Service Oriented Architecture.

            Click below to join Nick Olivo on November 15th to discuss how to build frameworks for data driven testing and how to use them for test automation with TestComplete. 








            subscribe-to-our-blog



            Comments

            1. I find that the spreadsheets used in data driven automated tests provide an excellent medium to communicate with the business. My business users were much more fluent in ‘spreadsheet’ than in ‘user story’.  
               
               
               
              Data driven automated tests are also a natural evolution of a Test Matrix. And if your application doesn’t match the math in the spreadsheet, you’ll need to change your application(or change your spreadsheet).

            2. Goran Begic says:

              Thank you very much Jon. You bring up an excellent point. Data often ‘is’ the story. Spreadsheet ‘language’ can be clear and precise and if data is organized well it can also be much easier to review than the narrative around it.

            Speak Your Mind

            *