Keyword or Table Driven Test Automation
Nearly everything discussed so far defining our ideal automation framework has been describing the best features of "keyword driven" test automation. Sometimes this is also called "table driven" test automation. It is typically an application-independent automation framework designed to process our tests. These tests are developed as data tables using a keyword vocabulary that is independent of the test automation tool used to execute them. This keyword vocabulary should also be suitable for manual testing, as you will soon see.
Action, Input Data, and Expected Result ALL in One Record:
The data table records contain the keywords that describe the actions we want to perform. They also provide any additional data needed as input to the application, and where appropriate, the benchmark information we use to verify the state of our components and the application in general.
For example, to verify the value of a user ID textbox on a login page, we might have a data table record as seen in Table 1:
WINDOW | COMPONENT | ACTION | EXPECTED VALUE |
LoginPage | UserIDTextbox | VerifyValue | "MyUserID" |
Table 1
Reusable Code, Error Correction and Synchronization:
Application-independent component functions are developed that accept application-specific variable data. Once these component functions exist, they can be used on each and every application we choose to test with the framework.
Figure 2 presents pseudo-code that would interpret the data table record from Table 1 and Table 2. In our design, the primary loop reads a record from the data table, performs some high-level validation on it, sets focus on the proper object for the instruction, and then routes the complete record to the appropriate component function for full processing. The component function is responsible for determining what action is being requested, and to further route the record based on the action.
Framework Pseudo-Code |
Primary Record Processor Module: Verify "LoginPage" Exists. (Attempt recovery if not)Set focus to "LoginPage". Verify "UserIDTextbox" Exists. (Attempt recovery if not) Find "Type" of component "UserIDTextbox". (It is a Textbox) Call the module that processes ALL Textbox components. Textbox Component Module: Validate the action keyword "VerifyValue".Call the Textbox.VerifyValue function. Textbox.VerifyValue Function: Get the text stored in the "UserIDTextbox" Textbox.Compare the retrieved text to "MyUserID". Record our success or failure. |
Figure 2
Test Design for Man and Machine, With or Without the Application:
Table 2 reiterates the actual data table record run by the automation framework above:
WINDOW | COMPONENT | ACTION | EXPECTED VALUE |
LoginPage | UserIDTextbox | VerifyValue | "MyUserID" |
Note how the record uses a vocabulary that can be processed by both man and machine. With minimal training, a human tester can be made to understand the record instruction as deciphered in Figure 3:
On the LoginPage, in the UserIDTextbox, Verify the Value is "MyUserID". |
Once they learn or can reference this simple vocabulary, testers can start designing tests without knowing anything about the automation tool used to execute them.
Another advantage of the keyword driven approach is that testers can develop tests without a functioning application as long as preliminary requirements or designs can be determined. All the tester needs is a fairly reliable definition of what the interface and functional flow is expected to be like. From this they can write most, if not all, of the data table test records.
Sometimes it is hard to convince people that this advantage is realizable. Yet, take our login example from Table 2 and Figure 3. We do not need the application to construct any login tests. All we have to know is that we will have a login form of some kind that will accept a user ID, a password, and contain a button or two to submit or cancel the request. A quick discussion with development can confirm or modify our determinations. We can then complete the test table and move on to another.
We can develop other tests similarly for any part of the product we can receive or deduce reliable information. In fact, if in such a position, testers can actually help guide the development of the UI and flow, providing developers with upfront input on how users might expect the product to function. And since the test vocabulary we use is suitable for both manual and automated execution, designed testing can commence immediately once the application becomes available.
It is, perhaps, important to note that this does not suggest that these tests can be executed automatically as soon as the application becomes available. The test record in Table 2 may be perfectly understood and executable by a person, but the automation framework knows nothing about the objects in this record until we can provide that additional information. That is a separate piece of the framework we will learn about when we discuss application mapping.
Findings:
The keyword driven automation framework is initially the hardest and most time-consuming data driven approach to implement. After all, we are trying to fully insulate our tests from both the many failings of the automation tools, as well as changes to the application itself.
To accomplish this, we are essentially writing enhancements to many of the component functions already provided by the automation tool: such as error correction, prevention, and enhanced synchronization.
Fortunately, this heavy, initial investment is mostly a one-shot deal. Once in place, keyword driven automation is arguably the easiest of the data driven frameworks to maintain and perpetuate providing the greatest potential for long-term success.
Additionally, there may now be commercial products suitable for your needs to decrease, but not eliminate, much of the up-front technical burden of implementing such a framework. This was not the case just a few years ago.