In my previous post, we discussed areas of test automation that may provide long-term reusable positive results within a manageable amount of time.
- Check each entry field that has maximum-length string data.
- Check each required field that has null data.
- Challenge type-checking validation of each field that has data of a different type from what is allowed.
- Test for all data values that are simultaneously maximized or most stressful.
- Verify that the nominal value of data is accepted, stored, retrieved, and displayed.
- Password Validation
Let’s discuss the next steps.
Expansion and Improvements
Test-design specification templates should be expanded beyond the typical empty-document sections to fill in to include generic validation rules, test-case checklists, common failure modes, and libraries of generic tests. The templates should include many of the usual tests for application features, so that they do not have to be added or reinvented. Part of this generic design should implement “boilerplate” tests that will be the first ones that every testing effort should include, even before considering more sophisticated and instance-specific tests. In this way, the generic test-design specifications allow any tester—for example, a new contingent staff member—to perform a range of generic tests, even before any customized, application-specific test planning.
- Identify a record or field upon which to operate (based on input name and parameter info).
- Verify nonexistence.
- Add item.
- Read and verify existence of identical unchanged data.
- Modify and verify matching modified data.
- Delete and verify removal of item.
- Identify item with type characteristics (for example, a data field) at an abstract level. This should not be limited to simple data types, but should include common business data types (for example, telephone number, address, ZIP code, customer, Social Security number, calendar date, and so on).
- Enumerate the generic business rules that are associated with the type.
- Define equivalence partitions and boundaries for the values for each business rule.
- Select test-case values from each equivalence class.
- Generate randomized item from equivalence classes.
- Reduce ad-hoc reinvention of test designs by capturing and reusing common test patterns.
- Recognize and abstract test issues. Capture them in a form that can be reapplied as a higher-level chunk of activity instead of always dropping down to the detailed instance level.
- Package the results and experience of past test-design work into a reusable deliverable that can be applied to similar test efforts.
- Create checklists of generalized “things to test” that will be a resource for reuse on future versions or applications. These generalized tests should augment the separate list of feature details (business rules) and associated tests that truly are unique to a specific project.
- Catalogue common and generic failure modes as items to verify, and make them part of test-design templates.
- Collect and document explicit test coverage categories, so that they can be incorporated into analysis and test design.
- Ensure that each category contains as complete a set of tests as possible to ensure coverage of the category. Divide the results into generic and unique subsets of tests. Reuse the generic portions of these lists for future efforts.
- Test-design specification templates should be expanded beyond empty-document sections to fill in to include generic validation rules, test-case checklists, common failure modes, and libraries of generic tests.
Testers should build test designs around reusable test patterns that are common to a large number of application-testing problems. The inclusion of these patterns as a standard part of the test-design template reduces the time that is spent on test design by eliminating the need to start with a blank test-design specification. Reuse of these patterns has the added benefit of codifying expert-tester knowledge, so as to increase the likelihood of catching common failures.
The introduction of automated testing is in itself a project. Treat it so. Plan it as you would any major project. Make it a first-level level item. Involve all members of your team from developers to management. Expect discussion, setbacks, and a few hefty headaches as you grow.
SOUND OFF: Have you implemented automated testing in your organization? What hurdles did you face?