Tuesday, May 6, 2008

Automated Testing Methodology

Automated Testing Methodology

Commonly Used Methodologies:

Record/Playback Method

· The Test Tool's Recording Mechanism records keystrokes, mouse actions, verification lists, etc.

Functional Decomposition Method

· The Application is broken down into Business Functions

· Automated Scripts are developed using the Tool's Scripting Language to perform those functions

· Data-Driven Process using Input & Verification Data Files

Test-Plan Driven Method

· The Test Cases are broken down into 'generic' testing actions

· Scripts are developed using the Tool's Scripting Language to perform these 'generic' actions

· Input File controls the processing as well as providing the Input and Verification data

Record/Playback Method

Advantages:

· Easy to Use

· Tester just starts recording and executes the Manual Test Case

Disadvantages:

· Reliability

The tester can make errors, which are then recorded, and have to be corrected

Failure on replay due to timing issues

Failure on replay due to events that occur that were not recorded (pop-ups, messages, etc.)

· Maintenance

Scripts contain hard-coded data that must be updated if the Application or the Data changes

Scripts must be enhanced or corrected after recording

Scripts & verification need to be re-recorded if the Application processing changes

Functional Decomposition Method

Business Functionality is broken down into its fundamental operations.

Test Case Example: "Post a Payment and Verify Account Data is Updated Correctly"

This could be broken down into the following operations:

Navigation: Access Payment Screen from Main Menu

Post a Payment

Verify Current Balance Updated

Navigation: Return to Main Menu

Navigation: Access Account Record

Verify Account Data Updated (Balance, Next Payment Date, etc.)

Navigation: Return to Main Menu

Using this breakdown, we can extract Business Functionality:

Navigation Routine(s)

Payment Posting & Verification Routine

Account Data Verification Routine

Routines can be Data-Driven, using Input and Verification files.

Numerous Test Cases can be automated by simply adding Input and Verification files for each.

Functional Decomposition Method

Architecture:

Driver Scripts

· Perform Initialization (if required), then call Test Case Scripts in the desired order

· Can be arranged to account for Test Case dependencies (If Test 1 fails, skip to Test 4, etc.)

Test Case Scripts

· Call Business Function Scripts to perform the required Test Case actions and verifications

Business Function Scripts

· Perform specific Business Functions within the application

· Call Subroutine Scripts and User-Defined Functions tp perform specific actions

Subroutine Scripts

· Perform application-specific tasks required by two or more Business Function Scripts

User-Defined Functions

Perform general, specific tasks that can be used by any number of scripts

Navigation, Enter Data using Input File, Verify Data using Verification File, etc.

Advantages:

Maintenance

· Modular design: If a Business Function changes, only 1 or 2 scripts must be modified

· Complex Test Cases can be constructed by calling Business Function Scripts from a Main Routine

· Data-Driven: Script can be used for many Test Cases by using different input/verification files

Reliability

· Tester error is eliminated, as is "scripter error" after the script has been properly coded and tested

· Unexpected events (pop-ups, messages, etc.) can be anticipated and coded for

Disadvantages:

Maintenance

· Each Business Function requires a script. There may be hundreds of Business Functions

· Changes in Test Cases require updates to several sets of input/verification files for each Test Case

· Format of input/verification records must be strictly adhered to or the tests will fail

· Testers must maintain the input/verification records as well as the Test Case documentation

Totally Data-Driven Method

Testing Activity is broken down into it's fundamental actions.

Examples: Data Entry

Select item in list_box/combo_box

Set a radio_button on or off

Set check_box on or off

Set a spin_control to a value

Enter text in an edit_box or field

Actions

Press a push_button or key

Select a menu_item

Select a tab

Verification

Verify correct Screen or Window is displayed

Verify Data (corollary to Data Entry)

Verify Field or Object Attributes

Each Testing Action is Associated with a Key-Word

Each Key-Word is Associated with a Utility Script

Data Entry Key-Word: "Enter:" Utility Script: Enter()

Action Key-Word: "Action:" Utility Script: Action()

Verify Window Key-Word: "Verify:" Utility Script: Verify()

Verify Data Key-Word: "Verify_Data:" Utility Script: Ver_Data()

Verify Attributes Key-Word: "Verify_Attributes:" Utility Script: Ver_Attr()

A Spreadsheet can be used for Input to this process:

Key-Words are placed in Column-1

Parameters or Field/Object names are placed in Column-2

Data or Field/Object names are placed in Column-3

Column 4 is used for comments

How it Works:

Spreadsheet is saved as a tab-delimited (text) file

A Controller script reads and processes the tab-delimited file

Switch/Case is used to match on the Column-1 Key-Word

A "list" is created from the remaining columns (Column-2~Column-3|Column-2~Column-3|etc.)

This continues until a blank line is reached

The Controller script then calls the Utility Script associated with the Key-Word, passing it the created list.

The Utility Script processes the input list from the Controller

Splits the input list into an array

Processes the Column-2 & Column-3 data in much the same way as the Controller Script

Returns to Controller script when finished

A Driver script processes multiple Test Cases, calling the Controller script

The File Name of the tab-delimited text file representing the Test Case is passed to the Controller

Architecture:

Driver Scripts

· Perform Initialization (if required), then call the Controller Script passing it the Test Case file name

· Also can be arranged to account for Test Case dependencies (If Test 1 fails, skip to Test 4, etc.)

Controller Script

· Calls Utility Scripts associated with Key-Words to perform the Test Case actions and verifications

Utility Scripts

· Perform specific Testing tasks required: Data Entry, Actions, Data Verification, etc.

· Call Subroutine Scripts and User-Defined Functions to perform specific actions

Business Function Utility Scripts

· Perform required application-specific tasks which may be a combination of Testing tasks

· Are associated with application-specific Key-Words and are parameterized within the Spreadsheet

User-Defined Functions

· Perform general tasks that can be used by any number of scripts

Architecture using TestDirector: TestDirector acts as the Driver

Single Test-Case Drivers call the Controller Script, passing the name of the tab-delimited

File Structure:

C:\

· AUT ------------- Application Under Test - Main Dir

· APP_Init ---------- --- Initialization Scripts

· DLL_Lib --------- --- DLLs to import (if required)

· Exp_Res --------- --- Common Expected Results Dir

· Fcn_Lib ----------- --- Application-Specific Functions

· GUI_Lib ---------- --- GUI Files for the Application

· Parameters ------ --- Parameter Files (set variables represented in the spreadsheet)

· Scripts ------------ --- Application-Specific Scripts

· AUT_Util ---- ---------- Application-Specific Utility Scripts

· Bus_Fcn ------ ---------- Application-Specific Business Function Scripts

· Drivers ------- ---------- Application-Specific Driver Scripts

· Recorded ---- ---------- Application-Specific Recorded Scripts

· TestData --------- --- Tab-Delimited Files

· TestPlan ---------- --- Spreadsheets / Workbooks

· TestRept --------- --- Customized Test Reports

· ToolKit_Fcn -- General ToolKit Function Libraries

· ToolKit_Util -- General Utility Scripts

Using TestDirector as the Driver:

To use Mercury Interactive's TestDirector, a slightly different architecture must be applied. TestDirector is designed to manage and call each different test or test case. Therefore we insert a Test Case Driver between TestDirector and the Controller Script:

TestDirector

· Calls the Test Case Driver script for each Test Case

· May also call a Test Case Script composed of Business Functions

· Additionally may call a recorded script, or any type of script

Test Case Driver

· Calls the Controller script, passing to it the Test Case name

· Test Case Driver folder contains the Spreadsheet file and the tab-delimited file

· Test Case Driver folder may also contain any other input/verification files required

From this point, the architecture is the same: The Controller script calls Utility / Business Function Scripts associated with Key-Words Utility Scripts perform the specific Testing Tasks delineated within the Spreadsheet

Business Function Scripts perform application-specific Testing Tasks User-defined functions perform generic tasks that can be used by any script as required.

Test Case Driver

Test Case Driver Folder contains all files & data relevant to the Test Case:

C:\ Mercury

AUT--------- Application Under Test - Main Dir

TestCases----- --- Test Case Driver Scripts (Called by TestDirector)

TC0001---- -------- Test Case Driver Script Folder

db (Folder)

exp (Folder)

Header (Test Properties)

Script (WinRunner TSL Script)

Script.bak (Script Backup Copy)

TC001.xls (Spreadsheet Test Case)

TC001.txt (Tab-Delimited File)

Results (Folder)

TC001R.txt (Customized Test Results)

Summary:

The Keyword-Driven method allows the Tester to create Test Cases in a Spreadsheet and have these be used as Automated Test Cases without having to use or learn the Test Tool. Automated Script development and maintenance is reduced, as only a few dozen scripts will be needed rather than hundreds. Test Case (Spreadsheet) maintenance is neither increased nor reduced, as the Tester must currently update the manual test cases anyhow when changes or revisions must be made. Only one "Test Tool Expert" will be required. The whole testing staff will not have to learn to use the test tool (unless they want to). Business Function Scripts and Recorded Scripts can still be used under this architecture if required. Parameterized input as variable-data allows Test Case (Spreadsheet) to be re-used in multiple-database environments. Pre-written "ToolKit" Functions and Utility Scripts greatly reduces "ramp-up" time.

Thursday, August 2, 2007

Password Recovery Testing

Whats is the Issue?
A great majority of web applications provide a way for users to recover (or reset) their password in case they have forgotten it. The exact procedure varies heavily among different applications, also depending on the required level of security, but the approach is always to use an alternate way of verifying the identity of the user. One of the simplest (and most common) approaches is to ask the user for his/her e-mail address, and send the old password (or a new one) to that address. This scheme is based on the assumption that the user's email has not been compromised and that is secure enough for this goal.
Alternatively (or in addition to that), the application could ask the user to answer one or more "secret questions", which are usually chosen by the user among a set of possible ones. The security of this scheme lies in the ability to provide a way for someone to identify themselves to the system with answers to questions that are not easily answerable via personal information lookups.

How To Test?

Password Reset
The first step is to check whether secret questions are used. Sending the password (or a password reset link) to the user email address without first asking for a secret question means relying 100% on the security of that email address, which is not suitable if the applicaton needs a high level of security.
On the other hand, if secret question are used, the next step is to assessing their strength.
As a first point, how many questions need to be answered before the password can be reset ? The majority of applications only need the user to answer to one question, but some critical applications require the user to answer correctly to two or even more different questions.
As a second step, we need to analyze the questions themselves. Often a self-reset system offers the choice of multiple questions; this is a good sign for the would-be attacker as this presents him/her with options. Ask yourself whether you could obtain answers to any or all of these questions via a simple Google search on the Internet or with some social engineering attack. As a penetration tester, here is a step-by-step walk through of assessing a password self-reset tool:

  • Are there multiple questions offered?
    • If so, try to pick a question which would have a “public” answer; for example, something Google would find with a simple query
    • Always pick questions which have a factual answer such as a “first school” or other facts which can be looked up
    • Look for questions which have few possible options such as “what make was your first car”; this question would present the attacker with a short-list of answers to guess at and based on statistics the attacker could rank answers from most to least likely
  • Determine how many guesses you have (if possible)
    • Does the password reset allow unlimited attempts ?
    • Is there a lockout period after X incorrect answers? Keep in mind that a lockout system can be a security problem in itself, as it can be exploited by an attacker to launch a Denial of Service against users
  • Pick the appropriate question based on analysis from above point, and do research to determine the most likely answers
  • How does the password-reset tool (once a successful answer to a question is found) behave?
    • Does it allow immediate change of the password?
    • Does it display the old password?
    • Does it email the password to some pre-defined email address?
    • The most insecure scenario here is if the password reset tool shows you the password; this gives the attacker the ability to log into the account, and unless the application provides information about the last login the victim would not know that his/her account has been compromised.
    • A less insecure scenario is if the password reset tool forces the user to immediately change his/her password. While not as stealthy as the first case, it allows the attacker to gain access and locks the real user out.
    • The best security is achieved if the password reset is done via an email to the address the user initially registered with, or some other email address; this forces the attacker to not only guess at which email account the password reset was sent to (unless the application tells that) but also to compromise that account in order to take control of the victim access to the application.

The key to successfully exploiting and bypassing a password self-reset is to find a question or set of questions which give the possibility of easily acquiring the answers. Always look for questions which can give you the greatest statistical chance of guessing the correct answer, if you are completely unsure of any of the answers. In the end, a password self-reset tool is only as strong as the weakest question. As a side note, if the application sends/visualizes the old password in cleartext it means that passwords are not stored in a hashed form, which is a security issue in itself already.

Password Remember

The "remember my password" mechanism can be implemented with one of the following methods:
  1. Allowing the "cache password" feature in web browsers. Although not directly an application mechanism, this can and should be disabled.
  2. Storing the password in a permanent cookie. The password must be hashed/encrypted and not sent in cleartext.
For the first method, check the HTML code of the login page to see whether browser caching of the passwords is disabled. The code for this will usually be along the following lines:
The password autocomplete should always be disabled, especially in sensitive applications, since an attacker, if able to access the browser cache, could easily obtain the password in cleartext (public computers are a very notable example of this attack). To check the second implementation type – examine the cookie stored by the application. Verify the credentials are not stored in cleartext, but are hashed. Examine the hashing mechanism: if it appears a common well-known one, check for its strength; in homegrown hash functions, attempt several usernames to check whether the hash function is easily guessable. Additionally, verify that the credentials are only sent during the login phase, and not sent together with every request to the application.