Patterson Consulting is Closing

Patterson Consulting, LLC of Tennessee is in the process of closing and all operations will cease effective March 1, 2022. Please refer to our company page or FAQ for details.

Code Quality

Test Design Studio Helps Improve Your Code!

The rich integrated development environment allows automation engineers to reach new levels of productivity. With the introduction of new features in v2.0, you can now reach new levels of quality in the code you are creating.

Watch It In Action

What is Code Quality?

Code quality, as it pertains to automated tests, is often thought to be a subjective measurement of the following key code characteristics:

Ease of Maintenance

How easily can your code be maintained when the AUT is modified or new functionality is required?

Low Complexity

Similar to readability, how easily can someone comprehend the purpose of your code?

How Does Test Design Studio Improve Code Quality?

There are three key features of Test Design Studio that will help improve the quality of your code:

Real-Time Code Analysis

Helps you identify syntax errors and potential logic errors quickly, as they are created


We all make mistakes, but Test Design Studio helps you find them before your save your work

Code Metrics

Helps you objectively measure the complexity and maintainability of your code

Real-Time Code Analysis

Test Design Studio identifies errors and warnings in real time, as you type! All items of concern are underlined in the editor so that you can quickly identify the problem and fix it. Not sure why something is underlined? Just hover your mouse over the item and a tooltip will tell you.

Unified Functional Testing performs syntax checking as well, but only when you save a file or think to check it manually, but Test Design Studio is constantly analyzing your code for immediate feedback.

Syntax checking, however is only half the story, and this is where Test Design Studio really shines and does something you won't find in Unified Functional Testing. Syntax checks only ensure you follow the basic syntax of the language. Other errors are typically not revealed by Unified Functional Testing until you try to execute your tests. Why? Because they are syntactically correct even if the code was dead wrong.

These are the most time-consuming errors to fix because it is only during execution that you discover these mistakes, many of which could have been easily avoided. Common oversights include:

  • declaring the same variable more than once because you copied/pasted code
  • forgetting to declare variables when Option Explicit is used
  • mistyping the name of a function/variable

When these issues are discovered at run-time, it usually means a significant loss of productivity. Tests have to be re-executed and application state must be restored.

New users, in particular, have trouble with these errors because they simply aren't familiar with the basics of VBScript.

Try to find the errors or potential code problems in the image to the right. Can you see them? There is absolutely nothing wrong with the syntax, so don't expect any help from Unified Functional Testing. If you give up or want to check how you did, click the 'Show Errors' button to see the same screen shot with errors/warnings displayed.

Watch It In Action

Can you see all the errors in the code below?

Syntax Errors Hidden Screen Shot Show Errors
Syntax Errors Exposed Screen Shot
  • Lines 3 & 17 Same variable declared twice
  • Line 5 MyFunction Warning that the function never set a return value. Should this be a Sub?
  • Line 5 neverUsed Warning that a parameter was declared and never used. Forget something?
  • Line 7 fso Warning that a variable was declared and never used. Forget something?
  • Line 10 Reporter.ReportEvent This method takes 3 arguments, not 2.
  • Line 10 micFailed Oops, should be 'micFail'
  • Line 11 Have to use 'Set' to assigning an object
  • Line 12 Should be 'Exit Function'

Example Code Analysis Rules

Test Design Studio has many built-in rules for analyzing code and reporting potential issues. The most common rules are listed below, and we can always expand this list to include more. Have an idea for a new rule? Just let us know and we'll see about adding it to the rules engine.

General Rules

  • Promote use of the 'Option Explicit' statement to help enforce language rules.
  • Cannot make duplicate declarations of variables in the same scope.
  • Must use the various Exit statements in the proper context (i.e. using 'Exit Function' only within a 'Function' declaration).
  • Check for proper use of parenthesis when invoking a function.
  • Function calls must provide the proper number of arguments.
  • Function calls and variable usage must refer to a known entity (identifies use of invalid or misspelled items).
  • Ensure that object-based assignment statements use the 'Set' keyword, and non-object-based assignments do not.
  • Warn on use of obsolete language elements (including the ability to mark your own elements as obsolete with XML comments).
  • Identifier names cannot be too long.
  • 'Select Case' constructs must have at least one 'Case' statement.

Class Rules

  • Ensure classes instantiated with 'New' keyword are located in the same file as the statement instantiating it.
  • Default Properties/Functions must be 'Public'.
  • Only one member of a class can be 'Default'.
  • A default property can only be defined on the 'Get' declaration.
  • 'Class_Initialize' and 'Class_Terminate' cannot have arguments.
  • Property declarations must have consistent argument signatures.
  • Warn if public variables are used in a class instead of public properties.

Function/Sub Rules

  • 'Sub' declarations must not attempt to return a value.
  • Warn if a 'Function' declaration has no return value (did the developer forget?).
  • Warn if a parameter is declared but never actually used.

You can even use a powerful rules engine to define naming conventions for your framework to make sure all code contributors are following guidelines for your organization.

Naming Conventions (Preview)

A popular and effective practice for any software development activity is to adopt and use naming conventions. Consistently naming variables, functions, constants, and other language elements provides a familiarity with code even when reviewing something written by another author. Defining the naming conventions is the easy part. Consistently following those conventions is the hard part. Well, it used to be hard. Now Test Design Studio has introduced a feature that allows you to define rules for naming conventions, and the powerful code analysis engine will help ensure you are following your own guidelines! We ship with examples of some popular VBScript naming conventions to help get you started, and you can use the extensive rules engine to further add or customize the rules to meet the needs of your organization.


Finding all these errors and warnings has not benefit if the user doesn't see them. That's why all the files in your project are checked for issues and displayed in a single Error List Tool Window.

Error List Screen Shot

You can toggle the visiblity of warnings or errors and even filter the results to only show the file you're working on. We make it easy for you to see everything you want to see and nothing you don't.

Code Metrics

Code Metrics are a useful tool implemented by Test Design Studio to provide an objective analysis of the complexity of your code. Higher code complexity typically leads to higher defect rates and decreased maintainability. The following metric values are calculated for major language elements including entire tests, class declarations, functions, and properties:

Cyclomatic Complexity
Measures the number of paths through your code. Inclusion of branch and loop statements (like If and For) increases the number of paths.
Lines of Code
Counts the number of executable lines, ignorin white space and comments.
Halstead Metrics
Measures the vocabulary of your code by counting unique and total instances of operators and operands. While not shown in the tool, these values are factored into the maintainability index.
Maintainability Index
All of the above metrics are used to calculate a maintainability index between 0 and 100. Values of 0-9 indicate high maintenance, 10-19 moderate maintenance, and 20-100 are low maintenance.

This information is presented in a hierarchical format where test engineers can quickly locate code that might be hard to maintain or overly complex. Taking time to refactor the code can result in fewer errors and improved maintainability.

Code Metrics Screen Shot
Patterson Consulting, LLC of Tennessee is in the process of closing and all operations will cease effective March 1, 2022. Please refer to our company page for details.