of 971

SOAtestUserGuide.pdf

Published on June 2017 | Categories: Documents | Downloads: 87 | Comments: 0

Comments

Content

Parasoft SOAtest User’s Guide

Parasoft Corporation 101 E. Huntington Drive, 2nd Floor Monrovia, CA 91016 Phone: (888) 305-0041 Fax: (626) 305-9048 E-mail: [email protected] URL: www.parasoft.com

PARASOFT END USER LICENSE AGREEMENT This Agreement has 3 parts. Part I applies if you have not purchased a license to the accompanying software (the "SOFTWARE"). Part II applies if you have purchased a license to the SOFTWARE. Part III applies to all license grants. If you initially acquired a copy of the SOFTWARE without purchasing a license and you wish to purchase a license, contact Parasoft Corporation ("PARASOFT"): (626) 256-3680 (888) 305-0041 (USA only) (626) 256-9048 (Fax) [email protected] http://www.parasoft.com PART I -- TERMS APPLICABLE WHEN LICENSE FEES NOT (YET) PAID GRANT. DISCLAIMER OF WARRANTY. Free of charge SOFTWARE is provided on an "AS IS" basis, without warranty of any kind, including without limitation the warranties of merchantability, fitness for a particular purpose and non-infringement. The entire risk as to the quality and performance of the SOFTWARE is borne by you. Should the SOFTWARE prove defective, you and not PARASOFT assume the entire cost of any service and repair. This disclaimer of warranty constitutes an essential part of the agreement. SOME JURISDICTIONS DO NOT ALLOW EXCLUSIONS OF AN IMPLIED WARRANTY, SO THIS DISCLAIMER MAY NOT APPLY TO YOU AND YOU MAY HAVE OTHER LEGAL RIGHTS THAT VARY BY JURISDICTION. PART II -- TERMS APPLICABLE WHEN LICENSE FEES PAID GRANT OF LICENSE. PARASOFT hereby grants you, and you accept, a limited license to use the enclosed electronic media, user manuals, and any related materials (collectively called the SOFTWARE in this AGREEMENT). You may install the SOFTWARE in only one location on a single disk or in one location on the temporary or permanent replacement of this disk for use by a single user. If you wish to install the SOFTWARE in multiple locations, you must license additional copies of the SOFTWARE from PARASOFT. If you wish to have multiple users access the software you must either license additional copies of the software from Parasoft or request a multi-user license from PARASOFT. You may not transfer or sub-license, either temporarily or permanently, your right to use the SOFTWARE under this AGREEMENT without the prior written consent of PARASOFT. LIMITED WARRANTY. PARASOFT warrants for a period of thirty (30) days from the date of purchase, that under normal use, the material of the electronic media will not prove defective. If, during the thirty (30) day period, the software media shall prove defective, you may return them to PARASOFT for a replacement without charge. THIS IS A LIMITED WARRANTY AND IT IS THE ONLY WARRANTYMADE BY PARASOFT. PARASOFT MAKES NO OTHER EXPRESS WARRANTY AND NO WARRANTY OF NONINFRINGEMENT OF THIRD PARTIES' RIGHTS. THE DURATION OF IMPLIED WARRANTIES, INCLUDING WITHOUT

LIMITATION, WARRANTIES OF MERCHANTABILITY AND OF FITNESS FOR A PARTICULAR PURPOSE, IS LIMITED TO THE ABOVE LIMITED WARRANTY PERIOD; SOME JURISDICTIONS DO NOT ALLOW LIMITATIONS ON HOW LONG AN IMPLIED WARRANTY LASTS, SO LIMITATIONS MAY NOT APPLY TO YOU. NO PARASOFT DEALER, AGENT, OR EMPLOYEE IS AUTHORIZED TO MAKE ANY MODIFICATIONS, EXTENSIONS, OR ADDITIONS TO THIS WARRANTY. If any modifications are made to the SOFTWARE by you during the warranty period; if the media is subjected to accident, abuse, or improper use; or if you violate the terms of this Agreement, then this warranty shall immediately be terminated. This warranty shall not apply if the SOFTWARE is used on or in conjunction with hardware or software other than the unmodified version of hardware and software with which the SOFTWARE was designed to be used as described in the Documentation. THIS WARRANTY GIVES YOU SPECIFIC LEGAL RIGHTS, AND YOU MAY HAVE OTHER LEGAL RIGHTS THAT VARY BY JURISDICTION. YOUR ORIGINAL ELECTRONIC MEDIA/ARCHIVAL COPIES. The electronic media enclosed contain an original PARASOFT label. Use the original electronic media to make "back-up" or "archival" copies for the purpose of running the SOFTWARE program. You should not use the original electronic media in your terminal except to create the archival copy. After recording the archival copies, place the original electronic media in a safe place. Other than these archival copies, you agree that no other copies of the SOFTWARE will be made. TERM. This AGREEMENT is effective from the day you install the SOFTWARE and continues until you return the original SOFTWARE to PARASOFT, in which case you must also certify in writing that you have destroyed any archival copies you may have recorded on any memory system or magnetic, electronic, or optical media and likewise any copies of the written materials. CUSTOMER REGISTRATION. PARASOFT may from time to time revise or update the SOFTWARE. These revisions will be made generally available at PARASOFT's discretion. Revisions or notification of revisions can only be provided to you if you have registered with a PARASOFT representative or on the Parasoft Web site. PARASOFT's customer services are available only to registered users. PART III -- TERMS APPLICABLE TO ALL LICENSE GRANTS SCOPE OF GRANT. DERIVED PRODUCTS. Products developed from the use of the SOFTWARE remain your property. No royalty fees or runtime licenses are required on said products. PARASOFT'S RIGHTS. You acknowledge that the SOFTWARE is the sole and exclusive property of PARASOFT. By accepting this agreement you do not become the owner of the SOFTWARE, but you do have the right to use the SOFTWARE in accordance with this AGREEMENT. You agree to use your best efforts and all reasonable steps to protect the SOFTWARE from use, reproduction, or distribution, except as authorized by this AGREEMENT. You agree not to disassemble, de-compile or otherwise reverse engineer the SOFTWARE.

SUITABILITY. PARASOFT has worked hard to make this a quality product, however PARASOFT makes no warranties as to the suitability, accuracy, or operational characteristics of this SOFTWARE. The SOFTWARE is sold on an "as-is" basis. EXCLUSIONS. PARASOFT shall have no obligation to support SOFTWARE that is not the then current release. TERMINATION OF AGREEMENT. If any of the terms and conditions of this AGREEMENT are broken, this AGREEMENT will terminate automatically. Upon termination, you must return the software to PARASOFT or destroy all copies of the SOFTWARE and Documentation. At that time you must also certify, in writing, that you have not retained any copies of the SOFTWARE. LIMITATION OF LIABILITY. You agree that PARASOFT's liability for any damages to you or to any other party shall not exceed the license fee paid for the SOFTWARE. PARASOFT WILL NOT BE RESPONSIBLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, OR CONSEQUENTIAL DAMAGES RESULTING FROM THE USE OF THE SOFTWARE ARISING OUT OF ANY BREACH OF THE WARRANTY, EVEN IF PARASOFT HAS BEEN ADVISED OF SUCH DAMAGES. THIS PRODUCT IS SOLD "AS-IS". SOME STATES DO NOT ALLOW THE LIMITATION OR EXCLUSION OF LIABILITY FOR INCIDENTAL OR CONSEQUENTIAL DAMAGES, SO THE ABOVE LIMITATION OR EXCLUSION MAY NOT APPLY TO YOU. YOU MAY ALSO HAVE OTHER RIGHTS WHICH VARY FROM STATE TO STATE. ENTIRE AGREEMENT. This Agreement represents the complete agreement concerning this license and may be amended only by a writing executed by both parties. THE ACCEPTANCE OF ANY PURCHASE ORDER PLACED BY YOU IS EXPRESSLY MADE CONDITIONAL ON YOUR ASSENT TO THE TERMS SET FORTH HEREIN, AND NOT THOSE IN YOUR PURCHASE ORDER. If any provision of this Agreement is held to be unenforceable, such provision shall be reformed only to the extent necessary to make it enforceable. This Agreement shall be governed by California law (except for conflict of law provisions). All brand and product names are trademarks or registered trademarks of their respective holders. Copyright 1993-2008 Parasoft Corporation 101 E. Huntington Drive., 2nd Floor Monrovia, CA 91016 Printed in the U.S.A, November 16, 2009

Table of Contents Introduction Welcome ....................................................................................................................................... 13 About the Documentation Library - PDFs and Related Resources............................................... 14 Contacting Parasoft Technical Support ........................................................................................ 16

Installation and Licensing Windows Standalone Installation .................................................................................................. 21 Windows Plugin Installation .......................................................................................................... 22 Linux/Solaris Standalone Installation ............................................................................................ 24 Linux/Solaris Plugin Installation .................................................................................................... 26 Service Pack Installation............................................................................................................... 28 Licensing....................................................................................................................................... 29

The SOAtest UI Exploring the SOAtest UI .............................................................................................................. 33

Migrating from SOAtest and WebKing Migration Guide for Existing SOAtest and WebKing Users .......................................................... 41 Command Line Interface (cli) Migration ........................................................................................ 51

SOAtest Tutorial About this Tutorial ......................................................................................................................... 58 Creating Projects and Test (.tst) files............................................................................................ 59 WSDL Verification......................................................................................................................... 63 Functional Testing ........................................................................................................................ 67 Scenario Testing........................................................................................................................... 87 Advanced Strategies..................................................................................................................... 92 Creating and Deploying Stubs ...................................................................................................... 102 Testing Plain XML Services .......................................................................................................... 126 Extending SOAtest with Scripting ................................................................................................. 128 Asynchronous Testing .................................................................................................................. 134 WS-Security .................................................................................................................................. 140 Design and Development Policy Enforcement.............................................................................. 159 Automation/Iteration (Nightly Process) ......................................................................................... 165 Running Regression Tests in Different Environments .................................................................. 167 Web Functional Testing ................................................................................................................ 171 Web Static Analysis ...................................................................................................................... 183

Team-Wide Deployment Team-Wide Deployment - Configuration Overview Configuring a Team Deployment: Introduction ............................................................................. 191 Connecting All SOAtest Installations to Your Source Control System.......................................... 192 Connecting All SOAtest Installations to Team Server................................................................... 199

Connecting SOAtest Server to Report Center ............................................................................. 203 Connecting All SOAtest Installations to Parasoft Project Center .................................................. 206 Configuring Task Assignment ....................................................................................................... 215 Configuring Team Test Configurations and Rules ........................................................................ 220 Configuring Task Goals ................................................................................................................ 225 Sharing Project and Test Assets .................................................................................................. 227 Configuring Automated Nightly Testing ........................................................................................ 229

Team-Wide Deployment - Usage Overview Using a Team Deployment: Daily Usage Introduction .................................................................. 231 Creating Tests and Analyzing Source Files from the GUI ............................................................ 232 Reviewing and Responding to Tasks Generated During Nightly Tests ........................................ 233 Accessing Results and Reports .................................................................................................... 234 Reassigning Tasks to Other Team Members ............................................................................... 238 Monitoring Project-Wide Test Results........................................................................................... 239

Test and Analysis Basics Customizing Settings and Configurations Modifying General SOAtest Preferences ...................................................................................... 242 Creating Custom Test Configurations........................................................................................... 244

Running Tests and Analysis Testing from the GUI .................................................................................................................... 255 Testing from the Command Line Interface (soatestcli) ................................................................. 257 Testing from the Web Service Interface........................................................................................ 287

Reviewing Results Viewing Results ............................................................................................................................ 290 Generating Reports ...................................................................................................................... 295 Configuring Reporting Settings ..................................................................................................... 297 Understanding Reports ................................................................................................................. 300

Functional/Integration Testing End-to-End Test Scenarios Configuring End-to-End Test Scenarios: Overview ...................................................................... 307 Adding Projects, .tst files, and Test Suites ................................................................................... 308 Working with Projects and .tst files ............................................................................................... 313 Reusing/Modularizing Test Suites ................................................................................................ 317 Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)............................................ 319 Adding Standard Tests ................................................................................................................. 331 Adding Set-Up and Tear-Down Tests ........................................................................................... 332 Adding Test Outputs ..................................................................................................................... 333 Adding Global Test Suite Properties............................................................................................. 337 Reusing and Reordering Tests ..................................................................................................... 344 Parameterizing Tests (with Data Sources or Values from Other Tests) ....................................... 345 Configuring Testing in Different Environments ............................................................................. 369 Validating the Database Layer...................................................................................................... 373 Validating EJBs............................................................................................................................. 374 Validating Java Application-Layer Functionality ........................................................................... 375 Monitoring and Validating Messages and Events Inside ESBs and Other Systems..................... 376 Executing Functional Tests........................................................................................................... 377

Reviewing Functional Test Results............................................................................................... 379 Creating a Report of the Test Suite Structure............................................................................... 380 Managing the Test Suite ............................................................................................................... 381

SOA Functional Tests Automatic Creation of Test Suites for SOA: Overview.................................................................. 384 Creating Tests From a WSDL....................................................................................................... 385 Creating Tests From XML Schema............................................................................................... 389 Creating Tests From AmberPoint Management System .............................................................. 391 Creating Tests From Oracle Enterprise Repository / BEA AquaLogic Repository........................ 393 Creating Tests From BPEL Files .................................................................................................. 396 Creating Tests From Software AG CentraSite Active SOA .......................................................... 399 Creating Tests From JMS System Transactions .......................................................................... 401 Creating Tests From Sonic ESB Transactions ............................................................................. 404 Creating Tests From TIBCO EMS Transactions........................................................................... 407 Creating Tests From Traffic .......................................................................................................... 410 Creating Tests From a UDDI ........................................................................................................ 413 Creating Tests From a WSIL ........................................................................................................ 416 Creating Asynchronous Tests....................................................................................................... 419 Testing RESTful Services ............................................................................................................. 420 Sending MTOM/XOP Messages................................................................................................... 421 Sending and Receiving Attachments ............................................................................................ 422 Accessing Web Services Deployed with HTTPS .......................................................................... 423 Configuring Regression Testing ................................................................................................... 425 Validating the Value of an Individual Response Element ............................................................. 427 Validating the Structure of the XML Response Message ............................................................. 428

Web Functional Tests Web Functional Testing: Overview ............................................................................................... 430 Recording Tests from a Browser .................................................................................................. 431 Generating JUnit Tests from a Web Browser Recording .............................................................. 441 Configuring Browser Playback Options ........................................................................................ 447 Configuring User Actions (Navigation, Delays, etc.) .................................................................... 449 Configuring Wait Conditions ......................................................................................................... 454 Validating or Storing Values ......................................................................................................... 459 Stubbing Test Requests/Responses ............................................................................................ 465 Running Web Scenarios in Headless Mode ................................................................................. 468 Running Static Analysis as Functional Tests Execute .................................................................. 471 Customizing Recording Options ................................................................................................... 472 Creating Custom Locators for Validations and Extractions.......................................................... 473 Understanding Web Functional Test Errors.................................................................................. 475 Creating a Report of Test Suite Maintainability............................................................................. 476

Security Testing Security Testing: Introduction ....................................................................................................... 479 Authentication, Encryption, and Access Control ........................................................................... 481 Penetration Testing....................................................................................................................... 484

Runtime Error Detection Performing Runtime Error Detection............................................................................................. 490

Event Monitoring (ESBs, Java Apps, Databases, and other Systems) Monitoring Intra-Process Events: Overview.................................................................................. 495 Using SOAtest’s Event Monitor .................................................................................................... 496 Monitoring IBM WebSphere ESB ................................................................................................. 500 Monitoring Oracle or BEA AquaLogic Service Bus ....................................................................... 503 Monitoring Software AG webMethods Broker............................................................................... 505 Monitoring Sonic ESB ................................................................................................................... 508 Monitoring TIBCO EMS ................................................................................................................ 510 Monitoring Other JMS Systems .................................................................................................... 512 Monitoring Java Applications ........................................................................................................ 514 Monitoring Databases ................................................................................................................... 518 Monitoring Stub Events................................................................................................................. 520 Monitoring a Custom API-Based Events Source .......................................................................... 524 Extensibility API Patterns.............................................................................................................. 525 Generating Tests from Monitored Transactions............................................................................ 530

Service Virtualization: Creating and Deploying Stubs Understanding Stubs .................................................................................................................... 532 Creating Stubs from Functional Test Traffic ................................................................................. 535 Creating Stubs from Recorded HTTP Traffic ................................................................................ 538 Creating Stubs from WSDLs or Manually ..................................................................................... 539 Working with Stubs ....................................................................................................................... 541 Configuring Stub Server Deployment Settings ............................................................................ 554

Load Testing Load Testing your Functional Tests: Introduction ......................................................................... 559 Load Test Documentation and Tutorial......................................................................................... 560 Preparing Web Functional Tests for Load Testing........................................................................ 561

SOA Quality Governance and Policy Enforcement SOA Policy Enforcement: Overview ............................................................................................. 570 Defining the Policy ........................................................................................................................ 573 Enforcing Policies on WSDLs, Schemas, and SOAP Messages.................................................. 575

Static Analysis Performing Static Analysis ............................................................................................................ 578 Configuring SOAtest to Scan a Web Application .......................................................................... 583 Reviewing Static Analysis Results ................................................................................................ 605 Suppressing the Reporting of Acceptable Violations .................................................................... 608 Customizing Static Analysis: Overview........................................................................................ 611 Creating Custom Static Analysis Rules ........................................................................................ 612 Modifying Rule Categories, IDs, Names, and Severity Levels...................................................... 616

Code Review Code Review Introduction............................................................................................................. 620 General Code Review Configuration ............................................................................................ 622 Configuring and Running Pre-Commit Code Review Scans......................................................... 628 Configuring and Running Post-Commit Code Review Scans ....................................................... 635 Working with the Code Review UI ................................................................................................ 641 Authors - Examining and Responding to Review Comments ....................................................... 647 Reviewers - Reviewing Code Modifications.................................................................................. 650 Monitors - Overseeing the Review Process.................................................................................. 653 Code Review Tips and Tricks ....................................................................................................... 655

Platform Support and Integrations Using AmberPoint Management System with SOAtest ................................................................ 657 Using Oracle/BEA with SOAtest ................................................................................................... 658 Using HP with SOAtest ................................................................................................................. 659 Using JMS with SOAtest............................................................................................................... 665 Using Microsoft with SOAtest ....................................................................................................... 666 Using IBM/Rational with SOAtest ................................................................................................. 672 Using Software AG CentraSite Active SOA with SOAtest ............................................................ 682 Using Software AG webMethods with SOAtest ............................................................................ 685 Using Sonic with SOAtest ............................................................................................................. 686 Using TIBCO with SOAtest ........................................................................................................... 687

Testing Through Different Protocols Using HTTP 1.0 ............................................................................................................................ 689 Using HTTP 1.1 ............................................................................................................................ 691 Using JMS .................................................................................................................................... 694 Using SonicMQ ............................................................................................................................. 706 Using IBM WebSphere MQ .......................................................................................................... 710 Using TIBCO Rendezvous............................................................................................................ 720 Using RMI ..................................................................................................................................... 722 Using SMTP.................................................................................................................................. 723 Testing a CORBA Server.............................................................................................................. 724 Using .NET WCF TCP .................................................................................................................. 726 Using .NET WCF HTTP ................................................................................................................ 730 Using .NET WCF Flowed Transactions ........................................................................................ 734

Reference Built-in Test Configurations........................................................................................................... 741 Built-in Static Analysis Rules ........................................................................................................ 744 Preference Settings ...................................................................................................................... 747 Extensibility (Scripting) Basics ...................................................................................................... 764 Using Eclipse Java Projects in SOAtest ....................................................................................... 772 Available Tools ............................................................................................................................. 773

Messaging Tools SOAP Client.................................................................................................................................. 777 Messaging Client .......................................................................................................................... 782 Message Stub ............................................................................................................................... 784

Common Messaging Options ....................................................................................................... 792 Literal XML View Options ............................................................................................................. 793 Form XML View Options ............................................................................................................... 794 Scripted XML View Options .......................................................................................................... 804 Form Input View Options .............................................................................................................. 805 MapMessage Input Options .......................................................................................................... 828 REST Client .................................................................................................................................. 829 webMethods ................................................................................................................................. 833 EJB Client ..................................................................................................................................... 841 Call Back....................................................................................................................................... 847 UDDI Query .................................................................................................................................. 852 Transmit Tool ................................................................................................................................ 854 ISO 8583....................................................................................................................................... 856

XML Tools XML Validator ............................................................................................................................... 862 XML Data Bank............................................................................................................................. 863 XML Transformer .......................................................................................................................... 869 XSLT ............................................................................................................................................. 872 XML Encryption ............................................................................................................................ 873 XML Signer ................................................................................................................................... 878 XML Signature Verifier.................................................................................................................. 882 XML Encoder ................................................................................................................................ 885 XML Decoder ................................................................................................................................ 886

Viewing Tools Traffic Viewer ................................................................................................................................ 888 Event Monitor................................................................................................................................ 891 Edit................................................................................................................................................ 892 Write File....................................................................................................................................... 893 File Stream Writer ......................................................................................................................... 894 Results Stream Writer................................................................................................................... 895 stderr............................................................................................................................................. 896 stdout ............................................................................................................................................ 897

Validation Tools Diff ................................................................................................................................................ 899 WS-I .............................................................................................................................................. 909 DB ................................................................................................................................................. 911 XML Assertor ................................................................................................................................ 917 WS-BPEL Semantics Validator..................................................................................................... 923

Web Application Tools Browser Testing ............................................................................................................................ 925 Browser Contents Viewer ............................................................................................................. 927 Browser Validation ........................................................................................................................ 929 Browser Data Bank ....................................................................................................................... 931 Browser Stub ................................................................................................................................ 932 Scanning ....................................................................................................................................... 933 Check Links .................................................................................................................................. 934 Spell .............................................................................................................................................. 936 HTML Cleanup.............................................................................................................................. 938 Search .......................................................................................................................................... 944

Other Tools Header Data Bank ........................................................................................................................ 947 JSON Data Bank .......................................................................................................................... 950 Object Data Bank.......................................................................................................................... 951

Text Data Bank ............................................................................................................................. 952 Coding Standards ......................................................................................................................... 954 FTP Client ..................................................................................................................................... 955 External......................................................................................................................................... 956 Aggregate ..................................................................................................................................... 959 Extension (Custom Scripting) ....................................................................................................... 960 Attachment Handler ...................................................................................................................... 962 Decompression ............................................................................................................................. 965 Jtest Tracer Client......................................................................................................................... 966 WSDL Content Handler ................................................................................................................ 969 Browse .......................................................................................................................................... 970 WSDL Semantics Validator .......................................................................................................... 971

Introduction In this section: •

Welcome



About the Documentation Library - PDFs and Related Resources



Contacting Parasoft Technical Support



Installation and Licensing



The SOAtest UI

12

Welcome

Welcome Parasoft SOAtest is a full-lifecycle quality platform for ensuring secure, reliable, compliant business processes. It provides enterprises an integrated solution for: •

Quality governance: To continuously measure how each service conforms to the often dynamic expectations defined by both your own organization and your partners.



Environment management: To reduce the complexity of testing in today’s heterogeneous environments—with limited visibility/control of distributed components or vendor-specific technologies.



End-to-end testing: To continuously validate all critical aspects of complex transactions, which may extend beyond the message layer through a web interface, ESBs, databases, and everything in between.



Process visibility and control: To establish a sustainable workflow that helps the entire team efficiently develop, share, and manage the evolution of quality assets throughout the lifecycle.

Getting Started •

Existing SOAtest or WebKing Users: We recommend starting at “Migrating from SOAtest and WebKing”, page 40 then proceeding to “SOAtest Tutorial”, page 57.



New SOAtest Users: We recommend starting at “SOAtest Tutorial”, page 57.

Open Source Acknowledgements This product includes software developed by the Eclipse Project (http://www.eclipse.org/). For a list of additional software used, choose Help> About Parasoft SOAtest, then click the SOAtest icon.

13

About the Documentation Library - PDFs and Related Resources

About the Documentation Library PDFs and Related Resources The SOAtest documentation library includes the following items: •

The SOAtest User’s Guide (the current guide): Explains how to use the SOAtest functionality that is built upon Eclipse (if you have the standalone version of SOAtest) or that is added to Eclipse (if you have the SOAtest plugin). To access this guide from the Eclipse help system, choose Help> Help Contents, then open the SOAtest User’s Guide book. The PDF is available in the manuals directory within the SOAtest installation directory.



The RuleWizard User’s Guide: Explains how to use RuleWizard to create custom rules. Note that RuleWizard requires a special license. To access this guide, open RuleWizard by choosing SOAtest> Launch RuleWizard, then choose Help> Documentation from within the RuleWizard GUI.



The SOAtest Static Analysis Rules Guide: Describes all of the coding standards rules included with SOAtest. To access this guide from the Eclipse help system, choose Help> Help Contents, then open the SOAtest Static Analysis Rules book. To generate a custom HTMLformat guide with the descriptions for only the rules you have enabled, use the procedure described in “Viewing Rule Descriptions”, page 744.

Additional user guides in the Eclipse help system (for example, Workbench User Guide, etc.) describe native Eclipse functionality and strategies.

Search Tips The Eclipse help system provides search functionality. However, by default, it searches for the entered term in all available documentation—including Eclipse documentation. If you want to restrict your searches to the SOAtest documentation, perform the following steps: 1. Choose Help> Help Contents. 2. Click the Search Scope link in the help viewer. 3. Click the New button in the Select Search Scope dialog. 4. Enter SOAtest in the New Search List dialog List name field. 5. Check the boxes for the books you want to search (for example, SOAtest User's Guide, SOAtest Static Analysis Rules). 6. Click OK. 7. Select the Search only the following topics button. 8. Click OK. In addition, the User Guide PDF is fully searchable. Popular PDF readers provide both search and find functionality to help you locate the information you are looking for.

14

About the Documentation Library - PDFs and Related Resources

Bookmarking Your Most Commonly-Accessed Topics You can use the Eclipse help "bookmark" feature to enable easy access to the SOAtest help topics you refer to most frequently. To bookmark a topic, open it in the Eclipse help system, then click the Bookmark Document button in the top right of the help system’s toolbar.

To access bookmarked topics, click the bookmark icon on the bottom left of the Eclipse help system.

15

Contacting Parasoft Technical Support

Contacting Parasoft Technical Support This topic explains several ways to contact technical support, as well as how to prepare and send "support archives" that help the technical support team diagnose any problems you are experiencing. Sections include: •

Obtaining Live Online Support (Windows only)



Using the SOAtest Forum



Contacting us via Phone or E-mail



Preventing SOAtest from Running Out of Memory



Preparing a "Support Archive" and Sending it to Technical Support

Obtaining Live Online Support (Windows only) SOAtest experts are available online to answer your questions. This live support allows you to chat in real-time with the SOAtest team and perform desktop sharing if needed. To receive live online support, go to http://www.parasoft.com/jsp/pr/live_experts.jsp. This live tech support feature currently supports only the Microsoft Windows operating system.

Using the SOAtest Forum Parasoft's SOAtest Forum is an active, online meeting place where you can converse with and learn from peers and Parasoft team members. Post your questions and participate in the latest discussions at http://forums.parasoft.com.

Contacting us via Phone or E-mail USA Headquarters Tel: (888) 305-0041 or (626) 256-3680 Email: [email protected]

Netherlands Tel: +31-70-3922000 Email: [email protected]

France Tel:+33 (0) 64 89 26 00 Email: [email protected]

Germany Tel: +49 731 880309-0 Email: [email protected]

16

Contacting Parasoft Technical Support

UK Tel:

+44 (0)1923 858005

Email: [email protected]

Asia Tel: +886 2 6636-8090 Email: [email protected]

Other Locations See http://www.parasoft.com/contacts.

Preventing SOAtest from Running Out of Memory To prevent SOAtest from running out of memory, add memory parameters to the script or shortcut being used to start SOAtest. The two parameters are the initial size of the JVM (Xms) and the maximum size of the JVM (Xmx). Typically, both are set to the same size (for instance, 256MB). However, if you have occasional problems but don't want to always allocate a large amount of memory, you can set the parameters to different sizes (for example, 256MB as the initial size and 512MB for the maximum size). Examples: SOAtest standalone: soatest.exe -J-Xms256m -J-Xmx256m SOAtest plugin for Eclipse: soatest.exe -vmargs -Xmx384m Note that the maximum size you can set depends on your OS and JVM. If you are running the SOAtest Eclipse plugin under Sun Java 1.5 and get a java.lang.OutOfMemoryError: PermGen space error messages, start Eclipse with eclipse -vmargs -XX:MaxPermSize=256m

Preparing a "Support Archive" and Sending it to Technical Support If you are experiencing testing problems such as build failures, the best way to remedy the problem is to create a zip archive containing the source file(s) that caused that failure (if applicable), as well as related test information, then send that zip file to Parasoft's support team. To facilitate this process, SOAtest can automatically create an archive when testing problems occur. On average, these archives are about half a megabyte, and are created in about one minute. By default, SOAtest does not create an archive when testing problems occur. You can either manually prepare and send a support archive when needed, or you can modify SOAtest archive creation options so that SOAtest automatically prepares and sends an archive when testing problems occur. To configure SOAtest to automatically prepare and send archives when testing problems occur: 1. Open the Technical Support panel by choosing SOAtest> Preferences, then selecting the Technical Support category. 2. Check Enable auto-creation of support archives. 3. If you want to send the archive from SOAtest, check Send archives by e-mail.

17

Contacting Parasoft Technical Support



If you enable this option, be sure to set the e-mail options in Preferences> E-mail if you have not already done so.

4. In the Items to include area, check the items you want included. Available options are: •

Environmental data: Environment variables, JVM system properties, platform details, additional properties (memory, other).



General application logs: Various platform/application logs.

5. If you want verbose logs included in the archive, check Enable verbose logging. Note that this option cannot be enabled if the logging system has custom configurations. •

Verbose logs are stored in the xtest.log file within the user-home temporary location (on Windows, this is <drive>:\Documents and Settings\<user>\Local Settings\Temp\parasoft\xtest).



Verbose logging state is cross-session persistent (restored on application startup).



The log file is a rolling file: it won't grow over a certain size, and each time it achieves the maximum size, a backup will be created.

6. If you want verbose logs to include output from source control commands, check Enable source control output. Note that the output could include fragments of your source code. 7. If the support team asked you to enter any advanced options, check Advanced options, then enter them here. 8. If you do not want to use the default archive location (listed in the Archives location field), specify a new one in the Archives location field. 9. Click Apply, then OK. To manually create a support archive: •

Choose SOAtest> Preferences, select the Technical Support category, select the desired archive options, then click Create Archive.

To open the Technical Support Archive Manager, which allows you to review, e-mail, or delete recent support archives: •

Choose SOAtest> Preferences, select the Technical Support category, then click Browse Recent Archives.

When creating a support archive it is best to ensure that it contains all the info which is relevant to the problem and does not contain any unrelated info.

Best Practice: Creating an Archive with the Most Relevant Data When a technical support archive is created, the complete application logs are included. The logs may contain information from many test runs over a long period of time—but chances are that only a small part of that information is relevant to the problem you are experiencing. To help technical support isolate the cause of the problem, create a technical support archive containing application logs only for the testing session which produces problems. To do this: 1. Clean application logs by turning on verbose logging. If verbose logging is already enabled, then disable it and re-enable it. 2. Run the testing session that causes problems. 3. Prepare a technical support archive.

18

Contacting Parasoft Technical Support

19

Installation and Licensing In this section: •

Windows Standalone Installation



Windows Plugin Installation



Linux/Solaris Standalone Installation



Linux/Solaris Plugin Installation



Service Pack Installation



Licensing

20

Windows Standalone Installation

Windows Standalone Installation This topic explains how to install the standalone version of SOAtest (which is built upon the Eclipse framework)—as well as the Parasoft Load Test product—on a Windows system.

System Requirements •

At least 1 GB RAM per processor (2 GB is recommended)



Windows 2000, 2003, XP (Professional or Server Edition), or Vista

Installation To install the standalone version of SOAtest on a Windows system: 1. Run the setup executable that you downloaded from the Parasoft Web site. 2. Follow the installation program's onscreen instructions. After you have completed the installation program, SOAtest will be installed on your machine. SOAtest will be installed in the specified installation directory. The SOAtest workspace will be installed at %USERPROFILE%\soatest\workspace. For example: •

Windows XP: C:\Documents and Settings\[user]\soatest\workspace



Windows Vista: C:\Users\[user]\soatest\workspace

Startup To start SOAtest: •

Double-click the SOAtest desktop icon or choose Programs> Parasoft> SOAtest 6.x> SOAtest from the Windows Start menu.

To start Load Test: •

Double-click the Load Test desktop icon or choose Programs> Parasoft> Load Test> Load Test from the Windows Start menu.

Note: You must install a license before you begin using Load Test or SOAtest.

Licensing See the Licensing topic for details.

21

Windows Plugin Installation

Windows Plugin Installation This topic explains how to install the SOAtest plugin into a working copy of Eclipse on Windows. Parasoft Load Test will also be installed during this process.

System Requirements •

At least 1 GB RAM per processor (2 GB is recommended)



Windows 2000, 2003, XP (Professional or Server Edition), or Vista



Eclipse 3.4, 3.3, 3.2.1 or higher



Sun Microsystems JRE 1.5 or higher (32-bit)

Known Eclipse Issues •

On Windows platforms, there is a known issue with the Eclipse 3.3 UI not refreshing properly.



On all platforms, there is a known system updates issue with Eclipse 3.4. This may affect your ability to install SOAtest service packs in the future. If you run into a problem updating SOAtest in the future, please contact SOAtest technical support for assistance. Please note that SOAtest standalone ships on a patched Eclipse Ganymede 3.4.1, which does not suffer from the update issue mentioned above.



If you already have Parasoft Jtest or Parasoft C++test, SOAtest needs to be installed into a separate Eclipse installation.

Installation To install the SOAtest plugin on a Windows system: 1. In Windows Explorer, locate and double-click the self extracting archive. 2. Click Yes when a dialog asks whether you want to install SOAtest. 3. Click Yes after you have read and agreed with the license information. 4. Click Next after you have read the readme file. 5. Enter the desired destination directory for the SOAtest Extension files, then click Next. The default destination directory is C:\Program Files\Parasoft\SOAtestExtension. 6. Enter your Eclipse installation directory, then click OK. 7. Close Eclipse if it is open, then click OK to close the dialog reminding you to close this program. SOAtest will then start copying files and installing the necessary files into the workbench. A dialog box with a progress indicator will open and indicate installation progress. When the installation is complete, a notification dialog box will open. 8. Click the OK button to close the notification dialog box.

Startup To start SOAtest:

22

Windows Plugin Installation

1. Start Eclipse by double-clicking the appropriate desktop icon or choosing the appropriate menu item from the Windows Start menu. 2. Open the SOAtest perspective by choosing Window> Open Perspective> Other, then choosing SOAtest in the Select Perspective dialog that opens. 3. If the SOAtest menu is not visible in the Eclipse toolbar, choose Window> Reset Perspective. If the SOAtest menu still is not visible, ensure that you have the latest version of SOAtest by choosing Help> Software Updates> Pending Updates and installing any pending updates. To start Load Test: •

Double-click the Load Test desktop icon or choose Programs> Parasoft> Load Test> Load Test from the Windows Start menu.

Note: You must install a license before you begin using Load Test or SOAtest.

Licensing See the Licensing topic for details.

WTP and Pydev Plugin Installation WTP and Pydev plugins greatly improve the usability of various text editors and must be installed for SOAtest’s syntax highlighting to work. These plugins can be downloaded and installed from the following locations: WTP download site: http://download.eclipse.org/webtools/downloads/ Pydev download site: http://pydev.sourceforge.net/download.html

23

Linux/Solaris Standalone Installation

Linux/Solaris Standalone Installation This topic explains how to install the standalone version of SOAtest (which is built upon the Eclipse framework)—as well as the Parasoft Load Test product—on a Linux or Solaris system.

System Requirements •

At least 1 GB RAM per processor (2 GB is recommended)



Linux or Solaris



For Linux: •

GTK+ 2.10 or higher



GLib 2.12 or higher



Pango 1.14 or higher



X.Org 1.0 or higher

Installation To install the standalone version of SOAtest on a Linux or Solaris system: 1. If you haven’t already done so, copy the installation file to the directory where you would like to install SOAtest. 2. Change directories to the directory where you are going to install SOAtest. 3. Extract the necessary files by entering the appropriate command at the prompt: •

Linux: tar -xzf soatest_6.x_linux.tar.gz



Solaris: unzip soatest_6.x_solaris.zip

During extraction, a directory named SOAtest will be created; this directory will contain the program files needed to run SOAtest. The SOAtest workspace will be installed at <$HOME>/.SOAtest/workspace (for Solaris) or <$HOME>/.SOAtest_linux/workspace (for Linux).

Startup To run SOAtest GUI: •

Change directories to the soatest directory, then enter the following command at the prompt: ./soatest

To run SOAtest Command Line: •

Change directories to the soatest directory, then execute SOAtest with the desired command line arguments: ./soatestcli

For details on using soatestcli, see the User Guide topic on Testing from the Command Line Interface (soatestcli). To start Load Test: •

Change directories to the loadtest directory, then enter the following command at the prompt: ./loadtest

24

Linux/Solaris Standalone Installation

Note: You must install a license before you begin using SOAtest or Load Test.

Licensing See the Licensing topic for details.

25

Linux/Solaris Plugin Installation

Linux/Solaris Plugin Installation This topic explains how to install the SOAtest plugin into a working copy of Eclipse on a Linux or Solaris system. Parasoft Load Test will also be installed during this process.

Prerequisites •

At least 1 GB RAM per processor (2 GB is recommended)



Linux or Solaris



For Linux: •

GTK+ 2.10 or higher



GLib 2.12 or higher



Pango 1.14 or higher



X.Org 1.0 or higher



Eclipse 3.4, 3.3, 3.2.1 or higher



Sun Microsystems JRE 1.5 or higher (32-bit)

Known Eclipse Issues •

On all platforms, there is a known system updates issue with Eclipse 3.4. This may affect your ability to install SOAtest service packs in the future. If you run into a problem updating SOAtest in the future, please contact SOAtest technical support for assistance. Please note that SOAtest standalone ships on a patched Eclipse Ganymede 3.4.1, which does not suffer from the update issue mentioned above.



If you already have Parasoft Jtest or Parasoft C++test, SOAtest needs to be installed into a separate Eclipse installation.

Installation To install the SOAtest plugin on a UNIX system: 1. If you haven't already done so, move the installation file that you downloaded to the directory where you want to install SOAtest. Typically, this would be a directory different than your Eclipse location so the files can be more easily updated independently in the future. 2. Change directories to the directory where you are going to install SOAtest. 3. Extract the necessary files by entering the appropriate command at the prompt: •

Linux: tar -xzf soatest_6.x_linux_eclipse_plugin.tar.gz



Solaris: unzip soatest_6.x_solaris_eclipse_plugin.zip

4. After you enter this command, a directory named soatest-extension is created within your current directory, and all SOAtest files are extracted into this directory. 5. Change to the soatest-extension directory. 6. Run the install script: ./install

7. Provide the location of your current Eclipse installation directory. For example: This script will link an existing Eclipse installation with the

26

Linux/Solaris Plugin Installation

SOAtest plugins. Please enter the directory that contains the installation you want to link to, or Ctrl-C to quit. > /home/developer/app/eclipse Eclipse installation found in /home/developer/app/eclipse Installing... Done.

Note: The SOAtest plugin can be uninstalled by deleting the links directory that was created at the top-level of the Eclipse installation directory.

Startup To run SOAtest GUI: •

Change directories to your eclipse directory, then start Eclipse as you normally do (using the Eclipse executable): ./eclipse

To run SOAtest Command Line: •

Change directories to the soatest-extension directory, then execute SOAtest with the desired command line arguments: ./soatestcli

For details on using soatestcli, see the User Guide topic on Testing from the Command Line Interface (soatestcli). To start Load Test: •

Change directories to the loadtest-extension directory, then enter the following command at the prompt: ./loadtest

Note: You must install a license before you begin using Load Test or SOAtest.

Licensing See the Licensing topic for details.

WTP and Pydev Plugin Installation WTP and Pydev plugins greatly improve the usability of various text editors and must be installed for SOAtest’s syntax highlighting to work. These plugins can be downloaded and installed from the following locations: WTP download site: http://download.eclipse.org/webtools/downloads/ Pydev download site: http://pydev.sourceforge.net/download.html

27

Service Pack Installation

Service Pack Installation This topic explains how to update your current version of SOAtest or download the latest service pack.

Updating SOAtest To update SOAtest: 1. From the SOAtest menu, choose Check for Updates. SOAtest will then check if any updates are available. 2. If updates are reported, select the available updates, click Next, then complete the wizard to install the updates.

Using an Alternative Update Site Some teams prefer to use an internal update site instead of the public one (for example, so that they can internally standardize their versions). To configure SOAtest to access an alternative update site: 1. Choose SOAtest> Preferences then select Updates. 2. Enter the desired update site in the Update Site Location field. 3. If you want to configure Eclipse proxy connections (not related to SOAtest proxy configuration/ behavior), click the Configure Proxy Settings link.

28

Licensing

Licensing This topic explains how to set licensing information for SOAtest and Load Test.

SOAtest The following instructions focus on how to license SOAtest from the GUI. On SOAtest installations that are licensed for command line mode, you can define license information in a local settings file, then call this file when you run SOAtest in command line mode.

Using a Machine-Locked License To install a machine- locked license: 1. Choose SOAtest> Preferences to open the Preferences dialog. 2. Select the License category in the left pane. 3. Contact your Parasoft representative to receive your license. You will need to provide the Machine ID listed in the Local License area. •

If you have a Server license and you want to obtain the Machine ID without opening the GUI, run soatestcli from the command line. The Machine ID will be reported in the output message.

4. Enter your license password in the Local License section of the License preferences page. 5. Click Apply. The License preferences page will display the features that you are licensed to use, and the expiration date for your license. 6. Click OK to set and save your license.

Using Parasoft LicenseServer Setting the license if all users cannot write to the SOAtest installation directory The user who has write access to the SOAtest installation directory should configure the license on behalf of all team members. If the license is set by a user who does not have write access to the SOAtest installation directory, the license information will be stored at the workspace level, and will need to be re-entered for each new workspace.

To install a license when the Parasoft LicenseServer (available separately) manages SOAtest licensing across your team or organization: 1. Choose SOAtest> Preferences to open the Preferences dialog. 2. Select the License category in the left pane. 3. Enable the Use LicenseServer option. The LicenseServer section of the License preferences page will become active. 4. If you want to use SOAtest for a short period of time when you will not have access to the LicenseServer (e.g., because you expect to be working from home, you will be travelling, your team will be upgrading the machine hosting LicenseServer, etc.), check borrow and specify how long you need to "borrow" the license.

29

Licensing



When you borrow a license, one of the available licenses (on the LicenseServer) is locked to your machine for the specified amount of time. You can then disconnect from the network and run SOAtest without accessing the LicenseServer.



Licenses can be borrowed from 1 hour to 14 days.



License borrowing requires PST 2.6 or higher.

5. If the appropriate LicenseServer is not already set, select it from the Autodetected servers list and click Set. Or, manually enter your organization’s LicenseServer host (either a name or an IP address) in the Host name field, then enter your organization’s LicenseServer port in the Port number field. 6. Indicate the license type that you want this SOAtest installation to use by selecting the appropriate option in the Edition box. Available options are:

7.



Professional Edition: Covers static analysis and functional testing.



Architect Edition: Covers static analysis, functional testing, and custom rule creation (RuleWizard).



Server Edition: Covers static analysis, functional testing, custom rule creation (RuleWizard), and the command-line interface.



Custom Edition: Covers custom licensing needs. If you are using a custom license, select this option, then click the Choose button and specify which of your available license features you want to apply to this installation.

Click OK to set and save your LicenseServer settings.

If your organization needs additional licenses or updated licenses, the manager or architect should contact Parasoft to obtain these licenses, then add these licenses to LicenseServer as described in the LicenseServer documentation.

Tip - Deactivating Licenses If you want to deactivate a SOAtest LicenseServer license, enable the Start deactivated, release automatically when idle option. To reactivate the license, disable the Start deactivated, release automatically when idle option. When the license is deactivated: •

The SOAtest view is cleared and displays a message indicating that a license is not available.



All SOAtest operations currently in progress (for instance, static analysis or test case execution) are canceled.



The SOAtest LicenseServer license is released.

When the license is reactivated: •

The SOAtest view is restored and will display errors (if available).

Load Test Using a Machine-Locked License

30

Licensing

To install a machine-locked license: 1. Launch Load Test. 2. Open the Password window in one of the following ways: •

If you see a dialog box asking if you would like to install a password, click Yes.



If you do not see this dialog box, choose File> View Password.

3. Contact your Parasoft representative to receive your license. 4. In the top portion of the Password Information dialog, enter your expiration date and password. 5. Click OK to set and save your license.

Using Parasoft LicenseServer To install a license when the Parasoft LicenseServer (available separately) manages Load Test licensing across your team or organization: 1. Launch Load Test. 2. Open the Password window in one of the following ways: •

If you see a dialog box asking if you would like to install a password, click Yes.



If you do not see this dialog box, choose File> View Password.

3. Select the Use license server option. 4. Select a license server from the Autodetect drop-down menu and click Set to populate the Host and Port fields. You can also click Refresh to refresh the list of license servers. If your license server is not found in the Autodetect list, you can manually enter the Host name in the Host field, and the Port number in the Port field (the default port is 2002). 5. Enter the amount of seconds after which a timeout should occur in the Timeout field. If the specified amount of time passes before a license is retrieved, Load Test will not run. 6. Click OK to set and save your license server information.

31

The SOAtest UI In this section: •

Exploring the SOAtest UI

32

Exploring the SOAtest UI

Exploring the SOAtest UI This topic describes the SOAtest controls that are added to the Eclipse IDE. Sections include: •

The SOAtest Perspective



Views



Toolbar Buttons



SOAtest Menu Commands



The Scanning Perspective



The Load Test Perspective

The SOAtest Perspective The SOAtest perspective, which is built into the Eclipse workbench, provides a set of capabilities designed to help you configure, run, and review tests. You can open this perspective in any of the following ways: •

Click the SOAtest Perspective button in the shortcut bar (on the top right of the workbench).



Click the Open Perspective button in the shortcut bar, choose Other, then choose SOAtest in the Select Perspective dialog that opens.



Choose Window> Open Perspective> Other, then choose SOAtest in the Select Perspective dialog that opens.

The SOAtest perspective provides special views, toolbar buttons, and menu items you will use to configure, run, and review tests.

Views SOAtest functionality relies on the following views: •

Test Case Explorer



SOAtest view



Console view



Test Progress view



Editor view



Stub Server view

For details on other views provided by the Eclipse workbench (such as the Tasks and Problems view), see the Workbench User Guide, which can be accessed from Help> Contents.

Test Case Explorer The Test Case Explorer displays available SOAtest projects and tests. The Test Case Explorer can have multiple Eclipse projects open at the same time. Each project can have multiple Test Suites open at the same time.

Test Case Explorer Menu Buttons

33

Exploring the SOAtest UI

At the top right corner of the Test Case Explorer are the following menu buttons : Icon

Name

Description

Refresh

Refreshes the contents of the Test Case Explorer.

Collapse All

Collapses all of the nodes within the Test Case Explorer.

Search

Allows you to search for any node (i.e. Test Suites, tests, chained tools, etc.) within the Test Case Explorer. After clicking the Search button, the following options display: •

Containing: Enter the text or string contained within the test.



Within the whole tree: Select to search for the specified text within the whole tree.



Within the selected node: Select to search for the specified text within the selected node.



Wrap around: Select to perform a search on wrap around text.



Case sensitive: Select to perform a case sensitive search.

Filter

Allows you to configure filters that hide specified projects or tests within the Test Case Explorer.

Statistics

Shows statistics (i.e. number of tests Passed, Failed, Errors, Skipped, Run) next to each test suite node within the Test Case Explorer.

SOAtest view The SOAtest view is where SOAtest lists its test findings. This view is open by default. If this view is not available, choose SOAtest> Show View> SOAtest to open it.

34

Exploring the SOAtest UI

For details on reviewing the results reported in this view, see “Viewing Results”, page 290.

Console view The Console view displays summary information about any executed test—including how many tests were run, how many tests failed, and how many tests were skipped.

35

Exploring the SOAtest UI

You can also configure the Console to show test variables as described in “Monitoring Test Variable Usage”, page 328.

Test Progress view This view is where SOAtest reports test progress and status. For details, see “Test Progress View”, page 290.

Editor view The Editor view is the largest panel in the workbench. This is where SOAtest displays tool/test configuration panels or source code—depending on what Test Case Explorer or Navigator node was selected. For instance, if you double-click a SOAP Client tool node in the Test Case Explorer, a SOAP Client tool configuration panel opens in an Editor.

Stub Server view The Stub Server view allows you to manage and interact with the local stub server as well as dedicated stub servers running on remote machines. For details on how to use this view, see “Working with Stubs”, page 541.

Toolbar Buttons SOAtest adds the following buttons to the toolbar: •

Test



Import My Recommended Tasks

36

Exploring the SOAtest UI

Test The Test button allows you to quickly run any available Test Configuration. If you simply click the Test button, SOAtest will run a test based on the Favorite Test Configuration.

If you use the pull-down menu to the right of the Test button, you can start a test using any of the available Test Configurations. The current Favorite Test Configuration will always be listed as the first command in the Test Using pull-down menu, and will be followed by the most recently-run Test Configurations, then by commands that provide access to User-defined Test Configurations, Built-in Test Configurations, and Team Test Configurations.

Click here to open the pulldown menu

Favorite Test Configuration

Recently-run Test Configurations

Test Configurations shared by the team

37

Test Configurations included with SOAtest

Test Configurations designed by the user

Exploring the SOAtest UI

Import My Recommended Tasks The Import My Recommended Tasks button lets you import a selected category of results that are available on Parasoft Team Server. This allows you to use the GUI to review and analyze the results from command-line tests. If you simply click the Import My Recommended Tasks button, SOAtest will import the subset of all of your testing tasks that 1) you are responsible for (based on SOAtest’s task assignments) and 2) SOAtest has selected for you to review and address today (based on the maximum number of tasks per team member per day that your team has configured SOAtest to report).

. If you use the pull-down menu to the right of the Import My Recommended Tasks button, you can select which type of results you want to import. For details on the available options, see “Accessing Results and Reports”, page 234.

SOAtest Menu Commands The SOAtest menu provides the following commands: •

Test Using [Favorite Configuration]: Starts a test using the Test Configuration currently set as the Favorite Test Configuration.



Test History: Starts a test based on the selected Test Configuration. Only the most recentlyrun Test Configurations are listed here.



Test Using: Starts a test based on the selected Test Configuration. All available Test Configurations are listed here.



Test Configurations: Opens the Test Configuration dialog, which lets you view, modify, and create Test Configurations. See “Creating Custom Test Configurations”, page 244 for details.



Launch RuleWizard: Opens RuleWizard, a tool for graphically or automatically creating custom static analysis rules. See “Creating Custom Static Analysis Rules”, page 612 for details.



Explore> Team Server: Opens the Team Server browser dialog, which allows you to access, configure, and update the Test Configurations, rules, rule mapping files, and reports available on Team Server.



Explore> Team Server Reports: Opens the HTML report files that are available on Team Server. See “Accessing Team Server Reports through the GUI”, page 236 for details.



Explore> Report Center Reports: Opens Report Center reports based on information from SOAtest tests and other sources. See “Accessing Results through Report Center”, page 237 for details.



Import: Imports the selected category of results that are available on Parasoft Team Server. See “Importing Results From Team Server into the SOAtest GUI”, page 234 for details.



Show View: Opens GUI elements (such as the SOAtest view or Suppressions view) which are not currently visible. See “Views”, page 33 for details.



Preferences: Opens the Preferences dialog. See “Modifying General SOAtest Preferences”, page 242 for details.



Support: Provides several ways to contact the support team.

38

Exploring the SOAtest UI



Help: Opens the User’s Guide in the online help system.



Deactivate | Activate License: Deactivates/activates a SOAtest LicenseServer license. See “Licensing”, page 29 for details.

The Scanning Perspective The Scanning perspective is designed to facilitate reviewing and retesting of resources scanned during static analysis. To open the Scanning perspective: •

Choose Window> Open Perspective> Other> Scanning.

This perspective is similar to the SOAtest perspective, but has two additional features: •

The Quick Test tool bar button for testing a single URL or file. This button can be added to any perspective by choosing Window > Customize Perspective> Commands and clicking the checkbox next to SOAtest Scanning.



The Scanned Resources view. The view can be added to any perspective by choosing Window > Show View> SOAtest> Scanned Resources Listing.

For details on using this perspective, see “Reviewing and Retesting Scanned Resources”, page 581.

The Load Test Perspective The Load Test perspective is designed to help you prepare your web functional tests for load testing. To open the Load Test perspective: •

Choose Window> Open Perspective> Other> Load Test.

This perspective is similar to the SOAtest perspective, but it also provides the following features: •

Two toolbar buttons (Configure for Load Test and Validate for Load Test) which allow you to run automated test configuration and validation.



A Load Test Explorer, which lists the available web functional tests. Note that any web functional test components that are not relevant to load testing—for example, browser-based validations or data banks—will not be shown in this view.



Load Test Explorer right-click menus for running automated test configuration and validation (the same commands available in the toolbar buttons).



Specialized test configuration panels, which are accessed by double-clicking a test in the Load Test Explorer.

For details on using this perspective, see “Preparing Web Functional Tests for Load Testing”, page 561.

39

Migrating from SOAtest and WebKing In this section: •

Migration Guide for Existing SOAtest and WebKing Users



Command Line Interface (cli) Migration

40

Migration Guide for Existing SOAtest and WebKing Users

Migration Guide for Existing SOAtest and WebKing Users This topic is a general migration guide for existing SOAtest and WebKing users. Sections include: •

About This Migration Guide



What’s New?



Setting Up Projects •

Choosing an Appropriate Project Setup Strategy



Creating Projects from Tests Under Source Control



Creating Projects from Tests NOT Under Source Control



Using a Team Project Set File (.psf) to Share a New Project Across the Team



Importing Existing Preferences



Importing Stubs



Familiarizing Yourself with the SOAtest 6.x Interface •

Test Case Explorer



Editors



Environments



Running a Test



Saving Test Suite (.tst) Files



SOAtest View and Console View



Source Control Integration



Setting up the Automated Nightly Process using CLI



Deprecated Features

About This Migration Guide This guide is designed to help existing SOAtest or WebKing users get up and running in SOAtest 6.x as rapidly as possible. This Migration Guide is meant to be used by users already familiar with SOAtest or WebKing. New users should review the SOAtest Tutorial first.

What’s New? For details on what’s new, see http://www.parasoft.com/soatest6.

Setting Up Projects Since the SOAtest 6.x interface was integrated into the Eclipse framework, it now follows the Eclipse framework hierarchy for managing your test assets. You no longer need to open .tst files one at a time. Instead, you can manage all your .tst files in projects within a workspace.

41

Migration Guide for Existing SOAtest and WebKing Users



A workspace corresponds to a directory on the local machine. SOAtest will ask you for the desired location of workspace at startup and will remember that location in future runs. When you start SOAtest 6.x, an Eclipse workspace is automatically created in: •

Windows: %USERPROFILE%\soatest\workspace



Linux: $HOME/.soatest_linux/workspace



Solaris: $HOME/.soatest/workspace For example: •

Windows XP: C:\Documents and Settings\[user]\soatest\workspace



Windows Vista: C:\Users\[user]\soatest\workspace



Linux: /home/[user]/.soatest_linux/workspace



A workspace can contain multiple projects, each of which correlates to a directory inside the workspace on the local machine. The project can contain multiple .tst files along with any related files and artifacts such as data source Excel spread sheets, keystores, etc.



The .tst file inside a project serves the same function as what was called a "project file" in previous releases.

Choosing an Appropriate Project Setup Strategy In the following sections, we will go through several ways of creating new projects that may be useful to existing SOAtest or WebKing users. Other ways of creating new projects, (e.g. from a WSDL), can be found in the Tutorial. To ensure that tests can easily be shared across your organization, a designated team member—usually a team lead or manager—must decide which project setup strategy to use. The entire team should then adopt the same strategy.

In these situations

Use this strategy

Your tests are stored in a source control system

Creating Projects from Tests Under Source Control

Your tests are NOT stored in a source control system and you want to copy your old tests into a new location on your file system

Copying Tests to a New Location

Your tests are NOT stored in a source control system and you want the existing files to remain in the same location on your file system

Leaving Tests in the Original Location

Note Once a file is opened within SOAtest 6.0 or later, it is automatically saved in a new format that cannot be opened in earlier versions of SOAtest or WebKing.

Creating Projects from Tests Under Source Control

42

Migration Guide for Existing SOAtest and WebKing Users

By default, SOAtest 6.x ships with CVS source control support. Support for additional source controls can be added by providing appropriate plug-ins to Eclipse. To create a project consisting of test suites that are checked into source control: 1. Choose File> Import. 2. In the window that opens, expand the folder that corresponds to your source control system (e.g., SVN or CVS). 3. Select Project(s) from <name of source control> then click Next. 4. Enter the necessary Repository Location Information for the source control folder containing your tests, then click Finish. 5. After the project is available in your workspace, add the .project and .parasoft files into source control. They will be visible in the Navigator view and should be shared by the entire team. •

Do NOT add the .metadata folder into source control.

Creating Projects from Tests NOT Under Source Control The first (and strongly recommended) option for tests not stored in a source control system is to copy the old tests into your new workspace. Doing so will preserve your old tests in a manner analogous to backing up your hard drive. This procedure is explained in Copying Tests to a New Location. The second option for tests not stored in a source control system is to use the Project from Existing SOAtest or WebKing Test Suites wizard. This will cause the original files to appear within your workspace—but it will allow them to remain in the same location on your file system. This procedure is explained in Leaving Tests in the Original Location.

Copying Tests to a New Location To create a project that copies existing test suites to a new location on the file system: 1. Choose File> New> Project. 2. In the window that opens, expand General, select Project, then click Next. 3. Specify a name for your project (which will contain multiple tst files), then click Finish. This will create an empty folder in your workspace. 4. Choose File> Import. 5. In the window that opens, expand General, select File System, then click Next. 6. In the From directory field, navigate to the directory containing your tests. 7. In the Into folder field, select your project folder from Step 3, then click Finish. 8. (Optional, Strongly Recommended) Obtain a source control system, and add the entire project, the .project folder, and the .parasoft files into source control. They will be visible in the Navigator view and should be shared by the entire team. •

Do NOT add the .metadata folder into source control.

Leaving Tests in the Original Location To create a project that leaves existing test suites at the same location on the file system: 1. Open the New Project Wizard by completing one of the following options:

43

Migration Guide for Existing SOAtest and WebKing Users



Select File> New> Project from Existing SOAtest or WebKing Test Suites.



Open the pull-down menu for the New toolbar button (top left) then choose Project from Existing SOAtest or WebKing Test Suites.

2. In the wizard that opens, enter a project name then enter or browse to the root directory for the existing test suites. 3. Click the Finish button. The tests you selected will display in the Test Case Explorer.

Using a Team Project Set File (.psf) to Share a New Project Across the Team Once one team member creates a project, that team member can create a Team Project Set File (.psf) which can then be shared by other members of the team. Doing so allows every team member to create their Eclipse Projects in the same uniform way. This is a necessary step for importing tasks from the automated nightly test process. To create a Team Project Set File for a project created from CVS, complete the following: 1. Select File> Export. The Export Wizard displays. 2. Within the Export Wizard, select Team> Team Project Set, and then click the Next button. 3. Select the projects to be included in the Team Project Set File by selecting the corresponding check boxes. 4. Enter the location where the Team Project Set File will be saved and click the Finish button. To create a Project from the Team Project Set File, complete the following: 1. Select File> Import. The Import Wizard displays. 2. Within the Import Wizard, select Team> Team Project Set, and then click the Next button.

44

Migration Guide for Existing SOAtest and WebKing Users

3. Browse to the desired Team Project Set and click the Finish button. The tests that you selected display in the Test Case Explorer.

Importing Existing Preferences Existing SOAtest or WebKing users can import preferences from previous versions of SOAtest or WebKing. Preferences hold settings such as previously-used WSDL URLs, Report Center preferences, System Properties for including additional jar files in a classpath, etc. Preferences in previous versions of SOAtest or WebKing were saved in a binary file with .xtp or .wkp extensions in the installation directory of SOAtest or WebKing. To import existing preferences, complete the following: 1. Select SOAtest> Preferences. The Preferences dialog displays. 2. Select the root SOAtest node within the Preferences dialog and click the Import button. 3. Browse to and select the .xtp or .wkp preference file. The selected preferences are now saved.

Importing Stubs The Client Tester tool has now been renamed to the "Message Stub" tool. In all versions of SOAtest, the configuration settings for any deployed stubs are saved in a stubs.xml file. In SOAtest 5.5.x, this file was saved in the "stubs" folder in the SOAtest install location. In the current version of SOAtest, this file is created in a "stubs" project in the SOAtest workspace. This stubs project is the default (and recommended) location for storing each stub's corresponding .tst file (Client Tester/Message Stub suite). To migrate stubs from SOAtest 5.5.x: 1. In your SOAtest 6.x workspace, create a project called "stubs" (if it does not already exist). 2. Copy your stubs.xml file and any corresponding stub .tst files to the "stubs" project. 3. In the stubs.xml file, review the file paths to each stub’s tst file and adjust them as needed. •

In SOAtest 5.5.x, relative paths were resolved relative to the SOAtest install directory.



In the current version of SOAtest, the file paths are relative to the stubs.xml file in the "stubs" project.

Familiarizing Yourself with the SOAtest 6.x Interface Because Parasoft SOAtest is now Eclipse-based, the look and feel style is now slightly different. However, except for the changes outlined below, the user interface design layout, forms and settings have largely remained unchanged and should remain familiar to existing users.

Test Case Explorer The Test Case Explorer can have multiple Eclipse projects open at the same time. Each project can have multiple Test Suites open at the same time. In previous versions of SOAtest, only one Test Suite could be open at any given time.

45

Migration Guide for Existing SOAtest and WebKing Users

Test Case Explorer Menu Buttons At the top right corner of the Test Case Explorer are the following menu buttons: •

Refresh: Click to refresh the contents of the Test Case Explorer.



Collapse All: Click to collapse all of the nodes within the Test Case Explorer.



Search: Click to perform a search for any node (i.e. Test Suites, tests, chained tools, etc.) within the Test Case Explorer. After clicking the Search button, the following options display: •

Containing: Enter the text or string contained within the test.



Within the whole tree: Select to search for the specified text within the whole tree.



Within the selected node: Select to search for the specified text within the selected node.



Wrap around: Select to perform a search on wrap around text.



Case sensitive: Select to perform a case sensitive search.



Filter: Select to hide specified projects or tests within the Test Case Explorer.



Statistics: Select to display statistics (i.e. number of tests Passed, Failed, Errors, Skipped, Run) next to each test suite node within the Test Case Explorer.

Editors Opening Editors with a Double or Single Click In previous versions, if you wanted to open the configuration panel for a test node (e.g., an "Editor"), you would select that node in the Tests tab. With SOAtest 6.x, you double-click on an item’s Test Case Explorer node to display its Editor. If you want to change the default double-click behavior to single-click, complete the following: 1. Select Window> Preferences. The Preferences dialog displays.

46

Migration Guide for Existing SOAtest and WebKing Users

2. Within the Preferences dialog, select General on the left, and change the Open mode from Double click to Single click within the right GUI panel. 3. Select General> Editors, enable Close editors automatically and then click the OK button. You will now be able to open editors based on a single click.

Opening Multiple Editors In previous versions of SOAtest and WebKing, only one Editor could be open at once. In SOAtest 6.x, multiple Editors can be open simultaneously.

Saving Changes in Editors When an Editor is modified in SOAtest 6.x, an asterisk “*” displays on the Editor tab denoting that the Editor is now “dirty.” Modifications to the Editor must be explicitly saved using the Save toolbar button or the Ctrl-S keyboard shortcut. Editor with Unsaved Changes

Environments In previous versions of SOAtest, environments were displayed in a separate tab below the Tests tab. Starting with SOAtest 6.x, environments are now part of the tree view in the Test Case Explorer.

47

Migration Guide for Existing SOAtest and WebKing Users

Running a Test To run a test, you can right-click on the test’s node and select Test Using ‘Example Configuration’ from the shortcut menu.’ Alternatively, you can press F9 on your keyboard, or click the Test toolbar button.

Saving Test Suite (.tst) Files In previous versions of SOAtest and WebKing, you had to explicitly save test suite (.tst) files. In SOAtest 6.x, user actions in the Test Case Explorer are automatically saved. For instance, adding new Test to the Test Case Explorer will be automatically saved.

SOAtest View and Console View Failures that occur during the test execution now display in the SOAtest view. What was previously displayed in the Messages Log now displays in the Console view.

48

Migration Guide for Existing SOAtest and WebKing Users

Source Control Integration If you have the appropriate source control plugins installed into the Eclipse environment, your Test Suites can now be checked into source control directly as follows: •

Right-click the Test Suite node and select Team> Commit from the shortcut menu.

To check in new projects, complete the following: •

Right-click on the Project node and select Team> Share Project from the shortcut menu.

Setting up the Automated Nightly Process using CLI To set up an automated nightly build from the Command Line, complete the following: 1. Start SOAtest on your test machine, then create a workspace containing all the projects and test suites that you wish to run as part of your nightly testing process. For more information, see the Setting Up Projects section above. 2. Configure SOAtest's preferences with any global settings that are required for your tests. To open SOAtest preferences, choose SOAtest> Preferences. If the test suites in your workspace were imported from source control, then you should configure the Source Controls settings. You can set preferences as described in the User Guide section on Importing Existing Preferences. 3. (Optional) Create a test configuration to use for your nightly test run. Test configurations have settings that affect the way in which your tests are executed. SOAtest ships with a test configuration named Example Configuration that you can use if you do not wish to create your own. Your test configurations can be managed by choosing SOAtest> Test Configurations. If the projects in your workspace were created from source control, then you should click the Common tab in your test configuration then enable the Update projects from source control option.

49

Migration Guide for Existing SOAtest and WebKing Users

4. (Optional) Create Local Settings (Options) Files. This are text files that can be used to control settings for reports, email, Report Center, Team Server, license server, authorship, and source control. 5. Schedule a daily process to invoke SOAtest using the desired command line options. This can be done using a job scheduler mechanism such as Windows Task Scheduler or Unix cron. For example, to run all projects in your workspace you can use the following command: soatestcli.exe -data "c:\mySOAtestWorkspace" -showdetails -config "user://Example Configuration" -report "c:\mySOAtestReports" -publish -localsettings c:\mySOAtestWorkspace\mylocalsettings.properties"

Please note that SOAtest's command line interface has been modified and enhanced. For example, the -publish argument will add reports to Team Server which can later be used to import test failures as tasks into your local SOAtest workspace. For a detailed list of changes, see the topic on Command Line Interface (cli) Migration.

Deprecated Features •

Load Testing: This is now available in a separate installable called Parasoft Load Test. The current version will allow you to run your existing SOA and web load tests as well as create new ones. It also allows you to load test complete end-to-end test scenarios—from the web interface, through services, to the database. Every protocol and test type available in Parasoft SOAtest is supported in Parasoft Load Test. •

Parasoft Load Test includes the full SOAtest product, so if you are interested in both functional testing and load testing, you should install Parasoft Load Test.



WebKing Paths: WebKing’s Path view has been replaced by Test Suite-based functional tests using Browser Testing tools. The primary benefit is that Test Suite-based functional tests support much more complex web applications (like RIA and AJAX applications). Moreover, the new implementation follows a consistent test configuration paradigm that supports end-to-end testing. Paths in existing .wkj files can be executed in SOAtest 6.x, but they cannot be edited or extended.



WebKing Publishing: This functionality is not applicable to SOAtest 6.x.



Capture HTTP Traffic Tool: This tool is no longer supported. If this functionality is needed, a free tool like WireShark can be used to save the HTTP trace to a file, then the "Generate tests from traffic" option can be used to create tests from it.



Specific XML Validator Options: The XML DTD preferences and validating against DTD options are no longer available.



Management Reports: Report enhancements are planned. SOAtest will report all meta-data to Report Center and Report Center will be able to generate different kinds of reports.



CLI commands: •

-run: This command, which is for running custom Python scripts through SOAtest, is deprecated. Please contact Technical Support for assistance migrating scripts to 6.x.



-runtest: This command is replaced with new CLI options. See the Migration Guide topic on Command Line Interface (cli) Migration for details.



-wsdl



-reportAllTraffic



-traffic

50

Command Line Interface (cli) Migration

Command Line Interface (cli) Migration This topic explains how to migrate your existing SOAtest or WebKing automated nightly process from the legacy command line interface (cli) to SOAtest 6.x’s soatestcli. Sections include: •

Command Line Invocations



Command Line Options

Command Line Invocations The following table shows the differences in command line invocation between previous versions of SOAtest or WebKing and SOAtest 6.x.

OS

Previous WebKing option

Previous SOAtest option

SOAtest 6.x option

Windows

wk.exe -cmd [options]

st.exe -cmd [options]

soatestcli.exe [options]

Linux or Solaris

webking -cmd [options]

soatest -cmd [options]

soatestcli [options]

Examples Running a Single Test Suite SOAtest 5.5 and earlier: If your SOAtest command line invocation to run a single test suite and write a report in HTML format to a file was st.exe -cmd -runtest <test suite name.tst> -reportHTML -detailed <report file name>

your SOAtest command line invocation might be soatestcli.exe -config <configuration name> -resource <path to test suite name.tst relative to the workspace> -report <report file>

WebKing: If your WebKing command line invocation to run a single test suite and write a report in HTML format to a file was wk.exe -cmd -runtest <test suite name.tst> -reportHTML -detailed <report file name>

your SOAtest command line invocation might be soatestcli.exe -config <configuration name> -resource <path to test suite name.tst relative to the workspace> -report <report file>

Running all Test Suites SOAtest 5.5 and earlier: If your SOAtest command line invocation to run all test suites within a directory was st.exe -cmd -runtest -all <directory path>

your SOAtest command line invocation might be soatestcli.exe -config <configuration name> -resource <directory path relative to the workspace> -report <report file>

51

Command Line Interface (cli) Migration

WebKing: If your WebKing command line invocation to run all test suites within a directory was wk.exe -cmd -runtest -all <directory path>

your SOAtest command line invocation might be soatestcli.exe -config <configuration name> -resource <directory path relative to the workspace> -report <report file>

Command Line Options The following tables show the differences in command line options between previous versions of SOAtest or WebKing and SOAtest 6.x.

Top Level Options Note that the following options cannot be used together.

Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Start stub server

-startStubServer

-startStubServer

Run tests

-runTest

no flag is needed

Run static analysis on wsdl

-runwsdltest

Deprecated - but replaced with equivalent WSDL static analysis options. See the note below for details.

Execute script

-run

Deprecated - please contact Technical Support for assistance migrating scripts to 6.x.

WSDL Static Analysis There are three ways to statically analyze WSDLs in SOAtest 6.x: •

Apply a policy file when first generating tests (such as the sample policy in the examples directory). This generates a "WSDL Handler" tool that decomposes a WSDL to its imported WSDL and schema parts. Then, WSDL parts are chained to WSDL validation rules and the schemas are chained to schema validation rules.



Manually create a Coding Standards tool that takes a WSDL or schema file as an input, then have the desired schema rules enabled in it.



Have the schema file within your workspace project, then right-click its Navigator view, node and run a Test Configuration that has static analysis enabled. SOAtest’s built-in Test Configurations include a few sample static analysis configurations—including ones for WSDLs and schemas.

52

Command Line Interface (cli) Migration

Run Test Options Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Run all tests recursively starting at the specified directory.

-all <directory>

To run all tests in the workspace: no flag is needed To run all tests in a particular project: -resource <directory path relative to the workspace>

Ignore a test

-ignore <file name>

To ignore/include all tests in a sub-folder: -exclude <subfolder> / -include <sub-folder>. The -include flag allows to specify a subset of the resources indicated by the resource flag. (DO NOT start the resources after the include/-exclude flags with a '/ '.) To ignore/include a test from the resources specified in the resource flag : -exclude <file name> / -include <file name> (DO NOT start the resources after the -include/-exclude flags with a '/'.)

Run a specific test file

<file name>

-resource <path to test suite name.tst relative to the workspace>

Search and replace router

-router [matchWhole] <searchURI> <replaceURI>

-router matchWhole <searchURI:URI> <replaceURI:URI> This feature is now deprecated. Please use Environments instead.

Specify test name patterns

-testName [-match] <pattern> [dataSourceRow <row>] [-dataSourceName <name>]

-testName [match:] <test name> -dataSourceRow <row> -dataSourceName <name>

Specify environment options

-environment <environment_name>

-environment <environment_name>

Run tests with a single data source row

[-dataSourceRow <row>] [dataSourceName <name>]

-dataSourceRow <row> -dataSourceName <name>

Report test results to HP (Mercury) TestDirector

-testDirector <test file> <report file>

-testDirector <testFile:file> <reportFile: file>

53

Command Line Interface (cli) Migration

Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Report test results to Rational TestManager

-testManager [-v]

-testManagerVerbose

Specify browser used for Web functional test playback.

-browserType

See the User Guide topic on Configuring Browser Playback Options.

Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Execute the specified tool

<toolname>

Run a Test Configuration that executes a .tst file that includes the specified tool. Or, run a Test Configuration that applies the specified type of tool (check links, check spelling, etc.) during static analysis.

Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Generate reports

-genreport or <report format flag> <report file> (see "Report formats")

-report <report file>

Show traffic for successful tests

-reportAllTraffic

Deprecated

Report formats

-reportHTML, -reportXML, and -reportPDF

Specified in the local properties file using the following option:

Tool Execution Options

Reporting Options

report.format=html|pdf|custom Detailed versus summary reports

-detailed and -summary

Specified in the local properties file using the following option: report.developer_errors=true|fa lse

54

Command Line Interface (cli) Migration

Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Mailing reports

-mail -attach to:[email protected]

Specified in the local properties file using the following options: report.mail.enabled=true|false: report.mail.attachments=true|false report.mail.cc=[email_address es] report.mail.include=[email_add resses] See the User Guide topic on Testing from the Command Line Interface (soatestcli) for details

Specify configuration to be used for report generation

-config:<configuration name>

Specified in the local properties file using the available reporting options.

Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Enable communication with Report Center

-logger

Specified in the local properties file using the following option:

Report Center Options

grs.enabled=true|false Specify the name of the machine that is running Report Center

-J-Dlogger.host=<host>.

Specify the port to communicate with Report Center

-J-Dlogger.port=<port>.

Specified in the local properties file using the following option: grs.server=[server] Specified in the local properties file using the following option: grs.port=[port]

Enable Report Center communication and send all traffic to Report Center

-traffic

Deprecated

Specify custom Report Center attributes

-grs <attribute name>=<attribute value>

Specified in the local properties file using the following option: grs.user_defined_attributes=[at tributes]; Use the format key1:value1; key2:value2

Team Server Options 55

Command Line Interface (cli) Migration

Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Send reports to Team Server

N/A

-publish Note that -publish uses the Team Server configuration in the GUI by default. Alternatively, you can specify these settings in the local properties file.

Licensing Options Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Specify local license

-password [expiration date] [password]

Specified in the local properties file using the following options: <tool name>.license.local.expiration=[expiration] and <tool name>.license.local.password=[password]

Specify license server:

-licenseserver [host]:[port]

Specified in the local properties file using the following options: <tool name>.license.network.host and <tool name>.license.network.port

Other Options Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Show help

-dump

-help

Show version

N/A

-version

Installer options

-initjython, -installcertificate, uninstallcertificate

-initjython, -installcertificate, uninstallcertificate

Specify classpath entries

-extraClasspath

Specified in the local properties file using the following option: system.properties.classpath

For more details on SOAtest 6.x’s command line interface, see the User Guide topic on Testing from the Command Line Interface (soatestcli).

56

SOAtest Tutorial In this section: •

About this Tutorial



Creating Projects and Test (.tst) files



WSDL Verification



Functional Testing



Scenario Testing



Advanced Strategies



Creating and Deploying Stubs



Testing Plain XML Services



Extending SOAtest with Scripting



Asynchronous Testing



WS-Security



Design and Development Policy Enforcement



Automation/Iteration (Nightly Process)



Running Regression Tests in Different Environments



Web Functional Testing



Web Static Analysis

57

About this Tutorial

About this Tutorial The topics presented in this tutorial will guide you through how SOAtest addresses key areas of service and Web application testing. This tutorial will demonstrate the creation of various test suites. For your convenience, we've provided a sample SOAtest test suite named SOAtestTutorial.tst, located within the SOAtest examples directory. It contains all the tests that will be created through the tutorial, and also includes specific examples referenced throughout this tutorial. Further information about each test is given in the Requirements and Notes tab of each test suite’s configuration panel.

Parasoft SOAtest Best Practices While reading this document, you will find examples that show the recommended way of creating test cases within SOAtest. When creating tests for your own services and applications, you can utilize these examples to create your tests in a similar fashion. The following list details the best practices for using SOAtest: •

Using the SOAtest test wizard, create a test suite of WSDL tests that should be run on a nightly basis.



Using the SOAtest test wizard, create a test suite of SOAP Client tests for each operation defined within your WSDL. These test clients can then be moved into separate test suites for Functional Tests and Scenario Tests to optimize reusability and organization.



Positive and negative test cases should be created for each test case you create to fully maximize the testing coverage of the web service.



Regression tests should be created for both positive and negative test cases. Regression tests alert you to any changes in service functionality over time as the service evolves.



For each distinct testing requirement, create a separate Test (.tst file).

You will learn how to apply these and other best practices throughout the tutorial. As you gain a basic understanding of SOAtest functionalty, we strongly recommend reading the SOAtest Best Practices Guide, which is available as SOAtest_Best_Practices.pdf in [SOAtest_install_dir]/manuals.

58

Creating Projects and Test (.tst) files

Creating Projects and Test (.tst) files A project (an entity created by Eclipse) can contain any number of SOAtest-specific .tst files. They can also contain source files you want to analyze with SOAtest, and any other resources that make sense for your environment. Each .tst file can include any number of test suites/scenarios, tools, inputs, and stubs. The organization and structure is up to you. To keep file size down and to improve maintainability, we recommend using one .tst file for each distinct testing requirement.

Creating New Projects Multiple .tst files can be grouped into a single project. First, we will create a new project based on existing test suites. 1. Choose File> New> Project from Existing SOAtest or WebKing Test Suites.



Alternatively, you choose this command from pull-down menu for the New toolbar button (top left).

2. Enter Examples in the Project Name field. 3. Specify the location of the project’s test suites by clicking Browse then navigating to [SOAtest_installation_directory]/examples/tests.

59

Creating Projects and Test (.tst) files

4. Click Finish.

The Examples project will be added to the Test Case Explorer. It will contain multiple test (.tst files). You can also create new projects by selecting different commands (such as Project from WSDL or Project from Web Browser Recording) from the new project wizard. To see all available New Project options, choose File> New> Other and look under the SOAtest folder.

60

Creating Projects and Test (.tst) files

Opening and Closing Test (.tst Files) By default, .tst files are closed. All open .tst files are loaded into memory. There are two ways to open a .tst file: •

Double click the .tst file’s Test Case Explorer node.



Right-click the .tst file’s Test Case Explorer node, then choose Open Test (.tst) File from the shortcut menu.

Closed .tst files have the following "closed box" icon:

Open .tst files have the following "open box" icon:

61

Creating Projects and Test (.tst) files

Creating New Test (.tst) Files We recommend that you create a separate test (.tst file) for each distinct testing requirement. There are two ways to create a new test (.tst) file. •

Right click the project node, and select New Test (.tst) File from the shortcut menu.



Choose File > New > New Test (.tst) File.

The wizard will guide you through the test case creation process, then add a .tst file containing the generated tests. As you go through the subsequent tutorial lessons, you will learn how to create different kinds of Test Suites.

62

WSDL Verification

WSDL Verification WSDL verification can be considered the first step in testing Web Services. Although WSDLs are generally created automatically by various tools, it doesn’t necessarily mean that the WSDLs are correct. When WSDLs are manually altered, WSDL verification becomes even more important. Ensuring correct and compliant WSDLs enables your service consumers to function correctly, and avoids vendor lock-in, thus achieving interoperability and realizing SOA goals of service reuse. SOAtest can automatically generate a test suite of comprehensive WSDL tests to ensure that your WSDL conforms to the schema and passes XML validation tests. Additionally, it performs an interoperability check to verify that your web service will be interoperable with other WS-I compliant services. When you complete this section of the tutorial, your test suite should resemble the test suite entitled "WSDL Tests" in the SOAtestTutorial.tst file.

Creating a WSDL Verification Test Suite For this example we will create WSDL tests for a book store service with the WSDL located at http://soatest.parasoft.com/store-01.wsdl. To verify a WSDL using SOAtest’s WSDL Verification Tests, complete the following: 1. Open the pull-down menu for the New toolbar button (top left) then choose Project from WSDL

2. Enter a name for the project (e.g., Tutorial) in the Project name field, then click the Next button.

63

WSDL Verification

3. In the WSDL URL field, enter http://soatest.parasoft.com/store-01.wsdl

4. Clear the Create Functional Tests from the WSDL check box and select the Create tests to validate and enforce policies on the WSDL check box. 5. Click Finish. Because you selected the Create tests to validate and enforce policies on the WSDL check box, four WSDL tests are automatically created in a separate test suite called WSDL Tests. To see this test suite, open the Test Case Explorer tab and expand the tree.

SOAtest automatically creates the following WSDL tests from a WSDL URL. •

Test 1: Schema Validity: Runs XML validation on the WSDL against WSDL schemas from W3C.

64

WSDL Verification



Test 2: Semantic Validity: Checks the correctness of the WSDL by parsing and consuming it like an actual service consumer would, but with stricter adherence to standards.



Test 3: WS-I Interoperability: Verifies the WSDL against WS-I Basic Profile 1.1.



Test 4: WSDL Regression: Creates a regression control for the WSDL so that changes in the WSDL document can be detected.

6. Select the Test 3: WS-I Interoperability Check node and click the Add test or output toolbar button.

This opens the Add Output wizard, which displays a list of available tools. In addition, a description of the selected tool displays in the Tool Description field. 7. In the Add Output wizard, select Conformance Report from the left pane, select All from the Show dropdown menu, select Browse from the right pane, and click the Finish button. This will send a WS-I Conformance report to your internet browser when you run the test.

65

WSDL Verification

8. Select the Test Suite: WSDL Tests node and click the Test toolbar button.

If any errors occur, they will display in the Console dialog located at the bottom of the SOAtest GUI. You can double-click the errors in the right GUI panel for additional information and you can also examine the conformance report that was opened in your internet browser. Note: If you are using Firefox 3.0 or above as your default browser, the XML WS-I sheet may not be read. To fix this problem, select the Conformance Report> Browse node, then in the Project Configuration panel, select a different browser and try running the WSDL Tests node again.

66

Functional Testing

Functional Testing The best way to ensure the correct functionality of your Web service is to start by creating unit tests for each individual operation implemented by your service. Performing unit testing allows you to catch errors at the component level, making development errors easier to identify and fix. The SOAtest test creation wizard will automatically create a test client for each operation defined within your WSDL. These tests can then be moved into separate test suites, creating one test suite for each test case, allowing you to organize and structure your testing environment to maximize readability and reusability. For example, if your WSDL defines five operations, the SOAtest wizard will generate five test clients in a single test suite. These five test clients can then be separated into five separate test suites, each containing a unit test for a single operation. In this example a simple book store service is used. It provides the following operations: •

getItemById(int): Returns the book entry with the given item id. Currently valid values are 1, 2, 3, 4, 5 and 6.



getItemByTitle(String): Returns a list of Book objects that matched your title search query. The item price value which is returned by this operation increases by $1.00 every 5 invocations. Example keywords: linux, java, C++, program. Leave it blank to get ALL the books in the database.



placeOrder(int, int): Takes an item id and a quantity, returns an "Order" object which includes a Book object, quantity and a unique order number.



getPendingOrders(): Returns a list of orders that have been submitted using placeOrder(int,int) so far.



removeOrder(int): Takes an order number and removes it from the pending orders and returns a string with a result message (success or failure, etc.). As you might expect, the order numbers it takes successfully are the same as the ones returned by placeOrder(int, int).



confirm(): Confirms the currently pending orders. Subsequent calls to getPendingOrders(int, int) or removeOrder(int) will result in nothing.



addNewItem(Book): Enables you to add new book entries into the database (virtually). Feel free add anything you want; it will not really add them to the permanent database. New entries will only live throughout your session.

When you complete this section of the tutorial, your test suite should resemble the test suite entitled "Unit Tests" in the SOAtestTutorial.tst file.

Creating Test Suites for Unit Tests For this example, we add a new Test (.tst) file to the project created in the previous lesson.

67

Functional Testing

1. Right-click the project from the previous exercise, then choose New Test (.tst) File from the shortcut menu.

2. Enter a name for the file (e.g., functional test lesson), then click Next. 3. Select SOA> WSDL, then click Next.

4. Select http://soatest.parasoft.com/store-01.wsdl from the WSDL URL field.

68

Functional Testing

5. If it is not already selected, enable the Create functional tests from the WSDL checkbox .

6. Click the Next button four times to proceed to the Layout dialog. 7. Enable the Organize as Positive and Negative Unit Tests checkbox.

69

Functional Testing

8. Click the Finish button. The newly created test suite displays in the Test Case Explorer. 9. Double-click the new Test Suite: Test Suite node.

10. In the test suite configuration panel (on the right side of the GUI), enter Functional Tests in the Name field, then click the Save toolbar button. Within the Functional Tests test suite, there is a Test Suite: ICart node that includes seven other test suites which test each operation of the WSDL. 11. Right-click the Test Suite: ICart node, then choose Expand All. This will display all seven test suites are visible and each test within each test suite is displayed.

70

Functional Testing

Each of the seven test suites contain both a positive and negative test for each operation since it is important to test situations where we send expected data as well as unexpected data to the server. 12. Double-click the Test Suite: getItemByTitle Positive Test> Test 1: getItemByTitle node .

71

Functional Testing

13. Open the Request tab in the test configuration panel, type Linux in the titleKeyword entry field, then click Save. We will be searching for books with keyword Linux.

14. Select the Test 1: getItemByTitle node and click the Test tool bar button. The getItemByTitle operation is invoked with the parameter Linux. 15. Expand the Test 1: getItemByTitle node and double-click the Traffic Object> Traffic Viewer node underneath.

The HTTP Traffic panel opens and displays the traffic that was logged from the test run. 16. Right-click on the Test Suite: getItemByTitle Positive Test node and select Create/Update Regression Control from the shortcut menu, then choose Create Internal Regression Control in the Response Validation Wizard that opens.

SOAtest automatically runs the test and creates a regression control populated with the value received from the server.

72

Functional Testing

We now have a functional test that tests the getItemByTitle operation of our web service on a single input value. The same sequence of actions can be done to create functional tests for the other operations defined within the WSDL.

Ignoring XPath Values When creating regression controls, it may be helpful to ignore dynamic values such as timestamps or session variables that can cause your regression test to fail. In the bookstore example, the “price” element is a dynamic value, with the price of the book increasing by $1 every five times the test is run. In this example we will set up an XPath Property to globally ignore the “price” element value in all tests. 1. Run Test 1: getItemByTitle a few times. Notice that after a few test runs, the regression test fails and a task is reported in the Example Configuration view. This is because the price element has changed. In this case we want to ignore the value of the price element. 2. Right-click on the error message in the SOAtest view and select Ignore XPath from the shortcut menu.

An Ignored XPaths Settings dialog displays.The XPath of the price element, /Envelope/ Body/getItemByTitleResponse/Result/i/price, is automatically populated in the XPath field.

73

Functional Testing

3. Make sure the Recursive, Text Content, and Modify checkboxes are selected and click OK. This will instruct the regression test to recursively ignore any modifications to the text content of the price element. 4. In Test 1: getItemByTitle, double-click the Response SOAP Envelope> Diff control node. 5. Open the Ignored Differences tab in the test configuration panel. The Ignored Differences dialog displays. Notice that the XPath of the price element has been added to the Ignored XPaths List.

All of the price element values with the specified XPath are now being ignored. Run the functional test again and it will succeed.

Using the XML Assertor You can use the XML Assertor tool to enforce the correctness of data in an XML message. It is most commonly connected to a SOAP Client or Messaging Client in order to verify the data returned by a service. The XML Assertor provides support for complex message validation needs without the need for scripting, and allows you to easily create and maintain validation assertions on your XML messages. To use the XML Assertor, complete the following:

74

Functional Testing

1. Right-click the Test 1: getItemByTitle node from the previous exercise and select Add Output from the shortcut menu.

2. In the Add Output wizard, select Response> SOAP Envelope on the left, select XML Assertor on the right, and click the Finish button.

3. Double-click the Response SOAP Envelope> XML Assertor node that was added underneath the Test 1: getItemByTitle node .

75

Functional Testing

4. In the XML Assertor panel, open the Configuration tab and click Add.

5. In the Select Assertion wizard, expand Value Assertions, select String Comparison Assertion, then click Next.

The String Comparison Assertion dialog displays a tree view of the XML message from which you can select a single value to enforce. 6. Select the title element from the String Comparison Assertion dialog and click the Finish button.

76

Functional Testing

The Configuration tab of the XML Assertor is now populated with a String Comparison Assertion. 7. In the XML Assertor’s Configuration tab, select contain from the Element must drop-down menu, and enter Linux in the Expected Value field of the Configuration tab.

8. Save the changes to the XML Assertor Configuration. 9. Click the Test toolbar button. The test succeeds. You may add additional assertions to apply to the message (such as a Numeric assertion to enforce on the price element) by clicking the Add button in the XML Assertor’s Configuration tab.

Automate Testing Using Data Sources Now that we have a unit test created that tests a single input value, the next step is to add a data source. Adding a data source will allow you to test multiple input values with a single test case. 1. Right-click on the root test suite node Test Suite: Functional Tests and select Add New> Data Source from the shortcut menu.

77

Functional Testing

2. Select Excel from the New Project Data Source wizard and click the Finish button.

3. In the Data Source configuration panel, complete the following: a. Enter Books in the Name field. b. Click the File System button to navigate to and select the Books.xls file that is included in the SOAtest examples/datasources directory. c.

Click Save.

d. Click the Show Columns button to display the column names from the Excel Spreadsheet.

78

Functional Testing

4. Go back and double-click the Test 1: getItemByTitle node from the previous exercise. Books should already be selected from the Data Source drop-down menu that is now present in the test configuration panel.

5. For the titleKeyword drop-down menus at the bottom of the test configuration panel, select Parameterized and Keywords and then click the Save toolbar button.

79

Functional Testing

6. Click the Test tool bar button and notice the error messages that appear in the SOAtest view. The test ran one time for each row in the Keywords column, but failed due to the XML Assertor we created previously. Now we need to update our regression control. 7. Right-click the Response SOAP Envelope> XML Assertor node and select Delete from the shortcut menu. 8. Right-click the Test 1: getItemByTitle node and select Create/Update Regression Control. 9. In the Response Validation wizard, expand the Update Regression Controls node, select Update All Controls, and click the Finish button.

80

Functional Testing

10. Select the Test 1: getItemByTitle node and click the Test toolbar button. SOAtest adds new regression controls for each test run. In this case, 4 regression controls are added: one for each row of the data source. 11. Double-click the Traffic Object> Traffic Viewer node beneath the Test 1: getItemByTitle node and notice that the test ran four times, once for each keyword value in the Keyword column.

81

Functional Testing

Separating Tests into Positive and Negative Test Cases When creating test cases, it is important to test situations where we send expected data as well as unexpected data to the server. It is important that the server sends the correct responses to valid requests, and just as important that it knows how to handle invalid requests. In this example we will examine the Negative Unit Test within the Test Suite: getItemByTitle Unit Tests node. 1. Expand Test Suite: getItemByTitle Unit Tests and Test Suite: getItemByTitle Negative Test, then double-click the Test 1: getItemByTitle node. 2. In the Test 1: getItemByTitle node located in the Negative Tests test suite, parameterize the titleKeyword element using the Bad Keywords column.

In the negative test cases we are sending our service unexpected data and verifying that it returns the correct response or error response. 3. Save your changes to Test 1. 4. Right click on the test, select Create/Update Regression Control, select Create Regression Control, click Create Multiple Controls, then click Finish. New regression controls are created for each test run. No tasks are reported in the SOAtest view. This is the correct behavior.

Testing Invalid Data It is useful to test situations in which invalid data is sent to your service. For example, sending a string when your service expects an integer. 1. Select the Test Suite: Functional Tests node and click the Add Test Suite button.

82

Functional Testing

2. Select Empty suite from the Add Test Suite wizard and click the Finish button.

3. Double-click the new Test Suite: Test Suite node that was added to the test suite tree. 4. In the test suite configuration panel (on the right side of the GUI), enter Sending Bad Data in the Name field in the right GUI and click the Save toolbar button. 5. Expand Test Suite: getItemById Unit Tests, then expand Test Suite: getItemByIdNegative Test, then copy Test 1: getItemById.

83

Functional Testing

6. Paste Test 1: getItemById into the new Sending Bad Data test suite.

7. Double-click the Test 1: getItemById node within the Sending Bad Data test suite.

8. In the test configuration panel’s Form Input view, right-click the id element and disable (uncheck) Enforce Schema Type from the shortcut menu. This tells SOAtest to allow sending data for id element that does not conform to the schema - in this case, the schema indicates that the id element is an int, but we'll send a string instead.

84

Functional Testing

9. Enter the literal string Bad Data as the Fixed Value for the id element, then click the Save toolbar button.

10. Click the Test toolbar button. 11. After the test completes, view the traffic by expanding the Test Suite: Sending Bad Data> Test 1: getItembyID branch, and then double-clicking Traffic Object> Traffic Viewer.

12. Open the Traffic Viewer’s Response tab. Notice that an exception is thrown and displayed in the Response traffic.

85

Functional Testing

13. Right-click the Test 1: getItemById node within the Sending Bad Data test suite and select Create/Update Regression Control from the shortcut menu.

14. In the Response Validation Wizard, select Create Regression Control, then click Finish.

86

Scenario Testing

Scenario Testing After unit tests have been created, they can be leveraged into scenario-based tests without any additional work. Scenario tests allow you to emulate business logic or transactions that may occur during normal usage of the web service. This also allows you to find bugs that may surface only after a certain sequence of events. NOTE: If at any time during this exercise you receive a session expiration error message from the server, select SOAtest> Preferences. In the Misc tab of the SOAtest Preferences dialog box that opens, click the Reset Cookies button, and then click OK. The scenario test example given in the test suite “Scenario Test - Search, Place Order, and Remove” represents a typical sequence of operations that a customer may invoke when using a book store web service. In this case it represents a situation where a customer searches for a book, places an order for that book, and then removes the previously placed order. This scenario test introduces a tool called the XML Data Bank. This tool allows you to extract XML element values and store these values in memory to be used in later tests. In this example you will be storing the book ID returned by the service after searching for a book, and then in the subsequent test, use that ID to purchase the book. You will also store the order number returned after placing an order for the book, and then in the subsequent test, use that order number to remove the order from the system. When you complete this section of the tutorial, your test suite should resemble the test suite entitled "Scenario Test - Search, Place Order, and Remove" in the SOAtestTutorial.tst file.

Creating a Scenario Test Suite To create this scenario, perform the following: 1. Select the Test Suite: Functional Tests node from the previous exercise and click the Add Test Suite button.

2. Select Empty from the Add Test Suite wizard and click the Finish button. 3. Double-click the new Test Suite: Test Suite node that was added to the test suite tree. 4. In the test suite configuration panel (on the right side of the GUI), enter Scenario Test Search, Place Order, and Remove into the Name field, and then click the Save toolbar button. 5. Copy the positive getItemByTitle, placeOrder, and removeOrder test nodes from the previously created Functional Tests test suite and paste them into the Scenario Test - Search, Place Order, and Remove test suite. If needed, you can drag and drop to reorder them.

87

Scenario Testing

These three tests represent a typical business transaction a customer may invoke and will be the basis for our scenario test.

Configuring an XML Data Bank To configure the XML Data Bank, complete the following: 1. Double-click the Test 2: placeOrder node in the Scenario Test - Search, Place Order, and Remove test suite. 2. In the test configuration panel, select Books from the Data Source drop-down menu at the top right.

3. Select Parameterized and Use Data Source Wizard from the itemId element drop-down menus.

4. Complete the Parameterize with Value From Existing Test Response dialog as follows so that when this test is run, the value stored from Test 1 will be automatically inserted as the value for the itemId element: a. Select Test 1: getItemByTitle from the Test menu at the top of the dialog. b. Select the id element from the Expected XML tree and click the Add button. The id element displays in the Selected XPaths list with a Data Source column name corresponding to the selected test.

88

Scenario Testing

c.

Click the OK button.

Test 1:id now displays in the right GUI panel as a parameterized value for itemId. You will also notice that a Response SOAP Envelope> XML Data Bank node now appears underneath the Test 1: getItemByTitle node in the Scenario Test - Search, Place Order, and Remove test suite. 5. In the test configuration panel, enter a Fixed value of 3 for the quantity element, then click the Save toolbar button.

6. Double-click the Test 3: removeOrder node. 7. Select Books from the Data Source drop-down menu in the right GUI panel and select Parameterized and Use Data Source Wizard from the orderNumber element drop-down menus. 8. Complete the Parameterize with Value From Existing Test Response dialog as follows so that when this test is run, the order_number element value stored from Test 2 will be automatically inserted as the value for the orderNumber element: a. Select Test 2: placeOrder from the Test menu at the top of the dialog.

89

Scenario Testing

b. Select the order_number element from the Expected XML tree and click the Add button. The order_number element displays in the Selected XPaths list with a Data Source column name corresponding to the selected test. c.

Click the OK button.

Test 2:order_number now displays in the test configuration panel as a parameterized value for orderNumber. You will also notice that a Response SOAP Envelope> XML Data Bank node now appears underneath the Test 2: placeOrder node in the Scenario Test - Search, Place Order, and Remove test suite.

9. Click the Save toolbar button. 10. Select the Scenario: Scenario Test - Search, Place Order, and Remove node and click the Test toolbar button. When this test is run, the order_number element value stored from Test 2 will be automatically inserted as the value for the orderNumber element.

90

Scenario Testing

11. Explore the traffic by expanding Scenario: Scenario Test - Search, Place Order, and Remove and double-clicking each test’s Traffic Object> Traffic Viewer nodes. 12. Notice that the itemId of the book returned from Test 1 is used as the input for Test 2. Also, the order_number of the order placed in Test 2 is used as the input for Test 3. 13. Right-click the Scenario: Scenario Test - Search, Place Order, and Remove node, select Create/Update Regression Controls. 14. In the Response Validation Wizard, expand the Update Regression Controls node, select Create Regression Controls, and click the Finish button. The tests are run and a Regression Control is added to each SOAP Client test. 15. Select the Scenario: Scenario Test - Search, Place Order, and Remove node and click the Test toolbar button. Notice that all the tests now fail. 16. Examine the error messages that appear in the SOAtest view. These regression failures are due to dynamic content that appears within the response messages. In the following steps we will ignore elements with this type of dynamic data. 17. In the SOAtest view, right-click on the first error reported under each Test Suite node and select Ignore XPath from the shortcut menu. In the Ignore XPath Settings dialog that displays, click the OK button. You should ignore two XPaths in this step. 18. Select the Test Suite: Scenario Test - Search, Place Order, and Remove node and click the Test toolbar button. All the tests should now succeed. You have now created a fully functional scenario test that tests one possible business transaction that may occur during normal usage of the book store service. For extra practice you can try to create other scenarios that may possibly occur. Negative test cases could also be created for expanded test coverage.

91

Advanced Strategies

Advanced Strategies This lesson covers advanced strategies that will help you develop more robust, reusable test suites. Sections include: •

Creating Reusable (Modular) Test Suites



Looping Until a Test Succeeds or Fails - Using Test Flow Logic

Creating Reusable (Modular) Test Suites In many cases, you may want to create a test suite that can be reused by other test suites. A common example is a test suite that logs in to a web site. Once such a test suite is created, it can be used by other test suites in various scenarios that require a login. Two SOAtest features are especially helpful for the creation and use of reusable test suites: •

Referenced test suites: Once a reusable module or test suite has been created, it can be referenced by another test suite.



Test variables: You can parameterize tests with test variables, which can be set to specific values from a central location, extracted from a tests (e.g., through a data bank tool), or set from data sources.

This lesson demonstrates how those two features can be used to create and use reusable test suites. For simplicity, we will use the store web service; however, the principles and steps below can then be applied to any scenario you create. To create a reusable test suite: 1. Create an empty test suite (.tst) file called ReusableModule.tst as follows: a. Right-click the Examples project node in the Test Case Explorer, then choose New test (.tst) file. b. Under File name, enter ReusableModule. c.

Click Next.

d. Select Empty, then click Finish. 2. Create a SOAP Client test as follows. a. Expand the ReusableModule.tst Test Case Explorer node. b. Right-click the Test Suite: Test Suite node, then choose Add New> Test. c.

In the dialog that opens, select SOAP Client, then click Finish.

3. Configure the SOAP Client test as follows:

92

Advanced Strategies

a. In the test configuration panel’s WSDL tab, enter http://soatest.parasoft.com/ store-01.wsdl for the WSDL URL.

b. In the Request tab, set Operation to getItemByTitle.

c.

Save the changes to the SOAP Client test.

4. Define a test variable as follows: a. Double click the Test Suite: Test Suite node to open the test suite configuration panel. b. In the Test Variables tab, click the Add button. c.

In the dialog’s Name field, enter title variable.

d. Change Type to Data Source. e. Keep the selection at Use value from parent test suite (if defined). This will allow this variable to use values set in the test suite that references this test suite. If the selection is changed to Use local value, the value of the variable will always be the value specified in the Value field. f.

Enter store for Data Source Name and enter title for Column Name. This specifies that we expect a test suite that references this test suite to have a data source named store with a column named title.

93

Advanced Strategies

g. Enter Java for Value. This is the default value that will be used if SOAtest does not find a data source named store with a column named title.

h. Click OK. i.

Save the test suite configuration changes.

5. Configure the SOAP Client test to use the specified data source values, if available, as follows: a. In the SOAP Client’s test configuration panel, go to the Request tab and change Fixed to Parameterized. b. Select title variable in the combo box.

c.

Save the changes to the SOAP Client test.

6. Run the test suite by selecting ReusableModule.tst, then clicking the Run toolbar button. 7. Double-click the SOAP Client test’s Traffic Viewer node to see the traffic. Note that the titleKeyword used was Java. SOAtest used the default variable value because it did not find the specified data source (since we did not create it yet).

94

Advanced Strategies

8. Create an empty test suite (.tst) file called TestStoreTitles.tst as follows: a. Right-click the Examples project node in the Test Case Explorer, then choose New test (.tst) file. b. Under File name, enter TestStoreTitles. c.

Click Next.

d. Select Empty, then click Finish. 9. Add a data source to that test suite as follows: a. Right-click the Test Suite: Test Suite node, then choose Add New> Data Source. b. Select Table, then click Finish. c.

In the data source configuration panel, change the name to store.

d. Add a column named title.

95

Advanced Strategies

e. Add two values to that column: Linux and C++.

f.

Save the data source changes.

10. Configure this test suite to reference the first test suite we created in this exercise as follows: a. Right click Test Suite: Test Suite and choose Add New> Test Suite. b. Select Reference Test (.tst) File. c.

Click Finish.

d. Select ReusableModule.tst, then click Open. 11. Run the current test suite by selecting TestStoreTitles.tst, then clicking the Run toolbar button. 12. Double-click the Traffic Viewer node to see the traffic.

96

Advanced Strategies

13. Verify that Linux, then C++ were used as the titleKeyword.

Looping Until a Test Succeeds or Fails - Using Test Flow Logic In many cases, you may want to have SOAtest repeatedly perform a certain action until a certain condition is met. Test suite flow logic allows you to configure this. SOAtest allows you to choose between two main test flow types: •

While variable: Repeatedly perform a certain action until a test variable condition is met. T



While pass/fail: Repeatedly perform a certain action until a pass/fail condition is met (e.g., one of the tests in the test suite either passes or succeeds).

To see how while pass/fail logic allows you to have a test suite loop until a specified price value is obtained reached. 1. Create an empty test suite (.tst) file called TestFlowLogic.tst as follows: a. Right-click the Examples project node in the Test Case Explorer, then choose New test (.tst) file. b. Under File name, enter TestFlowLogic. c.

Click Next.

d. Select Empty, then click Finish. 2. Open the test suite configuration panel by expanding the test suite, then double-clicking the Test Suite: Test Suite node.

3. Open the Execution Options> Test Flow Logic tab.

97

Advanced Strategies

4. Set the Flow type to While pass/fail. 5. Set the Maximum number of loops to 20. 6. Leave the Loop until one of the test(s) setting at Succeeds.

7. Save the test suite configuration settings. 8. Create a SOAP Client test as follows. a. Right-click the Test Suite: Test Suite node, then choose Add New> Test. b. In the dialog that opens, select SOAP Client, then click Finish. 9. Configure the SOAP Client test as follows: a. In the test configuration panel’s WSDL tab, enter http://soatest.parasoft.com/ store-01.wsdl for the WSDL URL.

98

Advanced Strategies

b. In the Request tab, set Operation to getItemByTitle.

c.

Enter Java as the value for titleKeyword.

d. Save the changes to the SOAP Client test. 10. Create a regression control for that test as follows: a. Right-click the SOAP Client test node and select Create/Update Regression Control. b. Select Create Regression Control. c.

Click Finish.

11. Modify the expected price in the regression control as follows: a. Double-click the newly-created Diff control node to open the Diff Tool editor.

b. Modify the price from 76.0 to 78.0. The store service will increases the price of the book by $1.00 after several calls to getItemByTitle, so this is the expected value after several test iterations.

99

Advanced Strategies

12. Run the current test suite by selecting the test suite node, then clicking the Run toolbar button. The Test Suite will succeed because a price of 78.0 will reached before looping 20 times.

13. Double click on the previously-created Diff Control node to re-open the Diff Tool editor. 14. Modify the price from 78.0 to 150.0.

100

Advanced Strategies

15. Run the Test Suite again. The Test Suite will fail because a price of 150.0 is never reached— even after looping 20 times.

101

Creating and Deploying Stubs

Creating and Deploying Stubs Evolving services in a distributed SOA environment, and across multiple teams, is a complex endeavor due to the interdependencies between the system and business processes. For example, in a system that incorporates multiple endpoints such as credit card processing, billing, shipping, etc., it may be difficult for one team to test the responses from another team without interrupting normal business transactions. With SOAtest’s stub generation capability, you can test complex and distributed environments by creating stubs in a number of ways: •

Create a functional test that models the scenario that you want emulated (you simply interact with the actual components to be emulated), then have SOAtest automatically generate stubs that emulate the behavior monitored when executing the modeled scenario.



Emulate services based on real-world historical data (request/reply sets) collected from the runtime environment.



Create an emulated version of completely unavailable services "from scratch"—for example, you can use spreadsheets or other data sources to define the desired behavior, and then visually correlate request parameter values with desired response values.

Creating Stubs from Functional Tests In the following exercises, we will create stubs that emulate an existing Web service. The stubs will be created automatically from an existing SOAtest test suite that tests an existing Web service. We will also deploy the stubs locally and use the stubs for testing. To create a stub from functional tests: 1. Open the Test Suite named StubClient.tst in the examples/tests directory. This is a Test Suite comprised of three tests driven by a data source of three rows. The test suite consumes the actual Book Store web service described by the WSDL at http://soatest.parasoft.com/store-01.wsdl. 2. Select the main Test Suite: StubClient node and click the Test toolbar button. Nine tests will run and succeed. From these tests, we will automatically generate a stub. 3. Right-click the Test Suite: StubClient node and select Create Stub from the shortcut menu.

102

Creating and Deploying Stubs

4. To accept the default file name, click Next.



By default, SOAtest will create a stub Test Suite named StubClientStub.tst and save it in a "stubs" project, which will be added to your workspace if it does not already exist

5. To accept the default deployment settings, click Finish. •

By default, SOAtest will deploy the stub at the Endpoint http://localhost:9080/servlet/StubEndpoint?stub=StubClient.

SOAtest automatically creates the stub based on the existing Test Suite and saves it as StubClientStub.tst. It then deploys the stub on the local stub server at the default endpoint.

Viewing Stub Deployment Settings To view the deployment settings for the stubs deployed in the previous exercise, complete the following: 1. Select Window> Show View> Stub Server. The Stub Server view displays at the bottom GUI panel. 2. Expand the Local Machine node and double-click the StubClient node.

The deployment settings for the StubClient stub will display.

103

Creating and Deploying Stubs

Validating the Deployed Stubs In a real-world situation, you would want to validate that the stubs behave properly before you point your application to the stubs instead of the actual resource that the stubs are emulating. To use SOAtest to validate that the deployed stubs are working as expected, perform the following: 1. Expand the Environments node located in the Test Case Explorer view. 2. Right-click the Stub Environment node and select Set as Active Environment from the shortcut menu. This configures SOAtest to interact with the stubs—so you can ensure that they are working correctly before you configure your application to access them.

3. Click the Test toolbar button. The SOAP Client tests will now exercise the emulated services rather than the actual ones. The tests will run and succeed. If you examine the traffic, it should confirm that the stubs are behaving as expected. In a real-world situation, you would now configure your application to access the HTTP Endpoint for StubClient: http://<localhost>:9080/servlet/StubEndpoint?stub=StubClient

104

Creating and Deploying Stubs

Modifying Stubs Behavior To modify the stubs behavior, complete the following: 1. Locate and expand the Test Case Explorer node for StubClientStub.tst (the test suite that was added when you generated stubs for StubClient.tst). This is located in the "stubs" project. 2. Double-click the Test 1: getItemByTitle node. This is the stub for the getItemByTitle operation.

3. Open the test configuration panel’s Response tab, and select Response 1 within the Message Body tab. Notice the end of the XPath Function: *[local-name(.)="titleKeyword"]/text()="Linux". This stub returns the specified XML if the XPath function succeeds on the request XML (if the getItemByTitle is Linux).

105

Creating and Deploying Stubs

4. Open the Message subtab (within the Response tab) and modify the title element to return Linux Hacking Handbook instead of Linux Administration Handbook.

5. Click Save. 6. In the Stub Server tab, right-click the Local Machine node and select Re-Deploy All Stubs from the shortcut menu to re-deploy the modified stub.

7. Click the Test toolbar button. Now when you run the StubClient.tst test suite, the response for getItemByTitle(Linux) will contain the modified title. Again, this validates that the stub is behaving as expected.

106

Creating and Deploying Stubs

Using this iterative modify-validate process, you can customize the behavior of the stubs. You can also modify the XPath functions in a similar fashion.

Creating Stubs From a Traffic Trace If you can access a message log or a trace from the traffic between clients and servers, then you can create stubs from that data. Specifically, you can have SOAtest automatically create Message Stubs that respond to incoming messages in a way that mimics the behavior captured in the traffic. Such message traces or logs can be captured at the network level using network sniffing tools such as the free WireShark tool (http://www.wireshark.org/), or obtained by having your application log its traffic. The format that SOAtest expects is fairly loose. It can parse a wide range of formats—as long as it can find HTTP headers and request/response messages in sequence. To create stubs from a traffic trace: 1. Delete the store stub created in the previous lesson (go to the stubs project, right-click StubClientStub.tst, then choose Delete). This will delete the file and cause the stub to be undeployed (it will be removed from the Local Machine folder in the Stub Server view). 2. Right click the stubs project and choose New Test (*.tst) File.

107

Creating and Deploying Stubs

3. Ensure that stubs is selected under Enter or select the parent folder, set the file name to StoreStub, then click Next.

108

Creating and Deploying Stubs

4. Select SOA> Other> Traffic, then click Next.

5. For Traffic file, browse to the store.txt file found under [soatest install dir]/examples/traffic. This file was saved by using WireShark to trace traffic between a client and the Parasoft book store service.

109

Creating and Deploying Stubs

6. Select Generate Server Stubs, then click Next.

7. Give the stub a name (such as StoreStub). 8. For Value of "stub" parameter, specify the value ClientStub (the sample StubClient.tst will invoke the stub when it sees this expected value).

110

Creating and Deploying Stubs

9. Click Next. SOAtest will now display a series of steps: one for each Web service operation that it found in the file. 10. For Test 1: getItemByTitle() , leave the default automatic setting – we know that this operation expects a single parameter in the request, so SOAtest can recognize that single parameter value as the determinant for response messages.

11. Click Next. 12. For Test 2: placeOrder(): a. Choose Select relevant parameters. b. Click New. c.

Under the SOAP Envelope tree, select itemId (SOAtest will generate the XPath for that request parameter), then click OK.

d. Click New again.

111

Creating and Deploying Stubs

e. This time, select quantity, then click OK.

You should now have two entries listed under the Request Parameters table. We manually selected these parameters because there are multiple parameters and we want the values of both to determine the response. In other words, both parameters are significant and the value of each one of them should affect the response message that is returned by the stub. Requests will often have many more parameters, but for stubbing purposes, only a few are usually significant for determining the responses from a testing perspective. 13. Click Finish. (If you decided to click Next, you would leave the Test 3: confirm() option on Automatic since this operation does not take any request parameters). SOAtest will now create StoreStub.tst under the stubs project folder, and deploy it immediately on the local machine. You may test this stub by invoking the StubClient.tst we used in the last lesson—you just need to set it to “Stub Environment” so it invokes the local stub on the local machine instead of the real service on the Parasoft server.

112

Creating and Deploying Stubs

Notice the traffic in that test. The stub you created responds with the same messages as the real service.

Creating Stubs for an Application that Does Not Exist Yet When an application is not accessible for the purpose of configuring a stub, you can create a stub from scratch and configure its behavior with data sources. The use of data sources allows for easier maintainability. With data sources, modifying the behavior or adding additional cases is as simple as changing values in a spreadsheet. To create stubs from scratch: 1. Right click the stubs project and choose New Test (*.tst) File. 2. Make sure stubs is selected under Enter or select the parent folder, set the file name to DataSourceStoreStub, then click Next. 3. Select SOA> WSDL, then click Next. 4. In the WSDL URL, specify the store WSDL location http://soatest.parasoft.com/store01.wsdl

113

Creating and Deploying Stubs

5. Select Generate Server Stubs and click Finish.

SOAtest will create 7 Message Stub tests: one for each of the operations defined in the WSDL. 6. Delete all the tests except Test 4: getItemByTitle Service. 7. Double-click the getItemByTitle Service test to open its editor. 8. Open the Response tab.

114

Creating and Deploying Stubs

9. Right-click getItemByTitleResponse, then choose Populate.

10. In Number of sequence (array) items, specify 1, then click OK.

This will add the optional items elements so we can generate a data source table with columns for all the elements. 11. Save the test.

115

Creating and Deploying Stubs

12. Right-click getItemByTitleResponse , then choose Generate CSV Data Source.

13. For the CSV File Destination, click Workspace, then select the stubs project. 14. Click OK.

A data source will be generated and configured for the test suite.

116

Creating and Deploying Stubs

15. Save the test. Notice how all the parameters are now parameterized and mapped to data source columns.

16. Switch to the Navigator view, double-click getItemByTitleResponse.csv, then edit the file and add values to it (you can edit it with Excel—or you can replace it with the sample file included under [soatest install dir]/examples/datasources).

17. Go to the Data Source Correlation tab.

117

Creating and Deploying Stubs

18. Under the Request XML Message section, click Add.

19. Click Edit (to the right of the XPath field), select the titleKeyword element, then click OK.

118

Creating and Deploying Stubs

20. Under Column Name, select title, then click OK and save the test.

You have now created a stub for the getItemByTitle service and defined it so that the titleKeyword value is matched with the title column in the data source. In other words, the stub will search the title column until it finds a match for the incoming value, then use that row to populate the response for that message. You can try invoking the stub (like in the previous exercise) with one of the titles you provided in the data source. The stub will respond with the book details associated with that title.

Creating Stubs for REST Services You can also emulate REST services with SOAtest as follows: 1. Create a new empty .tst file under the stubs project, name it StockQuote and add a new Message Stub tool to it. 2. Under the Test Correlation tab, select Transport and clear Enable Correlation (since we want to create a REST stub, SOAPAction HTTP headers are not applicable).

3. Under the Message Stub tool’s Response tab, switch the view to Multiple Responses.

119

Creating and Deploying Stubs

4. Click New.

5. Select Always match under XPath function, and clear Always match under HTTP URL Parameters. This will allow the response message we are configuring to be correlated to values that come in the URL parameters (instead of values in an XML request, as we did in previous exercises).

120

Creating and Deploying Stubs

6. Click Add.

7. Specify the parameter name symbol and the value GOOG, then click OK.

121

Creating and Deploying Stubs

8. Open the Message tab for Response 1.

9. Provide the XML that you want to be returned for this request; for example <quote> <company>Google Inc</company> <lastTrade>621.50</lastTrade> </quote>

10. Click New to add another response. Repeat the general steps above—but this time, specify the value AAPL for the symbol and the desired response <quote> <company>Apple Inc</company> <lastTrade>244.75</lastTrade> </quote>

122

Creating and Deploying Stubs

11. Save the test. You now have a stub that is deployed and listening on your local machine at the URL: http://localhost:9080/servlet/StubEndpoint?stub=StockQuote You can test this stub service using [soatest install dir]/examples/tests/StockQuoteClient.tst: try changing the symbol from GOOG to AAPL and some other value, then look at the resulting traffic.

Monitoring Stub Traffic and Failures So far, we have covered a few different ways to generate and configure stubs to emulate services behavior. When you are testing an application that is invoking the SOAtest stub, it is often helpful to have visibility into what requests the application is sending to the stub and whether there are any errors or request message validation failures (if you added any request validation tools to the request of the Message Stub tool). To monitor the traffic in the Store Stub created in the first or second exercise: 1. Open StubClient.tst, which we used to invoke the stub. 2. Set Stub Environment as the active environment. 3. Select the Scenario: Book Store test suite.

4. Click Add test or output.

123

Creating and Deploying Stubs

5. Select Event Monitor, then click Finish.

6. Switch the platform in the Event Monitor to SOAtest Stub Server.

7. Save the test. 8. In the Event Monitor panel, open the Event Viewer tab. Do not close this panel.

9. Select and run the Scenario: Book Store test suite.

124

Creating and Deploying Stubs

Notice the events that appear in the event viewer. They represent the request message that was received by the stub, the response it returned and validation result.

125

Testing Plain XML Services

Testing Plain XML Services Parasoft SOAtest can be used to test POX (Plain Old XML) services that are not necessarily SOAP Web services. Many legacy system integration initiatives have relied on plain XML messaging, or sometimes plain XML is preferred over SOAP Web services for performance reasons to reduce complexity. If a schema for the XML messages is available, tests can be generated automatically by SOAtest, without the need to provide sample XML messages. Parasoft SOAtest support for plain XML services includes emulating a client that sends XML over one of the supported protocols and APIs (e.g. HTTP, JMS, etc.), or emulating a server that responds with XML over HTTP. To generate a set of new tests using a schema: 1. Select the Test Suite: Functional Tests node and click the Add Test Suite toolbar button.

2. In the Add Test Suite wizard, expand the New Project node, select XML Schema, and click the Next button. 3. In the XML Schema dialog, enter http://soatest.parasoft.com/schema.xsd in the Schema Location field, or Browse to a Schema on your machine. 4. Select Generate Messaging Clients to send plain XML messages. 5. Enter http://ws1.parasoft.com:8080/examples/servlets/Echo in the Endpoint field. This specifies where XML messages are sent to. This field can be left blank if another protocol is desired or if the URL is to be provided later.

126

Testing Plain XML Services

6. Click the Next button. A list of elements that are defined in the schema (directly, as well as indirectly via imports) displays. You may select one or more of these elements, and a Messaging Client test will be generated for each selection. 7. Select all elements by pressing CTRL while clicking or pressing CTRL+A. 8. Click the Finish button. Three tests are created.

127

Extending SOAtest with Scripting

Extending SOAtest with Scripting In the ever changing world of web services, there may be situations in which you have a testing requirement which requires you to add custom functionality or logic to your tests cases. Due to the flexible nature of SOAtest, you can easily integrate custom scripts into your testing environment. Using SOAtest’s XML Assertor, you can integrate custom scripts written in Jython (Java enabled Python—SOAtest ships with Jython 2.2.1), Java, or JavaScript into SOAtest. This means that almost any testing situation can be handled with ease, even if the situation is not directly supported by SOAtest’s current tool set. In this example you will create a Scenario Test using the book store service used in previous examples. In this Scenario you will search for a book by its title, then validate that the price of the book is an even integer. When you complete this section of the tutorial, your test suite should resemble the test suite entitled "Custom Scripting" in the SOAtestTutorial.tst file. 1. Select the Test Suite: Functional Tests node and click the Add Test Suite toolbar button.

2. In the Add Test Suite wizard, click Empty, then click Finish 3. Double-click the new Test Suite: Test Suite node added to the test suite tree, enter Custom Scripting in the Name field in the test configuration panel, then click Save. 4. Select the Test Suite: Custom Scripting node and click the Add test or output button.

5. In the Add Test Wizard, select SOAP Client in the right, then click Finish. A SOAP Client tool is added to the test suite. 6. Double-click the Test 1: SOAP Client node underneath the Test Suite: Custom Scripting node and enter Validate Price Value in the Name field in the right GUI Panel. 7. In the WSDL tab of the test configuration panel, enter http://soatest.parasoft.com/ store-01.wsdl in the WSDL URI field. 8. Open the Request tab, then select getItemByTitle from the Operation drop down box.

128

Extending SOAtest with Scripting

9. Enter Linux as the Fixed value in the titleKeyword element entry box, and then click the Save toolbar button.

10. Right-click the Test 1: Validate Price Value node and select Add Output. 11. In the Add Output wizard, select Response> SOAP Envelope on the left, and XML Assertor on the right, and click Finish. This tells SOAtest to chain an XML Assertor to the XML Response output of the SOAP Client. 12. Open the Configuration tab within the XML Assertor test configuration panel, then click the Add button. 13. In the Select Assertion dialog, expand the Value Assertion node, select Custom Assertion, and click the Next button.

The Custom Assertion dialog then displays a tree view of the XML message from which you can select a single value to enforce.

129

Extending SOAtest with Scripting

14. Select the price element in the XML tree view and click the Finish button.

The test configuration tab will now be populated with a Custom Assertion. 15. Enter the following script, which ensures that the price value is even, in the Text field of the test configuration tab: def checkPrice(input, context): price = float(input) if price % 2 == 0: return 1 else: return 0

130

Extending SOAtest with Scripting

16. Select Python from the Language drop-down menu. 17. Select checkPrice() from the Method drop-down menu. 18. Click the Save toolbar button. 19. Select the Test 1 node and click the Test button. Notice that the test fails. If you double-click the Traffic node, you will see that the price of the Linux book is an odd number, causing the test to fail.

131

Extending SOAtest with Scripting

20. Double-click the Test 1 node and enter Java as the Fixed value in the titleKeyword entry box. 21. Click Save. 22. Click the Test button. The test succeeds because the price of the Java book is even.

132

Extending SOAtest with Scripting

133

Asynchronous Testing

Asynchronous Testing In this age of flexible, high performance web services, asynchronous communication is often used to exchange data, allowing the client to continue with other processing rather than blocking until a response is received. SOAtest comes packaged with a server that runs in the background and manages the asynchronous Call Back messages received. SOAtest supports the major asynchronous communication protocols including Parlay, SCP, and WS-Addressing. In this example we will use a simple web service which takes a string as input and then echoes this string back to the client in an asynchronous message exchange. This web service uses the WSAddressing protocol. We will need to send a Message ID which is used by SOAtest to identify the message when the Call Back Message is received and a Call Back URL so that the service knows where to send the Call Back Message. Note: It is likely that you will not be able to run the scenario in this exercise because of firewall restrictions. In order to successfully invoke this service, your machine would need to be accessible over the Internet by the Parasoft machine which sends the asynchronous response (HTTP post to SOAtest). When you complete this section of the tutorial, your test suite should resemble the test suite entitled "Asynchronous Testing" in the SOAtestTutorial.tst file. 1. Create a new test (.tst) file as follows: a. Right-click the project from the previous exercises, then choose New Test (.tst) File from the shortcut menu.

b. Enter a name for the file (e.g. Asynchronous Testing), then click Next. c.

Select SOA> WSDL, then click Next.

d. In the WSDL wizard page, enter the following into the WSDL URL field: http://soatest.parasoft.com/echo.wsdl

e. Make sure the Create functional tests from the WSDL checkbox is selected and Generate Web Service Clients is the selected option. Create tests to validate and

134

Asynchronous Testing

enforce policies on the WSDL should NOT be selected.

f.

Click Next several times until you advance to the Layout page.

g. In the Layout page, choose the Asynchronous radio button and the WS-Addressing radio button and click Finish. •

A new Test Suite: Test Suite folder is created which contains automatically configured asynchronous test cases for each operation defined within the WSDL.



Notice that many tests have been created under the Test Suite: echo folder. You can delete all but the last one, Scenario: echoString(string).

2. Configure the test suite as follows: a. Double-click the new Scenario: echoString(string) node.

b. Enter Asynchronous Testing in the Name field in the right GUI panel.

135

Asynchronous Testing

c.

Open the Execution Options tab and select the Tests run concurrently radio button.

3. Click Save. 4. Looking back at the Test Case Explorer, note that Asynchronous Testing folder contains two tests. •

The first test is a SOAP Client test which will send an initial request to the asynchronous service.



The second is a tool called the Call Back tool. Using the Call Back tool, SOAtest is able to listen for call back messages that are sent in an asynchronous messaging exchange. A local stub server has been integrated into SOAtest, allowing the Call Back tool to listen for these incoming messages. For this reason, it is important that the server is running before executing these examples.

5. Configure and start the local server as follows: a. Choose Window> Show View> Stub Server to open the Stub Server tab, which should appear at the bottom of the GUI. b. If the local server is not already running (if it has as gray light rather than a green light), right-click the root Server node and select Start Server. The light next to the node should turn green indicating that the server has been started. c.

We need to deploy a stub using the stub server to emulate the asynchronous service. Under the Stub Server node, right-click the Local machine folder and choose Add Stub.

d. Click the Workspace button next to the Message Stub Tester Suite field, browse to the AsynchronousTestingStub.tst shipped with the SOAtest examples, then click Next.

136

Asynchronous Testing

The new stub is now ready to use. 6. Configure the test to send a Message ID which is used by SOAtest to identify the message when the Call Back Message is received and a Call Back URL so that the service knows where to send the Call Back Message as follows: a. In the Test Case Explorer, double-click the Test 1: echoString async node in the Asynchronous Testing test suite. b. Open the request tab and enter the fixed value Hello World as the arg0 input parameter to this operation.

c.

Open the SOAP Header tab and notice that the SOAP Headers defined within the WSDL have been automatically created and added to this test case. Select the WS Addressing header, click Modify, and open the MessageID/ReplyTo tab. Note the value of the dropdown under wsa:MessageID.

137

Asynchronous Testing

By default, the SOAP Client Tool will generate a unique messageID, which will be sent

138

Asynchronous Testing

to your server. When receiving messages, the Call Back Tool will then check this messageID so that asynchronous responses can be correlated to the proper requests. In this case, though, we need to specify a messageID because we are using a stub instead of a live service. d. Change the dropdown under wsa:MessageID from Unique to Fixed, enter the following into the field uuid:3799a6eb-cf84-4141-b8b4-09e1f7090734

then click OK to close that dialog e. Open the Transport tab and set the endpoint to Custom with the value of http://localhost:9080/servlet/StubEndpoint?stub=AsynchronousTesting

f.

Click Save.

g. Double-click the Test 2: echo call back node in the Asynchronous Testing test suite. In the test configuration panel, notice that the Call Back Tool has been automatically configured to use the WS-Addressing protocol. By default, the Call Back Tool will listen for incoming messages with the same MessageID that was generated in Test 1: echoString async. h. Because we are using a stub to emulate the service, we need to use a specific messageID for correlation. Double-click the MessageID entry in the table. In the popup dialog, change the dropdown to Fixed, enter urn:message-1 into the box, then click OK.

i.

Click Save.

7. Select the Test Suite: Asynchronous Testing node and click the Test toolbar button. All the tests should succeed. 8. To see the traffic, expand both the Test 1: echoString async and Test 2: echoString call back nodes on the left, then double-click on the Traffic Viewers attached to them. The Traffic Viewer attached to the SOAP Client tool will show the request message, and the Traffic Viewer attached to the Call Back Tool will show the response.

139

WS-Security

WS-Security To help you ensure that your security measures work flawlessly in terms of authentication, encryption, and access control, SOAtest contains a vast array of security tools and options that fully supports the industry standard WS-Security specification. In the example given in the WS-Security test suite, examples of encryption/decryption, digital signature, and the addition of SOAP Headers are shown. The following are key security tools and options that SOAtest supports: •

XML Encryption Tool: The XML Encryption tool allows you to encrypt and decrypt entire messages or parts of messages using Triple DES, AES 128, AES 192, or AES 256. In WS-Security mode, Binary Security Tokens, X509IssuerSerial, and Key Identifiers are supported.



XML Signer Tool: The XML signer tool allows you to digitally sign an entire message or parts of a message depending on your specific needs. In some cases it may be important to digitally sign parts of a document while encrypting other parts.



XML Signature Verifier Tool: The XML verifier tool allows for the verification of digitally signed documents using a public/private key pair stored within a key store file.



Key Stores: The use of key stores in SOAtest allows you to encrypt/decrypt and digitally sign documents using public/private key pairs stored in a key store. Key stores in JKS, PKCS12, BKS, and UBER format can be used.



Username Tokens, SAML Tokens, X509 Tokens, or Custom Headers: SOAtest supports sending custom SOAP Headers and includes templates for Username Tokens and SAML tokens.

When you complete this section of the tutorial, your test suite should resemble the test suite entitled "WS-Security" in the SOAtestTutorial.tst file.

Unlimited Strength Java Cryptography Extension Important: In order to perform security tests using the XML Signature Verifier, XML Signer, or XML Encryption tools, or if using Key Stores, you will need to download and install the Unlimited Strength Java Cryptography Extension. To do so, go to http://java.sun.com/javase/downloads/index_jdk5.jsp and download the JCE Unlimited Strength Jurisdiction Policy Files. The files downloaded should be installed into the following directory on your machine: [SOAtest install dir]\[SOAtest version_number]\plugins\com.parasoft.xtest.jre.eclipse.core.[platform]_[jre version]\jre\lib\security

Be sure to replace the existing local_policy.jar and US_export_policy.jar files with the new ones that you downloaded.

140

WS-Security

Message Layer Security with SOAP Headers In this example we will use a book store web service, which requires a Username and Password to be submitted within the SOAP Header element according to the WS-Security specification. SOAtest provides the ability to add Custom Headers and also provides pre-defined templates for creating Username Tokens and SAML Tokens. The following example uses a Username Token. 1. Right-click the project from the previous exercises, then choose New Test (.tst) File from the shortcut menu.

2. Enter a name for the file, then click Next. 3. Select Empty and click Finish. An empty test suite folder is created. 4. Double-click the new Test Suite: Test Suite node that was added. 5. Type WS-Security into the Name field in the configuration panel on the right. 6. Click the Save button to save the WS-Security test suite. 7. Copy the Excel: Books data source that you added in the Functional Test lesson and paste it into this test suite. 8. Select the Test Suite: WS-Security node and click the Add Test Suite button.

9. Select Empty and click Finish. An empty test suite folder is created. 10. Type Username Tokens into the Name field in the tool configuration panel on the right. 11. Click the Save button to save the Username Tokens test suite. 12. Select the Test Suite: Username Tokens node and click the Add Test or Output button.

13. In the Add Test wizard, select Standard Test from the left pane, and SOAP Client from the right pane, and click Finish. A SOAP Client tool is added to the test suite. 14. Double-click the Test 1: SOAP Client node beneath the Test Suite: Username Tokens node.

141

WS-Security

15. Complete the SOAP Client tool’s configuration panel as follows: a. Enter SOAP Client – getItemByTitle operation in the Name field. b. Open the WSDL tab and enter the following in the WSDL URI field: http://soatest.parasoft.com/store-wss-01.wsdl

c.

Open the Request tab and select getItemByTitle(string) from the Operation dropdown menu.

d. For the title element, select Keywords as its Parameterized value.

16. Click the Save toolbar button to save the modified test. 17. Run the test by clicking the Test toolbar button.

Notice that the test fails because it did not have the required Security Header. To add the required SOAP Header: 1. Double-click the Test 1: SOAP Client node. 2. Open the SOAP Header tab in the tool’s configuration panel, then click the Add button. An Add New SOAP Header dialog opens.

142

WS-Security

3. Select WS-Security then click OK.

4. Double-click the new entry added to the SOAP Header table. A dialog will open. 5. In the Timestamp tab, clear the Send Timestamp checkbox. 6. Open the Username Token tab and complete the following: a. Enter soatest in the wsse:Username field. b. Enter soatest in the wsse:Password field.

7. Click OK. 8. Click the Save toolbar button to save the modified test. 9. Run the test by clicking the Test toolbar button.

The test now succeeds. Double-click the Traffic Viewer node to view the SOAP Header sent in the request and verify that the service returned information about the specified books. 10. To create a regression control that will alert you to any changes in the server response in the future, right click Test 1: SOAP Client – getItemByTitleOperation and choose Create/Update Regression Control from the shortcut menu.

143

WS-Security

11. In the Response Validation Wizard, select Create Regression Control> Create Single Control, then click Finish.

If you run the test a few more times you will notice that it fails because the price element has changed. Follow the steps from previous exercises to ignore the dynamically changing price value.

Using the XML Encryption Tool In this example, we will use a book store service similar to the service used in previous examples, except that: •

Request bodies must be encrypted using the key store soatest.pfx which is located in the examples\keystores directory.



Responses are encrypted as well and can be decrypted using the same key store.

First you will need to set up the key store: 1. Select the Test Suite: WS-Security node and click the Add Property toolbar button.

144

WS-Security

2. In the Global Property Type Selection dialog, select Global Key Store and click Finish.

3. Complete the Key Store configuration panel as follows: a. Enter PKCS12 Keystore in the Name field in the GUI panel. b. Make sure the Use same key store for private key checkbox is selected. c.

Click the Browse button and navigate to the location of the key store soatest.pfx. •

For Windows: C:\Program Files\Parasoft\SOAtest\[SOAtest version number]\examples\keystores.



For Linux: [SOAtest installation directory]\examples\keystores.

d. Enter security in the Key Store Password field and select the Save check box. This will enable SOAtest to remember the keystore password the next time the test suite is opened. e. Select PKCS12 from the Key Store Type drop-down menu. f.

Click the Load button.

145

WS-Security

The list of available certificate aliases within the keystore are populated into the Certificate Alias drop-down menu. g. Select soatest in the Certificate Alias field. h. Open the Private Key tab at the top of the Key Store configuration panel. i.

Select soatest for the Private Key Alias and enter security for the Private Key Password.

j.

Select the Save key store password check box.

4. Click the Save toolbar button. Now we are ready to set up a test using the XML Encryption tool. To better organize our security tests, we will create a new folder for the encryption test. 1. Select the Test Suite: WS-Security node and click the Add Test Suite button.

2. Select Empty and click Finish. An empty test suite folder is created. 3. Type Encryption/Decryption into the Name field in the right GUI panel. 4. Click the Save toolbar button. 5. Select the Test Suite: Encryption/Decryption node and click the Add Test or Output button.

6. Select Standard Test from the left pane, and select SOAP Client from the right pane, and click Finish. A SOAP Client tool is added to the test suite. 7. Complete the SOAP Client tool’s configuration panel as follows: a. Enter SOAP Client – getItemByID operation in the Name field. b. Open the WSDL tab and enter the following in the WSDL URI field: http://soatest.parasoft.com/store-wss-03.wsdl

c.

Open the Request tab.

d. Select getItemById from the Operation drop-down menu.

146

WS-Security

e. For the id element, select ID as its Parameterized value.

8. Click the Save toolbar button. 9. Right-click the Test 1: SOAP Client - getItemByID operation node and select Add Output from the shortcut menu. The Add Output wizard displays. 10. Select Request> SOAP Envelope from the left pane, and select XML Encryption from the right pane, and click the Finish button.

147

WS-Security

An Encryption Tool is chained to the SOAP Client. 11. Complete the Request SOAP Envelope -> XML Encryption tool’s configuration panel as follows: a. Ensure that the Encrypt radio button is selected. b. Ensure that the WS-Security Mode box is checked. c.

Select AES 256 from the Symmetric (Block Encryption) drop down menu.

d. Open the WS-Security page and ensure that X509BinarySecurityToken is selected in the Form box.

148

WS-Security

e. Open the Target Elements page and verify that the SOAP Body/entire document checkbox is selected. This will encrypt the XML Body element. The XML Request is now set up to be encrypted when the request is sent to the service. f.

Click the Save toolbar button to save the modified test.

Now you can add an XML Encryption tool to the XML Response of the SOAP Client test to enable Decryption of the XML response. 1. Right-click the Test 1: SOAP Client - getItemByID operation node and select Add Output from the shortcut menu. The Add Output wizard displays. 2. Select Response> SOAP Envelope from the left pane, and select XML Encryption from the right pane, and click the Finish button.

149

WS-Security

An Encryption Tool is chained to the SOAP Client. 3. Complete the Response SOAP Envelope -> XML Encryption tool’s configuration panel as follows: a. Select the Decrypt radio button.

150

WS-Security

b. Select the PKCS12 Keystore from the Key Store drop down menu.

4. Click the Save toolbar button to save the modified test. 5. Run the test by clicking the Test toolbar button.

6. Double-click the Traffic Viewer node to view the encrypted data. 7. Right-click the Test 1: SOAP Client - getItemByID operation node and select Create/Update Regression Control. 8. In the dialog that opens, select Create Regression Control> Create Multiple Controls, then click Finish.

151

WS-Security

Regression controls are created and automatically chained to the Response SOAP Envelope -> XML Encryption. Notice that the decrypted responses are shown in the Regression Control. Finally, you want to ignore dynamic values from the XML Response so that the Regression Control does not fail each time. 1. Double-click the XML Document -> Diff node and complete the following in the right GUI panel: a. Set the Diff Mode to XML. 2. Select Form XML as the Diff Mode.When the Form XML tab is selected, a popup will appear asking whether to override with values from Literal XML view. Click Yes. a. Right-click the price element and select Setup Ignored XPath from the shortcut menu. An Ignore XPath Setting dialog appears. Click OK to ignore modifications to the text content of the price element. b. Repeat the previous step for the CipherValue element. c.

Right click on the DataReference element and select Setup Ignored XPath. An Ignore XPath Setting Dialog appears. Select the Attribute check box to ignore changes to the attributes of the DataReference element. Click OK.

d. Select the Literal XML button to switch back to Literal XML view. 3. Click the Save toolbar button to save the modified test. 4. Run the test by clicking the Test toolbar button.

Using the XML Signer Tool In the next example, we will use a book store service which requires request bodies to be signed with the certificate in the key store soatest.pfx. Responses from this service are signed as well and can be verified using the same key store. We will use the same key store settings from the previous example. 1. Select the Test Suite: WS-Security node and click the Add Test Suite button.

2. Select Empty and click Finish. An empty test suite folder is created. 3. Type Sign/Verify into the Name field in the right GUI panel, then click the Save toolbar button.

152

WS-Security

4. Select the Test Suite: Sign/Verify node and click the Add Test or Output button.

5. Select Standard Test from the left panel and SOAP Client from the right panel and click Finish. A SOAP Client tool is added to the test suite. 6. Complete the SOAP Client tool’s configuration panel as follows: a. Enter SOAP Client – placeOrder operation in the Name field. b. Open the WSDL tab and enter the following in the WSDL URI field: http://soatest.parasoft.com/store-wss-02.wsdl

c.

Open the Request tab.

d. Select placeOrder(int, int) from the Operation drop-down menu. e. Select the itemId parameter and select ID as its Parameterized value. f.

Select the count parameter and enter 1 as its Fixed value.

7. Click Save to save the modified test. 8. Right-click the Test 1: SOAP Client - placeOrder operation node and select Add Output from the shortcut menu. The Add Output wizard displays. 9. Select Request> SOAP Envelope from the left panel, select XML Signer from the right panel, and click Finish. An XML Signer Tool is chained to the SOAP Client. 10. Complete the XML Signer tool’s configuration panel as follows: a. Select PKCS12 Keystore from the Key Store drop down menu. b. Select RSAwithSHA1 (PKCS1) – http://www.w3.org/2000/09/xmldsig#rsa-sha1 from the Algorithm drop down menu.

153

WS-Security

c.

Open the WS-Security page and select X509IssuerSerial from the Form box.

d. Open the Target Elements page and verify that the SOAP Body/entire document checkbox is selected. The XML Request is now set up to be signed when the request is sent to the service. e. Click Save to save the modified test. Now you can add an XML Verifier Tool to the XML Response of the SOAP Client test to enable Signature Verification of the XML response: 1. Right-click the Test 1: SOAP Client - placeOrder operation node and select Add Output from the shortcut menu. The Add Output wizard displays. 2. Select Response> SOAP Envelope from the left pane, and select XML Signature Verifier from the right pane, and click Finish. An XML Signature Verifier Tool is chained to the Test 1: SOAP Client - placeOrder operation node. 3. Complete the XML Signature Verifier tool’s configuration panel as follows: a. Select the Use Key Store checkbox and choose PKCS12 Keystore from the dropdown menu. b. Ensure that the WS-Security Mode check box is checked.

4. Click the Save toolbar button to save the modified test. 5. Run the test by clicking the Test toolbar button.

154

WS-Security

6. Double-click the Traffic Viewer node to view the signed data. Since the test succeeds, this tells us that the server accepted our signed request and the server’s signed response was successfully verified.

XML Encryption and Signature Combined In this example, we will create a more complex test using a book store service which combines the security requirements of the previous two exercises. This service requires request bodies to be signed and encrypted using the key store soatest.pfx. The responses from this service are signed and encrypted as well and can be decrypted and verified using the same key store. 1. Select the Test Suite: WS-Security node and click the Add Test Suite button.

2. Select Empty and click Finish. An empty test suite folder is created. 3. Type Encryption and Signature Combined into the Name field in the right GUI panel, then click the Save toolbar button. 4. Select the Test Suite: Encryption and Signature Combined node and click the Add Test or Output button.

5. In the Add Test Wizard, select Standard Test from the left pane, and select SOAP Client from the right pane, and click Finish. A SOAP Client tool is added to the test suite. 6. Complete the SOAP Client tool’s configuration panel as follows: a. Enter SOAP Client – getItemByTitle operation in the Name field. b. Open the WSDL tab and enter the following in the WSDL URI field: http://soatest.parasoft.com/store-wss-04.wsdl

c.

Open the Request tab.

d. Select getItemByTitle from the Operation drop-down menu.

155

WS-Security

e. Select the title parameter and enter Linux as its Fixed value.

7. Right-click the Test 1: SOAP Client - getItemByID operation node and select Add Output from the shortcut menu. The Add Output wizard displays. 8. Select Request> SOAP Envelope from the left pane, and select XML Signer from the right pane, and click the Finish button. An XML Signer Tool is chained to the SOAP Client. 9. Complete the XML Signer tool’s configuration panel as follows: a. Select RSA from the Algorithm drop down menu. b. Select PKCS12 Keystore from the Key Store drop down menu.

c.

Open the WS-Security page and choose X509BinarySecurityToken from the drop down menu.

d. Open the Target Elements page and ensure that SOAP Body/entire document is checked. The XML Request is now set up to be signed when the request is sent to the service. Next you can add an XML Encryption Tool to the XML Response of the XML Signer Tool to encrypt the signed document. 1. Right-click the Request SOAP Envelope> XML Signer node and select Add Output from the shortcut menu. The Add Output wizard displays. 2. Select XML Encryption and click the Finish button. An XML Encryption Tool is chained to the XML Response of the XML Signer Tool. 3. Complete the XML Encryption Tool tool configuration panel as follows:

156

WS-Security

a. Ensure that the Encrypt radio button is selected. b. Choose PKCS12 Keystore from the Key Store drop down menu. c.

Select AES 256 from the Symmetric drop down menu.

d. Open the WS-Security page and select X509BinarySecurityToken from the Form box. e. Open the Target Elements page and verify that the SOAP Body/entire document checkbox is selected. The XML Request is now set up to be signed when the request is sent to the service. f.

Click Save to save the modified test.

4. Run the test by clicking the Test toolbar button.

5. Double-click the Traffic Viewer node to view the server response

Automatically Generating WS-Security Tests with WS-SecurityPolicy Parasoft enables automatic test creation to enforce runtime security policies. This helps you automatically generate the correct tests with the correct settings so the services can be invoked instantly. Furthermore, by managing the policies at the project test level, you can more easily create and manage various policy variations in order to test the services properly, both positive and negative. SOAtest recognizes WS-SecurityPolicy assertions in the WSDL when using the WS-PolicyAttachment standard. In order to automatically generate tests from a WSDL with WS-SecurityPolicy assertions, complete the following: 1. Select the Test Suite: WS-Security node and click the Add Property toolbar button.

2. In the Global Property Type Selection dialog, select WS-Policy Bank and click Finish. A WS-Policy Banks node is added to the Test Case Explorer. 3. In the WSDL Policies configuration panel on the right side of the GUI, enter http://soatest.parasoft.com/store-wss-04.wsdl in the WSDL URL field and click the Refresh from

157

WS-Security

WSDL button. The Global Policies are populated.

Notice how there are policy nodes that include the WS-SecurityPolicy configuration that corresponds to the WS-SecurityPolicy assertions in the WSDL. Notice how the tests generated are automatically configured with the signer and encryption tool on the request, because the policy dictates so. Since a keystore has already been added to the test suite, the tests are ready to run. If you have not added a keystore, one needs to be configured. For more information on adding a keystore, see “Using the XML Encryption Tool”, page 144.

158

Design and Development Policy Enforcement

Design and Development Policy Enforcement As a greater number of Service Oriented Architectures (SOA) are deployed throughout the industry, the need arises to enforce policies and best practices on all components of the SOA. Policy enforcement over these components will help to ensure interoperability, consistency, and re-usability throughout the life cycle of the SOA. SOAtest provides SOA architects the ability to create and manage design-time SOA policies. A SOAtest “policy” now combines both static analysis policy configurations for XML artifacts (WSDLs, schemas and SOAP) as well as semantic and schema validation tests. SOAtest allows an architect to create a policy configuration which combines Coding Standard tool rule assertions with test assertions such as Schema validity and WS-I interoperability. The new SOA policy configuration interface is very similar to rule configurations in Parasoft's language products (Jtest for Java, C++test for C and C++, .TEST for .NET languages). SOAtest saves and loads policies in an XML format which extends on WS-Policy. When you complete this section of the tutorial, your test suite should resemble the test suite entitled "Design and Development" in the SOAtestTutorial.tst file.

Enforcing Design-Time SOA Policies For this example we will create policy enforcement tests for a book store service with the WSDL located at http://soatest.parasoft.com/store-01.wsdl. 1. Right-click the project from the previous exercises, then choose New Test (.tst) File from the shortcut menu.

2. Enter a name for the file (e.g., Policy Enforcement), then click Next. 3. Select SOA> WSDL, and click Next to advance to the WSDL dialog. 4. Select http://soatest.parasoft.com/store-01.wsdl from the WSDL URL field.

159

Design and Development Policy Enforcement

5. Check the Create tests to validate and enforce policies on the WSDL check box and make sure the Create functional tests from the WSDL check box is also checked.

6. Click Next until you advance to the Policy Enforcement dialog. •

Select the Apply Policy Configuration check box. This will create WSDL and functional tests that will enforce the assertions defined in the specified policy configuration.

160

Design and Development Policy Enforcement

The default policy configuration, soa.policy, is a collection of industry-wide best practices. To use a custom policy configuration, you can either use the Browse button to select a policy configuration or the policy configuration's path can be entered in the text field. For details on policy enforcement, see “SOA Policy Enforcement: Overview”, page 570. 7. Click the Finish button. 8. Double-click the new Test Suite: Test Suite node added to the test case tree, enter Policy Configuration in the Name field in the test configuration panel, and click the Save toolbar button. 9. Expand Test Suite: Policy Configuration then Test Suite: WSDL Tests. Notice that Test 4: Policy Enforcement has been added to Test Suite: WSDL Tests.

161

Design and Development Policy Enforcement

10. Expand the Test 4: Policy Enforcement test to view its chained tools. You will see two Coding Standards tools, one for enforcing rules on the WSDLs and one for enforcing rules on the schemas.



The first tool, WSDL> WSDL Policy Enforcer, is chained to the WSDL Output of the Test 4: Policy Enforcement test and thus is passed the base WSDL and all imported WSDLs for rule enforcement.



The second Coding Standards tool titled Schema> Schema Enforcer is chained to Test 4: Policy Enforcement's Schema Output and thus is passed all schema files referenced in the WSDL for rule enforcement.

11. Expand one of the tests in the Test Suite: ICart node and notice that a referenced Coding Standards tool titled Response SOAP Envelope> SOAP Policy Enforcer has been chained to the Test.

This tool will apply its contained policy configuration on the messages received by this test client. The tool is a reference to a Global Tool in the Tools Test Suite under the root Test Suite.

For more information on Global Tools see “Global Tools”, page 341. 12. Select the Test 4: Policy Enforcement Test and click the Test toolbar button. This will run policy enforcement tests on the WSDL and schema files. If any errors occur, they will be reported in the SOAtest view.

162

Design and Development Policy Enforcement

Defining Custom SOA Policies In the previous exercise, we enforced policies using a default policy configuration. For this example, we will define a custom SOA policy. 1. Open the pull-down menu for the New toolbar button (top left) then choose SOA Policy Configuration File.

2. Enter a name for the policy in the Policy name field, then click the Finish button. The Policy Configuration panel displays in the right GUI pane of SOAtest and lists assertions that correspond to policy enforcement rules and WSDL tests.

3. From the Policy Configuration panel, you can: •

Enable/disable individual assertions by selecting or unselecting corresponding check boxes.



Access help documentation for assertions by right-clicking and selecting View Rule Documentation from the shortcut menu.



Import custom rules designed using SOAtest’s RuleWizard feature by clicking Add.

163

Design and Development Policy Enforcement

4. Click Save to save the custom policy to the default SOAtest rules folder. The policy configuration you define can be used later to automatically create tests to enforce policies.

164

Automation/Iteration (Nightly Process)

Automation/Iteration (Nightly Process) This lesson teaches you how to run tests from the command line, which allows you to configure SOAtest to automatically check the complete project at a specified time each night (or at another interval). This ensures that testing occurs consistently without being disruptive or obtrusive. SOAtest’s command line mode allows you to perform tests from Windows or UNIX command line shells and to run SOAtest from automated build utilities such as Ant, Maven, and CruiseControl. The following exercises are designed to demonstrate the basics of using soatestcli.

Important A command-line license is required to use soatestcli. This license is provided with SOAtest Server Edition.

Running a Test Suite From the Command Line In this example, we will run SOAtestTutorial.tst from the command line which can be found in the examples directory. 1. Close SOAtest and open a command line window. 2. Switch to the directory where SOAtest is installed. 3. From the command line window type the following command: •

On Windows: soatestcli.exe -config <configuration name> -resource "C:\Location Of SOAtestTutorial.tst" -report MySampleReport



On UNIX (where Location Of SOAtestTutorial.tst represents the location of SOAtest on disk).: soatestcli -config <configuration name> -resource "/Location Of SOAtestTutorial.tst" -report MySampleReport

Running all Projects in a Workspace soatestcli.exe -data "c:\mySOAtestWorkspace" -showdetails -config "user://Example Configuration" -report "c:\mySOAtestReports"

The –data option specifies the Eclipse workspace location. The –showdetails option prints detailed test progress information. The –config option specifies test configuration. The –report option generates an HTML report.

Running an Individual Project in a Workspace soatestcli.exe -data "c:\mySOAtestWorkspace" -resource "/MyProject" -exclude "**/somebadtesttoskip.tst" -showdetails -config "user://Example Configuration" -report "c:\mySOAtestReports"

To run an individual project in a workspace, you must specify the project to be tested with the -resource option. The -exclude option specifies files to be excluded during testing.

165

Automation/Iteration (Nightly Process)

Using a localsettings File Local settings files can control report settings, Report Center settings, error authorship settings, and Team Server settings. You can create different local settings files for different projects, then use the localsettings option to indicate which file should be used for the current command line test. Each local settings file must be a simple text file. There are no name or location requirements. Each setting should be entered in a single line. If a parameter is specified in this file, it will override the related parameter specified from the GUI. If a parameter is not specified in this file, SOAtest will use the parameter specified in the GUI. soatestcli.exe -data "c:\mySOAtestWorkspace" -showdetails -config "user://Example Configuration" -report "c:\mySOAtestReports" -publish -localsettings "c:\mylocalsettings.properties"

Example localsettings file: grs.enabled=true grs.server=grs.server.com grs.port=32323 grs.log_as_nightly=true tcm.server.enabled=true tcm.server.name=tcm.server.com tcm.server.port=18888 report.mail.enabled=true report.mail.server=smtp.server.com report.mail.domain=server.com report.mail.subject=My Nightly Tests [email protected] report.mail.exclude.developers=false scope.sourcecontrol=true scope.local=false soatest.license.use_network=true soatest.license.network.host=ls.server.com soatest.license.network.port=2002 soatest.license.network.edition=server_edition

166

Running Regression Tests in Different Environments

Running Regression Tests in Different Environments It is common for Web service applications to be developed and maintained by different teams under different environments. For example, a developer may start with tests on a server deployed on his or her local machine, then as the server is deployed to a development build server, the same tests would need to be executed against that server, then QA and testing teams would need to run the same regression tests on their own integration server. Parasoft SOAtest includes an “Environments” management feature which makes such tasks easy, because reusing and sharing test assets is critical for achieving a highly efficient process. The New Test Suite wizard includes an option to create an environment configuration for the tests that are generated automatically. To create a new test suite with preconfigured environment variables: 1. Right-click the project from the previous exercises, then choose New Test (.tst) File from the shortcut menu.

2. Enter a name for the file, then click Next. 3. Select SOA> WSDL and click Next. 4. Enter http://soatest.parasoft.com/calculator.wsdl in the WSDL URL field.

167

Running Regression Tests in Different Environments

5. Select the Create functional tests from the WSDL check box, select the Generate Web service clients radio button.

6. Click the Next button twice to advance to the Create Environment page. 7. Review the settings and options, then click Finish. A new Test Suite: Test Suite node displays in the Test Case Explorer tab. New environment variables are added to the Default Calculator Environment node. 8. Double-click the Default Calculator Environment node.

168

Running Regression Tests in Different Environments

Notice how the environment configuration now includes variables for the HOST, PORT and PATH to the service defined in the environment. The same variables are referenced by name in each of the automatically generated SOAP Client tests (look under the Transport tab). To create a new environment configuration, complete the following: 1. Right-click on the Environments node and select New Environment. A New Environment node appears. 2. Double-click the New Environment node and enter Echo Environment in the Name field in the test configuration panel. 3. Click the Add button and enter the following values to the corresponding variable names: •

CALCULATOR_HOST: ws1.parasoft.com



CALCULATOR_PORT: 8080



CALCULATOR_PATH: examples/servlets/Echo

4. Click Save to save the new environment. 5. Right-click the new Echo Environment node and select Set as Active Environment from the shortcut menu. This will set that new environment as the new configuration for the test project. Running the tests again will cause the SOAP messages to be sent to the Echo servlet on bpel.parasoft.com instead of the original calculator service. Environment configurations can be exported and imported into external XML files, as well as uploaded and referenced to the Parasoft Team Server. Environment variables can be referenced from most of the fields in the test settings GUI, not just URL fields.

Applying an Environment Configuration to a 169

Running Regression Tests in Different Environments

Regression Test from the Command Line The greatest benefit of environments is the ability to rerun the same regression suites from the command line without the need to open the SOAtest GUI and modify host or URL settings. From the command line, run a command like: soatestcli.exe -config <configuration name> -resource <path to test suite name.tst relative to the workspace> -environment "Default Calculator Environment"

Then try: soatestcli.exe -config <configuration name> -resource <path to test suite name.tst relative to the workspace> -environment "Echo Environment"

This will run the same suite with the second environment applied to it.

170

Web Functional Testing

Web Functional Testing Introduction Web interface testing is difficult to automate. Teams often abandon automated testing in favor of manual testing due to too many false positives or too much effort required to maintain the test suites. SOAtest facilitates the creation of automated test suites that are reliable and dependable. Its ability to isolate testing to specific elements of the Web interface eliminates noise and provides accurate results. SOAtest isolates and tests individual application components for correct functionality across multiple browsers without requiring scripts. Dynamic data can be stubbed out with constant data to reduce test case noise. Validations can be performed at the page object level as well as the HTTP message level. SOAtest also verifies the client-side JavaScript engine under expected and unexpected conditions through asynchronous HTTP message stubbing. The following exercises demonstrate how to use SOAtest to perform functional testing on the Web interface. It covers: •

Recording in a Browser



Adding a Data Source



Parameterizing a Form Input



Configuring Validation on a Page Element



Configuring a Browser Stub



Playing a Recorded Scenario in a Browser



Performing Static Analysis During Functional Testing

Recording in a Browser To record in a browser: 1. Right-click the project from the previous exercises, then choose New Test (.tst) File from the shortcut menu.

2. Enter a name for the file, then click Next. 3. Select Web> Record web functional tests and click Next. 4. In the first Record Web Functional Tests wizard page, ensure that Record new functional test is selected, then click Next. 5. Complete the next Record Web Functional Tests wizard page as follows:

171

Web Functional Testing

a. Enter Web Functional Testing in the Test Suite Name field. b. Enter www.endless.com in the Start Recording From field. c.

Ensure that the following options are checked, and the others are not: •

Generate Functional Tests



Generate Asynchronous Request Tests.

d. Click the Finish button. The test will begin, and a browser window will open. 6. Within the browser window that opens, perform the following actions:

172

Web Functional Testing

a. Click on the Men's Shoes link toward the top of the page.

b. Type leather into the Search field to the right of the drop-down menu showing Men's Shoes, then click the Go button immediately to the right of this field.

c.

Select the Sale & Clearance radio button from the Show options on the right of the screen.

d. Close the browser to end recording. In the Test Case Explorer view, SOAtest creates a new .tst file and a Web Functional Testing test suite which contains the generated tests for the scenario that you just recorded.

173

Web Functional Testing

7. Expand the new test suite node in the Test Case Explorer to view the tests created for each user action taken during the recording.

Adding a Data Source To add a data source: 1. Right-click the Scenario: Web Functional Testing node, then choose Add New> Data Source from the shortcut menu.

174

Web Functional Testing

2. In the New Project Data Source wizard, select Table and click Finish.

A new Data Sources node is added to the Scenario: Web Functional Testing branch. 3. In the New Datasource table configuration panel that is opened in the right side of the GUI, make sure the First row specifies column names checkbox is checked.

175

Web Functional Testing

4. Enter Material in the top cell in column A. In the same column, enter suede in row 1 and canvas in row 2.

5. Click Save to save the changes.

Parameterizing a Form Input To parameterize a form input: 1. Expand the Scenario: Form: keywordSearchForm branch to view the recorded actions related to the search form. 2. Double-click the Test 1: Type "leather" node to open the test configuration panel .

3. Note that the Pre-Action Browser Contents tab shows what the page looked like before the test action (typing leather) was performed. It also uses a blue border to highlight the user

176

Web Functional Testing

action for this test.

4. In the test configuration panel, open the User Action tab. 5. Near the top of that tab, change the Text Input Value drop-down menu from Fixed to Parameterized. 6. Select Material from the menu that appears to the immediate right.

7. Click Save to save the changes.

Configuring Validation on a Page Element To configure a validation on a page element: 1. Fully expand the Scenario: Form: shippingOptionFilterForm branch.

177

Web Functional Testing

2. Double-click the Browser Contents Viewer node under Test 1.

The Browser Contents Viewer tool configuration panel will open. 3. In the Browser Contents Viewer configuration panel, right-click some element on the page (for example, the number of results shown at the top of the page) and select Extract Value for <Element>... from the shortcut menu.

4. In the dialog that opens, ensure that the text property is selected in the Property name box.

5. Click Next two times (until you reach the Validate or Store Data page). 6. Ensure that Validate the value is selected, and that the expected value matches the results that number displayed on the rendered page.

178

Web Functional Testing

7. Click Finish. There is now a Browser Validation Tool added to this test that is set up to check that the element you selected remains the same as the application evolves. 8. Double-click the added Browser Validation Tool node.

9. In the Browser Validation tool’s test configuration panel, notice that the validation is displayed. If you wanted to reconfigure the validation settings, you could do so here.

Configuring a Browser Stub A stub is static data that SOAtest saves when recording a Functional Test Scenario through a web site. Stubbing helps to verify that any changes to the client-side code do not affect the final resultant html page by feeding unchanging data to the client in place of the actual server response. To configure a message stub: 1. Expand the Scenario: Form: shippingOptionFilterForm branch. 2. Right-click on the test labeled Test 1: Click "shippingOptionFilter" and select Add Output from the shortcut menu.

3. In the Add Output wizard that opens, choose HTTP Traffic, then click Next. 4. Select 1. http://www.endless.com/searchrequest/ref=sr_nr_onsale and click Next. 5. In the left panel, select Both> Stub Request/Response. 6. In the right panel, select any Browser Stub.

179

Web Functional Testing

7. Click Finish.

8. In the Response Body section of the Browser Stub configuration view (opened on the right side of the GUI), change the number for numPrimaryResults from its existing value to 1,234. •

The easiest way to do this is to copy the entire text in the Response Body into a text editor (e.g. Notepad), use its search capability to find 'numPrimaryResults'", modify the value, and then copy all the text from the editor and replace the current contents in the Response Body.

180

Web Functional Testing



This response body will be provided in place of the actual server response when the recorded Scenario is run later in this example.

9. Click Save to save the changes.

Playing a Recorded Scenario in a Browser To playback the recorded scenario in a browser: 1. In the Test Case Explorer, select the Scenario: Web Functional Testing node. 2. Click the Test toolbar button. The recorded scenario will now be played back in your browser, once for each search value that we parameterized in a previous section of this example. Please wait for each action to be played out in your browser. Note if you have been following the complete tutorial, errors will be reported due to the Browser Stub tool that was previously added. This is expected.

Performing Static Analysis During Functional Testing To configure SOAtest to perform static analysis on the Web pages that the browser downloads as Web functional tests execute: 1. In the Test Case Explorer, select the Scenario: Web Functional Testing node. 2. Open the Test toolbar button’s pull-down menu, then choose one of the available static analysis configurations.

181

Web Functional Testing

When SOAtest is finished performing static analysis, static analysis errors are shown in the SOAtest view. To facilitate review of these results, open the SOAtest view’s Layout menu and choose SOAtest Static Analysis for Functional Tests Layout.

182

Web Static Analysis

Web Static Analysis Introduction While functional testing finds problems by simulating how click paths would operate in a browser, static analysis finds problems by inspecting pages’ source code. Static analysis is just like a code review or code inspection. It reads and analyzes your source files, then it lets you know if it finds any coding errors that could cause functionality or presentation problems. It also pinpoints broken links and other navigational problems it finds. In addition, it can alert you to code that might not work correctly when people with disabilities try to use your site with adaptive devices, such as machines that read screen content out loud or convert content into Braille. Furthermore, static analysis can verify that custom design and content requirements, such as corporate branding, are met. SOAtest facilitates static analysis by automating complex analyses that would otherwise take days. Static analysis can be customized to involve tools that help expose and prevent problems such as: •

Coding constructs that make code more error-prone and difficult to update.



Navigational problems such as broken links, actions that invoke designated error pages, anchor problems, non-clickable links, and so forth.



HTML, CSS, JavaScript, and VBScript coding problems that affect presentation, execution, dynamic content, performance, transformation, display in non-traditional browsers, etc.



XML problems that affect transformations and data retrieval.



Code and content that violates Web accessibility guidelines.



Content that contains misspellings and typos.



Code that violates application-specific requirements and business rules.



Code that violates project-specific, organization-specific branding, design, or content guidelines.

The following exercises will demonstrate how to use SOAtest's Static Analysis feature to create a project, load an initial site into that project, and then quickly and easily analyze that project's files for any coding errors. We will not use the projects created from the previous exercises.

Performing Static Analysis To perform static analysis, complete the following: 1. Right-click the project from the previous exercises, then choose New Test (.tst) File from the shortcut menu.

183

Web Static Analysis

2. Enter a name for the file, then click Next. 3. Select Web> Scan web application and click Next. 4. Complete the Scan HTTP/FTP/Local Resources page as follows: a. Enter parabank.parasoft.com in the Start URL field. SOAtest will add this URL to the Allowable/Restricted URLs table. b. Check the Limit loading depth to checkbox and enter 2 in the field to the immediate right. This depth setting determines the number of clicks (links) deep to load the site. For example, a loading depth of 2 tells SOAtest to load only pages that can be reached in two clicks from the start URL (Note: redirect is also considered to be 1 depth). c.

In the Form Options section at the bottom of the panel, set the options as follows: •

Scan Forms With Default Inputs: Unchecked.



Fill Active Inputs Manually: Checked.

d. Click Finish. A new .tst file, and a test suite with one test, will be added to the Test Case Explorer.

184

Web Static Analysis

5. Right-click the Test 1: Scanning Tool node and select Test Configurations from the shortcut menu.

6. In the Test Configuration dialog that appears, select User-defined in the left panel and click the New button under the panel. 7. In the new configuration panel that appears to the right, Change the Name field to ScanTesting Configuration. 8. Open the Static tab, and check the Enable Static Analysis checkbox. 9. In the list of rules below in the same tab, check the following options: •

Accessibility - WCAG 2.0 [ACC-WCAG2]



Coding Convention [CC]



Check Links [CLINK]

185

Web Static Analysis

10. Click the Apply button to save the configuration, then click Close.

11. Select the scan testing test suite in the Test Case Explorer. 12. Open the pull-down menu for the Test toolbar button, then choose Test Using> UserDefined> ScanTesting Configuration.

While scanning the specified site, SOAtest will open a Form dialog box each time it loads a form that allows user input. In this example, a Form input dialog box will open for the form1 form. 13. Complete the Form:Form1 dialog as follows: a. Select Fixed from the Text: "username" drop-down menu, then enter john in the corresponding text field. b. Select Fixed from the Password: "password" drop-down menu, then enter demo in the corresponding text field. This tells SOAtest how to populate this particular form's input elements. The page returned after these inputs are submitted will be tested and included in the Project tree.

186

Web Static Analysis

c.

Click the Add button.

SOAtest will then reopen the same form dialog box so you can enter additional inputs if desired. d. Click the Skip All button to indicate that you do not want to enter any more inputs for forms in this site. When SOAtest is finished performing static analysis, static analysis errors are shown in the SOAtest view. To facilitate review of these results, open the SOAtest view’s Layout menu and choose SOAtest Static Analysis for Functional Tests Layout.

Additional Options You can customize static analysis tests by: •

Enabling or disabling static analysis rules.



Customizing the parameters of the various rules applied during static analysis.

187

Web Static Analysis



Designing rules that verify compliance with unique team or project requirements, then configuring SOAtest to apply those rules during static analysis.



Configuring SOAtest to suppress error messages that are not relevant to your project.

188

Team-Wide Deployment In this section: •

Team-Wide Deployment - Configuration Overview



Team-Wide Deployment - Usage Overview

189

Team-Wide Deployment - Configuration Overview In this section: •

Configuring a Team Deployment: Introduction



Connecting All SOAtest Installations to Your Source Control System



Connecting All SOAtest Installations to Team Server



Connecting SOAtest Server to Report Center



Connecting All SOAtest Installations to Parasoft Project Center



Configuring Team Test Configurations and Rules



Configuring Task Goals



Configuring Task Assignment



Sharing Project and Test Assets



Configuring Automated Nightly Testing

190

Configuring a Team Deployment: Introduction

Configuring a Team Deployment: Introduction The recommended way to configure a team-wide SOAtest deployment is to perform the following tasks in the specified order: 1. Installing and licensing SOAtest Professional Edition on all team machines, SOAtest Architect Edition on the architect’s or lead developer or tester’s machine, and SOAtest Server Edition on a team server machine. 2. Connecting All SOAtest Installations to Your Source Control System 3. Connecting All SOAtest Installations to Team Server 4. Connecting SOAtest Server to Report Center 5. Connecting All SOAtest Installations to Parasoft Project Center 6. Configuring Team Test Configurations and Rules 7. Configuring Task Goals 8. Configuring Task Assignment 9. Sharing Project and Test Assets 10. Configuring Automated Nightly Testing

191

Connecting All SOAtest Installations to Your Source Control System

Connecting All SOAtest Installations to Your Source Control System This topic explains how to connect SOAtest to your source control system. This connection enables SOAtest to use file revision history data in order to automatically assign test failures or policy violations to the responsible team members. It is required if you will be using SOAtest to automate peer review of artifacts such as configuration files, Web code, WSDLs, etc. Sections include: •

About SOAtest's Source Control Support



Enabling Source Control Support



AccuRev Configuration



ClearCase Configuration



CM Synergy Configuration



CVS Configuration



Perforce Configuration



Serena Dimensions Configuration



StarTeam Configuration



Subversion Configuration



Visual SourceSafe Configuration



Specifying Source Control Definitions from the Command Line Interface (cli)

About SOAtest's Source Control Support SOAtest currently supports the following source control system. •

AccuRev 4.6



ClearCase 2003.06.00



CM Synergy 6.4



CVS



Perforce 2006.2



Serena Dimensions 9.x and 10.x



StarTeam 2005 and 2008



Subversion (SVN) 1.2.x, 1.3.x, or 1.4x



Visual SourceSafe 6.0, 2005

192

Connecting All SOAtest Installations to Your Source Control System

Subclipse Support Notes •

Each Subclipse plugin version is compatible only with specific Subversion versions. Ensure that your Subclipse plugin is compatible with a version of Subversion that SOAtest supports. For example, you should NOT install Subversion 1.3 and Subclipse plugin 1.2, which uses Subversion 1.4.



Due to changes introduced in Subversion 1.4, Subversion clients earlier than 1.4 will not be able to work with working copies produced by Subversion 1.4. If you are using Subclipse plugin 1.2 (which includes Subversion 1.4) you might receive the following error message: svn: This client is too old to work with working copy '.'; please get a newer Subversion client

This means that SOAtest is using a command-line client that is version 1.3 or older. The solution is to update your command-line SVN client to version 1.4. The client version can be verified by executing svn --version If your team is using one of these source control systems and performs any necessary configurations (as described later in this topic), SOAtest can: •

Use file revision history data in order to automatically assign test failures or policy violations to the responsible team members. See “Configuring Task Assignment”, page 215 for details.



Update projects from source control before testing. See “Defining Common Options that Affect Multiple Analysis Types (Common Tab)”, page 249 for details (the related setting is Source Control> Update projects).



Automate the peer review process for the various quality artifacts involved in delivering secure, reliable, compliant SOA. See “Code Review”, page 619 for details.

Why do dialogs open when I try to modify files? Some source controls (including ClearCase, Perforce, Synergy, and Visual SourceSafe) require users to mark (lock) sources before editing them. If you are using one of these source control systems and you prompt SOAtest to perform an operation that involves editing a "read-only" file in source control, SOAtest will first open a dialog asking you whether you want to make the file writeable and lock it. Click OK, then provide your source control username and password in the next dialog that opens; this allows SOAtest to access the source control system and set the lock.

Enabling Source Control Support To enable support for any of the supported source control systems: 1. Make sure that the command line client for the given source control is on the system %PATH%/$PATH and is available when SOAtest is launched. •

For example, if you have Subversion, it is not sufficient (or even required) to install the Subclipse plugin to Eclipse (SVN Eclipse plugin). Instead, you should have the plain command line svn.exe Subversion client.

2. Choose SOAtest> Preferences. The Preferences dialog will open. 3. Select SOAtest> Scope and Authorship in the Preferences dialog. 4. Check Use source control to compute scope.

193

Connecting All SOAtest Installations to Your Source Control System

5. Select SOAtest> Source Control in the Preferences dialog. 6. Enable the check box for the source control system you want to use. 7. If the source control executable is not already on your system path, specify the path to it in the text field to the right of the source control system’s name. 8. Specify the source control properties required for the selected type of source control system by clicking New in the Defined Source Controls table, completing the Create Source Control Description dialog’s fields as appropriate for your system, then clicking OK. •

The fields in the Create Source Control Description dialog are described below.

9. Click OK to close the Source Control Description dialog. 10. Click Apply in the Preferences dialog. 11. Click Apply, then OK. To test the integration: 1. In the SOAtest environment, open a project that is checked out from the repository. 2. Open a file in the editor. 3. Right-click the source code, and choose SOAtest> Show Author at Line. If the correct author is shown, the integration was successful.

Debugging Tip To troubleshooting problems with source control integration, run -consolelog -J-Dcom.parasoft.xtest.logging.config.jar.file=/com/parasoft/xtest/logging/log4j/config/logging.on.xml. This should result in detailed log information being printed to the console.

To include messages from the source control system that may contain fragments of user source code, use an additional flag: -Dscontrol.log=true

AccuRev Configuration When you are enabling source control support, specify the following repository properties in the Create Source Control Description dialog: •

Server: Enter the hostname of server where AccuRev is running (required).



Port: Enter the port of the server where AccuRev is running (required).



Username: Enter the AccuRev username/login (required).



Password: Enter the AccuRev password (if needed).

ClearCase Configuration To use ClearCase with SOAtest: •

Check whether a file is controlled by ClearCase by calling the cleartool describe -fmt %Vn <file_path> command. No output means that the file is not controlled by ClearCase.



Ensure that the VOB root directory contains a lost+found directory.

When you are enabling source control support, specify the following repository properties in the Create Source Control Description dialog:

194

Connecting All SOAtest Installations to Your Source Control System



VOB location: Enter the dynamic or snapshot VOB access path. Use the full VOB path (e.g., / vobs/myvob (Linux dynamic view) or M:\\my_dynamic_view\myvob (Windows VOB path). Note that when you enter a Vob Location, the Vob tag field will automatically display the vob tag. If the location is not a proper vob path, a warning message is displayed.

CM Synergy Configuration When you are enabling source control support, specify the following repository properties in the Create Source Control Description dialog: •

Database path: Enter the absolute Synergy database path.



Engine host: Enter the Synergy server’s machine name or IP address.



User: Enter the user name under which you want to connect to the repository.



Password: Enter the password for the above user name.



Use remote client (UNIX systems only): Enable this option if you want to start CCM as a remote client session.



Local database (remote client): Enter the path to the location where the database information is copied when you are running a remote client session.

CVS Configuration To use CVS with SOAtest, ensure that the .cvspass file is in one of the following locations: •

user.home system property



HOME env variable



(Windows) combination of HOMEDRIVE and HOMEPATH (example: "C:" + "\home")



current working directory

When you are enabling source control support, specify the following repository properties in the Create Source Control Description dialog: General tab •

Connection type: Enter the authentication protocol of the CVS server.



User: Enter the user name under which you want to connect to the repository.



Password: Enter the password for the above user name.



Repository path: Enter the path to the repository on the server.



Server: Enter the CVS server’s machine name or IP address.



Port: Enter the CVS server’s port.

Custom SSH/CVS_RSH tab •

CVS_SERVER value: If connecting to a CVS server in EXT mode, this specifies which CVS application to start on the server side.



Use custom authentication properties for ext/server method: Enable this option if you want to use custom authentication for ext/server method.



Remote shell login: Enter your SSH login.



Remote shell password: Enter the password for the above SSH login.



Private key file: Enter the private key file.

195

Connecting All SOAtest Installations to Your Source Control System



Passphrase for private key file: Enter the passphrase for the above private key file.



Use command-line program to establish connection: Enables you to run an external program to establish an EXT connection. Use this option only for non-standard and legacy protocol connections (telnet, rsh). Linux/Unix/Cygwin ssh prompts for passwords/passphrases/ security word sequences are not currently supported.



CVS_RSH path: Specifies the full path to the executable used to establish EXT connections.



CVS_RSH parameters: Specifies the parameters for the executable. The following macro-definitions (case sensitive) can be used to expand values into command line parameters: •

{host} - host parameter



{port} - port parameter



{user} - user parameter from primary page



{password} - user password from primary page



{extuser} - user parameter from EXT/CVS_RSH page



{extpassword} - password parameter from EXT/CVS_RSH page



{keyfile} - path to key file



{passphrase} - password to key file

Perforce Configuration When you are enabling source control support, specify the following repository properties in the Create Source Control Description dialog: •

Server: Enter the Perforce server’s machine name or IP address.



Port: Enter the Perforce server’s port.



User: Enter the user name under which you want to connect to the repository.



Password: Enter the password for the above user name.



Client: Enter the client workspace name, as specified in the P4CLIENT environment variable or its equivalent.

Serena Dimensions Configuration

196

Connecting All SOAtest Installations to Your Source Control System

Linux and Solaris Configuration Note To use Serena Dimensions with SOAtest, Linux and Solaris users should run SOAtest in an environment prepared for using Serena programs, such as 'dmcli' •

LD_LIBRARY_PATH should contain the path to <SERENA Install Dir>/libs.



DM_HOME should be specified.

Since many Solaris users commonly set the required Serena variables by running the Serena dmgvars.sh file, it also necessary to modify LD_LIBRARY_PATH variable. To use Serena Dimensions with SOAtest, LD_LIBRARY_PATH needs to include the following items (paths can be different on client machines): •

SSL/Crypto library - /usr/local/ssl/lib



STDC++ library - /usr/local/lib

When you are enabling source control support, specify the following repository properties: •

Host: Enter the Serena Dimensions server host name.



Database name: Enter the name of the database for the product you are working with



Database connection: Enter the connection string for that database



Login: Enter the login name.



Password: Enter the password.



Mapping: Enter an expression that maps workspace resources to Serena Dimension repository paths. •

Example 1: If you use scontrol.rep.serena.mapping_1=${project_loc\:MyProject};PRODUCT1\:WORKSET1;src\\MyProject, then Project 'MyProject' will be mapped to the Serena workset PRODUCT1:WORKSET1 and workset relative path: src\\MyProject



Example 2: If you use scontrol.rep.serena.mapping_2=${workspace_loc};PRODUCT1\:WORKSET1 then the complete workspace will be mapped to the Serena workset PRODUCT1:WORKSET1.

StarTeam Configuration To use StarTeam with SOAtest: •

Ensure that you have the Borland StarTeam SDK installed. This can be downloaded for free from the Borland web site.

When you are enabling source control support, specify the following repository properties: •

Server: Enter the StarTeam server’s machine name or IP address.



Port: Enter the StarTeam server’s port.



User: Enter the user name under which you want to connect to the repository.



Password: Enter the password for the above user name.

197

Connecting All SOAtest Installations to Your Source Control System

Subversion Configuration SOAtest’s Subversion support is based on the command line client 'svn'. To use Subversion with SOAtest, ensure that: •

The Subversion 1.2.x, 1.3.x, or 1.4.x client is installed.



The client certificate is stored in the Subversion configuration area. The Subversion client has a built-in system for caching authentication credentials on disk. By default, whenever the command-line client successfully authenticates itself to a server, it saves the credentials in the user's private runtime configuration area—in ~/.subversion/auth/ on Unix-like systems or %APPDATA%/Subversion/auth/ on Windows.

When you are enabling source control support, specify the following repository properties in the Create Source Control Description dialog: •

URL: Enter the URL for the SVN server. The URL should specify the protocol, server name, port and starting repository path (for example, svn://buildmachine.foobar.com/home/svn).



User: Enter the user name under which you want to connect to the repository.



Password: Enter the password (not encoded) for the above user name.

Visual SourceSafe Configuration When you are enabling source control support, specify the following repository properties in the Create Source Control Description dialog: •

VSS Database Path: Enter the database path (the location of SRCSAFE.INI).



User: Enter the user name under which you want to connect to the repository.



Password: Enter the password for the above user name.



Project root in repository: Enter the project root. This begins with $/; for example, $/ nightly_test.

Specifying Source Control Definitions from the Command Line Interface (cli) Source control definitions can be specified from the command line, using local settings files (described in “Local Settings (Options) Files”, page 266). The fastest and easiest way to add source control settings to a local settings file is to export them directly into a new or existing file. To do this: 1. On a SOAtest installation with your source control repositories defined, choose SOAtest> Preferences. The Preferences dialog will open. 2. Select SOAtest> Source Control in the Preferences dialog. 3. Click the Export to local settings file button, then specify the file where you want the settings saved. •

If you select an existing file, the source control settings will be appended to that file. Otherwise, a new file will be created.



Exported passwords will be encrypted.

198

Connecting All SOAtest Installations to Team Server

Connecting All SOAtest Installations to Team Server This topic describes how to connect all team SOAtest installations to Parasoft Team Server, which supports centralized administration and application of test practices. Sections include: •

About Team Server



Prerequisites



Connecting SOAtest to Team Server



Extending the Team Server Timeout Period



Exporting Team Data

About Team Server Parasoft Team Server is the software that manages the team-wide distribution and sharing of Test Configurations, rules, rule mappings, suppressions, skipped resources, code review tasks, and test results. All team SOAtest machines should be connected to Team Server to enable centralized administration and application of test practices. The Team Server module ensures that all team members have access to the appropriate team Test Configurations, suppressions, rule files, and test case files. Team Server is available and licensed separately. This version of SOAtest works with Team Server 2.0 and higher, which is distributed as part of Parasoft Server Tools. After Team Server is installed and deployed as a Web service, the team architect or manager can configure the appropriate team settings and files on one SOAtest installation, then tell Team Server where to access the settings and related test files. Team members can then point their machines to the Team Server location, and Team Server will ensure that all team machines access the appropriate settings and files. When the master version of a file is modified, added, or removed, Team Server makes the appropriate updates on all of the team’s SOAtest installations. Before you can use Team Server to share SOAtest files, Team Server must be installed and deployed on one of your team’s machines. For information on obtaining, installing, and deploying Team Server, refer to the Team Server documentation or contact your Parasoft representative.

Prerequisites Before you proceed with the team deployment, ensure that Team Server is successfully installed and deployed on one of your organization’s machines. If you need information on obtaining, installing, or deploying Team Server, contact your Parasoft representative.

Connecting SOAtest to Team Server After Team Server is installed and deployed, you need to connect all team machines to that Team Server. If a SOAtest installation is not connected to Team Server, Team Server will not provide file/configuration/task sharing and management for that installation. To connect the team’s SOAtest installations to Team Server, perform the following procedure on every SOAtest installation used by the team: 1. Choose SOAtest> Preferences to open the Preferences dialog.

199

Connecting All SOAtest Installations to Team Server

2. Select the SOAtest> Team category in the left pane. 3. Enable the Enable Team Server option in the SOAtest Team preferences page. 4. If the appropriate Team Server is not already set, select it from the Autodetected servers list and click Set. Or, manually enter your Team Server’s host (either a name or an IP address) in the Host name field, then enter its port in the Port number field. 5. If you want to minimize the number of operations on Team Server by reusing cached data, check Enable cache mode. •

This can improve performance, but there is a small risk that outdated rules or Test Configurations could be distributed (if the file was updated since the caching, which is set to occur every 8 hours by default). If a file has been updated since the caching, users can force a refresh by clicking Refresh.

6. If your team requires users to log in to Team Server, check Enable account login and then enter your Team Server username and password in the appropriate fields. Depending on how your Team Server was configured, each team member might have a unique Team Server username and password, or all developers might share a single "generic" account. 7. Click Test Connection to verify the connection to Team Server. 8. Click Apply to apply your settings. 9. Click OK to set and save your settings.

Extending the Team Server Timeout Period By default, SOAtest waits 60 seconds for a response from Team Server; if a response is not received within this time, it times out. If you want SOAtest to wait longer for a response from Team Server before timing out, you can extend the timeout as follows: •

For the standalone: Start the tool using the argument -J-Dparasoft.tcm.timeout=[timeout_in_seconds]



For the plugin: Start the tool using the argument -vmargs -Dparasoft.tcm.timeout=[timeout_in_seconds]

Exporting Team Data You may occasionally want to export team data. You can copy: •

All data from one Team Server account to another (with or without transforming the paths to use a new location).



Suppressions and resource data from one location to another within the same Team Server account.

To export team data: 1. Open the Team Server page in the Preferences panel. 2. Click Export Team Data. 3. Use the available controls to specify what data you want exported, where you want it exported, and whether you want paths to be transformed during export. Exporting Team Server data may be especially useful when... •

Renaming IDE projects: To ensure that resource data settings and suppressions are still available after renaming a project, you can use the export wizard to copy data with path reloca-

200

Connecting All SOAtest Installations to Team Server

tion; for example:



Creating a new version of the general project connected to a new Team Server user: When a new version of project is created in source control (branch), it is recommended that you also create a new Team Server user that will control configurations, rules, suppressions and other data for the given project version. Initially, the new area on Team Server should be filled from the curent project. After creating the Team Server user, you can use the wizard to copy all data from the current user to the new one. This configures Team Server to support two separate areas for two versions of product. From this point forward, any changes in configurations, rules, or suppressions in one version will not affect settings in the other version.

201

Connecting All SOAtest Installations to Team Server



Modifying the project/solution layout: For example, assume that your team decides to add artifacts in separate folders: you have all artifacts in /My Project/src/... but want to have them in in /My Project/... To make this move without losing the data on Team Server, you can copy data from /My Project/src to /My Project.

202

Connecting SOAtest Server to Report Center

Connecting SOAtest Server to Report Center This topic describes how to connect the SOAtest Server Edition installation to Parasoft Report Center, which provides dashboards, reports, and metrics that help team members evaluate the project’s overall quality and readiness. Sections include: •

About Report Center



Prerequisites



Configuring SOAtest to Send Results to Report Center



Configuring Report Center Attributes



Accessing Report Center Reports

About Report Center Parasoft Report Center is a decision support system that provides teams the on-going visibility and measurement of the software development process necessary to help keep software projects on track. Collecting and consolidating metrics generated during the SDLC, Report Center turns these data points into meaningful statistics and dashboards that provide development managers and team members with the ability to continuously and objectively assess the quality and readiness of the project, status of the coding process, and the effectiveness of the team. With Report Center, teams can more readily identify, respond to and manage risks that threaten project schedules and quality. Report Center provides the metrics by which management can more effectively assess and direct resources, set and monitor development targets, communicate, guide and measure conformance to development policies, and ensure successful project outcomes. Once SOAtest is configured to send information to Report Center, developers, testers, architects, and managers can use the Report Center dashboard to access role-based reports on quality, progress, and productivity.

Prerequisites If your team is using Report Center, at least one SOAtest installation—the SOAtest Server Edition— should be connected to Report Center. Before you connect the SOAtest Server Edition to Report Center, ensure that Report Center is successfully installed and deployed on one of your organization's machines. If you need information on obtaining, installing, or deploying Report Center, contact your Parasoft representative.

Configuring SOAtest to Send Results to Report Center Before a SOAtest installation can send results to Report Center, you must connect it to Report Center. Most teams connect only their SOAtest Server installation to Report Center because they do not want their Report Center reports and statistics to contain data from tests performed on developer desktops. However, SOAtest Professional and/or SOAtest Architect installations can also be configured to send data to Report Center. To connect a machine to Report Center:

203

Connecting SOAtest Server to Report Center

1. Choose SOAtest> Preferences to open the Preferences dialog. 2. Select the SOAtest> Report Center / Project Center category in the left pane. 3. (Optional) If your team wants to run tests on different code/IDE projects that are components of a larger "global project," share quality and code review tasks across these project components, import results for them all, and report everything to Concerto Report Center, specify the name of the general project you want the data collected under in the General Project field. You can enter a new project name, or click Find to locate an existing one (one already defined in Concerto). •

For example, you might have a general project for "MyProduct" that includes projects such as product_engine, product_gui, product_utils. Results for all of these projects will then be sent to Concerto and reported under "MyProduct."

4. Check the Send results to Report Center and Project Center option. 5. (Optional) If you want Report Center to log results from this SOAtest installation as the results from the nightly tests, check the Log as nightly option. •

If you enable this option, results will be logged at the group level in Report Center.

6. (Optional) If you want this machine to send only a brief summary report sent to Report Center, check the Send summary only option. •

This is recommended for SOAtest Professional or Architect installations that are configured to send data to Report Center.



If you enable this option, only results summaries from static analysis and test execution (not individual violations) will be sent to Report Center.

7. Enter your team’s Report Center Server host (either a name or an IP address) in the Server host name field. 8. Enter your team’s Report Center Data Collector port in the Data Collector port field. 9. Click Test Connection to verify this connection. 10. Enter your team’s Report Center Report Server port in the Report Server port field. 11. Click Test Connection to verify this connection. 12. (Optional) Add Report Center attributes as described in Configuring Report Center Attributes below. 13. Click Apply. 14. Click OK to set and save your settings.

Configuring Report Center Attributes Report Center attributes help you mark results in ways that are meaningful to your organization. They also determine how results are grouped in Report Center and how you can filter results in Report Center. By default, the following information is sent to Report Center: •

user



machine



date



product



product version

204

Connecting SOAtest Server to Report Center



project name

Additional Report Center attributes can be specified in two ways: •

Through the SOAtest GUI.



Through a SOAtest command-line interface local settings file.

To set general attributes (attributes that apply to all tests) through the GUI: 1. Do one of the following: •

If you want to set attributes that will be applied to all tests run by this SOAtest installation, choose SOAtest> Preferences to open the Preferences dialog, then select the SOAtest> Report Center / Project Center category in the left pane.



If you want to set attributes that will be applied only to tests of a specific project or file, right-click the project that you want to set Report Center attributes for, choose Properties from the shortcut menu, then select SOAtest> Report Center Attributes in the left pane.

2. Use the available controls in the Attributes section to add or import Report Center attribute settings. Each attribute contains two components: a general attribute category name and a specific identification value. •

Example 1: If your group wants to label results by project names that are more specific than those used in the SOAtest project, you might use the attribute name PROJECT_NAME and the attribute value projname1. For the next project, you could specify an attribute with the attribute name PROJECT_NAME and the attribute value projname2.



Example 2: If your organization wants to label results by division, you might use the attribute name DIVISION and the attribute value division1 for all tests performed by your division. Another division could specify an attribute with the attribute name DIVISION and the attribute value division2.



Example 3: If your group wants to label results by project versions, you might use the attribute name VERSION and the attribute value 1.1. For the next project, you could specify an attribute with the attribute name VERSION and the attribute value 1.2.

For details on setting attributes through a SOAtest command-line interface local settings file, see “Local Settings (Options) Files”, page 266.

Accessing Report Center Reports To access Report Center reports based on information from SOAtest tests and other sources: •

Choose SOAtest> Explore> Report Center Reports, or open the reports as described in your Report Center User’s Guide.

205

Connecting All SOAtest Installations to Parasoft Project Center

Connecting All SOAtest Installations to Parasoft Project Center This topic explains how to work with Parasoft Project Center from the SOAtest environment. You can have Mylyn drive the process of connecting to Project Center, identifying tasks, and activating/ deactivating tasks—or you can perform these actions manually (via the Task Assistant) if Mylyn is not installed (or if you prefer not to use it). Sections include: •

Working with Project Center through Mylyn



Working with Project Center without Using Mylyn



Configuring Task Assistant Preferences

Working with Project Center through Mylyn This section covers: •

Prerequisites



Understanding the Interface



Defining a Task Repository



Defining a Task Query



Working with Project Center Tasks

Prerequisites Before you start interacting with Parasoft Project Center in Mylyn-driven mode, you must have Mylyn installed in your Eclipse IDE. For SOAtest standalone, Mylyn is included. For the SOAtest Eclipse plugin, you need Mylyn 3.0 or higher and Eclipse 3.3 or 3.4. If you already have Eclipse 3.3 or 3.4 but not Mylyn, you can obtain Mylyn at: •

For Eclipse 3.3: http://download.eclipse.org/tools/mylyn/update/e3.3



For Eclipse 3.4: http://download.eclipse.org/tools/mylyn/update/e3.4

Understanding the Interface There are three views for interacting with Project Center in Mylyn-driven mode: •

Mylyn> Task Repositories: Enables you to define Mylyn's task repositories.



Mylyn> Task List: Main working area for task management. Enables you to define task queries, open tasks, edit tasks, and close tasks.



SOAtest> Task Assistant: Tracks data related to the Project Center task you are working on.

To activate each view, use the Window> Show View> Other option. Mylyn views are available in the Mylyn category, while the Task Assistant view is available in the SOAtest category.

206

Connecting All SOAtest Installations to Parasoft Project Center

Defining a Task Repository Before you can start working on tasks, a task repository that points to the Project Center Server needs to be defined. A task repository can be defined as follows: 1. Open the Mylyn> Task Repositories view. 2. Right-click in that view and choose Add Task Repository (or click the Add Task Repository view toolbar button). 3. In the Add Task Repository dialog, choose Project Center, then click Next. 4. Complete the Project Center Repository form as follows: •

Server: URL of your Project Center server, for example, http://concerto.example.com:8080



Label: Any string.



User ID: Your Project Center login.



Password: Your Project Center password.

5. Click Finish. After adding the Project Center Task Repository, it should be displayed in the Task Repositories list under the label you selected.

Defining a Task Query 207

Connecting All SOAtest Installations to Parasoft Project Center

Before you can start working on tasks, a task query for each developer needs to be created. Once the query is defined, all tasks that are assigned to the developer and that have a status of "open" and "in progress" will be imported from Project Center to Eclipse. To define a task query: 1. Open the Mylyn> Task List view. 2. Choose New> Query... . The New Repository Query dialog is displayed. 3. Select the Project Center repository. The Edit Repository Query form is displayed and contains the same fields as the Task Search page in Project Center. 4. Fill in one or more fields to query for Project Center tasks that meet your criteria. For example, if you would like to view your open tasks, select Open in the Status field, and then enter your name in the Name field:

5. Click Finish. The query is displayed in the Task List view with tasks matching the specified search criteria. Note: If the query does not appear in the view, clear Focus on Work Week on the Task List toolbar (or via the right-click menu). It could be that no tasks in the newly-defined query are scheduled for the current work week. Eclipse connects to Project Center to import tasks for the specified developer. You can refresh to view the updated list. Tasks displayed are sorted based on the priority defined in Project Center.

Working with Project Center Tasks Viewing Tasks Project Center tasks are prioritized based on either Priority or Planned Start Date. Tasks assigned to you are sorted by work priority. You should set your view based on how you prioritize your tasks: •

Categorized view: If your tasks are prioritized based on priority, stay in this view.

208

Connecting All SOAtest Installations to Parasoft Project Center



Scheduled view. If your tasks are prioritized tasks based on Planned Start Date, switch to this view.

Activating Tasks To work on tasks: 1. Double-click the appropriate task. The Edit Task page is displayed in Mylyn.

2. Activate the task using Mylyn. In the Edit Task page, change task status to "in progress" and apply the changes by clicking the Submit button. 3. (Optional) Open the task in Project Center to verify that the status changed to In progress.

Tracking Task Implementation with the Task Assistant From this point forward, you can work on your task development in Eclipse/SOAtest. As you do so, the Task Assistant monitors any changes made to the project files. You can see the modified task files and tests (if test collection is enabled) in the Task Assistant view:

This view lists the resources related to the current task, and provides information about revisions stored in source control. •

Use the Expand All and Collapse All toolbar buttons to quickly display and hide details.



Use the right-click menu commands to: •

Open an editor for selected resource.



Expand part of the tree.



Indicate that a resource is not related to the current task.

Updating Working Time Estimates When you start Eclipse each day, the following Tasks Management dialog will be displayed:

209

Connecting All SOAtest Installations to Parasoft Project Center

If needed, update the number of days that you need to finish the active tasks based on your own estimation, then click OK. The Estimated Remaining Working Time field for your task will be updated accordingly, and managers will be able to review the number of days left until this task is finished.

Closing Tasks To close a task: 1. In the Task Editor, add a comment in the New Comment field. 2. If the task was successfully accomplished, select Success (under Actions), and then click Submit. Alternatively, you can cancel the task (Canceled option) or un-assign it (Open option). 3. Deactivate the task in Mylyn. The Deactivate wizard will open and list the files that you changed while working on this task. 4. If you want to see a text summary report of the file modifications related to this task (so you can copy/paste it), enable Show me a dialog with reported revisions in text format. 5. Click the OK button to send information about the modified files to Project Center."

Configuring Task Assistant Preferences For details on configuring Task Assistant preferences, see “Configuring Task Assistant Preferences”, page 213.

Working with Project Center without Using Mylyn If you do not have Mylyn installed (or you do not want to use it), you can work with Project Center tasks exclusively through the Task Assistant. This section covers: •

Connecting to the Project Center Server



Opening the Task Assistant



Identifying Tasks



Working with Tasks

Connecting to the Project Center Server Before you can start working on tasks, the location of the Project Center server needs to be defined as described in “Connecting SOAtest Server to Report Center”, page 203.

Opening the Task Assistant

210

Connecting All SOAtest Installations to Parasoft Project Center

To open the Task Assistant, choose Window> Show View> Other, then choose SOAtest> Task Assistant.

Identifying Tasks To identify tasks, you search for them using the Project Center web interface. You can click the My Tasks Task Assistant toolbar button to open this interface. The interface will show your assigned tasks, and allow you to search through the entire repository of tasks.

When you identify the task you are going to work on, note the task id. You will need to enter this in the Task Assistant view.

Working with Tasks You can use the Task Assistant to activate tasks, track the files related to those tasks, and close tasks

Activating Tasks To activate a task. 1. Ensure that the project related to this task is available in the IDE. 2. In the Task Assistant, enter the task ID and press Enter. The program will connect to Project Center and retrieve data about this task. The task name will display in the Task Assistant. 3. Click the Activate button in the Task Assistant toolbar (this is the button on the far right). The Activate wizard will open and allow you easily switch the task to "in progress" state—as well as enter the estimated time for this task.

211

Connecting All SOAtest Installations to Parasoft Project Center

Reviewing or Modifying Task Details If you want to review details of an active task or modify task details, click the Edit Task toolbar button.

Closing Tasks To close a task that you have finished working on:

212

Connecting All SOAtest Installations to Parasoft Project Center

1. Click the Task Assistant’s Deactivate toolbar button. The Deactivate wizard will open and list the files that you changed while working on this task.

2. In the Deactivate wizard, describe your changes in the New Comment field. 3. If the task was successfully accomplished, select Success. Alternatively, you can cancel the task (Canceled option) or un-assign it (Open option). 4. If you want 1) information about the modified files to be sent to Project Center and 2) all the listed file modifications to be associated with the task, then enable Notify Project Center about the following modifications. 5. If you want to see a text summary report of the file modifications related to this task (so you can copy/paste it), enable Show me a dialog with reported revisions in text format.

Configuring Task Assistant Preferences

213

Connecting All SOAtest Installations to Parasoft Project Center

You can configure Task Assistant preferences in the SOAtest preference panel’s Report Center/ Project Center> Task Assistant page (to open the preference panel, choose SOAtest> Preferences). In the General tab, you can configure the following options: •

Source control support: Determines Task Assistant’s level of interaction with the specified source control system. •

Recommended: Allows Task Assistant to collect information about changed files and data required for Project Center reporting. With this option, source control is used to validate if a file was modified, what its current version is, whether a change was reverted, and so on. If your source control connection is slow, this option may also work slowly; in that case, we recommend the Minimal option.



Minimal: Allows Task Assistant to collect only the bare minimum of information required for Project Center reporting (e.g., only a few calls to get data regarding file revisions).



None: Does not allow Task Assistant to interact with source control. Project Center reporting is not meaningful in this mode, and is thus disabled.



Open related defect or enhancement in browser when changing task status: Determines whether the details for the defect/enhancement you are working on are shown when you change the task status.



Test collecting: Determines whether the tests related to a task are tracked and shown in the Task Assistant.



Automatically mark tests by task identifier: Determines if new and modified tests are automatically marked by task id.

In the Filters tab, you can specify patterns that match the types of files and tests you do NOT want the Task Assistant to track. For instance, you might not want to track the modifications to binary files such as files in the bin directory, class files, or jar files.

214

Configuring Task Assignment

Configuring Task Assignment This topic explains how to configure SOAtest to assign quality tasks to the various team members building and testing your SOA. Sections include: •

About SOAtest’s Task Assignment



Understanding How SOAtest Assigns Tasks



Specifying How Tasks are Assigned



Directly Specifying Task Owners for Specific Resources



Specifying Team Member’s Email Addresses



Handling Authorship When Using Multiple Source Control Systems

About SOAtest’s Task Assignment There are typically many different team members working on a SOA throughout the SDLC—from developers of Web services, to developers of the Web interface, to QA testers verifying end-to-end business processes. Using technologies ranging from peer review workflow automation, to static analysis, to SOA policy enforcement, to functional testing of individual SOA components as well as end-to-end test scenarios, SOAtest generates quality tasks for the various team members to perform. For example: •

If the organization has a policy that the Web interface must comply with WCAG 2.0 accessibility guidelines, any Web developer who contributes non-compliant code to the project would receive a static analysis task to fix the noncompliant code.



If the team has a policy that certain critical artifacts must be peer reviewed, the assigned reviewer will be automatically notified with a review task if such an artifact is added or changed.



If a tester added a functional test to the test suite, then that test later fails as the system evolves, that tester will be assigned a "test review" task to determine if the test needs to be updated to remain in sync with intentional application changes, or if that test failure indicates a problem with the application functionality.

Understanding How SOAtest Assigns Tasks SOAtest can assign tasks based on source control data in a number of ways. •

If you are going to be statically analyzing source files or executing functional tests whose test files are stored in your source control system, you can set up SOAtest to use data from your source control system in order to assign tasks. Static analysis tasks are assigned to the person who introduced them. Test failures are assigned to the person who last worked on the related test.



If you are using SOAtest for functional testing, you can directly specify task owners for specific test suites or other resources.



If you will be using SOAtest to automate peer review, review task assignment is defined through the Code Review interface (described in the Code Review section).

215

Configuring Task Assignment

Tip If you have added source files to a project in the SOAtest environment, you can see the team member assigned to a particular line of that source file, as well as information about when it was last modified: 1. Open an editor for the appropriate file. 2. Right-click the line whose author you want to view, then choose SOAtest> Show Author at Line from the shortcut menu Note that if you are not using source control data to calculate task ownership, the message will show modification information for the file (rather than for the specific line selected).

Specifying How Tasks are Assigned Task assignment settings can be specified: •

Through the Scope and Authorship preferences page in the SOAtest GUI.



Through a command-line interface local settings file.

To use GUI controls to change task assignment settings: 1. Choose SOAtest> Preferences to open the Preferences dialog. 2. Select the SOAtest> Scope and Authorship category in the left pane. 3. Use the available controls to indicate how you want SOAtest to assign tasks. •

Use source control (modification author) to computer scope: Source control data will be used to assign tasks related to the source files you are analyzing with SOAtest.



Use file system (xml map) to compute scope: You will directly specify how you want tasks assigned for particular files or sets of files (for example, you want tester1 to be responsible for one set of .tst files, tester2 to be responsible for another set of .tst files, and so on). See “Directly Specifying Task Owners for Specific Resources”, page 216 for details.



Use file system (current user) to compute scope: The local user name will be used to compute authorship.

4. Click OK to set and save your settings. For details on setting attributes through a command-line interface local settings file, see “Local Settings (Options) Files”, page 266.

Directly Specifying Task Owners for Specific Resources To directly specify how you want tasks for particular files or sets of files (for example, for .tst files) assigned: 1. Indicate that you will be entering mappings directly as follows: a. Choose SOAtest> Preferences to open the Preferences dialog.

216

Configuring Task Assignment

b. Select the Scope and Authorship category in the left pane. c.

Select Use file system (xml map) to compute scope.

2. Enter file-to-author mapping details as follows: a. Select the Scope and Authorship> Authorship Mapping category in the Preferences dialog. b. Specify your mappings in the authorship mapping table. Note that wildcards are allowed; for example: •

?oo/tests/SomeTest.tst - assigns all files whose names starts with any character (except /) and ends with "oo/tests/



**.tst - assigns all *.tst files in any directory



**/tests/** - assigns every file whose path has a folder named "tests"



tests/** - assigns all files located in directory "tests"



tests/**/Test* - assigns all files in directory "tests" whose name starts with "Test" (e.g., "tests/some/other/dir/TestFile.tst")

Mapping Order Matters Place the most general paths at the end of the mapping. For example, /ATM/** path is the last here because it is the most general: /ATM/unittests/** user 1 /ATM/other/** /ATM/**

user 2 user3

This assigns unittests files to user 1, other files to user 2, and all other ATM project files to user 1. If you reversed the order, all ATM files would be assigned to user 3. c.

Click Save Changes.

3. Click Export to export the mappings as an XML file, then have team members import the mapping file. •

If you are already sharing preferences across the team, this step is not necessary.

Alternatively, you can direct SOAtest to use the exported XML file by either: •

Specifying the path to that file in the Preferences dialog’s Scope and Authorship> Authorship Mapping category (using the Shared File option).



Specifying the path to that file from the command line, using soatestcli -mapping {filename} . Note that this will override any authorship settings specified in the GUI. For details on setting attributes through a SOAtest command-line interface local settings file, see “Testing from the Command Line Interface (soatestcli)”, page 257.

A sample XML authorship mapping file follows: <?xml version="1.0" encoding="UTF-8" ?> <!DOCTYPE authorship (View Source for full doctype...)> <authorship> <!-- assigns all files named: "foo/tests/SomeTest.tst" to "tester1" --> <file author="tester1" path="foo/tests/SomeTest.tst" />

217

Configuring Task Assignment

<!-- assigns all files whose names starts with any character (except /) and ends with "oo/ tests/SomeTest.tst" to "tester2" --> <file author="tester2" path="?oo/tests/SomeTest.tst" /> <!-- assigns all *.tst files in any directory to "tester3" --> <file author="tester3" path="**.tst" /> <!-- assigns every file whose path has a folder named "tests" to "tester4" --> <file author="tester4" path="**/tests/**" /> <!-- assigns all files located in directory "tests" to "author5" --> <file author="tester5" path="tests/**" /> <!-- assigns all files in directory "tests" whose name starts with "Test" i.e. "tests/some/ other/dir/TestFile.tst" to "tester6" --> <file author="tester6" path="tests/**/Test*" /> </authorship>

Specifying Team Member’s Email Addresses By default, SOAtest assumes that each username value it detects is the team member’s username, and that the related team member’s email is [username]@[mail_domain]. However, in some cases, you might want to map the detected username to a different username and/or email address. For example: •

If you want to reassign all of one team member’s tasks to another team member (for instance, if User1 has left the group and you want User2 to take care of all the tasks that would be assigned to User1). We will refer to this type of mapping as an "author to author mapping."



If a team member’s username does not match his email address (for instance, the detected username is john but the appropriate email is [email protected]). We will refer to this type of mapping as an "author to email mapping."

To map the default user value detected to a different username and/or email address: 1. Choose SOAtest> Explore> Team Server. The Browsing dialog will open. 2. Open the Authors tab of the Browsing dialog. 3. If you want to specify "author to author" mappings: a. Select the Reassign tasks from this author... To this author row. b. Click Add. c.

Specify the desired mapping.

4. If you want to specify "author to email" mappings: a. Select the Send emails for this author... To this email address row. b. Click Add. c.

Specify the desired mapping.

218

Configuring Task Assignment

5. Click the Done button to close the Browsing dialog.

Tip - Importing and Exporting to Share Authorship and Email Mappings If you want to move mappings from one Team Server to another, use the Export button to export them from the first server, then use the Import button to import them on the other server.

Handling Authorship When Using Multiple Source Control Systems If you and/or your team members work with multiple source control system, but use the same login for all source control systems, start SOAtest as follows to ensure accurate authorship computation from source control: •

Standalone: SOAtest -J-Duser.name=your_username ...



Plugin: eclipse .... -vmargs -Duser.name=your_username

219

Configuring Team Test Configurations and Rules

Configuring Team Test Configurations and Rules This topic explains how to share Test Configurations (and any rule files or rule mapping files that they depend on) across the team. Sections include: •

About Team Test Configurations



Sharing Team Test Configurations



Modifying a Team Test Configuration



Setting the Team Favorite Test Configuration



Sharing Rule Mappings



Sharing Custom Rules

About Team Test Configurations Team Test Configurations are the Test Configurations that apply the team’s designated test settings (for instance, the static analysis rules that your team has decided to follow, browser playback options for web functional tests, etc.). When all team members use the designated Test Configurations, tests will be run consistently, and the team’s quality and style guidelines will be applied consistently across the project. Once a team Test Configuration is added to Team Server, it will be accessible on all connected team SOAtest installations. If a Test Configuration uses custom rules and/or rulemappings, they can be added to Team Server, then automatically accessed by all connected team SOAtest installations.

Sharing Team Test Configurations To share a Team Test Configuration team-wide, the architect or manager performs the following procedure on a SOAtest installation (Architect or Server Edition) that is already connected to Team Server: 1. If you have not already done so, create a user-defined Test Configuration that applies the designated team settings. •

See “Creating Custom Test Configurations”, page 244 for instructions.

2. Upload that configuration to Team Server as follows: a. Open the Test Configurations dialog by choosing SOAtest> Test Configurations. b. Right-click the Test Configurations category that represents the Test Configuration you want to upload. c.

Choose Upload to Team Server from the shortcut menu.

You can configure multiple team Test Configurations (for instance, one for static analysis, one for regression testing, etc.).

Tip •

If your Team Test Configuration uses custom rules or rule mappings, the related files can be shared as described later in this topic.

220

Configuring Team Test Configurations and Rules

Modifying a Team Test Configuration Team Test Configurations can be directly edited from SOAtest Architect Edition or Server Edition. To directly modify a Team Test Configuration from an Architect or Server Edition: 1. Open the Test Configurations dialog by choosing SOAtest> Test Configurations or by choosing Test Configurations in the drop-down menu on the Test Using toolbar button. 2. In the left pane, select Team> [your_team_Test_Configuration]. 3. Modify the settings as needed. 4. Click either Apply or Close to commit the modified settings. The settings will then be updated on Team Server, and the updated settings will be shared across the team.

Alternate Update Method You can update a Team Test Configuration by modifying the User-Defined Test Configuration it was based on, then repeating the Sharing Team Test Configurations procedure to re-upload the modified Test Configuration.

Setting the Team Favorite Test Configuration The Team Favorite Test Configuration is the test scenario that SOAtest uses when any Team Serverconnected team member starts a test without specifically indicating which Test Configuration to use (for example, when a team member starts a test by clicking the Test Using button). To set the Team Favorite Test Configuration, perform the following steps from a SOAtest Architect or Server edition: 1. Choose SOAtest> Explore> Team Server. The Browsing dialog will open. 2. Open the Configurations tab of the Browsing dialog. 3. Select the Test Configuration that you want to serve as the Team Favorite Test Configuration. 4. Click the Set as Team Favorite button.

Sharing Rule Mappings Rule mapping is a key part of configuring SOAtest to enforce your team’s or organization’s coding policy (e.g. by customizing the built-in rule names, severities, and categories to match the ones defined in your policy). You can use Team Server to ensure that all team members can access the rulemap.xml file (described in “Modifying Rule Categories, IDs, Names, and Severity Levels”, page 616) you have created to customize SOAtest rule categories and severity levels. To upload a rulemap.xml file to the Team Server, perform the following steps from SOAtest Architect Edition or Server Edition: 1. Launch SOAtest on a machine from which you can access the rulemap.xml file that you want to share. 2. Choose SOAtest> Explore> Team Server. The Browsing dialog will open. 3. Open the Rules tab of the Browsing dialog.

221

Configuring Team Test Configurations and Rules

4. Click the Upload button. A file chooser will open. 5. Select the rulemap.xml file that you created, then click Open. The rulemap.xml file that you just uploaded should now be listed in the Browsing dialog’s Rules tab. The rule configurations specified in this file will be available on all SOAtest installations connected to Team Server. 6. Click Done, click Apply, and then close the SOAtest Preferences dialog. 7. Restart SOAtest. You do not have to stop the server first. 8. Open the Test Configurations dialog by choosing SOAtest> Test Configurations or by choosing Test Configurations in the drop-down menu on the Test Using toolbar button. 9. Select any Test Configuration and open the Static tab. The new rule settings should be applied.

Tip •

If you later modify the master rulemap.xml file, you must repeat the Sharing Rule Mappings procedure to upload the modified file; if the modified file is not uploaded, the modifications will not be shared.

Sharing Custom Rules You can use Team Server to ensure that all team members can access and check custom static analysis rules you have designed with the RuleWizard module. When Team Server manages a rule, all SOAtest installations connected to Team Server will automatically have access to the most recent version of the rule. If a rule changes and the modified rule is uploaded to Team Server, the version on all team SOAtest installations will be updated automatically. The architect (or other designated team member) performs the following procedure on one SOAtest Architect or Server Edition that is already connected to Team Server: 1. Create one or more custom rules in RuleWizard. 2. Save each rule and assign it a .rule extension. You can save the rule in any location. 3. If any new rules should belong to a new SOAtest category, create a new category as follows: a. Open the Test Configurations dialog by choosing SOAtest> Test Configurations or by choosing SOAtest> Test Configurations in the drop-down menu on the Test Using toolbar button. b. Select any Test Configurations category. c.

Open the Static> Rules Tree tab.

d. Click the Edit Rulemap button. e. Open the Categories tab. f.

Click New. A new entry will be added to the category table.

g. Enter a category ID and category description in the new entry. For instance, an organization might choose to use ACME as the category ID and ACME INTERNAL RULES as the description. h. Note the location of the rulemap file, which is listed at the top of this dialog. You will need this information in step 9. i.

Click OK to save the new category.

4. Choose SOAtest> Explore> Team Server. The Browsing dialog will open.

222

Configuring Team Test Configurations and Rules

5. Open the Rules tab of the Browsing dialog. 6. Click the Upload button. A file chooser will open. 7. Select one or more of the new .rule files that you created, then click Open. The .rule files that you just uploaded should now be listed in the Browsing dialog’s Rules tab. All rules represented in this tab will be available on all SOAtest installations connected to Team Server. 8. Add additional team rules by repeating the previous two steps. 9. If you added any new rule categories or made any other changes to the rule mappings, click Upload, select the edited rulemap file, then click Open. The file that you just uploaded should now be listed in the Browsing dialog’s Rules tab. This file will be available on all SOAtest installations connected to Team Server. This file tells SOAtest how to categorize the team rules.

Tip - Rules for the Coding Standard Tool If the rule is going to be added to a Coding Standards test suite tool (as opposed to being run through a static analysis Test Configuration), you add it to the tree in the Coding Standards tool’s configuration panel—not the Test Configuration panel, as described below.

10. Open the Test Configurations dialog by choosing SOAtest> Test Configurations. 11. Select any Test Configuration and open the Static> Rules Tree tab. 12. Click Reload. The new rule should be available in all available Test Configurations and classified under the Team category. The rule will be disabled by default. 13. If you want a Team Test Configuration to check these rules: a. Configure a new or existing Test Configuration to check these rules. The added rules will be disabled by default, so you will need to enable any rules that you want checked. b. Ensure that the modified Test Configuration available to the team as described in “Sharing Team Test Configurations”, page 220. You must follow this procedure even if you are modifying a Test Configuration that is already shared. 14. Click either Apply or Close to commit the modified settings.

Tips •

If your custom rule is visible in Test Configuration rules tree (for instance, if you imported it via the rules tree Import button), you can upload it to Team Server by simply right-clicking the rule, then choosing Upload to Team Server from the shortcut menu.



If you later modify a team rule, you must repeat the Sharing Custom Rules procedure to upload the modified rule file; if the modified .rule file is not uploaded, the rule modifications will not be shared.

Removing Rules From Team Server To remove a rule from Team Server, the architect (or other designated team member) performs the following procedure from SOAtest Architect Edition or Server Edition: 1. Choose SOAtest> Explore> Team Server. The Browsing dialog will open.

223

Configuring Team Test Configurations and Rules

2. Open the Rules tab of the Browsing dialog. 3. Select the rule you want to remove. 4. Click Delete. 5. Click OK.

224

Configuring Task Goals

Configuring Task Goals This topic explains how managers can set goals for task reporting and task resolution, then have SOAtest apply these goals across the team’s SOAtest installations. Sections include: •

About Global Goals



Configuring Global Goals



Importing Tasks for Specified Goals

About Global Goals Global team goals are goals that may span across multiple Test Configurations. For instance, a global goal might be: •

Test execution failures from any project should be fixed each day.



Certain projects must have all static analysis tasks related to a given policy completed by a certain date.

To configure global goals, your SOAtest installation must have a SOAtest Server license. Global goals will be applied across the team if all team SOAtest installations are connected to Team Server as described in “Connecting All SOAtest Installations to Team Server”, page 199. Users without a Server license can review goals, but not configure them. When global goals are enabled, the Test Configuration manager’s Goals tab will be disabled.

Configuring Global Goals To configure global goals: 1. Choose SOAtest> Preferences to open the Preferences panel. 2. Choose Tasks> Goals on the left. 3. Check the Enable global management button. 4. Click New. 5. Configure the new goal that is added to the table. You can configure global goal options such as: •

The project(s) that the goal applies to.



The deadline for achieving the goal.



What the goal requires.

Importing Tasks for Specified Goals When importing tasks from Team Server, team members can choose to import only the tasks related to a specified goal. To do this: 1. Choose SOAtest> Import> Custom Tasks or choose Custom Tasks from the Import My Recommended Tasks pull-down toolbar menu. 2. Select Import from Team Server server. 3. Select Filtered. 4. Select for goals.

225

Configuring Task Goals

5. Choose the appropriate goal from the goals box.

6. Click OK.

226

Sharing Project and Test Assets

Sharing Project and Test Assets This topic explains how to share your project and test assets across the team.

Sharing Test Assets To ensure that your tests are accessible to other team members, as well as to the nightly test machine, share test assets such as the following through source control: •

.tst files



data sources



custom scripts



external regression controls

Using a Team Project Set File (.psf) to Share the Project Across the Team Once one team member creates a project, that team member can create a Team Project Set File (.psf) which can then be shared by other members of the team. Doing so allows every team member to create their Eclipse Projects in the same uniform way. This is a necessary step for importing tasks from the automated nightly test process. To create a Team Project Set File, complete the following: 1. Select File> Export. The Export Wizard displays. 2. Within the Export Wizard, select Team> Team Project Set, and then click the Next button. 3. Select the projects to be included in the Team Project Set File by selecting the corresponding check boxes, 4. Enter the location where the Team Project Set File will be saved and click the Finish button. To create a Project from the Team Project Set File, complete the following: 1. Select File> Import. The Import Wizard displays. 2. Within the Import Wizard, select Team> Team Project Set, and then click the Next button. 3. Browse to the desired Team Project Set and click the Finish button. The tests you selected display in the Test Case Explorer.

Sharing Preferences Across a Team To share preferences, you can export them (ideally, to source control, a file server, twiki, etc.) then import them onto the other installations. When the desired preferences change, they will need to be manually exported and imported again. To export and then import preferences: 1. In the workspace where they are set, export them by choosing File> Export, selecting General> Preferences, then completing the wizard. In the new workspace, import them by choosing by choosing File> Import, selecting General> Preferences, then completing the wizard.

227

Sharing Project and Test Assets

228

Configuring Automated Nightly Testing

Configuring Automated Nightly Testing You can run SOAtest from the command line to speed up your testing/development process and integrate SOAtest into your automatic nightly builds/deployments. The following example scenario describes how you might setup SOAtest to run automatically as part of a nightly testing and build process, including automatic test execution and reporting. We will assume that you have your SOAtest project files checked into a revision control system and you have an existing automated job scheduler, such as Windows Task Manager or Unix cron jobs. The goal is to have SOAtest run all project files that are checked into source control and to generate reports which will be visible on a web server. 1. Create directories on your test system for storing project files and reports. 2. Schedule a process in the task scheduler on the test system to check out the latest version of your SOAtest project and test files from your revision control system. This process should be scheduled to run daily. 3. Schedule another daily process to invoke SOAtest using the desired command line options. •

For details and examples, see the tutorial lesson “Automation/Iteration (Nightly Process)”, page 165 as well as the topic “Testing from the Command Line Interface (soatestcli)”, page 257

229

Team-Wide Deployment - Usage Overview In this section: •

Using a Team Deployment: Daily Usage Introduction



Creating Tests and Analyzing Source Files from the GUI



Reviewing and Responding to Tasks Generated During Nightly Tests



Accessing Results and Reports



Reassigning Tasks to Other Team Members



Monitoring Project-Wide Test Results

230

Using a Team Deployment: Daily Usage Introduction

Using a Team Deployment: Daily Usage Introduction Once a team deployment is configured as described in “Configuring a Team Deployment: Introduction”, page 191, daily usage should involve: •

Developers/QA - Creating Tests and Analyzing Source Files from the GUI



Developers/QA - Reviewing and Responding to Tasks Generated During Nightly Tests



Managers/Architects - Monitoring Project-Wide Test Results

231

Creating Tests and Analyzing Source Files from the GUI

Creating Tests and Analyzing Source Files from the GUI Team members who are testing SOA systems typically use their local installations of SOAtest to develop and run tests, then check their test artifacts (.tst files, etc.) into source control so that they can be integrated into the nightly regression test suite. Team members who are working on source files typically use their local installations of SOAtest to ensure that each new or modified source file complies with the organization’s policy, address the reported violations, then add the compliant files to source control.

232

Reviewing and Responding to Tasks Generated During Nightly Tests

Reviewing and Responding to Tasks Generated During Nightly Tests This topic explains the recommend procedure for team members to follow each morning to review the results from the team's SOAtest Server (soatestcli) tests. soatestcli should be configured to run the team’s designated functional test suite and policy enforcement/static analysis tests nightly to identify any quality tasks that require the team’s attention (failed test cases, policy violations, etc.)

To access these tasks, each team member performs the following procedure each morning: 1. (Optional) Review the emailed report of test results to determine if any new tasks were assigned to you. •

See “Understanding Reports”, page 300 for report details.

2. Import results by choosing SOAtest> Import [preferred category of tasks]. •

See “Accessing Results and Reports”, page 234 for other import options.

3. Review and respond to results. •

See “Viewing Results”, page 290 for details.



If a test failure is reported, determine if the failure was the result of an expected change (for instance, an application change made in response to a modified requirement) or a problem with the application. For expected changes, the test case needs to be modified. For unexpected changes, the application needs to be corrected.



If a policy violation is reported, either correct the violation, or use code review to discuss whether it is a good candidate for a suppression.

4. Reassign tasks if needed. •

See “Reassigning Tasks to Other Team Members”, page 238 for details.

5. Add any modified code and/or test assets (test cases, data sources, etc.) to the designated source control location.

233

Accessing Results and Reports

Accessing Results and Reports This topic explains how to access test results available on Team Server. If SOAtest results (e.g., from a nightly command line test) are sent to Parasoft Team Server, team developers can import test results into the SOAtest GUI. This way, project-wide/team-wide tests can be run automatically each night on a central server machine. Each team member can import his or her assigned tasks (failed tests to review, policy violations to address, etc.) into the GUI and then review and respond to them in the SOAtest view. In addition, team members can download and/or open those reports from any SOAtest installation connected to Team Server or from any machine that can access the Team Server Web server. Sections include: •

Importing Results From Team Server into the SOAtest GUI



Accessing Team Server Reports through the GUI



Accessing Team Server Reports through a Web Browser



Accessing Results through Report Center

Terminology •

Your tasks: The subset of all of your testing tasks that you are responsible for (based on SOAtest’s task assignments, which are discussed in “Configuring Task Assignment”, page 215).



Recommended tasks: The subset of all of your testing tasks that SOAtest has selected for you to review and address today (based on the maximum number of tasks per developer per day that your team has configured SOAtest to report, as described in “Defining Task Reporting and Resolution Targets (Goals Tab)”, page 250).



Your recommended tasks: The subset of all of your testing tasks that 1) you are responsible for (based on SOAtest’s task assignments, which are discussed in “Configuring Task Assignment”, page 215) and 2) SOAtest has selected for you to review and address today (based on the maximum number of tasks per person per day that your team has configured SOAtest to report, as described in “Defining Task Reporting and Resolution Targets (Goals Tab)”, page 250).

Importing Results From Team Server into the SOAtest GUI Any team member whose SOAtest installation is connected to Team Server will be able to import the test results stored on Team Server. When results are imported, test results are shown in the GUI as if the test were run in the GUI. After the import, you can drill down into results in the normal manner. You can import a specific category of tasks, tasks for specific resources you have worked on or you are responsible for correcting, or all tasks. You can only import results for projects that are currently in your workbench. If the tested project files were modified in your workbench since the test was run, the results will not be reported because they might not correspond to your modified version of the project files.

234

Accessing Results and Reports

Tip: Importing Tasks if Tests Are Not Performed Frequently By default, SOAtest is configured to import tasks from tests performed within the past 2 days. If your team doesn’t run tests frequently and you try to import tasks more than 2 days after the test has run, nothing will be imported—unless you change the default settings. To change the default test import settings: 1. Choose SOAtest> Preferences. The Preferences dialog will open. 2. In the left pane, select SOAtest> Tasks. 3. Modify the setting for Import only tasks reported for tests ran in the last n days.

Importing Your Recommended Tasks To import your recommended tasks reported for all test runs that were performed in the previous 24 hours and whose results were sent to Team Server: •

Choose SOAtest> Import> My Recommended Tasks or click the Import My Recommended Tasks toolbar button.

Importing All Your Tasks To import your testing tasks reported for all test runs that were performed in the previous 24 hours and whose results were sent to Team Server: •

Choose SOAtest> Import> All My Tasks or choose All My Tasks from the Import My Recommended Tasks pull-down toolbar menu.

Importing a Custom Set of Tasks To import a custom set of tasks: 1. Choose SOAtest> Import> Custom Tasks or choose Custom Tasks from the Import My Recommended Tasks pull-down toolbar menu. 2. Specify where you want to import tasks from. Available options are: •

Import from Team Server server: Imports tasks that were uploaded to Team Server (for example, after a batch mode test).



Import from local file(s): Imports tasks from a results file that is accessible from your local file system.

3. Specify where you want to import tasks from. Available options are: •

All/Filtered: Specifies whether you want to import all tasks on Team Server, or only a subset of tasks (tasks that satisfy the criteria specified in the subsequent options).



recommended tasks: Imports only recommended tasks.



for selected resources: Imports only tasks for the selected resources.



for single user: Imports only tasks for the specified user.

4. Click OK.

235

Accessing Results and Reports

For example, if you were tasked with cleaning up a particular file and wanted to import all of the tasks reported for that file, you would first select that resource in the project tree, then you would open the Custom Tasks dialog and select the following: •

Team Server



Filtered



for selected resources

Importing all results from an XML file Here’s an alternative way to import all tasks: 1. Choose SOAtest> Explore> Team Server. The Browsing dialog will open. 2. Open the Reports tab of the Browsing dialog. 3. Select the XML report whose results you want to import, then click the Import Results button.

Accessing Team Server Reports through the GUI Any team member whose SOAtest installation is connected to Team Server will be able to view and download the report files that are available on Team Server. To download a report file: 1. Choose SOAtest> Explore> Team Server. The Browsing dialog will open. 2. Open the Reports tab of the Browsing dialog. Reports will be organized according to the date when they were generated. 3. Do one of the following: •

To view a report, select the report that you want to view, then click the View button. The report will open in a Web browser.



To download a report, select the report that you want to download, then click the Download button. A file chooser will open. Specify a location for the downloaded report, then click Save. The report file will then be downloaded to the specified location.

Removing Reports from Team Server If you want to delete reports stored on Team Server (for example, if you want to remove all old reports from Team Server or remove the report for a failed test run): 1. Choose SOAtest> Explore> Team Server. The Browsing dialog will open. 2. Open the Reports tab of the Browsing dialog. 3. Select the XML report whose results you want to delete, then click the Delete button. •

If you want to keep the related test data on Team Server (e.g., if you are cleaning out old reports, but you still want data from these tests used for graphs that show historical trends), clear the Keep summary data for report graphs check box.

236

Accessing Results and Reports

Accessing Team Server Reports through a Web Browser Any team member who can access the Team Server’s Web server can directly browse to the HTML and XML report files that are available on Team Server. This allows team members to access reports outside of the SOAtest GUI. Moreover, in the reports available on Team Server, all links (for instance, links to Category) are active. All links are not active in emailed reports. To directly access a report available on Team Server: •

Choose SOAtest> Explore> Team Server Reports in the SOAtest GUI.

Accessing Results through Report Center To access Report Center reports based on information from SOAtest tests and other sources: •

Choose SOAtest> Explore> Report Center Reports or open the reports as described in your Report Center User’s Guide.

237

Reassigning Tasks to Other Team Members

Reassigning Tasks to Other Team Members This topic explains how to reassign a reported task (for example, to fix a static analysis/policy violation, review a test failure, etc.) to another team member. SOAtest automatically assigns tasks as described in “Configuring Task Assignment”, page 215. You can override these computations and reassign tasks as needed. To reassign a task: 1. In the SOAtest view, right-click the task you want to reassign, then choose Reassign Task To from the shortcut menu. The Reassign Task To dialog will open. 2. Choose the option that indicates how you want the task reassigned. •

To reassign the task to a specific user, choose Username, then enter the username of the person to whom you want that task assigned.



To reassign the task to the user who last modified the file, choose Last Author.



(Optional) If you want to remove the reassigned task from your SOAtest view, enable the Delete task if reassigning option.

Note If you are using Parasoft Team Server, then task assignment is managed by the server and shared across the team; otherwise, it is managed locally by your installation of SOAtest. For example, if you are using Team Server and you reassign tasks to johndoe, those tasks will only be shown on johndoe’s machine; if you and other team members have set the Check only file/lines authored by [your_username] Scope option, those tasks will not appear on your machine or other team members’ machines. If you are not using Team Server and you reassign tasks to johndoe, those tasks will not be shown on your machine, but may be shown on other team members’ machines.

238

Monitoring Project-Wide Test Results

Monitoring Project-Wide Test Results This topic explains the recommend procedure for team leaders (managers, architects, etc.) to follow in order to monitor quality, evaluate project readiness, and to spot emerging trends and problems. To monitor project quality and status, managers should review results on a regular basis. Results can be accessed in several ways: •

From the emailed report of test results. •



See “Understanding Reports”, page 300 for report details.

From the report of test results that was sent to Team Server (by choosing SOAtest> Explore> Team Server Reports). •

See “Accessing Results and Reports”, page 234 for details.



From the Report Center dashboard (by choosing SOAtest> Explore> Report Center Reports).



From the GUI (by choosing SOAtest> Import> All Tasks).

239

Test and Analysis Basics In this section: •

Running Tests and Analysis



Reviewing Results

240

Customizing Settings and Configurations In this section: •

Modifying General SOAtest Preferences



Creating Custom Test Configurations

241

Modifying General SOAtest Preferences

Modifying General SOAtest Preferences This topic explains how to modify SOAtest’s general settings (settings that are not specific to any particular Test Configuration, project, or test). Sections include: •

Customizing Preferences



Configuring Double-Click vs. Single-Click Options



Sharing Preferences Across a Team



Transferring Preferences Across Workspaces

Customizing Preferences To customize general SOAtest preferences: 1. Choose SOAtest> Preferences. The Preferences dialog will open. 2. In the left pane, select the category that represents the SOAtest settings you want to change. •

See “Preference Settings”, page 747 for details on the available categories and settings.

3. Modify the settings as needed. 4. Click either Apply or OK to commit the modified settings.

Configuring Double-Click vs. Single-Click Options With the default settings, you need to double-click a Test Case Explorer or Navigator node to open the related file or configuration panel. For instance, if you want to configure a tool, you need to double-click the related Test Case Explorer node to open the that tool’s configuration panel. You can change the default double-click behavior to single-click by completing the following: 1. Select Window> Preferences. 2. Within the Preferences dialog, select General on the left, and change the Open mode from Double click to Single click within the right GUI panel. 3. Select General> Editors, enable Close editors automatically and then click the OK button. You will now be able to open editors based on a single click.

Sharing Preferences Across a Team If you share an entire workspace with team members, your preferences (SOAtest-specific preferences as well as general Eclipse preferences) will be shared along with the project. If you do not share an entire workspace, but want to share preferences, you will need to export them (ideally, to source control, a file server, twiki, etc.) then import them onto the other installations. When the desired preferences change, they will need to be manually exported and imported again. To export and then import preferences: 1. In the workspace where they are set, export them by choosing File> Export, selecting General> Preferences, then completing the wizard.

242

Modifying General SOAtest Preferences

In the new workspace, import them by choosing by choosing File> Import, selecting General> Preferences, then completing the wizard.

Transferring Preferences Across Workspaces Eclipse resets preferences to the defaults when a new workspace is opened. If you want to transfer your preferences (SOAtest-specific preferences as well as general Eclipse preferences) to a new workspace: 1. In the workspace where they are set, export them by choosing File> Export, selecting General> Preferences, then completing the wizard. In the new workspace, import them by choosing by choosing File> Import, selecting General> Preferences, then completing the wizard.

243

Creating Custom Test Configurations

Creating Custom Test Configurations This topic explains why and how to create custom Test Configurations that define the test scenarios you plan to use, as well as how to export/import Test Configurations. Sections include: •

About Custom Test Configurations



Creating a Custom Test Configuration



Configuring Test Settings •

Defining What Code is Tested (Scope Tab)



Defining How Static Analysis is Performed (Static Tab)



Defining How Test Cases are Executed (Execution Tab)



Defining Common Options that Affect Multiple Analysis Types (Common Tab)



Defining Peer Review Options (Code Review Tab)



Defining Task Reporting and Resolution Targets (Goals Tab)



Changing the Favorite Test Configuration



Importing/Exporting Test Configurations



Comparing Test Configurations



Specifying Test Configuration Inheritance

About Custom Test Configurations Every SOAtest test—whether it is performed in the GUI or from the command line interface—is based on a Test Configuration which defines a test scenario and sets all related test parameters for static analysis and test execution. To change how a test is performed, you modify the settings for the Test Configuration you plan to use. SOAtest provides built-in Test Configurations that are based on a variety of popular test scenarios. However, because development projects and team priorities differ, some SOAtest users prefer to create custom Test Configurations. The default Test Configurations, which are in the Built-in category, cannot be modified. The recommended way to create a custom Test Configuration is to copy a Built-in Test Configuration to the Userdefined category, then modify the copied Test Configuration to suit your preferences and environment. Alternatively, you could create a new Test Configuration "from scratch", then modify it as needed. The Favorite Configuration should be set to the custom Test Configuration that you plan to use most frequently. By setting your preferred Test Configuration as the Favorite Configuration, you can easily run it from the SOAtest menu, the Test Using tool bar button, or from the command line interface.

Creating a Custom Test Configuration To create a custom Test Configuration: 1. Open the Test Configurations panel by choosing SOAtest> Test Configurations. 2. Review the available Test Configurations to determine which (if any) you want to base your custom Test Configuration on. •

Built-in Test Configurations are described in “Built-in Test Configurations”, page 741.

244

Creating Custom Test Configurations

3. Do one of the following: •

If you want to base a custom Test Configuration on a built-in Test Configuration, rightclick that Test Configuration, then choose Duplicate.



If you want to create a custom Test Configuration from scratch, click New.

4. Select the new Test Configuration, which will be added to the User-defined category. 5. Modify the settings as needed. •

Scope tab settings determine what code is tested; it allows you to restrict tests by author, by timestamp, and so on. For details, see “Defining What Code is Tested (Scope Tab)”, page 245.



Static tab settings determine how static analysis is performed and what rules it checks. For details, see “Defining How Static Analysis is Performed (Static Tab)”, page 247.



Execution tab settings determine if test cases are executed. For details, see “Defining How Test Cases are Executed (Execution Tab)”, page 248.



Common tab settings cover various actions that can affect multiple types of analysis. For details, see “Defining Common Options that Affect Multiple Analysis Types (Common Tab)”, page 249.



Goals tab settings allow the team to specify goals for task reporting and task resolution. For details, see “Defining Task Reporting and Resolution Targets (Goals Tab)”, page 250.

6. (Optional) Set the Test Configuration as the Favorite Test Configuration by right-clicking it, then choosing Set as Favorite from the shortcut menu. The configuration will then be set as the Favorite Configuration; the "favorite" icon will be added to that configuration in the Test Configurations tree. 7. Click Apply, then Close.

"Grayed-Out" Test Configurations = Incompatible Test Configurations If a Test Configuration is "grayed out," this indicated that it was created with an incompatible version of SOAtest, and cannot be edited or run with the current version.

Tip - Importing and Exporting to Share Test Configurations If you are not using Parasoft Team Server to share test settings across your team, you can share custom Test Configurations with team members by exporting each Test Configuration you want to share, then having your team members import it. See “Importing/Exporting Test Configurations”, page 251 for details.

Configuring Test Settings Defining What Code is Tested (Scope Tab) For source code static analysis only.

245

Creating Custom Test Configurations

During a test, SOAtest will perform the specified action(s) on all code in the selected resource that satisfies the scope criteria for the selected Test Configuration. By default, SOAtest checks all code in the selected resource. However, you can use the Scope tab to configure restrictions such as: •

Test only files or lines added or modified after a given date.



Test only files or lines added or modified on the local machine.



Test only files modified by a specific user.

You can restrict the scope of tests by specifying file filters or line filters that define what files or code you want tested (for example, only files or lines added or modified since a cutoff date, only files or lines added or modified locally, only files or lines last modified by a specific user). Some file filters and line filters are only applicable if you are working with projects that are under supported source control systems. The Scope tab has the following settings: •

File Filters: Restricts SOAtest from testing files that do not meet the specified timestamp and/ or author criteria. •





Time options: Restricts SOAtest from testing files that do not meet the specified timestamp criteria. Available time options include: •

No time filters: Does not filter out any files based on their last modification date.



Test only files added or modified since the cutoff date: Filters out files that were not added or modified since the cutoff date.



Test only files added or modified in the last n days: Filters out files that were not added or modified in the specified time period.



Test only files added or modified locally: Filters out files that were not added or modified on the local machine. This feature only applies to files that are under supported source control systems.

Author options: Restricts SOAtest from testing files that do not meet the specified author criteria. Available author filter options include: •

No author filters: Does not filter out any files based on their author.



Test only files authored by preferred user: Filters out any files that were not authored by the specified user (i.e., filters out any files that were authored by another user).

Line filters: Restricts the lines of code that SOAtest operates on. The file filter is applied first, so code that reaches the line filter must have already passed the file filter. Available line filter options include: •

Time options: Restricts SOAtest from testing lines of code that do not meet the specified timestamp criteria. Available time options include: •

No time filters: Does not filter out any lines of code based on their last modification date.



Test only lines added or modified since the cutoff date: Filters out lines of code that were not added or modified since the cutoff date. This feature only applies to files that are under supported source control systems.



Test only lines added or modified in the last n days: Filters out lines of code that were not added or modified in the specified time period.

246

Creating Custom Test Configurations





Test only lines added or modified locally: Filters out lines of code that were not added or modified on the local machine. This feature only applies to files that are under supported source control systems.

Author options: Restricts SOAtest from testing lines of code that do not meet the specified author criteria. Available author filter options include: •

No author filters: Does not filter out any lines of code based on their author.



Test only lines authored by preferred user: Filters out any lines of code that were not authored by the specified user (i.e., filters out any lines of code that were authored by another user).

Note •

Code authorship information and last modified date is determined in the manner set in the Scope and Authorship preferences page; for details about available settings, see “Configuring Task Assignment”, page 215.



Setting file or line scope filters could prevent SOAtest from reporting some of the violations that occur within tested files. See “”, page 607 for details.

Defining How Static Analysis is Performed (Static Tab) During a test, SOAtest will perform static analysis based on the parameters defined in the Test Configuration used for that test. The Static tab has the following settings: •

Enable Static Analysis: Determines whether SOAtest performs static analysis, which involves checking whether the selected resources follow the rules that are enabled for this Test Configuration.



Limit maximum number of tasks reported per rule to: Determines whether SOAtest limits the number of violations (tasks) reported for each rule, and—if so—the maximum number of violations per rule that should be reported during a single test. For instance, if you want to see no more than five violations of each rule, set this parameter to 5. The default setting is 1,000.



Do not apply suppressions: Determines whether SOAtest applies the specified suppressions. If suppressions are not applied, SOAtest will report all violations found.



Rules tree: Determines which rules are checked during static analysis. Use the rules tree and related controls to indicate which rules and rule categories you want checked during static analysis. •

To view a description of a rule, right-click the node that represents that rule, then choose View Rule Documentation from the shortcut menu.



To view a description of a rule category, right-click the node that represents that rule category, then choose View Category Documentation from the shortcut menu.



To enable or disable all rules in a specific rule category or certain types of rules within a specific rule category, right-click the category node, then choose Enable Rules> [desired option] or Disable Rules> [desired option].



To search for a rule, click the Find button, then use that dialog to search for the rule.

247

Creating Custom Test Configurations



To hide the rules that are not enabled, click the Hide Disabled button. If you later want all rules displayed, click Show All.

Tips •

The number next to each rule ID indicates the rule’s severity level. The severity level indicates the chance that a violation of the rule will cause a serious construction defect (a coding construct that causes application problems such as slow performance, security vulnerabilities, and so on). Possible severity levels (listed from most severe to least severe) are: •

Severe Violation (SV) - Level 1



Possible Severe Violation (PSV) - Level 2



Violation (V) - Level 3



Possible Violation (PV) - Level 4



Informational (I) - Level 5



To learn about the rules that are included with SOAtest, choose Help> Help Contents, then open the SOAtest Static Analysis Rules book, then browse the available rule description files.



To generate a printable list of all rules that a given Test Configuration is configured to check: a. Open the Test Configurations panel by choosing SOAtest> Test Configurations. b. Select the Test Configurations category that represents the user-defined Test Configuration you want to modify. c.

Open the Static tab.

d. Click the Printable Docs button.

Defining How Test Cases are Executed (Execution Tab) During a test, SOAtest will execute test cases based on the parameters defined in the selected Test Configuration’s Execution tab. The Execution> Functional tab has the following settings: •

Enable Test Execution: Determines whether SOAtest executes available tests. If this option is not checked, all other test execution parameters are irrelevant.



Execute in load test mode: Determines whether SOAtest executes available tests in load testing mode and alerts you to any outstanding issues that might impact your load testing—for example, incorrectly configured HTTP requests. See “Validating Tests”, page 566 for details. •

Auto-configure tests in preparation for load testing: Determines whether SOAtest configures browser-based web functional tests to run in a browser-less load test environment. See “Configuring Tests”, page 562 for details. See “Configuring Tests”, page 562 for details.



Execute only opened Test Suite (.tst) Files (always false in command-line mode): Determines whether SOAtest executes Test Suites that are not currently active (i.e., tests that you are not currently working on).



Override default Environment during Test Execution: Configures SOAtest to always use the specified environment for tests run with this Test Configuration—regardless of what environment is active in the Test Case Explorer.

248

Creating Custom Test Configurations

For example, assume you have the following environments:

This is how you set the Test Configuration to always use the "staging server" environment:



Use browser: Allows you to override a test’s browser playback settings at the time of test execution See “Modifying Browser Playback Settings”, page 447 and “Specifying the Browser at the Time of Test Execution”, page 447 for details.



Apply static analysis to: If a Test Configuration runs both static analysis and test execution (e.g., for performing static analysis on a Web functional test), this setting determines whether static analysis is performed on the HTTP responses, or the browser contents. •

HTTP Responses refers to the individual HTTP messages that the browser made in order to construct its data model—the content returned by the server as is (before any browser processing).



Browser-Constructed HTML refers to the real-time data model that the browser constructed from all of the HTML, JS, CSS, and other files it loaded.

The Execution> Security tab allows you to configure penetration testing, which is described in “Penetration Testing”, page 484. The Execution> Runtime Error Detection tab allows you to configure runtime error detection, which is described in “Performing Runtime Error Detection”, page 490.

Defining Common Options that Affect Multiple Analysis Types (Common Tab) The Test Configuration’s Common tab controls test settings for actions that affect multiple analysis types. The Common tab has the following settings: •

Before Testing> Refresh projects: Determines whether projects are refreshed before they are tested. When a project is refreshed, SOAtest checks whether external tools have changed

249

Creating Custom Test Configurations

the project in the local file system, and then applies any detected changes. Note that when you test from the command line, projects are always refreshed before testing. •

Before Testing> Update projects from source control: Determines whether projects are updated from source control (if you using a supported source control system) before they are tested.



Build: Determines if and when whether projects are built before they are tested. Note that this settings applies to GUI tests, not command-line tests. Available options include:





Full (rebuild all files): Specifies that all project files should always be rebuilt.



Incremental (build files changed since last build): Specifies that only the project files that have changed since the previous build should be rebuilt.



Stop testing on build errors: Specifies that testing should stop when build errors are reported.

Commit added/modified files to source control if no tasks were reported: Allows you to combine your testing and your source control check-ins into a single step. For example, you would enable this if you want to run static analysis on files, then have SOAtest automatically check in the modified files if no static analysis tasks are reported. In the context of functional testing, it tells SOAtest that if you run modified tests—and they pass—it should check the modified tests into source control.

Defining Peer Review Options (Code Review Tab) This tab contains settings for automating preparation, notification, and tracking the peer review process, which can be used to evaluate critical SDLC artifacts (source files, tests, etc.) in the context of the organization’s defined quality policies. For details, see “Code Review”, page 619.

Defining Task Reporting and Resolution Targets (Goals Tab) To ensure that SOAtest does not report an overwhelming number of tasks, the team manager can specify a reporting limit (such as "Do not report more than 25 static analysis tasks per developer per day") and/or a quality goal (such as "All static analysis violations should be fixed in 2 months"). SOAtest will then use the specified criteria to select a subset of testing tasks for each developer to perform each day. These goals are specified in the Goals tab. Alternatively, you can set global team goals—goals that may span across multiple Test Configurations—as described in “Configuring Task Goals”, page 225. This requires Team Server and a SOAtest Server license. If goals are set globally, the Goals tab in the Test Configuration panel will be disabled. The Goals tab has the following settings:

Static tab •

Perform all tasks: Specifies that you want SOAtest to report all static analysis tasks it recommends, and the team should perform all static analysis tasks immediately.



Don’t perform tasks: Specifies that you want SOAtest to report all static analysis tasks it recommends, but the team is not required to perform all static analysis tasks immediately. This is useful, for instance, if you want to see all recommended static analysis tasks, but you want the team to focus on fixing test failures before addressing static analysis violations.



No more than n tasks per developer by date: Specifies that you want each developer to have only n static analysis tasks by the specified date.



Max tasks to recommend: Limits the number of static analysis tasks reported for each developer on any test run. The tasks shown are selected randomly so that different tasks are shown after each run. For example, if you set the parameter to 50, the first task shown after each run

250

Creating Custom Test Configurations

is selected at random, and the following 49 tasks shown are the ones that would follow that first task in a complete report.

Execution tab •

Perform all tasks: Specifies that you want SOAtest to report all functional testing tasks, and the team should perform the specified tasks immediately.



Don’t perform tasks: Specifies that you want SOAtest to report all functional testing tasks, but the team is not required to perform the specified tasks immediately. This is useful, for instance, if you want to see a list of all necessary functional testing tasks, but you want the team to focus on fixing static analysis tasks before addressing functional test failures.



No more than n tasks per developer by date: Specifies that you want each developer to have only n functional testing tasks by the specified date.



Max tasks to recommend: Limits the number of functional testing tasks reported for each developer on any test run. The tasks shown are selected randomly so that different tasks are shown after each run. For example, if you set the parameter to 50, the first task shown after each run is selected at random, and the following 49 tasks shown are the ones that would follow that first task in a complete report.

Changing the Favorite Test Configuration The Favorite Configuration defines the test scenario that SOAtest uses when you start a test without specifically indicating which Test Configuration you want to use. For example, if you start a test by clicking the Test Using button, SOAtest will run that test based on the parameters defined in the Favorite Configuration. To indicate which Test Configuration you want set as the Favorite Configuration: 1. Open the Test Configurations panel by choosing SOAtest> Test Configurations or by choosing Test Configurations in the drop-down menu on the Test Using toolbar button. 2. Right-click the Test Configurations category that represents the Test Configuration you want set as the Favorite Configuration, then choose Set As Favorite from the shortcut menu. The configuration will then be set as the Favorite Configuration; the "favorite" icon will be added to that configuration in the Test Configurations tree.

Importing/Exporting Test Configurations If you have created a Test Configuration that you want to share with team members or use in an upgraded version of SOAtest, you can export the Test Configuration into a properties file. That Test Configuration can then be added by importing the related properties file.

Exporting To export a Test Configuration: 1. Open the Test Configurations panel by choosing SOAtest> Test Configurations. 2. Right-click the Test Configuration you want to export, choose Export from the shortcut menu, then use the file chooser to indicate where you want to save the properties file that will be created for this Test Configuration. A properties file will then be saved in the designated location. A dialog box will open to confirm the location of the newly-created properties file.

251

Creating Custom Test Configurations

Importing To import a Test Configuration that was previously exported into a properties file: 1. Open the Test Configurations panel by choosing SOAtest> Test Configurations. 2. Right-click the User-defined category, choose Import from the shortcut menu, then use the file chooser to select the appropriate properties file.

Comparing Test Configurations To compare two Test Configurations: 1. Open the Test Configurations panel by choosing SOAtest> Test Configurations. 2. Right-click one of the Test Configurations you want to compare, choose Export from the shortcut menu, then use the file chooser to indicate where you want to save the .properties file (choose a folder that is in your workspace and available in the Package Explorer). 3. Repeat the above step for the other Test Configurations you want to compare. 4. Select the two different .properties files in the Navigator, right-click the selection, then choose Compare with> Each other.

Specifying Test Configuration Inheritance If you want multiple Test Configurations to share some of the same parameter settings (for example, if you want multiple Test Configurations to have the same rules enabled), you can create new child Test Configurations referring to one parent Test Configuration. A child Test Configuration will inherit the parent’s settings; the value of each preference in the parent Test Configuration is used whenever the corresponding preference in the child Test Configuration is not present. Inheritance is recursive; in other words, you could have the MyConfig2 Test Configuration inherit the settings from MyConfig1, and have MyConfig3 inherit the settings from MyConfig 2. MyConfig3 will thus inherit some MyConfig1 settings as it inherits MyConfig2 settings. You can create a child Test Configuration from a Test Configuration shown in the Test Configuration panel, or by specifying a Test Configuration URL (for Test Configurations available via HTTP). To create a child from a Test Configuration shown in the Test Configuration panel: 1. Open the Test Configurations panel. 2. Right-click the desired parent Test Configuration, then choose New Child from the shortcut menu. To create a child from Test Configurations available via HTTP: 1. Open the Test Configurations panel. 2. Right-click the User-Defined node, then choose New Child from the shortcut menu. 3. In the dialog that opens, enter the URL for the desired parent Test Configuration (http:// config_address). For example: http://SOAtest.acme.com/configs/static.properties To disconnect a child from its parent: 1. Open the Test Configurations panel. 2. Click the Disconnect button to the right of the Parent field.

252

Creating Custom Test Configurations

Important Notes •

It is not possible to change the parent of a Test Configuration. Test Configurations that inherit from a parent must be created that way from the start using the "New Child" action.



Once a child Test Configuration is disconnected from its parent, it cannot be reconnected. All the inherited settings are applied directly in the child when disconnected.



Each test configuration may have at most one parent configuration. Multiple inheritance is not supported.

253

Running Tests and Analysis In this section: •

Testing from the GUI



Testing from the Command Line Interface (soatestcli)



Testing from the Web Service Interface

254

Testing from the GUI

Testing from the GUI This topic explains the general procedure for running tests from the SOAtest GUI. Sections include: •

Running a Test



Reviewing Results



Fine-Tuning Test Settings

Running a Test SOAtest can perform a variety of tasks, from static analysis, to functional testing, to regression testing. To start using SOAtest to achieve your goals, you run a test based on a default or custom test scenario, which defines the precise nature and scope of SOAtest's analysis. All preconfigured Test Configurations are described in “Built-in Test Configurations”, page 741. The procedure for creating a custom Test Configuration is described in “Creating Custom Test Configurations”, page 244. The general procedure for testing from the GUI is as follows: 1. In the Test Case Explorer or Navigator view, select the resources you want to analyze or the tests you want to execute. You can use Ctrl + click or Shift + click to select multiple resources. 2. Start the test in one of the following ways: •



To run the Favorite Configuration (which executes the selected tests), perform one of the following actions: •

Click the Test Using button in the toolbar.



Choose SOAtest> Test Using [Favorite Configuration] from the menu bar.



Right-click the resource, then choose SOAtest Test Using [Favorite Configuration] from the shortcut menu.

To run another Test Configuration, perform one of the following actions: •

Choose the appropriate Test Configuration from the Test Using section of the Test Using button’s pull-down menu.



Choose the appropriate Test Configuration from the SOAtest> Test Using menu in the menu bar.



Choose the appropriate Test Configuration from the SOAtest> Test History menu in the menu bar. Note that this menu contains only the most recently-run Test Configurations.



Right-click the selection, then choose the appropriate Test Configuration from the SOAtest> Test Using shortcut menu.



Right-click the selection, then choose the appropriate Test Configuration from the SOAtest> Test History shortcut menu.

"Grayed-Out" Test Configurations = Incompatible Test Configurations If a Test Configuration is "grayed out," this indicated that it was created with an incompatible version of SOAtest, and cannot be applied with the current version.

255

Testing from the GUI

SOAtest will then run the test scenario defined by the selected Test Configuration.

Reviewing Results Test progress and results summaries will be reported in the Testing panel that SOAtest opens when it starts the test. Detailed results will be reported in the SOAtest view. Drill down to see details about the test findings. For details on reviewing results, see “Viewing Results”, page 290. For details on producing a report for the test, see “Generating Reports”, page 295.

Fine-Tuning Test Settings To change test settings—such as what rules are checked—edit an existing Test Configuration or create a new one, then run a test using the modified/new Test Configuration. Test Configurations and all related parameters can be viewed, edited, and modified in the Test Configurations dialog. To open this dialog, choose SOAtest> Test Configurations from the menu bar. For information on configuring test parameters, see “Creating Custom Test Configurations”, page 244.

256

Testing from the Command Line Interface (soatestcli)

Testing from the Command Line Interface (soatestcli) This topic explains how to run a test from the SOAtest command line interface (soatestcli),. Sections include: •

Introduction



Running a Test



cli Overview



cli Options



Local Settings (Options) Files



Using Variables in Local Settings (Options) Files



cli Exit Codes

Migrating Your Automated Nightly Process from an Earlier Version of SOAtest or WebKing For help migrating your existing SOAtest automated nightly process from earlier versions of SOAtest or WebKing, see “Command Line Interface (cli) Migration”, page 51.

Introduction Prerequisites •

The command line mode requires a command line interface license (available with SOAtest Server Edition).



Before you can run a test from the command line, you need to setup a project, .tst file, and test suites. See “Adding Projects, .tst files, and Test Suites”, page 308 for details.

About the cli SOAtest’s command line interface (soatestcli) allows you to perform static analysis and execute tests as part of an automated nightly process. Command line mode is available for the Server Edition of SOAtest. soatestcli can send results to the Parasoft Report Center, send comprehensive reports to the team manager and to the Parasoft Team Server, and send focused reports to each team developer and tester working on the SOA project. Reports can be generated in a number of formats. Details such as reporting preferences (who should reports be sent to, how should those reports be labelled, what mail server and domain should be used, etc.) Team Server settings, Report Center settings, and email settings, license settings, etc. can be controlled by local settings files.

The optimal team configuration is to have one SOAtest (Server Edition) on the team build machine, SOAtest (Professional Edition) on every developer/tester workstation, one SOAtest (Architect Edition) on the architect’s machine, and one installation of Team Server on the team build machine or another team machine.

257

Testing from the Command Line Interface (soatestcli)

Engineers use their local installations of SOAtest to develop and run tests, then check their work in to source control. Every night, soatestcli runs on a team machine. Depending on the configuration, it may execute the available tests, monitor policy adherence, and/or perform the specified static analysis tests. After the test completes, engineers can import test result into the SOAtest GUI to facilitate task review and resolution. Additionally, SOAtest sends results to the Parasoft Report Center, emails each team member a report that contains only the tasks assigned to him or her, emails group managers a report that lists all team/ project quality tasks and identifies which team member is responsible for each task, and uploads reports and results to Team Server. Throughout the process, Team Server manages the sharing and updating of test settings and test files; this standardizes tests across the team and helps team members leverage one another’s work. The standardized test settings and custom team rules are configured and maintained by the team architect, who is using SOAtest Architect Edition.

Running a Test The general procedure for testing from the command line is as follows: •

Use the soatestcli utility, with appropriate options, to launch analysis in the command-line mode. A complete list of options is provided in “cli Options”, page 260. Key options are: •

-data: Specifies the Eclipse workspace location.



-config: Specifies test configuration.



-resource: Specifies the test suite(s) to run. To run a single test suite, specify the path to <test suite name.tst> relative to the workspace To run all test suites within a directory, specify the directory path relative to the workspace.



-publish: Publishes test results to Team Server.



-report: Generates a report.



-localsettings: Passes advanced settings for Team Server/Report Center/mail reporting. Options are described in “Local Settings (Options) Files”, page 266.

If the SOAtest installation is not on your path, launch soatestcli with the full path to the executable.

cli Overview The general form of invocation for soatestcli is: •

Windows: soatestcli.exe



UNIX: soatestcli

[OPTIONS]

[OPTIONS]

Typically, invocations follow this pattern: •

Windows: soatestcli.exe



UNIX: soatestcli

-data %WORKSPACE_DIR% -resource resource_to_test | local_settings_file config %CONFIG_URL% -report %REPORT_FILE%

-data %WORKSPACE_DIR% -resource resource_to_test | local_settings_file -config %CONFIG_URL% -report %REPORT_FILE%

258

Testing from the Command Line Interface (soatestcli)

Examples soatestcli.exe -config "user://Example Configuration"



Runs all tests in the default workspace (for standalone installations, usually c:\documents and settings\<username>\soatest\workspace) with the user-defined Test Configuration named "Example Configuration."

Note A user-defined configuration is local to the specified workspace and the configuration. "Example Configuration" is automatically created and set to be the default configuration by SOAtest when the workspace is first created.

soatestcli.exe -import c:\myProject



Imports a project from c:\myProject into the default workspace. Once this is done, the project does not need to be imported into the same workspace again.

soatestcli.exe -config "user://Example Configuration" -data "c:\myWorkspace"



Runs all tests in the workspace at c:\myWorkspace. Tests in projects that have not been imported will not be run. The "Example Configuration" Test Configuration is used.

soatestcli.exe -config "user://Example Configuration" -resource "tests/myTest.tst"



Runs the test suite file 'myTest.tst' located in the 'tests' project of the default workspace. The "Example Configuration" Test Configuration is used. The project 'tests' must have already been imported into the workspace.

soatestcli.exe -config "user://Example Configuration" -resource "tests" -report c:\reports\Report1



Runs all tests in the 'tests' project folder of the default workspace, and saves the report at c:\reports\Report1. The project 'tests' must have already been imported into the workspace.

Using -data to Specify Your Eclipse Workspace If you are not in the same directory as the Eclipse workspace that you want to test, you need to use soatestcli with the -data option. For example, this Windows command tests the SOAtest Example project by applying the "My Configuration" Test Configuration, generates a report of results, and saves that report in the c:\reports\Report1 directory: soatestcli -data "c:\Documents and Settings\cynthia\Application Data\Parasoft\SOAtest\work-

space" -resource "SOAtest Example" -config user://"My Configuration" c:\reports\Report1

-report

If you are in the same directory as the workspace that you want to test, you can call soatestcli without the -data option. For example, this Windows command tests the SOAtest Example project by applying the My Configuration Test Configuration, generates a report of results, and saves that report in the c:\reports\Report1 directory: soatestcli -resource "SOAtest Example" -config user://"My Configuration" -report

c:\reports\Report1

259

Testing from the Command Line Interface (soatestcli)

cli Options Available soatestcli options are listed in the following tables.

Double Quotes vs. Single Quotes Use "double-quotes" (not ’single quotes’) to specify options. For example: -config team://"Our Configuration"

Option

Purpose

Notes

-data %WORKSPACE_DIR%

Specifies the location of the Eclipse workspace directory to use.

Defaults to the current user’s dependent directory.

-import %PROJECT%

Imports the specified projects into the Eclipse workspace.

For example, the command can be used as follows to import the project at location "C:\Documents and Settings\user\My Test Project" into the current workspace: soatestcli.exe -import "C:\Documents and Settings\user\My Test Project"

The -import command-line argument supports project paths that are relative to the current working directory. Typically, the current working directory is the current directory on the command-line from which the command is given.

260

Testing from the Command Line Interface (soatestcli)

Option -resource

%RESOURCE%

Purpose

Notes

Specifies the test suite(s) to run.

To run a single test suite, specify the path to <test suite name.tst> relative to the workspace. To run all test suites within a directory, specify the directory path relative to the workspace. Use multiple times to specify multiple resources. Use quotes when the resource path contains spaces or other non-alphanumeric characters. If %RESOURCE% is a .properties file, the value corresponding to com.parasoft.xtest.checkers.resources will be interpreted as a colon(:)-separated list of resources. Only one properties file can be specified in this way. If no resources are specified on the command line, the complete workspace will be tested. Paths (even absolute ones) are relative to the workspace specified by the -data parameter. Examples: -resource "Acme Project" -resource "/MyProject/tests/acme" -resource testedprojects.properties

-config %CONFIG_URL%

Specifies that you want to run the Test Configuration available at %CONFIG_URL% .

This parameter is required. %CONFIG_URL% is interpreted as a URL, the name of a Test Configuration, or the path to a local file.

Examples: •

By filename: -config "mylocalconfig.properties"



By URL: -config "http://intranet.acme.com/SOAtest/ team_config.properties"



Built-in configurations: -config "builtin://Demo Configuration" -config "Demo Configuration"



User-defined configurations: -config "user://My First Configuration"



Team configurations: -config "team://Team Configuration" -config "team://teamconfig.properties"

-localsettings %LOCALSETTINGS_FILE%

Reads the local setting file %LOCALSETTINGS_FILE% for global preferences. These settings specify details such as Report Center settings, email settings, and Team Server settings.

261

The local setting file is a properties file. These files can control reporting preferences (who should reports be sent to, how should those reports be labelled, what mail server and domain should be used, etc.) Team Server settings, Report Center settings, email settings, and more. For details on creating local setting files, see “Local Settings (Options) Files”, page 266.

Testing from the Command Line Interface (soatestcli)

Option

Purpose

Notes

-publish

Publishes the reports to the Team Server.

The Team Server location can be specified in the GUI or in the local setting file (described in the -localsettings %LOCALSETTINGS_FILE% entry).

-report %REPORT_FILE%

Generates an XML report to the given file %REPORT_FILE% and adds an HTML (or PDF or custom format—if specified using the report.format option) report with the same name—and a different extension—in the same directory.

All of the following commands will produce an HTML report filename.html and an XML report filename.xml. • • •

-report filename.xml -report filename.htm -report filename.html

If the specified path ends with an ".html"/".htm"/ ".xml" extension, it will be treated as a path to the report file to generate. Otherwise, it will be treated as a path to a directory where reports should be generated. If the file name is explicitly specified in the command and a file with this name already exists in the specified location, the previous report will be overwritten. If your command doesn’t explicitly specify a file name, the existing report file will not be overwritten—the new file will be named repXXXX.html, where XXXX is a random number. If the -report option is not specified, reports will be generated with the default names "report.xml/html" in the current directory.

-startStubServer

Starts the stub server

-router matchWhole <searchURI:URI> <repla-ceURI:URI>

Specifies search and replace arguments

Starts the stub server and readies stubs for use as endpoints by a project. For example: -router searchURI:host1.adobe.com replaceURI:host2.adobe.com

OR -router searchURI:* replaceURI:http://

host2.adobe.com/service This feature is now deprecated. Please use Environments instead.

262

Testing from the Command Line Interface (soatestcli)

Option

Purpose

Notes

-testName [match:] <test name>

Specifies test name patterns

Allows you to specify the name of the test in the test suite to run. SOAtest will find tests that contain the specific string specified, but it does not perform actual pattern matching (such as wildcards or Regular Expressions). For example, -testName match: something will run all tests whose names contain the word something. To run multiple tests use -testName name1 -testName where name1 and name2 correspond to the names of the desired tests. name2

Note that you can surround the value with quotes in order to allow spaces in the name. For example, -testName match: "hello world" will search for a test with the exact string hello world in its name. -environment <environment_name>

Specifies environment options

When running functional tests from the command line, you can override the active environment specified in a project with one specified from the command line. Note that if the specified environment is not found in the project, the default active environment will be used instead.

-dataSourceRow <row> dataSourceName <name>

Runs tests with a single data source row.

The -dataSourceName

<name>

argument is optional.

For example: • •

will cause any test that is using a data source to run with row 5.

-dataSourceRow 5

-dataSourceRow 5 -dataSourceName "Data"

will cause any test that is using a data source named "Data" to run with row 5. -Centrasite

Report test results to the Software AG CentraSite Active SOA registry

Allows you to send results back to the Software AG CentraSite Active SOA registry. For details, see “Using Software AG CentraSite Active SOA with SOAtest”, page 682

-qualityCenter

Report test results to HP Quality Center

Allows you to send results back to HP Quality Center. For details, see “Using HP with SOAtest”, page 659

Report test results to Rational TestManager

Allows you to send results back to Rational TestManager. For details, see “Using IBM/ Rational with SOAtest”, page 672

-qualityCenterReportAllTraffic

-testManager -testManagerVerbose

Verbose mode provides more information such as request and response traffic

263

Testing from the Command Line Interface (soatestcli)

Option

Purpose

Notes

-visualStudio

Report test results to Microsoft Visual Studio Team System

Allows you to send results back to Microsoft VIsual Studio Team System. For details, see “Using Microsoft with SOAtest”, page 666

-include %PATTERN% -exclude %PATTERN%

Specifies files to be included/excluded during testing.

You must specify a file name or path after this option. Patterns specify file names, with the wildcards * and ? accepted, and the special wildcard ** used to specify one or more path name segments. Syntax for the patterns is similar to that of Ant filesets. Examples: -include **/Bank.xml (test Bank.xml files) -include **/ATM/Bank/*.xml (test all .xml files in folder ATM/Bank) -include c:/ATM/Bank/Bank.xml (test only the c:/ ATM/Bank/Bank.xml file) -exclude **/internal/** (test everything except classes that have path with folder "internal") -exclude **/*Test.xml (test everything, but files that end with Test.xml) Additionally if a pattern is a file with a .lst extension, it is treated as a file with a list of patterns. For example, if you use -include c:/include.lst and include.lst contains the following (each line is treated as single pattern): **/Bank.xml **/ATM/Bank/*.xml c:/ATM/Bank/Bank.xml then it has same effect as specifying: -include **/Bank.xml -include **/ATM/Bank/*.xml -include c:/ATM/Bank/Bank.xml"

-encodepass <plain password>

-showdetails

Generates an encoded version of a given password.

Prints the message 'Encrypted password: <encpass>' and terminates the cli app.

Prints detailed test progress information.

N/A

264

Must be used along with

-config <url>.

Testing from the Command Line Interface (soatestcli)

Option

Purpose

Notes

-J

Specifies additional JVM options, which in turn get passed to the Eclipse executable via the -vmargs option.

The Eclipse -vmargs argument is used to customize the operation of the Java VM to use to run Eclipse. If specified, this option must come at the end of the command line. Even if not specified on the executable command line, the executable will automatically add the relevant arguments (including the class being launched) to the command line passed into Java using the -vmargs argument. Java Main then stores this value in eclipse.vmargs. Usage is -vmargs [vmargs*] (Executable, Main)

-prefs %PREFS_URL%

Reads the %PREFS_URL% preference URL to import Eclipse workspace preferences.

is interpreted as a URL or the path to a local Eclipse workspace preferences file. The best way to create a workspace preferences file is to use the Export wizard. To do this:

%PREFS_URL%

1. Choose File> Export. 2. In the Export Wizard, select Preferences, then click Next. 3. Do one of the following: •

To add all of the preferences to the file, select Export all.



To add only specified preferences to the file, select Choose specific preferences to export, then check the preferences you want to import.

4. Click Browse... then indicate where you want the preferences file saved. 5. Click Finish. We recommend that you delete non-applicable properties and keep only critical properties, such as the classpath property. We also recommend that you replace machine/user-specific locations with variables by using the $(VAR) notation. These variables will be replaced with the corresponding Java properties, which that can be set at runtime by running soatestcli with -J-D options (for example soatestcli -J-DHOME=/home/ user). Examples: -prefs "http://intranet.acme.com/SOAtest/ workspace.properties" -prefs "workspace.properties" -help

Displays help information.

Does not run testing.

265

Testing from the Command Line Interface (soatestcli)

Option

Purpose

Notes

-version

Displays version number.

Does not run testing.

-initjython, installcertificate, uninstallcertificate

Installer options

N/A

Notes •

To see a list of valid command line options, enter for soatestcli



soatestcli automatically emails designated group managers and architects a report that lists all team/project tasks and identifies which team member is responsible for each task. If no tasks are reported, reports will be sent unless the local setting file contains the report.mail.on.error.only=true option.



If the appropriate prerequisites are met, soatestcli automatically emails each team member a report that contains only the tasks assigned to him or her. If no tasks are assigned to a particular team member, he or she will not be emailed a report.



For more details about options that are inherited from Eclipse, see the Eclipse documentation.

-help.

Local Settings (Options) Files Local settings files can control report settings, Report Center settings, task assignment settings, and Team Server settings. You can create different local settings files for different projects, then use the localsettings option to indicate which file should be used for the current command line test. Each local settings file must be a simple text file. There are no name or location requirements. Each setting should be entered in a single line. If a parameter is specified in this file, it will override the related parameter specified from the GUI. If a parameter is not specified in this file, SOAtest will use the parameter specified in the GUI.

Creating a Local Settings File by Exporting Your GUI Preferences The fastest and easiest way to create a local settings file is to export your Preferences from the GUI. 1. Choose SOAtest> Preferences. The Preferences dialog will open. 2. Select SOAtest (the root element in the left tree) in the Preferences dialog. 3. Specify which preferences you want to export. 4. Click the Export button, then specify the file where you want the settings saved. •

If you select an existing file, the source control settings will be appended to that file. Otherwise, a new file will be created.



Exported passwords will be encrypted.

Local settings files can determine the following settings: •

Reporting Settings

266

Testing from the Command Line Interface (soatestcli)



Report Center Settings



Team Server Settings



Licensing Settings



Technical Support Settings



Authorship Settings



Source Control Settings



Miscellaneous Settings

Notes •

Each setting should be entered on a single line.



If your local settings file contains any invalid settings, details will be reported in the command line output.

Reporting Settings

Setting

Purpose

report.authors_details

Determines whether the report includes an overview of the number and type of tasks assigned to each team member. The default is true.

report.contexts_details

Determines whether the report includes an overview of the files that were checked or executed during testing.The default is false.

report.custom.extension report.custom.xsl.file

Specifies the location and extension of the XSL file for a custom format. Used with report.format=custom See “Support for Custom Report Formats”, page 299 for details and examples.

report.developer_errors=true|false

Determines whether manager reports include details about team member tasks.

report.developer_reports=true|false

Determines whether the system generates detailed reports for all team members (in addition to a summary report for managers).

report.format=html|pdf|custom

Specifies the report format.

267

Testing from the Command Line Interface (soatestcli)

Setting

Purpose

report.generate_htmls=true|false

Determines whether HTML reports are generated and saved on the local filesystem. XML reports are generated and saved regardless of this setting’s value. The default setting is true. Determines the start date for trend graphs that track static analysis tasks over a period of time.

report.graph.cs_start_date=[MM/dd/yy}

See “Understanding Reports”, page 300 for more details on these reports. Determines the start date for trend graphs that track test execution results over a period of time.

report.graph.ue_start_date= [MM/dd/yy}

See “Understanding Reports”, page 300 for more details on these reports. Determines whether reports are sent as attachments. All components are included as attachments; before you can view an HTML report with images, all attachments must be saved to the disk.

report.mail.attachments=true|false

The default setting is false. report.mail.cc=[email_addresses]

Specifies where to mail comprehensive manager reports. This setting must be followed by a semicolon-separated list of email addresses. This setting is typically used to send reports to managers or architects. It can also be used to send comprehensive reports to team members if such reports are not sent automatically (for example, because authorship is not being determined by SOAtest).

report.mail.domain=[domain]

Specifies the mail domain used to send reports.

268

Testing from the Command Line Interface (soatestcli)

Setting

Purpose

report.mail.enabled=true|false

Determines whether reports are emailed to team members and to the additional recipients specified with the cc setting. Remember that each team member with assigned tasks will automatically be sent a report that contains only the assigned tasks.

report.mail.exclude=[email_addresses]

Specifies any email addresses you do not want to receive reports. This setting is used to prevent SOAtest from automatically sending reports to someone who worked on the tests or source code, but should not be receiving reports.

report.mail.exclude.developers=true|false

Specifies whether reports should be mailed to any team member whose email is not explicitly listed in the report.mail.cc

property. This setting is used to prevent reports from being mailed to individual team members. report.mail.format=html|ascii

Specifies the email format.

report.mail.from=[email_address OR user_name_of_the_same_domain}

Specifies the "from" line of the emails sent.

report.mail.include=[email_addresses]

Specifies the email addresses of team members that you want to receive individual reports. This setting must be followed by a semicolon-separated list of email addresses. This setting is typically used when such reports are not sent automatically (for example, because the team is not using SOAtest to assign tasks). It overrides the settings specified in the 'exclude' list.

report.mail.on.error.only=true|false

Determines whether reports are sent to the manager only if a task is generated or a fatal exception occurs. Team member emails are not affected by this setting; individual emails are sent only to team members who are responsible for reported tasks. The default setting is false. Specifies the mail server used to send reports.

report.mail.server=[server]

269

Testing from the Command Line Interface (soatestcli)

Setting

Purpose

report.mail.subject=My New Subject

Specifies the subject line of the emails sent. The default subject line is SOAtest Report." For example, if you want to change the subject line to SOAtest Report for Project A", you would use report.mail.subject=SOAtest Report for Project A

report.mail.time_delay=[server]

Specifies a time delay between emailing reports (to avoid bulk email restrictions).

report.mail.unknown=[email_address OR user_name_of_the_same_domain}

Specifies where to mail reports for tasks assigned to "unknown".

report.mail.username=[username] report.mail.password=[password] report.mail.realm=[realm]

Specifies the settings for SMTP server authentication. The realm value is required only for those servers that authenticate using SASL realm.

report.active_rules=true|false

Determines if reports contain a list of the rules that were enabled for the test.

report.suppressed_msgs=true|false

Determines whether HTML reports include suppressed messages. The default setting is false. Specifies the name of the group that is responsible for the project. This value is used for uploading summary results to Team Server.

report.tag=[name}

The tag is an identifier of the module checked during the analysis process. Reports for different modules should be marked with different tags. Determines whether HTML reports include test parameter details.

report.test_params=true|false

The default setting is false.

Report Center Settings

270

Testing from the Command Line Interface (soatestcli)

Setting

Purpose

grs.enabled=true|false

Determines whether the current SOAtest installation is connected to Report Center. This setting is not needed if you want to use the value specified in the GUI.

grs.server=[server]

Specifies the host name of the Report Center server. This setting is not needed if this information is specified in the GUI.

grs.port=[port]

Specifies the port number of the Report Center data collector. This setting is not needed if you want to use the value specified in the GUI.

grs.user_defined_attributes=[attributes]

Specifies the user-defined attributes for Report Center. Use the format key1:value1;

key2:value2

For more details on attributes, see “Configuring Report Center Attributes”, page 204. This setting is not needed if you want to use the value specified in the GUI. grs.log_as_nightly=true|false

Determines whether the results sent to Report Center are marked as being from a nightly build.

grs.use_resource_attributes=true|false

Determines whether Report Center attributes specified in the GUI at the project level should be used. This allows you to disable project-level Report Center attributes.

Team Server Settings

Setting

Purpose

tcm.server.enabled=true|false

Determines whether the current SOAtest installation is connected to the Team Server. This setting is not needed if you want to use the value specified in the GUI.

271

Testing from the Command Line Interface (soatestcli)

Setting

Purpose

tcm.server.name=[name]

Specifies the machine name or IP address of the machine running Team Server. This setting is not needed if you want to use the value specified in the GUI.

tcm.server.port=[port]

Specifies the Team Server port number. This setting is not needed if you want to use the value specified in the GUI.

report.tag=[name]

Specifies the name of the group that is responsible for the project. This value is used for uploading summary results to Team Server.

tcm.server.accountLogin=true|false tcm.server.username=[username] tcm.server.password=[password]

Determines whether username and password are submitted to connect to Team Server. Usernames/passwords are not always needed; it depends on your team’s setup. If the first setting is true, the second and third settings specify the username and password. Note that Team Server must have the username and password setting already enabled before these settings can be used.

Licensing Settings

Setting

Purpose

soatest.license. use_network=true|false

Determines whether the current SOAtest installation retrieves its license from LicenseServer. This setting is not needed if you want to use the value specified in the GUI. Example: soatest.license.use_network=true

272

Testing from the Command Line Interface (soatestcli)

Setting

Purpose

soatest.license. network.host=[host]

Specifies the machine name or IP address of the machine running LicenseServer Configuration Manager. This setting is not needed if you want to use the value specified in the GUI. Example: soatest.license.network.host=10.9.1.63

Specifies the LicenseServer port number. This setting is not needed if you want to use the value specified in the GUI.

soatest.license. network.port=[port]

Example: soatest.license.network.port=2002

Specifies the type of license that you want this SOAtest installation to retrieve from LicenseServer. This setting is not needed if you want to use the value specified in the GUI.

soatest.license. network.edition=[edition_name]

[edition_name]

can be

professional_edition, architect_edition,

or server_edition. To use a custom edition, do not set anything after the "="; simply leaving the value empty. Example: soatest.license.network.edition= architect_edition soatest.license.network.edition= server_edition soatest.license.network.edition= professional_edition soatest.license.network.edition=

soatest.license. autoconf.timeout=[seconds]

Specifies the maximum number of seconds SOAtest will wait for the license to be automatically configured from LicenseServer. Default is 10.

soatest.license. local.expiration=[expiration]

Specifies the local license that you want this SOAtest installation to use. This setting is not needed if you want to use the value specified in the GUI.

soatest.license. local.password=[password]

Specifies the local password that you want this SOAtest installation to use. This setting is not needed if you want to use the value specified in the GUI.

273

Testing from the Command Line Interface (soatestcli)

Technical Support Settings

Setting

Purpose

techsupport.auto_creation=true|false

Determines whether archives are automatically prepared when testing problems occur.

techsupport.send_email=true|false

Determines whether prepared archives are emailed to Parasoft support. If you enable this, be sure to specify email settings from the GUI or with the options in Reporting Settings.

techsupport.archive_location=[directory]

Specifies where archives are stored.

techsupport.verbose=true|false

Determines whether verbose logs are included in the archive. Note that this option cannot be enabled if the logging system has custom configurations. •

Verbose logs are stored in the xtest.log file within the user-home temporary location (on Windows, this is <drive>:\Documents and Settings\<user>\Local Settings\Temp\parasoft\xtest).



Verbose logging state is cross-session persistent (restored on application startup).



The log file is a rolling file: it won't grow over a certain size, and each time it achieves the maximum size, a backup will be created.

techsupport.verbose.scontrol=true|false

Determines whether verbose logs include output from source control commands. Note that the output could include fragments of your source code.

techsupport.item.general=true|false

Determines whether general application logs are included.

274

Testing from the Command Line Interface (soatestcli)

Setting

Purpose

techsupport.item.environment=true|false

Determines whether environment variables, JVM system properties, platform details, additional properties (memory, other) are included in the archive.

techsupport.advanced=true|false

Specifies if advanced options will be sent.

techsupport.advanced.options=[option]

Specifies any advanced options that the support team asked you to enter.

Authorship Settings

Setting

Purpose

scope.sourcecontrol=true|false

Determines whether SOAtest computes task assignment based on a data from a supported source control system. This setting is not needed if you want to use the value specified in the GUI.

scope.local=true|false

Determines whether SOAtest computes task assignment based on the local user. This setting is not needed if you want to use the value specified in the GUI.

scope.xmlmap=true|false

Specifies whether SOAtest computes task assignment based on XML files that define how you want tasks assigned for particular files or sets of files (these mappings can be specified in the GUI then saved in an XML file).

scope.xmlmap.file=[file]

Specifies the name of the XML file that defines how you want tasks assigned for particular files or sets of files.

Source Control Settings

275

Testing from the Command Line Interface (soatestcli)

Defining multiple repositories of the same type Indexes (numbered from 1 to n) must be added to the prefix if you want to define more than one repository of the same type. For example: scontrol.rep1.type=ccase scontrol.rep1.ccase.vob=/vobs/myvob1

scontrol.rep2.type=ccase scontrol.rep2.ccase.vob=/vobs/myvob2

If you are defining only one repository, you do not need to use an index. For example: scontrol.rep.type=ccase scontrol.rep.ccase.vob=/vobs/myvob1

AccuRev Repository Definition Properties

Property

Description

scontrol.rep.type=accurev

AccuRev repository type identifier.

scontrol.rep.accurev.host=

AccuRev server host.

scontrol.rep.accurev.port=

AccuRev server port. Default port is 1666.

scontrol.rep.accurev.login=

AccuRev user name.

scontrol.rep.accurev.password=

AccuRev password.

ClearCase Repository Definition Properties

Property

Description

scontrol.ccase.exec=

Path to external client executable (cleartool).

scontrol.rep.type=ccase

ClearCase repository type name.

scontrol.rep.ccase.vob=

Path inside VOB. ccase.vob value + File.separator must be the valid path to a ClearCase controlled directory.

CVS Repository Definition Properties

276

Testing from the Command Line Interface (soatestcli)

Property

Description

scontrol.rep.type=cvs

CVS repository type identifier.

scontrol.rep.cvs.root=

Full CVSROOT value.

scontrol.rep.cvs.pass=

Plain or encoded password. The encoded password should be the same as in the .cvspass file. When you are first logged in to the CVS repository from the command line using "cvs login", the password is saved in the registry. To retrieve it, go to the registry (using regedit), and look for the value under HKEY_CURRENT_USER->CVSNT> cvspass. This should display your entire login name (:pserver:[email protected]:/exampleC) encrypted password value.

scontrol.rep.cvs.useCustomSSHCredentials=

Determines whether the cvs login and password should be used for EXT/SSH connections. Allowed values are true and false. It is disabled by default.

scontrol.rep.cvs.ext.server

If connecting to a CVS server in EXT mode, this specifies which CVS application to start on the server side. Has the same meaning as the CVS_SERVER variable. cvs is the default value.

scontrol.rep.cvs.ssh.loginname=

Specifies the login for SSH connections (if an external program can be used to provide the login).

scontrol.rep.cvs.ssh.password=

Specifies the password for SSH connection.

scontrol.rep.cvs.ssh.keyfile=

Specifies the private key file to establish an SSH connection with key authentication.

scontrol.rep.cvs.ssh.passphrase=

Specifies the passphrase for SSH connections with the key authentication mechanism.

scontrol.rep.cvs.useShell=

Enable an external program (CVS_RSH) to establish a connection to the CVS repository. Allowed values are true and false. It is disabled by default.

scontrol.rep.cvs.ext.shell=

Specifies the path to the executable to be used as the CVS_RSH program. Command line parameters should be specified in the cvs.ext.params property.

277

Testing from the Command Line Interface (soatestcli)

Property

Description

scontrol.rep.cvs.ext.params=

Specifies the parameters to be passed to an external program. The following case-sensitive macro definitions can be used to expand values into command line parameters: •

{host} repository host



{port} port



{user} cvs user



{password} cvs password



{extuser} parameter cvs.ssh.login-

name •

{extpassword} parameter

cvs.ssh.password •

{keyfile} parameter cvs.ssh.keyfile



{passphrase} parameter

cvs.ssh.passphrase

Perforce Repository Definition Properties

Property

Description

scontrol.perforce.exec=

Path to external client executable (p4).

scontrol.rep.type=perforce

Perforce repository type identifier.

scontrol.rep.perforce.host=

Perforce server host.

scontrol.rep.perforce.port=

Perforce server port. Default port is 1666.

scontrol.rep.perforce.login=

Perforce user name.

scontrol.rep.perforce.password=

Password.

scontrol.rep.perforce.client=

The client workspace name, as specified in the P4CLIENT environment variable or its equivalents. The workspace's root dir should be configured for local path (so that files can be downloaded).

Serena Dimensions Repository Definition Properties

278

Testing from the Command Line Interface (soatestcli)

Linux and Solaris Configuration Note To use Serena Dimensions with SOAtest, Linux and Solaris users should run SOAtest in an environment prepared for using Serena programs, such as 'dmcli' •

LD_LIBRARY_PATH should contain the path to <SERENA Install Dir>/libs.



DM_HOME should be specified.

Since many Solaris users commonly set the required Serena variables by running the Serena dmgvars.sh file, it also necessary to modify LD_LIBRARY_PATH variable. To use Serena Dimensions with SOAtest, LD_LIBRARY_PATH needs to include the following items (paths can be different on client machines): •

SSL/Crypto library - /usr/local/ssl/lib



STDC++ library - /usr/local/lib

Property

Description

scontrol.rep.type=serena

Serena Dimensions repository type identifier.

scontrol.rep.serena.host=

Serena Dimensions server host name.

scontrol.rep.serena.dbname=

Name of the database for the product you are working with.

scontrol.rep.serena.dbconn=

Connection string for that database.

scontrol.rep.serena.login =

Login name.

scontrol.rep.serena.password

Password.

scontrol.rep.serena.mapping

Maps workspace resources to Serena Dimension repository paths.

279



Example 1: If you use scontrol.rep.serena.mapping_1=${project_loc\:MyPro ject};PRODUCT1\:WORKSET1;src\\ MyProject, then Project 'MyProject' will be mapped to the Serena workset PRODUCT1:WORKSET1 and workset relative path: src\\MyProject



Example 2: If you use scontrol.rep.serena.mapping_2=${workspace_loc};P RODUCT1\:WORKSET1 then the complete workspace will be mapped to the Serena workset PRODUCT1:WORKSET1.

Testing from the Command Line Interface (soatestcli)

StarTeam Repository Definition Properties

Property

Description

scontrol.rep.type=starteam

StarTeam repository type identifier.

scontrol.rep.starteam.host=

StarTeam server host.

sscontrol.rep.starteam.port=

StarTeam server port. Default port is 49201.

scontrol.rep.starteam.login=

Login name.

scontrol.rep.starteam.password=

Password (not encoded).

Subversion Repository Definition Properties

Property

Description

scontrol.rep.type=svn

Subversion repository type identifier.

scontrol.rep.svn.url=

Subversion URL specifies protocol, server name, port and starting repository path (e.g., svn://buildmachine.foobar.com/home/svn).

scontrol.rep.svn.login=

Login name.

scontrol.rep.svn.password =

Password (not encoded).

scontrol.svn.exec=

Path to external client executable (svn).

CM Synergy Repository Definition Properties

Property

Description

scontrol.rep.type=synergy

Synergy/CM repository type identifier.

scontrol.rep.synergy.host=

Computer on which synergy/cm engine runs. Local host is used when missing.

scontrol.rep.synergy.dbpath=

Absolute synergy database path e.g \\host\db\name (backslash symbols '\' in UNC/ Windows paths must be doubled).

scontrol.rep.synergy.projspec=

Synergy project spec which contains project name and its version e.g name-version.

scontrol.rep.synergy.login=

Synergy user name.

scontrol.rep.synergy.password=

Synergy password (not encoded).

scontrol.rep.synergy.port=

Synergy port.

280

Testing from the Command Line Interface (soatestcli)

Property

Description

scontrol.rep.synergy.remote_client=

(UNIX only) Specifies that you want to start ccm as a remote client. Default value is false. Optional.

scontrol.rep.synergy.local_dbpath=

Specifies the path name to which your database information is copied when you are running a remote client session. If null, then the default location will be used.

scontrol.synergy.exec=

Path to external client executable (ccm)

Microsoft Visual Source Safe Repository Definition Properties

Property

Description

scontrol.rep.type=vss

Visual SourceSafe repository type identifier.

scontrol.rep.vss.ssdir=

Path of repository database (backslash symbols '\' in UNC/Windows paths must be doubled).

scontrol.rep.vss.projpath=

VSS project path.

scontrol.rep.vss.login=

VSS login.

scontrol.rep.vss.password=

VSS password.

scontrol.vss.exec=

Path to external client executable (ss).

scontrol.vss.lookup=

Determines whether a full VSS database search is performed to find associations between local paths and repository paths. True or false.

Important Notes •

The repository(n).vss.ssdir property should contain a UNC value even if the repository database resides locally.



Be aware of VSS Naming Syntax, Conventions and Limitations. Any character can be used for names or labels, except the following: •

Dollar sign ($)



At sign (@)



Angle brackets (< >), brackets ([ ]), braces ({ }), and parentheses (( ))



Colon (:) and semicolon (;)



Equal sign (=)



Caret sign (^)



Exclamation point (!)



Percent sign (%)



Question mark (?)

281

Testing from the Command Line Interface (soatestcli)



Comma (,)



Quotation mark (single or double) (' ")



VSS 6.0 (build 8163), which is deployed with Visual Studio 6, does not work properly with projects whose names start with a dot (.) symbol. If such a project name is used, subprojects cannot be added.



Do not use custom working directories for sub-projects (example: Project $/SomeProject has the working directory C:\TEMP\VSS\SomeProject and its subproject $/SomeProject/SomeSubProject has the working directory D:\SomeSubProject).

Miscellaneous Settings

Setting

Purpose

report.rules=[url_path_to_rules_directory]

Specifies the directory for rules html files. The default setting is none.

tasks.clear=true|false

Clears existing tasks upon startup in cli mode. This prevents excessive time being spent "loading existing results." The default is true.

classpath.[variable]=[value]

Specifies classpath variables. For example: classpath.ECLIPSE_HOME= $(ECLIPSE_HOME) classpath.ECLIPSE_LIB=$(HOME)/ dv/ThirdParty/eclipse/2.1.3/$(PS_ARCH) classpath.ECLIPSE3_LIB=$(HOME)/ dv/ThirdParty/eclipse/3.0.0/$(PS_ARCH) classpath.THIRD_PARTY=$(HOME)/ dv/ThirdParty classpath.SOAtest_CLASSES=$(HOME)/ dv/plugins/ com.parasoft.eclipse.api.$(os)/ SOAtest/bin/ classpath.JUNIT_JAR=$(HOME)/ dv/ThirdParty/junit.jar classpath.SOAtest_ZIP=$(HOME)/dv/ plugins/com.parasoft.eclipse. SOAtestplugin/resources/webking.jar classpath.JUNIT_HOME=$(HOME)/ dv/ThirdParty

system.properties.classpath=[path1];[path2];[path3] ...

Specifies which jar files and class folders are in the classpath. For example: system.properties.classpath=C\:\\myjars\ \myLib1.jar;C\:\\myjars\\myLib2.jar

282

Testing from the Command Line Interface (soatestcli)

Setting

Purpose

startup.server=true|false

Determines whether the embedded stub server is started.

Here is one sample local settings file named local.properties: # Team Server settings tcm.server.enabled=true tcm.server.name=tcm.mycompany.com tcm.server.port=18888 tcm.server.accountLogin=true tcm.server.username=tcm_user tcm.server.password=tcm_pass # Report Center settings grs.enabled=true grs.server=grs.mycompany.com grs.port=32323 # Mail settings report.mail.enabled=true report.mail.server=mail.mycompany.com report.mail.domain=mycompany.com report.mail.cc=project_manager report.mail.subject=Coding Standards grs.log_as_nightly=true

Using Variables in Local Settings (Options) Files The following variables can be used in reports, e-mail, Report Center, Team Server, and license settings. Note that report tag value can't contain any ':' characters. env_var example: ${env_var:HOME} Outputs the value of the environmental variable specified after the colon. project_name example: ${project_name} Outputs the name of the tested project. If more than one project is provided as an input, it first outputs the tested project name, then "..." workspace_name example: ${workspace_name} Outputs an empty string in Eclipse. config_name

283

Testing from the Command Line Interface (soatestcli)

$ example: ${config_name} Outputs the name of executed test configuration; applies only to Reports and Email settings. analysis_type $ example: ${analysis_type} Outputs a comma separated list of enabled analysis types (for example: Static, Execution); applies only to Reports and Email settings. tool_name $ example: ${tool_name} Outputs the tool name (for example: SOAtest). Example localsettings file # REPORTS #Determines whether reports are emailed to team members and to the additional recipients specified with the cc setting. #Remember that if the team is using CVS for source control and each team member’s email address matches his or her CVS username + the mail domain, each team member that worked on project code will automatically be sent a report that contains only the tasks/results related to his or her work. report.mail.enabled=true #Exclude team member emails (true/false) report.mail.exclude.developers=false # Append team member tasks to manager emails (true/false) report.developer_errors=true # Send reports to team members (true|false) report.developer_reports=true # Append suppressed messages (true|false) report.suppressed_msgs=false #Determines where to mail complete test reports. #This setting is typically used to send reports to managers or architects. #It can also be used to send reports to team members if team member reports #are not sent automatically (for example, because the team is not using CVS). [email protected]; ${env_var:USERNAME} @domain.com # mail target for unknown team member tasks [email protected] #Specifies the mail server used to send reports. report.mail.server=mail_server.domain.com #Specifies the mail domain used to send reports. report.mail.domain=domain.com #Specify mali from report.mail.from=nightly #Specifies any email addresses you do not want to receive reports. #This setting is used to prevent from automatically sending reports to someone that worked on the code, but should not be receiving reports. This setting is only applicable if the team is

284

Testing from the Command Line Interface (soatestcli)

using CVS for source control and team member reports are being sent automatically. report.mail.exclude=developer1;developer2 # Specifies the subject line of the emails sent. report.mail.subject= ${tool_name} Report - ${config_name} # Report test params inculde (true|false) report.test_params=true # Team Server #Determines whether the current installation is connected to the Team Server. tcm.server.enabled=true #Specifies the machine name or IP address of the machine running Team Server. tcm.server.name=tcm_server.domain.com #Specifies the Team Server port number. tcm.server.port=18888 tcm.server.accountLogin=true tcm.server.username=user tcm.server.password=password report.tag= ${config_name} # Report Center #Determines the current installation is connected to Report Center. grs.enabled=true #Specifies the host name of the Report Center server. grs.server=grs_server.domain.com # Specifies the port number of the Report Center report collector. grs.port=32323 # Specifies user-defined attributes for Report Center. #Use the format key1:value1; key2:value2 #Attributes help you mark results in ways that are meaningful to your organization. #They also determine how results are grouped in Report Center and how you can filter results in Report Center. #For example, you might want to label results by project name and/or by project component name. #Each attribute contains two components: a general attribute category name #and a specific identification value. For example, assume your organization wants to classify results by project. #You might then use the attribute project:projname1. For the next project, you could use a different #local settings file that specified an attribute such as project:projname2. grs.user_defined_attributes=Type:Nightly;Project:Project1 # Determines whether the results sent to Report Center are marked as being from a nightly build. grs.log_as_nightly=true # SCOPE #task assignment based on CVS scope.sourcecontrol=true #task assignment based on author tag scope.author=false #task assignment based on local user scope.local=false

285

Testing from the Command Line Interface (soatestcli)

# LICENSE #override license settings #soatest.license.autoconf.timeout=40 soatest.license.use_network=true soatest.license.network.host=license_server.domain.com soatest.license.network.port=2002 soatest.license.network.edition=server_edition # SOURCE CONTROL scontrol.rep1.type=cvs scontrol.rep1.cvs.root=:pserver:[email protected]_server.domain.com:/home/cvs/ scontrol.rep1.cvs.pass=mypassword

cli Exit Codes soatestcli

uses the following exit codes when it encounters a problem:

Code

Meaning

130

bad command-line (command-line is malformed or refers to a resource that does not exist)

131

Eclipse already running in the same workspace

133

Parser configuration error (cannot find/instantiate XML parser)

134

Command-line is not licensed

135

Test process exited with an exception. Check error log.

286

Testing from the Web Service Interface

Testing from the Web Service Interface This topic explains how to run a test from the SOAtest web service API. Sections include: •

Prerequisites



About the Web Service API



Operations

Prerequisites •

The web service API requires SOAtest Server Edition.



Before you can run a test from the web service API, you need to setup a project, .tst file, and test suites. See “Adding Projects, .tst files, and Test Suites”, page 308 for details.

About the Web Service API SOAtest's web service API enables web service clients to run tests on a remote machine running SOAtest. Web service mode is available for the Server Edition of SOAtest. SOAtest can be started in web service mode as follows: soatestcli -startStubServer -data <workspace_dir> -localsettings <localsettings_file>

The -data and

-localsettings

arguments are optional.



-data specifies the Eclipse workspace location containing your test cases (.tst files).



-localsettings specifies a properties file used to control certain global settings such as license password and Team Server settings. For more information about these command line arguments, see “Testing from the Command Line Interface (soatestcli)”, page 257.

The SOAtest web service is described by a WSDL document. When SOAtest is running in server mode, this WSDL can be found at http://localhost:9080/axis2/services/SOAtestService?wsdl. The WSDL can be used to generate web service clients. Most web service platforms can generate web service clients from a WSDL document.

Operations The web service has three operations: •

startTestExecution: This operation initiates a test run. This operation returns a "pid" identifier that can be used in subsequent web service calls to check the test execution status and get the test results. The XML schema for the request message corresponds to the command line interface used by Parasoft products (e.g., SOAtest, Jtest, C++test, .TEST). For example, the request must include a "config" element, which maps to the "-config" argument from the command line interface. SOAtest-specific command line extensions, such as "-environment", are also represented in the XML schema for the request message.



getExecutionStatus: This operation gets the current execution status of a test run given a "pid". The response message indicates if the test run is in progress and the percent completed.

287

Testing from the Web Service Interface



getResult: This operation returns a test execution summary given the "pid" of a completed test run. The response message can also contain the XML and HTML reports. These reports can be used to get detailed information about the test run (including the specific test failures).

288

Reviewing Results In this section: •

Viewing Results



Generating Reports



Configuring Reporting Settings



Understanding Reports

289

Viewing Results

Viewing Results This topic explains how to view results and customize SOAtest’s results display. Sections include: •

Accessing Results



Reviewing the Results



Customizing the Results Display



Clearing Messages

Accessing Results SOAtest results can be accessed from a variety of locations in the GUI, as well as from command-line reports.

Results from Tests Run in the GUI Test Progress View The Test Progress view reports test progress and status.

Note that: •

When a test runs, the view label changes from "Test Progress" to "Testing [Test Configuration name].



Clicking the Review tasks button displays the results in the SOAtest view.



A results summary for each analysis category is available in expandable sections.

290

Viewing Results



The toolbar buttons in the upper right corner of the Test Progress view allow you to generate reports or show/hide details.

This opens the Report dialog from which you can configure Report Preferences.

SOAtest View If SOAtest detects that a quality task needs to be performed (e.g., review a test failure, resolve a policy violation, etc.) it reports a task in the SOAtest view. If this view is not available, choose SOAtest> Show View> SOAtest to open it. To see additional details, drill down into the SOAtest view tree. To toggle through the items reported in the SOAtest view, use the arrow buttons in the SOAtest view toolbar.

Source Code Markers For static analysis tests that were run on source files, results are also reported at the source code level. For details, see “Accessing Results”, page 605.

Console View To see testing details, open the Console view during test execution. Testing details are reported here when a test is in process, and remain there until they are cleared or until another test is run.

Test Case Explorer View The Test Case Explorer indicates the status (pass/fail/not yet executed) of all available functional test cases. For details, see “Reviewing Functional Test Results”, page 379. To view any tasks related to a test listed in the Test Case Explorer, right-click that test’s Test Case Explorer node, then choose Show in Tasks. Any tasks related to that test will then be shown in the SOAtest view.

291

Viewing Results

Tips •

Many SOAtest tree nodes report the line number at which an error or possible problem occurred. To view the related code, double-click the node that shows the line number, or right-click that node and choose Go to from the shortcut menu. The related editor will then open and highlight the designated line of code.



You can use Ctrl + C to copy findings in the SOAtest view, then paste them into another document.

Results from Tests Run from the Command Line For tests run from the command line, results are recorded in the generated report. If results were sent to Team Server, results can be imported into the GUI as described in “Accessing Results and Reports”, page 234. You can then review the results as if the test had been performed in the GUI. Static analysis results are reported under the Static Analysis category. Functional test results are reported under the Test Execution category.

Reviewing the Results In the SOAtest view, the results are presented as a task list that helps you determine how to proceed in order to ensure the quality of your system.

Functional Testing Functional test results are organized by test suite. See “Reviewing Functional Test Results”, page 379 for details.

Static Analysis Static analysis results should be reviewed in one of the layouts designed especially for static analysis. For details on enabling these layouts and reviewing static analysis results, see “Reviewing Static Analysis Results”, page 605

How do source code changes affect reported findings? If you enable Decorate code markers when tasks become out of date in the Configurations page of the SOAtest preferences panel (accessed by choosing SOAtest> Preferences), then any time an element in a results message is outdated (i.e., the current source code does not match the analyzed code), the element—plus the violation node—will be marked with a special "out of date" icon and tool tip. For example:

Even if this option is not enabled, the option to double-click an outdated element to see the related code will be disabled because the related source code is no longer available,

292

Viewing Results

Filtering Results By default, the SOAtest view shows cumulative results for all tested resources. For example, if you imported results from Team Server, then ran two tests from the GUI, the SOAtest view would show all imported tasks, plus all results from the subsequent two tests. If you prefer to see only results from the last test session or from selected resources, you can filter results. To filter results: 1. Click the filters button in the SOAtest view’s toolbar.

2. Set the desired filter options in the dialog that opens. You can configure it to show only last session tasks, or for designated resources.

Customizing the Results Display Changing the Display Format There are three available layout templates: •

Default Layout: For functional testing.



Static Analysis Layout: For running static analysis against source code (e.g., from the Scanning perspective).



Static Analysis for Functional Tests Layout: For running static analysis by executing a test suite (e.g., a test suite that contains a Browser Testing tool or a Scanning tool).

You can choose the format by opening the pull-down menu on the top right of the SOAtest view, then choosing one of the available formats from the Layout shortcut menu that opens.

Changing Categories To hide or display a category of problem information (project, category, subcategory, location, user, etc.) in the SOAtest view, click the pull-down menu on the top right of this view, then choose the appropriate command in the Layout> Advanced menu.

293

Viewing Results

Or, right-click that category’s node in the SOAtest view, then choose Hide [category] from the shortcut menu.

Clearing Messages You might want to clear messages from the SOAtest view to help you focus on the findings that you are most interested in. For example, if you are fixing reported errors, you might want to clear each error message as you fix the related error. That way, the SOAtest view only displays the error messages for the errors that still need to be fixed. Messages that you clear will only be removed temporarily. If the same findings are achieved during subsequent tests, the messages will be reported again. You can clear individual messages, categories of messages represented in the SOAtest view, or all reported messages.

Clearing Selected Messages To clear selected messages shown in the SOAtest view: 1. Select the message(s) or category of messages you want to delete. You can select multiple messages using Shift + left click or Ctrl + left click. 2. Right-click the message(s) you want to delete, then choose Delete. The selected messages will be removed from the SOAtest view.

Clearing All Messages To clear all messages found: •

Click the Delete All icon (a red X) at the top of the SOAtest view.

294

Generating Reports

Generating Reports This topic explains how to generate HTML, PDF, or custom XSL reports for tests that you run from the GUI or command line. Sections include: •

From the GUI



From the Command Line

From the GUI Generating the Report To generate a report immediately after a test completes: 1. After the test has completed, click the Report button that is available in the Test Progress view (at the bottom of the GUI).

2. Complete the Report dialog that opens. The Report dialog allows you to specify: •

Report preferences (by clicking the Preferences button and specifying settings as explained in “Configuring Reporting Settings”, page 297).



Any options files that specify reporting settings you want to use (these will override settings specified in the GUI’s Preferences panel).



The report format (HTML, PDF, or custom XSL).



The location of the report file.



Whether the report is deleted upon exit.



Whether the report should be uploaded to the Team Server (Server Edition only; requires Team Server).



Whether code review tasks/results should be uploaded to the Team Server (any edition; requires Team Server).

3. Click OK. SOAtest will then open an HTML report in a browser window. This report is similar to the manager HTML reports that are generated from command line tests.

On-Demand Report Generation You can generate a report for the previous test session at any time—regardless of whether the Testing dialog is available. To generate a report of the previous test session: •

Click the pull-down menu on the top right of this view, then choose Last Session> Report.

295

Generating Reports

Uploading the Report to Team Server To upload the report to Team Server (Server Edition only): •

Follow the above procedure, but be sure to enable Publish: Reports before clicking OK.

For information about this report, see “Understanding Reports”, page 300.

Tip To customize report settings, create a local settings file (as described in “Testing from the Command Line Interface (soatestcli)”, page 257 ), then enter the location to this file in the Report Options field of the Report dialog.

From the Command Line To generate an HTML report of command line test results (Server Edition only), use the -report %REPORT_FILE% option with your soatestcli command. To upload the report to Team Server, also use the -publish option with your soatestcli command. Two types of HTML reports can be produced from the command line interface: manager reports and individual reports. For information about this report, see “Understanding Reports”, page 300.

296

Configuring Reporting Settings

Configuring Reporting Settings This topic explains how to configure reporting settings. Reporting settings can be configured in the UI or from the command line interface (using a local settings file). Sections include: •

From the GUI



From the Command Line



Support for Custom Report Formats

From the GUI The available GUI controls can be used to specify reporting settings for any test—whether it is run from the command-line interface or the UI. Before configuring report settings, you should review the settings on the following preference pages to ensure that the task authorship is being calculated correctly, results are being sent to the proper Team Server and Report Center server, the correct email host is used, and so on: •

E-mail



Group Reporting



License



Scope and Authorship



Source Control



Team

The settings specified in the UI can be fully or partially overwritten by those specified in a local settings file. To specify reporting settings from the GUI: 1. Choose SOAtest> Preferences. The Preferences dialog will open. 2. Select SOAtest> Reports. 3. Specify the appropriate report content and format settings. Available settings include: •

Report contents: •

Detailed report for developers: Determines whether customized, detailed reports are generated for each developer (in addition to a summary report for managers). These reports contain only the tasks assigned to that developer.



Overview of tasks by authors: Determines whether the report includes an overview of the number and type of tasks assigned to each developer.



Tasks details: Determines whether the report includes details about all of the reported tasks.



Only top-level test suites: Determines whether the Test Suite Summary report section only lists the .tst files (with this option enabled) or displays a tree-like view of the individual tests in each .tst file (with this option disabled).



Active static analysis rules: Determines whether the report lists the static analysis rules that were enabled for the test.

297

Configuring Reporting Settings







Generate formatted reports in command-line mode: Determines whether formatted reports are generated for tests run in command line mode.



Overview of checked files and executed tests: Specifies whether the report provides details about all checked files and executed tests. •

For static analysis, this results in a list of all the files that were checked. For each file, it lists the number of rule violations and the number of suppressed violations. If the file has a violation, it also lists the line number, rule name, and rule ID for that violation.



For test execution, this results in a list of all executed test cases and their outcomes (pass or fail). For each test suite, it lists the total number of test cases and the number of passed test cases. If a task is reported for a test case, additional details (stack trace, outcome, etc.) are presented.



Suppressions details: Specifies whether the report lists suppressed messages.



Only tests that failed: Specifies whether the report lists only failed tests.



Cutoff date for graphs: Specifies the start date for trend graphs that track different categories of tasks over a period of time.

Report format: •

Format: Specifies the desired report format (HTML, PDF, or custom XSL).



XSL file: If you chose custom XSL as the report format, specify the path to the XSL file that defines your custom format. See “Support for Custom Report Formats”, page 299 for details and examples.



Report file extension: If you want to use a file extension other than the default .html extension, specify that extension here.

Advanced settings: •

Add absolute file paths to XML data: Specifies whether absolute file paths are added to XML data.



Report tag: Specifies the name of the group that is responsible for the project. This value is used for uploading summary results to Team Server. The tag is an identifier of the module checked during the analysis process. Reports for different modules should be marked with different tags. The variables detailed in “Using Variables in Preference Settings”, page 761 can be used here.

4. If you have not already configured e-mail settings (sender address, host name, etc.) in either the GUI or from the command line, do so now in SOAtest> E-mail Settings. 5. Select SOAtest> Reports> E-mail Notifications. 6. Specify the appropriate e-mail notification settings. Available settings include: •

Send reports by e-mail: Specifies whether reports are sent via e-mail.



E-mail subject: Specifies the subject for e-mails containing reports.Specifies the subject line of the emails sent. The default subject line is "SOAtest Report." For example, if you want to change the subject line to "SOAtest Report for Project A", you would enter SOAtest Report for Project A



Send manager reports to: Specifies where to send manager reports.



Send reports without tasks: Specifies whether reports are sent when zero tasks are reported.

298

Configuring Reporting Settings



Send developer reports to: Specifies where to send developer reports.



Send ’Unknown’ developer reports to: Specifies where to send developer reports for tasks assigned to "unknown" (tasks that could not be traced back to a specific developer).

From the Command Line Reporting settings can also be specified in local settings files. See “Reporting Settings”, page 267 for details. Note that the settings specified in the UI can be fully or partially overwritten by those specified in an local settings file. If a parameter is specified in this file, it will override the related parameter specified from the GUI. If a parameter is not specified in this file, SOAtest will use the parameter specified in the GUI.

Support for Custom Report Formats A custom XSL transformer was added to facilitate the use of custom XSL formats. To specify a custom report format by entering the XSL file and report file extension. In the Reports preference page, you can specify this information in the Custom report format area of the page. In the options file, you can specify this information using the following options: (results.)report.custom.extension (results.)report.custom.xsl.file For additional guidance, see the following files (available in the manuals directory): •

XML Schema: reports.xsd



Sample XML with a variety of results: rep_example.xml •







Note that this report is used by all transformations below

Simple CSV plain text file transformation •

XSL file: csv.xsl



result: csv.txt

Simple HTML table with violations list •

XSL file: html_table.xsl



result: html_table.html

Simple HTML table with author/violations statistics •

XSL file: stats_table.xsl



result: stats_table.html

299

Understanding Reports

Understanding Reports This topic provides a general introduction to the reports that SOAtest produces for GUI and cli tests. Report details will vary based on report settings, the Test Configuration used, and the errors found. Sections include: •

Report Types



Report Contents

Report Types Two types of reports can be produced from the command line interface: •

Comprehensive reports: Reports that contain all tasks generated for a single test run.



Individual reports: Reports that contain only tasks assigned to the specified team member.

For example, if a test generated 5 tasks for Tom and 10 tasks for Joe, the comprehensive report would contain all 15 tasks, Tom’s report would contain 5 tasks, and Joe’s report would contain 10 tasks.

Report Contents Reports may contain the following sections:

Header/Navigation Bar The top left cell of the header/navigation bar shows the time and date of the test. The remaining cells (Static Analysis, Test Execution) each link to the named report section.

Static Analysis Section The Static Analysis section includes several items: •

The Static Analysis trends graph tracks how the total number of lines of code in the project, the lines of project code that were checked during static analysis, and the total number of reported static analysis tasks vary over time. This graph is created only for tests that are run from the command line and that use the -publish command.

300

Understanding Reports



The Overview table shows a basic summary of all static analysis tasks for the tested project(s). It reports the total number of static analysis tasks, the number files checked during static analysis, the total number of project files, the number of lines of code checked during static analysis, and the total number of lines of code in the project. It also reports the total time spent performing static analysis.



The All Tasks by Category table shows the total number of tasks reported for each static analysis rule category and rule. Tasks can be sorted by rule category or rule severity; click the appropriate link in the table header to change the table sorting.



The Tasks per Author table shows the number of static analysis tasks assigned to each team member. To see details about the static analysis tasks assigned to a particular team member, click the related username. •

If a team member’s name is listed in green, it means that there are no “recommended static analysis tasks" reported for that team member ("recommended tasks" are the subset of all reported tasks that has selected for that team member to review and address today [based on the maximum number of tasks per team member that your team has configured to report, as described in “Defining Task Reporting and Resolution Targets (Goals Tab)”, page 250, and task assignment settings, as described in “Configuring Task Assignment”, page 215]).



The Task Details section provides details about each reported task.



The Checked Files (Details) section lists all the files that were checked. For each file, it lists the number of rule violations and the number of suppressed violations. If the file has a violation, it also lists the line number, rule name, and rule ID for that violation.



The Active Rules section lists the names and IDs of all rules that were enabled for the test.

Test Execution Section The Test Execution section includes several items: •

The Test Execution Tasks trends graph tracks how the number of functional test failures change over time. This graph is created only for tests that are run from the command line interface and that use the -publish command.

301

Understanding Reports



The Test Suite Summary provides a breakdown of failed tests, successful tests, total test, and the success %. Test suites with failed tests are highlighted in pink.

302

Understanding Reports



The Tasks per Author table shows the number of testing tasks assigned to each team member. To see details about the testing tasks assigned to a particular team member, click the related username. •

If a team member’s name is listed in green, it means that there are no “recommended testing tasks" reported for that team member ("recommended tasks" are the subset of all reported tasks that has selected for that team member to review and address today [based on the maximum number of tasks per team member that your team has configured to report, as described in “Defining Task Reporting and Resolution Targets (Goals Tab)”, page 250, and task assignment settings, as described in “Configuring Task Assignment”, page 215]).

Tasks Per Author

Tasks Details



The Task Details section provides details about each reported task.



The Executed Tests (Details) section lists all executed test cases and their outcomes (pass or fail). For each test suite, it lists the total number of test cases and the number of passed test cases. If a task is reported for a test case, additional details (stack trace, outcome, etc.) are presented.

303

Understanding Reports

Team Server Report Link This link allows you to directly browse to this and other report files available on Team Server. In the reports available on Team Server, all links (for instance, links to Category) are active. All links are not active in emailed reports. Thus, if you want to explore an emailed report in more detail, we recommend that you follow this link and access the report on Team Server.

304

Functional/Integration Testing In this section: •

End-to-End Test Scenarios



Web Functional Tests



SOA Functional Tests

305

End-to-End Test Scenarios In this section: •

Configuring End-to-End Test Scenarios: Overview



Adding Projects, .tst files, and Test Suites



Working with Projects and .tst files



Reusing/Modularizing Test Suites



Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)



Adding Standard Tests



Adding Set-Up and Tear-Down Tests



Adding Test Outputs



Adding Global Test Suite Properties



Reusing and Reordering Tests



Parameterizing Tests (with Data Sources or Values from Other Tests)



Configuring Testing in Different Environments



Validating the Database Layer



Validating EJBs



Validating Java Application-Layer Functionality



Monitoring and Validating Messages and Events Inside ESBs and Other Systems



Executing Functional Tests



Reviewing Functional Test Results



Creating a Report of the Test Suite Structure



Managing the Test Suite

306

Configuring End-to-End Test Scenarios: Overview

Configuring End-to-End Test Scenarios: Overview SOAtest provides full support for the testing of both Web interfaces and services over multiple protocols. This establishes an integrated framework for "end-to-end" testing; a single test suite can verify operations that cross the messaging layer, the Web interface, the database, and EJBs. Moreover, "Unit tests" can be created as soon as a single "unit of work" is completed, then this test suite can be incrementally enhanced to verify additional components as they are added, test the integration of components within an SOA, and audit end-to-end business processes that span across composite applications. SOAtest provides a flexible test suite infrastructure that lets you add, organize, and run specialized test scenarios. Each test in a test suite contains a main test tool and any number or combination of outputs (other tools, or special output options). You can run individual tests, or the complete test suite. In addition, you can attach regression controls at the test or test suite level so that you are immediately alerted to unexpected changes. After individual functional tests have been created, they can be leveraged into scenario-based tests without any additional work. Scenario tests allow you to emulate business logic or transactions that may occur during normal usage of the application or service. This also allows you to find bugs that may surface only after a certain sequence of events.

Recommended Workflow The most common workflow for developing test end-to-end test scenarios is as follows. As each "unit of work" (service or other logic component) is completed: 1. Use a wizard to automatically generate a project, test suite, and initial test cases. 2. Add additional outputs and tests as needed to cover the functionality that you want to test. 3. Execute the test and verify that it is producing the expected result. 4. If there are problems with the application, diagnose and correct them, then rerun the tests. 5. If the test is not working as expected, adjust test and tool settings as needed, then rerun the tests. 6. Add regression controls or validations once the tested functionality is behaving as expected. As your test suites grow, combine tests into test scenarios; for example, you can: •

Copy and paste tests from different test suites into a logic order.



Configure execution options such as test sequence, test relationship, and test flow logic.



Parameterize tests to use values from data sources or values extracted from other tests.



Use stubs and environments to configure a predictable and accessible test bed.

307

Adding Projects, .tst files, and Test Suites

Adding Projects, .tst files, and Test Suites This topic provides a general guide on how to add projects, .tst files and test suites using SOAtest’s various test creation wizards. Sections include: •

Projects, .tst files, and Test Suites



Test Suites and Scenarios



Adding a New Project and .tst File



Creating an Empty Project



Adding a New .tst File to an Existing Project



Adding a New Test Suite

Projects, .tst files, and Test Suites A project (an entity created by Eclipse) can contain any number of SOAtest-specific .tst files. They can also contain source files you want to analyze with SOAtest, and any other resources that make sense for your environment. Each .tst file can include any number of test suites/scenarios, tools, inputs, and stubs. The organization and structure is up to you. To keep file size down and to improve maintainability, we recommend using one .tst file for each distinct testing requirement.

Test Suites and Scenarios A test suite is any collection of tests that are individually runnable, and has the following setting in the test suite configuration panel:

A scenario is any collection of tests that are not individually runnable because they have dependencies. One example of a scenario is when a series of SOA tests extracts a value from one test’s response and uses it as part of subsequent test message. Another example is a sequence of Web functional tests recorded from a browser.

Adding a New Project and .tst File

308

Adding Projects, .tst files, and Test Suites

Projects provide you with a central place to access and operate on different .tst files and the test suites they contain. You can either create a test suite manually, or you can use the SOAtest test creation to automatically create a test suite from various platforms or artifacts. SOAtest provides a wizard to guide you through the creation of a new project and adding an initial .tst file to it. There are two ways to access this wizard: •

Choose File> New, then either select your desired test creation option (for example, Project from WSDL, Project from Web Browser Recording, etc.) if it is listed, or choose Other to open a full list of test creation options.



Choose this command from pull-down menu for the New toolbar button (top left).

The wizard will guide you through the test case creation process, then create a project and .tst file containing the generated tests. For help selecting and completing the available test creation wizards, see: •

“Automatic Creation of Test Suites for SOA: Overview”, page 384.



“Recording Tests from a Browser”, page 431



“Configuring SOAtest to Scan a Web Application”, page 583

Creating an Empty Project 309

Adding Projects, .tst files, and Test Suites

You can create an empty project as follows: 1. Open the pull-down menu for the New toolbar button (top left) then choose Project.

2. Choose SOAtest> Empty Project, then click Next. 3. Enter a name for the project, change the destination if needed, then click Finish.

Adding a New .tst File to an Existing Project We recommend that you create a separate test (.tst file) for each distinct testing requirement. To add a new Test Suite (.tst) file to an existing project. 1. Do one of the following: •

Right-click the project node, and select New Test (.tst) File from the shortcut menu.



Choose File > New > New Test (.tst) File.

310

Adding Projects, .tst files, and Test Suites

2. In the New Test (.tst) File wizard that opens, select the project that you want to contain the .tst file, enter a name for the .tst file, then click Next.

You can then complete the wizard to specify what type of tests you want to create and how you want them created. For help selecting and completing the available test creation wizards, see: •

“Automatic Creation of Test Suites for SOA: Overview”, page 384.



“Recording Tests from a Browser”, page 431



“Configuring SOAtest to Scan a Web Application”, page 583

Adding a New Test Suite To create a new test suite: 1. Do one of the following:

311

Adding Projects, .tst files, and Test Suites



Select the Test Case Explorer node for the existing test suite into which you want to add a new test suite, then click the Add Test Suite button:



Right-click the Test Case Explorer node for the existing test suite into which you want to add a new test suite, then choose Add New> Test Suite from the shortcut menu.

312

Working with Projects and .tst files

Working with Projects and .tst files This topic explains how to work with projects and .tst files. Sections include: •



Working with Projects •

Saving a Project File



Closing a Project File



Opening a Project File

Working with Test (.tst) Files •

Opening .tst Files



Understanding a .tst File’s XML Format

Working with Projects Saving a Project File Any changes that you make to project properties, test suites, and so on are automatically saved to the project. Projects will remain open until you explicitly close them/

Closing a Project File When you close a project file, SOAtest will close all related trees and settings. Closed projects are shown in the Navigator, but not in the Test Case Explorer. To close the current project and all related setting: •

In the Navigator, right-click the project, and choose Close Project from the shortcut menu.

Opening a Project File When you open a project file, all associated trees, settings, and reports will be restored. To open a project file: •

In the Navigator, right-click the project, and choose Open Project from the shortcut menu.

Working with Test (.tst) Files Opening .tst Files By default, .tst files are closed. All open .tst files are loaded into memory. There are two ways to open a .tst file: •

Double click the .tst file’s Test Case Explorer node



Right-click the .tst file’s Test Case Explorer node, then choose Open Test (.tst) File from the shortcut menu.

Closed .tst files have the following "closed box" icon:

313

Working with Projects and .tst files

Open .tst files have the following "open box" icon:

Closing .tst Files There are two ways to close a .tst file: •

Double click the .tst file’s Test Case Explorer node



Right-click the .tst file’s Test Case Explorer node, then choose Close Test (.tst) File from the shortcut menu.

Understanding a .tst File’s XML Format SOAtest .tst files are saved in XML format, and thus can be parsed to get test suite and test information into a custom framework. The following table describes how some of the most commonly-used tools are represented:

Artifact

Element Name

Parent Element

Root element

SOAtestProject

None

Test Suite

Test Suite

TestSuite (if nested under other Test Suites)

Test or a Test Suite name

name

any of the other listed elements

Environments

EnvironmentConfiguration

TestSuite

Data Sources

DataSource

TestSuite/SOAPRPCToolTest

Messaging Client

HTTPClient

TestSuite/HTTPClientToolTest

Browser Testing Tool

BrowserTestingTool

TestSuite/ToolTest

DB Tool

DbTool

TestSuite/ToolTest

Extension Tool

MethodTool

TestSuite/ToolTest

Call Back Tool

CallBackTool

TestSuite/CallBackToolTest

Message Stub Tool

ClientTester

TestSuite/ClientTesterTest

Moreover, the following list describes common, specially named elements that will appear in the XML project file. The fields are named so that you can exclude parts of it and make the search more general. For example, to search for any WSDL, you could search & replace for "_WSDLLocation>http:// mywsdl</", for just SOAP Client WSDLs, you could search for "<SOAPClient_WSDLLocation>http:/ mywsdl</SOAPClient_WSDLLocation>" WSDL fields: •

SOAPClient_WSDLLocation



ClientTester_WSDLLocation

314

Working with Projects and .tst files



WSITool_WSDLLocation



XMLValidator_WSDLLocation

Schema fields: •

SOAPClient_SchemaLocation



ClientTester_SchemaLocation



MessagingClient_SchemaLocation

Endpoint fields: •

SOAPClient_CustomEndpoint



SOAPClient_UDDIServiceKey



MessagingClient_Endpoint

Literal (XML) fields: •

SOAPClient_LiteralMessage



ClientTester_LiteralMessage



MessagingClient_LiteralMessage

XPath Fields: •

XMLDatabank_ExtractXPath



XMLDatabank_AlterXPath



XMLTransformer_ExtractXPath



XMLTransformer_AlterXPath



Assertion_XPath

Diff Tool Regression Controls: •

DiffTool_RegressionControl

BrowserTestingTool fields: •

BrowserTestingTool_NavigateURL - the url field for a Navigate action



BrowserTestingTool_WindowName - the window name field for any action



BrowserTestingTool_LocatorAttributeValue - the attribute value field for any action set to an Element Properties locator



BrowserTestingTool_LocatorXPath - the xpath field for any action set to an XPath locator



BrowserTestingTool_TypeValue - the value field for a type action



BrowserTestingTool_OtherValue - the value field for an "other" action



BrowserTestingTool_NewBrowserURL - the url field for a NewBrowser action

315

Working with Projects and .tst files

BrowserDataBank: •

BrowserDataBank_LocatorAttributeValue - the attribute value field for any extraction set to an Element Properties locator



BrowserDataBank_LocatorXPath - the xpath field for any extraction set to an XPath locator



BrowserDataBank_WindowName - the window name field for any extraction

BrowserValidationTool: •

BrowserValidationTool_LocatorAttributeValue - the attribute value field for any validation set to an Element Properties locator



BrowserValidationTool_LocatorXPath - the xpath field for any validation set to an XPath locator



BrowserValidationTool_WindowName - the window name field for any validation



BrowserValidationTool_ExpectedValue - the expected value field for any validation

316

Reusing/Modularizing Test Suites

Reusing/Modularizing Test Suites This topic explains how to make test suites reusable. Sections include: •

Introduction



Using Test Suite References



Using Test Variables



Tutorial

Introduction In many cases, you may want to create a test suite that can be reused by other test suites. A common example is a test suite that logs in to a web site. Once such a test suite is created, it can be used by other test suites in various scenarios that require a login. Two SOAtest features are especially helpful for the creation and use of reusable test suites: •

Referenced test suites: Once a reusable module or test suite has been created, it can be referenced by another test suite.



Test variables: You can parameterize tests with test variables, which can be set to specific values from a central location, set from data sources, or set from a data bank tool or Extension tool.

Using Test Suite References Adding an existing test suite as a test suite reference is especially useful if you have a test suite that you would like your team members to reuse across multiple parent test suites. For example, you may have a single Authentication test suite that different team members want to use within different root test suites. In this situation, the team members can add a reference to the defined Authentication test within their specific test suite. For another example, consider a web application that requires a user to log in. The sequence of steps to log into the application could be saved in one SOAtest test suite, and then every functional test for that web application could reference the test suite containing the login information. Setting up the tests in this manner makes it much easier to manage an evolving web application. If an extra step is added to the login process for the web application, then only the "login" test suite needs to be modified to include that extra step, and all other tests that reference the "login" test suite will automatically be updated with the change. To reference an existing test suite in another test suite: 1. Right-click the Test Suite tree node where you want the test suite referenced, then select Add New> Test Suite from the shortcut menu. 2. Select Reference Test (.tst) File and click the Finish button. 3. Select the appropriate .tst file from the file chooser that opens. After you add a test suite reference, it will be referenced by the current test suite. If the referenced test suite is modified (from its original location in the Test Case Explorer) , those changes will be propagated to the parent test.

Using Test Variables 317

Reusing/Modularizing Test Suites

For details on using test variables, see “Defining Test Variables”, page 326.

Tutorial For a step-by-step demonstration of how to construct and use a reusable test suite, see “Creating Reusable (Modular) Test Suites”, page 92.

318

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.) This topic explains how to customize a test suite’s properties (such as its name and how individual test cases are run). Topics include: •

Accessing the Test Suite Configuration Panel



Correlating Tests to Requirements, Project Center Tasks, Bug Reports, and Feature Requests



Specifying Execution Options (Test Flow Logic, Regression Options, etc.)



Defining Test Variables



Specifying SOAP Client Options



Specifying Browser Playback Options



Specifying Reporting Options

Accessing the Test Suite Configuration Panel To customize test suite properties, double-click its Test Case Explorer node and use the controls that open in the configuration panel (on the right side of the GUI).

Correlating Tests to Requirements, Project Center Tasks, Bug Reports, and Feature Requests The Requirements and Notes tab of the test suite configuration panel allows you to identify requirements for each test in the test suite. The requirements you define will appear in Structure reports (and also in Report Center for Report Center users), allowing managers and reviewers to determine whether the specified test requirements were accomplished. For more information on Structure Reports, see “Creating a Report of the Test Suite Structure”, page 380. To configure requirements for tracking, complete the following from the Requirements and Notes tab: 1. Select a node from the test suite tree within the Requirements and Notes tab.

319

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)

2. Click the Add button. 3. In the Type box, select a requirement type. Parasoft Concerto will use this information to associate the test suite’s test cases to the specified element type. For instance, if it is associated with a specific bug, information about the test case’s status will be considered for Report Center’s bugs graphs. Available task types are: •

@pr: for bugs.



@fr: for feature requests.



@req: for requirements.



@task: for tasks.

4. Enter an ID and a URL for the requirement and click OK.

The requirement you defined will display in the Requirements table within the Requirements and Notes tab and correspond to the test suite node you selected and all of its child nodes. 5. (Optional) If you want to enter notes for the test suite, enter a description in the Notes field. This option is useful in that it enables you to view a quick description of the test suite purpose.

Specifying Execution Options (Test Flow Logic, Regression Options, etc.) Configurable execution options allow you to control factors such as whether: •

Tests run sequentially or concurrently.



Tests can be run independently, or should be run in groups.



One test depends on the result of another test



The entire test suite should loop until a certain condition being met.



Regression controls are created for specific tests, and how regression controls map to data sources.

These options are configured in the Execution Options tab, which has three sub-tabs: Test Execution, Test Flow Logic, and Regression Options.

Test Execution You can customize the following options in the Test Execution sub-tab of the Execution Options: •

Execution Mode: These options determine the concurrency of test runs

320

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)





Tests run Sequentially: Choose this option to tell SOAtest to run each test, and child test suite of this test suite, separately from the others. Tests will run one at a time.



Tests run Concurrently: Choose this option to tell SOAtest to run all tests and child test suites of this test suite at the same time. Tests will run simultaneously.

Test Relationship: These options determine how SOAtest will iterate through the rows of your data sources •

Tests are individually runnable: (Default) SOAtest iterates through all data source rows for each test. When an individual test executes, it will use every row of the data source before the next test or test suite is executed. When a child test suite is executed, SOAtest will wait for all of its children to finish before the next test or test suite is executed.



Abort Scenario on Fatal Error: To stop running tests if the previous test resulted in a fatal error, check the Abort Scenario on Fatal Error checkbox. •

This option is only available when both Tests run Sequentially is selected and Tests are individually runnable is not selected. This case occurs when a set of tests in a test suite are dependent on each other, cannot be run apart from each other, and must run sequentially. If the option is enabled, and if a test within the scenario being run has a fatal error, the rest of the tests in the scenario will not be run. If it is disabled, even if a fatal error occurs, the remaining tests in the scenario will be run.



Tests run as group: (Default for scenarios) SOAtest runs all tests for each row of the data source. In this case, a data source row is chosen, and each test and child test suite is executed for that row. Once all children have executed, a new row is chosen and the process repeats.



Tests run all sub-groups as part of this group: SOAtest treats all tests contained in this test suite like direct children of this test suite. SOAtest will then iterate through them as a group. For example, consider the arrangement in the following figure:

In this case, we assume that Test Suite 2 and Test Suite 3 are both set to “Tests are individually runnable,” the Table has 2 rows of data, and we consider a test run under each of the Test Relationship options for Test Suite 1.This table demonstrates the order that tests would run for different choices of Execution Options of Test Suite 1. This also assumes that Test Suite 2 and Test Suite 3 remain set to "Run individually." The result is to run tests in the order shown in the table below.

321

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)

Run individually

Run as group

Run all subgroups as part of this group

SOAP Client 1 row 1

SOAP Client 1 row 1

SOAP Client 1 row 1

SOAP Client 1 row 2

SOAP Client 2 row 1

SOAP Client 2 row 1

SOAP Client 2 row 1

SOAP Client 2 row 2

SOAP Client 3 row 1

SOAP Client 2 row 2

SOAP Client 3 row 1

SOAP Client 1 row 2

SOAP Client 3 row 1

SOAP Client 3 row 2

SOAP Client 2 row 2

SOAP Client 3 row 2

SOAP Client 1 row 2

SOAP Client 3 row 2

SOAP Client 2 row 1 SOAP Client 2 row 2 SOAP Client 3 row 1 SOAP Client 3 row 2

Test Flow Logic SOAtest allows you to create tests that are dependent on the success or failure of previous tests, setup tests, or tear-down tests, thereby creating an efficient workflow within the test suite. In addition, you can also influence test suite logic by creating while loops and if/else statements that depend on the value of a test variable. Options can be set at the test suite level (options that apply to all tests in the test suite), or for specific tests.

Test Suite Logic Options In many cases, you may want to have SOAtest repeatedly perform a certain action until a certain condition is met. Test suite flow logic allows you to configure this. Understanding the Options To help you automate testing for such scenarios, SOAtest allows you to choose between two main test flow types: •

While variable: Repeatedly perform a certain action until a test variable condition is met. This requires test variables, described in “Defining Test Variables”, page 326, to be set.



While pass/fail: Repeatedly perform a certain action until a pass/fail condition is met (e.g., one of the tests in the test suite either passes or succeeds).

For example: •

A user submits some data to a web service, and then that submission results in other data being inserted into a database at a later time. The time at which the data is inserted into the database varies. To check this in SOAtest, you could define a DB tool with a chained assertor that fails while the data is not present. The test would then need to loop on this DB tool until it succeeds.



In a web application, the user enters some data and clicks a "Submit Query" button. If the data is not available, the application just shows a "data loading" message. The user repeatedly

322

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)

clicks the button until some data appears. To check this in SOAtest, you could setup a Browser Testing Tool that performs a click action on the button, then chain to it a Browser Validation Tool that validates whether some element is present. The test would need to loop until the element appears. •

In a web application, search results are often present in a "paged" format, meaning that the results are distributed over multiple pages. If the result that you are looking for is not on the currently displayed page, you need to click the "Next" link until it appears. To check this in SOAtest, you could configure a Browser Testing tool that performs a click action on the "Next" link, with a Browser Validation tool that validates if the desired result is present. The test would then need to loop until the result appears.

Setting the Options To configure test flow logic options that apply across the test suite: 1. Open the Execution Options> Test Flow Logic tab, then select the top-level node. .

2. Select the desired flow type. •

You can choose from while variable or while pass/fail loop flow (see above for an explanation of the different types) or none (if you do not want execution flow to depend on a condition being met).

3. (Optional) Customize the Maximum number of loops setting, which determines the maximum number of loops to run if the specified condition is never met. 4. If you chose while/pass fail flow specify the loop conditions by going to Loop until one of the test(s) and choose succeeds or fails—depending on which outcome you want to occur before the test suite proceeds. 5. If you chose while variable flow, set the while and do conditions as follows: •



while: Select the desired variable from the drop-down list. The items in this list depend on the variables you added to the Test Variables tab. •

If the variable you select was defined as a boolean value, you will be able to select from either true or false radio buttons.



If the variable you select was defined as an integer, a second drop-down menu displays with == (equals), != (not equal), < (less than), > (greater than), <= (less than or equal to), >= (greater than or equal to). In addition, a text field is available to enter an integer.

do: Allows you to determine the action for the variable in the while loop. The following options are available: •

Nothing: If the variable condition is met, do nothing.



Increment: (For integer values only) If the variable condition is met, increment the variable.

323

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)



Decrement: (For integer values only) If the variable condition is met, decrement the variable.



Negate: (For boolean values only) If the variable condition is met, negate the variable.

Test Flow Logic Tutorial For a step-by-step demonstration of how to apply while pass/fail test flow logic, see “Looping Until a Test Succeeds or Fails - Using Test Flow Logic”, page 97.

Test-Specific Logic Options The following options are available for specific tests:



Test Result Dependency: If the current (selected) test should run only if another test succeeds, fails, or is skipped, then specify the name of that dependent test here. For example, if Test 4 depends on the results of Test 1, select Test 4 in the left panel, then choose Test 1 from the drop-down menu. Then, specify the condition under which the current test should run. Options are: •

Success: Select if the subsequent test case should be run according to the success of the test case selected in the Test drop-down menu. If the test case selected in the Test drop-down menu does not succeed, the subsequent test case will not run.



Failure: Select if the subsequent test case should be run according to the failure of the test case selected in the Test drop-down menu. If the test case selected in the Test drop-down menu does not fail, the subsequent test case will not run.



Skipped: Select if the subsequent test case should be run if the test case selected in the Test drop-down menu was skipped. If the test case selected in the Test drop-down menu is not skipped, the subsequent test case will not run.

324

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)

Set-Up and Tear-Down Tests If any set-up or tear-down tests are available, they display in the left GUI panel and you will be able to configure test logic as follows: •

The execution of a test can be dependent on a set-up test.



A setup test can now be dependent on a previous set-up test.



A tear-down test can be dependent on a regular test, set-up test, or previous tear-down test.

This functionality allows you to stop a test (or run a test) if a setup test fails.



Variable Condition: Allows you to determine whether or not a test is run depending on variables added to the Test Variable table (for more information on adding test variables, see “Defining Test Variables”, page 326). If no variables were added, then the Variable Condition options are not available. The following options are available if variables were defined: •

Variable Condition drop-down: Select the desired variable from the drop-down list. The items in this list depend on the variables you added to the Test Variable table. •

If the variable you select was defined as an integer, a second drop-down menu displays with == (equals), != (not equal), < (less than), > (greater than), <= (less than or equal to), >= (greater than or equal to). In addition, a text field is available to enter an integer. For example:

If x != 13 (x does not equal 13), the test will run, however, if x does equal 13, the test will not be run. •

If the variable you select was defined as a boolean value, you will be able to select from either true or false radio buttons. For example:

If variable x1 is false, the test will run, however, if x1 is true, the test will not be run. •

Delay in milliseconds: Lets you set a delay before and/or after test execution.

Regression Options The Regression Options controls options allow you to customize how data sources are used in regression tests and which test suite have regression controls. Note that this tab is not applicable for Web functional tests. Available options are:

325

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)



Use data source row numbers: (Default value) Select to enable all the Diff regression controls within the test suite to associate data source row numbers to the data generated by the Diff control. For example, Row N in a data source will be associated with the Row N control in the diff tool, regardless of what data source values are used. When this option is selected, there is no dependency between the data source values and the corresponding diff regression control (in the context of multiple regressions). •

Note: If a new row is inserted into or deleted from the data source, all multiple regression controls associated with that data source must be updated.



Use data source column names and values: Select to enable all the Diff regression controls within the test suite to associate the data source column names and values to the data generated by the Diff control. For example, the request which used A=1, B=2 in a SOAP Client will be associated with the control that has been labelled "A=1, B=2" and so on. When this option is selected, you can add and remove data source rows as you wish and the Diff will map the content to the correct control as long as the column names and values are unchanged. For more information on using data sources, see “Parameterizing Tests (with Data Sources or Values from Other Tests)”, page 345.



Regression Controls Logic: This table allows you to configure which tests in a test suite SOAtest should create regression controls for. From each test entered in the table, you can select Always or Never. Regression controls will be updated accordingly the next time you update the regression controls for the test suite.

Defining Test Variables The Test Variables tab allows you to configure variables that can be used to simplify test definition and create flexible, reusable test suites. After a test variable is added, tests can parameterize against that test variable.

Understanding Test Variables You can set a test variable to specific value, then use that test variable throughout the current test suite to reference that value. This way, you don’t need to enter the same value multiple times—and if you want to modify the value, you only need to change it in one place. As an alternative to manually setting a test variable to a specific value, you can have a data bank tool (e.g., XML Data Bank) or Extension tool set the value of that test variable "on-the-fly." Moreover, if you have a referenced test suite (a test suite that is referenced by a parent test suite—see “Using Test Suite References”, page 317 for details), test variables can be used to access data sources from the parent test suite.

Adding Test Variables You can add a new variable as follows: 1. Click the Add button. 2. Enter a new variable name in the Name field. 3. Select either Integer, Boolean, String, or Data Source from the Type box. 4. Specify whether you want to use a local value or use a value from a parent test suite. •

Use value from parent test suite (if defined) - Choose this option if the current test suite is a "referenced" test suite and you want it to use a value from a data source in the parent test suite. See “Using Test Suite References”, page 317 for details on parent test suites.

326

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)



Use local value - Choose this option if you always want to use the specified value— even if the current test suite has a parent test suite whose tests set this test variable. Note that if you reset the value from a data bank tool or Extension tool, that new value will take precedence over the one specified here.

5. (For data source type only) Specify the name of the data source and column where the appropriate variables are stored. The data source should be in the parent test suite (the test suite that references the current test suite). 6. Enter the variable value in the Value field. If you chose Use local value, the variable will always be set to the specified value (unless it is reset from a data bank tool or Extension tool). If you chose Use value from parent test suite, the value specified here will be used only if a corresponding value is not found in the parent test suite.

7. Click OK.

Using Test Variables Once added, variables can be... •

Used via the "parameterized" option in test fields. For instance, If you wanted to set a SOAP Client request element to use the value from the title variable test variable, you would configure it as follows:



Reset from a data bank tool (e.g., an XML Data Bank as described in “Configuring XML Data Bank Using a Wizard”, page 863).



Reset from an Extension tool (as described below in “Setting Test Variables and Logic Through Scripting”, page 328).



Used to define a test logic condition as described below in “Test Flow Logic”, page 322.

327

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)

Setting Test Variables and Logic Through Scripting Very often, test suite logic and variables will depend on responses from the service itself. Using an Extension tool, you can set a test suite variable in order to influence test flow execution. For example, if Test 1 returns a variable of x=3, then Test 2 will run. Via the TestSuiteVariable API, located in <SOAtest_62_Installation_Directory>/plugins/ com.parasoft.xtest.web_<soatest_version>/help/api, you can access a test suite variable and either set it to a value, or get a value from it. Using this value, you can configure test flow logic. For example, you can enter the following into an Extension tool to set a variable: from com.parasoft.api import Application def setVar(input, context): context.setValue("x", input.toString())

To get a value from a TestSuiteVariable object x: varValue = context.getValue("x")

Where varValue will be returned as a string. For instance, you can add an XML Transformer tool to a test and extract a certain value from that test. Then, you can add an Extension output to the XML Transformer and enter a script to get the value from the Transformer. Finally, you can set up a second test to run only if the correct value is returned from the first test.

Monitoring Test Variable Usage To configure SOAtest to show what variables are actually used at runtime, set Console preferences (SOAtest> Preferences> Console) to use normal or high verbosity levels. After each test, the Console view (Show View> SOAtest> Console) will then display test variables used at runtime. For example: Scenario: ICalculator Test 1: first add - success get x=0 set x=10.0 set Test 1: type=xsd:float Test 2: second add - success get x=10 set x=20.0 Test 3: third add - success get x=20 set x=30.0 Test 1: first add - success get x=30 set x=50.0 set Test 1: type=xsd:float Test 2: second add - success get x=50 set x=70.0 Test 3: third add - success get x=70 set x=90.0

Viewing such variables is useful for diagnosing the cause of any issues that occur.

Tutorial For a step-by-step demonstration of how to use test variables, see “Creating Reusable (Modular) Test Suites”, page 92.

328

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)

Specifying SOAP Client Options You can customize the following options in the SOAP Client Options tab of the test suite configuration panel: •

Endpoint: If you would like to specify an endpoint for all tests within the test suite, enter an endpoint and click the Apply Endpoint to All Tests button.



Timeout after (milliseconds): If you do not want to use the default, select Custom from the drop-down menu and enter the desired time. The default value is 30000.



Attachment Encapsulation Format: Select Custom from the drop-down menu and select either MIME or DIME, MTOM Always, or MTOM Optional. The default value is MIME.



SOAP Version: Select Custom from the drop-down menu and select either SOAP 1.1 or SOAP 1.2. The default value is SOAP 1.1.



Outgoing Message Encoding: Allows you to choose the encoding for outgoing messages. You can choose any Character Encoding you wish from the SOAtest Preferences panel to read and write files, but the Outgoing Message Encoding provides additional flexibility so you can set a different charset encoding for the SOAP request from the global setting.

Specifying Browser Playback Options The Browser Playback Options tab is divided into four sections: •

Browser Playback: Describes the Browser type to use in the test. The panel contains a dropdown menu and three radio buttons (Firefox, Internet Explorer, Both). If the drop-down menu is set to Default, then all three radio buttons will be disabled, and this test suite will choose which browsers to replay the test based on its parent’s choice. If there is no parent (i.e. The root test suite), then the default option is to run the test back in both Firefox and Internet Explorer. If Custom is selected from the drop-down menu, then the selected test suite and any of its children test suites with Default selected in the same drop down box, will play back the test in the selected browser. •

Firefox: If this option is selected, SOAtest will play back the test in the Firefox browser using the appropriate Firefox executable (see part 2 below). SOAtest supports Mozilla Firefox 1.5 and higher.



Internet Explorer: If this option is selected, SOAtest will play back the test in the Internet Explorer browser (Windows machines only). SOAtest supports Internet Explorer 6 and higher. •

Run in specified browser only: Enable this option if you want to ensure that this test is never played in an alternate browser (e.g., because the web page structure is significantly different on other browsers and the scenario would need to be constructed differently on another browser).



Both: If this option is selected, SOAtest will play back the test in both Firefox and Internet Explorer.



Firefox executable path: This value is inherited from its parent if the Default option is selected for Browser type. You may select which version of Firefox to use by selecting the option from the drop-down menu, or by clicking the Browse to executable button and selecting the appropriate version. Note: On Windows machines, SOAtest will attempt to detect a Firefox installation automatically. Linux users will have to browse to the Firefox executable.

329

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)



Visibility: Describes the visibility of the tests as they playback. This option is inherited from its parent if Default is selected. You may choose Headless or Visible if Custom is selected. •

• •

In Headless mode, you will not be able to see the tests as they run (i.e. the browser will not be visible while the test is running). The following support is available for Headless mode: •

Windows: Fully supported



Linux: Supported on Linux 2.4.21-27.0.2 kernel builds and later (tested on Red Hat, Debian, and Mandrake Architectures)



Solaris: Supported on Solaris 9 and 10

In Visible mode, you will be able to watch in the browser as the test runs and be able to visually verify that the test ran correctly.

Authentication: Allows you to specify a Username and Password for Basic and NTLM authentication of your web application. If you enter a username/password during recording, this section will be automatically configured in the recorded scenario. However, you can later go back and modify the settings. The settings in this section are also inheritable from parent test suites. Therefore, if you have many functional test scenarios that require authentication, you can specify the settings in a high-level test suite that contains all the functional scenarios. •

The Perform Authentication checkbox specifies whether to send authentication credentials to the web application. If it is checked, credentials will be sent.

Specifying Reporting Options The Reporting Options tab allows you to customize meta-attributes that SOAtest sends to different reporting tools while running the current test suite. To specify reporting attributes for CentraSite Active SOA: 1. Select CentraSite from the Reporting options drop-down menu. 2. Specify the UDDI Service Key. To specify test-suite-specific reporting attributes for Parasoft Report Center: 1. Select Report Center Attributes from the Reporting options drop-down menu. 2. Click Add, then specify reporting attributes. •

For details on attributes, see “Configuring Report Center Attributes”, page 204.

330

Adding Standard Tests

Adding Standard Tests This topic explains how to add standard tests—tests that are added to a test suite and which are executed in the order in which they are listed. To add a standard test case to a test suite: 1. Select the Test Case Explorer tree node that represents the Test Suite you want to extend.

2. Click the Add Test or Output toolbar button. •

The Add Test wizard opens and displays a list of available tools.



To learn about a particular tool, select it and review the description that displays.

3. In the left pane, select Standard Test. 4. In the right pane, select the tool you want to use. 5. Click Finish. 6. Double-click the node added for that tool, then review and modify settings as needed in the tool configuration panel that opens on the right. •

Most settings vary from tool to tool.



If an Input Tab is shown in the tool configuration panel, you need to specify what text or file you want the tool to operate on. To do this, complete the tab as follows: •

Text: Use this option if you want to type or copy the document into the UI. Select the appropriate MIME type, enter the text in the text field below the Text radio button.



File: Use this option if you want to use an existing file. Click the Browse button to choose a file. Check the Persist as Relative Path option if you want the path to this file to be saved as a path that is relative to the current configuration file. Enabling this option makes it easier to share tests across multiple machines. If this option is not enabled, the test suite will save the path to this file as an absolute path.

331

Adding Set-Up and Tear-Down Tests

Adding Set-Up and Tear-Down Tests This topic explains how to add Set-Up and Tear-Down tests—tests that are executed before or after the rest of the test suite. SOAtest’s Set-Up and Tear-Down tests that mirror the setUp and tearDown operations in JUnit test cases. To add a Set-Up or Tear-Down test: 1. Select the Test Case Explorer tree node that represents the Test Suite you want to extend.

2. Click the Add Test or Output toolbar button. •

The Add Test wizard opens and displays a list of available tools.



To learn about a particular tool, select it and review the description that displays.

3. In the left pane, select Set-Up Test (if you want the tool to run before the test suite executes) or Tear-Down Test (if you want the tool to run after the test suite executes). 4. In the right pane, select the tool you want to use. 5. Click Finish. 6. Double-click the node added for that tool, then review and modify settings as needed in the tool configuration panel that opens on the right. •

Most settings vary from tool to tool.



If an Input Tab is shown in the tool configuration panel, you need to specify what text or file you want the tool to operate on. To do this, complete the tab as follows: •

Text: Use this option if you want to type or copy the document into the UI. Select the appropriate MIME type, enter the text in the text field below the Text radio button.



File: Use this option if you want to use an existing file. Click the Browse button to choose a file. Check the Persist as Relative Path option if you want the path to this file to be saved as a path that is relative to the current configuration file. Enabling this option makes it easier to share tests across multiple machines. If this option is not enabled, the test suite will save the path to this file as an absolute path.

332

Adding Test Outputs

Adding Test Outputs This topic explains the how you can access and manipulate test case output. Sections include: •

Understanding Outputs



Adding an Output



SOAP Client Output Options



Messaging Client Output Options



General Tool Output Options

Understanding Outputs You can have multiple tools perform operations on the result of one tool by adding multiple outputs to the appropriate tool node. Or, you can you can have one output perform an operation on the result of another output by adding an additional output to an existing output node. To specify one or more output(s) for a tool added to a test suite. Test suite tests based on a SOAP Client tool typically use outputs to operate on the messages that the SOAP Client tool returns. For example, if you wanted to test whether a certain SOAP remote procedural call always returned the same response for a given input, you might create a SOAP Client tool that sent the specified input, then use a Diff tool to verify the response. You could also use outputs to send the HTTP traffic from this test to the Results window so you could view the traffic. Or, if you wanted to test whether a Web service returned values in a correct format, you might create a SOAP Client tool, then use a Coding Standards output to apply a set of rules that checked if the output matched a certain required pattern. If it did, you could attach a regression control to the Coding Standards; SOAtest would then alert you if the Web service failed to match that required pattern in subsequent tests. Or, if you have an Extension tool that retrieves XML-format data, you could customize that tool so that it always sends its output to a RuleWizard rule which verifies whether the data is correct. If you want to apply a transformation tool such as XSLT and would like the transformed source code saved in a file, you need to chain a File Writer tool to the original tool. You can also use outputs to save the results of file transformations. For example, you could the XSLT tool to transform a set of files, then send the resulting files to a Write File output so that they would be saved.

Adding an Output To add an output: 1. Select the Test Case Explorer tree node that represents the test that you want to add an output for.

2. Click the Add Test or Output toolbar button. •

The Add Test wizard opens and displays a list of available tools.



To learn about a particular tool, select it and review the description that displays.

3. (Web tests only) Specify whether you want to use data from the browser contents (rendered HTML) or the original HTTP Traffic, then click Next.

333

Adding Test Outputs





Browser contents refers to the real-time data model that the browser constructed from all of the HTML, JS, CSS, and other files it loaded. Choose Browser contents if you want to validate values that appear in the browser (i.e., use the final DOM that the browsers construct from the server code). This option allows you to: •

Add a Browser Validation Tool to validate values that appear in the browser.



Add an Extension Tool for complex validations that require significant scripting.



Add a Browser Data Bank that extracts a value that appears in the browser .

HTTP traffic refers to the individual HTTP messages that the browser made in order to construct its data model. Choose HTTP traffic if you want to use the content returned by the server as is (before any browser processing). This options allows you to validate individual http requests/responses.

4. In the left pane of the Add Output wizard, select the node that specifies the output type you want to use as a tool input. 5. In the right pane, select the tool you want to use. 6. Click Finish. 7. Double-click the node added for that tool, then review and modify settings as needed in the tool configuration panel that opens on the right.

SOAP Client Output Options In the left pane of the Add Output wizard are three sub-menus: Response, Request, and Both. In the right pane of the Add Output wizard are New or Existing Output tools that you may select. •





Response: Sends the SOAP response to the following choices: •

Transport Header: Allows you to add the desired New/Existing Output tool to the HTTP Header or JMS properties specified as the Transport of the SOAP Client tool.



SOAP Envelope: Allows you to add the desired New/Existing Output tool to the SOAP envelope:



Attachment: Allows you to add the Attachment Handler tool as the output.

Request: Sends the SOAP request to the following choices: •

Transport Header: Allows you to add the desired New/Existing Output tool to the HTTP Header or JMS properties specified as the Transport of the SOAP Client tool.



SOAP Envelope: Allows you to add the desired New/Existing Output tool to the SOAP envelope:

Both: Sends both the SOAP response and SOAP request to the following choices: •

Traffic Object: Allows you to add an Extension, Traffic Viewer, or WS-I tool as an output.



Traffic Stream: Sends the HTTP traffic (SOAP request and SOAP response) to the selected location. Available options include: •

File: Sends the output to a file. If you choose this option, be sure to select the new FileStreamWriter output node, then specify where you want the file written in the control panel that opens. Moreover, if you want to ensure that this file’s path is always relative to your test file, enable the Persist as Relative Path to Test option in that same panel.



Results: Sends output to the GUI Message panel.

334

Adding Test Outputs



stderr: Sends standard error to a console window.



stdout: Sends standard output to a console window.

Messaging Client Output Options In the left pane of the Add Output wizard are three sub-menus: Response, Request, and Both. In the right pane of the Add Output wizard are New or Existing Output tools that you may select. •





Response: Sends the HTTP response to the following choices: •

Transport Header: Allows you to add the desired New/Existing Output tool to the HTTP Header or JMS properties specified as the Transport of the Messaging Client tool.



HTTP Response: Allows you to add the desired New/Existing Output tool.

Request: Sends the HTTP request to the following choices: •

Transport Header: Allows you to add the desired New/Existing Output tool to the HTTP Header or JMS properties specified as the Transport of the SOAP Client tool.



SOAP Envelope: Allows you to add the desired New/Existing Output tool to the SOAP envelope:

Both: Sends both the HTTP response and HTTP request to the following choices: •

Traffic Object: Allows you to add an Extension, Traffic Viewer, or WS-I tool as an output.



Traffic Stream: Sends the HTTP traffic (HTTP request and HTTP response) to the selected location. Available options include: •

File: Sends the output to a file. If you choose this option, be sure to select the new FileStreamWriter output node, then specify where you want the file written in the control panel that opens. Moreover, if you want to ensure that this file’s path is always relative to your test file, enable the Persist as Relative Path to Test option in that same panel.



Results: Sends output to the GUI Message panel.



stderr: Sends standard error to a console window.



stdout: Sends standard output to a console window.

General Tool Output Options You can typically send the output of most tools (other than the SOAP Client tool) directly to a tool of your choice. In addition, you can send SOAP Client output to tools if you first choose the XML Request output option. Some tools (such as Coding Standards) have 2 outputs: •

Messages about the test.



The transformed source produced during the test.

In these cases, you will see the following options: •

Messages: Sends the previous tool’s result messages to the specified tool.

335

Adding Test Outputs



Transformed Source: Sends the source code created/modified by the previous tool to the next specified tool. For example, you would use this option if you wanted to send the file that results from an XSLT tool to the Validate XML tool.

336

Adding Global Test Suite Properties

Adding Global Test Suite Properties This topic explains how to create JMS, XPath, SOAP Header, Database, Keystore, Tool, and Input properties that can be shared and referenced globally across a test suite. Sections include: •

Global JMS Connection Properties



Global Ignored XPath Properties



Global SOAP Header Properties



Global Database Account Properties



Global Key Stores



Global Tools



Global WS-Policy Banks

Global JMS Connection Properties When creating a large test suite with multiple tools, there may be instances where some tools (i.e. SOAP Client, Messaging Client, and Call Back Tool) will use the same JMS Connection Properties. Rather than manually entering the same information into each tool or copying and pasting settings between tools, it may be easier to create JMS settings that each tool can reference. In this case, you can create global JMS Connection Properties at the test suite level. To create global JMS Properties, complete the following:

1. Select the desired test suite node and click the Add Property button. The Add Global wizard displays. 2. Select Global Property> JMS Connection Properties from the Add Global wizard and click Finish. A Properties node displays in the Test Case Explorer tree and the JMS Connection Properties panel displays in the right side of the GUI. 3. Specify the settings in the JMS Connection Properties panel as follows: a. If you want to change the default name, enter the new name in the Name field. This will be the name that appears in the SOAP Client, Messaging Client, and Call Back tools from which you will reference these properties. Because you can create more than one global reference for JMS Connection Properties, the name you enter should be intuitive to its use. b. Click the Add Property to All Tests button (if you don’t click this button, the global properties you add will be ignored by the tests in the test suite). Depending on what you select from the drop-down list, one of the following will occur: •

c.

If Use Shared Property Only is selected from the drop-down list, the corresponding tests in the test suite will be able to only use the global property you added.

If Use Local and Shared Properties is selected from the drop-down list, the corresponding tests in the test suite will be able to use the global property you added and any properties configured within the individual test itself.

d. In the Provider URL field, specify the location of the JMS Administered Objects. e. In the Initial Context field, specify the Java class that contains all the JMS properties mappings.

337

Adding Global Test Suite Properties

f.

In the Connection Factory field, specify the key used to look up the MOM-specific factory from the initial context. Can be either a Queue Connection Factory or a Topic Connection Factory.

g. In the Authentication area, select the Perform Authentication checkbox and enter the Username and Password to authenticate the request. If the correct username and password are not used, the request will not be authenticated. Note: Only the SOAP Client, Messaging Client, and Call Back Tools can reference global JMS Connection Properties. After specifying the global JMS Connection Properties, you can share these properties across multiple instances of these SOAtest tools.

Global Ignored XPath Properties As with global JMS properties, there may be instances when you have multiple Diff tools that use the same XPath settings. Rather than manually entering the same information into each Diff tool or copying and pasting settings between Diff tools, it may be easier to create XPath settings that each Diff tool can reference. In this case, you can create global XPath Properties at the test suite level. To create a global list of ignored XPaths, complete the following:

1. Select the desired test suite node and click the Add Property button. The Add Global wizard displays. 2. Select Global Property> Ignored XPaths from the Add Global wizard and click Finish. A Properties node displays in the Test Case Explorer tree and the Ignored XPaths panel displays in the right side of the GUI. 3. Specify the settings in the Ignored XPaths panel as follows: a. If you want to change the default name, enter the new name in the Name. This will be the name that appears in the Diff tools from which you will reference these XPaths. Because you can create more than one global reference list for XPaths, the name you enter should be intuitive to its use. b. Click the Add Property to All Tests button (if you don’t click this button, the global properties you add will be ignored by the tests in the test suite). Depending on what you select from the drop-down list, one of the following will occur:

c.



If Use Shared Property Only is selected from the drop-down list, the corresponding tests in the test suite will be able to only use the global property you added.



If Use Local and Shared Properties is selected from the drop-down list, the corresponding tests in the test suite will be able to use the global property you added and any properties configured within the individual test itself.

Click the Add button. An empty field displays in the XPath column of the Ignored XPaths List. By default, the Settings column is automatically filled in with all XPath operations specified, meaning that the entire XPath you add will be ignored.

d. Using the Ignored XPath setting dialog that opens when you double-click in the XPath column, specify an XPath position. The XPath you enter can be shared by multiple diff tools within the test suite. •

If you want to ignore more than one attribute at an element’s XPath location, leave the attribute name empty or use the wild card * (e.g. myAttribute*).

338

Adding Global Test Suite Properties

Global SOAP Header Properties When creating a large test suite with multiple tools, there may be instances where SOAP Client tests will use the same SOAP Header Properties. Rather than manually entering the same information into each test or copying and pasting settings between tests, it may be easier to create SOAP Headers that each test can reference. In this case, you can create global SOAP Header Properties at the test suite level. To create a global SOAP Header, complete the following:

1. Select the desired test suite node and click the Add Property button. The Add Global wizard displays. 2. Select Global Property> SOAP Headers from the Add Global wizard and click Finish. A Properties node displays in the Test Case Explorer tree and the SOAP Headers panel displays in the right side of the GUI. 3. Specify the settings in the SOAP Headers panel as follows: a. If you want to change the default name, enter the new name in the Name field. b. Click the Add Property to All Tests button (if you don’t click this button, the global properties you add will be ignored by the tests in the test suite). Depending on what you select from the drop-down list, one of the following will occur:

c.



If Use Shared Property Only is selected from the drop-down list, the corresponding tests in the test suite will be able to only use the global property you added.



If Use Local and Shared Properties is selected from the drop-down list, the corresponding tests in the test suite will be able to use the global property you added and any properties configured within the individual test itself.

Click the Add button. The Choose Header Type dialog displays.

d. Select a SOAP Header type from the Available Header types list and click OK. e. Configure the SOAP Header parameters as needed. For more information on each SOAP Header, see “Adding SOAP Headers”, page 811.

339

Adding Global Test Suite Properties

Global Database Account Properties When creating a large test suite with multiple tools, there may be instances where DB tools will use the same Database Properties. Rather than manually entering the same information into each tool or copying and pasting settings between tools, it may be easier to create a Database Account that each tool can reference. In this case, you can create global Database Account Properties at the test suite level. To create a global Database Account, complete the following:

1. Select the desired test suite node and click the Add Property button. The Add Global wizard displays. 2. Select Global Property> Database Account from the Add Global wizard and click Finish. A Properties node displays in the Test Case Explorer tree and the Database Account panel displays in the right side of the GUI. 3. Specify the settings in the Database Account panel as follows: a. If you want to change the default name, enter the new name in the Name field. b. Click the Add Property to All Tests button (if you don’t click this button, the global properties you add will be ignored by the tests in the test suite). Depending on what you select from the drop-down list, one of the following will occur:

c.



If Use Shared Property Only is selected from the drop-down list, the corresponding tests in the test suite will be able to only use the global property you added.



If Use Local and Shared Properties is selected from the drop-down list, the corresponding tests in the test suite will be able to use the global property you added and any properties configured within the individual test itself.

Configure the rest of the Database Account settings as needed. •

If the account settings are stored in a file, select File then specify the path to that file. •



To refresh/reload the file (e.g., if you edited it outside of SOAtest), click Refresh Configuration Settings.

If you want to specify the settings in this panel, select Local, then specify the settings in the Driver, URL, Username, and Password field. •

To export these values to a file, click Export Configuration Settings. Once the values are exported to a file, you can import the file through the File> Input File control (described above). This way, you won’t have to re-type the same values if you want to add this same account to a different test suite.

Global Key Stores Key Stores contain the necessary certificates and private keys needed to perform secure Web service through means such as server/client authentication, XML encryption, and XML digital signatures. The values you specify in a Key Store will be available to use with the SOAP Client, XML Encryption, and XML Signer tools. The SOAP Client Tool can use a Key Store certificate to complete the handshake with a server. The XML Encryption Tool can use a Key Store certificate to encrypt XML documents, and the XML Signer tool can use a Key Store certificate and private key to sign and verify your identity in an XML document.

340

Adding Global Test Suite Properties

To add a key store:

1. Select the desired test suite node and click the Add Property button. The Add Global wizard displays. 2. Select Global Key Store from the Add Global wizard and click Finish. A Properties node displays in the Test Case Explorer tree and the Key Store panel displays in the right side of the GUI. 3. If you want to change the default name, enter the new name in the Name field. 4. Specify the settings in the Key Store panel’s Certificate tab as follows: a. Select Use same key store for private key if the Key Store contains private keys for the certificate. b. In the Key Store File field, specify the key store file by clicking the Browse button and using the file chooser that opens. If you want the path saved as a relative path (for example, to facilitate project sharing), check the Persist as Relative Path option. c.

In the Key Store Password field, specify the Key Store password and select the Save option if you want to save the password on future runs of the test.

d. In the Key Store Type box, select the type of Key Store being used (e.g. JKS, PKCS12, BKS, UBER). e. Click Load to populate the aliases with the available certificates/keys (if the path, type, and key store password are valid), then choose the certificate alias in the Certificate Alias.box. 5. Specify the settings in the Key Store panel’s Private Key tab as follows: a. In Key Store File, specify the key store file by clicking the Browse button and using the file chooser that opens. If you want the path saved as a relative path (for example, to facilitate project sharing), check the Persist as Relative Path option. •

This field is only available if the Key Store Contains Keys option is unselected in the Certificate tab.

b. In Key Store Password, specify the Key Store password and select the Save option to remember the password on future runs of the test. • c.

This field is only available if the Key Store Contains Keys option is unselected in the Certificate tab.

In Key Store Type, specify the type of Key Store being used (e.g. JKS, PKCS12, BKS, UBER). •

This field is only available if the Key Store Contains Keys option is unselected in the Certificate tab.

d. Click Load to populate the aliases with the available certificates/keys (if the path, type, and key store password are valid), then choose the private key alias in the Private Key Alias.box. e. In Private Key Password, specify the private key password and select the Save option to remember the password on future runs of the test.

Global Tools If you expect to use certain specialized tools (for example a particular XSLT tool or a “chained” tool that operates on a request or response, then sends its output to additional tools, which can send their out-

341

Adding Global Test Suite Properties

put to additional tools, and so on) only within the context of the current test suite, you can add them to the test suite tool repository, then add them to the test suite without recreating them each time. (If you plan to use a specialized tool for multiple test suites, you should add it to the program via the Tools panel available when you choose Tools> Customize. To add a tool to a test suite’s tools repository:

1. Select the desired test suite node and click the Add Property button. The Add Global wizard displays. 2. Select Global Tool> [Tool_name] from the Add Global wizard and click Finish. A new tool node displays in the Test Case Explorer tree (under the Tools branch, which will be added if it did not already exist) and a tool configuration panel displays in the right side of the GUI.

3. Customize that tool’s settings in the tool configuration panel that opens. 4. You can chain additional tools to that tool as described in “Adding an Output”, page 333. To use a repository tool in a test, select it from the Existing Tools list that is available when you add a test or output.

Global WS-Policy Banks One of the biggest aspects of Web services is interoperability. Web services rely on a standardized interface to declare what requirements must be met in order for a service consumer to interact with a service provider. The basic WSDL specification does not have the capacity to declare complex clientside requirements. To accommodate for this, WSDL is extended with WS-Policy and WS-PolicyAttachment, allowing a service provider to define additional requirements within the WSDL. WS-Policy leaves it up to other WS-* specifications to define their own set of policies. One such specification is WSSecurityPolicy which defines policies related to WS-Security. When reading a WSDL with SecurityPolicy extensions, SOAtest automatically generates the test cases with all the necessary policy related configurations. There are some attributes of the test case that still require manual configuration, but SOAtest will automatically set up the foundation. Note: WS-Policy is a lightweight specification. It delegates policy design to the WS-* specifications; in addition, there is a large set of proprietary policies. Since the WS-* space is large, SOAtest only supports WS-SecurityPolicy assertions but will continue to extend the processor to handle other common assertion sets. To add a global WS-Policy Bank, complete the following:

1. Select the desired test suite node and click the Add Property button. The Add Global wizard displays.

342

Adding Global Test Suite Properties

2. Select WS-Policy Bank from the Add Global wizard and click Finish. A new WSDL Policies node displays in the Test Case Explorer tree (under the WS-Policy Banks branch, which will be added if it did not already exist) and a WSDL Policies configuration panel displays in the right side of the GUI and various WS-security tests will be chained to the SOAP client tools. .

3. Specify the settings in the WSDL Policies panel as follows: a. If you want to change the default name, enter the new name in the Name field. b. In WSDL URI, specify the WSDL URI where this Web service can be accessed. You can either enter a WSDL or click the Browse button. c.

Click Refresh from WSDL to refresh the WSDL from the given location URL and reparses it.



In the Global Policies area, review the policy definitions in non-XML format. as well as the policy alternatives implied by your WSDL. Each section in the left hand tree represents a global policy element in the WSDL.

343

Reusing and Reordering Tests

Reusing and Reordering Tests This topic explains how to reorder tests as well as reuse tests and other test assets. You can drag and drop tests to reorder them. In addition, you might want to cut/copy and paste tests in the following situations: •

You want to create new tests that are very similar to existing tests.



You want to change the order of test execution (Tests are executed in the order in which they are listed in the Test Case Explorer).



You want to copy tests or test suites from one project to another.

To copy or cut a test or test suite: 1. Right-click the Test node or Test Suite node that represents the test or test suite that you want to cut/copy. 2. Choose Cut or Copy from the shortcut menu. To paste a test or test suite at the end of a test suite: 1. Right-click the Test Suite node 2. Choose Paste. To paste a test or test suite in a specific position in a test suite: 1. Right-click the Test node above which you want the test case or test suite pasted. 2. Choose Paste.

Copying/Pasting Other Test Assets You can also copy/paste other test assets—such as chained tools, data sources, transport settings, etc.—as needed.

344

Parameterizing Tests (with Data Sources or Values from Other Tests)

Parameterizing Tests (with Data Sources or Values from Other Tests) This topic explains how you can quickly extend the scope and comprehensiveness of your functional testing by parameterizing tests with values that are stored in data sources or extracted from other tests. Parameterization can be applied to test case inputs as well as data validation. Sections include: •

Parameterizing Tests with Values Extracted from Another Test



Understanding How SOAtest Performs Functional Testing Using Data Sources



Adding a Data Source •

Adding a CSV File Data Source



Adding a Database Data Source



Adding an Excel Spreadsheet Data Source



Specifying Data Source Values in a Table



Combining Multiple Data Sources into an Aggregate Data Source



Adding a File Data Source



Adding a Writable Data Source



Setting Up a "One-to-Many" Data Source Mapping



Generating a Data Source Template for Populating Message Elements



Cutting, Copying, and Pasting Data Sources



Performing Functional Tests Using Data Sources







Understanding Data Source Iteration



Configuring the SOAP Client and Diff Tools to Use Data Sources



Parameterizing Arrays of Varying Size

Using Interpreted Data Sources •

Generating a Data Source from the fields of a Java Bean



Interpreted Data Source Table Format

Populating and Parameterizing Elements with Data Source Values

Parameterizing Tests with Values Extracted from Another Test You can parameterize tests by extracting values from one test and then using them available in another test. This is accomplished with tools such as: •

XML Data Bank



Browser Data Bank



Header Data Bank



JSON Data Bank



Object Data Bank



Text Data Bank

345

Parameterizing Tests (with Data Sources or Values from Other Tests)

Understanding How SOAtest Performs Functional Testing Using Data Sources Another way to parameterize tests is with data source values. For example, you can configure SOAtest to send data source values as part of a request to a server. The values that SOAtest receives in response can then be compared to another data source value to check if the response received is correct. SOAtest will check each available combination of data source rows. This behavior is particularly useful if you want SOAtest to perform functional testing on a number of different inputs stored in a data source. For more information, see “Performing Functional Tests Using Data Sources”, page 354. SOAtest can perform functional testing using values from any of the following types of data sources: •

CSV files



Databases



Excel spreadsheets



Tables created in (or copied into) the internal table editor.



File



Writable



Aggregate

Tip- Generating a Data Source Template for Populating Message Elements Manually creating a data source for parameterizing large, complex XML messages can be time-consuming and tedious. For a fast way to accomplish this, have SOAtest automatically generate a CSV data source template based on the structure of the request or response message that you want to parameterize. Columns in the generated data source are automatically mapped to the appropriate elements in the request or response message. The only thing you need to do is add values to the generated data source template. For details, see “Generating a Data Source Template for Populating Message Elements”, page 353.

Adding a Data Source Data sources are added at the test suite level and saved in .tst files. You can specify any number of data sources for a test suite, and you can use any specified data sources throughout a test suite’s tests. In addition, you can create an aggregate data source in which you can combine the values of other available data sources into a single data source. This is especially useful if you would like to perform a functional test that needs to draw values from multiple data sources. For example, in sending a request to a server, you may want to send values from a data source that contains user information such as a first and last name, and you may also want to send values from a separate data source that contains the user’s login and password information. By combining the two data sources into a single aggregate data source, you can create a single test instead of having to create separate tests for each data source. The procedure for adding a data source depends on the type of data source you want to add. The following topics explain how to add the five possible types of data sources: •

Adding a CSV File Data Source

346

Parameterizing Tests (with Data Sources or Values from Other Tests)



Adding an Excel Spreadsheet Data Source



Specifying Data Source Values in a Table



Adding a Database Data Source



Combining Multiple Data Sources into an Aggregate Data Source



Adding a File Data Source



Adding a Writable Data Source

Once a data source is added, it will be represented in the Data Sources branch of the Project tree. SOAtest will add one node for each available data source. To view or change data source settings, you select that node, then view or modify the options listed in the right GUI panel. If you right-click on a data source node that is not a table, the Create Table option will be available in the shortcut menu. Selecting this option creates a new table data source that contains the same data and settings as the original data source that was right-clicked. This newly created data source will be added as a node to the Data Sources branch of the project tree. For more information on table data sources, see “Specifying Data Source Values in a Table”, page 349.

Adding a CSV File Data Source To add a CSV file data source:

1. Select the desired test suite node and click the Add Database toolbar button. The New Data Source wizard displays. 2. Select CSV and click Finish. The Data Source configuration options display in the right GUI panel of SOAtest. 3. (Optional) Change the data source label in the Name field of the Data Source configuration options. 4. Use the Rows controls to indicate the range of rows you want to use. •

If you only want to use selected rows, click the Range button, then enter the desired range (assuming a one-based index) by typing values into the From and To fields. For example, to use only the first 10 rows, enter 1 in the From field and 10 in the To field. To use only the fifth row, enter 5 in the From field and 5 in the To field.

5. Specify the path to the CSV file in the File Path field. 6. Specify the type of separator and quotes that the file uses. 7. If you want to see a list of the columns from that data source, click Show Columns. •

SOAtest assumes that the first row of values represents your column titles. If they do not, you might have trouble identifying and selecting your data source columns in SOAtest. If you want SOAtest to use different column titles, update the first row of your data source, then click the Show Columns button.

Adding a Database Data Source To add a database data source:

1. Select the desired test suite node and click the Add Database toolbar button. The New Data Source wizard displays.

347

Parameterizing Tests (with Data Sources or Values from Other Tests)

2. Select Database and click Finish. The Data Source configuration options display in the right GUI panel of SOAtest. 3. (Optional) Change the data source label in the Name field of the Data Source configuration options. 4. Use the Rows controls to indicate the range of rows you want to use. •

If you only want to use selected rows, click the Range button, then enter the desired range (assuming a one-based index) by typing values into the From and To fields. For example, to use only the first 10 rows, enter 1 in the From field and 10 in the To field. To use only the fifth row, enter 5 in the From field and 5 in the To field.

5. Specify the Database Configuration parameters. For more information, see the following section.

Database Configuration Parameters To configure the settings in the Database Configuration panel. 1. In the Driver class box, select the type of driver to use. The following are download links for common drivers: •

Oracle: http://www.oracle.com/technology/software/tech/java/sqlj_jdbc/index.html



MySQL: http://dev.mysql.com/downloads/connector/j/



Windows System DSN: This driver is included with Java.



SQLServer: http://msdn.microsoft.com/data/jdbc/



Sybase: http://www.sybase.com/products/allproductsa-z/softwaredeveloperkit/jconnect



DB2: http://www-306.ibm.com/software/data/db2/java/



Other: http://developers.sun.com/product/jdbc/drivers

2. Specify the settings for that particular type of driver. Settings will vary from driver to driver. Common settings include: •

Driver class: Type the path to the appropriate JDBC driver class, including the package name. For example, you might enter the following path if you were using an Oracle database: oracle.jdbc.driver.OracleDriver.

The driver that you enter must be available on your CLASSPATH; if it is not, SOAtest will not be able to access your database. To add the driver jars (or zip files) to the CLASSPATH, go to SOAtest> Preferences, then add the file(s) under the JDBC Drivers section. •

URL: Type the appropriate URL. For ex. For example, you might enter the following path if you were using an Oracle database: jdbc:oracle:thin:@bear:1521:mydb

Here are some examples of Drivers and URLs you might use for different databases •

Oracle: Driver: oracle.jdbc.driver.OracleDriver URL: jdbc:oracle:thin:@host:port:dbName



MySQL: Driver: com.mysql.jdbc.Driver URL: jdbc:mysql://host:port/dbName

348

Parameterizing Tests (with Data Sources or Values from Other Tests)



Windows System DSN: Driver: sun.jdbc.odbc.JdbcOdbcDriver URL: jdbc:odbc:DATABASE_NAME (where DATABASE_NAME is the database name from your System DSN settings)



SQLServer Driver: com.microsoft.sqlserver.jdbc.SQLServerDriver URL: jdbc:sqlserver://host:port;DatabaseName=DATABASE_NAME •

Note that older drivers may have different settings.



Sybase: Driver: com.sybase.jdbc2.jdbc.SybConnectionPoolDataSource URL: jdbc:sybase:Tds:host:port/dbName



DB2: Driver: COM.ibm.db2.jdbc.app.DB2Driver (included in db2java.zip, which comes with the DB2 run-time client) URL: jdbc:db2://host/dbName



Username: Type a valid username for this database (if the database requires passwords).



Password: Type the password for the given username (if the database requires passwords).



SQL Query: Type or copy the SQL query that expresses which data you want to use.

If you want to check what column names SOAtest is using, click the Show Columns button. If you want SOAtest to use different column titles for the existing columns, update your database column names, then click the Show Columns button. If you want SOAtest to use different columns, update your SQL query so that it retrieve the appropriate columns, then click the Show Columns button.

Adding an Excel Spreadsheet Data Source To add an Excel spreadsheet data source:

1. Select the desired test suite node and click the Add Database toolbar button. The New Data Source wizard displays. 2. Select Excel and click Finish. The Data Source configuration options display in the right GUI panel of SOAtest. 3. (Optional) Change the data source label in the Name field of the Data Source configuration options. 4. Use the Rows controls to indicate the range of rows you want to use. •

If you only want to use selected rows, click the Range button, then enter the desired range (assuming a one-based index) by typing values into the From and To fields. For example, to use only the first 10 rows, enter 1 in the From field and 10 in the To field. To use only the fifth row, enter 5 in the From field and 5 in the To field.

5. Specify the path to the Excel file in the File Path field. 6. Select the sheet of the specified Excel file you would like to use from the Sheet menu. Important: SOAtest assumes that the first row of values represents your column titles. If they do not, you might have trouble identifying and selecting your data source columns in SOAtest. If you want SOAtest to use different column titles, update the first row of your data source, then click the Show Columns button.

Specifying Data Source Values in a Table 349

Parameterizing Tests (with Data Sources or Values from Other Tests)

To specify data source values by entering or pasting them into an internal table editor:

1. Select the desired test suite node and click the Add Database toolbar button. The New Data Source wizard displays. 2. Select Table and click Finish. The Data Source configuration options display in the right GUI panel of SOAtest. 3. (Optional) Change the data source label in the Name field of the Data Source configuration options. 4. Use the Rows controls to indicate the range of rows you want to use. •

If you only want to use selected rows, click the Range button, then enter the desired range (assuming a one-based index) by typing values into the From and To fields. For example, to use only the first 10 rows, enter 1 in the From field and 10 in the To field. To use only the fifth row, enter 5 in the From field and 5 in the To field.

5. If you want to specify column names (rather than use the default A, B, C, D, etc.), check First row specifies column names. 6. Add the data values by typing or pasting them into the table. You can copy from popular spreadsheets such as Excel. Note that the table editor contains standard copy/cut/paste editing commands (when a cell is selected) as well as commands to insert rows or tables. To add more rows, use the downward arrow key or the downward arrow scrollbar button. To add more columns, use the right arrow key or the right arrow scrollbar button. Note: You can add columns to a table data source by right-clicking on a column header and selecting Insert column or Insert multiple columns from the shortcut menu.

Combining Multiple Data Sources into an Aggregate Data Source You can create an aggregate data source in which you can combine the values of other available data sources into a single data source. This is especially useful if you would like to perform a functional test that needs to draw values from multiple data sources. For example, in sending a request to a server, you may want to send values from a data source that contains user information such as a first and last name, and you may also want to send values from a separate data source that contains the user’s login and password information. By combining the two data sources into a single aggregate data source, you can create a single test instead of having to create separate tests for each data source. To combine multiple data sources into an aggregate data source:

1. Select the desired test suite node and click the Add Database toolbar button. The New Data Source wizard displays. 2. Select Aggregate and click Finish. The Data Source configuration options display in the right GUI panel of SOAtest. 3. (Optional) Change the data source label in the Name field of the Data Source configuration options. 4. Use the Rows controls to indicate the range of rows you want to use. •

If you only want to use selected rows, click the Range button, then enter the desired range (assuming a one-based index) by typing values into the From and To fields. For example, to use only the first 10 rows, enter 1 in the From field and 10 in the To field. To use only the fifth row, enter 5 in the From field and 5 in the To field.

350

Parameterizing Tests (with Data Sources or Values from Other Tests)

5. Choose the desired data sources from the Available box and click the Add button to add them to the Selected box. •

The Available box contains all of the data sources added to the test suite. After selecting and adding the desired data sources to the Selected box, the column names contained in the added data sources display in the Columns box.

Adding a File Data Source To add a File data source:

1. Select the desired test suite node and click the Add Database toolbar button. The New Data Source wizard displays. 2. Select File and click Finish. The Data Source configuration options display in the right GUI panel of SOAtest. 3. (Optional) Change the data source label in the Name field of the Data Source configuration options. 4. Use the Rows controls to indicate the range of rows you want to use. •

If you only want to use selected rows, click the Range button, then enter the desired range (assuming a one-based index) by typing values into the From and To fields. For example, to use only the first 10 rows, enter 1 in the From field and 10 in the To field. To use only the fifth row, enter 5 in the From field and 5 in the To field.

5. Specify the file or directory to import files from. All of the files available in the specified location will display in the table. Right-click options allow you to cut, copy, and paste values in as well. •

To filter which files are used by the File Data Source, enter a string in the File Filter field. For example: •

* = wild card for any string



*.* = all files (this is the default)



*.txt = all text files



data* = all files whose files names begin with "data"



data*.txt = all text files whose files names begin with "data"



*data* = all files whose file names contain the string "data" somewhere

At runtime, SOAtest will use the contents of each file as a data source value.

Adding a Writable Data Source To add a writable data source that dynamically generates values:

1. Select the desired test suite node and click the Add Database toolbar button. The New Data Source wizard displays. 2. Select Other from the Add Data Source wizard and click Next. The Other dialog of the Add Data Source wizard displays. 3. Select Writable and click Finish. The Data Source configuration options display in the right GUI panel of SOAtest.

351

Parameterizing Tests (with Data Sources or Values from Other Tests)

4. (Optional) Change the data source label in the Name field of the Data Source configuration options. 5. Use the Rows controls to indicate the range of rows you want to use. •

If you only want to use selected rows, click the Range button, then enter the desired range (assuming a one-based index) by typing values into the From and To fields. For example, to use only the first 10 rows, enter 1 in the From field and 10 in the To field. To use only the fifth row, enter 5 in the From field and 5 in the To field.

6. Specify your preferred writing mode for setup tests and standard tests. •

Append will only continuously append to the Writable Data Source without clearing the previous appended value as long as the Writable Data Source is being written to by a single test case. When another test case writes to the Writable Data Source, it will clear the data source and append a new value.

7. If you want to specify column names (rather than use the default A, B, C, D, etc.), check First row specifies column names. 8. (Optional) Specify a column name for each non-empty column by double-clicking the header for that column, then entering a column name in the dialog box that opens. If you do not change the column headers, SOAtest will refer to each column by its default name (a, b, c, etc). 9. To populate the Writable data source, complete the following: a. Add a SOAP Client tool as a Set-Up Test to the test suite. For more information on Set-Up Tests, see “Adding Set-Up and Tear-Down Tests”, page 332. b. Add a XML Data Bank tool as an output to the SOAP Client Set-Up Test. c.

Run the SOAP Client Set-Up Test to populate the XML Data Bank.

d. Add a node to the Selected XPaths in the XML Data Bank GUI. e. Double-click the entry row underneath the Data Source column name column in the XML Data Bank GUI. A Modify dialog displays. f.

Select Writable Data Source Column in the Modify dialog and click OK. Now when the Set-Up Test runs, the Writable Data Source will be populated. The Writable Data Source will automatically be reset every time its parent Test Suite is run.

Note: You can add row and columns to a writable data source by right-clicking and selecting Insert rows | columns.

Setting Up a "One-to-Many" Data Source Mapping You can also set up SOAtest to use values from a single row of one data source (e.g., a global data source that contains login information) with multiple rows from another data source as follows: 1. Add a Writable Data Source to your project as described in “Adding a Writable Data Source”, page 351. The Writable Data Source lets SOAtest iterate independent of the other Data Sources. 2. If you have more than one "global" parameter in your Data Source, right-click the single Writable Data Source column and select Insert Columns. Rename the columns to something matching your original Data Source columns. 3. Add an Extension tool as a Set-Up Test (see “Adding Set-Up and Tear-Down Tests”, page 332 for details). This will act as an interface to your "global" Data Source. 4. Configure the extension tool as described in “Extension (Custom Scripting)”, page 960. Assuming that the column names you want to access are named "username" and "password"

352

Parameterizing Tests (with Data Sources or Values from Other Tests)

from the Data Source "Credentials", you would select the Credential data source in the tool’s configuration panel, check Use data source, then add the following code: from soaptest.api import * def getCredentials(input, context): username = context.getValue("Credentials", "username") password = context.getValue("Credentials", "password") return SOAPUtil.getXMLFromString( [ username, password ] )

5. Chain an XML Data Bank to the output of the Extension Tool by right-clicking the Extension tool, choosing Add Output, and then selecting the XML Data Bank tool option. 6. Run the test once to populate the XML Data Bank. 7. Double-click the XML Data Bank tool to open its configuration panel. 8. Select the element that corresponds to the first parameter, and click Add. 9. Click Modify and select DataSource column name. 10. Select Writable Data Source Column and select the corresponding name. 11. Repeat steps 8-10 for the element corresponding to the second parameter.

Generating a Data Source Template for Populating Message Elements Manually creating a data source for parameterizing large, complex XML messages can be time-consuming and tedious. For a fast way to accomplish this, have SOAtest automatically generate a CSV data source template based on the structure of the request or response message that you want to parameterize. To generate and use a data source template: 1. If the Form Input view does not display all of the elements that you want the generated data source to reference, manually add them or automatically populate the form as described in “Populating a Set of Elements with Automatically-Generated Values”, page 809. Automated population will add nodes for optional elements. 2. In the appropriate messaging tool (SOAP Client, Messaging Client, Message Stub), go to the Form Input view, right-click the root tree element, then choose Generate CSV Data Source.

353

Parameterizing Tests (with Data Sources or Values from Other Tests)

3. In the dialog that opens, specify the following settings. •

File Name: The name of the CSV file that will be generated.



CSV File Destination: The workspace location where you want to save the generated file.



File separator: The delimiter for the CSV file.

4. Click OK. A data source will be added to the test suite and each message element will be parameterized with a column from the data source. 5. Open the generated data source template and add values as needed. These values will be passed to the associated elements during test execution.

Cutting, Copying, and Pasting Data Sources To cut, copy, or paste a data source, right-click the desired data source node and select Cut, Copy, or Paste from the shortcut menu. This feature is useful if you already have a data source in which you would like to modify only a few values. You can then just cut or copy the data source, paste it in the same tree branch, then modify the new data source as needed.

Performing Functional Tests Using Data Sources After adding a data source to a test suite, the data source values can be used in conjunction with SOAtest tools to further extend the functionality of the test suite. Data sources can be used to parameterize values in tools such as SOAP Client, Messaging Client, Browser Testing Tool, DB Tool, Diff Tool, and so on. The SOAP Client and Messaging Client tools can be configured to send data source values as part of a request to the server. The Diff tool can then be configured to compare the responses to another set of values in the data source. For example, consider the functional testing of a service that receives the names of U.S. capital cities and returns the names of corresponding states. In this case, a data source with two columns of differing values—the first column containing cities as values, and the second column containing states as values—would be added to the test suite. The SOAP Client tool could be configured to send requests that draw inputs from the first column. The Diff tool would then be configured to compare the actual responses to the inputs in the second column. Using the SOAP Client and Diff tools in this way is a very powerful option in functional testing since each value in each row of the data source would be cycled through and checked.

Understanding Data Source Iteration It is important to understand how the test Execution Options will affect Data Source iteration. For more information on Test Execution settings, see “Specifying Execution Options (Test Flow Logic, Regression Options, etc.)”, page 320. The following Execution Options will affect Data Source Iteration: •

Individually Runnable: When Individually Runnable is selected in the Execution Options of a test suite, each test in the test suite will iterate through every row of the data source before moving on to the next test.



Scenario: When Individually Runnable is not selected, every test in the test suite will execute before iterating to the next data source row.

For example, if a Test Suite contains Test A and Test B that both use a Data Source with three rows, the following execution patterns would occur: •

Individually Runnable: Test A: Row 1, Test A: Row 2, Test A: Row 3, Test B: Row 1, Test B: Row 2, Test B: Row 3

354

Parameterizing Tests (with Data Sources or Values from Other Tests)



Scenario (Non-Individually Runnable): Test A: Row 1, Test B: Row 1, Test A: Row 2, Test B: Row 2, Test A: Row 3, Test B: Row 3

If an XML Data Bank is used to pass values between test cases in a Test Suite, this will automatically put the Test Suite in "Scenario Mode" regardless of whether Individually Runnable is selected. The Data Source iteration will behave as if Individually Runnable is not selected. For more information on using an XML Data Bank see “XML Data Bank”, page 863.

Configuring the SOAP Client and Diff Tools to Use Data Sources To configure the SOAP Client and Diff tools to send and compare data source values: 1. If you have not already done so, add the data sources you want to use as described in “Adding a Data Source”, page 346. 2. Double-click the desired SOAP Client node in the Test Case Explorer and choose the appropriate data source from the Data Source combo box in the right GUI panel. •

The Data Source combo box will not display unless a data source was previously added to the test suite. If there is only one data source available, the SOAP Client tool will default to that data source. If there is more than one data source available, the SOAP Client tool will default to the first data source listed in the Tests tab.

3. Complete the rest of the fields in the Project Configuration panel for the SOAP Client node as explained in “SOAP Client”, page 777. 4. Select the SOAP Client node and click the Add Test or Output toolbar button. An Add Output wizard displays. 5. From the Add Output wizard, select Request> SOAP Envelope from the left pane, and select Diff from the right pane and click Finish. A Diff node is added to the SOAP Client Node. 6. Double-click the Diff node to open the tool configuration panel. 7. Choose the appropriate data source from the Data Source box. The data source you choose must be the same data source specified for the SOAP Client. 8. In the Regression control source box, choose Data Source. 9. In the Data Source Column box, choose the column from the combo box that you would like the Diff tool to compare responses to. 10. Complete the rest of the Diff tool configuration settings as explained in “Diff”, page 899. You can now perform a functional test by selecting the SOAP Client node and clicking the Test toolbar button. The results will display in the right GUI panel.

Parameterizing Arrays of Varying Size Soatest also allows you to map hierarchical data structure in data sources. Suppose that you have a web service that collects information to construct family trees. The information sent looks like the following: <ns1:People xmlns:ns1="http://www.example.org/ParentChildGrandChild"> <ns1:Person> <ns1:Name>GrandPa</ns1:Name> <ns1:Age>85</ns1:Age> <ns1:Child> <ns1:Name>Daddy</ns1:Name> <ns1:Age>55</ns1:Age> <ns1:Child> <ns1:Name>FirstSon</ns1:Name>

355

Parameterizing Tests (with Data Sources or Values from Other Tests)

<ns1:Age>25</ns1:Age> </ns1:Child> <ns1:Child> <ns1:Name>SecondSon</ns1:Name> <ns1:Age>22</ns1:Age> </ns1:Child> </ns1:Child> </ns1:Person> </ns1:People>

It represents a family tree as follows: •

GrandPa •

Daddy •

FirstSon



SecondSon

How do you set up an “Array Data Source” that can be used to vary the number of children and grandchildren in the XML? First, you setup the data source, then you use it to parameterize the form input as described in the following sections.

Setting up the Data Source Suppose we want to send family tree information for the following 2 families. •

GrandPa •



Daddy •

FirstSon



SecondSon

GrandMa •

Mommy •

FirstDaughter



SecondDaughter



FirstAunt



SecondAunt •

FirstCousin



SecondCousin

To set up the data source for this scenario:

356

Parameterizing Tests (with Data Sources or Values from Other Tests)

1. Create an Excel Spreadsheet with 3 sheets named: FirstGeneration, SecondGeneration, ThirdGeneration.

2. Fill out the FirstGeneration sheet.

Notice the SecondGeneration dsref* column. This is how we denote that the children of the FirstGeneration will be from the SecondGeneration sheet. (dsref* denotes Data Source REFerence.)

357

Parameterizing Tests (with Data Sources or Values from Other Tests)

3. Fill out the SecondGeneration sheet.

Notice the ThirdGeneration dsref* column. Notice also the ParentIndex column. The value of the ParentIndex column indicates which FirstGeneration the SecondGeneration is related to. For example, Daddy is related to the first FirstGeneration or GrandPa. Mommy, FirstAunt, and SecondAunt are related to the second FirstGeneration or GrandMa.

358

Parameterizing Tests (with Data Sources or Values from Other Tests)

4. Fill out the ThirdGeneration Sheet.

Notice again the ParentIndex column. Notice that there is no ParentIndex 3 because FirstAunt does not have any children.

Parameterizing Form Input The next step to varying the number of children and grandchildren in the XML is to paramerize the form input as follows: 1. Create a new .tst file called ArrayDataSource.tst. 2. Add an Excel Data Source that points to the Excel spreadsheet created in previous steps. Select FirstGeneration as the sheet. 3. Create a new Messaging Client. 4. Configure the Messaging Client as follows: •

Set Schema URL to http://soatest.parasoft.com/ParentChildGrandChild.xsd



Set RouterEndpoint to http://ws1.parasoft.com:8080/examples/servlets/ Echo

5. Set up the Form Input as follows: People •

Person •

Name – Parameterized to: Name



Age - Parameterized to: Age

359

Parameterizing Tests (with Data Sources or Values from Other Tests)



Child



Name - Parameterized to: SecondGeneration:Name (Name column in SecondGeneration Sheet)



Age - Parameterized to: SecondGeneration:Age (Age column in SecondGeneration Sheet) •

Child



Name - Parameterized to: SecondGeneration:ThirdGeneration:Name (Name column in ThirdGeneration Sheet)



Age - Parameterized to: SecondGeneration:ThirdGeneration:Age (Age column in ThirdGeneration Sheet)

6. Run the test. In the Traffic Viewer, the XML should reflect the family information in the data source.

360

Parameterizing Tests (with Data Sources or Values from Other Tests)

Using Interpreted Data Sources An interpreted data source is a tabular data source that is regarded by SOAtest as a relational representation of a Java object graph. An interpreted data source can be used to facilitate creation of multiple Java objects and object graphs that can be used by the EJB Client Tool and other SOAtest tools as test parameter inputs. For more information on the EJB Client tool, see “EJB Client”, page 841.

Generating a Data Source from the fields of a Java Bean To generate a Data Source from the fields of a Java Bean:

1. Select the desired test suite node and click the Add Database toolbar button. The New Data Source wizard displays. 2. Select Bean Wizard and click Next. The Bean Wizard dialog displays.

3. Complete the following options in the Bean Wizard dialog: •

Destination Type: Select the type of table template you would like to create from the drop-down menu.



Destination Directory: Specify where the tables will be written.



Write Over Existing Files: Specify whether you want to overwrite existing files.



Java Class: Specify the class for which you would like to create a tabular representation.



Trace Dependencies: Select either Yes or No for trace dependencies. Selecting Yes prompts SOAtest to generate tables for types of class member variables “reachable” from the “root” class that you specified in the Java Class field.

4. Click the Next button. The Dependencies dialog displays. 5. Select the desired type dependencies from the Generate Table for Classes list.

361

Parameterizing Tests (with Data Sources or Values from Other Tests)

6. Click the Finish button to generate the tables. The tables will be created in the designated directory and a data source for each file will be added to the test suite you selected.

Interpreted Data Source Table Format The following are the main concepts of the relational to object mapping used by SOAtest: •

An object is a row in a table.



There are two types of tables: •



Class Tables •

Column for each class member variable



Row for each instance

Collection Tables •

Column for table and row of actual object



Row for each instance

Object References The first column in each table is an identifier for object instances. An object may be referenced by the table name followed by a space, followed by the object identifier. Identifier column need not necessarily be row numbers, so long as the identifier values are unique within each table.

Values and References Field values of non-primitive classes are generally specified through object references as described above. However, the extra level of indirection is unnecessary and cumbersome for values of primitive types. To accommodate an abbreviated form, a built-in support is included for inlining commonly used types with well-defined string representations. An empty cell in a reference column is interpreted as the null value. An empty cell for a value column is interpreted as an empty string.

Collections In order to support variable sized collections, a collection table is introduced. A collection table has a single reference column. An object in the collection is referenced by the table name followed by a space, followed by the object identifier.

Example As an example, let us consider an object graph with CreditCardDO as a root. CreditCardDO contains instances of PersonDO and AddressDO and a Vector of ActivityDO type objects. public class CreditCardDO extends PaymentMethodDO implements Serializable { protected String ccNumber; protected Date expirationDate; protected PersonDO ccHolder; protected AddressDO billingAddress; protected Vector recentActivity = new Vector(); // set…()/get…() methods omitted } public class PaymentMethodDO implements Serializable { protected String bankName; // set…()/get…() methods omitted } public class AddressDO implements Serializable {

362

Parameterizing Tests (with Data Sources or Values from Other Tests)

protected protected protected protected

String streetAddress; String city; int zipCode; String state;

// set…()/get…() methods omitted } public class PersonDO implements Serializable { protected String firstName; protected String lastName; // set…()/get…() methods omitted } public class ActivityDO implements Serializable { private float amount; private String description; // set…()/get…() methods omitted }

The following tables illustrate how the above object graph example can be represented in tabular format: Table CreditCardDO.csv soatest.examples.CreditCardDO

bankName

billingAddress ref

ccHolder ref

ccNumber

expirationDate

recentActivity ref

1

SampleBank

AddressDO 1

PersonDO 1

1234123412341 234

8/31/2005

RecentActivities1

Table AddressDO.csv soatest.examples.AddressDO

city

state

streetAddress

zipCode

1

Los Angeles

CA

101 E. Huntington Dr.

91016

Table PersonDO.csv soatest.examples.PersonDO

firstName

lastName

1

Donald

Duck

Table ActivityDO.csv soatest.examples.ActivityDO

amount

description

1

10

10 Charge-10

2

20

20 Charge-20

3

30

30 Charge-30

4

40

40 Charge-40

5

50

50 Charge-50

363

Parameterizing Tests (with Data Sources or Values from Other Tests)

6

60

60 Charge-60

7

70

70 Charge-70

8

80

80 Charge-80

9

90

90 Charge-90

10

100

100 Charge-100

Table RecentActivities-1.csv ActivityDO ref ActivityDO 1 ActivityDO 2 ActivityDO 3

Populating and Parameterizing Elements with Data Source Values Why Populate and Parameterize Elements with Data Source Values? Let's say you have a complex request message that looks something like this. <?xml version="1.0" encoding="UTF-8"?> <SOAP-ENV:Envelope xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchemainstance"> <SOAP-ENV:Body> <addNewItems xmlns="http://www.parasoft.com/wsdl/store-01/"> <book> <id xmlns="">0</id> <title xmlns=""></title> <price xmlns="">0.0</price> <authors xmlns=""> <i></i> <i></i> </authors> <publisher xmlns=""> <id myAtt="attVal">1</id> </publisher> </book> </addNewItems> </SOAP-ENV:Body> </SOAP-ENV:Envelope>

You will want to parameterize most of the above elements—and even some arrays that may vary in the number of elements they contain (such as the list of authors in the above example). This is data-driven testing. However, manually creating an Excel spreadsheet with data and parameterizing each individual element can be time-consuming and tedious. If you already have an existing data source, you can use the Populate feature (described below) to automatically map data source values to message elements.

364

Parameterizing Tests (with Data Sources or Values from Other Tests)

If you do not already have a data source with these values, you can use the Populate feature (described below) to automatically generate simple values for a set of form fields. You can also automatically generate and then complete a data source template as described in “Generating a Data Source Template for Populating Message Elements”, page 353.

Data Source Mapping and Naming Conventions When parameterizing element or attribute values, there are three possibilities: •

Specifying a value



Specifying that an element should be Nil or Null



Specifying that an element should be excluded entirely.

Specifying Values Matching data source columns to request elements is accomplished via certain naming conventions applied to the data source column names. In the example XML above, notice that there are 2 elements named "id." To distinguish them, we can use "book/id" as one column name and "book/publisher/id" as the other. This naming convention mimics a file directory structure or an XPath. Attributes are similarly identified, with the additional specification of an "@" symbol. In the example above, "book/publisher/ [email protected]" refers to the attribute "myAtt" of the publisher id. Next, consider the case where several elements may have the same name, as demonstrated in the authors element above. The "book/authors/i" identifier applies to both elements, so we need a way to distinguish them. In this case, we can append array numbers within parentheses "()" to repeated elements of the same name. Hence, "book/authors/i(1)" identifies the first element, and "book/authors/ i(2)" identifies the second.

Specifying Nil and Exclude In some cases, you will want to specify that an element appear as Nil or that the element should not appear in the request message. By default, appending "XL" for exclude and "NIL" for nil values will accomplish this goal. For example, a column named "book/authors/iXL(2)" will allow you to indicate that the second child of the authors tag should not be sent, such as for the case where there is only one author.

How to Populate and Parameterize Elements with Data Source Values In cases of large, complex XML message requests, the process of configuring each element and attribute item in the request XML, and then parameterizing it with the correct data source column can be time consuming. Therefore, it is possible to expedite this process with the automatic populate and parameterize feature. Ultimately, when this feature is used with the associated data source naming conventions, it can provide huge productivity gains, especially when dealing with messages containing hundreds of nested, complex elements. It also allows you to focus on designing use cases in a data-driven manner (using data sources) instead of focusing on the error-prone method of individually configuring each message parameter.

365

Parameterizing Tests (with Data Sources or Values from Other Tests)

Note for SOAtest 5.x Users In SOAtest 5.x, the populate feature was available for Form XML and Literal XML views. It populated Form Input, and then overrode the values in Form XML or Literal XML with values from Form Input. With SOAtest 6.x, the populate has been removed from Form XML and Literal XML. You should populate in Form Input, then switch views to have the same effect.

To automatically populate and parameterize elements with data source values, complete the following: 1. In a SOAP Client test configured with a WSDL or a schema, open the Request> SOAP Body tab and right-click in a blank area of the Form Input tree (not on a specific element) and select Populate from the shortcut menu.

2. Enable Map parameter values to data source columns to tell SOAtest to automatically set each form input parameter to Parameterized. 3. Select the data source column with a name that matches the parameter name. For example, if the data source has a column name "title" and one of the form input elements has the same name "title", then the "title" element will be mapped to the data source column "title", and so on. 4. Customize the remaining options as needed. See “Populate Wizard Options”, page 366 for details. 5. Click OK. A Populate Results dialog displays Summary and Detail information.

Populate Wizard Options

366

Parameterizing Tests (with Data Sources or Values from Other Tests)

Option

Description

Map parameter values to data source columns

(Only enabled when a data source is present.) Indicates whether to automatically set each form input parameter to Parameterized and selects the data source column with a name that matches the parameter name. For example, if the data source has a column name "title" and one of the form input elements has the same name "title", then the "title" element will be mapped to the data source column "title", and so on.

Element Exclusion

Indicates whether to also map the property Use data source: Exclude [element name] with empty string with a data source column. For more details about the exclude with empty string option, see “Using Data Source Values to Determine if Optional Elements Are Sent”, page 810. The following options are available from the Element Exclusion dropdown menu: •

Always Include: Instructs SOAtest to always add new elements that are optional (schema type attribute minOccurs='0') as part of the populate process. The number of elements added is determined by the Number of sequence (array) items field. By default that is set to the value of 2.



Leave Unchanged: Leaves each sequence element at the current state. No new elements are added and no exclusion properties are modified.



Use Data Source: Instructs the populate process to map the Use data source: Exclude [element name] with empty string property of each Form Input element to the data source with the same name, but postfixed with the value XL. If the value in the specified data source column is an empty string, the optional element will not be included in the message. If it is an actual value, that value will be sent as part of the message.

The postfix XL is specified in the Exclude column name postfix field; XL is the default value. For example, if the request XML message includes an element named "title" and that element type in the schema is defined with the attribute minOccurs='0', the Use data source: Exclude title with empty string option becomes available in the Form Input view when right-clicking on the parent node of title. The populate feature would map the exclude property to the data source column named "titleXL", assuming that "XL" is the postfix specified.

367

Parameterizing Tests (with Data Sources or Values from Other Tests)

Option

Description

Nillable Elements

Similar to Element Exclusion except that Nillable Elements affects the Use data source set Nill with empty string property. This Form Input property is available on elements that have their schema type set with nillable="true". When the data source has an empty string, the nill attribute will be sent as part of the message. If a value is specified, the request element will include the specified value; the nill attribute will not be sent. If the data source has an empty string (e.g., ""), then the request element will have no value and will include the xsi:nil="true" attribute. For more information, see “Using Data Source Values to Configure if Nill Attributes Are Used”, page 810. The Nillable column name postfix field of the dialog specifies the postfix.

Attribute Exclusion

Indicates whether optional attributes are automatically added by the populate process

Data Source Mapping and Naming Conventions For both sequence (Array) and nested types: The value mapping and exclude/nillable mappings are based on name-matching conventions between the element name and the column name. However, there are cases where the same element name is reused within the XML message, so mapping collisions need to be avoided if one-to-one mapping between each element and data source column is to be maintained. For nested complex types: XPath-like data source column names can be used to disambiguate. For example, instead of using the data column name "title", you may use "book/title" as the column name and it would therefore be mapped to any "title" elements falling under "book". If that can lead to ambiguity, you may also use a column name such as "addNewItem/book/title" to further identify which element it is supposed to be associated with. For sequence types (arrays where items with the same name are repeated): Item index numbers can be used to disambiguate. For example, in the Parasoft store service the book type has an authors element, which in turn can have many "i" elements indicating a list of authors. Only using the data source column name "i" would result in that data source column being mapped to all occurrences of element "i". Using data source column name "i[2]" which results in that column name being mapped to all occurrences of "i" as the second item in the sequence. (The index numbers start from 1, not 0 as per the XPath specification). If the column name "authors/i[2]" is used, then it will be mapped only to the second item "i" element under the "authors". Note: if there happens to be multiple "authors" elements in the XML message, then all of them would be mapped accordingly, unless the column names are disambiguated with enough XPath parent nodes to make the mapping one-to-one. Parentheses can be used as the numeric index syntax, so "authors/ i(2)" will map to the same elements as "authors/i[2]". The () syntax is inconsistent with the XPath specification, but it is helpful when database data sources are used where [] is not a valid SQL character. Exclude and Nillable mapping: Follows the same XPath and indexing conventions as values. For example, to exclude/include the "i" element based on a data source column, you may use the column name "authors/iXL[2]" to indicate specifically which elements it is intended for.

368

Configuring Testing in Different Environments

Configuring Testing in Different Environments This topic explains how to work with SOAtest environments. A SOAtest Environment is a collection of variables that can be referenced within fields of your test configuration. By switching which environment is the "active" environment, you can dynamically switch the values of your test configurations at runtime. Sections include: •

Understanding Environments



Manually Defining an Environment



Using Environment Variables in Tests



Changing Environments from the GUI



Changing Environments from the Command Line



Sample Usage



Exporting, Importing, and Referencing Environments

Understanding Environments You may want to run the same test suite against other target systems (different testing environments, production environments, etc.). Rather than editing server configuration-related settings within your SOAtest project, you can instead use SOAtest Environments to decouple configuration settings from your test data. Once you have configured environments for your projects, running tests against another systems is as easy as clicking a button. An environment is a collection of variables that can be referenced in your SOAtest project configuration. When running a test, SOAtest will substitute the names of variables in your project configurations with the values of those variables in the active environment. By changing which Environment is active, you can quickly and easily change which values SOAtest uses. Environments are defined when you automatically generate (e.g., from a WSDL, by recording from a browser, etc.). In addition, you can manually define one as described below.

Manually Defining an Environment Creating and switching environments is done through the Environments branch of the test suite’s Test Case Explorer node.

The Environments branch is created by default when a new test suite is created. To add a new environment:

369

Configuring Testing in Different Environments

1. Right-click the Environments node, then choose New Environment. 2. In the configuration panel that opens on the right, use the available controls to define environment variables.

Using Environment Variables in Tests Environment variables can be accessed in test configuration fields using a special syntax. To reference a variable, enclose the variable name in the following character sequence: ${}. For example, if you have a variable named HOST, you would reference the variable in a field by typing: ${HOST}. Variable references may appear anywhere within a field. For example let's say your environment contains variables HOST = localhost and PORT = 8080, and you have an Endpoint field in a SOAP Client containing: "http://${HOST}:${PORT}/Service". Upon running the test, the value used for the endpoint will be "http://localhost:8080/Service". You can access Environment Variable values from a SOAtest Extension yool/Script through the Extensibility API. MethodToolContext now has a method called "getEnvironmentVariableValue(String)" which will lookup and return the current value of an Environment variable. This will allow you to use the value within your SOAtest scripts. Note: If your test case requires the character sequence ${}, you can escape the sequence by adding a backslash. For example, if SOAtest encounters the value "\${HOST}" it will use the value "${HOST}" and will not try to resolve the variable. Also note that environment variable names are case sensitive.

Changing Environments from the GUI To change what environment is active: •

Right-click the node representing the environment you want to make active, then choose Set as Active Environment.

Changing Environments from the Command Line In addition to being able to select the active Environment from the GUI, you can also switch the active environment from the command line, using the -environment option. See “Testing from the Command Line Interface (soatestcli)”, page 257 for details.

370

Configuring Testing in Different Environments

Overriding Environments Using Test Configurations To set a Test Configuration to always use a specific environment for test execution (regardless of what environment is active in the Test Case Explorer): •

Set the Test Configuration’s Override default Environment during Test Execution option in the Execution tab. See “Defining How Test Cases are Executed (Execution Tab)”, page 248 for details.

Sample Usage Let's say you are developing a service on your local machine. Once the code works locally, you commit the code to source control and start a build process targeted to a staging server. You want to create a suite of tests that will test against both your local machine and the staging server with minimal modification. This is the perfect case for using SOAtest Environments. Begin by creating a localhost environment for your new project. The SOAtest New Project Wizard will get you started with a few basic environment variables. If your WSDL varies across the two machines, you can decompose the WSDL URI into variables. If the WSDL location is constant across the machines, you can alternatively decompose the endpoint URI. The next step is to identify other machine-specific settings in your projects and to create environment variables for them. For example, let's say you have a JMS step in your test suite and you need to send the message to different queues depending on whether it's a localhost test or a staging test. First, create a new environment variable, let's call it "JMS_REPLY_QUEUE". Next, revisit your SOAP Client, go to the JMS settings, and enter ${JMS_REPLY_QUEUE} as the value for the JMSReplyTo field. Once you have created your localhost environment, you can copy and paste the existing environment and rename the new environment to "Staging Environment". Then, simply modify the values of each of the variables so that they reflect the settings of your staging server. Once your environments are setup, targeting your tests against the various environments is simply a matter of selecting the active environment. If you want to test against your local machine, you would set your active environment to be "Localhost Environment". This will run your tests using the values defined in the localhost environment. To test against the staging server, set the "Staging Server" environment to active and its values will be used.

Exporting, Importing, and Referencing Environments You may find that many configuration settings, such as server names and ports, will be common across multiple projects. Rather than duplicating these settings, you can export environment settings to an external file and import or reference the values in other projects.

Exporting Environments To export an environment: 1. Right-click the node representing the environment you want to export, then choose Export Environment. 2. In the file chooser that opens, specify a location for the exported environments file. The environments configuration will be written in an XML-based text file. If one Environment is selected, a *.env file will be created, containing a single environment. If multiple environments are selected, a *.envs, or Environment Set, file will be created containing all of the selected environments.

Importing Environments 371

Configuring Testing in Different Environments

When you import environments, you are bringing a copy of the values from the external environment file into your project. Further modification to the XML file will not be reflected in your project. To import an environment: 1. Right-click the Environments node, then choose Import Environment. 2. In the file chooser that opens, specify the location of the environments file that you want to import.

Referencing Environments Referencing environments is the most efficient way to share a single environment configuration across multiple projects. Using environment references, you can easily modify the configurations of multiple projects from a single location. To reference an environment: 1. Right-click the Environments node, then choose Reference Environment. 2. In the configuration panel that opens on the right, specify the location of the environments file that you want to reference. Note that when an environment configuration is referenced, you cannot edit the environment variables in the environment directly. However, your project will always use the values reflected in the referenced *.env file. Modifying the *.env file will propagate changes to all projects that reference it.

372

Validating the Database Layer

Validating the Database Layer Using SOAtest’s DB tool (described in “DB”, page 911), you can validate the database layer. For example, assume you are testing a Web service invocation that is supposed to add a record to the database. You can not only ensure that the service returned the expected response, but also query the database to verify whether the data was added to the database correctly. In addition, you can perform database setup or cleanup actions as needed to support your testing efforts.

373

Validating EJBs

Validating EJBs SOAtest’s EJB Client tool (described in “EJB Client”, page 841) can test the remote interface of deployed EJBs. This allows testing of EJBs through their remote interfaces—without having to go through a Web or Web service interface. Additionally, you can use Tracer (described in “Jtest Tracer Client”, page 966) to generate functional JUnit test cases for specified components as you run your use cases on the working application. Using Parasoft Jtest, you can perform additional levels of testing for EJBs and other code written for the Java EE framework. For example, as code is being written, a special rule library checks compliance to EJB and other Java EE best practices; these rules can be applied along with industry standard rules and rules that enforce your organization's specific polices. Upon component completion, unit tests can be generated and executed to test the code outside and/or inside the container.

374

Validating Java Application-Layer Functionality

Validating Java Application-Layer Functionality SOAtest’s Jtest Tracer Client (described in “Jtest Tracer Client”, page 966) can be used to identify, isolate, then reproduce bugs in a multi-layered system. Tracer allows you to rapidly create realistic functional JUnit test cases that capture the functionality covered by your SOAtest test cases. Using Tracer, you can trace the execution of Java applications at the JVM level (without a need to change any code or to recompile), and in the context of a larger integrated system. As your SOAtest test cases execute, Tracer monitors all the objects that are created, all the data that comes in and goes out. The trace results are then used to generate contextual JUnit test cases that replay the same actions in isolation, on the developer's desktop, without the need to access all the application dependencies. This means that you can use a single machine to reproduce the behavior of a complicated system during your verification procedure. Since the generated unit tests directly correlate tests to source code, this improves error identification and diagnosis, and allows developers to run these test cases without having to depend on access to the production environment or set up a complex staging environment. This facilitates collaboration between QA and Development: QA can provide developers traced test sessions with code-level test results, and these tests help developers identify, understand, and resolve the problematic code.

375

Monitoring and Validating Messages and Events Inside ESBs and Other Systems

Monitoring and Validating Messages and Events Inside ESBs and Other Systems Parasoft SOAtest can visualize and trace the intra-process events that occur as part of the transactions triggered by the tests and then dissect them for validation. This enables test engineers to identify problem causes and validate multi-endpoint, integrated transaction system—actions that are traditionally handled only by specialized development teams. For details, see “Event Monitoring (ESBs, Java Apps, Databases, and other Systems)”, page 494.

376

Executing Functional Tests

Executing Functional Tests This topic explains how to execute functional tests individually or with the complete test suite, and then view the HTTP traffic. Sections include: •

Running the Entire Test Suite



Running Specific Test Cases



Viewing HTTP Traffic

Running the Entire Test Suite To run all test cases in your test suite: 1. Select the Test Case Explorer node that represents the test suite you want to run. 2. Click the Test toolbar button.

SOAtest will run all available test cases, then report the outcome of each test and the test suite’s overall success rate. Green bubbles mark tests that succeeded. Red bubbles mark tests that failed. Yellow bubbles mark tests that encountered errors and were not completed. The results from all tests will be collected in the SOAtest view, which is typically positioned at the bottom of the GUI. For more information about results, see “Viewing Results”, page 290. In addition, you can access a report that contains a results summary, as well as result details. This is described in “Generating Reports”, page 295.

Running Specific Test Cases To run one or more selected test cases from a test suite that was marked as “individually runnable”: 1. Select the Test Case Explorer nodes that represent the test cases you want to run. 2. Click the Test toolbar button.

SOAtest will run the selected test case, then report the test outcome. Green bubbles mark tests that succeeded. Red bubbles mark tests that failed. Yellow bubbles mark tests that encountered errors and were not completed. The results from all tests will be collected in the SOAtest view, which is typically positioned at the bottom of the GUI. For more information about results, see “Viewing Results”, page 290. In addition, you can access a report that contains a results summary, as well as result details. This is described in “Generating Reports”, page 295.

Viewing HTTP Traffic If you would like to view the HTTP traffic for each individual test in a test suite, double-click the Traffic Viewer node of the desired test after the test has been completed. The traffic will display in the Traffic Viewer tab on the right side of the GUI.

377

Executing Functional Tests

For SOA tests, the HTTP traffic viewer shows the SOAP requests and SOAP responses. The Response and Request bodies display in Literal form by default. If you find that you have to scroll from left to right to view the HTTP traffic, you can click in the Literal view and press CTRL + B to beautify the XML. After pressing CTRL + B, all well-formed XML fragments will be beautified, alleviating the need to scroll from left to right. For more information on the Traffic Viewer tool, see “Traffic Viewer”, page 888.

378

Reviewing Functional Test Results

Reviewing Functional Test Results In addition to the general results review actions presented in “Viewing Results”, page 290, the following options are available for reviewing functional test results.

In the SOAtest View If a test fails, the SOAtest view reports a task to alert you that the test failure requires review. These tasks are organized by test suite and test. •

To open the test case related to a reported failure, right-click that failure then choose Open Editor for Test.



To locate the Test Case Explorer node related to a reported failure, right-click that failure then choose Show in Test Case Explorer.



To see the related traffic (when applicable), right-click an error message, then choose View Associated Traffic.

In the Test Case Explorer The Test Case Explorer indicates the status (pass/fail/not yet executed) of all available test cases.



• •

A green check mark indicates that the test passed.

A red X mark indicates that the test failed. An unmarked test indicated that the test was not yet executed.

379

Creating a Report of the Test Suite Structure

Creating a Report of the Test Suite Structure SOAtest provides a design-time structure report that exports test structure details to an XML or HTML document. The structure report provides details about the test setup which allows managers and reviewers to determine whether specified test requirements were accomplished. To view a structure report, right-click on the test suite’s .tst node in the Test Case Explorer, then choose View Structure Report> Structure.

The Structure Report will display in the right side of the GUI. The Structure Report contains the following information: •

Project Structure: Displays the available test suites and test cases for the selected project.



WSDLs and Operations Tested: Displays the WSDLs and operations for each WSDL for the selected project.



Endpoints Tested: Displays all endpoints that were tested for the selected project.



Requirements Tested: Displays all the defined requirements that were configured in the Requirements and Notes tab of the root test suite. In the Requirements Tracking sub section, you may add IDs and URLs to relate your tests with the requirements/bug fixes that are tested.



Data Sources Used: Displays the data sources that were configured for the selected project.



Key Stores Used: Displays the key stores that were configured for the selected project.

You can configure which of the above items to display in the Structure Report via the Reports configuration panel in the SOAtest Preferences. To access the Reports configuration panel, select SOAtest> Preferences, then select SOAtest> Reports> Structure Reports from the SOAtest Preferences dialog. For details on available structure report options, see “Reports> Structure Reports”, page 753.

380

Managing the Test Suite

Managing the Test Suite This topic explains how manage the test suite. Sections include: •

Deleting Test Cases



Disabling/Enabling a Test or Test Suite



Disabling/Enabling a Tool



Saving and Restoring a Test Suite



Exporting a Test Suite



Importing a Test Suite

Deleting Test Cases To delete a test case, right-click the related Test Suite tree node, then choose Delete from the shortcut menu.

Disabling/Enabling a Test or Test Suite You can temporarily disable tests or test suites that you want to save as part of your project, but do not want to run at the current time. To disable a test or test suite: •

Right-click the Tests tree node that represents the test or test suite you want to disable, then choose Disable from the shortcut menu.

To enable a test or test suite that you previously disabled: •

Right-click the Tests tree node that represents the test or test suite you want to enable, then choose Enable from the shortcut menu.

To disable multiple tests or test suites: •

Select multiple tests from the Tests tree node that represents the tests or test suites you want to disable, then choose Disable from the shortcut menu.

To enable multiple tests or test suites that you previously disabled: •

Select multiple tests from the Tests tree node that represents the tests or test suites you want to enable, then choose Enable from the shortcut menu.

Disabling/Enabling a Tool If you plan to use your test suite for load testing and you anticipate load testing with a large load, disabling heavy-chained tools (e.g. Diff tools or Check XML) may be useful and allow you to generate more load rather than having to delete the tools or create a new test suite. To enable or disable all tools of the same type from the test suite level: 1. Right-click the main test suite tree node. 2. Choose Search and Replace> Enable/Disable> [Tool Type] from the shortcut menu. Only tools that are used in the test suite display in the shortcut menu. Whatever tool you select, will be enabled or disabled in the entire test suite wherever it is used. Disabled tools will turn gray indicating that they are disabled and the test suite will function as if the tools are not there.

381

Managing the Test Suite

Saving and Restoring a Test Suite Test suites are saved when you save a project file and are restored whenever you open the related project file.

Exporting a Test Suite You may find that many configuration settings will be common across multiple tests. Rather than duplicating these settings, you can export test settings to an external file and import or reference the values in other tests. To export environments, complete the following: 1. Right-click the Test Suite tree node that you want to be exported, then select Export from the shortcut menu. 2. Select the appropriate .tst file from the file chooser that opens.

Importing a Test Suite SOAtest lets you import previously-saved test suites so that you can easily share tests with fellow team members and integrate test suites as needed. When a test suite is imported, it can be edited to the specific needs of the team members. To import a previously-saved test suite into an existing Test Suite: 1. Right-click the Test Suite tree node where you want the test suite integrated, then select Add New> Test Suite from the shortcut menu. 2. Select Import Test (.tst) File e and click the Finish button. 3. Select the appropriate .tst file from the file chooser that opens. After you import a test suite, it will be integrated into the current test suite.

382

SOA Functional Tests In this section: •

Automatic Creation of Test Suites for SOA: Overview



Creating Tests From a WSDL



Creating Tests From XML Schema



Creating Tests From AmberPoint Management System



Creating Tests From Oracle Enterprise Repository / BEA AquaLogic Repository



Creating Tests From BPEL Files



Creating Tests From Software AG CentraSite Active SOA



Creating Tests From JMS System Transactions



Creating Tests From Sonic ESB Transactions



Creating Tests From TIBCO EMS Transactions



Creating Tests From Traffic



Creating Tests From a UDDI



Creating Tests From a WSIL



Creating Asynchronous Tests



Testing RESTful Services



Sending MTOM/XOP Messages



Sending and Receiving Attachments



Accessing Web Services Deployed with HTTPS



Configuring Regression Testing



Validating the Value of an Individual Response Element



Validating the Structure of the XML Response Message

383

Automatic Creation of Test Suites for SOA: Overview

Automatic Creation of Test Suites for SOA: Overview With SOAtest’s test creation wizard, you can easily and automatically create a series of test cases based on a variety of artifacts and platforms. SOAtest provides a flexible test suite infrastructure that lets you add, organize, and run your Web service test cases. Each test in a test suite contains a main test tool (usually, a SOAP Client tool), and any number or combination of outputs (other tools, or special output options). You can run individual tests, or the complete test suite. In addition, you can attach regression controls at the test or test suite level so that you are immediately alerted to unexpected changes.

Understanding the Test Creation Wizard SOAtest automatically generates a suite of SOAP Client test cases from a variety of platforms and artifacts. Rather than creating each of the required tests by hand and adding them to a test suite one at a time, you can point SOAtest to the appropriate resources, and it will automatically generate a suite of test cases that covers every object associated with the corresponding data. In addition, when automatically creating test suites from WSDL or WSIL documents, you can organize tests into positive and negative unit tests, and create asynchronous test suites. The wizard can be used for: •

Creating Tests From a WSDL



Creating Tests From XML Schema



Creating Tests From AmberPoint Management System



Creating Tests From Oracle Enterprise Repository / BEA AquaLogic Repository



Creating Tests From BPEL Files



Creating Tests From Software AG CentraSite Active SOA



Creating Tests From JMS System Transactions



Creating Tests From Sonic ESB Transactions



Creating Tests From TIBCO EMS Transactions



Creating Tests From Traffic



Creating Tests From a UDDI



Creating Tests From a WSIL

384

Creating Tests From a WSDL

Creating Tests From a WSDL To automatically create a test suite from a valid WSDL document, complete the following: 1. Choose the WSDL option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project



Adding a New Test Suite

2. In the wizard’s WSDL page, enter a valid WSDL URL in the WSDL URL field, or click the Browse button to locate a WSDL file on the local file system.

Note: The remaining steps are optional. Once you enter a valid WSDL URL, you can go ahead and click the Finish button and SOAtest will generate a suite of test cases that test every object associated with the WSDL you entered. If you would like to configure the test suite further, continue to the next step. 3. Select the Create Functional Tests from the WSDL checkbox and choose the Generate Web Service Clients radio button. To create server stubs and perform client testing, see “Creating Stubs from Functional Test Traffic”, page 535. 4. To create a separate test suite that generates a series of tests to verify every aspect of the WSDL, select the Create tests to validate and enforce policies on the WSDL checkbox. 5. Click Next. The Interoperability dialog opens.

385

Creating Tests From a WSDL

6. Select whether you would like to create SOAtest (Java) Clients or .NET WCF Clients. 7. Click Next. The Create Environment dialog opens.

8. Select the Create a new environment for your project checkbox and enter an Environment Name and Variable Prefix, then select whether you want to create environment variables for WSDL URI Fields, Client Endpoints, or Both. For more information on environments, see “Configuring Testing in Different Environments”, page 369.

386

Creating Tests From a WSDL

9. Click Next. The Policy Enforcement dialog opens.

10. Select the Apply Policy Configuration check box. This will create WSDL and functional tests that will enforce the assertions defined in the specified policy configuration. •

The default policy configuration, soa.policy, is a collection of industry-wide best practices. To use a custom policy configuration, you can either use the Browse button to select a policy configuration or the policy configuration's path can be entered in the text field. For details on policy enforcement, see “SOA Policy Enforcement: Overview”, page 570.

11. Click the Next button to advance to the Layout dialog.

387

Creating Tests From a WSDL

12. (Optional) Select the Organize as Positive and Negative Unit Tests checkbox to create both positive and negative tests for each operation since it is important to test situations where we send expected data as well as unexpected data to the server. The default value is configured to Sort Tests Alphabetically. 13. (Optional) Select the Asynchronous radio button and choose Parlay, Parlay X, SCP, or WSAddressing to create asynchronous test suites. For more information on asynchronous testing, see “Creating Asynchronous Tests”, page 419. 14. Click the Finish button. SOAtest will generate a suite of test cases that test every object associated with the WSDL you entered.

388

Creating Tests From XML Schema

Creating Tests From XML Schema To automatically create a test suite from XML schema, complete the following: 1. Choose the XML Schema option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project



Adding a New Test Suite

2. In the XML Schema wizard page, specify the location of the schema from which you want to generate tests. 3. Select the type of functional test you’d like to create: •

Generate Messaging Client (non-SOAP XML messages). For more information, see “Messaging Client”, page 782.



Generate SOAP Clients (SOAP Messages). For more information, see “SOAP Client”, page 777.



Generate Server Stubs: For more information, see “Message Stub”, page 784.

4. Enter an Endpoint and click the Next button.

389

Creating Tests From XML Schema

5. In the Elements page, select one or more elements from which to generate your tests and click the Finish button. SOAtest only recognizes element definitions defined at the top level (i.e. as the child of the root schema element).

A new test suite is created based on the XML Schema and functional test type you selected.

390

Creating Tests From AmberPoint Management System

Creating Tests From AmberPoint Management System If your team uses AmberPoint Management System, you can export your runtime message sets or runtime validation baselines in the production environment, then provide this information to Parasoft SOAtest in order to create tests that can replay the SOAP messages. You can also establish the captured response messages as the regression control within the generated tests. To generate tests from an exported AmberPoint baseline or message set, complete the following: 1. Choose the AmberPoint Management System option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project



Adding a New Test Suite

2. In the AmberPoint Management System wizard page, browse to the location of the AmberPoint File.

391

Creating Tests From AmberPoint Management System

3. If you would like to create a regression test from the captured response messages, select the Create Regression Controls checkbox. 4. Click the Finish button. SOAtest generates a test suite from the exported baseline or message set file. If regression controls were created, a Diff tool is attached to each test.

392

Creating Tests From Oracle Enterprise Repository / BEA AquaLogic Repository

Creating Tests From Oracle Enterprise Repository / BEA AquaLogic Repository SOAtest can create tests that enforce policies applied to Web service assets declared in an Oracle/ BEA repository. You can select a Web service asset and choose the desired policies to enforce. To enforce Oracle/BEA AquaLogic policies, complete the following: 1. Choose the BEA AquaLogic Enterprise Repository option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project



Adding a New Test Suite

2. In the BEA AquaLogic Enterprise Repository wizard page, enter the location of the repository in the Repository URL field, enter a Username and Password. To save these settings, click the Save to Preferences button. 3. Click the Next button. A list of Available Web Service Assets corresponding to the selected repository displays.

393

Creating Tests From Oracle Enterprise Repository / BEA AquaLogic Repository

4. Select the desired Web service asset from the list and click the Next button. A list of Policies applied to the assets you selected displays.

394

Creating Tests From Oracle Enterprise Repository / BEA AquaLogic Repository

5. Select the desired policies and click the Finish button. SOAtest creates a test suite with the selected policies and tests whether these policies are enforced against the Web service assets.

395

Creating Tests From BPEL Files

Creating Tests From BPEL Files SOAtest can automatically create taste cases from vendor-specific BPEL deployment artifacts. You can then arrange these test cases into suites that reflect different aspects of testing of a BPEL process. SOAtest can create the following types of tests from BPEL files: BPEL Semantics Tests BPEL Semantics tests include the BPEL Semantics Validator – a static analysis tool that verifies syntactic correctness through schema validation, which verifies that the elements and attributes conform to the XML Schema. Beyond this, the validator explicitly verifies constraints imposed by the BPEL specification that are not enforced by the XML Schema. The validator finds errors such as: •

Unresolved references to BPEL, WSDL, and XML Schema types.



Violations to constraints on start activities, correlations, scopes, variables, links, and partner links.



Incompatible types in assignments.



Errors in embedded XPath expressions.

WSDL Tests BPEL depends on Web Services Description Language (WSDL) to define both outgoing and incoming messages. The SOAtest BPEL Wizard examines the BPEL process deployment artifacts for WSDL references. For each referenced WSDL file, the Wizard will create tests that verify WSDL schema validity, semantic validity, WS-I interoperability and will create a regression control. BPEL Process Tests BPEL Process tests emulate external business partners accessing the deployed BPEL process. The SOAtest BPEL Wizard examines the business process deployment artifacts, including BPEL and WSDL files. The Wizard maps the process's partner link description to a WSDL port type and a protocol binding through which the process can be externally invoked. The BPEL Wizard then creates a test for each operation of the port type of the business process. BPEL Partner Tests The correct functioning of a BPEL process directly depends on the correct functioning of its business partners. A change in the behavior of a business partner may cause the BPEL process to fail. Finding the cause of such failures can be time consuming. By including BPEL partner tests into the BPEL process test suite, the SOAtest BPEL Wizard allows users to test BPEL partners as components of the BPEL process and detect business partner errors and unexpected behavior early in the development lifecycle. The SOAtest BPEL Wizard examines the business process deployment artifacts, including BPEL and WSDL files. The Wizard then maps partner links descriptions to WSDL port types and protocol bindings through which business partners can be externally invoked. The Wizard will then create a test suite for each business partner - within each test suite will be a test for every operation declared in the partner's port type.

Automatically Creating Test Suites from BPEL Process Deployment Artifacts To automatically create test suites from BPEL process deployment artifacts, complete the following: 1. Choose the BPEL option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project

396

Creating Tests From BPEL Files



Adding a New Test Suite

2. In the BPEL wizard page, go to the BPEL Engine drop-down menu and select the BPEL engine type where the BPEL process you would like to test is deployed. •



If BPEL Maestro is selected from the BPEL Engine drop-down menu, complete the following: •

Enter the URL of the BPEL file in the BPEL URL field



Enter the public WSDL URL of the BPEL process in the Public WSDL URL field.

If Active BPEL is selected from the BPEL Engine drop-down menu, complete the following: •

Enter the location of the BPEL process Deployment Descriptor File (.pdd file).



Enter the location of the BPEL file.



Enter the Engine URL where the Active BPEL 2.0 engine is deployed. For example, if you installed Active BPEL 2.0 to run in a Tomcat servlet container with the address: http://mybpelhost:8080, then your Active BPEL 2.0 Engine URL will be http://mybpelhost:8080/active-bpel. Verify that this is the right URL by opening it in the browser —you should see the Administrative Servlets panel.

397

Creating Tests From BPEL Files



If Generic BPEL is selected from the BPEL Engine drop-down menu, complete the following: •

Enter the URL of the BPEL file in the BPEL URL field



Enter the public WSDL URL of the BPEL process in the Public WSDL URL field.

For Generic BPEL engines, it is likely that the BPEL partner link to WSDL port mappings, and the ports and port types of your BPEL process business partners, are declared in WSDL files other than the public WSDL of the BPEL process. If this is the case, you should declare those dependency WSDLs in the Optional Parameters dialog. To invoke this dialog press the Optional Parameters Configure button, press Add and enter the dependency WSDL URL in the newly created table row. Add as many dependency WSDLs as needed. 3. Select the test categories that you would like the BPEL Wizard to create: •

Create BPEL Semantics Tests: Verifies semantic and schema validity of BPEL files.



Create WSDL Tests: Checks WSDL files referenced in the BPEL deployment for schema validity, semantic validity, WS-I interoperability, and regression.



Create BPEL Process Tests: Emulates external business partners accessing the deployed BPEL process.



Create BPEL Partner Tests: Allows direct testing of BPEL process business partners.

4. Click the Finish button. SOAtest will examine the BPEL process deployment artifacts and automatically create test suites for the BPEL process you selected.

398

Creating Tests From Software AG CentraSite Active SOA

Creating Tests From Software AG CentraSite Active SOA SOAtest can create tests that enforce policies applied to Web service assets that are declared in a Software AG CentraSite Active SOA repository. You can select a service asset and choose the desired policies to enforce. To enforce CentraSite Active SOA policies, complete the following: 1. Choose the CentraSite repository option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project



Adding a New Test Suite

2. In the CentraSite wizard page, enter the location of the repository in the URL field, enter a Username and Password. To save these settings, click the Save to Preferences button. 3. Click the Next button and enter a service name query in the Name Query Field. This allows you to query for services by name that are registered in CentraSite Active SOA.

399

Creating Tests From Software AG CentraSite Active SOA

4. Click the Next button. A list of Available Web Service Assets corresponding to the entered query displays.

5. Select the desired service asset from the list and click the Finish button. SOAtest creates a test suite with the selected policies and tests whether these policies are enforced against the service assets.

Reporting Test Execution Results to CentraSite Active SOA After running the test suite with the selected CentraSite Active SOA policies, you have instant access to quality data associated with the assets in CentraSite Active SOA. For details on how to report results to CentraSite Active SOA, see “Using Software AG CentraSite Active SOA with SOAtest”, page 682.

400

Creating Tests From JMS System Transactions

Creating Tests From JMS System Transactions This topic explains how you can use SOAtest to monitor transactions that pass through any JMS system, then generate functional test cases that check the monitored messages. Sections include: •

Overview



Prerequisites



Generating Tests from JMS Transactions

Overview SOAtest can monitor transactions that pass through a JMS, then generate functional test cases that check the monitored messages. In addition to providing visibility into the systems messages, this allows you replay transactions directly from SOAtest and verify that the monitored functionality continues to work as expected. To achieve this, you tell SOAtest how to connect to your JMS and what destination (topic or queue) messages you want it to monitor, then you prompt it to start monitoring. SOAtest will generate a test suite of Messaging Client tests for each JMS message captured at the specified destination or for all messages within the process flow (if a process tracking topic was used). These tests are preconfigured with the connection parameters, requests, and destination information so that SOAtest can replay the same messages.

Prerequisites See “JMS Prerequisites”, page 694.

Generating Tests from JMS Transactions To generate tests: 1. Choose the Java Message Service (JMS) option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project

401

Creating Tests From JMS System Transactions



Adding a New Test Suite

2. In the JMS wizard page, complete the following: a. In the Connection area, specify your JMS connection settings. b. In the Initial Context field, specify a fully-qualified class name string, passed to the JNDI javax.naming.InitialContext constructor as a string value for the property named javax.naming.Context.INITIAL_CONTEXT_FACTORY. c.

In the Connection Factory field, specify the JNDI name for the factory This is passed to the lookup() method in javax.naming.InitialContext to create a javax.jms.QueueConnectionFactory or a javax.jms.TopicConnectionFactory instance.

d. In the Destination Name field, specify the topic or queue that you want to monitor. •

You can specify a regular topic or queue (e.g., the entry or exit of a workflow process), or a special process tracking topic .

e. In the Destination Type field, specify whether the tracking destination is a topic or a queue. f.

(Optional) In the Message Selector field, enter a value to act as a message filter. See “Using Message Selector Filters”, page 704 for tips.

3. Click Next. SOAtest will start monitoring the messages that match the settings specified in the previous wizard page. If you run another application that sends messages to the bus, those messages will be noted in this panel.

402

Creating Tests From JMS System Transactions

4. When you are ready to stop monitoring, click Finish. SOAtest will then create test cases based on the verified messages.

Monitoring Intermediary Messages In addition to automatically generating functional tests from monitoring the transaction messages that touch JMS endpoints in ESBs or middleware systems, you can also visualize and trace the intra-process JMS messages that take place as part of the transactions that are triggered by the tests, and then dissect them for validation. For details on how to do this, see “Event Monitoring (ESBs, Java Apps, Databases, and other Systems)”, page 494.

403

Creating Tests From Sonic ESB Transactions

Creating Tests From Sonic ESB Transactions This topic explains how SOAtest can monitor transactions that pass through a Sonic ESB system, then generate functional test cases that check the monitored messages. In addition to providing visibility into the systems messages, this allows you replay transactions directly from SOAtest and verify that the monitored functionality continues to work as expected. Sections include: •

Overview



Prerequisites



Generating Tests from Sonic ESB Transactions

Overview SOAtest can monitor transactions that pass through Sonic ESB, then generate functional test cases that check the monitored messages. In addition to providing visibility into the systems messages, this allows you replay transactions directly from SOAtest and verify that the monitored functionality continues to work as expected. To achieve this, you tell SOAtest how to connect to your Sonic ESB and what destination (topic or queue) messages you want it to monitor, then you prompt it to start monitoring. SOAtest will generate a test suite of Messaging Client tests for each JMS message captured at the specified destination or for all messages within the process flow (if a process tracking topic was used). These tests are preconfigured with the connection parameters, requests, and destination information so that SOAtest can replay the same messages.

Prerequisites The following jar files must be added to your classpath (via SOAtest> Preferences> System Properties): •

broker.jar



mfcontext.jar



sonic_Client.jar

Generating Tests from Sonic ESB Transactions To generate tests: 1. Choose the Sonic Enterprise Service Bus option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project

404

Creating Tests From Sonic ESB Transactions



Adding a New Test Suite

2. Complete the first page of the Sonic ESB wizard as follows: a. In the Connection area, specify your Sonic ESB connection settings. b. In the Destination Name field, specify the topic or queue that you want to monitor.]

c.



You can specify a regular topic or queue (e.g., the entry or exit of a workflow process), or a special "dev.Tracking" tracking endpoint.



For instance, if you want to track all events that occur as part of the process flow, specify the dev.Tracking endpoint, and have the process set to Tracking Level of 4 in the ESB.

In the Destination Type field, specify whether the tracking destination is a topic or a queue.

d. (Optional) In the Message Selector field, enter a value to act as a message filter. See “Using Message Selector Filters”, page 704 for tips. 3. Click Next. SOAtest will start monitoring the messages that match the settings specified in the previous wizard page. If you run another application that sends messages to the bus, those messages will be noted in this panel. 4. When you are ready to stop monitoring, click Finish. SOAtest will then create test cases based on the verified messages.

405

Creating Tests From Sonic ESB Transactions

Monitoring Intermediary Messages In addition to automatically generating functional tests from monitoring the transaction messages that touch JMS endpoints in Sonic ESB, you can also visualize and trace the intra-process events that take place as part of the transactions that are triggered by the tests, and then dissect them for validation. For details on how to do this, see “Event Monitoring (ESBs, Java Apps, Databases, and other Systems)”, page 494.

406

Creating Tests From TIBCO EMS Transactions

Creating Tests From TIBCO EMS Transactions This topic explains how you can use SOAtest to monitor transactions that pass through a TIBCO EMS system, then generate functional test cases that check the monitored messages. Sections include: •

Overview



Prerequisites



Generating Tests from TIBCO EMS Transactions

Overview SOAtest can monitor transactions that pass through TIBCO EMS, then generate functional test cases that check the monitored messages. In addition to providing visibility into the system’s messages, this allows you replay transactions directly from SOAtest and verify that the monitored functionality continues to work as expected. To achieve this, you tell SOAtest how to connect to your TIBCO EMS and what destination (topic or queue) messages you want it to monitor, then you prompt it to start monitoring. SOAtest will generate a test suite of Messaging Client tests for each JMS message captured at the specified destination or for all messages within the process flow (if a process tracking topic was used). These tests are preconfigured with the connection parameters, requests, and destination information so that SOAtest can replay the same messages.

Prerequisites The tibjms.jar file must be added to your classpath (via SOAtest> Preferences> System Properties).

Generating Tests from TIBCO EMS Transactions To generate tests: 1. Choose the TIBCO Enterprise Messaging Service option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project

407

Creating Tests From TIBCO EMS Transactions



Adding a New Test Suite

2. Complete the first page of the TIBCO EMS wizard as follows: a. In the Connection area, specify your TIBCO EMS connection settings. b. In the Destination Name field, specify the topic or queue that you want to monitor. •

You can specify a regular topic or queue (e.g., the entry or exit of a workflow process), or a special process tracking topic .



For instance, to track any JMS message that gets transmitted through TIBCO EMS, use $sys.monitor.Q.r.>



c.

For details on specifying tracking topics for TIBCO EMS, see "Chapter 13: Monitoring Server Activity" and "Appendix B: Monitor Messages" in the TIBCO EMS User’s Guide.

In the Destination Type field, specify whether the tracking destination is a topic or a queue.

d. (Optional) In the Message Selector field, enter a value to act as a message filter. See “Using Message Selector Filters”, page 704 for tips.

408

Creating Tests From TIBCO EMS Transactions

3. Click Next. SOAtest will start monitoring the messages that match the settings specified in the previous wizard page. If you run another application that sends messages to the bus, those messages will be noted in this panel. 4. When you are ready to stop monitoring, click Finish. SOAtest will then create test cases based on the verified messages.

Monitoring Intermediary Messages In addition to automatically generating functional tests from monitoring the transaction messages that touch JMS endpoints in TIBCO EMS, you can also visualize and trace messages that take place through EMS as part of the transactions that are triggered by the tests, and then dissect them for validation. For details on how to do this, see “Event Monitoring (ESBs, Java Apps, Databases, and other Systems)”, page 494.

409

Creating Tests From Traffic

Creating Tests From Traffic Creating test suites from HTTP traffic is useful for replaying messages in a plain text traffic log/trace file. For example, you can log the traffic in a server and save it to a file, then provide that file to SOAtest in order to construct a SOAP Client for each SOAP request found in that file. You can optionally create a regression control with each response to validate whether each request continues to have the expected response (the response captured in the file) when the messages are replayed from SOAtest. In addition to generating SOAP Client tools for replaying the logged SOAP requests, you can also create stubs to virtualize/replace the represented servers in your testing environment (e.g., if they are not available/accessible for testing). Message traces or logs for test case or stub creation can be captured at the network level using network sniffing tools such as the freely available WireShark tool (http://www.wireshark.org/), or obtained by having your application log its traffic.

Wire Shark Tip Once the trace is captured, highlight one of the relevant TCP packets and select Analyze> Follow TCP stream. Then, save it by clicking Save As. To automatically create a test suite from HTTP traffic, complete the following: 1. Choose the Traffic option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project

410

Creating Tests From Traffic



Adding a New Test Suite .

2. In the Traffic wizard page, specify the location of the traffic file from which you want to create tests. 3. Specify what type of tests you want to generate. Available options include: •

Generate Server Stubs: Creates stubs for servers that you want to stub, or virtualize, during testing. For example, you could use this to emulate servers that are not available for testing—instead of accessing the actual servers from the test environment, you interact with the virtualized ones. SOAtest will construct a Message Stub tool for each group of structurally similar messages and configure the various messages in each group as Multiple Responses within the Message Stub. •



For details on SOAtest’s stubbing/virtualization capabilities, see “Service Virtualization: Creating and Deploying Stubs”, page 531.

Generate Web Service Clients: Creates SOAP Client tools that replay the messages represented in the file. SOAtest will construct a SOAP Client for each SOAP request found in that file.

4. (For Web service clients only) Customize generation options as needed. Available options are: •

Group similar sequential messages into tests parameterized with file data sources: Consolidates structurally similar messages (messages that differ only in text content changes in elements and attributes) into a single SOAP Client that is parame-

411

Creating Tests From Traffic

terized with a file data source. This is particularly useful when creating tests from large traffic files: it relieves you from having to create a test for every request message in the file. It also facilitates test maintenance and management, since fewer tests are created and they are better organized. •

Create Regression Control: Creates regression controls for each test. This allows you to validate whether each request continues to have the expected response (the response captured in the file) when the messages are replayed from SOAtest.

5. Do one of the following: •

For Web Service Clients: Click Finish. SOAtest will import tests from the captured HTTP traffic and create SOAP Clients for each SOAP Request/Response pairs.



For Server Stubs: Click Next, customize stub deployment settings if desired, then click Finish. SOAtest will then generate and deploy stubs that emulate the recorded traffic.

412

Creating Tests From a UDDI

Creating Tests From a UDDI To automatically create a test suite from a UDDI registry, complete the following: 1. Choose the UDDI option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project



Adding a New Test Suite

2. In the UDDI wizard page, enter a new endpoint in the UDDI Inquiry Endpoint field or select from a previous endpoint in the drop-down menu. 3. Select Business, Service, or TModel for the service Type in the UDDI. 4. Enter a search keyword for the UDDI service in the Query field. 5. Enter an integer in the Maximum Results field to limit the number of results from the query.

413

Creating Tests From a UDDI

6. Click Next. The Policy Enforcement dialog opens.

7. Select the Apply Policy Configuration check box. This will create WSDL and functional tests that will enforce the assertions defined in the specified policy configuration. •

The default policy configuration, soa.policy, is a collection of industry-wide best practices. To use a custom policy configuration, you can either use the Browse button to select a policy configuration or the policy configuration's path can be entered in the text field. For details on policy enforcement, see “SOA Policy Enforcement: Overview”, page 570.

414

Creating Tests From a UDDI

8. Click the Next button to advance to the Layout dialog.

9. (Optional) Select the Organize as Positive and Negative Unit Tests checkbox to create both positive and negative tests for each operation since it is important to test situations where we send expected data as well as unexpected data to the server. The default value is configured to Sort Tests Alphabetically. 10. (Optional) Select the Asynchronous radio button and choose Parlay, Parlay X, SCP, or WSAddressing to create asynchronous test suites. For more information on asynchronous testing, see “Creating Asynchronous Tests”, page 419. 11. Click the Finish button. SOAtest will generate a suite of test cases that test every object associated with the WSDL you entered.

415

Creating Tests From a WSIL

Creating Tests From a WSIL To automatically create a test suite from a valid WSIL document, complete the following: 1. Choose the WSIL option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project



Adding a New Test Suite

2. In the WSIL wizard page, enter a valid WSIL URL in the WSIL URL field, or click the Browse button to locate a WSDL URL. •

Note: The remaining steps are optional. Once you enter a valid WSIL URL, you can go ahead and click the Finish button and SOAtest will generate a suite of test cases that test every object associated with the WSDLs within the WSIL you entered. If you would like to configure the test suite further, continue to the next step.

416

Creating Tests From a WSIL

3. Select the Create Functional Tests from the WSDL checkbox and choose the Generate Web Service Clients radio button. To create server stubs and perform client testing, see “Creating Stubs from Functional Test Traffic”, page 535. 4. To create a separate test suite that generates a series of tests to verify every aspect of the WSDL, select the Create tests to validate and enforce policies on the WSDL checkbox. 5. Click Next. The Policy Enforcement dialog opens.

6. Select the Apply Policy Configuration check box. This will create WSDL and functional tests that will enforce the assertions defined in the specified policy configuration. •

The default policy configuration, soa.policy, is a collection of industry-wide best practices. To use a custom policy configuration, you can either use the Browse button to select a policy configuration or the policy configuration's path can be entered in the text field. For details on policy enforcement, see “SOA Policy Enforcement: Overview”, page 570.

7. Click the Next button to advance to the Layout dialog.

417

Creating Tests From a WSIL

8. (Optional) Select the Organize as Positive and Negative Unit Tests checkbox to create both positive and negative tests for each operation since it is important to test situations where we send expected data as well as unexpected data to the server. The default value is configured to Sort Tests Alphabetically. 9. (Optional) Select the Asynchronous radio button and choose Parlay, Parlay X, SCP, or WSAddressing to create asynchronous test suites. For more information on asynchronous testing, see “Creating Asynchronous Tests”, page 419. 10. Click the Finish button. SOAtest will generate a suite of test cases that test every object associated with the WSDL you entered.

418

Creating Asynchronous Tests

Creating Asynchronous Tests In this age of flexible, high performance web services, asynchronous communication is often used to exchange data, allowing the client to continue with other processing rather than blocking until a response is received. SOAtest comes packaged with a server that runs in the background and manages the asynchronous Call Back messages received. When creating a test suite from WSDL or WSIL documents, you can use the Layout dialog via the test creation wizard to create asynchronous tests. SOAtest supports the major asynchronous communication protocols including Parlay, Parlay X, SCP, and WS-Addressing. After selecting the Asynchronous option from the test creation wizard, a test suite folder is created which contains automatically configured asynchronous test cases for each operation defined within the WSDL or WSIL you entered.

Notice that two asynchronous tests are created for each WSDL in the test suite. The first test is a SOAP Client test which will send an initial request to the asynchronous service. The second is a tool called the Call Back tool. Using the Call Back tool, SOAtest is able to listen for call back messages that are sent in an asynchronous messaging exchange. For more information on the Call Back tool, see “Call Back”, page 847. A local server has been integrated into SOAtest, allowing the Call Back tool to listen for these incoming messages. For this reason, it is important that the stub server is running before executing these asynchronous tests. To do this, complete the following: 1. Choose Window> Show View> Stub Server. 2. Right-click the Server node the Stub Server tab and select Start Server from the shortcut menu. A green light next to the node indicates that the server has been successfully started.

419

Testing RESTful Services

Testing RESTful Services SOAtest fully supports the testing of RESTful services via the REST Client tool, which was specifically designed for sending messages to RESTful services. Messages can be sent with HTTP GET, POST, PUT, DELETE, HEAD, OPTIONS, or custom methods. For more details, see “REST Client”, page 829.

420

Sending MTOM/XOP Messages

Sending MTOM/XOP Messages SOAtest enables the testing of Web services leveraging the MTOM (Message Transmission Optimization Mechanism) and XOP (XML-binary Optimized Packaging) technologies. SOAtest enables users to select the binary content to include in the transmission, as well as to send, receive, and validate optimized messages. MTOM optimized messages can be sent using the SOAP Client in Form Input View. To enable the sending of optimized messages, complete the following: 1. In the Misc Tab of a SOAP Client, select Custom from the Attachment Encapsulation Format drop-down menu and choose either MTOM Always or MTOM Optional •

MTOM Always: SOAtest will always send the request in a XOP Package, i.e. with MIME boundaries, even when there is not optimized content in the request



MTOM Optional: SOAtest will only send the request in a XOP Package, i.e. with MIME boundaries, when there is optimized content in the request. In the absence of optimized content, SOAtest will send a normal SOAP request. Note that MTOM Always or MTOM Optional can be selected at the Test Suite level in the SOAP Client Options tab or in the SOAP Client page of the SOAtest Preferences panel.

2. Select the Request tab and make sure Form Input is selected from the Views menu. The Form Input view is a schema-aware view. In this view, SOAtest will recognize the xsd:base64Binary schema datatype and allow you to reference the content that you want to optimize. When you click on a base64Binary type, the following options are available: •

Reference to file: This is the recommended option. This option allows you to select a file to send as optimized content. SOAtest reads the contents from the file as the contents are being sent on the wire. This way, the contents of the file do not need to be stored in memory.



Persist As Relative Path: It is always recommended that the path to the file be kept as a relative path to the Test Suite to allow for easier sharing and collaboration with the rest of the organization.



Import from file: This option (not recommended) allows you to read the contents in from the file. This option is not recommended for big files because the contents of the file will be loaded in to memory.



Additionally, the file to be sent can be driven by data sources by selecting Parameterized from the drop down box and selecting a File Data Source. For more information about File Data Sources, see “Adding a File Data Source”, page 351.

421

Sending and Receiving Attachments

Sending and Receiving Attachments Many Web Services employ attachments to send and receive data that is not well represented with XML messages, such as multimedia binary data. SOAtest can be used to send and receive attachments along with SOAP messages. Received attachments can then be processed and validated for correctness. The SOAP Client tool can be configured to send MIME, DIME, and MTOM attachments along with the request SOAP message. For more information, see “Misc Options”, page 780. The Attachment Handler tool can be used in conjunction with a SOAP Client to extract and validate incoming attachments from a response SOAP message. For more information on configuring the Attachment Handler tool, see “Attachment Handler”, page 962.

422

Accessing Web Services Deployed with HTTPS

Accessing Web Services Deployed with HTTPS To configure SOAtest to work with Web services deployed using HTTPS (HTTP via the SSL), you need to identify the certificate being used for the HTTPS connection from the server, then register that certificate with SOAtest. You do this as follows: 1. Close SOAtest if it is currently open. 2. Identify the location of the server certificate used for the HTTPS connection. 3. Ensure that this certificate’s COMMON NAME parameter contains both the server’s machine name and the subdomain (for example, machine.company.com). 4. Copy the certificate to <soatest_install_dir>/plugins/com.parasoft.xtest.libs.web_<soatest_version_number>/root/lib. This directory should contain a cacerts file in which SOAtest stores trusted certificates. 5. Execute a command of the following format: keytool -import -alias <certificate_alias> -file <certificate_file> -keystore cacerts

For example, if your certificate file is named test.cert and your SOAtest installation directory is C:\Program Files\Parasoft\SOAtest\6.2, you would execute the following command from the C:\Program Files\Parasoft\SOAtest\6.2\lib prompt: keytool -import -alias serverTrustCert -file test.cert -keystore cacerts

This will import the certificate into the cacerts file with the alias "serverTrustCert". 6. When prompted to enter a keystore password, enter changeit. 7. When asked whether you want to trust this certificate, enter yes. You will then see a message indicating that the certificate has been added to the keystore. 8. (Optional) Verify that the certificate has been added to the keystore by entering the following command, then checking the file that opens: keytool -list -keystore cacerts

9. Launch SOAtest and try to access the service again. If SOAtest still does not work with services deployed using HTTPS, ensure that: 1. Your server is running. 2. You used the full name of the machine when trying to communicate with this HTTPS. 3. The server certificate was created with the full name. 4. The name on the certificate is identical to the name the client tried to access it with. If you cannot satisfy the above requirements (for example, if you don’t have necessary permissions): 1. Choose SOAtest> Preferences to open the SOAtest Preferences dialog. 2. Select Security from the left pane of the SOAtest Preferences dialog, then select the Trust all certificates option in the right pane. 3. Click OK or Apply to apply this change. SOAtest will then try to access any WSDL you specify, regardless of any problems with the certificate. However, SOAtest will still try use the certificate while trying to send SOAP messages because it is required to do so. Note: You must add certificates to cacerts files on load test slave machines as well as on the master machine. Otherwise, SSL connections will not work when running a load test with slave machines.

423

Accessing Web Services Deployed with HTTPS

If none of these procedures solve your problem, contact Parasoft in one of the ways described in “Contacting Parasoft Technical Support”, page 16.

Debugging SSL Issues Parasoft SOAtest runs on a standard JVM. To show the SSL/TLS handshake details and help identify causes of SSL connection problems, enable JVM network and SSL debugging as follows: 1. Open a command line console and navigate to the SOAtest installation directory. 2. Start the SOAtest executable with the arguments: -J-Dssl.debug=true -J-Djavax.net.debug=all -consolelog

SOAtest will start as usual, but whenever SSL connections are made, debugging output will be printed on the console. If you wish to save the trace output to a file (for example, output.txt), you may append the following to the end of the command : > output.txt

For more information about managing keys and certificates using the Java keytool, refer to: •

Windows: http://java.sun.com/javase/6/docs/technotes/tools/windows/keytool.html



Linux and Solaris: http://java.sun.com/javase/6/docs/technotes/tools/solaris/keytool.html

JMS SSL See “JMS SSL”, page 703.

424

Configuring Regression Testing

Configuring Regression Testing This topic explains how to configure your functional test suite for regression testing, which helps you determine whether functionality unexpectedly changes over time. Sections include: Sections include: •

Understanding Regression Testing



Creating Regression Tests



Modifying Regression Options

Understanding Regression Testing The purpose of regression testing is to detect unexpected faults—especially those faults that occur because a developer did not fully understand the internal code correlations when he or she modified or extended code that previously functioned correctly. Regression testing is the only reliable way to ensure that modifications did not introduce new errors into code or to check whether modifications successfully eliminated existing errors. During regression testing, SOAtest runs specified test cases, compares the current outcomes with the previously-recorded outcomes, then alerts you to differences between the current responses and control responses. Subsequent regression test runs will remind you of this discrepancy until the application returns the expected result. We recommend that you perform functional testing to verify that your service functions correctly, then begin regression testing to ensure that system changes do not introduce errors or unexpected changes.

Creating Regression Tests You can automatically create regression controls for an entire test suite, or you can record regression controls for individual test cases. If you want to run regression testing on all or most test cases, we recommend that you automatically add regression controls to the complete test suite, then remove the controls you do not want to use. If you only want to perform regression testing on a limited number of test cases, it is more efficient to add the controls individually.

Automatically Creating Regression Controls To have SOAtest automatically create regression controls for an entire test suite or for specific test cases: 1. Right-click the appropriate test suite or test nodes, then choose Create/Update Regression Control from the shortcut menu. 2. In the Response Validation wizard that opens, expand Create Regression Control, then specify what type of control you want create and click Finish. A Diff control will be added to the selected test(s) •

Single vs. Multiple Regression Controls: This option is only available for tests that use a parameterized value from a data source. It determines whether SOAtest creates regression controls using only one row from the data source, or all rows from the data source.



Internal vs. External Regression Controls: External regression controls allow you to manage the regression controls outside of SOAtest. If you are working with large messages, external controls are recommended because 1) they reduce the size of the .tst file and 2) they enable you to use the ExamXML Diff option, which can process very

425

Configuring Regression Testing

large files. Regression content in external files can also be searched, updated, scripted, and managed since the content is readable text. All subsequent times SOAtest runs a test case with a Diff control, it will compare the actual outcomes with the outcome specified in the Diff controls. If a single regression control was created, a single file is saved in the same directory as the current project file. This file will contain the regression content. If multiple regression controls created, the response associated with each data source row will be saved into a separate file. A directory with a random name will also be created and it will contain multiple files with names such as DS_Row_001.xml. The number in this file name correlates to the row number. For example, if the data source has five rows, and the test ran five times when creating regression controls, five files will be generated.

Manually Creating a Regression Control If you want to perform regression testing on a test case, but do not want to use the current outcome as the comparison value, we recommend that you define your own Diff control for that test case. To define your own Diff control for a test case: 1. Use the procedure described in “Adding Test Outputs”, page 333 to add a Diff tool. 2. Configure the Diff tool as described in “Diff”, page 899. All subsequent times SOAtest runs a test case with a Diff control, it will compare the actual outcomes with the outcome specified in the Diff controls.

Modifying Regression Options To automatically update a regression control: •

Right-click the appropriate test suite or test nodes, then choose Create/Update Regression Control from the shortcut menu.

To manually update a regression control: •

Double-click the Diff tool and modify the available options.

To customize the logic and data source usage for all regression controls in a test suite: •

Double-click the test suite’s Test Case Explorer node and modify the available options, which are described in “Regression Options”, page 325.

426

Validating the Value of an Individual Response Element

Validating the Value of an Individual Response Element This topic explains how to use SOAtest’s Response Validation wizard to validate the value of an individual response element. To configure SOAtest to validate a specific element from the service’s response message: 1. Right-click the appropriate test suite or test nodes, then choose Create/Update Regression Control from the shortcut menu. 2. In the Response Validation wizard that opens, select Create Value Assertion, then click Next. The next page will show a tree representation of the expected response message from the server. 3. Select the tree node that corresponds to the element you want to validate, then click Finish. An XML Assertor tool will then be configured and chained to the test. For details on this tool, see “XML Assertor”, page 917.

427

Validating the Structure of the XML Response Message

Validating the Structure of the XML Response Message This topic explains how to use SOAtest’s Response Validation wizard to quickly add an XML Validator to verify conformance of the incoming response to any XML Schemas that it is bound to. To configure SOAtest to validate the structure of an incoming response message: 1. Right-click the appropriate test suite or test nodes, then choose Create/Update Regression Control from the shortcut menu. 2. In the Response Validation wizard that opens, select Validate XML Response, then click Finish. An XML Validator tool will then be configured and chained to the test. For details on this tool, see “XML Validator”, page 862.

428

Web Functional Tests In this section: •

Web Functional Testing: Overview



Recording Tests from a Browser



Generating JUnit Tests from a Web Browser Recording



Configuring Browser Playback Options



Configuring User Actions (Navigation, Delays, etc.)



Configuring Wait Conditions



Validating or Storing Values



Stubbing Test Requests/Responses



Running Web Scenarios in Headless Mode



Running Static Analysis as Functional Tests Execute



Customizing Recording Options



Creating Custom Locators for Validations and Extractions



Understanding Web Functional Test Errors



Creating a Report of Test Suite Maintainability

429

Web Functional Testing: Overview

Web Functional Testing: Overview This topic provides an overview of SOAtest’s web functional testing capabilities. Web interface testing is difficult to automate. Teams often abandon automated testing in favor of manual testing due to too many false positives or too much effort required to maintain the test suites. SOAtest facilitates the creation of automated test suites that are reliable and dependable. Its ability to isolate testing to specific elements of the Web interface eliminates noise and provides accurate results. SOAtest isolates and tests individual application components for correct functionality across multiple browsers without requiring scripts. Dynamic data can be stubbed out with constant data to reduce test case noise. Validations can be performed at the page object level as well as the HTTP message level. SOAtest also verifies the client-side JavaScript engine under expected and unexpected conditions through asynchronous HTTP message stubbing.

430

Recording Tests from a Browser

Recording Tests from a Browser This topic explains how to record web functional tests by recording from a browser: Sections include: •

Recording New Tests



Extending an Existing Test Scenario



Exploring the Functional Tests Generated



Reviewing Pre-Action and Post-Action Browser Contents



Exploring the Asynchronous Test Requests Generated



Handling Internet Explorer Security Settings



Tutorial

Recording New Tests Want to extend an existing Test Scenario (instead of recording a new one)? See “Extending an Existing Test Scenario”, page 433 for details on how to add new steps to an existing test scenario. To create a new functional test for a web application, complete the following: 1. Do one of the following: •

To add a test suite to a new project, open the pull-down menu for the New toolbar button (top left) then choose Project from Web Browser Recording.

In the Create a SOAtest Project wizard that opens, enter a project name in the Project name field, then click Next. •

To add a new .tst file to an existing project, select the project node where you want the new .tst file added, then choose File> New> New Test (.tst) file, enter a .tst file name and location, click Next, then choose Web> Record functional tests, then click Next again.



To add a test suite to an existing project and .tst file, select the project’s Test Suite node where you want the new test suite added, then click Add Recorded Tests.

431

Recording Tests from a Browser

2. In the first Record Web Functional Tests wizard page, indicate if you want to create the new functional test "from scratch" or if you want to use an existing test scenario as a starting point. •

Record starting from a referenced test allows you start recording a new functional test scenario that builds upon an existing (reusable) functional test scenario. For example, you can record one test scenario that captures the steps to log into your application, and then reference this when creating new test scenarios. This way, you don’t need to record the login steps every time that you want to create a new test scenario that requires a log in. If the log in steps change, you just need to update the one login test scenario. All related test scenarios will automatically use the updated information.

3. Complete the next Record Web Functional Tests wizard page as follows: •

Test Suite Name: Enter a test suite name.



Start Recording From: Enter the URL of the site on which you would like to perform functional testing. To record applications that "live" on the same machine as SOAtest, do not use localhost—instead, use the machine name (e.g., mymachine.parasoft.com) or the IP address (e.g., 10.10.11.11).



Generate Functional Tests: Select this option if you want SOAtest to record user actions on the page, and generate a test suite that will allow you to replay the entire scenario in Firefox and/or Internet Explorer. •

Auto Generate Response Stubs: Select this option if you want SOAtest to automatically generate stub outputs for functional tests that have asynchronous responses.



Generate Asynchronous Request Tests: While navigating a web site in Firefox, the site may use the XMLHttpRequest object, or hidden IFrame calls to asynchronously request data from a server. Selecting Generate Asynchronous Request Tests will prompt SOAtest to capture those requests and their responses, and then generate and auto-configure tests to validate these requests. For more information, see “Exploring the Asynchronous Test Requests Generated”, page 437.



Record with: Specifies which browser you want to record the test. •



Firefox executable path: If you are running Windows, then SOAtest will automatically attempt to locate a version of Firefox on your machine, if SOAtest cannot locate Firefox, or if you are running Linux, you will need to click Browse to executable. This will open up a file chooser from which you may browse to the location of Firefox on your machine. SOAtest will display the version of Firefox that you have selected just below the path field. Note: SOAtest supports Mozilla Firefox 1.5 and higher.

Generate Test Maintainability Report: Specifies whether you want SOAtest to generate a report that helps you gauge the maintainability of a test suite. See “Creating a Report of Test Suite Maintainability”, page 476 for details.

4. Click the Next button. 5. (Optional) Complete the Create Environment page. •

The controls in this page allow you to specify whether environment variables automatically get added to recorded tests. For functional tests, they get used in the URL of the first navigate test. For asynchronous request tests, they get used in the endpoint and the HTTP header "Referer" of each Messaging Client that gets generated for each asynchronous request.

432

Recording Tests from a Browser



These variables are generated by default. If you do not want these variables generated into the recorded tests, disable the Add url variable to your existing environment option.



Name specifies the name that will be used for the environment if a new environment is created. (A new environment is created if the test suite that the tests are being generated into does not already contain at least one environment).



Prefix specifies the prefix that will be used on the environment variables that get generated into the environment and that are referenced by the tests. The text below the prefix field demonstrates what the environment variable name will look like based on the specified prefix.

6. Click the Finish button. The designated start page will open in the selected browser. •

If you configured SOAtest to record starting from a referenced test, that test will be played back in the browser before you can start adding the new test steps.

7. Specify the functionality that you want to capture by following it in the browser(s). You can click links, complete and submit forms, use the navigation bar to enter open URLs, access shortcuts, go forward or back, and so on. •

To ensure that recording works properly, wait until each page has fully loaded before performing an action.

Tip- Completing Forms To complete forms, enter the values directly into the GUI controls as if you were actually navigating the site. For instance, type in your user name and password, select radio buttons, check or clear check boxes, and so on. As you record sessions, please note: •

Password recall and auto-completion in Internet Explorer’s (Internet Options Advanced settings) are not supported during recording.



Google Toolbar's Auto Fill feature is not supported.



A "type" test may not be recorded if you type the beginning of a term into a field, but then click on a suggestion from a drop down.

8. Close the browser window(s). A new Test Suite will appear in the Test Case Explorer. This new Test Suite will contain different tests depending on your selections from the wizard’s Test Type field. For more information, see the following subsections.

Extending an Existing Test Scenario To add new steps to an existing test scenario (for instance, if you want to add more steps to the middle of a use case scenario that you already recorded): 1. Select the point in the scenario where you want to start recording from.

433

Recording Tests from a Browser

2. Click Add Recorded Tests.

3. In the first Record Web Functional Tests wizard page, choose Continue recording from <selected test>, then click Next. 4. Continue recording the test in the usual manner.

Exploring the Functional Tests Generated If you selected the Generate Functional Tests option in the SOAtest Project’s Record Web Functional Tests wizard, a Test Suite: Functional Tests folder is added to the Test Case Explorer. For each recording session completed, a test scenario will be added. Each test within this scenario is a Browser Testing tool that was automatically added and configured to represent one action taken in the browser. Actions taken inside forms will be placed inside a nested test scenario. Each test’s configuration is controlled by the settings available in the tool configuration panel, which can be opened by double-clicking the test node. This panel contains the following tabs: •

The Pre-Action Browser Contents tab shows browser contents before the user action (for the first test in a scenario, there are no contents to display). The contents are from the last run in which the previous browser test was successful (the post-action contents of the previous test are the pre-action for this test). If there are multiple windows, the one in which the test's user action occurs is displayed.



The User Action and Wait Conditions tabs correspond to settings that are used, in order, by each test. First, the test executes the User Action; second, it waits based on the conditions set in the Wait Conditions.

Two additional tools will be chained to each Browser Testing tool that is added: •

A Traffic Viewer that shows the HTTP traffic.



A Browser Contents Viewer that stores and displays the browser contents from the last test run—whether that test succeeded or failed.

Reviewing Pre-Action and Post-Action Browser Contents Pre-Action To see the page before the test action occurred, you use the Browser Testing tool’s Pre-Action Browser Contents tab. For example, here is the Browser Testing tool showing what the web page looked like before the Click "Athletic" test—a test that mimics a user narrowing search results to include only athletic shoes.

434

Recording Tests from a Browser

You can see the HTML for a given element by right-clicking that element, then choosing Inspect <Element> from the shortcut menu.

435

Recording Tests from a Browser

Post-Action To see the page after the test action occurred, you use the Browser Contents Viewer tool. For example, here is the Browser Contents Viewer tool showing what the same web page looked like after the Click "Athletic" test.

Colored Borders Note that various colored borders are used to highlight elements that are the source of operations such as extractions, validations, user actions, etc.

436

Recording Tests from a Browser

The following table explains what each colored border indicates.

Color

Marks the source of

Blue

User actions

Red

Validations

Gray

Data bank extractions

Green

Wait conditions

Purple

Validations + databank extractions

Rendering Pages on Linux SOAtest uses the XULRunner runtime version (1.9.0.1) to render web pages in SOAtest (for example, the rendered pages in the Browser Contents Viewer or the Browser Functional Test). The system requirements for XULRunner are the same as the system requirements for Firefox 3. On Windows, there are no notable system requirements. On Linux, additional software needs to be required. If these software requirements are not met, you may encounter errors (including SOAtest terminating) when you try to open an editor with a rendered page.

Exploring the Asynchronous Test Requests Generated If you select the Generate Asynchronous Requests Tests option in the Generation Options panel within the Record Web Functional Tests wizard, an Asynchronous Request Tests folder is added to the Test Case Explorer.

437

Recording Tests from a Browser

SOAtest detects XMLHttpRequests calls and hidden IFrames and uses them to auto configure asynchronous requests tests. IFrames are considered “hidden” if they meet any of the following criteria: •

the Iframe has a width and height of either 0 or 1 pixels



the Iframe’s visibility attribute is set to hidden



the Iframes’s display attribute is set to none

If you double-click the Asynchronous Requests Tests folder, test suite options display in the configuration panel on the right.

438

Recording Tests from a Browser

Attached to each test is a Response Traffic> Recorded Response tool that compares the server’s response to what was recorded during test creation. If the responses are different, the test fails.

There is also a Traffic Viewer tool attached to each test that lets you view the request that was sent and the server’s response.

Handling Internet Explorer Security Settings Internet Explorer’s Enhanced Security Configuration and Protected Mode cannot be enabled while you are working with SOAtest.

Enhanced Security Configuration On Windows Servers 2003 and 2008, Internet Explorer Enhanced Security Configuration is enabled by default. The Enhanced Security Configuration will prevent SOAtest from playing back browser tests in Internet Explorer on those machines. When SOAtest starts Internet Explorer, a dialog like the following will appear, and the tests will not complete.

439

Recording Tests from a Browser

To run tests on these machines, you should disable the Enhanced Security Configuration as described in http://www.visualwin.com/IE-enhanced-security/. For more information about the Enhanced Security Configuration, see http://blogs.msdn.com/askie/archive/2009/03/24/using-internet-explorer-enhancedsecurity-configuration-on-terminal-servers.aspx. Alternatively, you can prevent this dialog from displaying by configuring the Enhanced Security Configuration so that sites are "trusted" as described in http://support.microsoft.com/kb/933991. This will allow SOAtest to play back the tests. However, the recommended approach is to completely disable the Enhanced Security Configuration.

Protected Mode If Internet Explorer is running in Protected Mode (i.e., on Windows Vista), SOAtest automatically disables Protected Mode for the Internet, Local intranet, and Trusted sites. It does not disable it for restricted sites. To re-enable Protected Mode: 1. Open Internet Explorer's Internet Options dialog. 2. Open the Security tab. 3. Select a Web content zone. 4. Check the Enable Protected Mode check box. To verify that Protected Mode has been successfully re-enabled, look for the words "Protected Mode: On" next to the Web content zone displayed in Internet Explorer's status bar.

Tutorial For a step-by-step tutorial on how to use SOAtest to perform functional testing on the Web interface, see “Web Functional Testing”, page 171.

440

Generating JUnit Tests from a Web Browser Recording

Generating JUnit Tests from a Web Browser Recording This topic explains how to generate JUnit tests that represent the actions recorded in a web browser. Sections include: •

About JUnit Test Generation



Prerequisite



Generating JUnit Tests from a Scenario as You Record It



Generating JUnit Tests from a Previously-Recorded Scenario



Configuring License Information for JUnit Test Execution



Adding Assertion Statements to the Generated Tests

About JUnit Test Generation SOAtest can generate JUnit tests that represent a new or previously-recorded Web functional test scenario. You can set up a functional test against a Web application, then use the generated JUnit test cases to validate test results using the JUnit framework. This gives you the flexibility of a test script and the easy-to-use interface of SOAtest—without having to learn a new test scripting language. The resulting class files are JUnit based and depend on the following jar files available from the SOAtest installation directory: •

bcprov.jar



commons-httpclient-3.0.jar



commons-logging.jar



FESI.jar



grtlogger.jar



junit.jar



mail.jar



parasoft.jar



webking.jar



wizard.jar



xercesImpl.jar



xml-apis.jar

You also need junit.jar, which is available within the junit 3.x release that can be downloaded from http://www.junit.org/. Note: If the JRE used to run the junit test classes is 1.4.2 or lower, xml-apis needs to be appended in the Xbootclasspath attribute during execution. For example, the command should look like "java Xbootclasspath/p:xml-apis.jar TestCase".

Prerequisite Before you start generating JUnit tests, perform this one-time configuration: 1. Switch to the Java perspective (Choose Window> Open Perspective> Other> Java) .

441

Generating JUnit Tests from a Web Browser Recording

2. Create a new project in the workspace and name it MyJUnitTest. 3. In the Package Explorer, right-click the new MyJUnitTest project, then choose Properties from the shortcut menu. 4. Select Java Build Path, then go to the Libraries tab and click Add External JARs. 5. Browse to <SOAtest install root>/plugins/com.parasoft.xtest.libs.web_<version>/root/, then select and add the following jars: •

bcprov.jar



commons-httpclient-3.0.jar



commons-logging.jar



FESI.jar



grtlogger.jar



mail.jar



parasoft.jar



webking.jar



wizard.jar



xercesImpl.jar



xml-apis.jar

6. Browse to the location of junit.jar, then select and add it. If you do not already have it, you can get junit.jar within the junit 3.x release that can be downloaded from http://www.junit.org/.

Generating JUnit Tests from a Scenario as You Record It To generate JUnit tests from a scenario as you record it: 1. Return to the SOAtest perspective. 2. Open the pull-down menu for the New toolbar button (top left), choose Other, select SOAtest> Web> JUnit test from Web Browser Recording, then click Next.

442

Generating JUnit Tests from a Web Browser Recording

3. Complete the Record and Generate JUnit Test wizard page as follows: •

Start Recording From: Enter the URL of the site on which you would like to perform functional testing.

Recording Apps that "Live" on the Same Machine as SOAtest To record applications that "live" on the same machine as SOAtest, do not use localhost. Instead, use the machine name (e.g., mymachine.parasoft.com) or the IP address (e.g., 10.10.11.11). •

Record with: Specifies which browser you want to record the test. •

Firefox executable path: If you are running Windows, then SOAtest will automatically attempt to locate a version of Firefox on your machine, if SOAtest cannot locate Firefox, or if you are running Linux, you will need to click Browse to executable. This will open up a file chooser from which you may browse to the location of Firefox on your machine. SOAtest will display the

443

Generating JUnit Tests from a Web Browser Recording

version of Firefox that you have selected just below the path field. Note: SOAtest supports Mozilla Firefox 1.5 and higher. •

Class Name: Enter a class name for the generated JUnit test class. Enter MyJUnit for the class name



Package Name: This value is optional, but we recommend that you select the project ${project_loc:MyJUnitTest}/src for the output location. If the package name does not correspond to the folder structure dictated by the output location, SOAtest will generate the necessary sub-folders.



Generate into Output Location Specifies the destination folder for the generated test class file.

4. Click the Finish button. The designated start page will open in the selected browser. 5. Specify the functionality that you want to capture by following it in the browser(s). You can click links, complete and submit forms, use the navigation bar to enter open URLs, access shortcuts, go forward or back, and so on. Note: To ensure that recording works properly, you must wait until the page has fully loaded before performing some action. You must wait each time the page or some part of the page gets reloaded before performing an action.

Tip- Completing Forms To complete forms, enter the values directly into the GUI controls as if you were actually navigating the site. For instance, type in your user name and password, select radio buttons, check or clear check boxes, and so on. As you record sessions, please note: •

Password recall and auto-completion in Internet Explorer’s (Internet Options Advanced settings) are not supported during recording.



Google Toolbar's Auto Fill feature is not supported.



A "type" test may not be recorded if you type the beginning of a term into a field, but then click on a suggestion from a drop down.

6. Close the browser window(s). A JUnit test class will be added to the specified output location. A new project will not be created or added to the Test Case Explorer.

Generating JUnit Tests from a Previously-Recorded Scenario To generate JUnit tests that represent a previously-recorded test scenario: 1. Return to the SOAtest perspective. 2. In the Test Case Explorer, right-click the web functional test scenario that you want to generate JUnit tests for, then choose Generate JUnit Tests from the shortcut menu. 3. Complete the Generation Options dialog, then click Finish. Available options are: •

Class Name: Enter a class name for the generated JUnit test class. Enter MyJUnit for the class name

444

Generating JUnit Tests from a Web Browser Recording



Package Name: This value is optional, but we recommend that you select the project ${project_loc:MyJUnitTest}/src for the output location. If the package name does not correspond to the folder structure dictated by the output location, SOAtest will generate the necessary sub-folders.



Generate into Output Location: Specifies the destination folder for the generated test class file.

Tip- Managing Multiple Tests Many users find it convenient to put all tests into the same project. However, you may create multiple projects if you prefer.

Executing the Generated Tests To execute the generated tests 1. Go to the Java perspective. 2. Refresh the MyJUnitTest project. You should see a node representing the generated test. 3. Right-click MyJUnit.java and choose Run As> JUnit Test. You can also execute these tests from the command line as described in the JUnit documentation.

Configuring License Information for JUnit Test Execution License information is required to run JUnit tests generated by SOAtest. The license information can be set in two ways: •

If you want to use the same license as the local SOAtest installation, simply verify that the license information is configured properly under SOAtest> Preferences> License. The license information will then be detected by WebBrowser from the installation root passed to the constructor, i.e. <SOAtest install root>/plugins/com.parasoft.xtest.libs.web_<version>/root/.



If you want to run the tests from a machine that does not have a local installation of SOAtest— or if you want to use a different license information than one being used for the local SOAtest installation—you can control the licensing information without having to open SOAtest and modify the preferences from the UI. To do this, you pass the license information using the following constructor:

WebBrowser( String installRoot, int browserType, String ffExePath, String licenseServerLocation, int licenseServerPort, int licenseServerTimeout )

Adding Assertion Statements to the Generated Tests Each JUnit class generated by SOAtest consists of one test function that mimics the test sequence of a SOAtest test. Whenever the server returns a response, the test will make an assignment to the Web-

445

Generating JUnit Tests from a Web Browser Recording

Response object declared within the test function. You should insert assertion statements after these assignments to validate that the response from the server is expected. We have created comment blocks where we recommend assertion placements within the test function. For example: public void testA() throws Exception { WebConversation wc = new WebConversation(); WebRequest req = new GetMethodWebRequest("http://mole/tests/"); WebResponse resp = wc.getResponse(req); //Begin assertions //End assertions WebLink link = resp.getLinkWith("popup.html"); link.click(); resp = wc.getCurrentPage(); //Begin assertions //End assertions WebForm form = resp.getFormWithName("childrenForm"); resp = form.submit(); //Begin assertions //End assertions }

In the above JUnit test function, the blocks appear each time that the WebResponse object was assigned with a new value.

446

Configuring Browser Playback Options

Configuring Browser Playback Options This topic explains how to determine what browser is used to playback previously-recorded tests. Sections include: •

About SOAtest’s Browser Playback Options



Modifying Browser Playback Settings



Specifying the Browser at the Time of Test Execution



Specifying the Browser from the Command Line

About SOAtest’s Browser Playback Options By default, SOAtest configures a Web functional test to be played back using the browser in which it was recorded. You can change the test to use a different browser or both available browsers by default. SOAtest also provides Test Configurations that allow you to specify "on the fly" which browser to use for playback. This allows you to have your test played back in the browser you recorded in by default, as well as play it back in a different browser (or both browsers) by simply running the appropriate Test Configuration. If you want to ensure that a test is played only in the browser with which it was recorded (e.g., because the web page structure is significantly different on other browsers and the scenario would need to be constructed differently on another browser), you can configure the test to be played only in the specified browser.

Modifying Browser Playback Settings To modify the test’s browser playback settings (the settings used during playback unless another option is explicitly selected as described in Specifying the Browser at the Time of Test Execution below): 1. Double-click the scenario’s Test Case Explorer node. 2. Open the Browser Playback Options tab. 3. At the top of the tab, specify the browser you want the test case played in. •

If you want to ensure that this test is never played in an alternate browser (e.g., because the web page structure is significantly different on other browsers and the scenario would need to be constructed differently on another browser), enable Run in specified browser only.



With Run in specified browser only option disabled, each test scenario could have a different browser playback setting, and each test could run in a different browser depending on its test setting. This allows you to have your test played back in the browser you recorded in by default, as well as play it back in a different browser or both browsers by simply running the appropriate Test Configuration.

Specifying the Browser at the Time of Test Execution If the test does not have Run in specified browser only enabled, you can override the test’s browser playback settings at the time of test execution as follows: 1. Select the test scenario’s Test Case Explorer node.

447

Configuring Browser Playback Options

2. Choose the desired Test Configuration from SOAtest> Test Using> Built-In> Functional Testing •

Run Web Functional Tests in Both Browsers: Executes each test in both Firefox and Internet Explorer.



Run Web Functional Tests in Browser Specified by Tests: Executes each test using the browser playback settings configured in the test scenario’s Browser Playback Options tab. If you have multiple scenarios, each with different browser playback settings, this Test Configuration would run all the scenarios in the designated browser(s).



Run Web Functional Tests in Firefox: Executes each test in Firefox. If a test was configured to run in Internet Explorer, this does not perform any testing.



Run Web Functional Tests in Internet Explorer: Executes each test in Internet Explorer. If a test was configured to run in Firefox, this does not perform any testing.

Specifying the Browser from the Command Line To specify the browser to be used from the command line, set com.parasoft.xtest.execution.api.web.use_browser with one of the following values:



Both



Firefox



Internet Explorer



Specified in test

For example: com.parasoft.xtest.execution.api.web.use_browser=Internet Explorer

This is useful if you are creating your own custom Test Configuration that has different settings than the ones in the builtin Test Configurations.

448

Configuring User Actions (Navigation, Delays, etc.)

Configuring User Actions (Navigation, Delays, etc.) This topic explains how to modify the user actions simulated by a web functional test. Sections include: •

Configuring Actions



Understanding Preset Actions



Specifying Other Actions

Configuring Actions To view and modify the action taken by a specific functional test: 1. Double-click the test’s Test Case Explorer node. 2. In the configuration panel that opens in the right side of the GUI, open the User Action tab. 3. Review the existing actions (initially, the ones captured during test creation) and modify the settings as needed to specify the actions you want performed. You can choose from the available pre-set actions, or define a custom one.

Identifying Elements Associated with User Actions The element that is the source of a user action will be highlighted with a solid blue border in the test’s Pre-Action Browser Contents tab.

Changing the Target of a User Action To quickly change the target of a user action, right-click the related element in the Pre-Action Browser Contents tab, then choose the appropriate Modify command.

If the user action that you want to change is not associated with a specific element (for instance, a "close" or "navigate" action), you can right-click anywhere in the Pre-Action Browser Contents tab, then choose Change User Action.

449

Configuring User Actions (Navigation, Delays, etc.)

This opens the the User Action tab, which allows you to modify the target.

Inspecting the HTML for Elements As you create and modify user actions for page elements, you may want to inspect the HTML to determine if you are adding actions to the appropriate elements. To see the HTML for a given element by right-click that element, then choose Inspect <Element> from the shortcut menu.

Understanding Preset Actions Navigate Select the navigate action if you want the browser to navigate to the provided URL as though it was entered in the URL bar of the browser. If you choose this action, you can specify the following settings: •

URL: You can enter a Fixed, Parameterized (if a data source is available), or Scripted URL. •

To enter a scripted URL, select Scripted, then click the Edit Script button to enter a script method to return the URL that should be navigated to in the selected test.

450

Configuring User Actions (Navigation, Delays, etc.)



Window Name: You may specify the name of the window you would like the action to occur in. Leaving this field blank indicates that SOAtest will use the default window.

Click Select the click action if you want the browser to click the specified element. If you choose this action, you can specify the following settings: •

Element Locator •

Use Element Properties: Select to identify an element by the following properties: •

Element: Specifies the element name (for example, "img", "div", or "a") that the action should apply to. To allow any element, enter "Any" into this field.



Attribute Name: Specifies the attribute name to identify the element (for example, "title", "id", or "name"). You can configure this value using one of the following mechanisms:



Attribute Value: Specifies the expected value for the attribute supplied by the Attribute Name field.







If you want to specify a fixed value, select the Fixed option, then specify the desired value in the text box.



If you want to use values defined in a data source, select the Parameterized option, then specify the data source column that contains the values you want to use. Note that this option is only available if the project contains at least one data source.



If you want to use the return value of a Java/JavaScript/Python method, select the Script option. Click the Edit button to create or edit the method(s) and choose the desired method for use from the Method drop-down menu in the popup dialog. If there are two or more methods, you can also select a different method for use from the drop-down menu in the form panel.

Index: Specifies the element that matches the previous criteria. Entering "0" means that the first element that matches the "Element," "Attribute Name," and "Attribute Value" criteria will be used. Entering "1" means that the second element that matches will be used, and so on. •

If you want to specify a fixed value, select the Fixed option, then specify the desired value in the text box.



If you want to use values defined in a data source, select the Parameterized option, then specify the data source column that contains the values you want to use. Note that this option is only available if the project contains at least one data source.



If you want to use the return value of a Java/JavaScript/Python method, select the Script option. Click the Edit button to create or edit the method(s) and choose the desired method for use from the Method drop-down menu in the popup dialog. If there are two or more methods, you can also select a different method for use from the drop-down menu in the form panel.



Use XPath: Enter an XPath to be used as an identifier.



Use Script: Enter a script that defines the desired click action.

Key Modifiers: Specifies if you want to mimic the user pressing the Alt, Ctrl, or Shift keys during the click.

451

Configuring User Actions (Navigation, Delays, etc.)



Window Name: You may specify the name of the window you would like the action to occur in. Leaving this field blank indicates that SOAtest will use the default window.

Type Select the type action if you want the browser to type the specified text into the specified element. If you choose this action, you can specify the following settings: •

Value: You can enter a Fixed, Parameterized (if a data source is available), or Scripted value. •



To enter a scripted value, select Scripted, then click the Edit Script button to enter a script method to return the value that should be typed in the selected test.

Element Locator •

Use Element Properties: Select to identify an element by the following properties: •

Element name: Specifies the element name (for example, "img", "div", or "a") that the action should apply to. To allow any element, enter "Any" into this field.



Attribute Name: Specifies the attribute name to identify the element (for example, "title", "id", or "name"). You can configure this value using one of the following mechanisms:



Attribute Value: Specifies the expected value for the attribute supplied by the Attribute Name field.





If you want to specify a fixed value, select the Fixed option, then specify the desired value in the text box.



If you want to use values defined in a data source, select the Parameterized option, then specify the data source column that contains the values you want to use. Note that this option is only available if the project contains at least one data source.



If you want to use the return value of a Java/JavaScript/Python method, select the Script option. Click the Edit button to create or edit the method(s) and choose the desired method for use from the Method drop-down menu in the popup dialog. If there are two or more methods, you can also select a different method for use from the drop-down menu in the form panel.

Index: Specifies the element that matches the previous criteria. Entering "0" means that the first element that matches the "Element," "Attribute Name," and "Attribute Value" criteria will be used. Entering "1" means that the second element that matches will be used, and so on. •

If you want to specify a fixed value, select the Fixed option, then specify the desired value in the text box.



If you want to use values defined in a data source, select the Parameterized option, then specify the data source column that contains the values you want to use. Note that this option is only available if the project contains at least one data source.



If you want to use the return value of a Java/JavaScript/Python method, select the Script option. Click the Edit button to create or edit the method(s) and choose the desired method for use from the Method drop-down menu in the popup dialog. If there are two or more

452

Configuring User Actions (Navigation, Delays, etc.)

methods, you can also select a different method for use from the drop-down menu in the form panel.





Use XPath: Enter an XPath to be used as an identifier.



Use Script: Enter a script that defines the desired click action.

Window Name: You may specify the name of the window you would like the action to occur in. Leaving this field blank indicates that SOAtest will use the default window.

Wait Select the wait action if you want the browser to wait the specified number of milliseconds before continuing the next step in the functional test. If you choose this action, you can specify the following settings: •

Milliseconds: You can enter a Fixed, Parameterized (if a data source is available), or Scripted value. •



To enter a scripted value, select Scripted, then click the Edit Script button to enter a script method to return the value that should be typed in the selected test.

Window Name: You may specify the name of the window you would like the action to occur in. Leaving this field blank indicates that SOAtest will use the default window.

Close Select the close action if you want the browser to close the specified window. If you choose this action, you can specify the following setting: •

Window Name: You may specify the name of the window you would like the action to occur in. Leaving this field blank indicates that SOAtest will use the default window.

New Browser Select the new browser action if you want to open a new browser based on the specified start URL. If you choose this action, you can specify the following setting: •

Window Name: You may specify the name of the window you would like the action to occur in. Leaving this field blank indicates that SOAtest will use the default window.

Specifying Other Actions You can use the "other" action to specify common commands such as: •

goback - Equivalent to the user pressing the Back button in the browser.



mousedown - Equivalent to the user pressing the mouse on an element.



mouseup - Equivalent to the user releasing the mouse over an element.



mouseover - Equivalent to the user moving the mouse over an element.



select - Equivalent to the user choosing an option in a single selection combo box.



addselection - Equivalent to the user choosing an option in a multiple selection combo box.



removeselection - Equivalent to the user unselecting an option in a multiple selection combo box.

The action field is optional depending on what command is used. goback, mousedown, mouseup, mouseover do not need it. For select, addselection, and removeselection, you need to specify the name of the option that is being selected or deselected.

453

Configuring Wait Conditions

Configuring Wait Conditions This topic explains how to customize wait conditions for web functional tests. Sections include: •

Understanding Wait Conditions



Specifying Wait Conditions



Configuring the Order of Wait Conditions



Upgrading WebKing Projects (pre-6.0.5) to Current SOAtest Wait Options

Understanding Wait Conditions You can customize how long SOAtest waits after performing a user action in order to move on to the validations/extractions step of the current test, and then to the next test in the scenario. When a user interacts with a web page, the web page responds to whatever the user is doing. For example, when the user clicks a link, the page is reloaded with new content. When a user chooses a menu item, some other part of the page might refresh itself. The user instinctively waits until the page is done updating before continuing further use of the page. In fact, in most cases the user HAS TO wait for the update in order for the page element to be present with which the user is going to interact with next. Using automation, however, there is no human to decide when the page is done updating. This decision has to be made automatically by SOAtest. SOAtest must wait long enough so that it does not try to continue with the testing process before the page is ready, but at the same time it must also run quickly to achieve one of the benefits of automation—speed. SOAtest automatically configures the wait conditions while it is recording. However, you may want to manually adjust or modify the wait conditions in order to get the tests to perform as desired. In many cases, multiple wait conditions will be used for a single test.

Specifying Wait Conditions

454

Configuring Wait Conditions

The wait conditions captured during test creation can be viewed and modified in the Wait Conditions tab. Available wait conditions include: •

Wait for Page Load: This wait condition waits until at least one page load has occurred. However, it will wait until all page loads that happen within one second of each other have finished. Once one second has passed without any new page loads starting, the wait is finished. A page load can either mean the entire page is reloading, or it can mean that a single frame is reloading. A Page Load wait condition is added to a test during recording if SOAtest detects that a page load occurs after the particular user action that causes that test to be recorded, and before the user action for the next test that is recorded.



Wait for Asynchronous Requests: This wait condition waits until at least one asynchronous request has been made and a response is received for it. If any other asynchronous requests were begun while the first was in progress, then it waits until all asynchronous requests have completed. For this wait condition, an asynchronous request is defined as a request that is made while a page load is not occurring, and whose response is text-based. An Asynchronous Request wait condition is added to a test during recording if SOAtest detects that one or more asynchronous requests occur, outside the context of a page load, after the user action that caused that test to be recorded and before the user action for the next test is recorded.



Wait for Element: This wait condition waits until a specified page element meets a specific condition. •



Page Element: The page element can be specified in two ways: •

Element for next user action: This option automatically determines which element to wait for by looking at the next browser test in the scenario and using the element that is configured in that test’s User Action tab. If the current test is the last browser test in a certain scenario, then this wait condition will not perform a wait.



Element specified: This option allows you to manually choose which element to wait for. You can use Element Properties, XPath, or Script to choose the element.

Condition: You can set the following element conditions to wait for: •

Present: This condition waits until the element is present on the page. The element may or may not be visible to the user. This is preferred over the next condition, Visible, in cases where the element does not become visible until a user mouses over some other element. This is the default condition used for element waits that are automatically added while recording.



Visible: This condition waits until the element is visible on the page.



Not visible: This condition waits until the element is either not present on the page, or is on the page but is not visible.



Editable: This condition waits until an input element is editable. If this condition is used on an element that is not an input, it will eventually time out because elements that are not inputs are by definition not editable.



Has value: This condition waits until an element has an attribute with the specified value. The attribute could be any attribute supported by the element, or “text” for the text content of the element.

An Element wait condition (wait for element present) is added during recording as the last wait condition for all tests except those that have a Script Dialog wait condition added to them. Element wait conditions are inactive (meaning that they don’t wait for anything) in the following cases: •

There are no browser tests in the scenario after the current test

455

Configuring Wait Conditions



There are no enabled browser tests in the scenario after the current test



The next browser test does not use a page element in its user action



The next browser test is configured to use test suite logic. Element wait conditions are not used if the next browser test uses test suite logic, since the logic can cause the next test to not be run



Wait for Script Dialog: This wait condition waits until one of the following script dialogs is detected: alert, confirm, or prompt. A Script Dialog wait condition is added during recording as the last wait condition for all tests that cause a script dialog to appear.



Wait for Specified Time: This wait condition simply waits for the specified number of milliseconds. This wait condition is not added automatically during recording.



Wait For Time Interval Without HTTP Traffic: This wait condition waits until a specified number of seconds have passed without there being any traffic between the browser and the server. For example, if the specified time is 1 second, it finishes waiting once there has not been any traffic between the browser and the server for 1 second. This wait condition is added during recording only in conjunction with an Asynchronous Request wait condition, in cases where SOAtest detects that an asynchronous request causes other non-asynchronous requests to occur. Note: Prior to WebKing 6.0.5, this was the default wait condition. It came in two flavors – a Request Wait Time and a UI Wait Time. The individual times were able to be customized manually, but the defaults were 4000 ms for Request Wait Time and 100 ms for UI Wait Time.

In addition to manually adding new wait conditions from this tab, you can also add them automatically from the Browser Contents Viewer tool.

Adding a Wait Condition from the Browser Contents Viewer The Browser Contents Viewer allows you to specify a wait condition graphically, from a view of the rendered Web page. To add a new wait condition from the Browser Contents Viewer tool: 1. Right-click the element for which you want to specify a wait condition. 2. Choose Add Wait for Element from the shortcut menu. 3. In the Add Wait Condition dialog, specify the details for the new wait condition, then click OK. The wait condition added will automatically be configured to wait until the clicked-on element is present.

Identifying Elements Associated with Wait Conditions The element that is the source of a wait condition will be highlighted with a solid green border in the Browser Contents viewer.

Configuring the Order of Wait Conditions The Wait Conditions appear in order of execution in the Wait Executions tab. You can Add, Remove, and change the order of conditions by clicking the appropriate buttons in the Wait Conditions tab. The order of the wait conditions is important. SOAtest will execute all wait conditions in the order that they appear, regardless of whether any of the other wait conditions succeed or fail. If a wait condition fails (meaning the condition is not satisfied before the timeout for that condition), then an error message is generated. For example, for most web applications page loads typically happen before asynchronous requests are made. Therefore, the wait conditions for a test that has both a page load and

456

Configuring Wait Conditions

asynchronous requests typically should have a Page Load wait condition appearing before an Asynchronous Request wait condition. If the order of the conditions were switched, than the Page Load condition would fail, because the Asynchronous Request wait condition will wait for asynchronous requests which happen after any page loads occur. Then the Page Load wait condition would execute, but since there will be no more page loads it will end up timing out. Each wait condition, other than the Wait for Specified Time condition, has a timeout. If the wait condition is not met within the timeout, an error message is reported so you know to adjust the wait conditions. However, the test will continue even if wait conditions fail. The timeout can be set to use the default timeout that is set in the preferences, or it can be customized for that individual wait condition.

Upgrading WebKing Projects (pre-6.0.5) to Current SOAtest Wait Options By default, WebKing projects from WebKing versions prior to 6.0.5 that are opened in the current version of SOAtest will retain the old wait conditions (and thus will not benefit from the added speed and control that the new wait conditions provide). However, there is a way to automatically transform them to use the new wait conditions. To help you update wait conditions, SOAtest ships with a updateWaitConditions.py file, which is located at <SOAtest_62_Installation_Directory>/plugins/com.parasoft.xtest.libs.web_<soatest_version>/root/startup. To automatically update tests to the new wait conditions, modify that script to pass true (the value “1” in jython) to the updateWaitConditions method. SOAtest must be restarted after you make this change in order for it to take effect. Once restarted, any projects that are opened that were saved with the old wait conditions will be automatically updated as follows: •

Any navigate tests are converted to have a page load wait.



Any test that causes a script dialog to appear, has a script dialog wait added to it. Any other test that does not have a script dialog wait added to it, including navigate tests, will get an element wait added to it.

Once this conversion is complete, however, it is quite likely that page load waits will need to be added for any test that causes an entire page or frame to reload. The following error messages may appear to alert you to the fact that you need to add a page load wait (These error messages signal the need for a page load wait even when not dealing with old projects. •

Unable to extract value because the previous page did not finish loading – add a page load wait to the same test this error gets reported on.



Unable to perform user action because the previous page did not finish loading – add a page load wait to the browser test immediately before the test this error gets reported on.



The request has an attached output, but it is no longer being requested by the browser – add a page load wait to the same test this error gets reported on.



Any time a test gets a finished status, of either success or failure, before the page looks like it has completely finished loading, could mean a page load wait is needed on the test that finishes too quickly.

Wait Options Q&A How might I know that I need to update the wait conditions, and how should I update them? •

SOAtest shows that a test executes successfully, but the action that the test is supposed to trigger does not actually happen in the browser. This can happen when an element is present on the page, but not yet visible and also not ready for user interaction. Since SOAtest’s default

457

Configuring Wait Conditions

element wait condition is to wait for the element to be present, SOAtest may trigger the action on the element before it is ready to process the action. In this case you should change the wait condition to wait for the element to be visible instead. •

SOAtest shows an error message that starts with “[Internet Explorer Script Error]”, and it appears that the test that shows this message gets executed before the element it is executing the action on is visible. This can happen when an element is present on the page, but not yet visible and also not ready for user interaction. Since SOAtest’s default element wait condition is to wait for the element to be present, SOAtest may trigger the action on the element before it is ready to process the action. In this case you should change the wait condition to wait for the element to be visible instead.

458

Validating or Storing Values

Validating or Storing Values This topic explains how to extract a value for validation or store it for use in another test. Sections include: •

Understanding Extractions



Validating or Storing a Value



Specialized Extractions/Validations



Setting up Validations from within the Browser



Viewing Stored Variables Used During Test Execution



Configuring Custom Validations with Scripting

Understanding Extractions In the Browser Contents Viewer tool that is automatically added to each test recorded from the browser, you can click on page elements within a rendered view and automatically set up functional tests on those elements. If a validation is not satisfied in a subsequent test run, the associated test will fail. In addition, you can "extract" and store data from those elements, then use those extracted values in additional tests (for instance to populate form fields or to validate data). This allows you to easily set up functional tests on applications where dynamic data validation is important. Extracted data can be used in both Web tests and SOA tests. NOTE: You can also extract path elements from within your Web browser WHILE you are recording a path. To do so, right-click an element from within your browser and select Configure Validations from the shortcut menu. The Validation Options dialog displays. The options within this dialog will be the same as the options in the Browser Contents Viewer tool that appears AFTER a path has been recorded and replayed at least once.

Validating or Storing a Value To validate or store a value represented in the rendered page, complete the following from the Browser Contents Viewer’s tool configuration panel (accessible by double-clicking the tool’s Test Case Explorer node): 1. Right-click on the page element for which you would like to create a test (for example, rightclick on a link), and choose Extract Value from <element> Element… from the shortcut menu. 2. In the wizard that opens, ensure that the desired element is selected in the Property name box. 3. If you want to "zoom in" on the value to extract, complete the isolate partial value wizard page. •

Sometimes you may only want to validate or send a portion of a property value to a data source. If this is the case, you can isolate the part of the property value to use by selecting the Isolate Partial Value using Text Boundaries checkbox. You then enter Left-hand and Right-hand text that serve as boundaries for the value that you enter. The preview pane shows you what value will be used based on the boundary values that you have entered. For example, assume that the property value is "Click here to log in”:

459

Validating or Storing Values



To isolate the value “Click”, leave the left boundary blank and enter “ here” (including the space) in the right boundary.



To isolate the value “here”, enter “Click “ in the left boundary and “ to” in the right boundary (again including spaces).



To isolate the value “in”, enter “log “ (including the space) as the left boundary and leave the right boundary blank.

4. If you want to validate a value: a. Select Validate the value. b. Choose from the following expected value options:

c.



equals: Validates that the property value exactly matches the expected value.



contains: Validates that the property value contains the expected value somewhere within it.



starts with: Validates that the property value starts with the expected value.



ends with: Validates that the property value ends with the expected value.



is not present: Validates that the specified property does NOT appear on the page, and will report an error if it does. This is useful for cases when a web application is showing an error message that it should not be.

Choose Fixed, Parameterized, or Scripted, then specify a value. •

If the Parameterized option is chosen, then you can specify a column name from that data source. When the test is run, the expected value will be taken from the appropriate row and column in the data source. Column names will only be shown for one data source, so if you have multiple data sources in your project, you will need to go to the chained Browser Validation tool and modify the data source being used at the top of that panel. If other extracted column names are available because they were set up by extracting from a different HTML page, they will also be in the list of available column names, even if the project does not define a data source.

5. If you want to send the value of the selected property to a data source (so that the value can be used later in a functional test): a. Select Extract the value to a data bank. b. Enter a Column Name by which you will reference this value later in the test. When the test is run, the property value will be extracted from the page and placed into a temporary data source within a column with the specified name. When later parts of the test reference the column name, the value stored in the temporary data source will be used for those parts of the test. You can both validate and send a property value to a data source at the same time if desired. 6. Click Finish. The value will be validated or stored when the test is executed.

What if I don’t see the value I want to validate or extract? If the Browser Contents Viewer tool does not display the value you want to extract or validate—for example, because the related test failed or because the item is not visible in the rendered page (e.g., it is a title), you can manually add a Browser Validation tool or Browser Data Bank tool as described in “Adding Test Outputs”, page 333.

460

Validating or Storing Values

If you configured a validation... A Browser Validation tool will be chained to the test. This tool will performing the validation. If you later want to modify the validation, you can do so by modifying this tool’s settings. The element that is the source of a validation will be highlighted with a solid red border in the Browser Contents viewer, and in the Post-Action Browser Contents tab of the Browser Validation tool.

If you configured an extraction... A Browser Data Bank tool will be chained to the test. This tool will store the extracted value. The extracted value can be used wherever parameterized values are allowed, such as the value to type into an input in a subsequent test. If you later want to modify the stored value, you can do so by modifying this tool’s settings. The element that is the source of an extraction will be highlighted with a solid gray border in the Browser Contents viewer, and in the Post-Action Browser Contents tab of the Browser Data Bank tool.

If you configured both... A Browser Validation tool and a Browser Data Bank tool will be chained to the test as described above. In addition, a dotted purple border will be used to highlight the source element.

Specialized Extractions/Validations Validating or Extracting Text To validate text that appears on a page (or to extract it to a browser data bank), complete the following: 1. Select the text you want to validate or extract. 2. Do one of the following: •

To configure a validation for that text, right-click the selection and choose Validate Selected Text from the shortcut menu.



To configure an extraction for that text, right-click the selection and choose Extract Selected Text into Data Bank from the shortcut menu.

3. Ensure that the desired validation/extraction settings appear in the dialog that opens. 4. Click Finish.

Validating Color Elements To create a test that validates the color on a page, complete the following: 1. Right-click on the page element for which you would like to create a test (for example, rightclick on a link), and choose Extract Value from <element> Element… from the shortcut menu. 2. In the wizard that opens, ensure that style_color is selected in the Property name box, then click Next two times. 3. In the Validate or Store Value wizard page, select matches color from the Expected Value drop-down menu and enter a color in the text field (e.g. "red"). The matches color option validates color values corresponding to names of colors specified in the validation colors mapping

461

Validating or Storing Values

file. These mappings are either in hex notation, or RGB notation -- rgb(0, 255, 0). For more information, see “Validation Colors Mapping File”, page 462. 4. Click Finish. The color validation will be performed when the test is executed.

Validation Colors Mapping File There is a file in the product installation called the Validation Colors Mapping file. This file defines how SOAtest validates colors by name. It is located in <SOAtest_62_Installation_Directory>/plugins/com.parasoft.xtest.libs.web_<soatest_version>/root/validation/validationColors.txt.

Each line of the file defines a color by name, along with ranges for each component of the RGB color model. The specified ranges tell SOAtest that if a color is validated and falls within the ranges for each of the components of the RGB color model, then the color being validated matches the color that is defined within those ranges. For example, a line in this file may look like the following: red, b0-ff, 00-30, 00-30

This line defines the valid ranges for the color “red”. The ranges are specified using hex notation. In the above example the valid R range for red is between the hex values b0 and ff. The valid G and B ranges for red are between the hex values 00 and 30. In other words, if an element has a hex value of #c80000, then it will be considered to be red, since the R value, which is c8, falls between b0 and ff, and each of the G and B values, which are 00, fall between 00 and 30. However, if a validation is set up on an element that is expected to be red, but the element’s color has a hex value of #909090, then SOAtest will display a message that the element has the incorrect color. The mapping file has a few standard colors already defined. However, if you would like to specify additional colors, you can simply modify the file. There must be only one color defined per line. Also, if you want to change the valid RGB ranges for a defined color, you can also modify the mapping file. Ranges can be specified with a hyphen (b0-ff, as already described), or they can be a single value (ff). If they are a single value, the range of valid values only includes one value. As mentioned before, the ranges must also be in hex notation. SOAtest must be restarted in order for changes to this file to take effect.

Validating Style Properties To validate a style property, open the Browser Validation tool’s configuration panel, then set the validation’s Element Property to the value "style_" + <the JavaScript name of the property>. For example, to validate the text-decoration style property, you would specify style_textDecoration (textDecoration is the JavaScript way to specify the style property text-decoration) in the Element Property field, and specify the desired value of the property using the Expected Value controls. In the text-decoration case, the expected value might be equal to line-through or underline.

Validation Styles List File If you want a certain style property to display as an available property in the validation wizard, you can add that style to the Validation Styles List. The Validation Styles List file, located in <SOAtest_62_Installation_Directory>/plugins/ com.parasoft.xtest.libs.web_<soatest_version>/root/validation/validationStylesList.txt, specifies runtime style properties that can be validated. The format of this file is to have one property per line. By default, the color property is specified in this file; however, you can add any valid style properties that you would like to validate. SOAtest must be restarted in order for changes to this file to take effect. Once it is restarted, you will see the properties specified in this file in the validation dialog when right-clicking on an element (in the Firefox browser

462

Validating or Storing Values

during recording, or in the rendered view after recording). The properties will have “style_” appended to each of the properties to tell you (and SOAtest) that these refer to the runtime value individual style properties. A validation set-up using one of these properties will validate the runtime value of that style property. This is the runtime value of the property after all inline styles and styles defined in CSS files have been applied. Because of this, the value may differ from what is defined inline in the element. For example, these runtime style validations allow you to validate the actual color that is seen by a user after all styles have been processed by the browser.

Setting up Validations from within the Browser If you are recording with a Firefox browser, it is possible to setup validations on elements while you are recording. In order to do so, begin recording a functional test scenario. When the browser is open, right-click on the element you’d like to create a validation for and select Configure Validations from the shortcut menu. The Validation Options dialog displays. This dialog is nearly identical in functionality to the dialog that is created by right-clicking within the Browser Contents Viewer tool from within SOAtest. There is only one major difference. In order for the validation to be created, either the Validate Value check box (found when selecting Validate Value from the list) must be selected, or the Extract to Data Source check box (found when selecting Extract to Data Source from the list) or both must be checked. It is possible to setup validations on multiple properties of the same element with the same dialog.

Viewing Stored Variables Used During Test Execution You can configure the Console view (Window> Show View> Console) to display the stored data bank variables used during test execution. For details, see “Console view”, page 35.

Configuring Custom Validations with Scripting If you want to perform complex validations that cannot be properly representing using the GUI controls, you can express them using scripting. For example, suppose that you want to validate all of the rows in a table. That table could be of variable length. You can attach an Extension tool to a browser functional test and pull values from the document provided by input.getDocument(). Here's a sample JavaScript script that accomplishes that. var Application = Packages.com.parasoft.api.Application; var WebBrowserTableUtil = Packages.webking.api.browser2.WebBrowserTableUtil; var WebKingUtil = Packages.webking.api.WebKingUtil; // Verify that all values in a table column are equal to a previously // extracted value. For example, we searched for all places in which // widget 11 is sold, and we want to make sure that all results are // for widget 11. function validateTable(input, context) { // Column we want to validate. var columnIndex = 0; // We extracted the value we want to use for comparison through a Data Bank // extraction in a previous step. // Must prepend extracted column name with "Extracted: ". var expectedValue = context.getValue("dsExtracted", "Extracted: testValue"); var document = input.getDocument();

463

Validating or Storing Values

// Table should have some unique identifying attribute (e.g., id). var table = WebBrowserTableUtil.getTable("id", "mytable", document); // If the first row of the table contained column headers, we could // use getCellValuesForColumn(String, Element). var values = WebBrowserTableUtil.getCellValuesForColumn(columnIndex, table); if (values.length == 0) { context.report("No rows found!"); } for (var i = 0; i < values.length; ++i) { if (values[i] != expectedValue) { var errorMessage = "Table column '" + columnIndex + "': " + "Expected value '" + expectedValue + "', but found '" + values[i] + "'."; context.report(errorMessage); } }

464

Stubbing Test Requests/Responses

Stubbing Test Requests/Responses This topic explains how to stub test requests/responses for web functional tests. Sections include: •

Understanding Stubs for Web Functional Tests



Creating Stubs



Configuring the Browser Stub Tool

Understanding Stubs for Web Functional Tests Testing applications with dynamic data can cause many false positives, creating extra overhead for the developers and QA who have to determine which failures are real, and which are “noise.” To solve this, SOAtest has the ability to “stub” data that is sent back to the client. Stubbing helps to ensure that any changes to the client-side code do not affect the final resultant html page. A stub is static data that SOAtest saves when recording a functional test scenario through a web site. Since the data that will be fed to the client is unchanging, any new errors that occur while processing the data can be attributed to changes in the client-side JavaScript that processes the data.

Creating Stubs While recording a functional test, SOAtest keeps track of each request made by the client, as well as the response. To create a stub, complete the following: 1. Right-click the test from which you would like to return the static data and select Add Output from the shortcut menu.

2. In the wizard that opens, choose HTTP traffic, then click Next. 3. In the next page, choose the browser request that you want to stub. 4. In the left panel, select Both - Stub Request/Response. 5. In the right panel, select any Browser Stub.

465

Stubbing Test Requests/Responses

6. Click Finish.

Configuring the Browser Stub Tool To configure a Browser Stub tool that has been added to an functional test: 1. Double-click the Stub Request/Response for <URL> node that was added to the test. 2. In the Browser Stub test configuration panel, you can modify the following options:

466

Stubbing Test Requests/Responses



Name: Specifies the name for the Browser Stub.



Response Header View: Customizes the returned request/response headers. Select the following from the drop-down menu:





Literal: This view allows you to modify the raw text returned from the request.



Form: This view provides the following sub-options. •

General: Selecting this option from the list will allow you to modify the version of HTTP and response code that is returned.



Request/ Response Headers: This panel lets you add/remove/modify the list of headers that are sent to the server, or received by the client. To add a new header, click the Add button. This will add a new entry to the list that can be modified by clicking the Modify button. Clicking Modify will open up a dialog with two text fields. The top field is for the header name, and the bottom option is for the header value. Click OK to make the changes permanent. To remove headers, select the header you wish to remove and click the Remove button.



URL Parameters: This option, only available if configuring a request, allows you to add/remove/modify any arguments to be placed into the URL when the browser makes its request to the server.



Parameterized: This option will allow you to return values stored in data sources created in the project. For more information on setting up data sources see “Parameterizing Tests (with Data Sources or Values from Other Tests)”, page 345.



Scripted: This option allows you to use a custom script to return the correct values. This option is identical to using an Extension tool in SOAtest. For more information see “Extension (Custom Scripting)”, page 960.

Response Body View: Customizes the returned request/response body. Select the following from the drop-down menu: •

Literal: This view allows you to modify the raw text returned from the request.



Parameterized: This option will allow you to return values stored in data sources created in the project. For more information on setting up data sources see “Parameterizing Tests (with Data Sources or Values from Other Tests)”, page 345.



Scripted: This option allows you to use a custom script to return the correct values. This option is identical to using an Extension tool in SOAtest. For more information see “Extension (Custom Scripting)”, page 960.

467

Running Web Scenarios in Headless Mode

Running Web Scenarios in Headless Mode This topic explains how to run web scenarios without opening a browser, and discusses special configuration steps for configuring this mode on Linux and Solaris. Sections include: •

Running in Headless Mode



Linux and Solaris Configuration

Running in Headless Mode SOAtest can run web scenarios in "headless mode"—where the browser is not shown. In command line mode (using soatestcli), SOAtest always runs web scenarios in headless mode. You can also configure scenarios executed from the SOAtest GUI to run in headless mode. To do this: 1. Open the scenario’s configuration panel. 2. In the Browser Playback Options tab, choose the Headless option. By running tests in headless mode, you can work without the distraction of browser windows opening and closing.

Linux and Solaris Configuration To run web scenarios in headless mode in Linux and Solaris, SOAtest creates its own hidden X display. SOAtest uses the X server Xvfb to create a virtual framebuffer that requires no display hardware. The SOAtest installation includes a copy of Xvfb for each of Linux and Solaris (the files Xvfb_Linux and Xvfb_Solaris, respectively). SOAtest will use a system-installed Xvfb if it exists. SOAtest searches the following paths, in order: /usr/bin/Xvfb /usr/X11R6/bin/Xvfb /usr/X/bin/Xvfb /usr/openwin/bin/Xvfb

If your distribution does not provide Xvfb, SOAtest will use its own copy of Xvfb. This may require some configuration in the form of command line arguments. The following describes how to configure the Xvfb supplied by SOAtest to start with the appropriate arguments. If SOAtest cannot start Xvfb, then it will simply run the browser in the display specified by the system environment variable $DISPLAY. In other words, if you are starting tests from the SOAtest GUI, the browser will appear in the same display as the GUI and SOAtest will output any startup error message from Xvfb to the Console view. If you are running tests from the command line and there is no available display, then SOAtest will not be able to start Firefox; any web scenarios will therefore fail.

Getting Xvfb working independently of SOAtest To get Xvfb to work when invoked from SOAtest, first get Xvfb working independently of SOAtest. This how-to assumes that you need to configure Xvfb_Linux; however, the process is the same for configuring Xvfb_Solaris.

(1) Start Xvfb on display :20. Use a different display argument if :20 is not available.

468

Running Web Scenarios in Headless Mode

$ cd /path/to/soatest/plugins/com.parasoft.xtest.libs.web_[version].0/root $ ./Xvfb_Linux :20 -ac

If you get any error messages, you will need to specify command line arguments for the location of various X-related files whose location will be dependent on your distribution. When trying Xvfb_Linux on Fedora Core, this was the error message: Couldn't open RGB_DB '/usr/X11R6/lib/X11/rgb' error opening security policy file /usr/X11R6/lib/X11/xserver/SecurityPolicy Could not init font path element /usr/X11R6/lib/X11/fonts/misc/, removing from list! Could not init font path element /usr/X11R6/lib/X11/fonts/Speedo/, removing from list! Could not init font path element /usr/X11R6/lib/X11/fonts/Type1/, removing from list! Could not init font path element /usr/X11R6/lib/X11/fonts/CID/, removing from list! Could not init font path element /usr/X11R6/lib/X11/fonts/75dpi/, removing from list! Could not init font path element /usr/X11R6/lib/X11/fonts/100dpi/, removing from list! Fatal server error: could not open default font 'fixed'

(2) Find the necessary X server files and add command line arguments to successfully start Xvfb This is what worked on Fedora Core: $ ./Xvfb_Linux -ac -sp /usr/lib/xserver/SecurityPolicy -fp /usr/share/X11/fonts/misc -co /usr/share/X11/rgb -screen 0 1024x768x24 :20 A virtual frame buffer is now running on display :20.

(3) Try running a simple X application on the display you have created. $ xclock -display :20 &

Verify that the clock is visible in display :20 by dumping an image of display :20 and viewing the image. $ xwd -display :20 -root | xwud

You should see a clock. See the respective man pages for more information on xwd(1) and xwud(1).

(4) Run Firefox on the display you have created. $ firefox --display :20

Use the same xwd/xwud command to verify that Firefox is running in the virtual frame buffer. $ xwd -display :20 -root | xwud

If you can create an X display using Xvfb but Firefox fails to run on this virtual framebuffer, you may need to update various libraries external to SOAtest. For example, an outdated Cairo library (for 2D graphics) provided by the distribution has been known to cause problems on both Linux and Solaris. If you have problems with Cairo, make sure the virtual framebuffer uses a display depth greater than 8 bits because this will solve a common problem. If troubles persist, you may need to update the library itself.

Getting Xvfb working with SOAtest Once you determine the arguments that you need to pass to Xvfb, put these arguments to use in SOAtest. To do so, create a shell script named Xvfb_Linux that invokes Xvfb with the necessary arguments. SOAtest will then run the shell script when trying to start Xvfb.

(1) In /path/to/soatest/plugins/com.parasoft.xtest.libs.web_[version]/root rename Xvfb_Linux to Xvfb_Linux.bin. $ mv Xvfb_Linux Xvfb_Linux.bin

469

Running Web Scenarios in Headless Mode

(2) Create a shell script Xvfb_Linux.sh that will run Xvfb_Linux.bin with the appropriate arguments. This is what worked on Fedora Core: ----#!/bin/sh # # # #

Use [email protected] to pass along any argument specified by SOAtest. The [email protected] will include the display on which to run the server. Currently this is always display :32. You can override this by adding another display number as the last argument.

# There can be problems with graphics libraries if you do # not set the display depth to greater than 8 using the # -screen argument. # Make sure to run Xvfb with 'exec' so that SOAtest will # kill the correct process. XVFB_DIR=`dirname $0` exec ${XVFB_DIR}/Xvfb_Linux.bin \ -ac \ -sp /usr/lib/xserver/SecurityPolicy \ -fp /usr/share/X11/fonts/misc \ -co /usr/share/X11/rgb \ -screen 0 1024x768x24 \ [email protected]

-----

(3) Create a symbolic link to Xvfb_Linux.sh. $ ln -s Xvfb_Linux.sh Xvfb_Linux

Then, when SOAtest runs Xvfb_Linux. it will run the script that passes the appropriate arguments to Xvfb_Linux.bin. Alternatively, you could name the script Xvfb_Linux. However, using the symbolic link makes it easier to differentiate between what SOAtest installed and what you created to run the installed Xvfb.

(4) Run a web scenario in headless mode. While SOAtest is running the scenario, use ps(1) to check if Xvfb_Linux.bin is running, and then use xwd/xwud again to verify that Firefox is running. (SOAtest will close Firefox as soon as it completes the test run.) $ ps -ef | grep Xvfb $ xwd -display :32 -root | xwud

If you configured Xvfb to run on a different display, use that when invoking xwd(1).

470

Running Static Analysis as Functional Tests Execute

Running Static Analysis as Functional Tests Execute SOAtest can perform static analysis (check links, HTML well-formedness, spelling, accessibility, etc.) on the Web pages that the browser downloads as Web functional tests execute. For details, see “Performing Static Analysis”, page 578.

471

Customizing Recording Options

Customizing Recording Options This topic explains how to exercise greater control over the way that SOAtest records Web functional tests. SOAtest allows you to customize the clickable elements that can be recorded during functional test creation. The scripts to modify are located in the following directories: •

For Internet Explorer:<SOAtest_Installation_Directory>/plugins/com.parasoft.xtest.libs.web_<soatest_version>/root/browsers/ie/UserCustomizableOptions.js



For Firefox: <SOAtest_Installation_Directory>/plugins/com.parasoft.xtest.libs.web_<soatest_version>/root/browsers/ff/extensions/[email protected]/chrome/content/UserCustomizableOptions.js

SOAtest uses the array variables defined in UserCustomizableOptions.js during recording. The following are the variables currently available in this script: •

Recorder.clickableAttributes: This variable defines the type of attribute of html elements that SOAtest is looking for when determining if it should record a click functional test on this element. For example, the onclick is used to initiate script execution and is a good candidate in this array.



Recorder.clickableTags: This variable defines the html tags that SOAtest will consider when recording click functional tests. In an ajax web application, there are cases where tags such as span or li are clicked that caused certain functionalities to execute on the client side. This variable is used to define such tags.



Recorder.clickableInputTypes: This variable defines the form input types that SOAtest will consider when recording click functional test. Types such as text and textarea are not considered clickable by default because the user usually clicks on it just to gain focus and enter text.



Recorder.disallowedTags: This variable contains a list of tags that will never be recorded, even if they satisfy other recording criteria.



LocatorBuilders.order: This variable defines the order in which SOAtest uses to create the locator for a functional test. A locator is created to identify the html element on the page that the user action should take place. In a functional test, a locator is necessary during playback to repeat the user action. The order is constructed such that visual attributes in an element are more favorable when creating the locator.

472

Creating Custom Locators for Validations and Extractions

Creating Custom Locators for Validations and Extractions This topic explains how the CreateXPath hook allows you to create custom locators that can be used to identify elements when recording functional tests. These custom locators can then be used for validations and extractions. Sections include: •

About Hooks



About the CreateXPath Hook

About Hooks Customized hooks can be used to record or modify the values passed at specific points in SOAtest execution. Hooks are defined and customized in scripts using Python, JavaScript, or Java methods. The same file can define multiple hooks. If you add more than one method to a hook, all methods defined for that hook will be executed when the hook is called. You can create, apply, and invoke scripts that define hooks in the same way that you create, apply, and invoke any other script in SOAtest: upon startup (only for JavaScript and Python scripts), by creating and applying an Extension tool, and by adding scripts to a specific path node. You can invoke hooks at different times to elicit the desired functionality. For example, if you want to use a script’s hook functionality for all SOAtest projects and sites, you could add the JavaScript or Python script that defines and uses that hook to the <soatest_install_dir>/plugins/com.parasoft.xtest.libs.web_<soatest_version>/root/startupdirectory; then, any time the program calls the hook, the associated user-defined methods will be executed. The methods will be executed until you call clear () on the hook. For a complete description of general hook options, see the SOAtest Extensibility API (available by choosing SOAtest> Help> Extensibility API).

About the CreateXPath Hook When recording a functional test, SOAtest identifies the elements that you interacts with by certain attributes of those elements. For example, SOAtest will use the ID of an element, or the text content of an element to locate that element. If you need to extend the attributes used to identify elements during functional testing, you can use the CreateXPath hook. The CreateXPath hook takes two arguments. The first is an instance of org.w3c.dom.Node, and is the node that the locator will be built for. The second is an instance of org.w3c.dom.Document, and is the document where the node was found. The CreateXPath function should return an XPath that identifies the node in the document, or null if no XPath can be created. For example, if you want to identify elements by their "class" attribute, you could use the CreateXPath hook to create a custom identifier that checks elements for a "class" attribute and, if it finds one, returns an XPath that identifies the element by the value of the "class" attribute. When selecting an element to validate, or clicking an element during recording, SOAtest would then use this custom hook. The element would be passed to the custom hook, which would then check the element for a "class" attribute. If it found one, it would return an XPath using the "class" attribute value to identify the element. Here is a a sample script that identifies elements using the "class" attribute and an index (when more than one node has the same "class").

473

Creating Custom Locators for Validations and Extractions

app = Packages.com.parasoft.api.Application; settings = Packages.webtool.browsertest.XPathCreator; // establish the locator builder priority order // we use our custom locator after using the id, name and attribute locators // and before using the position xpath locator settings.locatorBuildersOrder[0] = settings.ID_LOCATOR; settings.locatorBuildersOrder[1] = settings.NAME_LOCATOR; settings.locatorBuildersOrder[2] = settings.ATTRIBUTE_XPATH_LOCATOR; settings.locatorBuildersOrder[3] = settings.HOOK_LOCATOR; settings.locatorBuildersOrder[4] = settings.POSITION_XPATH_LOCATOR; // gets the index of the node out of a list of nodes returned using xpath. function getIndex(xpath, node, doc) { index = -1; if (doc) { try { list = Packages.org.apache.xpath.XPathAPI.selectNodeList(doc, xpath); count = 0; while (index == -1 && count < list.getLength()) { candidate = list.item(count); if (candidate.equals(node)) { index = count; } else { ++count; } } } catch (e) {} } return index; } // the hook function to link to the CreateXPath hook. // looks for the class attribute and uses it (along with an index) to create // a locator for the given node. function createXPathHook(node, doc) { nodeName = node.getNodeName(); attributes = node.getAttributes(); classValue = attributes.getNamedItem("class"); if (classValue) { xpath = "//" + nodeName + "[@"+ classValue + "]"; index = getIndex(xpath, node, doc); if (index != -1) { return xpath + "[" + index + "]"; } } return null; } // sets up the hook function setHook() { hook = app.getHook("CreateXPath"); hook.set(createXPathHook); }

474

Understanding Web Functional Test Errors

Understanding Web Functional Test Errors This topic introduces some common web functional test errors. There are a few types of errors, with varying severity, that you may encounter when running these particular types of tests: •

Unable to perform user action: SOAtest was unable to Element <identifier> not found. – The element on which the specific action was to take place was not located on the page at all. This error is fatal if it relates to an action (click, type, etc), and will stop execution of the test suite. If this error occurs during an extraction, then the test suite will continue to run after reporting the error.



Firefox browser not installed: This error is due to SOAtest either not being able to locate a version of Firefox via the registry (Windows only), or the supplied path is not pointing to a viable Firefox executable. SOAtest supports Mozilla Firefox 1.5 and higher.



Your current version of <browser> is unsupported by SOAtest: This error is the result of attempting to run a version of Firefox or Internet Explorer that is unsupported by SOAtest. SOAtest supports Mozilla Firefox 1.5 and higher and Internet Explorer 6.0 and higher.

475

Creating a Report of Test Suite Maintainability

Creating a Report of Test Suite Maintainability This topic explains the purpose of SOAtest’s Test Maintainability report, and describes how to access it. Sections include: •

Understanding the Test Maintainability Report



Generating a Test Maintainability Report

Understanding the Test Maintainability Report How element locators get created is one of the greatest barriers to creating maintainable web functional tests that withstand changes to a web application over time. If page elements are created with unique attributes such as an id, then a test can be generated that uses that id to locate the element. No matter where the element moves on the page, the element will be able to be located because the id is unique . However, these kinds of unique identifiers are not often used in web applications. XPaths are commonly used instead, but tests that use XPaths are difficult to maintain. Why? If page elements get rearranged, the XPaths quickly can become invalid. To help you gauge the maintainability of a test suite, SOAtest provides a Test Maintainability report that shows a summary of web functional test suites. A warning is generated for each Web functional browser test with low maintainability—in other words, a test that uses one or more XPaths, which can make test maintenance difficult as the web application evolves.

Clicking on a warning opens a dialog that provides more detail.

476

Creating a Report of Test Suite Maintainability

Generating a Test Maintainability Report A Test Maintainability report can be generated in two ways: •

Right click on a root .tst file in the Test Case Explorer, then choose View Test Maintainability Report from the shortcut menu.



When creating a new web functional test from browser recording (as described in “Recording Tests from a Browser”, page 431), select the Generate Test Maintainability Report checkbox in the browser recording wizard. The report will open automatically when the recording is completed.

477

Security Testing In this section: •

Security Testing: Introduction



Authentication, Encryption, and Access Control



Penetration Testing

478

Security Testing: Introduction

Security Testing: Introduction This topic provides an overview of how SOAtest facilitates two types of security testing: •

Authentication, encryption, and access control (i.e., runtime security policy validation).



Hybrid security analysis, which integrates penetration testing with runtime error detection.

Sections include: •

Authentication, Encryption, and Access Control



Hybrid Security Analysis

Authentication, Encryption, and Access Control SOAtest assists with runtime security policy validation by enabling execution of complex authentication, encryption, and access control test scenarios. SOAtest includes security support for testing services with security layers. At the transport level, SOAtest supports SSL (both server and client authentication), basic, Digest and Kerberos authentication. At the message level, SOAtest supports WS-Security including X509, SAML, Username security tokens, XML Encryption and XML Digital Signature. It allows for security token validation as well as negative tests that ensure proper enforcement of message integrity and authentication.

Learning More For details on how to perform this validation, see “Authentication, Encryption, and Access Control”, page 479.

Hybrid Security Analysis SOAtest’s hybrid security analysis takes the functional tests that you and your team have already defined and uses them to perform a fully-automated assessment of where security attacks actually penetrate the application. This hybrid analysis: 1. Uses penetration testing to automatically generate and run penetration attack scenarios against your existing web or service functional test scenarios. 2. Uses runtime error detection to monitor the back-end of the application during test execution in order to determine whether security is actually compromised (and whether other runtime errors occur as well). 3. Correlates each reported runtime error with the functional test that was being run when the error was detected—allowing you to trace each reported error to a the specific use case related to the problem. The two key components of hybrid analysis—penetration testing and runtime error detection—can also be used independently of one another.

479

Security Testing: Introduction

Penetration Testing SOAtest’s penetration testing can generate and run a variety of attack scenarios (such as Parameter Fuzzing, SQL and XPath injections, Cross Site Scripting, XML Bombs, and more) against your functional test suites. For instance, if you are not able or ready to configure your application server for runtime error detection, you can still use penetration testing to generate and run attack scenarios, then use alternative strategies to determine if the attacks caused security breaches.

Runtime Error Detection SOAtest’s runtime error detection monitors the application from the back-end as SOAtest tests executes and alerts you if security breaches or other runtime defects (such as race conditions, exceptions, resource leaks) actually occur. You may want to perform runtime error detection both with and without penetration testing. This way, you can ensure that error detection covers both the exact use case functionality captured in your test cases as well as the simulated attacks based on this functionality.

Learning More For details on how to perform these analyses, see “Penetration Testing”, page 484 and “Performing Runtime Error Detection”, page 490.

480

Authentication, Encryption, and Access Control

Authentication, Encryption, and Access Control This topic explains how SOAtest assists with runtime security policy validation by enabling execution of complex authentication, encryption, and access control test scenarios. Sections include: •

Overview



Tutorial



Related Topics



Testing Oracle/BEA WebLogic Services with WS-Security

Overview To help you ensure that your security measures work flawlessly, SOAtest contains a vast array of security tools and options that help you construct and execute complex authentication, encryption, and access control test scenarios. For example: •

XML Encryption tool: The XML Encryption tool allows you to encrypt and decrypt entire messages or parts of messages using Triple DES, AES 128, AES 192, or AES 256. In WS-Security mode, Binary Security Tokens, X509IssuerSerial, and Key Identifiers are supported.



XML Signer tool: The XML Signer tool allows you to digitally sign an entire message or parts of a message depending on your specific needs. In some cases it may be important to digitally sign parts of a document while encrypting other parts.



XML Signature Verifier tool: The XML Signature Verifier tool allows for the verification of digitally signed documents using a public/private key pair stored within a key store file.



Key Stores: The use of key stores in SOAtest allows you to encrypt/decrypt and digitally sign documents using public/private key pairs stored in a key store. Key stores in JKS, PKCS12, BKS, and UBER format can be used.



Username Tokens, SAML Tokens, X509 Tokens, or Custom Headers: SOAtest supports sending custom SOAP Headers and includes templates for Username Tokens and SAML tokens.

Tutorial For a step-by-step demonstration of how to apply SOAtest for validating authentication, encryption, and access control, see “WS-Security”, page 140. This tutorial covers encryption/decryption, digital signature, and the addition of SOAP Headers. Lessons include: •

“Message Layer Security with SOAP Headers”, page 141



“Using the XML Encryption Tool”, page 144



“Using the XML Signer Tool”, page 152



“XML Encryption and Signature Combined”, page 155



“Automatically Generating WS-Security Tests with WS-SecurityPolicy”, page 157

481

Authentication, Encryption, and Access Control

Related Topics For more details on how to use SOAtest’s tools to support your specific authentication, encryption, and access control validation needs, see the following sections.

Topic

Reference

WS-Security Policy

“SOA Quality Governance and Policy Enforcement”, page 569 “Global WS-Policy Banks”, page 342

Custom Headers

“Adding SOAP Headers”, page 811 “Adding SAML Headers”, page 815 “Global SOAP Header Properties”, page 339

Tools

“XML Encryption”, page 873 “XML Signer”, page 878 “XML Signature Verifier”, page 882

General Security Settings (Authentication, Keystores, etc.)

“Security Settings”, page 754 “Using HTTP 1.0”, page 689 “Using HTTP 1.1”, page 691 “Global Key Stores”, page 340

HTTPS and SSL

“Accessing Web Services Deployed with HTTPS”, page 423

SAML

“Adding SAML Headers”, page 815 “SAML 1.1 Assertion Options”, page 814 “SAML 2.0 Assertion Options”, page 814

Testing Oracle/BEA WebLogic Services with WSSecurity If your services are configured with WS-Security XML security policies, then you can configure SOAtest with the necessary settings in order to interoperate with WebLogic. To help you configure these settings, a sample SOAtest project WebLogicWSS.tst is included under [SOAtest install dir]/examples/tests. WebLogicWSS.tst is not an executable test; it intended to serve as a reference, allowing you to compare a working configuration that has been verified by Parasoft against your own. This example configuration has been tested to work with WebLogic 9.2 and later. This example assumes that default sign, encrypt and UsernameToken (ut) policies are being used by your WebLogic application. It also assumes that the wss_client certificate (the client public key) has been imported to WebLogic's DemoTrust keystore. Please note how:

482

Authentication, Encryption, and Access Control



The signature and encryption tests in WebLogicWSS.tst include a WS-Security header with an X509 token configured.



The various encryption and signature tools are setup for various WS-Security scenarios.



Two certificate alias are used by the various operations.

If you are using the default policies or policies that are built off of the defaults, configure your test settings to match this example in terms of the options selected. For more information about WebLogic security policies, please refer to Oracle's e-docs sites. Here are some references that we found to be useful: •

Enabling WS-Security debugging on WebLogic: http://e-docs.bea.com/wls/docs103/ webserv_sec/message.html#wp288375 •

Note that these properties can be added to the startWebLogic.sh script within the WebLogic domain you are using.



Weblogic DemoIdentity and DemoTrust keystores:http://kingsfleet.blogspot.com/2008/11/ using-demoidentity-and-demotrust.html



Oracle Web Services Security Policy Assertion Reference: http://edocs.bea.com/wls/docs103/ webserv_ref/sec_assert.html

483

Penetration Testing

Penetration Testing SOAtest’s penetration testing can generate and run a variety of attack scenarios against your functional test suites. Sections include: •

Configuring Penetration Testing Attacks



Configuring Runtime Error Detection for Hybrid Security Analysis



Executing Tests



Reviewing and Validating Results



Configuring Attackable Parameters

Configuring Penetration Testing Attacks To configure SOAtest to simulate attacks against your functional tests scenarios: 1. Choose SOAtest> Test Configurations to open the Test Configuration manager. 2. Click New to create a new Test Configuration. 3. Give the new Test Configuration a meaningful name. 4. Open that Test Configuration’s Execution> Security tab. 5. Enable Perform penetration testing. 6. Use the rule tree to indicate which attacks you want to run.

484

Penetration Testing

Available Attacks SOAtest can simulate the following attacks:

Attack

Description

Parameter Fuzzing

When input parameters to a Web service are not properly validated, it can lead to vulnerabilities in the underlying system. In native applications, buffer overflow attacks can occur when input parameter data sizes go unchecked. These vulnerabilities could cause system crashes or could even lead to unauthorized information being returned to the client application.

SQL Injections

When SQL statements are dynamically created as software executes, there is an opportunity for a security breach by passing fixed inputs into the SQL statement, making them a part of the SQL statement. This could allow an attacker to gain access to privileged data, login to password-protected areas without a proper login, remove database tables, add new entries to the database, or even login to an application with admin privileges.

Username Harvesting

A request that includes a wrong username or password should not be met with a response that indicates whether the username is valid or not; this would make it easier for an attacker to identify valid usernames, then use them to guess the passwords.

XPath Injections

XPath injections are similar to SQL injections in that they are both specific forms of code injection attacks. XPaths enable you to query XML documents for nodes that match certain criteria. If such a query is constructed dynamically in the application code (with string concatenation) using invalidated inputs, then an attacker could inject XPath queries to retrieve unauthorized data.

Cross-Site Scripting

Cross-site scripting problems occur when user-modifiable data is output verbatim to HTML. Subsequently, an attacker can submit script tags with malicious code, which is then executed on the client browser. This allows an attacker to deface a site, steal credentials of legitimate users, and gain access to private data.

XML Bombs

When using a DTD (Document Type Definition) within an XML Document, a Denial of Service attack can be executed by defining a recursive entity declaration that, when parsed, can quickly explode exponentially to a large number of XML elements. This can consume the XML parser resources, causing a denial of service.

External Entities

XML has the ability to build data dynamically by pointing to a URI where the actual data is located. An attacker may be able to replace the data that is being collected with malicious data. This URI can either be pointed to local XML files on the Web service's file system to make the XML parser read large amounts of data, to steal confidential information, or launch DoS attacks on other servers by having the compromised system appear as the attacker by specifying the URLs of the other servers.

Schema Invalid XML

A well-formed document is not necessarily a valid document. Without referencing either a DTD or a schema, there is no way to verify whether the XML document is valid or not. Therefore, measures must be taken to ensure that XML documents do, in fact, reference a DTD or schema.

485

Penetration Testing

Attack

Description

Large XML Documents

Large payloads can be used to attack a Web service in two ways. First, a Web service can be clogged by sending a huge XML payload in the SOAP request, especially if the request is a well-formed SOAP request and it validates against the schema. Secondly, large payloads can also be induced by sending certain request queries that result in large responses.

Malformed XML

XML elements with malformed, unacceptable, or unexpected contents can cause the service to fail.

Attack String Customization To customize the attack strings used for various attacks, modify the .csv files in [SOAtest_install_dir]/plugins/com.parasoft.xtest.libs.web_[version]\root\security.

Configuring Runtime Error Detection for Hybrid Security Analysis If your application has a Java backend and you want to apply runtime error detection in order to determine if these attacks actually cause security breaches or other runtime defects, you should also configure runtime error detection as described in “Performing Runtime Error Detection”, page 490. Be sure to configure both: •

The server.



The Test Configuration that you will use to perform penetration testing (see “Configuring Penetration Testing Attacks”, page 484).

Executing Tests To run the penetration tests: 1. Select the test suite that you want to attack. 2. Run the Test Configuration that you designed for penetration testing (see “Configuring Penetration Testing Attacks”, page 484).

Reviewing and Validating Results Results will be reported in the SOAtest tab and in any reports generated.

With Runtime Error Detection Enabled If you performed hybrid analysis (penetration testing + runtime error detection), errors detected will be reported as follows:

486

Penetration Testing

Note that SOAtest correlates each reported error with the functional test that was being run when the error was detected. This correlation between violations and functional tests allows you to trace each reported error to particular use cases against your application.

Without Runtime Error Detection Enabled Additional validation strategies can help you determine if the generated attacks succeeded. For example, you can chain Coding Standards, Search, or XML Validator tools to the test suite, inspect server logs manually, or run a script to parse these logs.

Viewing Attack Traffic The Traffic Viewer for each test allows you to view attack traffic. Using the available Attacks and Iteration controls, you can display traffic for all attacks or for specific attack types, as well as focus on traffic for specific attack values.

487

Penetration Testing

Configuring Attackable Parameters By default, SOAtest will try to attack all of the available parameters represented in a selected test suite’s SOAP Client, REST Client, Messaging Client, and Browser Testing tools. To customize which parameters may be attacked: 1. Double-click the top-level test suite node for functional tests you want to attack. 2. Open the Security Options tab (on the far right). 3. Use the Penetration Test Parameter tree to indicate which parameters can be attacked.

4. Save the test suite configuration changes.

488

Runtime Error Detection In this section: •

Performing Runtime Error Detection

489

Performing Runtime Error Detection

Performing Runtime Error Detection This topic explains how to use SOAtest to perform runtime error detection on Java applications. Sections include: •

Runtime Error Detection Overview



Preparing the Server



Customizing the Test Configuration



Detecting Errors

Runtime Error Detection Overview SOAtest can perform runtime error detection as functional tests or penetration tests execute. This runtime error detection analyzes the executing application, applying a configurable set of dynamic "runtime rules" that verify the runtime behavior of the application. The rule violations reported indicate runtime errors that actually occurred during execution. SOAtest correlates each reported error with the functional test that was being run when the error was detected. This correlation between violations and functional tests allows you to trace each reported error to particular use cases against your application. Categories of errors detected include: •

Application crashes



Eclipse development



Exceptions



Functional Errors



File I/O



Graphical User Interface



Database



Network



Optimization



Portability



Security



Servlets



Threads & Synchronization

Preparing the Server Before you can monitor an application, you must copy the appropriate jar files to the machine that is running your application server, then configure the application server to use the jars. This is done as follows: 1. Copy insure.jar and insureimpl.jar from [SOAtest install dir]/eventmonitor to a directory on the server with the application you want to check. 2. If the server is running, stop it. 3. In your startup script, add the -javaagent command to the existing Java arguments.

490

Performing Runtime Error Detection



For details, see “javaagent Command Details”, page 491.



Be sure to add the optional trace_to_xml parameter if you also want to monitor the application’s internal behavior as your functional tests execute. See the box below for details.

4. Restart the server. The server will start as usual, but with the specified package classes instrumented.

Gain visibility into the application’s internal behavior during test execution As you configure the application for runtime error detection, you are also preparing it for application monitoring, which provides visibility into the application’s internal behavior as functional tests execute. This allows you to better identify the causes of test failures as well as enrich the use case validation criteria in your regression tests. For example, you can validate that an EJB method call or a remote call to another system is taking place with the expected parameters. To perform this monitoring: •

Use the required trace and recommended trace_to_xml parameter in your server startup script.



Add a properly-configured Event Monitor tool to the start of the test scenario you want to monitor.

See “Monitoring Java Applications”, page 514 for details.

javaagent Command Details Basics The following invocation-time parameters are required:

Parameter

Description

soatest

Required for configuring runtime error detection.

port=[port_number]

Specifies which port should be used to communicate with the monitored program. Use 5050 to 5099.

instrument= [class_name.method_name(jni _sig)]

Specifies which packages/classes/methods to check. Use ':' to separate multiple prefixes. You can provide specific class names, or use wildcards to monitor any class in the specified package.

For Applications Running from Eclipse or Application Servers Applications that define their own class loaders (i.e. Eclipse, JBoss and Tomcat) need insure.jar added to the boot classpath. To monitor those applications, add to the launch VM arguments: -javaagent:"<path_to_jar>\insure.jar=soatest,port=<port>"instrument=<my.package.prefix> Xbootclasspath/a<path_to_jar>\insure.jar

For instance, you may use:

491

Performing Runtime Error Detection

-javaagent:"/home/user/parasoft/insure.jar=soatest,port=5060"instrument=com.mycompany.onlinestore -Xbootclasspath/a:/home/user/parasoft/insure.jar

For other (Standalone) Java Applications For other (standalone) Java applications, you do NOT need to add insure.jar to the boot classpath. For instance, you may use: java -javaagent:"C:\Program Files\Parasoft\insure.jar=soatest,port=5050",instrument=com.mycompany.myapp

Customizing the Test Configuration Before you can perform runtime error detection, you need to customize a Test Configuration to specify which application you want to check and what errors you want to look for. To do this: 1.

Create a new Test Configuration as described in “Creating Custom Test Configurations”, page 244.

2. In the Execution> Runtime Error Detection tab, do the following: a. Enable Perform runtime error detection. b. Specify the host and port of the server running the application you want to check. c.

Enable/disable rules to specify what types of errors you want detected.

Detecting Errors To perform runtime error detection on your application: 1. Select the node for the functional tests you want to execute with runtime error detection. 2. Run the custom Test Configuration (described in the previous section). Errors detected will be reported as follows:

492

Performing Runtime Error Detection

Note that SOAtest correlates each reported error with the functional test that was being run when the error was detected. This correlation between violations and functional tests allows you to trace each reported error to particular use cases against your application.

493

Event Monitoring (ESBs, Java Apps, Databases, and other Systems) In this section: •

Monitoring Intra-Process Events: Overview



Using SOAtest’s Event Monitor



Monitoring IBM WebSphere ESB



Monitoring Oracle or BEA AquaLogic Service Bus



Monitoring Software AG webMethods Broker



Monitoring Sonic ESB



Monitoring TIBCO EMS



Monitoring Other JMS Systems



Monitoring Java Applications



Monitoring Databases



Monitoring Stub Events



Monitoring a Custom API-Based Events Source



Extensibility API Patterns



Generating Tests from Monitored Transactions

494

Monitoring Intra-Process Events: Overview

Monitoring Intra-Process Events: Overview This topic provides an overview of why and how SOAtest monitors events. SOAtest can visualize and trace the intra-process events that occur as part of the transactions triggered by the tests and then dissect them for validation. This enables test engineers to identify problem causes and validate multi-endpoint, integrated transaction system—actions that are traditionally handled only by specialized development teams. Sections include: •

Why Monitor Events?



Monitoring Events with SOAtest

Why Monitor Events? In a test environment, a lot of things can go wrong: The message could be routed to the wrong system, it can be transformed incorrectly, or the target system may not perform a required action. If a test succeeds, how do you know that every step was executed correctly and the appropriate updates were performed? If it fails, how do you know what went wrong, where, and why? Once you identify the cause of an error, how can you isolate it and design the right tests around the isolated parts so the problem can be resolved and then keep the test so that the problem can be detected again in the future? Just sending a message into a system and validating the response coming out is not sufficient to address these challenges. Having the visibility and control inside these systems—especially ESBs that serve as the heart of today's business transactions—is critical to success.

Monitoring Events with SOAtest Parasoft SOAtest can monitor messages and events inside ESBs, Java applications, databases, and other systems in order to provide validation and visibility into test transactions. SOAtest also provides a framework and an API to provide this level of internal visibility within almost any system. In addition to sending an initial message and then validating the response (and possibly validating various changes that take place as a result of the transaction), SOAtest can also monitor the intermediate messages and events. For example, it may monitor the initial messages, the message after it is transformed, messages where a service calls another service and a response comes back, and the message that reaches the destination system—then also monitor the steps through the entire route that the response messages take back. If the transaction executes correctly, you can easily define the assertions to automate validation of these intermediate messages/steps in all subsequent test runes. If it does not execute correctly, you can determine what went wrong and where. You can then apply the available tool set to validate whether messages satisfy functional expectations. Using this functionality, you can ensure that tests aren’t marked as successful unless the test transaction executes exactly as intended.

495

Using SOAtest’s Event Monitor

Using SOAtest’s Event Monitor This topic explains how to configure and apply the Event Monitor tool, which traces the internal events within systems such as ESBs, Java applications, databases, and other and business applications— and allows you to make them part of SOAtest’s end-to-end test scenarios. Sections include: •

Understanding the Event Monitor



Tool Configuration



Test Suite Execution



Stand-Alone/Ad-Hoc Execution



Viewing Monitored Events



Using Data Sources



Retrieving Events from Other Platforms



Validating Monitored Messages

Understanding the Event Monitor The Event Monitor tool traces the internal events within systems such as ESBs, Java applications, and business applications—providing visibility into intermediary messages without requiring direct access to the messaging system’s interface. Additionally, you can apply the available tool set to validate whether messages satisfy functional expectations. SOAtest provides built-in support for monitoring TIBCO Enterprise Messaging System, Sonic Enterprise Service Bus systems, Oracle/BEA Aqualogic Service Bus, Software AG webMethods Broker, IBM WebSphere ESB, any other JMS-based systems, any Java application, and any relational database. In addition, it can be configured to monitor any custom API-based events source (for example, nonJMS systems from other vendors, custom logging frameworks, etc.). The Event Monitor should always be positioned as the first test in the test suite. To configure it, you tell SOAtest how to connect to your system and what you want it to monitor. When you run the test suite, SOAtest will start the event monitor and then keep it running as the test suite’s tests execute. SOAtest monitors and reports the specified events/messages as they occur. Additional tools can then be chained to validate or further process the monitored events/messages.

Tool Configuration The configuration procedure depends on what type of system you’re monitoring. This tool can be used for: •

Monitoring Oracle or BEA AquaLogic Service Bus



Monitoring Software AG webMethods Broker



Monitoring Sonic ESB



Monitoring TIBCO EMS



Monitoring Other JMS Systems



Monitoring Java Applications



Monitoring Databases



Monitoring a Custom API-Based Events Source

496

Using SOAtest’s Event Monitor

Test Suite Execution Test suite execution is the typical and recommended usage of the Event Monitor tool. Add an Event Monitor Tool to your test suite and make it the first test. When you select and run the parent test suite, the Event Monitor will start automatically and continue monitoring while the rest of the tests in the test suite execute. It will stop when the first of the following two events take place: 1. The last test in the test suite (or last row in the data source, if the test suite tests are iterating over a data source) has completed execution, the last event has been retrieved, and the monitoring connection has been destroyed. 2. The maximum monitor execution duration (this value is configured in the tool’s Options tab) has been reached. You do not need to set the Event Monitor's parent test suite to Tests run concurrently. SOAtest will automatically recognize the Event Monitor as a special tool and manage its concurrent execution accordingly.

Stand-Alone/Ad-Hoc Execution You may configure the connection settings of an Event Monitor and execute it on its own while exercising your target systems outside of SOAtest. For example, you may wish to monitor JMS messages in a back-end middleware system while you manually browse the Web interface of your application. To watch the events being logged in real time, open the Event Viewer tab. The Event Monitor tool will run for the duration that is set under Options tab’s Maximum monitor execution duration (milliseconds) setting. Such ad-hoc execution is not applicable when the Custom events source option is selected with the Poll after each test execution pattern.

Viewing Monitored Events Events will be reported and visualized in the Event Viewer tab within the Event Monitor tool configuration panel. To view monitored events during in real time: •

Keep the Event Viewer tab (in the Event Monitor tool’s configuration panel) open during test execution.

There are two tabs available: the graphical view and the text view.

497

Using SOAtest’s Event Monitor

In the graphical view, click on an event to see message details.

Using Data Sources Event Monitor itself is not parameterizable with a data source. However if the tests inside the test suite that includes the Event Monitor tool are parameterizable, it will start monitoring before the first actual test (following the Event Monitor test) and first data source row executes, and it will stop monitoring when all data sources rows have been used. This ensures that you will obtain a single, seamless log of events regardless of how the tests iterate inside the test suite.

Retrieving Events from Other Platforms The Event Monitor tool has extensibility hooks that allow it to obtain events from a variety of sources besides the built-in platforms and systems that it supports. See “Extensibility API Patterns”, page 525 for details.

Validating Monitored Messages If the transaction executes correctly, you can add regression controls for monitored messages as described in “Configuring Regression Testing”, page 425. In addition, you can add validations as described in “Adding Test Outputs”, page 333. The available validation tools are listed in “Validation Tools”, page 898.

498

Using SOAtest’s Event Monitor

You can also monitor the system while the transactions execute (for example, trigger the transaction from the Web interface), and then generate tests automatically from these messages. This includes tests for transaction entry/exit points, as well as tests for intermediate parts of the transaction. In this way, you can reproduce issues quickly, create tests with real values rapidly, then isolate the system and create regression tests around the pieces and components of the transaction. For details on how to do this, see: •

“Creating Tests From Sonic ESB Transactions”, page 404



“Creating Tests From TIBCO EMS Transactions”, page 407



“Creating Tests From JMS System Transactions”, page 401

499

Monitoring IBM WebSphere ESB

Monitoring IBM WebSphere ESB This topic explains how to configure monitoring for IBM WebSphere ESB. Sections include: •

WebSphere Configuration



SOAtest Configuration



Viewing Monitored Events

WebSphere Configuration IBM WebSphere ESB includes monitoring capabilities that build upon its underlying WebSphere Application Server. Parasoft SOAtest can subscribe to Common Base Events that are fired at points in the processing of service components, and which are managed by WebSphere Common Event Infrastructure (CEI). For information about monitoring service component events in the WebSphere ESB and enabling the monitoring using WebSphere administrative console, see http://publib.boulder.ibm.com/infocenter/ dmndhelp/v6r2mx/topic/com.ibm.websphere.wesb620.doc/doc/cmon_businessevents.html. To configure WebSphere for event monitoring: 1. Enable the CEI service in the ESB. 2. Choose the level of logging for the service components you are interested in. The steps for performing this task on the ESB can be found at http://publib.boulder.ibm.com/infocenter/ dmndhelp/v6r2mx/topic/com.ibm.websphere.wesb620.doc/doc/tmon_configcei.html. •

In order to get the full event details in SOAtest, we recommend that you select the "ALL MESSAGES AND TRACES" option and the "FINEST" logging level for the components you are interested in, and which results in the business messages being included in the CEI events. To enable that for all business integration components, the log level string in the WebSphere administrative console would look like this: *=info: WBILocationMonitor.CEI.SCA.com.*=finest

SOAtest Configuration Adding Required Jar Files to the SOAtest Classpath The following jar files need to be added to the SOAtest classpath: •

com.ibm.ws.ejb.thinclient_7.0.0.jar



com.ibm.ws.orb_7.0.0.jar



com.ws.sib.client.thin.jms_7.0.0.jar



com.ibm.ws.emf_2.1.0.jar

The jar files can be found under [WAS installation dir]/runtimes. To add these jar files to SOAtest’s classpath, complete the following: 1. Choose SOAtest> Preferences. 2. Open the System Properties page. 3. Click the Add JARS button and choose and select the necessary JAR files to be added.

Configuring the Event Monitor Tool 500

Monitoring IBM WebSphere ESB

To configure the Event Monitor tool to monitor messages that pass through WebSphere ESB: 1. Double-click the Event Monitor tool to open up the tool configuration panel. 2. In the Event Source tab, select IBM WebSphere Enterprise Service Bus as the platform, then configure the following options: a. In the Connection area, specify your ESB connection settings. •

The username and password are the credentials that were configured in the WebSphere ESB (under Security, Business Integration Security on the WebSphere administrative console for the Common Event Infrastructure).



The credentials you provide are used by SOAtest to create the JNDI InitialContext of the events JMS topic and to create the JMS connection.

b. In the Monitoring Source field, specify the topic or queue that you want to monitor. •

You can leave the default destination name as jms/cei/notification/ AllEventsTopic, which is the CEI topic that reports all CEI events.



The connection URL is essentially the JNDI InitialContext URL for the WebSphere Default JMS provider.



The port number is the WebSphere bootstrap port.



You can check the correct port number for your WebSphere ESB using the administrative console under Servers section, WebSphere Application Server, then click or expand the "Ports" link under the "Communication" section. The port number to use in SOAtest is the BOOTSTRAP_ADDRESS value.

3. In the Options tab, modify settings as needed. •

Report test execution events specifies whether the Event Viewer tab and XML output display show only the monitored messages and events, or if they also indicate when each test started and completed. Enabling this option is helpful if you have multiple tests in the test suite and you want to better identify the events and correlate them to your test executions.



Wrap monitored messages with CDATA to ensure well-formedness of the XML event output should be disabled if you expect the monitored events’ message content to be well-formed XML. This will make the messages inside the events accessible via XPaths, allowing the message contents to be extracted by XML Transformer or validated with XML Assertor tools.





If the message contents are not necessarily XML, you should enable this option to ensure that the XML output of the Event Monitor tool (i.e. the XML Event Output for chaining tools to the Event Monitor, not what is shown under the Event Viewer) is well-formed XML by escaping all the message contents. This will make the content of these messages inaccessible by XPath since the message technically becomes just string content for the parent element.



Note that the Diff tool’s XML mode supports string content that is XML. In other words, no matter which option you select here, the Diff tool will still be able to diff the messages as XML, including the ability to use XPaths for ignoring values.

Maximum monitor execution duration specifies the point at which the test should timeout—in case another test in the test suite hangs, or if no other tests are being run (e.g., if you execute the Event Monitor test apart from the test suite, then use a custom application to send messages to system).

501

Monitoring IBM WebSphere ESB



Event polling delay after each test finishes execution (milliseconds) is not appliable here.

Viewing Monitored Events After the test runs, the Event Monitor will show the XML representation of the Common Base Events it receives from WebSphere, including the event's raw business data if it is present.

502

Monitoring Oracle or BEA AquaLogic Service Bus

Monitoring Oracle or BEA AquaLogic Service Bus This topic explains how to configure monitoring for events that are transmitted through Oracle Service Bus (OSB) or BEA Aqualogic Service Bus (ALSB). Sections include: •

Service Bus Configuration



SOAtest Configuration

Service Bus Configuration 1. Ensure that "Message Reporting" is enabled. This is required so SOAtest can draw message events. •

For details on how to globally enable message reporting in the bus, refer to the OSB Console Guide at http://download.oracle.com/docs/cd/E13159_01/osb/docs10gr3/ consolehelp/configuration.html#wp1080858.

2. Add Message Reporting actions to the desired message workflow components as described in http://download.oracle.com/docs/cd/E13159_01/osb/docs10gr3/userguide/modelingmessageflow.html#wp1080496. •

For details on how to accomplish this, see http://download.oracle.com/docs/cd/ E13159_01/osb/docs10gr3/consolehelp/proxyactions.html#wp1309439.

SOAtest Configuration 1. Configure the SOAtest classpath. By default, OSB is configured to use the built in PointBase relational Database for "Message Reporting" purposes. Parasoft SOAtest uses the OSB message reporting framework to obtain and visualize events (intermediate messages) from the bus by executing SQL queries on the reporting database. •

If you have a default OSB configuration, then you need the add the PointBase JDBC driver to the SOAtest classpath. This is in a single jar found in your OSB/WebLogic installation directory: $[BEA HOME]/wlserver_10.*/common/eval/pointbase/lib/pbclient5*.jar

You need to use the pbclient51.jar for ALSB 3.0 and pbclient57.jar for OSB 10gR3 (each ships with its own jar). •

If your OSB is configured to use a different database, then you need to provide the database JDBC drivers to the SOAtest classpath. If it is Oracle, you do not need to add any drivers (since SOAtest ships with Oracle drivers).

2. Set the Event Monitor URL based on the database. For more information, see “Database Configuration Parameters”, page 348 and the SOAtest forum. 3. (Recommended) Open the Options tab and set the delay amount (Event polling delay after each test finishes execution) to 3 seconds or more. By making the event monitor wait for a few seconds before obtaining the events, you can ensure that the events have been logged to the database before the query is executed. Note that you can click Export Configuration Settings to export these configuration settings to a file, then other team members can reference the settings by selecting the File button and specifying the path to this file.

503

Monitoring Oracle or BEA AquaLogic Service Bus

504

Monitoring Software AG webMethods Broker

Monitoring Software AG webMethods Broker This topic explains how to configure monitoring for events that are transmitted through Software AG webMethods Broker. This monitoring requires admin client group privileges. Sections include: •

Adding Required Jar Files to the SOAtest Classpath



Configuring Event Monitor



Notes

Adding Required Jar Files to the SOAtest Classpath The following jar files need to be added to the SOAtest classpath: •

wmbrokerclient.jar



g11nutils.jar

The jar files can be found under [webmethods install dir]/Broker/lib/. For more details, please refer to webMethods Broker Client Java API Programmer's Guide> Getting Started> Using the webMethods Broker Java API. To add these jar files to SOAtest’s classpath, complete the following: 1. Choose SOAtest> Preferences. 2. Open the System Properties page. 3. Click the Add JARS button and choose and select the necessary JAR files to be added.

Configuring Event Monitor To configure the Event Monitor tool to monitor webMethods Broker: 1. Double-click the Event Monitor tool to open up the tool configuration panel. 2. In the Event Source tab, select Software AG webMethods Broker as the platform. The configuration fields will be populated with default values. 3. Adjust the configuration field values as needed. The fields are the same as those used in the webMethods tool, and are described in “webMethods”, page 833.

505

Monitoring Software AG webMethods Broker

Note - Using Filters and Wildcards The Event Monitor uses BrokerAdminClient to monitor events. By default, it subscribes to the "admin" client group. Thus, if you wish to filter events based on their content, you should change the client group value to one that allows for subscription to regular event types (instead of Trace). Why? Because when using Broker::Trace::* events, the filter string will be applied to the fields in the trace BrokerEvents—not the original events that they represent. When completing the Event Type field, remember that wildcards are not allowed for Broker::Trace or Broker::Activity event types according to WebMethods Broker Client Java API Programmer's Guide. If you wish to monitor a set of event types, specify a client group name that allows subscription access to the desired event types (possibly other than the default "admin" client group setting), then provide event types with wildcards. For example, you can use Sample::* You may also use $[data bank column name] variables in your string. SOAtest will replace that with the data bank value so filter strings can have dynamic values based on the output of other tests. For more details on filters and using wildcards in event type names, please refer to the WebMethods Broker Client Java API Programmer's Guide.

4. In the Options tab, modify settings as needed. •

Report test execution events specifies whether the Event Viewer tab and XML output display show only the monitored messages and events, or if they also indicate when each test started and completed. Enabling this option is helpful if you have multiple tests in the test suite and you want to better identify the events and correlate them to your test executions.



Wrap monitored messages with CDATA to ensure well-formedness of the XML event output should be disabled if you expect the monitored events’ message content to be well-formed XML. This will make the messages inside the events accessible via XPaths, allowing the message contents to be extracted by XML Transformer or validated with XML Assertor tools. •

If the message contents are not necessarily XML, you should enable this option to ensure that the XML output of the Event Monitor tool (i.e. the XML Event Output for chaining tools to the Event Monitor, not what is shown under the Event Viewer) is well-formed XML by escaping all the message contents. This will make the content of these messages inaccessible by XPath since the message technically becomes just string content for the parent element.



Note that the Diff tool’s XML mode supports string content that is XML. In other words, no matter which option you select here, the Diff tool will still be able to diff the messages as XML, including the ability to use XPaths for ignoring values.



Maximum monitor execution duration specifies the point at which the test should timeout—in case another test in the test suite hangs, or if no other tests are being run (e.g., if you execute the Event Monitor test apart from the test suite, then use a custom application to send messages to system).



Event polling delay after each test finishes execution (milliseconds) is not applicable here.

506

Monitoring Software AG webMethods Broker

Notes With admin client group privileges, the Event Monitor tool can subscribe to the following event types: •

Broker::Ping



Adapter::ack



Adapter::adapter



Adapter::error



Adapter::errorNotify



Adapter::refresh



Broker::Trace::Publish



Broker::Trace::Enqueue



Broker::Trace::Drop



Broker::Trace::Receive



Broker::Trace::PublishRemote



Broker::Trace::EnqueueRemote



Broker::Trace::ReceiveRemote



Broker::Activity::TerritoryChange



Broker::Activity::ClientChange,



Broker::Activity::ClientGroupChange



Broker::Activity::EventTypeChange



Broker::Trace::Insert



Broker::Trace::Delete



Broker::Trace::Peek



Broker::Trace::DropRemote



Broker::Trace::Modify



Broker::Activity::ClientSubscriptionChange



Broker::Activity::RemoteSubscriptionChange

Note that you can also configure the webMethods tool (described in “webMethods”, page 833.) to subscribe to the desired event type—the only difference is that with the webMethods tool you need to provide the specific event type name.

507

Monitoring Sonic ESB

Monitoring Sonic ESB This topic explains how to configure monitoring for Sonic ESB.

Prerequisites The following jar files must be added to your classpath (via SOAtest> Preferences> System Properties): •

broker.jar



mfcontext.jar



sonic_Client.jar

Configuration To configure the Event Monitor tool to monitor messages that pass through Sonic ESB: 1. Double-click the Event Monitor tool to open up the tool configuration panel. 2. In the Event Source tab, select Sonic Enterprise Service Bus as the platform, then configure the following options: a. In the Connection area, specify your Sonic ESB connection settings. b. In the Destination Name field, specify the topic or queue that you want to monitor.

c.



You can specify a regular topic or queue (e.g., the entry or exit of a workflow process), or a special "dev.Tracking" tracking endpoint.



For instance, if you want to track all events that occur as part of the process flow, specify the dev.Tracking endpoint, and have the process set to Tracking Level of 4 in the ESB.

In the Destination Type field, specify whether the tracking destination is a topic or a queue.

d. (Optional) In the Message Selector field, enter a value to act as a message filter. See “Using Message Selector Filters”, page 704 for tips. 3. In the Options tab, modify settings as needed. •

Report test execution events specifies whether the Event Viewer tab and XML output display show only the monitored messages and events, or if they also indicate when each test started and completed. Enabling this option is helpful if you have multiple tests in the test suite and you want to better identify the events and correlate them to your test executions.



Wrap monitored messages with CDATA to ensure well-formedness of the XML event output should be disabled if you expect the monitored events’ message content to be well-formed XML. This will make the messages inside the events accessible via XPaths, allowing the message contents to be extracted by XML Transformer or validated with XML Assertor tools. •

If the message contents are not necessarily XML, you should enable this option to ensure that the XML output of the Event Monitor tool (i.e. the XML Event Output for chaining tools to the Event Monitor, not what is shown under the Event Viewer) is well-formed XML by escaping all the message contents. This will make the content of these messages inaccessible by XPath since the message technically becomes just string content for the parent element.

508

Monitoring Sonic ESB



Note that the Diff tool’s XML mode supports string content that is XML. In other words, no matter which option you select here, the Diff tool will still be able to diff the messages as XML, including the ability to use XPaths for ignoring values.



Maximum monitor execution duration specifies the point at which the test should timeout—in case another test in the test suite hangs, or if no other tests are being run (e.g., if you execute the Event Monitor test apart from the test suite, then use a custom application to send messages to system).



Event polling delay after each test finishes execution (milliseconds) is not applicable here.

509

Monitoring TIBCO EMS

Monitoring TIBCO EMS This topic explains how to configure monitoring for TIBCO EMS.

Prerequisites The tibjms.jar file must be added to your classpath (via SOAtest> Preferences> System Properties).

Configuration To configure the Event Monitor tool to monitor messages that pass through TIBCO EMS: 1. Double-click the Event Monitor tool to open up the tool configuration panel. 2. In the Event Source tab, select TIBCO Enterprise Message Service as the platform, then configure the following options: a. In the Connection area, specify your TIBCO EMS connection settings. b. In the Destination Name field, specify the topic or queue that you want to monitor. •

You can specify a regular topic or queue (e.g., the entry or exit of a workflow process), or a special process tracking topic .



For instance, to track any JMS message that gets transmitted through TIBCO EMS, use $sys.monitor.Q.r.>



c.

For details on specifying tracking topics for TIBCO EMS, see "Chapter 13: Monitoring Server Activity" and "Appendix B: Monitor Messages" in the TIBCO EMS User’s Guide.

In the Destination Type field, specify whether the tracking destination is a topic or a queue.

d. (Optional) In the Message Selector field, enter a value to act as a message filter. See “Using Message Selector Filters”, page 704 for tips. 3. In the Options tab, modify settings as needed. •

Report test execution events specifies whether the Event Viewer tab and XML output display show only the monitored messages and events, or if they also indicate when each test started and completed. Enabling this option is helpful if you have multiple tests in the test suite and you want to better identify the events and correlate them to your test executions.



Wrap monitored messages with CDATA to ensure well-formedness of the XML event output should be disabled if you expect the monitored events’ message content to be well-formed XML. This will make the messages inside the events accessible via XPaths, allowing the message contents to be extracted by XML Transformer or validated with XML Assertor tools. •

If the message contents are not necessarily XML, you should enable this option to ensure that the XML output of the Event Monitor tool (i.e. the XML Event Output for chaining tools to the Event Monitor, not what is shown under the Event Viewer) is well-formed XML by escaping all the message contents. This will make the content of these messages inaccessible by XPath since the message technically becomes just string content for the parent element.



Note that the Diff tool’s XML mode supports string content that is XML. In other words, no matter which option you select here, the Diff tool will still be

510

Monitoring TIBCO EMS

able to diff the messages as XML, including the ability to use XPaths for ignoring values. •

Maximum monitor execution duration specifies the point at which the test should timeout—in case another test in the test suite hangs, or if no other tests are being run (e.g., if you execute the Event Monitor test apart from the test suite, then use a custom application to send messages to system).



Event polling delay after each test finishes execution (milliseconds) is not applicable here.

511

Monitoring Other JMS Systems

Monitoring Other JMS Systems This topic explains how to configure monitoring for any JMS system. To configure the Event Monitor tool to monitor any JMS system: 1. Double-click the Event Monitor tool to open up the tool configuration panel. 2. In the Event Source tab, select Generic JMS system as the platform, then configure the following options: a. In the Connection area, specify your connection settings. b. In the Initial Context field, specify a fully-qualified class name string, passed to the JNDI javax.naming.InitialContext constructor as a string value for the property named javax.naming.Context.INITIAL_CONTEXT_FACTORY. c.

In the Connection Factory field, specify the JNDI name for the factory. This is passed to the lookup() method in javax.naming.InitialContext to create a javax.jms.QueueConnectionFactory or a javax.jms.TopicConnectionFactory instance.

d. In the Destination Name field, specify the topic or queue that you want to monitor. •

You can specify a regular topic or queue (e.g., the entry or exit of a workflow process), or a special process tracking topic.

e. In the Destination Type field, specify whether the tracking destination is a topic or a queue. f.

(Optional) In the Message Selector field, enter a value to act as a message filter. See “Using Message Selector Filters”, page 704 for tips.

3. In the Options tab, modify settings as needed. •

Report test execution events specifies whether the Event Viewer tab and XML output display show only the monitored messages and events, or if they also indicate when each test started and completed. Enabling this option is helpful if you have multiple tests in the test suite and you want to better identify the events and correlate them to your test executions.



Wrap monitored messages with CDATA to ensure well-formedness of the XML event output should be disabled if you expect the monitored events’ message content to be well-formed XML. This will make the messages inside the events accessible via XPaths, allowing the message contents to be extracted by XML Transformer or validated with XML Assertor tools.





If the message contents are not necessarily XML, you should enable this option to ensure that the XML output of the Event Monitor tool (i.e. the XML Event Output for chaining tools to the Event Monitor, not what is shown under the Event Viewer) is well-formed XML by escaping all the message contents. This will make the content of these messages inaccessible by XPath since the message technically becomes just string content for the parent element.



Note that the Diff tool’s XML mode supports string content that is XML. In other words, no matter which option you select here, the Diff tool will still be able to diff the messages as XML, including the ability to use XPaths for ignoring values.

Maximum monitor execution duration specifies the point at which the test should timeout—in case another test in the test suite hangs, or if no other tests are being run

512

Monitoring Other JMS Systems

(e.g., if you execute the Event Monitor test apart from the test suite, then use a custom application to send messages to system). •

Event polling delay after each test finishes execution (milliseconds) is not applicable here.

513

Monitoring Java Applications

Monitoring Java Applications This topic explains how to configure monitoring for any Java application. When a properly-configured Event Monitor tool is placed in the beginning of a test suite that includes tests which invoke that Java application (directly or indirectly), it will receive and visualize the Java events that take place. Sections include: •

Why Monitor Java Applications?



Application Configuration



SOAtest Configuration

Why Monitor Java Applications? By monitoring instrumented Java applications, you can gain visibility into the application’s internal behavior as functional tests execute. This allows you to better identify the causes of test failures as well as enrich the use case validation criteria in your regression tests. In addition to validating the messages returned by the system and the intermediate messages/steps monitored via the ESB, you can also validate events within the Java application being invoked. For example, you can validate that an EJB method call or a remote call to another system is taking place with the expected parameters.

Application Configuration To configure the application for monitoring, you need to instrument it with Parasoft’s monitoring agent. To do this: 1. Copy insure.jar and insureimpl.jar from [SOAtest install dir]/eventmonitor to a directory on the server with the application you wish to instrument. 2. If the server is running, stop it. 3. In your startup script, add the -javaagent command to the existing Java arguments. •

For details, see “javaagent Command Details”, page 514.

4. Restart the server. The server will start as usual, but with the specified package classes instrumented. Now, whenever instances of objects are created or methods within the specified package prefixes are invoked, SOAtest (which can be running from another developer/QA desktop machine) will be able to receive event notifications in Event Monitor.

javaagent Command Details Basics The following invocation-time parameters are required in all situations:

Parameter

Description

soatest

Required for configuring monitoring.

port=[port_number]

Specifies which port should be used to communicate with the monitored program. Use 5050 to 5099.

514

Monitoring Java Applications

Parameter

Description

instrument= [class_name.method_name(jni _sig)]

Specifies the prefixes of fully-qualified methods to check. For instance, given the com.abc.util.IOUtil.method, it will instrument all methods in IOUtil.java that start with method. If given com.abc., it will also instrument those methods and all methods of classes whose fully qualified name starts with com.abc. Note that wildcards are not permitted. See the note below for more details.

trace=[class_name.method_na me(jni_sig)]

Specifies the filter for method calls to trace. See the note below for more details.

Note - instrument and trace Instrumentation of a class applies to all of its method bodies; it provides visibility, for example, into what methods that class calls and what values those methods return. Tracing is implemented by also instrumenting the caller of the code you want to visibility into. The called code is not instrumented. For example, assume you want to trace third-paty methods called from your code, but not third-party methods called from other third-paty code. In this case, you would instrument your own code, and trace the callls to the third-paty code. More, specifically.... instrument=com.acme.util configures instrumentation for all classes matching com.acme.util. All methods of those classes are instrumented. In the following code, the writeData method will be

instrumented: package com.acme.util; class IOUtil { int writeData(DataOutputStream dos, Data data) { dos.write(data._int); dos.write(data._float); } }

instrument=com.acme.util,trace=java.io provides visibility into the the java.io calls made by com.acme.util code. Instrumentation adds calls to the writeData() method in order to check which calls to java.io are made by that monitored code.

The following parameters are optional:

515

Monitoring Java Applications

Parameter

Description

trace_to_xml

Tells the monitor to serialize complex Java object into an XML representation. If this option is omitted, only primitive and "toString()" values of Objects will be returned. This parameter is strongly recommended for use with Event Monitor. It is not applicable if you are performing only runtime error detection.

xmlsizelimit=[integer_value]

Determines the maximum XML size limit in bytes. The default size is 20000 if this option is not specified. Applies only when trace_to_xml is used.

xmlcolsizelimit=[integer_value]

When generating an XML representation of Java objects, determines the maximum number of elements shown for collections/maps/arrays. The first 100 elements are shown by default. Applies only when trace_to_xml is used.

xmldeeplimit=[integer_value]

When generating an XML representation of Java objects, determines the maximum field depth included for data structures. Fields up to a depth of 8 are included by default. Applies only when trace_to_xml is used.

xmlexcl=[classes_or_fields]

':' separated classes or fields to exclude from xml serialization (i.e. xmlexcl=com.acme.A:com.acme.B._field). Applies only when trace_to_xml is used.

xmlinc=[classes_or_fields]

':' separated classes or fields to always include in xml serialization (i.e.xmlinc=com.acme.A:com.acme.B._field) Matches for xmlinc take preference over matches for xmlexcl: if something matches xmlinc, it will always be shown— even if xmlexcl also matches it. When the pattern matches a class name, 1) fields of that class type or a derived type are excluded from serialization and 2) method arguments and return values of that type or derived ones will not be serialized to xml. By default, the monitor excludes classes of the following types: - org.apache.log4j.Logger - java.util.logging.Logger - java.util.Timer - java.io.Writer - java.io.Reader - java.io.InputStream - java.io.OutputStream Applies only when trace_to_xml is used.

516

Monitoring Java Applications

Parameter

Description

xmlsecondlimit=[seconds]

By default, if you are using trace_to_xml, the monitoring will spend only up to 10 seconds to convert a monitored complex Java object to an XML representation. This is to prevent significant slow-downs in the monitoring agent when monitoring very large objects. If that limit is reached, then the SOAtest event will show the following message instead of the XML representation of an object: SKIPPED: converting to XML takes too long: 10 seconds If you wish to change that 10 second threshold, use the xmlsecondslimit flag. example: xmlsecondslimit=20 For large objects, the recommended appoach is to avoid reaching the threshold in the first place: reduce the XML size by excluding the fields you are not interested (using the xmlecl option).

trace_exceptions[=exception_c lass_prefix]

Shows a trace of events related to an exception that was created, thrown, or caught. _details can be added to get more detail of the events (i.e. the

stack trace where the event happens). terse

Configures terse output to the console (stack traces have only 1 element).

For Applications Running from Eclipse or Application Servers Applications that define their own class loaders (i.e. Eclipse, JBoss and Tomcat) need insure.jar added to the boot classpath. To monitor those applications, add to the launch VM arguments: -javaagent:"<path_to_jar>\insure.jar=alias,port=<port>"trace_to_xml,instrument=<my.package.prefix>,trace=<my.package.prefix> -Xbootclasspath/a<path_to_jar>\insure.jar

For instance, you may use: -javaagent:"/home/user/parasoft/insure.jar=alias,port=5060"trace_to_xml,instrument=com.mycompany.onlinestore,trace=com.mycompany.onlinestore -Xbootclasspath/a:/home/user/parasoft/ insure.jar

For other (Standalone) Java Applications For other (standalone) Java applications, you do NOT need to add insure.jar to the boot classpath. For instance, you may use: java -javaagent:"C:\Program Files\Parasoft\insure.jar=alias,port=5050",trace_to_xml,instrument=com.mycompany.myapp,trace=com.mycompany.myapp

SOAtest Configuration 1. Double-click the Event Monitor tool to open up the tool configuration panel. 2. In the Event Source tab, select Instrumented Java Application as the platform, then specify the hostname where the server resides and the port number for the Parasoft agent runtime (5050 by default).

517

Monitoring Databases

Monitoring Databases This topic explains how to configure monitoring for databases. Sections include: •

Why Monitor Databases?



Configuring Event Monitor to Monitor a Database



Notes

Why Monitor Databases? The Event Monitor allows you to execute a SQL query on a database of your choice after each test executes within the test suite. Although the DB Tool can be used for similar purposes, the Event Monitor Database mode is better suited for retrieving database rows when events occur as a result of your application logging messages into a database. The Event Monitor is different than the DB Tool in a number of ways: •

A single Event Monitor in your test suite can execute database queries automatically after each test execution. This relieves you from having to add a DB Tool directly after each test.



It allows delayed query execution (see the "Event polling delay after each test finishes execution" option under the Options tab). This is important for many logging databases because the logged entries may not reach the database in real time.



It can consolidate the database entries into a single flow of events within a test suite, while the DB Tool gives you the flexibility to execute isolated and different queries at the desired points of your use case scenario.



It helps you isolate the entries that were added to the database during test execution.

Configuring Event Monitor to Monitor a Database 1. Double-click the Event Monitor tool to open up the tool configuration panel. 2. In the Event Source tab, select Database as the platform. 3. With Local selected, enter the Driver, URL, Username, and Password for the database you want to query. For details on completing these fields, see “Database Configuration Parameters”, page 348 and the SOAtest forum. 4. (Optional) In the Constraint SQL Query field, enter a value that identifies the last logged value from the database before a test executes. Event Monitor expects that query to return a single value. Typically this would be a table key, an entry number, or a Timestamp. •

Using a constraint query is useful when your logging database is cumulative and it is not cleaned/restored after each scenario executes. By executing this query before the test suite tests execute, the Event Monitor can distinguish the pre-existing entries from the new entries that will be logged into the database during test execution.



Such queries would typically use the SQL MAX function. For example, the query select max(MESSAGE_TIMESTAMP) from MESSAGE_LOG assumes that you have a table named MESSAGE_LOG that contains a column named MESSAGE_TIMESTAMP of type Timestamp. It will return a single value representing the

latest message entry currently present in that database. Event Monitor will execute that query first and keep that timestamp value.

518

Monitoring Databases

5. In the Event SQL Query field, specify the SQL for retrieving the log or event entry from the database. For example, such a query might look like: select * from MESSAGE_LOG where MESSAGE_TIMESTAMP > $[CONSTRAINT] order by MESSAGE_TIMESTAMP DESC



Note that $[CONSTRAINT] is a special SOAtest variable. It tells the Event Monitor to use the value it received from the first constraint query (described in the previous step) and automatically provide it in the event query. The event query executes after test suite execution completes (and after the delay specified in the Event Monitor’s Options tab). It retrieves the rows that were added to the database after the test executed.



Use of $[CONSTRAINT] in event queries is not required.

6. In the Options tab, modify settings as needed. •

Report test execution events specifies whether the Event Viewer tab and XML output display show only the monitored messages and events, or if they also indicate when each test started and completed. Enabling this option is helpful if you have multiple tests in the test suite and you want to better identify the events and correlate them to your test executions.



Wrap monitored messages with CDATA to ensure well-formedness of the XML event output should be disabled if you expect the monitored events’ message content to be well-formed XML. This will make the messages inside the events accessible via XPaths, allowing the message contents to be extracted by XML Transformer or validated with XML Assertor tools. •

If the message contents are not necessarily XML, you should enable this option to ensure that the XML output of the Event Monitor tool (i.e. the XML Event Output for chaining tools to the Event Monitor, not what is shown under the Event Viewer) is well-formed XML by escaping all the message contents. This will make the content of these messages inaccessible by XPath since the message technically becomes just string content for the parent element.



Note that the Diff tool’s XML mode supports string content that is XML. In other words, no matter which option you select here, the Diff tool will still be able to diff the messages as XML, including the ability to use XPaths for ignoring values.



Maximum monitor execution duration specifies the point at which the test should timeout—in case another test in the test suite hangs, or if no other tests are being run (e.g., if you execute the Event Monitor test apart from the test suite, then use a custom application to send messages to system).



Event polling delay after each test finishes execution (milliseconds) specifies how long Event Monitor waits between the time the test ends and the time it retrieves the events.

Notes •

The Event Viewer tab will display each row retrieved by the Event SQL query as a box in the event sequence flow. Double-clicking the box opens a dialog with data details.



You can chain XML tools to a database Event Monitor. Since the database rows are outputted in XML, they can be diffed, validated with XML Assertor, etc.

519

Monitoring Stub Events

Monitoring Stub Events This topic explains how to configure monitoring for events that occur within stubs you have deployed using SOAtest (to virtualize services or other resources that are unavailable for testing or that you do not want to access during testing). Sections include: •

Why Monitor Stub Events?



Prerequisites



Configuration



Using An Alternative JMS System for Stub Event Monitoring

Note: For details on SOAtest’s stubbing (virtualization) functionality, see “Service Virtualization: Creating and Deploying Stubs”, page 531.

Why Monitor Stub Events? Visibility into what messages are sent to and from stubs—and what validations and errors occur at the stub level—enables you to: •

Validate the messages that the application sends to your stubs.



See what errors occur based on how your application interacts with the stubbed services.

In a common test situation, SOAtest will send a message to a system, such as an available service or a web browser, which will then send a message to another service or system that you have stubbed (e.g., because it is not yet available or not accessible for testing).

SOAP Client Tool

Stub 1

2

System A

Validation 3

4

Event Monitor

Client Tester Tools

By adding a stub Event Monitor tool to the test suite, you gain insight into messages 2 and 3 and well as messages 1 and 4. You also see the result of any validation tools you may have attached to the stub (for instance, XML Assertor tools) and receive details on any stub errors that may have occurred (for instance, because the stub was not properly configured to process valid messages, or because invalid messages were sent).

520

Monitoring Stub Events

Prerequisites Stub event monitoring applies specifically to "hosted stubs"—stubs that are available for continuous access. It is not relevant to Message Stub tools that are integrated into end-to-end test suites as described in “Hosted Deployment vs. Scenario-Mode Integration”, page 542. Note that firewalls running where SOAtest is running can sometimes block communication between the Event Monitor and a remote Stub Server. If you are using the Windows Vista firewall, it needs to be disabled prior to using Event Montiro with a remote Stub Server.

Configuration 1. Add an Event Monitor tool to the test suite that drives the interaction with the stub you want to monitor. 2. In the Event Source tab of the Event Monitor tool’s configuration panel, select SOAtest Stub Server as the platform. 3. Under Event Reporting Provider, ensure that SOAtest Builtin Provider is selected. •

If you want to use a different JMS system as the stub server (e.g., to scale for load testing), see “Using An Alternative JMS System for Stub Event Monitoring”, page 522.

4. If you want to monitor stub events on a remote SOAtest server, select Remote SOAtest Server and specify that server’s URL. Otherwise the local SOAtest server will be used. 5. Under Stub Event Subscriptions, specify what type of stub events you want to monitor. Available options are: •

Request messages: The message sent to the stub. For instance, in the image shown in Why Monitor Events?, this would be message #2.



Response messages: The message that the stub returns. For instance, in the image shown in Why Monitor Events?, this would be message #3.



Message validation results: The result of any validation tools you may have attached to the stub’s Message Stub tools —such as XML Assertor tools. For instance, in the image shown in Why Monitor Events?, this would be whatever tool is represented by the "validation" marker.



Stub errors: Any stub errors that may have occurred (e.g., because Message Stub tools were not properly configured to process valid messages, or because invalid messages were sent). For instance, in the image shown in Why Monitor Events?, if the stub was not configured to route message #2 to a specific Message Stub tool, this would be reported as an error.

6. Under Test Failure Criteria, specify the test failure criteria. Note that if a validation tool is chained to the current Event Monitor tool, the outcome of that validation will determine this test’s success or failure—and the following settings not applicable. Available options include: •

Stub error events: The test will fail if any stub errors occur (e.g., because Message Stub tools were not properly configured to process valid messages, or because invalid messages were sent).



Stub validation failure events: The test will fail if a failure is reported by any validation tool (such as an XML Assertor tool) you may have attached to the stub’s Message Stub tool.



No events received: The test will fail if the stub does not receive any events before the current test suite (the one that includes the Event Monitor tool) completes execution.

521

Monitoring Stub Events

7. In the Options tab, modify settings as needed. •

Report test execution events specifies whether the Event Viewer tab and XML output display show only the monitored messages and events, or if they also indicate when each test started and completed. Enabling this option is helpful if you have multiple tests in the test suite and you want to better identify the events and correlate them to your test executions.



Wrap monitored messages with CDATA to ensure well-formedness of the XML event output should be disabled if you expect the monitored events’ message content to be well-formed XML. This will make the messages inside the events accessible via XPaths, allowing the message contents to be extracted by XML Transformer or validated with XML Assertor tools. •

If the message contents are not necessarily XML, you should enable this option to ensure that the XML output of the Event Monitor tool (i.e. the XML Event Output for chaining tools to the Event Monitor, not what is shown under the Event Viewer) is well-formed XML by escaping all the message contents. This will make the content of these messages inaccessible by XPath since the message technically becomes just string content for the parent element.



Note that the Diff tool’s XML mode supports string content that is XML. In other words, no matter which option you select here, the Diff tool will still be able to diff the messages as XML, including the ability to use XPaths for ignoring values.



Maximum monitor execution duration specifies the point at which the test should timeout—in case another test in the test suite hangs, or if no other tests are being run (e.g., if you execute the Event Monitor test apart from the test suite, then use a custom application to send messages to system).



Event polling delay after each test finishes execution (milliseconds) is not applicable here.

Notes •

The Event Viewer tab will display details about the stub events received. It indicates the stub name, the name of Message Stub tool that responded to that message, the response message, results of validation tools (if available), and any stub errors that occurred. Double-clicking an item opens a dialog with additional details.

Using An Alternative JMS System for Stub Event Monitoring By default, stubs are monitored using SOAtest’s built-in JMS-based provider. Alternatively, you can use another JMS system that you have—for instance, if you want to scale for load testing. To configure this: 1. Open the Stub Server view (if it is not available, choose Window> Show View> Stub Server). 2. Double-click the node for the machine (local or remote) you want to configure to use the stub event reporting provider. 3. Open the Event Reporting tab in the machine configuration panel. 4. In the Event Reporting Provider field, select your preferred server. If you want to use a JMS server that is not specifically listed, choose Other JMS Provider.

522

Monitoring Stub Events

5. Specify the connection settings. The available fields are described in •

Monitoring IBM WebSphere ESB



Monitoring Sonic ESB



Monitoring TIBCO EMS



Monitoring Other JMS Systems

Event Reporting Destination - Configuration Needed Note that a default event reporting destination and type are specified in the available controls. You need to either: •

Configure your JMS system to use this default destination, or



Change the SOAtest settings to another destination that is available on your system.

6. In the Event Monitor tool configuration panel, open the Event Source tab, select the appropriate Event Reporting Provider, and specify the settings required to connect to your JMS. Again, the available fields are described in •

Monitoring IBM WebSphere ESB



Monitoring Sonic ESB



Monitoring TIBCO EMS



Monitoring Other JMS Systems

523

Monitoring a Custom API-Based Events Source

Monitoring a Custom API-Based Events Source This topic explains how to configure monitoring for a custom API-Based Events source. To configure the Event Monitor tool to monitor a custom API-based event source: 1. Double-click the Event Monitor tool to open up the tool configuration panel. 2. In the Event Source tab, select Custom API-Based Events Source as the platform, then configure the following options: a. In the Connection area, specify your connection settings. b. In the Event Retrieval area, specify the event retrieval pattern you want to use (polling at a specified time interval, polling after each test execution, or subscribing to an event producer). c.

In the User Code area, specify the location of your custom event monitoring application or scripts. •

See “Extensibility API Patterns”, page 525 for details.

3. In the Options tab, modify settings as needed. •

Report test execution events specifies whether the Event Viewer tab and XML output display show only the monitored messages and events, or if they also indicate when each test started and completed. Enabling this option is helpful if you have multiple tests in the test suite and you want to better identify the events and correlate them to your test executions.



Wrap monitored messages with CDATA to ensure well-formedness of the XML event output should be disabled if you expect the monitored events’ message content to be well-formed XML. This will make the messages inside the events accessible via XPaths, allowing the message contents to be extracted by XML Transformer or validated with XML Assertor tools. •

If the message contents are not necessarily XML, you should enable this option to ensure that the XML output of the Event Monitor tool (i.e. the XML Event Output for chaining tools to the Event Monitor, not what is shown under the Event Viewer) is well-formed XML by escaping all the message contents. This will make the content of these messages inaccessible by XPath since the message technically becomes just string content for the parent element.



Note that the Diff tool’s XML mode supports string content that is XML. In other words, no matter which option you select here, the Diff tool will still be able to diff the messages as XML, including the ability to use XPaths for ignoring values.



Maximum monitor execution duration specifies the point at which the test should timeout—in case another test in the test suite hangs, or if no other tests are being run (e.g., if you execute the Event Monitor test apart from the test suite, then use a custom application to send messages to system).



Event polling delay after each test finishes execution (milliseconds) specifies how long Event Monitor waits between the time the test ends and the time it retrieves the events.

524

Extensibility API Patterns

Extensibility API Patterns This topic describes the Extensibility API, which allows for capturing events by polling at specified time intervals, polling after test execution, and subscribing to an event producer. Sections include: •

Available Patterns



Using “Poll at specified time intervals” and “Poll after each test execution” Patterns



Using the “Subscribe to an event producer” pattern

Where are the Javadocs? The Javadocs for the Monitor Tool API can be accessed by choosing SOAtest> Help> Extensibility API . The resources that directly concern the Event Monitor tool are com.parasoft.api.IEvent, its default implementation adapter com.parasoft.api.Event, and com.parasoft.api.EventSubscriber

Available Patterns The Extensibility API allows for capturing events using one of three different patterns: •

Poll at Specified Time Intervals



Poll after Each Test Execution



Subscribe to an Event Producer

Poll at Specified Time Intervals This pattern is useful when the events in the system you are trying to monitor are not synchronized with your test execution. For example, you may be interested in pulling data from a logging framework which logs events at specified time intervals, and therefore you wish to capture the logged events in SOAtest at the same intervals in order to ensure that the events are actually retrieved. An analogy to this is somebody being a fan of a particular monthly magazine, but rather than subscribing to it in order to receive every issue that is published, the reader would visit a book store every month and obtain the latest issue. A point of interest with this pattern is that if you poll for events before new ones have been made available, then you may get the same events as the last poll or no events at all—depending on how your target framework behaves. This is just like our example reader possibly going to the store and finding only last month's issue since the current month issue has not been released yet. With this pattern, you can specify a time amount (in milliseconds) which will serve as the wait period in between the executions of your script. For example, if you specify the interval to be 1000 milliseconds, it will cause the Event Monitor tool to execute the code you provide every 1 second. This will continue until one of two events take place: 1. If the Event Monitor is being executed as part of a test suite with other various tests, the periodical user code execution will stop as soon as the last test in the test suite (or last row in the data source, if the test suite tests are iterating over a data source) has finished execution. 2. The maximum monitor execution duration (the value is configured under the Options tab) has been reached.

525

Extensibility API Patterns

Poll after Each Test Execution This pattern is useful when the events in the system you are trying to monitor are generally in sync with your test suite's test execution events. For example, your system's logging framework is triggered immediately after events occur in the system and the events are made available for you to obtain. With this pattern, the code you provide will be executed immediately after each test in the test suite executes. This pattern is not applicable if you run the Event Monitor tool independently (apart from executing the entire test suite which contains it). In this case, the Event Monitor will stop once: 1. The last test in the test suite (or the last row in the data source if the test suite tests are iterating over a data source) has finished execution. 2. The maximum monitor execution duration (value is configured under Options tab) has been reached.

Subscribe to an Event Producer This is probably the most common pattern, and Parasoft recommends its use whenever possible. This pattern is useful when the framework you are trying to monitor can allow a subscriber to be triggered immediately upon the occurrence of an event—in other words, it can perform a “call back” function. For example, the JMS publish/subscribe message pattern is an example of this pattern and it is the pattern used to drive the Sonic, TIBCO and other built-in platforms supported in the Event Monitor tool. In our magazine reader example, this is similar to subscribing to the publication so the latest issue gets delivered to the reader as soon as it is published.

Using “Poll at specified time intervals” and “Poll after each test execution” Patterns When using these two patterns, the method you select in the User Code section will be executed in accordance to the respective pattern. You may have a method name with any name you wish (be sure it is selected in the Method menu). IEvent getEvent(String url, String username, String password, Object connection, ScriptingContext context)



url (String): the value that is provided in the URL field of the Event Monitor Connection section.



username (String): the value that is provided in the username field of the Event Monitor Connection section.



password (String): the value that is provided in the password field of the Event Monitor Connection section.



connection(Object): this object can be optionally provided in order to maintain and reuse the same connection over the multiple script executions of the Event Monitor. See to “Maintaining Connections”, page 527 for details.



context (com.parasoft.api.Context): standard SOAtest scripting context that allows for accessing variables, data sources values and setting/getting objects for sharing across test executions.

Example (Jython) from com.parasoft.api import * def getEvent(url, username, password, connection, context): return Event("Hello!")

526

Extensibility API Patterns

The code under getEvent() method would basically handle event retrieval from the system you wish to monitor and return an implementation of com.parasoft.api.IEvent. In this example, we return com.parasoft.api.Event, which is an adapter implementation to that interface and takes a simple String object “Hello!”.

Maintaining Connections In practice, it is often useful to be able to create a connection to the remote system you want to monitor and reuse that connection for retrieving events (instead of creating a new connection on each User Code invocation). Therefore, in addition to have a method for retrieving an IEvent object as described above, you may choose to add two additional optional methods: Object createConnection(String url, String username, String password, com.parasoft.api.Context context)

and

Object destroyConnection(Object connection, com.parasoft.api.Context context)

createConnection would create and return an object handle to the monitoring connection, while destroyConnection takes that same object and allows for you to provide code for a graceful disconnection. The Event Monitor looks for the presence of these two optional methods. If you want to add them, be sure to use the exact method signatures. createConnection() is invoked once at the beginning of the Event Monitor execution and destroyConnection is invoked once at the end. The event retrieval method—for example, getEvent() above—is potentially invoked multiple times during Event Monitor test run in accordance with the selected pattern. The connection object you create with the createConnection method is passed to the event retrieval method so you can potentially use that connection to return an event.

Example from com.parasoft.api import * def getEvent(url, username, password, connection, context): return connection.getEvent()

Using the “Subscribe to an event producer” pattern With this pattern, Event Monitor expects a single method (with any name you wish—as long as the name is selected in the Method drop down menu) and with the following signature:

EventSubscriber getEventSubscriber(String url, String username, String password, Context context)

The arguments descriptions are provided in the previous patterns. In this case, you need to provide a Java implementation of an EventSubscriber (by inheriting from com.parasoft.api.EventSubscriber). The methods to implement are: public boolean start() Throws Exception

527

Extensibility API Patterns

and public boolean stop() Throws Exception

The start method will be invoked automatically by Event Monitor when the monitoring begins, and the stop method will be invoked automatically when the Event Monitor Execution finishes. The assumption under this pattern is that your EventSubscriber implementation would take care of connecting to your target system and subscribe to its event producing framework in start() and then unsubscribe and disconnect in stop(). An example implementation for subscribing to TIBCO EMS message monitoring topics is provided below. This actually mirrors the pattern used by the built-in TIBCO EMS platform of the Event Monitor.

Example This example is also available under the examples scripting directory that ships with SOAtest. It is included as an Eclipse project in a zip archive that can be imported to your Eclipse workspace. It requires tibjms.jar from TIBCO EMS and parasoft.jar to be added to the classpath in order to build and run. parasoft.jar is available at [SOAtestinstall directory]/plugins/com.parasoft.xtest.libs.web_1.0.0/root

import com.parasoft.api.*; import com.tibco.tibjms.*; import javax.jms.*; public class TIBCOEventSubscriber extends EventSubscriber { private ConsumerRunnable _consumerRunnable; private Connection _connection; private String _dest; private boolean _started = false; public TIBCOEventSubscriber(String url, String username, String password, String destination) throws JMSException { _dest = destination; QueueConnectionFactory factory = new TibjmsQueueConnectionFactory(url); _connection = factory.createQueueConnection(username, password); } public boolean start() throws Exception { Session session = null; session = _connection.createSession(false, Session.AUTO_ACKNOWLEDGE); Destination destination = session.createTopic(_dest); MessageConsumer msgConsumer = session.createConsumer(destination); _connection.start(); _started = true; _consumerRunnable = new ConsumerRunnable(msgConsumer); Thread thread = new Thread(_consumerRunnable); thread.start(); Application.showMessage("Monitoring started on " + _dest); return true; } public boolean stop() throws Exception { _started = false; Thread.sleep(1000); if (_connection != null) { _connection.close(); Application.showMessage("Monitoring connection closed"); } if (_consumerRunnable != null && _consumerRunnable.getException() != null) { throw _consumerRunnable.getException(); } return _started; }

528

Extensibility API Patterns

private class ConsumerRunnable implements Runnable { private MessageConsumer msgConsumer; private Exception t; public ConsumerRunnable(MessageConsumer msgConsumer) { this.msgConsumer = msgConsumer; } public void run() { while(_started) { try { MapMessage msg = (MapMessage)msgConsumer.receive(500); if (msg == null) { continue; } Message actualMessage = null; try { byte[] bytes = msg.getBytes("message_bytes"); if (bytes != null && bytes.length > 0) { actualMessage = Tibjms.createFromBytes(bytes); } } catch (JMSException e1) { } // you can provide your own extension of Event in order to customize the event // getLabel(), toString() and toXML() outputs IEvent event; if (actualMessage == null) { event = new Event(msg); } else { event = new Event(actualMessage); } event.setTimestamp(msg.getJMSTimestamp()); onEvent(event); } catch (JMSException e) { Application.showMessage(e.getClass().toString() + ": " + e.getMessage()); try { msgConsumer.close(); } catch (JMSException e1) { Application.showMessage(e1.getClass().toString() + ": " + e1.getMessage()); } t = e; _started = false; } } } public Exception getException() { return t; } } }

529

Generating Tests from Monitored Transactions

Generating Tests from Monitored Transactions In addition to providing visibility into your test transactions, SOAtest can monitor the system while real transactions execute, and then generate tests automatically from these messages. This includes tests for transaction entry/exit points, as well as tests for intermediate parts of the transaction. Benefits of this include: •

Accelerate test case creation by running real transactions through the system then generating tests that can replay them.



Consume real messages with real data into tests from monitoring.



Debug problems and reduce complexity by isolating components in an integrated system by replaying messages that trigger only certain parts of the transaction

For details, see: •

“Creating Tests From Sonic ESB Transactions”, page 404



“Creating Tests From TIBCO EMS Transactions”, page 407



“Creating Tests From JMS System Transactions”, page 401

530

Service Virtualization: Creating and Deploying Stubs In this section: •

Understanding Stubs



Creating Stubs from Functional Test Traffic



Creating Stubs from Recorded HTTP Traffic



Creating Stubs from WSDLs or Manually



Working with Stubs



Configuring Stub Server Deployment Settings

531

Understanding Stubs

Understanding Stubs Evolving services in a distributed SOA environment, and across multiple teams, is a complex endeavor due to the interdependencies between the system and business processes. For example, in a system that incorporates multiple endpoints such as credit card processing, billing, shipping, etc., it may be difficult for one team to test the responses from another team without interrupting normal business transactions. With SOAtest’s stub generation capability, you can test complex and distributed environments by automatically generating server stubs based on existing test suites. SOAtest can quickly and easily automate server emulation across multiple environments, thereby streamlining the collaborative development and testing activities of multiple teams and ultimately speeding the SDLC. SOAtest can: 1. Replace the various endpoints in the system architecture (services, applications, mainframes, stored procedures, etc.) as well as emulate the application behavior at the unit level. 2. Add stubs to the test environment to replace the behavior of various components that would otherwise inhibit your team’s ability to effectively exercise and validate the end-to-end business process.

Constructing Stubs Stubs can be constructed in several main ways: •

Emulation Based on Historical Data



Contextual Emulation



Using Data Sources (for unavailable services)

Emulation Based on Historical Data You can automatically emulate services based on real-world historical data (request/reply sets) collected from the runtime environment. Once you have captured a set of messages from the actual system (via monitoring, tracing, log files, etc.), SOAtest can create an emulated version of the service based on the same messages. For example, you can obtain runtime message sets from a runtime monitoring system such as AmberPoint Management System, then SOAtest can automatically generate stubs that represent the monitored behavior.

Contextual Emulation In some cases, it’s necessary to have a better understanding of the context around these messages in order to produce these emulated services intelligently. When this contextual understanding is important, you can create a functional test that models the scenario that you want emulated (you simply interact with the actual components to be emulated), then have Parasoft automatically generate stubs that emulate the behavior monitored when executing the modeled scenario. This significantly reduces the resources required to intelligently mirror the real-world behavior of complex, distributed, and heterogeneous environments. For example, assume you are interacting with Amazon Web Services (AWS). In order to emulate them, you can model a scenario that interacts with the actual Amazon services, then automatically “flip” that scenario into an emulated version. With the emulated version deployed locally, you gain control over its behavior for your testing environment (instead of relying on Amazon). This way, you can easily emulate error conditions, realistic delays, and so forth.

532

Understanding Stubs

Using Data Sources to Define Behavior for Unavailable Services Even if a system is completely unavailable, you can rapidly create an emulated version of necessary services from scratch. For example, you can automatically create a stub skeleton from a WSDL, use spreadsheets or other data sources to define the desired behavior, and then visually correlate request parameter values with desired response values.

Extending Beyond Services SOAtest’s stub generation is not limited to service emulation. The solution’s extreme extensibility and easy customization allows you to stub any component or protocol that is creating a dependency problem in the test environment. For example, you can use SOAtest to: •

Monitor what happens inside an ESB or another system as it is executed and create stubs that emulate the monitored behavior.



Emulate the other components that the bus is talking to—for instance, a CRM application, or a partner’s service which in turn calls a Software as a Service application.



Use unit-level stubbing to emulate the application logic’s calls to other applications, legacy systems, etc.



Configure the test scenario to set up and clean up test data on the actual databases so that the actual data set is not impacted by test transactions.



Simulate the behavior of a user interacting with the system via a browser.

Customizing Stubs To ensure that the emulated assets are flexible and robust enough to represent even the most complex test scenarios, Parasoft provides easy ways to configure stubs. For example, you can use the tool configuration panel to customize stubs with different request/ response use cases, error conditions, delays, and so forth. To quickly configure responses with a wide variety of values, you can set the stub to use automaticallygenerated inputs for specified operations, or feed it a range of values that are stored in a data source. In addition, if you have scripts or code that represents a custom response, you can integrate it directly into the emulated environment. This means that you can extend the stub to mimic any level of processing—no matter how complex or sophisticated. The configured stubs can be deployed locally or made available as a service so that different teams or business partners can collaborate on evolving their components within the distributed architecture. If the emulated asset changes as the application evolves (for example, the WSDL for an emulated service is extended to include a new operation), the associated stub does not need to be re-created; it can be updated. Stubs can be made available for continuous access (i.e., be deployed as "hosted stubs") or they can be integrated into an end-to-end test suite. For instance, to validate a loan approval service that executes a business process workflow with multiple steps (including calling a manager approval service and another external credit rating service), you could construct a scenario with the following tests: •

Test 1: Send a request to a loan approval service to initiate the process.



Test 2: Act as a stub to listen for the incoming credit rating service over HTTP and respond with the desired rating for the scenario (emulate the crediting rating service response).



Test 3: Act as a stub to consume the manager approval message over a JMS queue and respond with approval, denial, etc.

533

Understanding Stubs



Test 4: Get an asynchronous response from the loan process with the final loan result and validate it.



Test 5: Execute a query against a relational database to check if the loan application was audited in the database properly.



Test 6: Remove the application data from the database in order to restore it to the original state and make the test scenario repeatable.

Creating Stubs Specific to the Travel Industry Certain travel industry XML services are built around plain XML messaging where the XML request is URL-encoded under a parameter named xmlRequest. SOAtest stubs services that respond to XML based on values within a URL-encoded XML request without requiring any additional configuration. SOAtest stubs recognize incoming requests with Content-Type application/x-www-form-urlencoded and expect to read the request XML from the HTTP form parameter "xmlRequest". Multiple responses mode can be used so that you can build XPaths to correlate the desired response message based on values in the XML request.

534

Creating Stubs from Functional Test Traffic

Creating Stubs from Functional Test Traffic This topic explains how to create stubs from existing functional test suite traffic (from SOAP Client tests that use HTTP, JMS, or MQ as well as Messaging Client tests). If the service you want to emulate is available, this is typically the most effective way to create stubs. Sections include: •

Creating Stubs



Using the Stubs

Creating Stubs To create stubs from existing test suite traffic: 1. Design a functional test suite that represents the traffic that you want the stub to emulate. For instance, you can create a test suite based on historical data (request/reply sets from Amberpoint, traffic, etc.). Or, you can design a functional test suite that sets up a sequence of events representing the contextual application behavior you want emulated. 2. Run the test suite from which that you want to create stubs. 3. Select the test suite’s Test Case Explorer node, then choose Create Stub from the shortcut menu. 4. In the wizard that opens, modify the stub file name if desired, then click Next.

535

Creating Stubs from Functional Test Traffic

5. Modify the stub name and path if desired, then click Next.

6. (Optional) Specify how you want SOAtest to determine which parameter values in the request message determine should be used to determine the response messages of the corresponding stub (i.e., which request message parameters you want SOAtest to evaluate when determining which response to send). In the Message Stub tool configuration, you will be able to specify the response that the stub should return when a message matching one of these "relevant" values is received. •

Choose Automatic if you want SOAtest to automatically select the parameters that vary from request to request.



If you have a complex service and/or want to manually specify which parameters are relevant, choose Select Relevant Parameters, then use the available controls to indicate which parameters you want to use.



For instance, assume that the request message has 3 parameters: x, y, z. If you want SOAtest to consider the values of y and z in determining what response the stub should send, you would specify y and z here. The selection of y and z would then be reflected in the generated Message Stub tool’s Response settings. This is where you would specify what responses you want the stub to return.

7. Click the Finish button. SOAtest automatically creates a stub based on the existing test suite. Stubs are implemented as a set of Message Stub tools, and are grouped in a .tst file with the specified name and in the specified location. The resulting .tst file of Message Stub tools is added to a "stubs" project in the SOAtest workspace and deployed to the local stub server.

536

Creating Stubs from Functional Test Traffic

You can customize stubs with different request/response use cases, error conditions, delays, and so forth as described in “Message Stub”, page 784.

MQ Note If you are working with IBM WebSphere MQ, you also need to provide the MQ connection settings in the stub configuration editor as described in “Configuring MQ for Stubs”, page 556.

Creating Stubs from a Messaging Client The Messaging Clients response messages or requests do not have to be XML. However, if you want to do request/response correlation (i.e., have SOAtest automatically discover the changing parameters and configure the XPaths for multiple responses) the request needs to be XML. If the request is not XML, then the Message Stub will be configured with a single response. If there is more than one test that uses JMS, the first test with a JMS configuration will be used to configure the JMS for the stub.

Using the Stubs For details on how to customize, deploy, and exercise stubs, see “Working with Stubs”, page 541.

537

Creating Stubs from Recorded HTTP Traffic

Creating Stubs from Recorded HTTP Traffic If you can access a message log or a trace from the traffic between clients and servers, then you can create stubs from that data. Specifically, you can have SOAtest automatically create Message Stubs that respond to incoming messages in a way that mimics the behavior captured in the traffic. Such message traces or logs can be captured at the network level using network sniffing tools such as the free WireShark tool (http://www.wireshark.org/), or obtained by having your application log its traffic. The format that SOAtest expects is fairly loose. It can parse a wide range of formats—as long as it can find HTTP headers and request/response messages in sequence. For details on how to create stubs from messages in a plain text traffic log/trace file, see “Creating Tests From Traffic”, page 410.

538

Creating Stubs from WSDLs or Manually

Creating Stubs from WSDLs or Manually If the service you want to emulate is not yet completed or accessible, you will not have any existing traffic—and thus cannot use the stub creation method described in “Creating Stubs from Functional Test Traffic”, page 535). However, you can create stubs from the WSDL test creation wizard. Alternatively, you can define stubs "from scratch" by adding Message Stub tools the test suite and configuring them manually. This topic explains how to create stubs even if you do not have functional tests that exercise the services you want to stub. Sections include: •

Adding Stubs



Using the Stubs

Adding Stubs If you have access to the WSDL, you can have SOAtest automatically add and configure Message Stub tools based on the operations defined in the WSDL. This is accomplished using the WSDL test creation wizard (described in detail in “Creating Tests From a WSDL”, page 385). If you do not have access to the WSDL, if the services you want to stub are REST services, or if you simply want more control over the stub configuration process, you can manually add Message Stub tools to the test suite.

Generating Stubs from a WSDL To generate stubs from a WSDL, complete the following:. 1. Start completing the WSDL test creation wizard as normal. 2. In the WSDL page, select the Create functional tests from the WSDL checkbox and choose the Generate Server Stubs button.

3. Continue completing the wizard.

539

Creating Stubs from WSDLs or Manually

After you click Finish, a test suite will be created containing stubs with the methods declared in the WSDL that you entered. Each test in the test suite is composed of the Message Stub tool.

Defining Stubs Manually To define stubs manually, add Message Stub tools to the test suite as described in “Adding Standard Tests”, page 331. Typically, each Message Stub tool represents an operation in the service.

Using the Stubs For details on how to customize, deploy, and exercise stubs, see “Working with Stubs”, page 541.

540

Working with Stubs

Working with Stubs This topic explains how to customize, deploy, and exercise stubs. Sections include: •

Locating and Storing Stubs



Customizing Stub Behavior



Using Data Sources with Stubs



Understanding Deployment Options •

Hosted Deployment vs. Scenario-Mode Integration



Dedicated (Remote) Stub Servers vs. The Local Stub Server



Recommended Workflow



Configuring the Local Stub Server



Configuring a Dedicated Stub Server



Manually Deploying Stubs



Refreshing Stubs



Validating Stubs



Interacting with Stubs



Gaining Visibility into Stub Behavior



Tutorial



Stub Server View - GUI Reference

Locating and Storing Stubs The "stubs" project is the default and recommended location for stub .tst files. It is added to the SOAtest workspace when a stub is deployed (either from the Stub Server view or when creating a stub from traffic). Any .tst files added to this project will be automatically deployed to the local stub server. This project contains a stubs.xml file that stores the deployment configuration for each stub—including the location of each stub's .tst file, name and HTTP endpoint path, global reporting settings, and JMS and WebSphere MQ transport settings. This file is automatically saved when SOAtest exits.

The Stub Server view is where you manage and interact with the local stub server as well as dedicated stub servers running on remote machines. To open it, choose Window> Show View> Stub Server. For an overview of the available buttons and commands, see “Stub Server View - GUI Reference”, page 552. If you want to save local stub settings before exiting SOAtest, right-click the Local machine node in the Stub Server view and choose Save Stub Deployment Changes.

541

Working with Stubs

Customizing Stub Behavior You can customize the behavior of the emulated assets—with different request/response use cases specified manually or via data sources, error conditions, delays, and so forth—by customizing the automatically-generated (or manually added) Message Stub tools as described in “Message Stub”, page 784.

Using Data Sources with Stubs Specifying response values in a data source is a very efficient way to add a significant volume of request/response pairs. For details on how to use an existing data source or rapidly create a new one, see “Using Existing Data Sources or Rapidly Creating Data Sources for Responses”, page 789.

Understanding Deployment Options Hosted Deployment vs. Scenario-Mode Integration Hosted Deployment The typical usage model for stubs is "hosted deployment": the stub for the Message Stub tools’ test suite is continuously available in the background as a "hosted" stub. In this mode, the stub will wait for an appropriate message, then respond in the specified manner whenever the designated stub endpoint is contacted. This can continue as long as the stub server is running. These stubs can be deployed on a local stub server or a remote stub server. See “Dedicated (Remote) Stub Servers vs. The Local Stub Server”, page 543 for a discussion of these options.

Integrating Message Stub Tools into an End-to-End Test Scenario An alternate usage model is to have a Message Stub tool invoked as part of an end-to-end test scenario. With this configuration, SOAtest’s local stub server will start listening for a message when that specific Message Stub test step is called as part of the test sequence. It will consume the message, then react in the manner defined in the Message Stub tool’s configuration. After this completes, the next test in the test scenario will execute. You can cut/copy and paste automatically-generated or manually-defined Message Stub tools into an end-to-end test scenario so that they are invoked at the desired point in the test suite execution flow and can trigger additional test actions. For instance, to validate a loan approval service that executes a business process workflow with multiple steps (including calling a manager approval service and another external credit rating service), you could construct a scenario with the following tests: •

Test 1: Send a request to a loan approval service to initiate the process.



Test 2: Act as a stub to listen for the incoming credit rating service over HTTP and respond with the desired rating for the scenario (emulate the crediting rating service response).



Test 3: Act as a stub to consume the manager approval message over a JMS queue and respond with approval, denial, etc.



Test 4: Get an asynchronous response from the loan process with the final loan result and validate it.



Test 5: Execute a query against a relational database to check if the loan application was audited in the database properly.

542

Working with Stubs



Test 6: Remove the application data from the database in order to restore it to the original state and make the test scenario repeatable.

These stubs can be parameterized with data from a data source. For details on constructing end-to-end test scenarios, see “End-to-End Test Scenarios”, page 306.

Dedicated (Remote) Stub Servers vs. The Local Stub Server SOAtest allows you to configure dedicated stub servers—always-running machines that host the specified stubs in order to provide all team members and project stakeholders continuous, stable access to virtualized resources. With such a server, the team gains centralized stub access and management. Such stub servers can be accessed and managed remotely from your team’s various SOAtest installations. SOAtest also provides a local stub server that is ideal for quickly deploying stubs. This creates an environment for easy experimentation and validation. Hosted stubs can be deployed on either a remote stub server or a local stub server. Scenario-mode stubs are always deployed on the local stub server.

Recommended Workflow The recommended workflow for hosted stubs is to deploy a newly-created stub to the local server in order to validate that it works as expected and to fine-tune its behavior. This deployment is automated in many circumstances; when it is not fully automated, the stub can be deployed by simply dragging the related .tst file to the appropriate Stub Server node. Then, once the stub is operating properly, you can move it to a dedicated stub server for centralized, team-wide access. This re-deployment can be done by simply dragging the related .tst file from the local server to the remote one.

Configuring the Local Stub Server The local stub server can be started and stopped from the Stub Server view or from the command line.

From the GUI Starting the Server To start the local stub server: 1. Open the Stub Server view (if it is not available, choose Window> Show View> Stub Server). 2. If the Server node does not have a green ball icon to the left of it, start the local stub server in one of the following ways:

543

Working with Stubs



Right-click the Server node and choose Start Server.



Select the Server node , then click Start Server in the Stub Server view’s toolbar.

Stopping the Server You can stop the local stub server (making any stubs on the local stub server inaccessible) in any of the following ways: •

Right-click the Server node and select Stop Server.



Select the Server node , then click Stop Server in the Stub Server view’s toolbar.

From the Command Line To start the local stub server from the command line: •

On the local machine, use a command such as "soatestcli -startStubServer -data <workspace_dir> -localsettings <localsettings_file>" file

For details on using SOAtest in command line mode, see “Testing from the Command Line Interface (soatestcli)”, page 257.

Deploying Stubs to the Local Stub Server For instructions on how to deploy stubs to the local stub server, see “Manually Deploying Stubs”, page 546.

Configuring a Dedicated Stub Server To work with a dedicated stub server, you start SOAtest in server mode from the designated server machine, then you interact with it from the various desktop SOAtest installations that your team uses for testing.

Starting SOAtest in Server Mode To set up a dedicated stub server: 1. Install SOAtest Server on the machine that you want to act as a dedicated stub server.

544

Working with Stubs

2. On that same machine, start SOAtest in stub server mode by using a command such as: "soatestcli -startStubServer -data <workspace_dir> -localsettings <localsettings_file>" file

For details on using SOAtest in command line mode, see “Testing from the Command Line Interface (soatestcli)”, page 257. The stub server is controlled by a web service with the URL http://localhost:9080/axis2/services/StubService?wsdl.

Stopping a Dedicated Server To stop a dedicated stub server: •

Invoke the "shutdown" operation from the stubs web service.

Interacting with a Remote Stub Server To configure a desktop SOAtest installation to interact with a remote stub server (e.g., so you can view and add stubs): 1. Open the desktop SOAtest installation’s Stub Server view (Choose Window> Show View> Stub Server). 2. Do one of the following: •

Right-click the Server node, then choose Add Server.



Select the Server node, then click Add Server.

545

Working with Stubs

3. In the wizard that opens, specify the server’s host name, protocol, and port.

The server will then be added to the list of servers—allowing you to add stubs and configure stubs that run on this server.

Deploying Stubs to a Remote Stub Server For instructions on how to deploy stubs to a remote stub server, see “Manually Deploying Stubs”, page 546 below. Note that when a stub is deployed to a remote stub server, that stub’s .tst file is written to the "stubs" project of the workspace being used by the remote stub server.

Manually Deploying Stubs There are several ways to manually deploy stubs: •

Drag and drop (or copy/paste) already-deployed stubs from one stub server to another.



Drag and drop (or copy/paste) .tst files to the Stub Server node representing the desired stub server.



Drag and drop (or copy/paste) .tst files to the "stubs" project.



Use the Add Stubs wizard.

More specifically, here is an overview of the deployment options available for the local stub server and for dedicated (remote) stub servers:

546

Working with Stubs

Local Stub Server

Remote Stub Server



Right-click the Local machine node then chose Add Stub.



Right-click the remote stub server node and choose Add Stub.



Create stubs from existing traffic as described in “Creating Stubs from Functional Test Traffic”, page 535 and “Creating Tests From Traffic”, page 410.



Drag and drop a .tst file from the Test Case Explorer or Navigator view to the remote stub server node in the Stub Server view.





Drag and drop (or copy) a .tst file to the stubs project in the local workspace.

Copy and paste a stub from the Local Machine node to the remote stub server node in the Stub Server view.





Drag and drop a .tst file to the Local machine node in the Stub Server view.

Drag and drop a stub from the Local machine node to the remote stub server node.



Copy and paste a stub from a remote stub server node to the Local machine node.



Drag and drop a stub from a remote stub server Stub View node to the Local machine node.

Detailed instructions are provided in the following sections.

You Can Skip Manual Deployment If... No manual deployment steps are needed if: •

You created stubs from functional test traffic (as described in “Creating Stubs from Functional Test Traffic”, page 535) and you want to deploy stubs to the local stub server... OR



You created stubs from recorded traffic (as described in “Creating Tests From Traffic”, page 410) and you want to deploy stubs to the local stub server...OR



You are invoking stubs by running Message Stub tools as part of an end-to-end test scenario (as described in “Hosted Deployment vs. Scenario-Mode Integration”, page 542).

Using Drag and Drop or Copy/Paste The fastest way to deploy a stub to a local or remote server or to move stubs from one server to another is as follows: 1. In the Stub Server tab, find the node representing the local or remote server to which you want to deploy the stubs. 2. Drag (or copy/paste) the stub to that node. You can drag or copy stubs from other servers, or .tst files from the Test Case Explorer or Navigator. Additionally, you can drag or copy related test assets (such as a .csv or .xls data source used by the stub) from the Test Case Explorer or Navigator. You can use this procedure for a variety of purposes, including:

547

Working with Stubs



To deploy a newly-created stub to the local server in order to validate and fine-tune its operation.



To move a properly-functioning stub from the local server to a remote server for team-wide use.



To move a stub from the remote server to a local server for editing, then re-deploy the modified stub to the remote server.



To update the .tst file used by any already-deployed stub.

Alternatively, you can deploy stubs to the local server by adding the related .tst file to the "stubs" project (through drag and drop, copy/paste, or a source control update).

Using the Add Stub Wizard (for Local Server Deployment) If you want additional control over the deployment process (e.g., if you want to modify the endpoint), you can deploy stubs to the local server as follows: 1. In the Stub Server tab, right-click the node representing the machine to which you want to deploy the stub, then choose Add Stub. •

To deploy the stub to a remote stub server, right-click that machine’s node. For more on using remote servers, see “Dedicated (Remote) Stub Servers vs. The Local Stub Server”, page 543 and “Configuring a Dedicated Stub Server”, page 544.



To deploy stubs to the local machine, right-click the Local machine node.

548

Working with Stubs

2. Specify the path to the test suite that contains your Message Stub tools, then click Next.

3. Modify the endpoint if desired, then click Finish.

About "Hosted Stub" Deployment If the stub for the Message Stub tools’ test suite is continuously available in the background (as a "hosted" stub), the stub will wait for an appropriate message, then respond in the specified manner whenever the designated stub endpoint is contacted. This can continue as long as the stub server is running.

About Deployment of Message Stub Tools Integrated into an End-toEnd Test Scenario

549

Working with Stubs

If a Message Stub tool is invoked as part of an end-to-end test scenario, SOAtest will start listening for a message when that specific Message Stub test step is called as part of the test sequence. It will consume the message, then react in the manner defined in the Message Stub tool’s configuration. After this completes, the following test in the test scenario will execute.

Customizing Stub Deployment For details on how to customize advanced options for stub deployment, see “Configuring Stub Server Deployment Settings”, page 554.

Re-Deploying Modified Stubs If you modify the stubs, be sure to re-deploy them as follows: •

In the Stub Server tab, right-click the appropriate machine node, then choose Re-Deploy All Stubs from the shortcut menu to re-deploy the modified stubs.

Refreshing Stubs Refreshing stubs ensures that the Stub Server tree is in sync with the deployed stubs. To refresh the entire Stub Server view, do one of the following: •

Right-click the Server node, then choose Refresh from the shortcut menu.



Select the Server node, then click Refresh .

To refresh a particular stub server (e.g., to display stubs recently added by a team member): •

Right-click the related node in the Stub Server tree, then choose Refresh from the shortcut menu.

Validating Stubs Before you configure your application to interact with your stubs, you might want to validate that the stubs behave as expected. To do this, you can create a new environment that uses the stubs instead of the actual server, then run your tests against this environment to ensure that the stubs are demonstrating the expected behavior.

550

Working with Stubs

For a demonstration of how to use an environment to validate stub behavior, see “Creating and Deploying Stubs”, page 102. For a general discussion of environments, see “Configuring Testing in Different Environments”, page 369.

Interacting with Stubs For Message Stub Tools Integrated Into an End-to-End Test Scenario When you want to have your application interact with a stub instead of an actual resource, configure your application to access the stub, which will be deployed at http://<localhost>:9080/servlet/MessageHandler. Note that every stub created and deployed in this manner has the same URL. This is possible because the Message Stub tools will be invoked one at a time, according to the logic of the test suite.

For "Hosted Stubs" that are Continuously Deployed To have your application interact with a stub instead of an actual resource, configure your application to access the stub’s HTTP endpoint (for example, http://shuttle45:9080/servlet/StubEndpoint?stub=MyStub). For stubbed REST services, use the HTTP endpoint, plus the desired parameters. Note that each stub has a unique URL because you may have multiple stubs deployed at once. To determine the stub’s HTTP endpoint: 1. Open the Stub Server view (if it is not available, choose Window> Show View> Stub Server). 2. Expand the appropriate server’s branch and double-click the node named after the test suite from which you created stubs.

551

Working with Stubs

Review the value in the HTTP endpoint value in the configuration panel that opens. This is the location to which the stub is deployed.

Gaining Visibility into Stub Behavior Visibility into what messages are sent to and from stubs—and what validations and errors occur at the stub level—enables you to: •

Validate the messages that the application sends to your stubs.



See what errors occur based on how your application interacts with the stubbed services.

When Message Stub tools are integrated into end-to-end test suites, you can gain visibility into their behavior through the Traffic Viewer tools that are attached to them. When stubs are deployed as continuously available "hosted stubs", you can use the Event Monitor to gain visibility into the stubs. For details on how to configure this, see “Event Monitoring (ESBs, Java Apps, Databases, and other Systems)”, page 494.

Tutorial For a tutorial on using stubs, see “Creating and Deploying Stubs”, page 102.

Stub Server View - GUI Reference Toolbar Buttons The Stub Server view’s toolbar provides the following buttons:

Icon

Name

Description

Start Server

Starts the local server.

Stop Server

Stops the local server.

552

Working with Stubs

Icon

Name

Description

Refresh

Refreshes all servers in the tree.

Add Server

Allows you to add a remote server to the Stubs View tree.

Shortcut (Right-click) Commands The following shortcut (right-click) commands are available within the Stub Server view: •









From the Server node: •

Start Server: Starts the local server.



Stop Server: Stops the local server.



Refresh: Refreshes all servers in the tree.



Add Server: Allows you to add a remote server to the Stub Server tree.

From the Local machine node: •

Open: Opens a panel that allows you to configure advanced settings for the local stub server. See “Configuring Stub Server Deployment Settings”, page 554 for details.



Refresh: Refreshes the local stub server.



Add Stub: Allows you to add a stub to the local stub server.



Re-deploy All Stubs: Re-deploys stubs so that modifications are "live."



Save Stub Deployment Changes: Forces SOAtest to save the local stub modifications to stubs.xml. Otherwise, changes will be saved upon exiting SOAtest.

From a remote server node: •

Open: Opens a panel that allows you to configure advanced settings for the given stub server. See “Configuring Stub Server Deployment Settings”, page 554 for details.



Refresh: Refreshes the given server (e.g., to keep it in sync with stubs added or removed by other team members).



Add Stub: Allows you to add a stub to the given server.



Re-deploy All Stubs: Re-deploys stubs so that modifications are "live."



Remove Server: Removes a remote server from the Stubs Server tree.

From a specific stub node (local machine or remote server): •

Open: Opens a panel that allows you to configure advanced deployment settings for the given stub. See “Configuring Stub Server Deployment Settings”, page 554 for details.



Copy: Allows you to copy a stub so you can paste it from one server to another.



Paste: Allows you to paste a copied stub from one server to another.



Delete: Deletes the stub from the given server.

Unprocessed Messages: Shows details on messages that were sent to that stub, but not processed by that stub.

553

Configuring Stub Server Deployment Settings

Configuring Stub Server Deployment Settings This topic explains how to configure advanced deployment settings for local or remote stub servers— and for the stubs deployed upon them. Via the Stub Server tab, various settings can be configured globally—or individually for each stub (when stubs are created automatically from a functional test suite as described in “Creating Stubs from Functional Test Traffic”, page 535). When a global setting is configured, the setting applies to all deployed stubs on the given stub server. The global setting can be overridden by configuring the individual setting for the stub. Sections include: •

Global Stub Deployment Settings



Individual Stub Deployment Settings



Configuring JMS for Stubs



Configuring MQ for Stubs

Note: For details on customizing stub behavior (e.g., how to customize stubs with different request/ response use cases, error conditions, delays, and so forth), you customize the related Message Stub tools as described in “Message Stub”, page 784.

Global Stub Deployment Settings Global settings can be configured by right-clicking any listed stub server (remote or local), in the Stub Server tab, then choosing Open.

From the configuration panel that opens, you can configure settings for: •

Emulating services deployed on JMS - See “Configuring JMS for Stubs”, page 556



Emulating services deployed on IBM WebSphere MQ - See “Configuring MQ for Stubs”, page 556



Gaining visibility into stub behavior - See “Monitoring Stub Events”, page 520

Individual Stub Deployment Settings Individual settings can be configured by right-clicking a specific stub listed in the Stub Server view, then choosing Open.

554

Configuring Stub Server Deployment Settings

You can then configure general options as well as JMS and MQ settings.

Configuring General Stub Deployment Options In the General tab, you can specify: •

Test Suite: Specifies the Test Suite in which Message Stubs are configured. To change the related test suite, drag or copy the related .tst file to the appropriate server node in the Stub Server tree.



"stub" parameter: If you specify a value here, the SOAtest stub will only consume messages that include a string property with a value matching that field value.



HTTP Endpoint: If you are using HTTP (not JMS or MQ), this is where the stub can be accessed. To exercise the stub, you can configure your application to use this URL instead of the URL for the actual resource. Any machine that can access this endpoint can access and use your stub.

Note: Before deploying stubs over JMS and/or MQ, please add the appropriate jar files to the SOAtest classpath. For details on how to do this, see “System Properties Settings”, page 758.

555

Configuring Stub Server Deployment Settings

Configuring JMS for Stubs A stub can be configured to receive messages from, and send messages to, a queue or a topic. To configure global JMS settings that apply across a specific stub server, double-click the appropriate server machine node in the Stub Server view. To configure JMS settings for a specific stub, double-click the appropriate stub node in the Stub Server view. The following JMS options are required in the JMS Settings tab : •

providerURL: Specifies the value of the property named javax.naming.Context.PROVIDER_URL passed to the JNDI javax.naming.InitialContext constructor.



initialContext: Specifies a fully qualified class name string, passed to the JNDI javax.naming.InitialContext constructor as a string value for the property named javax.naming.Context.INITIAL_CONTEXT_FACTORY



Messaging Model: Messaging Model options specify how messages are sent between applications. Select either Point to Point or Publish and Subscribe, then specify the settings in the appropriate area (Point-to-Point Settings or Publish-and-Subscribe Settings)



Message Selector Expression: (optional) When the same queue is being used by multiple services, it is helpful to specify a message selector expression. For example, if the message selector expression is "product = 'soatest'", then the stubs will only select messages in the queues/topics that have a JMS Header "product: soatest". See “Using Message Selector Filters”, page 704 for tips.



username/password: Enter if needed.

Behavior of Stubs Deployed Over JMS The JMSMessageID of the request message will be sent as the JMSCorrelationID of the response message. SOAtest stubs deployed over JMS can be invoked simply by having the application send or publish the messages to the specified destination as usual. SOAtest will consume messages on that destination. If a value is specified in the Message Selector Expression field, it will consume any message that matches the specified expression. Optionally, you can also specify a value in the "stub" parameter field. In this case, the SOAtest stub will only consume messages that include a string property with a value matching that field value.

Configuring MQ for Stubs SOAtest stubs can emulate services deployed on IBM WebSphere MQ by configuring the necessary MQ Settings. To configure global MQ settings that apply across a specific stub server, double-click the appropriate server machine node in the Stub Server tab. To configure MQ settings for a specific stub, double-click the appropriate stub node in the Stub Server tab. The following MQ options are required in the MQ Settings tab: •

mq_host: Specifies the name of the host running IBM MQ.



mq_port: Specifies the port number for IBM MQ.

556

Configuring Stub Server Deployment Settings



queueManager: Specifies the name of the Queue Manager.



channel: Specifies the name of the server-defined channel.



putQueue: Specifies the queue that SOAtest sends the SOAP message to.



getQueue: Specifies the queue that SOAtest retrieves the SOAP message from.



Message Selector ID: (optional) When the same queue is being used by multiple services, it is helpful to specify a message selector expression. For example, if the message selector expression is "product = 'soatest'", then the stubs will only select messages in the queues that have a header "product: soatest". See “Using Message Selector Filters”, page 704 for tips.

Behavior of Stubs Deployed over IBM WebSphere MQ SOAtest stubs deployed over MQ can be invoked simply by having the application send or publish the messages to the specified destination as usual. SOAtest will consume messages on that destination. If a value is specified in the Message Selector Expression field, it will consume any message that matches the specified expression. Optionally, you can also specify a value in the "stub" parameter field. In this case, the SOAtest stub will only consume messages that include a string property with a value matching that field value. Note on IBM WebSphere MQ Clients: In order to use the MQMD.putApplicationName field, the client must also ensure that MQMD.putApplicationName matches the "stub" parameter field in the stub configuration editor

557

Load Testing In this section: •

Load Testing your Functional Tests: Introduction



Load Test Documentation and Tutorial



Preparing Web Functional Tests for Load Testing

558

Load Testing your Functional Tests: Introduction

Load Testing your Functional Tests: Introduction Load testing is performed in Parasoft Load Test—a load testing platform that features: •

Centrally-managed load test configuration/execution with seamless integration into Parasoft SOAtest. This is aligned with how teams and roles are typically structured within an organization.



The ability to load test complete end-to-end test scenarios—from the web interface, through services, to the database. Every protocol and test type available in Parasoft SOAtest is supported in Parasoft Load Test.



Support for load testing non-Parasoft components such as JUnits or lightweight socketbased components. This provides an integrated solution for your various load testing needs.

Important Notes •

Obtaining Parasoft Load Test: The Parasoft SOAtest installer installs both Parasoft SOAtest and Parasoft Load Test.



Web load testing: If you want to use your browser-based functional tests for browser-less web load testing, use SOAtest to configure them for this application. For details, see “Preparing Web Functional Tests for Load Testing”, page 561.

559

Load Test Documentation and Tutorial

Load Test Documentation and Tutorial Detailed load testing documentation is provided with the Load Test product. Documentation is available in a fully-searchable online help system, as well as a PDF. A load testing tutorial is included as part of the documentation.

560

Preparing Web Functional Tests for Load Testing

Preparing Web Functional Tests for Load Testing SOAtest web functional tests (including automatically-generated tests added when you record from a browser as well as manually-added Browser Testing Tool tests) are designed to be run in a browser. Since load tests don’t run in a browser, some configuration is necessary to reuse functional web tests in a load testing environment—where web tests are conducted by sending requests to the server. Parasoft SOAtest automatically configures your browser-based functional tests for load testing. It also validates them by executing them in a simulated load testing environment. This significantly reduces the setup required to create meaningful web load tests, and helps you to identify and resolve any potential load test issues before the load testing efforts actually begin. This topic explains how to prepare your web functional tests for load testing. Sections include: •

Recommended Preparation Procedure



Accessing and Understanding the Load Test Perspective



Configuring Tests



Validating Tests



Notes

Recommended Preparation Procedure The recommended procedure is to configure your tests for load testing as described in “Configuring Tests”, page 562, then validate that they will work properly as described in “Validating Tests”, page 566. However, if you do not want to run the configuration step (e.g., because you have already configured the tests and do not want to overwrite any manual configurations you added), configuration is not required as long as the validation step passes.

Accessing and Understanding the Load Test Perspective The Load Test perspective is designed to help you prepare your web functional tests for load testing. To open the Load Test perspective: •

Choose Window> Open Perspective> Other> Load Test.

This perspective is similar to the SOAtest perspective, but it also provides the following features: •

Two toolbar buttons (Configure for Load Test and Validate for Load Test) which allow you to run automated test configuration and validation.



A Load Test Explorer, which lists the available web functional tests. Note that any web functional test components that are not relevant to load testing—for example, browser-based validations or data banks—will not be shown in this view.



Load Test Explorer right-click menus for running automated test configuration and validation (the same commands available in the toolbar buttons).



Specialized test configuration panels, which are accessed by double-clicking a test in the Load Test Explorer.

561

Preparing Web Functional Tests for Load Testing

Configuring Tests Why Do I Need to Configure Tests? Load tests take the set of requests that the browser test would use and sends those results outside of the browser context. Sometimes, browser requests become invalid when they are re-submitted outside of the browser—for instance, because a request contains session-dependent information such as session ID. In these cases, configuration is required. To facilitate configuration, SOAtest identifies such issues and automatically configures the requests to run in the browser-less load test environment. In the configure mode, SOAtest: 1. Runs the test twice to identify dynamic parameters (e.g., session IDs). 2. Sets up a Text Data Bank to extract a valid value for each dynamic request parameter (e.g., using a value extracted from a previous test, or an earlier response in the current test). For more details about this tool, see “Text Data Bank”, page 952. 3. Configures the test to use the appropriate extracted value for each dynamic parameter. These requests are saved with the appropriate tests, and can be accessed as described in How Can I Review and Modify the Requests that SOAtest Configured? below. This configuration is required when either: •

Load test validation does not succeed.



Your application has evolved to the point that your existing load test configurations no longer match the application functionality.

How Do I Configure Tests? Warning Configuration will re-create all the existing load testing requests based on the application’s existing state. As a result, any existing load test configurations you have set up in SOAtest (for example, if you manually configured the URL or Post Data to be set using parameterized or scripted values) will be overwritten. Run the automated configuration as follows: 1. In the Load Test Explorer, select the test suite that you want to configure. 2. Either click the Configure for Load Test toolbar button, or right-click the test suite and choose Configure for Load Test from the shortcut menu. Next, validate tests as described in “Validating Tests”, page 566.

How Can I Review and Modify the Requests that SOAtest Configured? If you double-click on a Browser Testing tool in the Load Test Explorer (available in the Load Test perspective), you will see a special configuration panel that displays a list of the requests that the test is supposed to make. It shows both the URL and post data, and allows you to modify these if desired. You can also add and remove requests using the controls on the right of the configuration panel.

How Do I Parameterize or Script Request Values?

562

Preparing Web Functional Tests for Load Testing

If you want to use a dynamic value for any part of the request, you can parameterize requests with values from a data source or values extracted from another test—or with values resulting from custom scripting. To do this: 1. Double-click the test in the Load Test Explorer (available in the Load Test perspective) to open its configuration panel. 2. Select the specific request whose values you want to parameterize 3. In the URL or Post Data tab (depending on what part of the request you wish to parameterize), highlight the text you want to parameterize. 4. Click Parameterize Selected Text. 5. In the dialog that opens, specify a name for the parameterized value. The actual value in the URL or Post Data tab will be replaced with a reference to a variable, and an entry for that variable will be added to the Parameterized Values area at the bottom of the test configuration panel. 6. To configure the variable to use a value that is stored in a data source or that is extracted from another test, choose Parameterized in the Value field, then select the desired data source column in the box to the right of Parameterize. •

See “Parameterizing Tests (with Data Sources or Values from Other Tests)”, page 345 for more details about parameterizing tests.

7. To configure the variable to use the result of a custom script, select Scripted in the Value field, then click Edit Script and specify the script details. •

See “Extensibility (Scripting) Basics”, page 764 for more details about using custom scripts.

Note that if your functional test is already configured to use parameterized values, the configuration step will set up the tests so that the parameterized values will also be used for load testing.

What Happens During Test Configuration? When running the “Configure for Load Test” step, SOAtest executes the test suite twice and performs three kinds of automated configuration: 1. HTTP requests that were previously set to use data source values for the functional test scenario are automatically configured to use the same data source values for the load test configuration Here is an example where SOAtest parameterized the username and password parameters in the HTTP request with the same data source values that the functional test was already configured to use:

563

Preparing Web Functional Tests for Load Testing

Note that the “password” parameter is configured to use the “Password” column from the "Login Values” data source. 2. Dynamic parameter values (for example session ids) in the HTTP requests are configured to use the updated values from the current session SOAtest does this by creating a Text Data Bank tool on an HTTP response that contains the dynamic value. This data bank value is then used in the appropriate HTTP request. For instance, in the example shown below, Test 2 had a Text Data Bank added to it.

Note that a value is being extracted into a column name “qid”. Left and right hand text boundaries have been auto-configured.

564

Preparing Web Functional Tests for Load Testing

Below, you can see that one of the HTTP requests has been configured to use the extracted value:

For more details about this tool, see “Text Data Bank”, page 952. 3. The scenario is configured to extract the same data bank values that were extracted for functional testing This configuration makes the values available for other tools in mixed web/SOA load test scenarios. For example, assume that Test 2: Click “Google Search” has a Browser Data Bank that extracted a column named “Extracted: bookName”. SOAtest found the HTTP response that contained the same value, and created a Text Data Bank that extracts the value into a column with the same name (“Extracted: bookName”). This value is later used in a SOAP client, as shown below:

565

Preparing Web Functional Tests for Load Testing

Validating Tests Why Validate Tests? When you validate tests, SOAtest will run them in load testing mode and alert you to any outstanding issues that might impact your load testing—for example, incorrectly configured HTTP requests. This way, you can resolve these issues before the actual load testing begins.

How Do I Validate Tests? Run the automated validation as follows: 1. In the Load Test Explorer, select the test suite that you want to validate. 2. Either click the Validate for Load Test toolbar button, or right-click the test suite and choose Validate for Load Test from the shortcut menu. If the validation succeeds, the Validate for Load Test tab will report that the test suite is ready to be used in Load Test. If a potential issue is detected, it will report that n issues need to be resolved before this test suite can be used in Load Test. You can then review the reported issues in the SOAtest view.

How Can I See a Test Step Rendered in a Browser? To better determine what is occurring at each test step, you can have SOAtest display what happens when the load test requests are rendered in a browser. To do this, double-click the Browser Contents Viewer added to the related test. This is especially helpful if you want to visualize why the test is not producing the expected results. For example, the rendered page might reveal that the login is not occurring properly. Using this tool, along with examining error messages, helps you identify and resolve the cause of problems.

What Is Validation Looking For? During validation, SOAtest determines if any configuration needs to be done on the scenario—either automated configuration (by SOAtest) or manual configuration. If validation does not succeed, this indicates that you need to run the configuration step or—if you have already run the configure step— that you need to manually configure parameters.

What if Problems Are Reported? If the dynamic parameters could not be auto-configured by “Configure for Load Test”, one or both of the following will happen: 1. Errors will be reported by “Validate for Load Test”. Here are the kinds of errors you might see and what they could mean: a. HTTP error codes (e.g. 404 – Not Found or 401 – Not Authorized). This means that the HTTP requests have incorrect dynamic parameter values or are otherwise wrongly configured. b. Functional test errors such as “Unable to perform user action”, “Unable to validate or extract …”. These errors occur because the specified page elements for the failing test could not be found. Page elements not being found is typically the result of the HTTP responses containing unexpected data. Again, this is usually the result of the HTTP requests having incorrect dynamic parameter values or being otherwise wrongly configured.

566

Preparing Web Functional Tests for Load Testing

2. The Browser Contents Viewer will show an incorrect or unexpected page at the point where the incorrect dynamic parameter was used. If such issues occur, run “Configure for Load Test”. If “Configure for Load Test” has already been run and these errors are still occurring, you may need to manually configure the HTTP requests and/or parameters causing the problem.

When Do I Need to Manually Configure Parameters? There is one class of dynamic parameter values that SOAtest cannot configure automatically: values that are normally constructed or transformed by JavaScript in the browser. Since the (transformed) parameter values do not exist in any of the HTTP responses, SOAtest cannot extract them to be used where necessary in any HTTP requests. These parameters need to be configured manually. Validation will alert you when these kinds of dynamic parameters are present and are required by the web application to be updated for each session.

How Do I Manually Configure Parameters? Use the procedure described in “How Do I Parameterize or Script Request Values?”, page 562. Here is an example of a parameter that passes the current time to the server. This is a dynamic parameter, constructed by JavaScript, that is not present in any of the previous HTTP responses. It has been manually configured to be parameterized using a script that calculates and returns the current time.

567

Preparing Web Functional Tests for Load Testing

Notes •

Web load testing focuses on requests that result in text responses. It does not transfer binary files such as images, flash files, JavaScript, CSS., etc.) This allows you to simulate a mode where everything is cached on the user’s machine—providing response times that are accurate for a repeat visitor/existing user.



The requests for web load testing are configured to simulate the browser specified in the test suite’s Browser Playback Options tab. Browser type is simulated by sending the appropriate header content (User-Agent and Accept).



If the application requires basic or NTLM authentication, the settings used in the test suite’s Browser Playback Options tab will be applied to web load testing as well.

568

SOA Quality Governance and Policy Enforcement In this section: •

SOA Policy Enforcement: Overview



Defining the Policy



Enforcing Policies on WSDLs, Schemas, and SOAP Messages

569

SOA Policy Enforcement: Overview

SOA Policy Enforcement: Overview This topic provides an overview of SOAtest’s quality policy enforcement capabilities. Sections include: •

Policy Enforcement Details



Recommended Workflow



Tutorial

Policy Enforcement Details SOAtest provides a complete SOA policy enforcement solution, enforcing policies with executable rules that can be applied to WSDLs, schemas, SOAP messages, and any other XML artifact or SOA meta-data component. Once an organization has defined their policies to guide their SOA deployments, SOAtest can be used to enforce them throughout the development and QA process. For example, SOAtest verifies schema and semantic validity for W3C and OASIS standards compliance, validates Basic Profile 1.1 for WS-I Interoperability compliance, and implements rules to enforce various other endorsed WS* Standards.In addition, SOAtest can be used to enforce compliance to best practices such as customized company guidelines, security, and maintainability and reusability.

Registry-Based Policy Management SOAtest provides native support for multiple commercial registries. This integration enables teams to automatically execute a quality workflow and correlate quality data in the context of an SOA Governance initiative. Teams can automatically extract the information needed to create tests for design and development policies (such as standards, compliance, security, and best practices) for Web services assets as they are defined in a registry. They can also select a service asset and verify the associated policies, thereby ensuring interoperability and consistency. SOAtest is capable of querying any UDDI registry from vendors such as IBM, HP, and Microsoft. Furthermore, Parasoft offers even tighter integration with Oracle / BEA's AquaLogic Enterprise Repository (ALER) and Software AG's CentraSite. We automatically generate tests at the time the services are published to the registry–including functional test cases and WSDL verification tests that ensure WSDLs are compliant to best practices and organizational policies. Policy compliance results are then reported back to the registry and updated in real-time. This provides continuous visibility into a service's quality throughout its lifecycle.

Registry-Based Test Generation •

To learn how to create tests that enforce policies applied to Web service assets that are declared in a BEA repository, see “Creating Tests From Oracle Enterprise Repository / BEA AquaLogic Repository”, page 393.



To learn how to create tests that enforce policies applied to Web service assets that are declared in a Software AG CentraSite repository, see “Creating Tests From Software AG CentraSite Active SOA”, page 399.

570

SOA Policy Enforcement: Overview

WSDL, Schema, and Semantic Verification WSDL verification can be considered as the first step in testing Web services. Although WSDLs are generally created automatically by various tools, it doesn't necessarily mean that the WSDLs are correct. When WSDLs are manually altered, WSDL verification becomes even more important. Ensuring correct and compliant WSDLs enables your service consumers to function correctly, and avoids vendor lock-in, thus achieving interoperability and realizing SOA goals of service reuse. SOAtest can automatically generate a test suite of comprehensive WSDL tests to ensure that your WSDL conforms to the schema and passes XML validation tests. Additionally, it performs an interoperability check to verify that your Web service is WS-I compliant.

WS-* Standards Validation SOAtest enforces policies with executable rules that can be applied to WSDLs, schemas, SOAP messages and any other XML artifact or SOA meta-data component. For example, we verify schema and semantic validity for W3C and OASIS standards compliance, validate Basic Profile 1.1 for WS-I Interoperability compliance, and implement rules to enforce various other endorsed WS* Standards. In addition, we enforce compliance to best practices such as customized company guidelines, security, and maintainability and reusability.

Interoperability Testing SOAtest verifies the WSDL and SOAP traffic for conformance to Basic Profile 1.1 using WS-I Testing Tools. Functioning as both a traffic monitor and analyzer, SOAtest enhances the usability of WS-I Testing Tools by eliminating the need to set up man-in-the-middle Monitor and configuration files for the Analyzer. The only required input is the WSDL URL.

Recommended Workflow SOAtest’s policy enforcement component enables the SOA architect to define policy rules. Using SOAtest, Web service developers, QA, and Test engineers can then verify service compliance against the Architect-defined rules early in the Web service lifecycle process. The Architect then monitors compliance to these rules, resulting in a visible and controlled development process.

571

SOA Policy Enforcement: Overview

In addition, the rules defined by the architect can be applied across the entire organization via Parasoft’s Team Server, a server component that enables the sharing of rules and test policies among Parasoft’s products.

Tutorial For a step-by-step tutorial of how to monitor compliance to SOA policies, see “Design and Development Policy Enforcement”, page 159.

572

Defining the Policy

Defining the Policy This topic explains how to define and share a policy for your group. Sections include: •

Defining a Group Policy



Sharing Policy Files across the Group

Defining a Group Policy In SOAtest a policy consists of a set of assertions or rules. SOAtest enforces a policy by creating tests that check those rules. When creating tests from WSDL with a policy file, SOAtest creates "policy enforcer" tests. SOAtest ships with a default policy file that addresses the key concerns for a web service in an SOA, such as interoperability and compliance to industry standards, as well as maintainability and best practices. We strongly recommend that the team’s architect customize this policy to suit the team’s specific needs. To define a custom SOA policy: 1. Select File> New> Policy Configuration. 2. Specify a name and location for the policy, then click Finish. •

The Policy Configuration panel displays in the right GUI pane of SOAtest and lists assertions that correspond to policy enforcement rules and WSDL tests.

573

Defining the Policy

3. From the Policy Configuration panel, you can: •

Enable/disable individual rules or groups of rules by selecting or clearing the available check boxes.



Customize a parameterizable rule by right-clicking it, choosing View/Change Rule Parameters from the shortcut menu, then modifying settings as needed. Parameterized rules are marked with a special icon (a wizard hat with a radio button):



Search for a rule by clicking the Find button, then using that dialog to search for the rule.



Hide the rules that are not enabled by clicking the Hide Disabled button. If you later want all rules displayed, click Show All.



Define custom rules in RuleWizard by clicking New, then using the RuleWizard graphical editor or automated generator to create new rules. Once custom rules are defined, add them to the rule tree by clicking Add, then enable them. For details, open the RuleWizard User Guide by clicking New in this panel, then choosing Help> Documentation from within the RuleWizard GUI.)



View a description of a rule by right-clicking the node that represents that rule, then choosing View Rule Documentation from the shortcut menu

4. Click Save to save the custom policy to the location you previously specified. The policy configuration you define can be used later to automatically create tests to enforce policies as described in . For details, see “Enforcing Policies on WSDLs, Schemas, and SOAP Messages”, page 575.

Sharing Policy Files across the Group Once you have created your policy configuration files and custom rules, they can be shared so that all team members will have access to them in their own testing environments. Policy configuration files can be uploaded to Team Server through the Team Server web interface as described in the Team Server user’s guide. Rules can be also added to Team Server by choosing SOAtest> Explore> Team Server, then opening the Rules tab and uploading rules. You can also share policy configuration files via source control.

574

Enforcing Policies on WSDLs, Schemas, and SOAP Messages

Enforcing Policies on WSDLs, Schemas, and SOAP Messages This topic explains how to create, run, and review the results of policy enforcement tests. Sections include: •

Creating Policy Enforcement Tests



Running Policy Enforcement Tests



Reviewing Results/Reports

Creating Policy Enforcement Tests Once the policy has been defined, policy enforcement tests can be created within SOAtest to reference the related rules, or assertions. From the SOAtest test creation wizard, you can create the following types of policy enforcement tests: •

WSDL: Used to enforce policies and standards on the content of your WSDL documents including all imported WSDLs.



Schema: Used to enforce policies and standards on schemas referenced within your WSDL file including all imported schemas.



SOAP: Used to enforce policies and standards on SOAP messages sent from your web service.

To reference a policy configuration when creating tests from the WSDL test creation wizard (described in detail in “Creating Tests From a WSDL”, page 385): 1. Start completing the wizard as normal. 2. If you would like to create WSDL/Schema policy enforcement tests, select the Create tests to validate and enforce policies on the WSDL checkbox in the first wizard page. 3. When you reach the Policy Enforcement wizard page, select the Apply Policy Configuration check box. This will create WSDL and functional tests that will enforce the assertions defined in the specified policy configuration.

575

Enforcing Policies on WSDLs, Schemas, and SOAP Messages

4. Enter or browse to the desired policy configuration file.



You can reference a local file or a file on Team Server.



If you do not have a custom policy configuration, you can use the default policy enforcement tests that reference the default rules sets included with SOAtest. The default policy configuration, soa.policy, is a collection of industry-wide best practices.

5. Click the Finish button.

Running Policy Enforcement Tests You can now execute the test by clicking the Test toolbar button:

Reviewing Results/Reports Any policy violations detected will be reported as tasks in the SOAtest view as described in “Reviewing Results”, page 289. Alternatively, you can run the test from the command line, then import results into the SOAtest GUI as described in “Accessing Results and Reports”, page 234. Once the policy enforcement tests have been run, a report can be generated in HTML or XML format which will contain all the error messages for each policy enforcement violation. To view a report, right click the desired error message and select View Report> Detailed/Summary/Structure from the shortcut menu. The report will display in a Web browser.

576

Static Analysis In this section: •

Performing Static Analysis



Configuring SOAtest to Scan a Web Application



Reviewing Static Analysis Results



Suppressing the Reporting of Acceptable Violations



Customizing Static Analysis: Overview



Creating Custom Static Analysis Rules



Modifying Rule Categories, IDs, Names, and Severity Levels

577

Performing Static Analysis

Performing Static Analysis This topic explains how you can perform static analysis to identify code that does not comply with a preconfigured or customized set of static analysis rules. Sections include: •

About Static Analysis



Analyzing Resources From Your Project Workspace



Reviewing and Retesting Scanned Resources



Analyzing Files Outside of Your Project Workspace



Tutorial

About Static Analysis Static analysis is one of many technologies used to throughout the SDLC to help team members deliver secure, reliable, compliant SOA. Static analysis helps the team ensure that development activities satisfy the organization's expectations, ensuring interoperability and consistency across all SOA layers. SOAtest can perform static analysis on both SOA artifacts and the Web interface. For SOA, static analysis can be performed on individual artifacts (e.g., WSDL or XML files). It is also used as one of several components in a comprehensive SOA policy enforcement framework, which is discussed in “SOA Quality Governance and Policy Enforcement”, page 569. For Web interfaces, SOAtest’s static analysis can perform an automated audit of Web interface content and structure, automatically scanning and analyzing a Web asset for accessibility, branding, intranet standards compliance, and consistency. It inspects and exposes issues that present potential risk to the proper functionality, usability, and accessibility of your Web-based applications. It can cover an entire Web application or an individual component or module. Scan results are presented as actionable reports that identify erroneous objects—providing direct linkage to exposed issues for quick analysis and remediation. The assessment analysis covers the following areas: •

Accessibility: Support for Section 508, WAI, and WCAG 2.0 guidelines.



Branding: Automatically enforce policies related to site layout and "look and feel."



Intranet Standards: Identifies use of sensitive corporate data.



Consistency: Prevents broken links, spelling errors, browser compatibility issues.

More specifically, to facilitate Web accessibility validation (Section 508, WAI, WCAG), SOAtest automatically identifies code that positively or possibly violates Section 508, WAI, and WCAG 2.0 Web accessibility guidelines. During its automated audit. the solution checks whether Web interfaces comply with core accessibility guidelines and helps you identify code and page elements that require further inspection and/or modification. Moreover, Parasoft's pattern-based code analysis monitors whether Web language code follows industry-standard or customized rules for ensuring that code meets uniform expectations around security, reliability, performance, and maintainability. We provide an extensive rule library with hundreds of configurable rules for Web languages (JavaScript, VBScript/ASP, HTML, CSS, XML, and so on), as well as a graphical RuleWizard module that makes it very simple to construct and maintain customized rules.

Analyzing Resources From Your Project Workspace You can use the following procedure to perform static analysis on:

578

Performing Static Analysis



Any source files that are available in your project workspace (e.g., HTML, XML, WSDL, and other source files added to your project workspace). For details on linking your source files to a project created by SOAtest, see the Eclipse Workbench User Guide (choose Help> Help Contents).



Any Web pages that are represented in SOAtest test suites within your project workspace (e.g., the Web pages that are accessed as SOAtest crawls your Web application or the Web pages that the browser downloads as Web functional tests execute).

The general procedure for performing static analysis on one or more files in your project workspace is as follows: 1. Ensure that the files you want to analyze are available within SOAtest. The files must be available as a project in your workspace. •

If the files are not actually part of the project or downloaded as functional Web tests execute, you can have SOAtest "crawl" your Web application and then analyze the accessed pages; for details, see “Configuring SOAtest to Scan a Web Application”, page 583.

2. Select or create a Test Configuration with your preferred static analysis settings. •

For a description of preconfigured Test Configurations, see “Built-in Test Configurations”, page 741.



For details on how to create a custom Test Configuration, see “Creating Custom Test Configurations”, page 244.

Configuring SOAtest to Run Static Analysis on Web Functional Test Scenarios •

The selected Test Configuration must have Test Execution enabled (this is the default setting for all built-in Test Configurations).



By default, SOAtest will statically analyze the individual HTTP messages that the browser made in order to construct its data model—the content returned by the server as is (before any browser processing). If you prefer to analyze the browser-constructed HTML (the realtime data model that the browser constructed from all of the HTML, JS, CSS, and other files it loaded) you can change this by modifying the Test Configuration’s Execution settings.

3. Select the resource you want to analyze, then run the appropriate Test Configuration).

579

Performing Static Analysis



To test the Web pages that are accessed as SOAtest crawls your Web application, select the Test Suite that contains the Scanning Tool, then run the desired Test Configuration.



To test the Web pages that the browser downloads as Web functional tests execute, select the Test Suite that contains the Web Functional tests, then run the desired Test Configuration.



To perform static analysis from the command line, use the procedure described in “Testing from the Command Line Interface (soatestcli)”, page 257.

4. Review and respond to the results using the appropriate static analysis results layout option. •

For details, see “Reviewing Static Analysis Results”, page 605.

5. (Optional) Fine-tune static analysis settings as needed. •

For details, see “Customizing Static Analysis: Overview”, page 611.

580

Performing Static Analysis