SOAtestUserGuide.pdf

Published on June 2017 | Categories: Documents | Downloads: 143 | Comments: 0 | Views: 1758
of 971
Download PDF   Embed   Report

Comments

Content

Parasoft SOAtest User’s Guide

Parasoft Corporation 101 E. Huntington Drive, 2nd Floor Monrovia, CA 91016 Phone: (888) 305-0041 Fax: (626) 305-9048 E-mail: [email protected] URL: www.parasoft.com

PARASOFT END USER LICENSE AGREEMENT This Agreement has 3 parts. Part I applies if you have not purchased a license to the accompanying software (the "SOFTWARE"). Part II applies if you have purchased a license to the SOFTWARE. Part III applies to all license grants. If you initially acquired a copy of the SOFTWARE without purchasing a license and you wish to purchase a license, contact Parasoft Corporation ("PARASOFT"): (626) 256-3680 (888) 305-0041 (USA only) (626) 256-9048 (Fax) [email protected] http://www.parasoft.com PART I -- TERMS APPLICABLE WHEN LICENSE FEES NOT (YET) PAID GRANT. DISCLAIMER OF WARRANTY. Free of charge SOFTWARE is provided on an "AS IS" basis, without warranty of any kind, including without limitation the warranties of merchantability, fitness for a particular purpose and non-infringement. The entire risk as to the quality and performance of the SOFTWARE is borne by you. Should the SOFTWARE prove defective, you and not PARASOFT assume the entire cost of any service and repair. This disclaimer of warranty constitutes an essential part of the agreement. SOME JURISDICTIONS DO NOT ALLOW EXCLUSIONS OF AN IMPLIED WARRANTY, SO THIS DISCLAIMER MAY NOT APPLY TO YOU AND YOU MAY HAVE OTHER LEGAL RIGHTS THAT VARY BY JURISDICTION. PART II -- TERMS APPLICABLE WHEN LICENSE FEES PAID GRANT OF LICENSE. PARASOFT hereby grants you, and you accept, a limited license to use the enclosed electronic media, user manuals, and any related materials (collectively called the SOFTWARE in this AGREEMENT). You may install the SOFTWARE in only one location on a single disk or in one location on the temporary or permanent replacement of this disk for use by a single user. If you wish to install the SOFTWARE in multiple locations, you must license additional copies of the SOFTWARE from PARASOFT. If you wish to have multiple users access the software you must either license additional copies of the software from Parasoft or request a multi-user license from PARASOFT. You may not transfer or sub-license, either temporarily or permanently, your right to use the SOFTWARE under this AGREEMENT without the prior written consent of PARASOFT. LIMITED WARRANTY. PARASOFT warrants for a period of thirty (30) days from the date of purchase, that under normal use, the material of the electronic media will not prove defective. If, during the thirty (30) day period, the software media shall prove defective, you may return them to PARASOFT for a replacement without charge. THIS IS A LIMITED WARRANTY AND IT IS THE ONLY WARRANTYMADE BY PARASOFT. PARASOFT MAKES NO OTHER EXPRESS WARRANTY AND NO WARRANTY OF NONINFRINGEMENT OF THIRD PARTIES' RIGHTS. THE DURATION OF IMPLIED WARRANTIES, INCLUDING WITHOUT

LIMITATION, WARRANTIES OF MERCHANTABILITY AND OF FITNESS FOR A PARTICULAR PURPOSE, IS LIMITED TO THE ABOVE LIMITED WARRANTY PERIOD; SOME JURISDICTIONS DO NOT ALLOW LIMITATIONS ON HOW LONG AN IMPLIED WARRANTY LASTS, SO LIMITATIONS MAY NOT APPLY TO YOU. NO PARASOFT DEALER, AGENT, OR EMPLOYEE IS AUTHORIZED TO MAKE ANY MODIFICATIONS, EXTENSIONS, OR ADDITIONS TO THIS WARRANTY. If any modifications are made to the SOFTWARE by you during the warranty period; if the media is subjected to accident, abuse, or improper use; or if you violate the terms of this Agreement, then this warranty shall immediately be terminated. This warranty shall not apply if the SOFTWARE is used on or in conjunction with hardware or software other than the unmodified version of hardware and software with which the SOFTWARE was designed to be used as described in the Documentation. THIS WARRANTY GIVES YOU SPECIFIC LEGAL RIGHTS, AND YOU MAY HAVE OTHER LEGAL RIGHTS THAT VARY BY JURISDICTION. YOUR ORIGINAL ELECTRONIC MEDIA/ARCHIVAL COPIES. The electronic media enclosed contain an original PARASOFT label. Use the original electronic media to make "back-up" or "archival" copies for the purpose of running the SOFTWARE program. You should not use the original electronic media in your terminal except to create the archival copy. After recording the archival copies, place the original electronic media in a safe place. Other than these archival copies, you agree that no other copies of the SOFTWARE will be made. TERM. This AGREEMENT is effective from the day you install the SOFTWARE and continues until you return the original SOFTWARE to PARASOFT, in which case you must also certify in writing that you have destroyed any archival copies you may have recorded on any memory system or magnetic, electronic, or optical media and likewise any copies of the written materials. CUSTOMER REGISTRATION. PARASOFT may from time to time revise or update the SOFTWARE. These revisions will be made generally available at PARASOFT's discretion. Revisions or notification of revisions can only be provided to you if you have registered with a PARASOFT representative or on the Parasoft Web site. PARASOFT's customer services are available only to registered users. PART III -- TERMS APPLICABLE TO ALL LICENSE GRANTS SCOPE OF GRANT. DERIVED PRODUCTS. Products developed from the use of the SOFTWARE remain your property. No royalty fees or runtime licenses are required on said products. PARASOFT'S RIGHTS. You acknowledge that the SOFTWARE is the sole and exclusive property of PARASOFT. By accepting this agreement you do not become the owner of the SOFTWARE, but you do have the right to use the SOFTWARE in accordance with this AGREEMENT. You agree to use your best efforts and all reasonable steps to protect the SOFTWARE from use, reproduction, or distribution, except as authorized by this AGREEMENT. You agree not to disassemble, de-compile or otherwise reverse engineer the SOFTWARE.

SUITABILITY. PARASOFT has worked hard to make this a quality product, however PARASOFT makes no warranties as to the suitability, accuracy, or operational characteristics of this SOFTWARE. The SOFTWARE is sold on an "as-is" basis. EXCLUSIONS. PARASOFT shall have no obligation to support SOFTWARE that is not the then current release. TERMINATION OF AGREEMENT. If any of the terms and conditions of this AGREEMENT are broken, this AGREEMENT will terminate automatically. Upon termination, you must return the software to PARASOFT or destroy all copies of the SOFTWARE and Documentation. At that time you must also certify, in writing, that you have not retained any copies of the SOFTWARE. LIMITATION OF LIABILITY. You agree that PARASOFT's liability for any damages to you or to any other party shall not exceed the license fee paid for the SOFTWARE. PARASOFT WILL NOT BE RESPONSIBLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, OR CONSEQUENTIAL DAMAGES RESULTING FROM THE USE OF THE SOFTWARE ARISING OUT OF ANY BREACH OF THE WARRANTY, EVEN IF PARASOFT HAS BEEN ADVISED OF SUCH DAMAGES. THIS PRODUCT IS SOLD "AS-IS". SOME STATES DO NOT ALLOW THE LIMITATION OR EXCLUSION OF LIABILITY FOR INCIDENTAL OR CONSEQUENTIAL DAMAGES, SO THE ABOVE LIMITATION OR EXCLUSION MAY NOT APPLY TO YOU. YOU MAY ALSO HAVE OTHER RIGHTS WHICH VARY FROM STATE TO STATE. ENTIRE AGREEMENT. This Agreement represents the complete agreement concerning this license and may be amended only by a writing executed by both parties. THE ACCEPTANCE OF ANY PURCHASE ORDER PLACED BY YOU IS EXPRESSLY MADE CONDITIONAL ON YOUR ASSENT TO THE TERMS SET FORTH HEREIN, AND NOT THOSE IN YOUR PURCHASE ORDER. If any provision of this Agreement is held to be unenforceable, such provision shall be reformed only to the extent necessary to make it enforceable. This Agreement shall be governed by California law (except for conflict of law provisions). All brand and product names are trademarks or registered trademarks of their respective holders. Copyright 1993-2008 Parasoft Corporation 101 E. Huntington Drive., 2nd Floor Monrovia, CA 91016 Printed in the U.S.A, November 16, 2009

Table of Contents Introduction Welcome ....................................................................................................................................... 13 About the Documentation Library - PDFs and Related Resources............................................... 14 Contacting Parasoft Technical Support ........................................................................................ 16

Installation and Licensing Windows Standalone Installation .................................................................................................. 21 Windows Plugin Installation .......................................................................................................... 22 Linux/Solaris Standalone Installation ............................................................................................ 24 Linux/Solaris Plugin Installation .................................................................................................... 26 Service Pack Installation............................................................................................................... 28 Licensing....................................................................................................................................... 29

The SOAtest UI Exploring the SOAtest UI .............................................................................................................. 33

Migrating from SOAtest and WebKing Migration Guide for Existing SOAtest and WebKing Users .......................................................... 41 Command Line Interface (cli) Migration ........................................................................................ 51

SOAtest Tutorial About this Tutorial ......................................................................................................................... 58 Creating Projects and Test (.tst) files............................................................................................ 59 WSDL Verification......................................................................................................................... 63 Functional Testing ........................................................................................................................ 67 Scenario Testing........................................................................................................................... 87 Advanced Strategies..................................................................................................................... 92 Creating and Deploying Stubs ...................................................................................................... 102 Testing Plain XML Services .......................................................................................................... 126 Extending SOAtest with Scripting ................................................................................................. 128 Asynchronous Testing .................................................................................................................. 134 WS-Security .................................................................................................................................. 140 Design and Development Policy Enforcement.............................................................................. 159 Automation/Iteration (Nightly Process) ......................................................................................... 165 Running Regression Tests in Different Environments .................................................................. 167 Web Functional Testing ................................................................................................................ 171 Web Static Analysis ...................................................................................................................... 183

Team-Wide Deployment Team-Wide Deployment - Configuration Overview Configuring a Team Deployment: Introduction ............................................................................. 191 Connecting All SOAtest Installations to Your Source Control System.......................................... 192 Connecting All SOAtest Installations to Team Server................................................................... 199

Connecting SOAtest Server to Report Center ............................................................................. 203 Connecting All SOAtest Installations to Parasoft Project Center .................................................. 206 Configuring Task Assignment ....................................................................................................... 215 Configuring Team Test Configurations and Rules ........................................................................ 220 Configuring Task Goals ................................................................................................................ 225 Sharing Project and Test Assets .................................................................................................. 227 Configuring Automated Nightly Testing ........................................................................................ 229

Team-Wide Deployment - Usage Overview Using a Team Deployment: Daily Usage Introduction .................................................................. 231 Creating Tests and Analyzing Source Files from the GUI ............................................................ 232 Reviewing and Responding to Tasks Generated During Nightly Tests ........................................ 233 Accessing Results and Reports .................................................................................................... 234 Reassigning Tasks to Other Team Members ............................................................................... 238 Monitoring Project-Wide Test Results........................................................................................... 239

Test and Analysis Basics Customizing Settings and Configurations Modifying General SOAtest Preferences ...................................................................................... 242 Creating Custom Test Configurations........................................................................................... 244

Running Tests and Analysis Testing from the GUI .................................................................................................................... 255 Testing from the Command Line Interface (soatestcli) ................................................................. 257 Testing from the Web Service Interface........................................................................................ 287

Reviewing Results Viewing Results ............................................................................................................................ 290 Generating Reports ...................................................................................................................... 295 Configuring Reporting Settings ..................................................................................................... 297 Understanding Reports ................................................................................................................. 300

Functional/Integration Testing End-to-End Test Scenarios Configuring End-to-End Test Scenarios: Overview ...................................................................... 307 Adding Projects, .tst files, and Test Suites ................................................................................... 308 Working with Projects and .tst files ............................................................................................... 313 Reusing/Modularizing Test Suites ................................................................................................ 317 Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)............................................ 319 Adding Standard Tests ................................................................................................................. 331 Adding Set-Up and Tear-Down Tests ........................................................................................... 332 Adding Test Outputs ..................................................................................................................... 333 Adding Global Test Suite Properties............................................................................................. 337 Reusing and Reordering Tests ..................................................................................................... 344 Parameterizing Tests (with Data Sources or Values from Other Tests) ....................................... 345 Configuring Testing in Different Environments ............................................................................. 369 Validating the Database Layer...................................................................................................... 373 Validating EJBs............................................................................................................................. 374 Validating Java Application-Layer Functionality ........................................................................... 375 Monitoring and Validating Messages and Events Inside ESBs and Other Systems..................... 376 Executing Functional Tests........................................................................................................... 377

Reviewing Functional Test Results............................................................................................... 379 Creating a Report of the Test Suite Structure............................................................................... 380 Managing the Test Suite ............................................................................................................... 381

SOA Functional Tests Automatic Creation of Test Suites for SOA: Overview.................................................................. 384 Creating Tests From a WSDL....................................................................................................... 385 Creating Tests From XML Schema............................................................................................... 389 Creating Tests From AmberPoint Management System .............................................................. 391 Creating Tests From Oracle Enterprise Repository / BEA AquaLogic Repository........................ 393 Creating Tests From BPEL Files .................................................................................................. 396 Creating Tests From Software AG CentraSite Active SOA .......................................................... 399 Creating Tests From JMS System Transactions .......................................................................... 401 Creating Tests From Sonic ESB Transactions ............................................................................. 404 Creating Tests From TIBCO EMS Transactions........................................................................... 407 Creating Tests From Traffic .......................................................................................................... 410 Creating Tests From a UDDI ........................................................................................................ 413 Creating Tests From a WSIL ........................................................................................................ 416 Creating Asynchronous Tests....................................................................................................... 419 Testing RESTful Services ............................................................................................................. 420 Sending MTOM/XOP Messages................................................................................................... 421 Sending and Receiving Attachments ............................................................................................ 422 Accessing Web Services Deployed with HTTPS .......................................................................... 423 Configuring Regression Testing ................................................................................................... 425 Validating the Value of an Individual Response Element ............................................................. 427 Validating the Structure of the XML Response Message ............................................................. 428

Web Functional Tests Web Functional Testing: Overview ............................................................................................... 430 Recording Tests from a Browser .................................................................................................. 431 Generating JUnit Tests from a Web Browser Recording .............................................................. 441 Configuring Browser Playback Options ........................................................................................ 447 Configuring User Actions (Navigation, Delays, etc.) .................................................................... 449 Configuring Wait Conditions ......................................................................................................... 454 Validating or Storing Values ......................................................................................................... 459 Stubbing Test Requests/Responses ............................................................................................ 465 Running Web Scenarios in Headless Mode ................................................................................. 468 Running Static Analysis as Functional Tests Execute .................................................................. 471 Customizing Recording Options ................................................................................................... 472 Creating Custom Locators for Validations and Extractions.......................................................... 473 Understanding Web Functional Test Errors.................................................................................. 475 Creating a Report of Test Suite Maintainability............................................................................. 476

Security Testing Security Testing: Introduction ....................................................................................................... 479 Authentication, Encryption, and Access Control ........................................................................... 481 Penetration Testing....................................................................................................................... 484

Runtime Error Detection Performing Runtime Error Detection............................................................................................. 490

Event Monitoring (ESBs, Java Apps, Databases, and other Systems) Monitoring Intra-Process Events: Overview.................................................................................. 495 Using SOAtest’s Event Monitor .................................................................................................... 496 Monitoring IBM WebSphere ESB ................................................................................................. 500 Monitoring Oracle or BEA AquaLogic Service Bus ....................................................................... 503 Monitoring Software AG webMethods Broker............................................................................... 505 Monitoring Sonic ESB ................................................................................................................... 508 Monitoring TIBCO EMS ................................................................................................................ 510 Monitoring Other JMS Systems .................................................................................................... 512 Monitoring Java Applications ........................................................................................................ 514 Monitoring Databases ................................................................................................................... 518 Monitoring Stub Events................................................................................................................. 520 Monitoring a Custom API-Based Events Source .......................................................................... 524 Extensibility API Patterns.............................................................................................................. 525 Generating Tests from Monitored Transactions............................................................................ 530

Service Virtualization: Creating and Deploying Stubs Understanding Stubs .................................................................................................................... 532 Creating Stubs from Functional Test Traffic ................................................................................. 535 Creating Stubs from Recorded HTTP Traffic ................................................................................ 538 Creating Stubs from WSDLs or Manually ..................................................................................... 539 Working with Stubs ....................................................................................................................... 541 Configuring Stub Server Deployment Settings ............................................................................ 554

Load Testing Load Testing your Functional Tests: Introduction ......................................................................... 559 Load Test Documentation and Tutorial......................................................................................... 560 Preparing Web Functional Tests for Load Testing........................................................................ 561

SOA Quality Governance and Policy Enforcement SOA Policy Enforcement: Overview ............................................................................................. 570 Defining the Policy ........................................................................................................................ 573 Enforcing Policies on WSDLs, Schemas, and SOAP Messages.................................................. 575

Static Analysis Performing Static Analysis ............................................................................................................ 578 Configuring SOAtest to Scan a Web Application .......................................................................... 583 Reviewing Static Analysis Results ................................................................................................ 605 Suppressing the Reporting of Acceptable Violations .................................................................... 608 Customizing Static Analysis: Overview........................................................................................ 611 Creating Custom Static Analysis Rules ........................................................................................ 612 Modifying Rule Categories, IDs, Names, and Severity Levels...................................................... 616

Code Review Code Review Introduction............................................................................................................. 620 General Code Review Configuration ............................................................................................ 622 Configuring and Running Pre-Commit Code Review Scans......................................................... 628 Configuring and Running Post-Commit Code Review Scans ....................................................... 635 Working with the Code Review UI ................................................................................................ 641 Authors - Examining and Responding to Review Comments ....................................................... 647 Reviewers - Reviewing Code Modifications.................................................................................. 650 Monitors - Overseeing the Review Process.................................................................................. 653 Code Review Tips and Tricks ....................................................................................................... 655

Platform Support and Integrations Using AmberPoint Management System with SOAtest ................................................................ 657 Using Oracle/BEA with SOAtest ................................................................................................... 658 Using HP with SOAtest ................................................................................................................. 659 Using JMS with SOAtest............................................................................................................... 665 Using Microsoft with SOAtest ....................................................................................................... 666 Using IBM/Rational with SOAtest ................................................................................................. 672 Using Software AG CentraSite Active SOA with SOAtest ............................................................ 682 Using Software AG webMethods with SOAtest ............................................................................ 685 Using Sonic with SOAtest ............................................................................................................. 686 Using TIBCO with SOAtest ........................................................................................................... 687

Testing Through Different Protocols Using HTTP 1.0 ............................................................................................................................ 689 Using HTTP 1.1 ............................................................................................................................ 691 Using JMS .................................................................................................................................... 694 Using SonicMQ ............................................................................................................................. 706 Using IBM WebSphere MQ .......................................................................................................... 710 Using TIBCO Rendezvous............................................................................................................ 720 Using RMI ..................................................................................................................................... 722 Using SMTP.................................................................................................................................. 723 Testing a CORBA Server.............................................................................................................. 724 Using .NET WCF TCP .................................................................................................................. 726 Using .NET WCF HTTP ................................................................................................................ 730 Using .NET WCF Flowed Transactions ........................................................................................ 734

Reference Built-in Test Configurations........................................................................................................... 741 Built-in Static Analysis Rules ........................................................................................................ 744 Preference Settings ...................................................................................................................... 747 Extensibility (Scripting) Basics ...................................................................................................... 764 Using Eclipse Java Projects in SOAtest ....................................................................................... 772 Available Tools ............................................................................................................................. 773

Messaging Tools SOAP Client.................................................................................................................................. 777 Messaging Client .......................................................................................................................... 782 Message Stub ............................................................................................................................... 784

Common Messaging Options ....................................................................................................... 792 Literal XML View Options ............................................................................................................. 793 Form XML View Options ............................................................................................................... 794 Scripted XML View Options .......................................................................................................... 804 Form Input View Options .............................................................................................................. 805 MapMessage Input Options .......................................................................................................... 828 REST Client .................................................................................................................................. 829 webMethods ................................................................................................................................. 833 EJB Client ..................................................................................................................................... 841 Call Back....................................................................................................................................... 847 UDDI Query .................................................................................................................................. 852 Transmit Tool ................................................................................................................................ 854 ISO 8583....................................................................................................................................... 856

XML Tools XML Validator ............................................................................................................................... 862 XML Data Bank............................................................................................................................. 863 XML Transformer .......................................................................................................................... 869 XSLT ............................................................................................................................................. 872 XML Encryption ............................................................................................................................ 873 XML Signer ................................................................................................................................... 878 XML Signature Verifier.................................................................................................................. 882 XML Encoder ................................................................................................................................ 885 XML Decoder ................................................................................................................................ 886

Viewing Tools Traffic Viewer ................................................................................................................................ 888 Event Monitor................................................................................................................................ 891 Edit................................................................................................................................................ 892 Write File....................................................................................................................................... 893 File Stream Writer ......................................................................................................................... 894 Results Stream Writer................................................................................................................... 895 stderr............................................................................................................................................. 896 stdout ............................................................................................................................................ 897

Validation Tools Diff ................................................................................................................................................ 899 WS-I .............................................................................................................................................. 909 DB ................................................................................................................................................. 911 XML Assertor ................................................................................................................................ 917 WS-BPEL Semantics Validator..................................................................................................... 923

Web Application Tools Browser Testing ............................................................................................................................ 925 Browser Contents Viewer ............................................................................................................. 927 Browser Validation ........................................................................................................................ 929 Browser Data Bank ....................................................................................................................... 931 Browser Stub ................................................................................................................................ 932 Scanning ....................................................................................................................................... 933 Check Links .................................................................................................................................. 934 Spell .............................................................................................................................................. 936 HTML Cleanup.............................................................................................................................. 938 Search .......................................................................................................................................... 944

Other Tools Header Data Bank ........................................................................................................................ 947 JSON Data Bank .......................................................................................................................... 950 Object Data Bank.......................................................................................................................... 951

Text Data Bank ............................................................................................................................. 952 Coding Standards ......................................................................................................................... 954 FTP Client ..................................................................................................................................... 955 External......................................................................................................................................... 956 Aggregate ..................................................................................................................................... 959 Extension (Custom Scripting) ....................................................................................................... 960 Attachment Handler ...................................................................................................................... 962 Decompression ............................................................................................................................. 965 Jtest Tracer Client......................................................................................................................... 966 WSDL Content Handler ................................................................................................................ 969 Browse .......................................................................................................................................... 970 WSDL Semantics Validator .......................................................................................................... 971

Introduction In this section: •

Welcome



About the Documentation Library - PDFs and Related Resources



Contacting Parasoft Technical Support



Installation and Licensing



The SOAtest UI

12

Welcome

Welcome Parasoft SOAtest is a full-lifecycle quality platform for ensuring secure, reliable, compliant business processes. It provides enterprises an integrated solution for: •

Quality governance: To continuously measure how each service conforms to the often dynamic expectations defined by both your own organization and your partners.



Environment management: To reduce the complexity of testing in today’s heterogeneous environments—with limited visibility/control of distributed components or vendor-specific technologies.



End-to-end testing: To continuously validate all critical aspects of complex transactions, which may extend beyond the message layer through a web interface, ESBs, databases, and everything in between.



Process visibility and control: To establish a sustainable workflow that helps the entire team efficiently develop, share, and manage the evolution of quality assets throughout the lifecycle.

Getting Started •

Existing SOAtest or WebKing Users: We recommend starting at “Migrating from SOAtest and WebKing”, page 40 then proceeding to “SOAtest Tutorial”, page 57.



New SOAtest Users: We recommend starting at “SOAtest Tutorial”, page 57.

Open Source Acknowledgements This product includes software developed by the Eclipse Project (http://www.eclipse.org/). For a list of additional software used, choose Help> About Parasoft SOAtest, then click the SOAtest icon.

13

About the Documentation Library - PDFs and Related Resources

About the Documentation Library PDFs and Related Resources The SOAtest documentation library includes the following items: •

The SOAtest User’s Guide (the current guide): Explains how to use the SOAtest functionality that is built upon Eclipse (if you have the standalone version of SOAtest) or that is added to Eclipse (if you have the SOAtest plugin). To access this guide from the Eclipse help system, choose Help> Help Contents, then open the SOAtest User’s Guide book. The PDF is available in the manuals directory within the SOAtest installation directory.



The RuleWizard User’s Guide: Explains how to use RuleWizard to create custom rules. Note that RuleWizard requires a special license. To access this guide, open RuleWizard by choosing SOAtest> Launch RuleWizard, then choose Help> Documentation from within the RuleWizard GUI.



The SOAtest Static Analysis Rules Guide: Describes all of the coding standards rules included with SOAtest. To access this guide from the Eclipse help system, choose Help> Help Contents, then open the SOAtest Static Analysis Rules book. To generate a custom HTMLformat guide with the descriptions for only the rules you have enabled, use the procedure described in “Viewing Rule Descriptions”, page 744.

Additional user guides in the Eclipse help system (for example, Workbench User Guide, etc.) describe native Eclipse functionality and strategies.

Search Tips The Eclipse help system provides search functionality. However, by default, it searches for the entered term in all available documentation—including Eclipse documentation. If you want to restrict your searches to the SOAtest documentation, perform the following steps: 1. Choose Help> Help Contents. 2. Click the Search Scope link in the help viewer. 3. Click the New button in the Select Search Scope dialog. 4. Enter SOAtest in the New Search List dialog List name field. 5. Check the boxes for the books you want to search (for example, SOAtest User's Guide, SOAtest Static Analysis Rules). 6. Click OK. 7. Select the Search only the following topics button. 8. Click OK. In addition, the User Guide PDF is fully searchable. Popular PDF readers provide both search and find functionality to help you locate the information you are looking for.

14

About the Documentation Library - PDFs and Related Resources

Bookmarking Your Most Commonly-Accessed Topics You can use the Eclipse help "bookmark" feature to enable easy access to the SOAtest help topics you refer to most frequently. To bookmark a topic, open it in the Eclipse help system, then click the Bookmark Document button in the top right of the help system’s toolbar.

To access bookmarked topics, click the bookmark icon on the bottom left of the Eclipse help system.

15

Contacting Parasoft Technical Support

Contacting Parasoft Technical Support This topic explains several ways to contact technical support, as well as how to prepare and send "support archives" that help the technical support team diagnose any problems you are experiencing. Sections include: •

Obtaining Live Online Support (Windows only)



Using the SOAtest Forum



Contacting us via Phone or E-mail



Preventing SOAtest from Running Out of Memory



Preparing a "Support Archive" and Sending it to Technical Support

Obtaining Live Online Support (Windows only) SOAtest experts are available online to answer your questions. This live support allows you to chat in real-time with the SOAtest team and perform desktop sharing if needed. To receive live online support, go to http://www.parasoft.com/jsp/pr/live_experts.jsp. This live tech support feature currently supports only the Microsoft Windows operating system.

Using the SOAtest Forum Parasoft's SOAtest Forum is an active, online meeting place where you can converse with and learn from peers and Parasoft team members. Post your questions and participate in the latest discussions at http://forums.parasoft.com.

Contacting us via Phone or E-mail USA Headquarters Tel: (888) 305-0041 or (626) 256-3680 Email: [email protected]

Netherlands Tel: +31-70-3922000 Email: [email protected]

France Tel:+33 (0) 64 89 26 00 Email: [email protected]

Germany Tel: +49 731 880309-0 Email: [email protected]

16

Contacting Parasoft Technical Support

UK Tel:

+44 (0)1923 858005

Email: [email protected]

Asia Tel: +886 2 6636-8090 Email: [email protected]

Other Locations See http://www.parasoft.com/contacts.

Preventing SOAtest from Running Out of Memory To prevent SOAtest from running out of memory, add memory parameters to the script or shortcut being used to start SOAtest. The two parameters are the initial size of the JVM (Xms) and the maximum size of the JVM (Xmx). Typically, both are set to the same size (for instance, 256MB). However, if you have occasional problems but don't want to always allocate a large amount of memory, you can set the parameters to different sizes (for example, 256MB as the initial size and 512MB for the maximum size). Examples: SOAtest standalone: soatest.exe -J-Xms256m -J-Xmx256m SOAtest plugin for Eclipse: soatest.exe -vmargs -Xmx384m Note that the maximum size you can set depends on your OS and JVM. If you are running the SOAtest Eclipse plugin under Sun Java 1.5 and get a java.lang.OutOfMemoryError: PermGen space error messages, start Eclipse with eclipse -vmargs -XX:MaxPermSize=256m

Preparing a "Support Archive" and Sending it to Technical Support If you are experiencing testing problems such as build failures, the best way to remedy the problem is to create a zip archive containing the source file(s) that caused that failure (if applicable), as well as related test information, then send that zip file to Parasoft's support team. To facilitate this process, SOAtest can automatically create an archive when testing problems occur. On average, these archives are about half a megabyte, and are created in about one minute. By default, SOAtest does not create an archive when testing problems occur. You can either manually prepare and send a support archive when needed, or you can modify SOAtest archive creation options so that SOAtest automatically prepares and sends an archive when testing problems occur. To configure SOAtest to automatically prepare and send archives when testing problems occur: 1. Open the Technical Support panel by choosing SOAtest> Preferences, then selecting the Technical Support category. 2. Check Enable auto-creation of support archives. 3. If you want to send the archive from SOAtest, check Send archives by e-mail.

17

Contacting Parasoft Technical Support



If you enable this option, be sure to set the e-mail options in Preferences> E-mail if you have not already done so.

4. In the Items to include area, check the items you want included. Available options are: •

Environmental data: Environment variables, JVM system properties, platform details, additional properties (memory, other).



General application logs: Various platform/application logs.

5. If you want verbose logs included in the archive, check Enable verbose logging. Note that this option cannot be enabled if the logging system has custom configurations. •

Verbose logs are stored in the xtest.log file within the user-home temporary location (on Windows, this is <drive>:\Documents and Settings\<user>\Local Settings\Temp\parasoft\xtest).



Verbose logging state is cross-session persistent (restored on application startup).



The log file is a rolling file: it won't grow over a certain size, and each time it achieves the maximum size, a backup will be created.

6. If you want verbose logs to include output from source control commands, check Enable source control output. Note that the output could include fragments of your source code. 7. If the support team asked you to enter any advanced options, check Advanced options, then enter them here. 8. If you do not want to use the default archive location (listed in the Archives location field), specify a new one in the Archives location field. 9. Click Apply, then OK. To manually create a support archive: •

Choose SOAtest> Preferences, select the Technical Support category, select the desired archive options, then click Create Archive.

To open the Technical Support Archive Manager, which allows you to review, e-mail, or delete recent support archives: •

Choose SOAtest> Preferences, select the Technical Support category, then click Browse Recent Archives.

When creating a support archive it is best to ensure that it contains all the info which is relevant to the problem and does not contain any unrelated info.

Best Practice: Creating an Archive with the Most Relevant Data When a technical support archive is created, the complete application logs are included. The logs may contain information from many test runs over a long period of time—but chances are that only a small part of that information is relevant to the problem you are experiencing. To help technical support isolate the cause of the problem, create a technical support archive containing application logs only for the testing session which produces problems. To do this: 1. Clean application logs by turning on verbose logging. If verbose logging is already enabled, then disable it and re-enable it. 2. Run the testing session that causes problems. 3. Prepare a technical support archive.

18

Contacting Parasoft Technical Support

19

Installation and Licensing In this section: •

Windows Standalone Installation



Windows Plugin Installation



Linux/Solaris Standalone Installation



Linux/Solaris Plugin Installation



Service Pack Installation



Licensing

20

Windows Standalone Installation

Windows Standalone Installation This topic explains how to install the standalone version of SOAtest (which is built upon the Eclipse framework)—as well as the Parasoft Load Test product—on a Windows system.

System Requirements •

At least 1 GB RAM per processor (2 GB is recommended)



Windows 2000, 2003, XP (Professional or Server Edition), or Vista

Installation To install the standalone version of SOAtest on a Windows system: 1. Run the setup executable that you downloaded from the Parasoft Web site. 2. Follow the installation program's onscreen instructions. After you have completed the installation program, SOAtest will be installed on your machine. SOAtest will be installed in the specified installation directory. The SOAtest workspace will be installed at %USERPROFILE%\soatest\workspace. For example: •

Windows XP: C:\Documents and Settings\[user]\soatest\workspace



Windows Vista: C:\Users\[user]\soatest\workspace

Startup To start SOAtest: •

Double-click the SOAtest desktop icon or choose Programs> Parasoft> SOAtest 6.x> SOAtest from the Windows Start menu.

To start Load Test: •

Double-click the Load Test desktop icon or choose Programs> Parasoft> Load Test> Load Test from the Windows Start menu.

Note: You must install a license before you begin using Load Test or SOAtest.

Licensing See the Licensing topic for details.

21

Windows Plugin Installation

Windows Plugin Installation This topic explains how to install the SOAtest plugin into a working copy of Eclipse on Windows. Parasoft Load Test will also be installed during this process.

System Requirements •

At least 1 GB RAM per processor (2 GB is recommended)



Windows 2000, 2003, XP (Professional or Server Edition), or Vista



Eclipse 3.4, 3.3, 3.2.1 or higher



Sun Microsystems JRE 1.5 or higher (32-bit)

Known Eclipse Issues •

On Windows platforms, there is a known issue with the Eclipse 3.3 UI not refreshing properly.



On all platforms, there is a known system updates issue with Eclipse 3.4. This may affect your ability to install SOAtest service packs in the future. If you run into a problem updating SOAtest in the future, please contact SOAtest technical support for assistance. Please note that SOAtest standalone ships on a patched Eclipse Ganymede 3.4.1, which does not suffer from the update issue mentioned above.



If you already have Parasoft Jtest or Parasoft C++test, SOAtest needs to be installed into a separate Eclipse installation.

Installation To install the SOAtest plugin on a Windows system: 1. In Windows Explorer, locate and double-click the self extracting archive. 2. Click Yes when a dialog asks whether you want to install SOAtest. 3. Click Yes after you have read and agreed with the license information. 4. Click Next after you have read the readme file. 5. Enter the desired destination directory for the SOAtest Extension files, then click Next. The default destination directory is C:\Program Files\Parasoft\SOAtestExtension. 6. Enter your Eclipse installation directory, then click OK. 7. Close Eclipse if it is open, then click OK to close the dialog reminding you to close this program. SOAtest will then start copying files and installing the necessary files into the workbench. A dialog box with a progress indicator will open and indicate installation progress. When the installation is complete, a notification dialog box will open. 8. Click the OK button to close the notification dialog box.

Startup To start SOAtest:

22

Windows Plugin Installation

1. Start Eclipse by double-clicking the appropriate desktop icon or choosing the appropriate menu item from the Windows Start menu. 2. Open the SOAtest perspective by choosing Window> Open Perspective> Other, then choosing SOAtest in the Select Perspective dialog that opens. 3. If the SOAtest menu is not visible in the Eclipse toolbar, choose Window> Reset Perspective. If the SOAtest menu still is not visible, ensure that you have the latest version of SOAtest by choosing Help> Software Updates> Pending Updates and installing any pending updates. To start Load Test: •

Double-click the Load Test desktop icon or choose Programs> Parasoft> Load Test> Load Test from the Windows Start menu.

Note: You must install a license before you begin using Load Test or SOAtest.

Licensing See the Licensing topic for details.

WTP and Pydev Plugin Installation WTP and Pydev plugins greatly improve the usability of various text editors and must be installed for SOAtest’s syntax highlighting to work. These plugins can be downloaded and installed from the following locations: WTP download site: http://download.eclipse.org/webtools/downloads/ Pydev download site: http://pydev.sourceforge.net/download.html

23

Linux/Solaris Standalone Installation

Linux/Solaris Standalone Installation This topic explains how to install the standalone version of SOAtest (which is built upon the Eclipse framework)—as well as the Parasoft Load Test product—on a Linux or Solaris system.

System Requirements •

At least 1 GB RAM per processor (2 GB is recommended)



Linux or Solaris



For Linux: •

GTK+ 2.10 or higher



GLib 2.12 or higher



Pango 1.14 or higher



X.Org 1.0 or higher

Installation To install the standalone version of SOAtest on a Linux or Solaris system: 1. If you haven’t already done so, copy the installation file to the directory where you would like to install SOAtest. 2. Change directories to the directory where you are going to install SOAtest. 3. Extract the necessary files by entering the appropriate command at the prompt: •

Linux: tar -xzf soatest_6.x_linux.tar.gz



Solaris: unzip soatest_6.x_solaris.zip

During extraction, a directory named SOAtest will be created; this directory will contain the program files needed to run SOAtest. The SOAtest workspace will be installed at <$HOME>/.SOAtest/workspace (for Solaris) or <$HOME>/.SOAtest_linux/workspace (for Linux).

Startup To run SOAtest GUI: •

Change directories to the soatest directory, then enter the following command at the prompt: ./soatest

To run SOAtest Command Line: •

Change directories to the soatest directory, then execute SOAtest with the desired command line arguments: ./soatestcli

For details on using soatestcli, see the User Guide topic on Testing from the Command Line Interface (soatestcli). To start Load Test: •

Change directories to the loadtest directory, then enter the following command at the prompt: ./loadtest

24

Linux/Solaris Standalone Installation

Note: You must install a license before you begin using SOAtest or Load Test.

Licensing See the Licensing topic for details.

25

Linux/Solaris Plugin Installation

Linux/Solaris Plugin Installation This topic explains how to install the SOAtest plugin into a working copy of Eclipse on a Linux or Solaris system. Parasoft Load Test will also be installed during this process.

Prerequisites •

At least 1 GB RAM per processor (2 GB is recommended)



Linux or Solaris



For Linux: •

GTK+ 2.10 or higher



GLib 2.12 or higher



Pango 1.14 or higher



X.Org 1.0 or higher



Eclipse 3.4, 3.3, 3.2.1 or higher



Sun Microsystems JRE 1.5 or higher (32-bit)

Known Eclipse Issues •

On all platforms, there is a known system updates issue with Eclipse 3.4. This may affect your ability to install SOAtest service packs in the future. If you run into a problem updating SOAtest in the future, please contact SOAtest technical support for assistance. Please note that SOAtest standalone ships on a patched Eclipse Ganymede 3.4.1, which does not suffer from the update issue mentioned above.



If you already have Parasoft Jtest or Parasoft C++test, SOAtest needs to be installed into a separate Eclipse installation.

Installation To install the SOAtest plugin on a UNIX system: 1. If you haven't already done so, move the installation file that you downloaded to the directory where you want to install SOAtest. Typically, this would be a directory different than your Eclipse location so the files can be more easily updated independently in the future. 2. Change directories to the directory where you are going to install SOAtest. 3. Extract the necessary files by entering the appropriate command at the prompt: •

Linux: tar -xzf soatest_6.x_linux_eclipse_plugin.tar.gz



Solaris: unzip soatest_6.x_solaris_eclipse_plugin.zip

4. After you enter this command, a directory named soatest-extension is created within your current directory, and all SOAtest files are extracted into this directory. 5. Change to the soatest-extension directory. 6. Run the install script: ./install

7. Provide the location of your current Eclipse installation directory. For example: This script will link an existing Eclipse installation with the

26

Linux/Solaris Plugin Installation

SOAtest plugins. Please enter the directory that contains the installation you want to link to, or Ctrl-C to quit. > /home/developer/app/eclipse Eclipse installation found in /home/developer/app/eclipse Installing... Done.

Note: The SOAtest plugin can be uninstalled by deleting the links directory that was created at the top-level of the Eclipse installation directory.

Startup To run SOAtest GUI: •

Change directories to your eclipse directory, then start Eclipse as you normally do (using the Eclipse executable): ./eclipse

To run SOAtest Command Line: •

Change directories to the soatest-extension directory, then execute SOAtest with the desired command line arguments: ./soatestcli

For details on using soatestcli, see the User Guide topic on Testing from the Command Line Interface (soatestcli). To start Load Test: •

Change directories to the loadtest-extension directory, then enter the following command at the prompt: ./loadtest

Note: You must install a license before you begin using Load Test or SOAtest.

Licensing See the Licensing topic for details.

WTP and Pydev Plugin Installation WTP and Pydev plugins greatly improve the usability of various text editors and must be installed for SOAtest’s syntax highlighting to work. These plugins can be downloaded and installed from the following locations: WTP download site: http://download.eclipse.org/webtools/downloads/ Pydev download site: http://pydev.sourceforge.net/download.html

27

Service Pack Installation

Service Pack Installation This topic explains how to update your current version of SOAtest or download the latest service pack.

Updating SOAtest To update SOAtest: 1. From the SOAtest menu, choose Check for Updates. SOAtest will then check if any updates are available. 2. If updates are reported, select the available updates, click Next, then complete the wizard to install the updates.

Using an Alternative Update Site Some teams prefer to use an internal update site instead of the public one (for example, so that they can internally standardize their versions). To configure SOAtest to access an alternative update site: 1. Choose SOAtest> Preferences then select Updates. 2. Enter the desired update site in the Update Site Location field. 3. If you want to configure Eclipse proxy connections (not related to SOAtest proxy configuration/ behavior), click the Configure Proxy Settings link.

28

Licensing

Licensing This topic explains how to set licensing information for SOAtest and Load Test.

SOAtest The following instructions focus on how to license SOAtest from the GUI. On SOAtest installations that are licensed for command line mode, you can define license information in a local settings file, then call this file when you run SOAtest in command line mode.

Using a Machine-Locked License To install a machine- locked license: 1. Choose SOAtest> Preferences to open the Preferences dialog. 2. Select the License category in the left pane. 3. Contact your Parasoft representative to receive your license. You will need to provide the Machine ID listed in the Local License area. •

If you have a Server license and you want to obtain the Machine ID without opening the GUI, run soatestcli from the command line. The Machine ID will be reported in the output message.

4. Enter your license password in the Local License section of the License preferences page. 5. Click Apply. The License preferences page will display the features that you are licensed to use, and the expiration date for your license. 6. Click OK to set and save your license.

Using Parasoft LicenseServer Setting the license if all users cannot write to the SOAtest installation directory The user who has write access to the SOAtest installation directory should configure the license on behalf of all team members. If the license is set by a user who does not have write access to the SOAtest installation directory, the license information will be stored at the workspace level, and will need to be re-entered for each new workspace.

To install a license when the Parasoft LicenseServer (available separately) manages SOAtest licensing across your team or organization: 1. Choose SOAtest> Preferences to open the Preferences dialog. 2. Select the License category in the left pane. 3. Enable the Use LicenseServer option. The LicenseServer section of the License preferences page will become active. 4. If you want to use SOAtest for a short period of time when you will not have access to the LicenseServer (e.g., because you expect to be working from home, you will be travelling, your team will be upgrading the machine hosting LicenseServer, etc.), check borrow and specify how long you need to "borrow" the license.

29

Licensing



When you borrow a license, one of the available licenses (on the LicenseServer) is locked to your machine for the specified amount of time. You can then disconnect from the network and run SOAtest without accessing the LicenseServer.



Licenses can be borrowed from 1 hour to 14 days.



License borrowing requires PST 2.6 or higher.

5. If the appropriate LicenseServer is not already set, select it from the Autodetected servers list and click Set. Or, manually enter your organization’s LicenseServer host (either a name or an IP address) in the Host name field, then enter your organization’s LicenseServer port in the Port number field. 6. Indicate the license type that you want this SOAtest installation to use by selecting the appropriate option in the Edition box. Available options are:

7.



Professional Edition: Covers static analysis and functional testing.



Architect Edition: Covers static analysis, functional testing, and custom rule creation (RuleWizard).



Server Edition: Covers static analysis, functional testing, custom rule creation (RuleWizard), and the command-line interface.



Custom Edition: Covers custom licensing needs. If you are using a custom license, select this option, then click the Choose button and specify which of your available license features you want to apply to this installation.

Click OK to set and save your LicenseServer settings.

If your organization needs additional licenses or updated licenses, the manager or architect should contact Parasoft to obtain these licenses, then add these licenses to LicenseServer as described in the LicenseServer documentation.

Tip - Deactivating Licenses If you want to deactivate a SOAtest LicenseServer license, enable the Start deactivated, release automatically when idle option. To reactivate the license, disable the Start deactivated, release automatically when idle option. When the license is deactivated: •

The SOAtest view is cleared and displays a message indicating that a license is not available.



All SOAtest operations currently in progress (for instance, static analysis or test case execution) are canceled.



The SOAtest LicenseServer license is released.

When the license is reactivated: •

The SOAtest view is restored and will display errors (if available).

Load Test Using a Machine-Locked License

30

Licensing

To install a machine-locked license: 1. Launch Load Test. 2. Open the Password window in one of the following ways: •

If you see a dialog box asking if you would like to install a password, click Yes.



If you do not see this dialog box, choose File> View Password.

3. Contact your Parasoft representative to receive your license. 4. In the top portion of the Password Information dialog, enter your expiration date and password. 5. Click OK to set and save your license.

Using Parasoft LicenseServer To install a license when the Parasoft LicenseServer (available separately) manages Load Test licensing across your team or organization: 1. Launch Load Test. 2. Open the Password window in one of the following ways: •

If you see a dialog box asking if you would like to install a password, click Yes.



If you do not see this dialog box, choose File> View Password.

3. Select the Use license server option. 4. Select a license server from the Autodetect drop-down menu and click Set to populate the Host and Port fields. You can also click Refresh to refresh the list of license servers. If your license server is not found in the Autodetect list, you can manually enter the Host name in the Host field, and the Port number in the Port field (the default port is 2002). 5. Enter the amount of seconds after which a timeout should occur in the Timeout field. If the specified amount of time passes before a license is retrieved, Load Test will not run. 6. Click OK to set and save your license server information.

31

The SOAtest UI In this section: •

Exploring the SOAtest UI

32

Exploring the SOAtest UI

Exploring the SOAtest UI This topic describes the SOAtest controls that are added to the Eclipse IDE. Sections include: •

The SOAtest Perspective



Views



Toolbar Buttons



SOAtest Menu Commands



The Scanning Perspective



The Load Test Perspective

The SOAtest Perspective The SOAtest perspective, which is built into the Eclipse workbench, provides a set of capabilities designed to help you configure, run, and review tests. You can open this perspective in any of the following ways: •

Click the SOAtest Perspective button in the shortcut bar (on the top right of the workbench).



Click the Open Perspective button in the shortcut bar, choose Other, then choose SOAtest in the Select Perspective dialog that opens.



Choose Window> Open Perspective> Other, then choose SOAtest in the Select Perspective dialog that opens.

The SOAtest perspective provides special views, toolbar buttons, and menu items you will use to configure, run, and review tests.

Views SOAtest functionality relies on the following views: •

Test Case Explorer



SOAtest view



Console view



Test Progress view



Editor view



Stub Server view

For details on other views provided by the Eclipse workbench (such as the Tasks and Problems view), see the Workbench User Guide, which can be accessed from Help> Contents.

Test Case Explorer The Test Case Explorer displays available SOAtest projects and tests. The Test Case Explorer can have multiple Eclipse projects open at the same time. Each project can have multiple Test Suites open at the same time.

Test Case Explorer Menu Buttons

33

Exploring the SOAtest UI

At the top right corner of the Test Case Explorer are the following menu buttons : Icon

Name

Description

Refresh

Refreshes the contents of the Test Case Explorer.

Collapse All

Collapses all of the nodes within the Test Case Explorer.

Search

Allows you to search for any node (i.e. Test Suites, tests, chained tools, etc.) within the Test Case Explorer. After clicking the Search button, the following options display: •

Containing: Enter the text or string contained within the test.



Within the whole tree: Select to search for the specified text within the whole tree.



Within the selected node: Select to search for the specified text within the selected node.



Wrap around: Select to perform a search on wrap around text.



Case sensitive: Select to perform a case sensitive search.

Filter

Allows you to configure filters that hide specified projects or tests within the Test Case Explorer.

Statistics

Shows statistics (i.e. number of tests Passed, Failed, Errors, Skipped, Run) next to each test suite node within the Test Case Explorer.

SOAtest view The SOAtest view is where SOAtest lists its test findings. This view is open by default. If this view is not available, choose SOAtest> Show View> SOAtest to open it.

34

Exploring the SOAtest UI

For details on reviewing the results reported in this view, see “Viewing Results”, page 290.

Console view The Console view displays summary information about any executed test—including how many tests were run, how many tests failed, and how many tests were skipped.

35

Exploring the SOAtest UI

You can also configure the Console to show test variables as described in “Monitoring Test Variable Usage”, page 328.

Test Progress view This view is where SOAtest reports test progress and status. For details, see “Test Progress View”, page 290.

Editor view The Editor view is the largest panel in the workbench. This is where SOAtest displays tool/test configuration panels or source code—depending on what Test Case Explorer or Navigator node was selected. For instance, if you double-click a SOAP Client tool node in the Test Case Explorer, a SOAP Client tool configuration panel opens in an Editor.

Stub Server view The Stub Server view allows you to manage and interact with the local stub server as well as dedicated stub servers running on remote machines. For details on how to use this view, see “Working with Stubs”, page 541.

Toolbar Buttons SOAtest adds the following buttons to the toolbar: •

Test



Import My Recommended Tasks

36

Exploring the SOAtest UI

Test The Test button allows you to quickly run any available Test Configuration. If you simply click the Test button, SOAtest will run a test based on the Favorite Test Configuration.

If you use the pull-down menu to the right of the Test button, you can start a test using any of the available Test Configurations. The current Favorite Test Configuration will always be listed as the first command in the Test Using pull-down menu, and will be followed by the most recently-run Test Configurations, then by commands that provide access to User-defined Test Configurations, Built-in Test Configurations, and Team Test Configurations.

Click here to open the pulldown menu

Favorite Test Configuration

Recently-run Test Configurations

Test Configurations shared by the team

37

Test Configurations included with SOAtest

Test Configurations designed by the user

Exploring the SOAtest UI

Import My Recommended Tasks The Import My Recommended Tasks button lets you import a selected category of results that are available on Parasoft Team Server. This allows you to use the GUI to review and analyze the results from command-line tests. If you simply click the Import My Recommended Tasks button, SOAtest will import the subset of all of your testing tasks that 1) you are responsible for (based on SOAtest’s task assignments) and 2) SOAtest has selected for you to review and address today (based on the maximum number of tasks per team member per day that your team has configured SOAtest to report).

. If you use the pull-down menu to the right of the Import My Recommended Tasks button, you can select which type of results you want to import. For details on the available options, see “Accessing Results and Reports”, page 234.

SOAtest Menu Commands The SOAtest menu provides the following commands: •

Test Using [Favorite Configuration]: Starts a test using the Test Configuration currently set as the Favorite Test Configuration.



Test History: Starts a test based on the selected Test Configuration. Only the most recentlyrun Test Configurations are listed here.



Test Using: Starts a test based on the selected Test Configuration. All available Test Configurations are listed here.



Test Configurations: Opens the Test Configuration dialog, which lets you view, modify, and create Test Configurations. See “Creating Custom Test Configurations”, page 244 for details.



Launch RuleWizard: Opens RuleWizard, a tool for graphically or automatically creating custom static analysis rules. See “Creating Custom Static Analysis Rules”, page 612 for details.



Explore> Team Server: Opens the Team Server browser dialog, which allows you to access, configure, and update the Test Configurations, rules, rule mapping files, and reports available on Team Server.



Explore> Team Server Reports: Opens the HTML report files that are available on Team Server. See “Accessing Team Server Reports through the GUI”, page 236 for details.



Explore> Report Center Reports: Opens Report Center reports based on information from SOAtest tests and other sources. See “Accessing Results through Report Center”, page 237 for details.



Import: Imports the selected category of results that are available on Parasoft Team Server. See “Importing Results From Team Server into the SOAtest GUI”, page 234 for details.



Show View: Opens GUI elements (such as the SOAtest view or Suppressions view) which are not currently visible. See “Views”, page 33 for details.



Preferences: Opens the Preferences dialog. See “Modifying General SOAtest Preferences”, page 242 for details.



Support: Provides several ways to contact the support team.

38

Exploring the SOAtest UI



Help: Opens the User’s Guide in the online help system.



Deactivate | Activate License: Deactivates/activates a SOAtest LicenseServer license. See “Licensing”, page 29 for details.

The Scanning Perspective The Scanning perspective is designed to facilitate reviewing and retesting of resources scanned during static analysis. To open the Scanning perspective: •

Choose Window> Open Perspective> Other> Scanning.

This perspective is similar to the SOAtest perspective, but has two additional features: •

The Quick Test tool bar button for testing a single URL or file. This button can be added to any perspective by choosing Window > Customize Perspective> Commands and clicking the checkbox next to SOAtest Scanning.



The Scanned Resources view. The view can be added to any perspective by choosing Window > Show View> SOAtest> Scanned Resources Listing.

For details on using this perspective, see “Reviewing and Retesting Scanned Resources”, page 581.

The Load Test Perspective The Load Test perspective is designed to help you prepare your web functional tests for load testing. To open the Load Test perspective: •

Choose Window> Open Perspective> Other> Load Test.

This perspective is similar to the SOAtest perspective, but it also provides the following features: •

Two toolbar buttons (Configure for Load Test and Validate for Load Test) which allow you to run automated test configuration and validation.



A Load Test Explorer, which lists the available web functional tests. Note that any web functional test components that are not relevant to load testing—for example, browser-based validations or data banks—will not be shown in this view.



Load Test Explorer right-click menus for running automated test configuration and validation (the same commands available in the toolbar buttons).



Specialized test configuration panels, which are accessed by double-clicking a test in the Load Test Explorer.

For details on using this perspective, see “Preparing Web Functional Tests for Load Testing”, page 561.

39

Migrating from SOAtest and WebKing In this section: •

Migration Guide for Existing SOAtest and WebKing Users



Command Line Interface (cli) Migration

40

Migration Guide for Existing SOAtest and WebKing Users

Migration Guide for Existing SOAtest and WebKing Users This topic is a general migration guide for existing SOAtest and WebKing users. Sections include: •

About This Migration Guide



What’s New?



Setting Up Projects •

Choosing an Appropriate Project Setup Strategy



Creating Projects from Tests Under Source Control



Creating Projects from Tests NOT Under Source Control



Using a Team Project Set File (.psf) to Share a New Project Across the Team



Importing Existing Preferences



Importing Stubs



Familiarizing Yourself with the SOAtest 6.x Interface •

Test Case Explorer



Editors



Environments



Running a Test



Saving Test Suite (.tst) Files



SOAtest View and Console View



Source Control Integration



Setting up the Automated Nightly Process using CLI



Deprecated Features

About This Migration Guide This guide is designed to help existing SOAtest or WebKing users get up and running in SOAtest 6.x as rapidly as possible. This Migration Guide is meant to be used by users already familiar with SOAtest or WebKing. New users should review the SOAtest Tutorial first.

What’s New? For details on what’s new, see http://www.parasoft.com/soatest6.

Setting Up Projects Since the SOAtest 6.x interface was integrated into the Eclipse framework, it now follows the Eclipse framework hierarchy for managing your test assets. You no longer need to open .tst files one at a time. Instead, you can manage all your .tst files in projects within a workspace.

41

Migration Guide for Existing SOAtest and WebKing Users



A workspace corresponds to a directory on the local machine. SOAtest will ask you for the desired location of workspace at startup and will remember that location in future runs. When you start SOAtest 6.x, an Eclipse workspace is automatically created in: •

Windows: %USERPROFILE%\soatest\workspace



Linux: $HOME/.soatest_linux/workspace



Solaris: $HOME/.soatest/workspace For example: •

Windows XP: C:\Documents and Settings\[user]\soatest\workspace



Windows Vista: C:\Users\[user]\soatest\workspace



Linux: /home/[user]/.soatest_linux/workspace



A workspace can contain multiple projects, each of which correlates to a directory inside the workspace on the local machine. The project can contain multiple .tst files along with any related files and artifacts such as data source Excel spread sheets, keystores, etc.



The .tst file inside a project serves the same function as what was called a "project file" in previous releases.

Choosing an Appropriate Project Setup Strategy In the following sections, we will go through several ways of creating new projects that may be useful to existing SOAtest or WebKing users. Other ways of creating new projects, (e.g. from a WSDL), can be found in the Tutorial. To ensure that tests can easily be shared across your organization, a designated team member—usually a team lead or manager—must decide which project setup strategy to use. The entire team should then adopt the same strategy.

In these situations

Use this strategy

Your tests are stored in a source control system

Creating Projects from Tests Under Source Control

Your tests are NOT stored in a source control system and you want to copy your old tests into a new location on your file system

Copying Tests to a New Location

Your tests are NOT stored in a source control system and you want the existing files to remain in the same location on your file system

Leaving Tests in the Original Location

Note Once a file is opened within SOAtest 6.0 or later, it is automatically saved in a new format that cannot be opened in earlier versions of SOAtest or WebKing.

Creating Projects from Tests Under Source Control

42

Migration Guide for Existing SOAtest and WebKing Users

By default, SOAtest 6.x ships with CVS source control support. Support for additional source controls can be added by providing appropriate plug-ins to Eclipse. To create a project consisting of test suites that are checked into source control: 1. Choose File> Import. 2. In the window that opens, expand the folder that corresponds to your source control system (e.g., SVN or CVS). 3. Select Project(s) from <name of source control> then click Next. 4. Enter the necessary Repository Location Information for the source control folder containing your tests, then click Finish. 5. After the project is available in your workspace, add the .project and .parasoft files into source control. They will be visible in the Navigator view and should be shared by the entire team. •

Do NOT add the .metadata folder into source control.

Creating Projects from Tests NOT Under Source Control The first (and strongly recommended) option for tests not stored in a source control system is to copy the old tests into your new workspace. Doing so will preserve your old tests in a manner analogous to backing up your hard drive. This procedure is explained in Copying Tests to a New Location. The second option for tests not stored in a source control system is to use the Project from Existing SOAtest or WebKing Test Suites wizard. This will cause the original files to appear within your workspace—but it will allow them to remain in the same location on your file system. This procedure is explained in Leaving Tests in the Original Location.

Copying Tests to a New Location To create a project that copies existing test suites to a new location on the file system: 1. Choose File> New> Project. 2. In the window that opens, expand General, select Project, then click Next. 3. Specify a name for your project (which will contain multiple tst files), then click Finish. This will create an empty folder in your workspace. 4. Choose File> Import. 5. In the window that opens, expand General, select File System, then click Next. 6. In the From directory field, navigate to the directory containing your tests. 7. In the Into folder field, select your project folder from Step 3, then click Finish. 8. (Optional, Strongly Recommended) Obtain a source control system, and add the entire project, the .project folder, and the .parasoft files into source control. They will be visible in the Navigator view and should be shared by the entire team. •

Do NOT add the .metadata folder into source control.

Leaving Tests in the Original Location To create a project that leaves existing test suites at the same location on the file system: 1. Open the New Project Wizard by completing one of the following options:

43

Migration Guide for Existing SOAtest and WebKing Users



Select File> New> Project from Existing SOAtest or WebKing Test Suites.



Open the pull-down menu for the New toolbar button (top left) then choose Project from Existing SOAtest or WebKing Test Suites.

2. In the wizard that opens, enter a project name then enter or browse to the root directory for the existing test suites. 3. Click the Finish button. The tests you selected will display in the Test Case Explorer.

Using a Team Project Set File (.psf) to Share a New Project Across the Team Once one team member creates a project, that team member can create a Team Project Set File (.psf) which can then be shared by other members of the team. Doing so allows every team member to create their Eclipse Projects in the same uniform way. This is a necessary step for importing tasks from the automated nightly test process. To create a Team Project Set File for a project created from CVS, complete the following: 1. Select File> Export. The Export Wizard displays. 2. Within the Export Wizard, select Team> Team Project Set, and then click the Next button. 3. Select the projects to be included in the Team Project Set File by selecting the corresponding check boxes. 4. Enter the location where the Team Project Set File will be saved and click the Finish button. To create a Project from the Team Project Set File, complete the following: 1. Select File> Import. The Import Wizard displays. 2. Within the Import Wizard, select Team> Team Project Set, and then click the Next button.

44

Migration Guide for Existing SOAtest and WebKing Users

3. Browse to the desired Team Project Set and click the Finish button. The tests that you selected display in the Test Case Explorer.

Importing Existing Preferences Existing SOAtest or WebKing users can import preferences from previous versions of SOAtest or WebKing. Preferences hold settings such as previously-used WSDL URLs, Report Center preferences, System Properties for including additional jar files in a classpath, etc. Preferences in previous versions of SOAtest or WebKing were saved in a binary file with .xtp or .wkp extensions in the installation directory of SOAtest or WebKing. To import existing preferences, complete the following: 1. Select SOAtest> Preferences. The Preferences dialog displays. 2. Select the root SOAtest node within the Preferences dialog and click the Import button. 3. Browse to and select the .xtp or .wkp preference file. The selected preferences are now saved.

Importing Stubs The Client Tester tool has now been renamed to the "Message Stub" tool. In all versions of SOAtest, the configuration settings for any deployed stubs are saved in a stubs.xml file. In SOAtest 5.5.x, this file was saved in the "stubs" folder in the SOAtest install location. In the current version of SOAtest, this file is created in a "stubs" project in the SOAtest workspace. This stubs project is the default (and recommended) location for storing each stub's corresponding .tst file (Client Tester/Message Stub suite). To migrate stubs from SOAtest 5.5.x: 1. In your SOAtest 6.x workspace, create a project called "stubs" (if it does not already exist). 2. Copy your stubs.xml file and any corresponding stub .tst files to the "stubs" project. 3. In the stubs.xml file, review the file paths to each stub’s tst file and adjust them as needed. •

In SOAtest 5.5.x, relative paths were resolved relative to the SOAtest install directory.



In the current version of SOAtest, the file paths are relative to the stubs.xml file in the "stubs" project.

Familiarizing Yourself with the SOAtest 6.x Interface Because Parasoft SOAtest is now Eclipse-based, the look and feel style is now slightly different. However, except for the changes outlined below, the user interface design layout, forms and settings have largely remained unchanged and should remain familiar to existing users.

Test Case Explorer The Test Case Explorer can have multiple Eclipse projects open at the same time. Each project can have multiple Test Suites open at the same time. In previous versions of SOAtest, only one Test Suite could be open at any given time.

45

Migration Guide for Existing SOAtest and WebKing Users

Test Case Explorer Menu Buttons At the top right corner of the Test Case Explorer are the following menu buttons: •

Refresh: Click to refresh the contents of the Test Case Explorer.



Collapse All: Click to collapse all of the nodes within the Test Case Explorer.



Search: Click to perform a search for any node (i.e. Test Suites, tests, chained tools, etc.) within the Test Case Explorer. After clicking the Search button, the following options display: •

Containing: Enter the text or string contained within the test.



Within the whole tree: Select to search for the specified text within the whole tree.



Within the selected node: Select to search for the specified text within the selected node.



Wrap around: Select to perform a search on wrap around text.



Case sensitive: Select to perform a case sensitive search.



Filter: Select to hide specified projects or tests within the Test Case Explorer.



Statistics: Select to display statistics (i.e. number of tests Passed, Failed, Errors, Skipped, Run) next to each test suite node within the Test Case Explorer.

Editors Opening Editors with a Double or Single Click In previous versions, if you wanted to open the configuration panel for a test node (e.g., an "Editor"), you would select that node in the Tests tab. With SOAtest 6.x, you double-click on an item’s Test Case Explorer node to display its Editor. If you want to change the default double-click behavior to single-click, complete the following: 1. Select Window> Preferences. The Preferences dialog displays.

46

Migration Guide for Existing SOAtest and WebKing Users

2. Within the Preferences dialog, select General on the left, and change the Open mode from Double click to Single click within the right GUI panel. 3. Select General> Editors, enable Close editors automatically and then click the OK button. You will now be able to open editors based on a single click.

Opening Multiple Editors In previous versions of SOAtest and WebKing, only one Editor could be open at once. In SOAtest 6.x, multiple Editors can be open simultaneously.

Saving Changes in Editors When an Editor is modified in SOAtest 6.x, an asterisk “*” displays on the Editor tab denoting that the Editor is now “dirty.” Modifications to the Editor must be explicitly saved using the Save toolbar button or the Ctrl-S keyboard shortcut. Editor with Unsaved Changes

Environments In previous versions of SOAtest, environments were displayed in a separate tab below the Tests tab. Starting with SOAtest 6.x, environments are now part of the tree view in the Test Case Explorer.

47

Migration Guide for Existing SOAtest and WebKing Users

Running a Test To run a test, you can right-click on the test’s node and select Test Using ‘Example Configuration’ from the shortcut menu.’ Alternatively, you can press F9 on your keyboard, or click the Test toolbar button.

Saving Test Suite (.tst) Files In previous versions of SOAtest and WebKing, you had to explicitly save test suite (.tst) files. In SOAtest 6.x, user actions in the Test Case Explorer are automatically saved. For instance, adding new Test to the Test Case Explorer will be automatically saved.

SOAtest View and Console View Failures that occur during the test execution now display in the SOAtest view. What was previously displayed in the Messages Log now displays in the Console view.

48

Migration Guide for Existing SOAtest and WebKing Users

Source Control Integration If you have the appropriate source control plugins installed into the Eclipse environment, your Test Suites can now be checked into source control directly as follows: •

Right-click the Test Suite node and select Team> Commit from the shortcut menu.

To check in new projects, complete the following: •

Right-click on the Project node and select Team> Share Project from the shortcut menu.

Setting up the Automated Nightly Process using CLI To set up an automated nightly build from the Command Line, complete the following: 1. Start SOAtest on your test machine, then create a workspace containing all the projects and test suites that you wish to run as part of your nightly testing process. For more information, see the Setting Up Projects section above. 2. Configure SOAtest's preferences with any global settings that are required for your tests. To open SOAtest preferences, choose SOAtest> Preferences. If the test suites in your workspace were imported from source control, then you should configure the Source Controls settings. You can set preferences as described in the User Guide section on Importing Existing Preferences. 3. (Optional) Create a test configuration to use for your nightly test run. Test configurations have settings that affect the way in which your tests are executed. SOAtest ships with a test configuration named Example Configuration that you can use if you do not wish to create your own. Your test configurations can be managed by choosing SOAtest> Test Configurations. If the projects in your workspace were created from source control, then you should click the Common tab in your test configuration then enable the Update projects from source control option.

49

Migration Guide for Existing SOAtest and WebKing Users

4. (Optional) Create Local Settings (Options) Files. This are text files that can be used to control settings for reports, email, Report Center, Team Server, license server, authorship, and source control. 5. Schedule a daily process to invoke SOAtest using the desired command line options. This can be done using a job scheduler mechanism such as Windows Task Scheduler or Unix cron. For example, to run all projects in your workspace you can use the following command: soatestcli.exe -data "c:\mySOAtestWorkspace" -showdetails -config "user://Example Configuration" -report "c:\mySOAtestReports" -publish -localsettings c:\mySOAtestWorkspace\mylocalsettings.properties"

Please note that SOAtest's command line interface has been modified and enhanced. For example, the -publish argument will add reports to Team Server which can later be used to import test failures as tasks into your local SOAtest workspace. For a detailed list of changes, see the topic on Command Line Interface (cli) Migration.

Deprecated Features •

Load Testing: This is now available in a separate installable called Parasoft Load Test. The current version will allow you to run your existing SOA and web load tests as well as create new ones. It also allows you to load test complete end-to-end test scenarios—from the web interface, through services, to the database. Every protocol and test type available in Parasoft SOAtest is supported in Parasoft Load Test. •

Parasoft Load Test includes the full SOAtest product, so if you are interested in both functional testing and load testing, you should install Parasoft Load Test.



WebKing Paths: WebKing’s Path view has been replaced by Test Suite-based functional tests using Browser Testing tools. The primary benefit is that Test Suite-based functional tests support much more complex web applications (like RIA and AJAX applications). Moreover, the new implementation follows a consistent test configuration paradigm that supports end-to-end testing. Paths in existing .wkj files can be executed in SOAtest 6.x, but they cannot be edited or extended.



WebKing Publishing: This functionality is not applicable to SOAtest 6.x.



Capture HTTP Traffic Tool: This tool is no longer supported. If this functionality is needed, a free tool like WireShark can be used to save the HTTP trace to a file, then the "Generate tests from traffic" option can be used to create tests from it.



Specific XML Validator Options: The XML DTD preferences and validating against DTD options are no longer available.



Management Reports: Report enhancements are planned. SOAtest will report all meta-data to Report Center and Report Center will be able to generate different kinds of reports.



CLI commands: •

-run: This command, which is for running custom Python scripts through SOAtest, is deprecated. Please contact Technical Support for assistance migrating scripts to 6.x.



-runtest: This command is replaced with new CLI options. See the Migration Guide topic on Command Line Interface (cli) Migration for details.



-wsdl



-reportAllTraffic



-traffic

50

Command Line Interface (cli) Migration

Command Line Interface (cli) Migration This topic explains how to migrate your existing SOAtest or WebKing automated nightly process from the legacy command line interface (cli) to SOAtest 6.x’s soatestcli. Sections include: •

Command Line Invocations



Command Line Options

Command Line Invocations The following table shows the differences in command line invocation between previous versions of SOAtest or WebKing and SOAtest 6.x.

OS

Previous WebKing option

Previous SOAtest option

SOAtest 6.x option

Windows

wk.exe -cmd [options]

st.exe -cmd [options]

soatestcli.exe [options]

Linux or Solaris

webking -cmd [options]

soatest -cmd [options]

soatestcli [options]

Examples Running a Single Test Suite SOAtest 5.5 and earlier: If your SOAtest command line invocation to run a single test suite and write a report in HTML format to a file was st.exe -cmd -runtest <test suite name.tst> -reportHTML -detailed <report file name>

your SOAtest command line invocation might be soatestcli.exe -config <configuration name> -resource <path to test suite name.tst relative to the workspace> -report <report file>

WebKing: If your WebKing command line invocation to run a single test suite and write a report in HTML format to a file was wk.exe -cmd -runtest <test suite name.tst> -reportHTML -detailed <report file name>

your SOAtest command line invocation might be soatestcli.exe -config <configuration name> -resource <path to test suite name.tst relative to the workspace> -report <report file>

Running all Test Suites SOAtest 5.5 and earlier: If your SOAtest command line invocation to run all test suites within a directory was st.exe -cmd -runtest -all <directory path>

your SOAtest command line invocation might be soatestcli.exe -config <configuration name> -resource <directory path relative to the workspace> -report <report file>

51

Command Line Interface (cli) Migration

WebKing: If your WebKing command line invocation to run all test suites within a directory was wk.exe -cmd -runtest -all <directory path>

your SOAtest command line invocation might be soatestcli.exe -config <configuration name> -resource <directory path relative to the workspace> -report <report file>

Command Line Options The following tables show the differences in command line options between previous versions of SOAtest or WebKing and SOAtest 6.x.

Top Level Options Note that the following options cannot be used together.

Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Start stub server

-startStubServer

-startStubServer

Run tests

-runTest

no flag is needed

Run static analysis on wsdl

-runwsdltest

Deprecated - but replaced with equivalent WSDL static analysis options. See the note below for details.

Execute script

-run

Deprecated - please contact Technical Support for assistance migrating scripts to 6.x.

WSDL Static Analysis There are three ways to statically analyze WSDLs in SOAtest 6.x: •

Apply a policy file when first generating tests (such as the sample policy in the examples directory). This generates a "WSDL Handler" tool that decomposes a WSDL to its imported WSDL and schema parts. Then, WSDL parts are chained to WSDL validation rules and the schemas are chained to schema validation rules.



Manually create a Coding Standards tool that takes a WSDL or schema file as an input, then have the desired schema rules enabled in it.



Have the schema file within your workspace project, then right-click its Navigator view, node and run a Test Configuration that has static analysis enabled. SOAtest’s built-in Test Configurations include a few sample static analysis configurations—including ones for WSDLs and schemas.

52

Command Line Interface (cli) Migration

Run Test Options Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Run all tests recursively starting at the specified directory.

-all <directory>

To run all tests in the workspace: no flag is needed To run all tests in a particular project: -resource <directory path relative to the workspace>

Ignore a test

-ignore <file name>

To ignore/include all tests in a sub-folder: -exclude <subfolder> / -include <sub-folder>. The -include flag allows to specify a subset of the resources indicated by the resource flag. (DO NOT start the resources after the include/-exclude flags with a '/ '.) To ignore/include a test from the resources specified in the resource flag : -exclude <file name> / -include <file name> (DO NOT start the resources after the -include/-exclude flags with a '/'.)

Run a specific test file

<file name>

-resource <path to test suite name.tst relative to the workspace>

Search and replace router

-router [matchWhole] <searchURI> <replaceURI>

-router matchWhole <searchURI:URI> <replaceURI:URI> This feature is now deprecated. Please use Environments instead.

Specify test name patterns

-testName [-match] <pattern> [dataSourceRow <row>] [-dataSourceName <name>]

-testName [match:] <test name> -dataSourceRow <row> -dataSourceName <name>

Specify environment options

-environment <environment_name>

-environment <environment_name>

Run tests with a single data source row

[-dataSourceRow <row>] [dataSourceName <name>]

-dataSourceRow <row> -dataSourceName <name>

Report test results to HP (Mercury) TestDirector

-testDirector <test file> <report file>

-testDirector <testFile:file> <reportFile: file>

53

Command Line Interface (cli) Migration

Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Report test results to Rational TestManager

-testManager [-v]

-testManagerVerbose

Specify browser used for Web functional test playback.

-browserType

See the User Guide topic on Configuring Browser Playback Options.

Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Execute the specified tool

<toolname>

Run a Test Configuration that executes a .tst file that includes the specified tool. Or, run a Test Configuration that applies the specified type of tool (check links, check spelling, etc.) during static analysis.

Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Generate reports

-genreport or <report format flag> <report file> (see "Report formats")

-report <report file>

Show traffic for successful tests

-reportAllTraffic

Deprecated

Report formats

-reportHTML, -reportXML, and -reportPDF

Specified in the local properties file using the following option:

Tool Execution Options

Reporting Options

report.format=html|pdf|custom Detailed versus summary reports

-detailed and -summary

Specified in the local properties file using the following option: report.developer_errors=true|fa lse

54

Command Line Interface (cli) Migration

Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Mailing reports

-mail -attach to:[email protected]

Specified in the local properties file using the following options: report.mail.enabled=true|false: report.mail.attachments=true|false report.mail.cc=[email_address es] report.mail.include=[email_add resses] See the User Guide topic on Testing from the Command Line Interface (soatestcli) for details

Specify configuration to be used for report generation

-config:<configuration name>

Specified in the local properties file using the available reporting options.

Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Enable communication with Report Center

-logger

Specified in the local properties file using the following option:

Report Center Options

grs.enabled=true|false Specify the name of the machine that is running Report Center

-J-Dlogger.host=<host>.

Specify the port to communicate with Report Center

-J-Dlogger.port=<port>.

Specified in the local properties file using the following option: grs.server=[server] Specified in the local properties file using the following option: grs.port=[port]

Enable Report Center communication and send all traffic to Report Center

-traffic

Deprecated

Specify custom Report Center attributes

-grs <attribute name>=<attribute value>

Specified in the local properties file using the following option: grs.user_defined_attributes=[at tributes]; Use the format key1:value1; key2:value2

Team Server Options 55

Command Line Interface (cli) Migration

Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Send reports to Team Server

N/A

-publish Note that -publish uses the Team Server configuration in the GUI by default. Alternatively, you can specify these settings in the local properties file.

Licensing Options Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Specify local license

-password [expiration date] [password]

Specified in the local properties file using the following options: <tool name>.license.local.expiration=[expiration] and <tool name>.license.local.password=[password]

Specify license server:

-licenseserver [host]:[port]

Specified in the local properties file using the following options: <tool name>.license.network.host and <tool name>.license.network.port

Other Options Action

Previous SOAtest/WebKing option

SOAtest 6.x option

Show help

-dump

-help

Show version

N/A

-version

Installer options

-initjython, -installcertificate, uninstallcertificate

-initjython, -installcertificate, uninstallcertificate

Specify classpath entries

-extraClasspath

Specified in the local properties file using the following option: system.properties.classpath

For more details on SOAtest 6.x’s command line interface, see the User Guide topic on Testing from the Command Line Interface (soatestcli).

56

SOAtest Tutorial In this section: •

About this Tutorial



Creating Projects and Test (.tst) files



WSDL Verification



Functional Testing



Scenario Testing



Advanced Strategies



Creating and Deploying Stubs



Testing Plain XML Services



Extending SOAtest with Scripting



Asynchronous Testing



WS-Security



Design and Development Policy Enforcement



Automation/Iteration (Nightly Process)



Running Regression Tests in Different Environments



Web Functional Testing



Web Static Analysis

57

About this Tutorial

About this Tutorial The topics presented in this tutorial will guide you through how SOAtest addresses key areas of service and Web application testing. This tutorial will demonstrate the creation of various test suites. For your convenience, we've provided a sample SOAtest test suite named SOAtestTutorial.tst, located within the SOAtest examples directory. It contains all the tests that will be created through the tutorial, and also includes specific examples referenced throughout this tutorial. Further information about each test is given in the Requirements and Notes tab of each test suite’s configuration panel.

Parasoft SOAtest Best Practices While reading this document, you will find examples that show the recommended way of creating test cases within SOAtest. When creating tests for your own services and applications, you can utilize these examples to create your tests in a similar fashion. The following list details the best practices for using SOAtest: •

Using the SOAtest test wizard, create a test suite of WSDL tests that should be run on a nightly basis.



Using the SOAtest test wizard, create a test suite of SOAP Client tests for each operation defined within your WSDL. These test clients can then be moved into separate test suites for Functional Tests and Scenario Tests to optimize reusability and organization.



Positive and negative test cases should be created for each test case you create to fully maximize the testing coverage of the web service.



Regression tests should be created for both positive and negative test cases. Regression tests alert you to any changes in service functionality over time as the service evolves.



For each distinct testing requirement, create a separate Test (.tst file).

You will learn how to apply these and other best practices throughout the tutorial. As you gain a basic understanding of SOAtest functionalty, we strongly recommend reading the SOAtest Best Practices Guide, which is available as SOAtest_Best_Practices.pdf in [SOAtest_install_dir]/manuals.

58

Creating Projects and Test (.tst) files

Creating Projects and Test (.tst) files A project (an entity created by Eclipse) can contain any number of SOAtest-specific .tst files. They can also contain source files you want to analyze with SOAtest, and any other resources that make sense for your environment. Each .tst file can include any number of test suites/scenarios, tools, inputs, and stubs. The organization and structure is up to you. To keep file size down and to improve maintainability, we recommend using one .tst file for each distinct testing requirement.

Creating New Projects Multiple .tst files can be grouped into a single project. First, we will create a new project based on existing test suites. 1. Choose File> New> Project from Existing SOAtest or WebKing Test Suites.



Alternatively, you choose this command from pull-down menu for the New toolbar button (top left).

2. Enter Examples in the Project Name field. 3. Specify the location of the project’s test suites by clicking Browse then navigating to [SOAtest_installation_directory]/examples/tests.

59

Creating Projects and Test (.tst) files

4. Click Finish.

The Examples project will be added to the Test Case Explorer. It will contain multiple test (.tst files). You can also create new projects by selecting different commands (such as Project from WSDL or Project from Web Browser Recording) from the new project wizard. To see all available New Project options, choose File> New> Other and look under the SOAtest folder.

60

Creating Projects and Test (.tst) files

Opening and Closing Test (.tst Files) By default, .tst files are closed. All open .tst files are loaded into memory. There are two ways to open a .tst file: •

Double click the .tst file’s Test Case Explorer node.



Right-click the .tst file’s Test Case Explorer node, then choose Open Test (.tst) File from the shortcut menu.

Closed .tst files have the following "closed box" icon:

Open .tst files have the following "open box" icon:

61

Creating Projects and Test (.tst) files

Creating New Test (.tst) Files We recommend that you create a separate test (.tst file) for each distinct testing requirement. There are two ways to create a new test (.tst) file. •

Right click the project node, and select New Test (.tst) File from the shortcut menu.



Choose File > New > New Test (.tst) File.

The wizard will guide you through the test case creation process, then add a .tst file containing the generated tests. As you go through the subsequent tutorial lessons, you will learn how to create different kinds of Test Suites.

62

WSDL Verification

WSDL Verification WSDL verification can be considered the first step in testing Web Services. Although WSDLs are generally created automatically by various tools, it doesn’t necessarily mean that the WSDLs are correct. When WSDLs are manually altered, WSDL verification becomes even more important. Ensuring correct and compliant WSDLs enables your service consumers to function correctly, and avoids vendor lock-in, thus achieving interoperability and realizing SOA goals of service reuse. SOAtest can automatically generate a test suite of comprehensive WSDL tests to ensure that your WSDL conforms to the schema and passes XML validation tests. Additionally, it performs an interoperability check to verify that your web service will be interoperable with other WS-I compliant services. When you complete this section of the tutorial, your test suite should resemble the test suite entitled "WSDL Tests" in the SOAtestTutorial.tst file.

Creating a WSDL Verification Test Suite For this example we will create WSDL tests for a book store service with the WSDL located at http://soatest.parasoft.com/store-01.wsdl. To verify a WSDL using SOAtest’s WSDL Verification Tests, complete the following: 1. Open the pull-down menu for the New toolbar button (top left) then choose Project from WSDL

2. Enter a name for the project (e.g., Tutorial) in the Project name field, then click the Next button.

63

WSDL Verification

3. In the WSDL URL field, enter http://soatest.parasoft.com/store-01.wsdl

4. Clear the Create Functional Tests from the WSDL check box and select the Create tests to validate and enforce policies on the WSDL check box. 5. Click Finish. Because you selected the Create tests to validate and enforce policies on the WSDL check box, four WSDL tests are automatically created in a separate test suite called WSDL Tests. To see this test suite, open the Test Case Explorer tab and expand the tree.

SOAtest automatically creates the following WSDL tests from a WSDL URL. •

Test 1: Schema Validity: Runs XML validation on the WSDL against WSDL schemas from W3C.

64

WSDL Verification



Test 2: Semantic Validity: Checks the correctness of the WSDL by parsing and consuming it like an actual service consumer would, but with stricter adherence to standards.



Test 3: WS-I Interoperability: Verifies the WSDL against WS-I Basic Profile 1.1.



Test 4: WSDL Regression: Creates a regression control for the WSDL so that changes in the WSDL document can be detected.

6. Select the Test 3: WS-I Interoperability Check node and click the Add test or output toolbar button.

This opens the Add Output wizard, which displays a list of available tools. In addition, a description of the selected tool displays in the Tool Description field. 7. In the Add Output wizard, select Conformance Report from the left pane, select All from the Show dropdown menu, select Browse from the right pane, and click the Finish button. This will send a WS-I Conformance report to your internet browser when you run the test.

65

WSDL Verification

8. Select the Test Suite: WSDL Tests node and click the Test toolbar button.

If any errors occur, they will display in the Console dialog located at the bottom of the SOAtest GUI. You can double-click the errors in the right GUI panel for additional information and you can also examine the conformance report that was opened in your internet browser. Note: If you are using Firefox 3.0 or above as your default browser, the XML WS-I sheet may not be read. To fix this problem, select the Conformance Report> Browse node, then in the Project Configuration panel, select a different browser and try running the WSDL Tests node again.

66

Functional Testing

Functional Testing The best way to ensure the correct functionality of your Web service is to start by creating unit tests for each individual operation implemented by your service. Performing unit testing allows you to catch errors at the component level, making development errors easier to identify and fix. The SOAtest test creation wizard will automatically create a test client for each operation defined within your WSDL. These tests can then be moved into separate test suites, creating one test suite for each test case, allowing you to organize and structure your testing environment to maximize readability and reusability. For example, if your WSDL defines five operations, the SOAtest wizard will generate five test clients in a single test suite. These five test clients can then be separated into five separate test suites, each containing a unit test for a single operation. In this example a simple book store service is used. It provides the following operations: •

getItemById(int): Returns the book entry with the given item id. Currently valid values are 1, 2, 3, 4, 5 and 6.



getItemByTitle(String): Returns a list of Book objects that matched your title search query. The item price value which is returned by this operation increases by $1.00 every 5 invocations. Example keywords: linux, java, C++, program. Leave it blank to get ALL the books in the database.



placeOrder(int, int): Takes an item id and a quantity, returns an "Order" object which includes a Book object, quantity and a unique order number.



getPendingOrders(): Returns a list of orders that have been submitted using placeOrder(int,int) so far.



removeOrder(int): Takes an order number and removes it from the pending orders and returns a string with a result message (success or failure, etc.). As you might expect, the order numbers it takes successfully are the same as the ones returned by placeOrder(int, int).



confirm(): Confirms the currently pending orders. Subsequent calls to getPendingOrders(int, int) or removeOrder(int) will result in nothing.



addNewItem(Book): Enables you to add new book entries into the database (virtually). Feel free add anything you want; it will not really add them to the permanent database. New entries will only live throughout your session.

When you complete this section of the tutorial, your test suite should resemble the test suite entitled "Unit Tests" in the SOAtestTutorial.tst file.

Creating Test Suites for Unit Tests For this example, we add a new Test (.tst) file to the project created in the previous lesson.

67

Functional Testing

1. Right-click the project from the previous exercise, then choose New Test (.tst) File from the shortcut menu.

2. Enter a name for the file (e.g., functional test lesson), then click Next. 3. Select SOA> WSDL, then click Next.

4. Select http://soatest.parasoft.com/store-01.wsdl from the WSDL URL field.

68

Functional Testing

5. If it is not already selected, enable the Create functional tests from the WSDL checkbox .

6. Click the Next button four times to proceed to the Layout dialog. 7. Enable the Organize as Positive and Negative Unit Tests checkbox.

69

Functional Testing

8. Click the Finish button. The newly created test suite displays in the Test Case Explorer. 9. Double-click the new Test Suite: Test Suite node.

10. In the test suite configuration panel (on the right side of the GUI), enter Functional Tests in the Name field, then click the Save toolbar button. Within the Functional Tests test suite, there is a Test Suite: ICart node that includes seven other test suites which test each operation of the WSDL. 11. Right-click the Test Suite: ICart node, then choose Expand All. This will display all seven test suites are visible and each test within each test suite is displayed.

70

Functional Testing

Each of the seven test suites contain both a positive and negative test for each operation since it is important to test situations where we send expected data as well as unexpected data to the server. 12. Double-click the Test Suite: getItemByTitle Positive Test> Test 1: getItemByTitle node .

71

Functional Testing

13. Open the Request tab in the test configuration panel, type Linux in the titleKeyword entry field, then click Save. We will be searching for books with keyword Linux.

14. Select the Test 1: getItemByTitle node and click the Test tool bar button. The getItemByTitle operation is invoked with the parameter Linux. 15. Expand the Test 1: getItemByTitle node and double-click the Traffic Object> Traffic Viewer node underneath.

The HTTP Traffic panel opens and displays the traffic that was logged from the test run. 16. Right-click on the Test Suite: getItemByTitle Positive Test node and select Create/Update Regression Control from the shortcut menu, then choose Create Internal Regression Control in the Response Validation Wizard that opens.

SOAtest automatically runs the test and creates a regression control populated with the value received from the server.

72

Functional Testing

We now have a functional test that tests the getItemByTitle operation of our web service on a single input value. The same sequence of actions can be done to create functional tests for the other operations defined within the WSDL.

Ignoring XPath Values When creating regression controls, it may be helpful to ignore dynamic values such as timestamps or session variables that can cause your regression test to fail. In the bookstore example, the “price” element is a dynamic value, with the price of the book increasing by $1 every five times the test is run. In this example we will set up an XPath Property to globally ignore the “price” element value in all tests. 1. Run Test 1: getItemByTitle a few times. Notice that after a few test runs, the regression test fails and a task is reported in the Example Configuration view. This is because the price element has changed. In this case we want to ignore the value of the price element. 2. Right-click on the error message in the SOAtest view and select Ignore XPath from the shortcut menu.

An Ignored XPaths Settings dialog displays.The XPath of the price element, /Envelope/ Body/getItemByTitleResponse/Result/i/price, is automatically populated in the XPath field.

73

Functional Testing

3. Make sure the Recursive, Text Content, and Modify checkboxes are selected and click OK. This will instruct the regression test to recursively ignore any modifications to the text content of the price element. 4. In Test 1: getItemByTitle, double-click the Response SOAP Envelope> Diff control node. 5. Open the Ignored Differences tab in the test configuration panel. The Ignored Differences dialog displays. Notice that the XPath of the price element has been added to the Ignored XPaths List.

All of the price element values with the specified XPath are now being ignored. Run the functional test again and it will succeed.

Using the XML Assertor You can use the XML Assertor tool to enforce the correctness of data in an XML message. It is most commonly connected to a SOAP Client or Messaging Client in order to verify the data returned by a service. The XML Assertor provides support for complex message validation needs without the need for scripting, and allows you to easily create and maintain validation assertions on your XML messages. To use the XML Assertor, complete the following:

74

Functional Testing

1. Right-click the Test 1: getItemByTitle node from the previous exercise and select Add Output from the shortcut menu.

2. In the Add Output wizard, select Response> SOAP Envelope on the left, select XML Assertor on the right, and click the Finish button.

3. Double-click the Response SOAP Envelope> XML Assertor node that was added underneath the Test 1: getItemByTitle node .

75

Functional Testing

4. In the XML Assertor panel, open the Configuration tab and click Add.

5. In the Select Assertion wizard, expand Value Assertions, select String Comparison Assertion, then click Next.

The String Comparison Assertion dialog displays a tree view of the XML message from which you can select a single value to enforce. 6. Select the title element from the String Comparison Assertion dialog and click the Finish button.

76

Functional Testing

The Configuration tab of the XML Assertor is now populated with a String Comparison Assertion. 7. In the XML Assertor’s Configuration tab, select contain from the Element must drop-down menu, and enter Linux in the Expected Value field of the Configuration tab.

8. Save the changes to the XML Assertor Configuration. 9. Click the Test toolbar button. The test succeeds. You may add additional assertions to apply to the message (such as a Numeric assertion to enforce on the price element) by clicking the Add button in the XML Assertor’s Configuration tab.

Automate Testing Using Data Sources Now that we have a unit test created that tests a single input value, the next step is to add a data source. Adding a data source will allow you to test multiple input values with a single test case. 1. Right-click on the root test suite node Test Suite: Functional Tests and select Add New> Data Source from the shortcut menu.

77

Functional Testing

2. Select Excel from the New Project Data Source wizard and click the Finish button.

3. In the Data Source configuration panel, complete the following: a. Enter Books in the Name field. b. Click the File System button to navigate to and select the Books.xls file that is included in the SOAtest examples/datasources directory. c.

Click Save.

d. Click the Show Columns button to display the column names from the Excel Spreadsheet.

78

Functional Testing

4. Go back and double-click the Test 1: getItemByTitle node from the previous exercise. Books should already be selected from the Data Source drop-down menu that is now present in the test configuration panel.

5. For the titleKeyword drop-down menus at the bottom of the test configuration panel, select Parameterized and Keywords and then click the Save toolbar button.

79

Functional Testing

6. Click the Test tool bar button and notice the error messages that appear in the SOAtest view. The test ran one time for each row in the Keywords column, but failed due to the XML Assertor we created previously. Now we need to update our regression control. 7. Right-click the Response SOAP Envelope> XML Assertor node and select Delete from the shortcut menu. 8. Right-click the Test 1: getItemByTitle node and select Create/Update Regression Control. 9. In the Response Validation wizard, expand the Update Regression Controls node, select Update All Controls, and click the Finish button.

80

Functional Testing

10. Select the Test 1: getItemByTitle node and click the Test toolbar button. SOAtest adds new regression controls for each test run. In this case, 4 regression controls are added: one for each row of the data source. 11. Double-click the Traffic Object> Traffic Viewer node beneath the Test 1: getItemByTitle node and notice that the test ran four times, once for each keyword value in the Keyword column.

81

Functional Testing

Separating Tests into Positive and Negative Test Cases When creating test cases, it is important to test situations where we send expected data as well as unexpected data to the server. It is important that the server sends the correct responses to valid requests, and just as important that it knows how to handle invalid requests. In this example we will examine the Negative Unit Test within the Test Suite: getItemByTitle Unit Tests node. 1. Expand Test Suite: getItemByTitle Unit Tests and Test Suite: getItemByTitle Negative Test, then double-click the Test 1: getItemByTitle node. 2. In the Test 1: getItemByTitle node located in the Negative Tests test suite, parameterize the titleKeyword element using the Bad Keywords column.

In the negative test cases we are sending our service unexpected data and verifying that it returns the correct response or error response. 3. Save your changes to Test 1. 4. Right click on the test, select Create/Update Regression Control, select Create Regression Control, click Create Multiple Controls, then click Finish. New regression controls are created for each test run. No tasks are reported in the SOAtest view. This is the correct behavior.

Testing Invalid Data It is useful to test situations in which invalid data is sent to your service. For example, sending a string when your service expects an integer. 1. Select the Test Suite: Functional Tests node and click the Add Test Suite button.

82

Functional Testing

2. Select Empty suite from the Add Test Suite wizard and click the Finish button.

3. Double-click the new Test Suite: Test Suite node that was added to the test suite tree. 4. In the test suite configuration panel (on the right side of the GUI), enter Sending Bad Data in the Name field in the right GUI and click the Save toolbar button. 5. Expand Test Suite: getItemById Unit Tests, then expand Test Suite: getItemByIdNegative Test, then copy Test 1: getItemById.

83

Functional Testing

6. Paste Test 1: getItemById into the new Sending Bad Data test suite.

7. Double-click the Test 1: getItemById node within the Sending Bad Data test suite.

8. In the test configuration panel’s Form Input view, right-click the id element and disable (uncheck) Enforce Schema Type from the shortcut menu. This tells SOAtest to allow sending data for id element that does not conform to the schema - in this case, the schema indicates that the id element is an int, but we'll send a string instead.

84

Functional Testing

9. Enter the literal string Bad Data as the Fixed Value for the id element, then click the Save toolbar button.

10. Click the Test toolbar button. 11. After the test completes, view the traffic by expanding the Test Suite: Sending Bad Data> Test 1: getItembyID branch, and then double-clicking Traffic Object> Traffic Viewer.

12. Open the Traffic Viewer’s Response tab. Notice that an exception is thrown and displayed in the Response traffic.

85

Functional Testing

13. Right-click the Test 1: getItemById node within the Sending Bad Data test suite and select Create/Update Regression Control from the shortcut menu.

14. In the Response Validation Wizard, select Create Regression Control, then click Finish.

86

Scenario Testing

Scenario Testing After unit tests have been created, they can be leveraged into scenario-based tests without any additional work. Scenario tests allow you to emulate business logic or transactions that may occur during normal usage of the web service. This also allows you to find bugs that may surface only after a certain sequence of events. NOTE: If at any time during this exercise you receive a session expiration error message from the server, select SOAtest> Preferences. In the Misc tab of the SOAtest Preferences dialog box that opens, click the Reset Cookies button, and then click OK. The scenario test example given in the test suite “Scenario Test - Search, Place Order, and Remove” represents a typical sequence of operations that a customer may invoke when using a book store web service. In this case it represents a situation where a customer searches for a book, places an order for that book, and then removes the previously placed order. This scenario test introduces a tool called the XML Data Bank. This tool allows you to extract XML element values and store these values in memory to be used in later tests. In this example you will be storing the book ID returned by the service after searching for a book, and then in the subsequent test, use that ID to purchase the book. You will also store the order number returned after placing an order for the book, and then in the subsequent test, use that order number to remove the order from the system. When you complete this section of the tutorial, your test suite should resemble the test suite entitled "Scenario Test - Search, Place Order, and Remove" in the SOAtestTutorial.tst file.

Creating a Scenario Test Suite To create this scenario, perform the following: 1. Select the Test Suite: Functional Tests node from the previous exercise and click the Add Test Suite button.

2. Select Empty from the Add Test Suite wizard and click the Finish button. 3. Double-click the new Test Suite: Test Suite node that was added to the test suite tree. 4. In the test suite configuration panel (on the right side of the GUI), enter Scenario Test Search, Place Order, and Remove into the Name field, and then click the Save toolbar button. 5. Copy the positive getItemByTitle, placeOrder, and removeOrder test nodes from the previously created Functional Tests test suite and paste them into the Scenario Test - Search, Place Order, and Remove test suite. If needed, you can drag and drop to reorder them.

87

Scenario Testing

These three tests represent a typical business transaction a customer may invoke and will be the basis for our scenario test.

Configuring an XML Data Bank To configure the XML Data Bank, complete the following: 1. Double-click the Test 2: placeOrder node in the Scenario Test - Search, Place Order, and Remove test suite. 2. In the test configuration panel, select Books from the Data Source drop-down menu at the top right.

3. Select Parameterized and Use Data Source Wizard from the itemId element drop-down menus.

4. Complete the Parameterize with Value From Existing Test Response dialog as follows so that when this test is run, the value stored from Test 1 will be automatically inserted as the value for the itemId element: a. Select Test 1: getItemByTitle from the Test menu at the top of the dialog. b. Select the id element from the Expected XML tree and click the Add button. The id element displays in the Selected XPaths list with a Data Source column name corresponding to the selected test.

88

Scenario Testing

c.

Click the OK button.

Test 1:id now displays in the right GUI panel as a parameterized value for itemId. You will also notice that a Response SOAP Envelope> XML Data Bank node now appears underneath the Test 1: getItemByTitle node in the Scenario Test - Search, Place Order, and Remove test suite. 5. In the test configuration panel, enter a Fixed value of 3 for the quantity element, then click the Save toolbar button.

6. Double-click the Test 3: removeOrder node. 7. Select Books from the Data Source drop-down menu in the right GUI panel and select Parameterized and Use Data Source Wizard from the orderNumber element drop-down menus. 8. Complete the Parameterize with Value From Existing Test Response dialog as follows so that when this test is run, the order_number element value stored from Test 2 will be automatically inserted as the value for the orderNumber element: a. Select Test 2: placeOrder from the Test menu at the top of the dialog.

89

Scenario Testing

b. Select the order_number element from the Expected XML tree and click the Add button. The order_number element displays in the Selected XPaths list with a Data Source column name corresponding to the selected test. c.

Click the OK button.

Test 2:order_number now displays in the test configuration panel as a parameterized value for orderNumber. You will also notice that a Response SOAP Envelope> XML Data Bank node now appears underneath the Test 2: placeOrder node in the Scenario Test - Search, Place Order, and Remove test suite.

9. Click the Save toolbar button. 10. Select the Scenario: Scenario Test - Search, Place Order, and Remove node and click the Test toolbar button. When this test is run, the order_number element value stored from Test 2 will be automatically inserted as the value for the orderNumber element.

90

Scenario Testing

11. Explore the traffic by expanding Scenario: Scenario Test - Search, Place Order, and Remove and double-clicking each test’s Traffic Object> Traffic Viewer nodes. 12. Notice that the itemId of the book returned from Test 1 is used as the input for Test 2. Also, the order_number of the order placed in Test 2 is used as the input for Test 3. 13. Right-click the Scenario: Scenario Test - Search, Place Order, and Remove node, select Create/Update Regression Controls. 14. In the Response Validation Wizard, expand the Update Regression Controls node, select Create Regression Controls, and click the Finish button. The tests are run and a Regression Control is added to each SOAP Client test. 15. Select the Scenario: Scenario Test - Search, Place Order, and Remove node and click the Test toolbar button. Notice that all the tests now fail. 16. Examine the error messages that appear in the SOAtest view. These regression failures are due to dynamic content that appears within the response messages. In the following steps we will ignore elements with this type of dynamic data. 17. In the SOAtest view, right-click on the first error reported under each Test Suite node and select Ignore XPath from the shortcut menu. In the Ignore XPath Settings dialog that displays, click the OK button. You should ignore two XPaths in this step. 18. Select the Test Suite: Scenario Test - Search, Place Order, and Remove node and click the Test toolbar button. All the tests should now succeed. You have now created a fully functional scenario test that tests one possible business transaction that may occur during normal usage of the book store service. For extra practice you can try to create other scenarios that may possibly occur. Negative test cases could also be created for expanded test coverage.

91

Advanced Strategies

Advanced Strategies This lesson covers advanced strategies that will help you develop more robust, reusable test suites. Sections include: •

Creating Reusable (Modular) Test Suites



Looping Until a Test Succeeds or Fails - Using Test Flow Logic

Creating Reusable (Modular) Test Suites In many cases, you may want to create a test suite that can be reused by other test suites. A common example is a test suite that logs in to a web site. Once such a test suite is created, it can be used by other test suites in various scenarios that require a login. Two SOAtest features are especially helpful for the creation and use of reusable test suites: •

Referenced test suites: Once a reusable module or test suite has been created, it can be referenced by another test suite.



Test variables: You can parameterize tests with test variables, which can be set to specific values from a central location, extracted from a tests (e.g., through a data bank tool), or set from data sources.

This lesson demonstrates how those two features can be used to create and use reusable test suites. For simplicity, we will use the store web service; however, the principles and steps below can then be applied to any scenario you create. To create a reusable test suite: 1. Create an empty test suite (.tst) file called ReusableModule.tst as follows: a. Right-click the Examples project node in the Test Case Explorer, then choose New test (.tst) file. b. Under File name, enter ReusableModule. c.

Click Next.

d. Select Empty, then click Finish. 2. Create a SOAP Client test as follows. a. Expand the ReusableModule.tst Test Case Explorer node. b. Right-click the Test Suite: Test Suite node, then choose Add New> Test. c.

In the dialog that opens, select SOAP Client, then click Finish.

3. Configure the SOAP Client test as follows:

92

Advanced Strategies

a. In the test configuration panel’s WSDL tab, enter http://soatest.parasoft.com/ store-01.wsdl for the WSDL URL.

b. In the Request tab, set Operation to getItemByTitle.

c.

Save the changes to the SOAP Client test.

4. Define a test variable as follows: a. Double click the Test Suite: Test Suite node to open the test suite configuration panel. b. In the Test Variables tab, click the Add button. c.

In the dialog’s Name field, enter title variable.

d. Change Type to Data Source. e. Keep the selection at Use value from parent test suite (if defined). This will allow this variable to use values set in the test suite that references this test suite. If the selection is changed to Use local value, the value of the variable will always be the value specified in the Value field. f.

Enter store for Data Source Name and enter title for Column Name. This specifies that we expect a test suite that references this test suite to have a data source named store with a column named title.

93

Advanced Strategies

g. Enter Java for Value. This is the default value that will be used if SOAtest does not find a data source named store with a column named title.

h. Click OK. i.

Save the test suite configuration changes.

5. Configure the SOAP Client test to use the specified data source values, if available, as follows: a. In the SOAP Client’s test configuration panel, go to the Request tab and change Fixed to Parameterized. b. Select title variable in the combo box.

c.

Save the changes to the SOAP Client test.

6. Run the test suite by selecting ReusableModule.tst, then clicking the Run toolbar button. 7. Double-click the SOAP Client test’s Traffic Viewer node to see the traffic. Note that the titleKeyword used was Java. SOAtest used the default variable value because it did not find the specified data source (since we did not create it yet).

94

Advanced Strategies

8. Create an empty test suite (.tst) file called TestStoreTitles.tst as follows: a. Right-click the Examples project node in the Test Case Explorer, then choose New test (.tst) file. b. Under File name, enter TestStoreTitles. c.

Click Next.

d. Select Empty, then click Finish. 9. Add a data source to that test suite as follows: a. Right-click the Test Suite: Test Suite node, then choose Add New> Data Source. b. Select Table, then click Finish. c.

In the data source configuration panel, change the name to store.

d. Add a column named title.

95

Advanced Strategies

e. Add two values to that column: Linux and C++.

f.

Save the data source changes.

10. Configure this test suite to reference the first test suite we created in this exercise as follows: a. Right click Test Suite: Test Suite and choose Add New> Test Suite. b. Select Reference Test (.tst) File. c.

Click Finish.

d. Select ReusableModule.tst, then click Open. 11. Run the current test suite by selecting TestStoreTitles.tst, then clicking the Run toolbar button. 12. Double-click the Traffic Viewer node to see the traffic.

96

Advanced Strategies

13. Verify that Linux, then C++ were used as the titleKeyword.

Looping Until a Test Succeeds or Fails - Using Test Flow Logic In many cases, you may want to have SOAtest repeatedly perform a certain action until a certain condition is met. Test suite flow logic allows you to configure this. SOAtest allows you to choose between two main test flow types: •

While variable: Repeatedly perform a certain action until a test variable condition is met. T



While pass/fail: Repeatedly perform a certain action until a pass/fail condition is met (e.g., one of the tests in the test suite either passes or succeeds).

To see how while pass/fail logic allows you to have a test suite loop until a specified price value is obtained reached. 1. Create an empty test suite (.tst) file called TestFlowLogic.tst as follows: a. Right-click the Examples project node in the Test Case Explorer, then choose New test (.tst) file. b. Under File name, enter TestFlowLogic. c.

Click Next.

d. Select Empty, then click Finish. 2. Open the test suite configuration panel by expanding the test suite, then double-clicking the Test Suite: Test Suite node.

3. Open the Execution Options> Test Flow Logic tab.

97

Advanced Strategies

4. Set the Flow type to While pass/fail. 5. Set the Maximum number of loops to 20. 6. Leave the Loop until one of the test(s) setting at Succeeds.

7. Save the test suite configuration settings. 8. Create a SOAP Client test as follows. a. Right-click the Test Suite: Test Suite node, then choose Add New> Test. b. In the dialog that opens, select SOAP Client, then click Finish. 9. Configure the SOAP Client test as follows: a. In the test configuration panel’s WSDL tab, enter http://soatest.parasoft.com/ store-01.wsdl for the WSDL URL.

98

Advanced Strategies

b. In the Request tab, set Operation to getItemByTitle.

c.

Enter Java as the value for titleKeyword.

d. Save the changes to the SOAP Client test. 10. Create a regression control for that test as follows: a. Right-click the SOAP Client test node and select Create/Update Regression Control. b. Select Create Regression Control. c.

Click Finish.

11. Modify the expected price in the regression control as follows: a. Double-click the newly-created Diff control node to open the Diff Tool editor.

b. Modify the price from 76.0 to 78.0. The store service will increases the price of the book by $1.00 after several calls to getItemByTitle, so this is the expected value after several test iterations.

99

Advanced Strategies

12. Run the current test suite by selecting the test suite node, then clicking the Run toolbar button. The Test Suite will succeed because a price of 78.0 will reached before looping 20 times.

13. Double click on the previously-created Diff Control node to re-open the Diff Tool editor. 14. Modify the price from 78.0 to 150.0.

100

Advanced Strategies

15. Run the Test Suite again. The Test Suite will fail because a price of 150.0 is never reached— even after looping 20 times.

101

Creating and Deploying Stubs

Creating and Deploying Stubs Evolving services in a distributed SOA environment, and across multiple teams, is a complex endeavor due to the interdependencies between the system and business processes. For example, in a system that incorporates multiple endpoints such as credit card processing, billing, shipping, etc., it may be difficult for one team to test the responses from another team without interrupting normal business transactions. With SOAtest’s stub generation capability, you can test complex and distributed environments by creating stubs in a number of ways: •

Create a functional test that models the scenario that you want emulated (you simply interact with the actual components to be emulated), then have SOAtest automatically generate stubs that emulate the behavior monitored when executing the modeled scenario.



Emulate services based on real-world historical data (request/reply sets) collected from the runtime environment.



Create an emulated version of completely unavailable services "from scratch"—for example, you can use spreadsheets or other data sources to define the desired behavior, and then visually correlate request parameter values with desired response values.

Creating Stubs from Functional Tests In the following exercises, we will create stubs that emulate an existing Web service. The stubs will be created automatically from an existing SOAtest test suite that tests an existing Web service. We will also deploy the stubs locally and use the stubs for testing. To create a stub from functional tests: 1. Open the Test Suite named StubClient.tst in the examples/tests directory. This is a Test Suite comprised of three tests driven by a data source of three rows. The test suite consumes the actual Book Store web service described by the WSDL at http://soatest.parasoft.com/store-01.wsdl. 2. Select the main Test Suite: StubClient node and click the Test toolbar button. Nine tests will run and succeed. From these tests, we will automatically generate a stub. 3. Right-click the Test Suite: StubClient node and select Create Stub from the shortcut menu.

102

Creating and Deploying Stubs

4. To accept the default file name, click Next.



By default, SOAtest will create a stub Test Suite named StubClientStub.tst and save it in a "stubs" project, which will be added to your workspace if it does not already exist

5. To accept the default deployment settings, click Finish. •

By default, SOAtest will deploy the stub at the Endpoint http://localhost:9080/servlet/StubEndpoint?stub=StubClient.

SOAtest automatically creates the stub based on the existing Test Suite and saves it as StubClientStub.tst. It then deploys the stub on the local stub server at the default endpoint.

Viewing Stub Deployment Settings To view the deployment settings for the stubs deployed in the previous exercise, complete the following: 1. Select Window> Show View> Stub Server. The Stub Server view displays at the bottom GUI panel. 2. Expand the Local Machine node and double-click the StubClient node.

The deployment settings for the StubClient stub will display.

103

Creating and Deploying Stubs

Validating the Deployed Stubs In a real-world situation, you would want to validate that the stubs behave properly before you point your application to the stubs instead of the actual resource that the stubs are emulating. To use SOAtest to validate that the deployed stubs are working as expected, perform the following: 1. Expand the Environments node located in the Test Case Explorer view. 2. Right-click the Stub Environment node and select Set as Active Environment from the shortcut menu. This configures SOAtest to interact with the stubs—so you can ensure that they are working correctly before you configure your application to access them.

3. Click the Test toolbar button. The SOAP Client tests will now exercise the emulated services rather than the actual ones. The tests will run and succeed. If you examine the traffic, it should confirm that the stubs are behaving as expected. In a real-world situation, you would now configure your application to access the HTTP Endpoint for StubClient: http://<localhost>:9080/servlet/StubEndpoint?stub=StubClient

104

Creating and Deploying Stubs

Modifying Stubs Behavior To modify the stubs behavior, complete the following: 1. Locate and expand the Test Case Explorer node for StubClientStub.tst (the test suite that was added when you generated stubs for StubClient.tst). This is located in the "stubs" project. 2. Double-click the Test 1: getItemByTitle node. This is the stub for the getItemByTitle operation.

3. Open the test configuration panel’s Response tab, and select Response 1 within the Message Body tab. Notice the end of the XPath Function: *[local-name(.)="titleKeyword"]/text()="Linux". This stub returns the specified XML if the XPath function succeeds on the request XML (if the getItemByTitle is Linux).

105

Creating and Deploying Stubs

4. Open the Message subtab (within the Response tab) and modify the title element to return Linux Hacking Handbook instead of Linux Administration Handbook.

5. Click Save. 6. In the Stub Server tab, right-click the Local Machine node and select Re-Deploy All Stubs from the shortcut menu to re-deploy the modified stub.

7. Click the Test toolbar button. Now when you run the StubClient.tst test suite, the response for getItemByTitle(Linux) will contain the modified title. Again, this validates that the stub is behaving as expected.

106

Creating and Deploying Stubs

Using this iterative modify-validate process, you can customize the behavior of the stubs. You can also modify the XPath functions in a similar fashion.

Creating Stubs From a Traffic Trace If you can access a message log or a trace from the traffic between clients and servers, then you can create stubs from that data. Specifically, you can have SOAtest automatically create Message Stubs that respond to incoming messages in a way that mimics the behavior captured in the traffic. Such message traces or logs can be captured at the network level using network sniffing tools such as the free WireShark tool (http://www.wireshark.org/), or obtained by having your application log its traffic. The format that SOAtest expects is fairly loose. It can parse a wide range of formats—as long as it can find HTTP headers and request/response messages in sequence. To create stubs from a traffic trace: 1. Delete the store stub created in the previous lesson (go to the stubs project, right-click StubClientStub.tst, then choose Delete). This will delete the file and cause the stub to be undeployed (it will be removed from the Local Machine folder in the Stub Server view). 2. Right click the stubs project and choose New Test (*.tst) File.

107

Creating and Deploying Stubs

3. Ensure that stubs is selected under Enter or select the parent folder, set the file name to StoreStub, then click Next.

108

Creating and Deploying Stubs

4. Select SOA> Other> Traffic, then click Next.

5. For Traffic file, browse to the store.txt file found under [soatest install dir]/examples/traffic. This file was saved by using WireShark to trace traffic between a client and the Parasoft book store service.

109

Creating and Deploying Stubs

6. Select Generate Server Stubs, then click Next.

7. Give the stub a name (such as StoreStub). 8. For Value of "stub" parameter, specify the value ClientStub (the sample StubClient.tst will invoke the stub when it sees this expected value).

110

Creating and Deploying Stubs

9. Click Next. SOAtest will now display a series of steps: one for each Web service operation that it found in the file. 10. For Test 1: getItemByTitle() , leave the default automatic setting – we know that this operation expects a single parameter in the request, so SOAtest can recognize that single parameter value as the determinant for response messages.

11. Click Next. 12. For Test 2: placeOrder(): a. Choose Select relevant parameters. b. Click New. c.

Under the SOAP Envelope tree, select itemId (SOAtest will generate the XPath for that request parameter), then click OK.

d. Click New again.

111

Creating and Deploying Stubs

e. This time, select quantity, then click OK.

You should now have two entries listed under the Request Parameters table. We manually selected these parameters because there are multiple parameters and we want the values of both to determine the response. In other words, both parameters are significant and the value of each one of them should affect the response message that is returned by the stub. Requests will often have many more parameters, but for stubbing purposes, only a few are usually significant for determining the responses from a testing perspective. 13. Click Finish. (If you decided to click Next, you would leave the Test 3: confirm() option on Automatic since this operation does not take any request parameters). SOAtest will now create StoreStub.tst under the stubs project folder, and deploy it immediately on the local machine. You may test this stub by invoking the StubClient.tst we used in the last lesson—you just need to set it to “Stub Environment” so it invokes the local stub on the local machine instead of the real service on the Parasoft server.

112

Creating and Deploying Stubs

Notice the traffic in that test. The stub you created responds with the same messages as the real service.

Creating Stubs for an Application that Does Not Exist Yet When an application is not accessible for the purpose of configuring a stub, you can create a stub from scratch and configure its behavior with data sources. The use of data sources allows for easier maintainability. With data sources, modifying the behavior or adding additional cases is as simple as changing values in a spreadsheet. To create stubs from scratch: 1. Right click the stubs project and choose New Test (*.tst) File. 2. Make sure stubs is selected under Enter or select the parent folder, set the file name to DataSourceStoreStub, then click Next. 3. Select SOA> WSDL, then click Next. 4. In the WSDL URL, specify the store WSDL location http://soatest.parasoft.com/store01.wsdl

113

Creating and Deploying Stubs

5. Select Generate Server Stubs and click Finish.

SOAtest will create 7 Message Stub tests: one for each of the operations defined in the WSDL. 6. Delete all the tests except Test 4: getItemByTitle Service. 7. Double-click the getItemByTitle Service test to open its editor. 8. Open the Response tab.

114

Creating and Deploying Stubs

9. Right-click getItemByTitleResponse, then choose Populate.

10. In Number of sequence (array) items, specify 1, then click OK.

This will add the optional items elements so we can generate a data source table with columns for all the elements. 11. Save the test.

115

Creating and Deploying Stubs

12. Right-click getItemByTitleResponse , then choose Generate CSV Data Source.

13. For the CSV File Destination, click Workspace, then select the stubs project. 14. Click OK.

A data source will be generated and configured for the test suite.

116

Creating and Deploying Stubs

15. Save the test. Notice how all the parameters are now parameterized and mapped to data source columns.

16. Switch to the Navigator view, double-click getItemByTitleResponse.csv, then edit the file and add values to it (you can edit it with Excel—or you can replace it with the sample file included under [soatest install dir]/examples/datasources).

17. Go to the Data Source Correlation tab.

117

Creating and Deploying Stubs

18. Under the Request XML Message section, click Add.

19. Click Edit (to the right of the XPath field), select the titleKeyword element, then click OK.

118

Creating and Deploying Stubs

20. Under Column Name, select title, then click OK and save the test.

You have now created a stub for the getItemByTitle service and defined it so that the titleKeyword value is matched with the title column in the data source. In other words, the stub will search the title column until it finds a match for the incoming value, then use that row to populate the response for that message. You can try invoking the stub (like in the previous exercise) with one of the titles you provided in the data source. The stub will respond with the book details associated with that title.

Creating Stubs for REST Services You can also emulate REST services with SOAtest as follows: 1. Create a new empty .tst file under the stubs project, name it StockQuote and add a new Message Stub tool to it. 2. Under the Test Correlation tab, select Transport and clear Enable Correlation (since we want to create a REST stub, SOAPAction HTTP headers are not applicable).

3. Under the Message Stub tool’s Response tab, switch the view to Multiple Responses.

119

Creating and Deploying Stubs

4. Click New.

5. Select Always match under XPath function, and clear Always match under HTTP URL Parameters. This will allow the response message we are configuring to be correlated to values that come in the URL parameters (instead of values in an XML request, as we did in previous exercises).

120

Creating and Deploying Stubs

6. Click Add.

7. Specify the parameter name symbol and the value GOOG, then click OK.

121

Creating and Deploying Stubs

8. Open the Message tab for Response 1.

9. Provide the XML that you want to be returned for this request; for example <quote> <company>Google Inc</company> <lastTrade>621.50</lastTrade> </quote>

10. Click New to add another response. Repeat the general steps above—but this time, specify the value AAPL for the symbol and the desired response <quote> <company>Apple Inc</company> <lastTrade>244.75</lastTrade> </quote>

122

Creating and Deploying Stubs

11. Save the test. You now have a stub that is deployed and listening on your local machine at the URL: http://localhost:9080/servlet/StubEndpoint?stub=StockQuote You can test this stub service using [soatest install dir]/examples/tests/StockQuoteClient.tst: try changing the symbol from GOOG to AAPL and some other value, then look at the resulting traffic.

Monitoring Stub Traffic and Failures So far, we have covered a few different ways to generate and configure stubs to emulate services behavior. When you are testing an application that is invoking the SOAtest stub, it is often helpful to have visibility into what requests the application is sending to the stub and whether there are any errors or request message validation failures (if you added any request validation tools to the request of the Message Stub tool). To monitor the traffic in the Store Stub created in the first or second exercise: 1. Open StubClient.tst, which we used to invoke the stub. 2. Set Stub Environment as the active environment. 3. Select the Scenario: Book Store test suite.

4. Click Add test or output.

123

Creating and Deploying Stubs

5. Select Event Monitor, then click Finish.

6. Switch the platform in the Event Monitor to SOAtest Stub Server.

7. Save the test. 8. In the Event Monitor panel, open the Event Viewer tab. Do not close this panel.

9. Select and run the Scenario: Book Store test suite.

124

Creating and Deploying Stubs

Notice the events that appear in the event viewer. They represent the request message that was received by the stub, the response it returned and validation result.

125

Testing Plain XML Services

Testing Plain XML Services Parasoft SOAtest can be used to test POX (Plain Old XML) services that are not necessarily SOAP Web services. Many legacy system integration initiatives have relied on plain XML messaging, or sometimes plain XML is preferred over SOAP Web services for performance reasons to reduce complexity. If a schema for the XML messages is available, tests can be generated automatically by SOAtest, without the need to provide sample XML messages. Parasoft SOAtest support for plain XML services includes emulating a client that sends XML over one of the supported protocols and APIs (e.g. HTTP, JMS, etc.), or emulating a server that responds with XML over HTTP. To generate a set of new tests using a schema: 1. Select the Test Suite: Functional Tests node and click the Add Test Suite toolbar button.

2. In the Add Test Suite wizard, expand the New Project node, select XML Schema, and click the Next button. 3. In the XML Schema dialog, enter http://soatest.parasoft.com/schema.xsd in the Schema Location field, or Browse to a Schema on your machine. 4. Select Generate Messaging Clients to send plain XML messages. 5. Enter http://ws1.parasoft.com:8080/examples/servlets/Echo in the Endpoint field. This specifies where XML messages are sent to. This field can be left blank if another protocol is desired or if the URL is to be provided later.

126

Testing Plain XML Services

6. Click the Next button. A list of elements that are defined in the schema (directly, as well as indirectly via imports) displays. You may select one or more of these elements, and a Messaging Client test will be generated for each selection. 7. Select all elements by pressing CTRL while clicking or pressing CTRL+A. 8. Click the Finish button. Three tests are created.

127

Extending SOAtest with Scripting

Extending SOAtest with Scripting In the ever changing world of web services, there may be situations in which you have a testing requirement which requires you to add custom functionality or logic to your tests cases. Due to the flexible nature of SOAtest, you can easily integrate custom scripts into your testing environment. Using SOAtest’s XML Assertor, you can integrate custom scripts written in Jython (Java enabled Python—SOAtest ships with Jython 2.2.1), Java, or JavaScript into SOAtest. This means that almost any testing situation can be handled with ease, even if the situation is not directly supported by SOAtest’s current tool set. In this example you will create a Scenario Test using the book store service used in previous examples. In this Scenario you will search for a book by its title, then validate that the price of the book is an even integer. When you complete this section of the tutorial, your test suite should resemble the test suite entitled "Custom Scripting" in the SOAtestTutorial.tst file. 1. Select the Test Suite: Functional Tests node and click the Add Test Suite toolbar button.

2. In the Add Test Suite wizard, click Empty, then click Finish 3. Double-click the new Test Suite: Test Suite node added to the test suite tree, enter Custom Scripting in the Name field in the test configuration panel, then click Save. 4. Select the Test Suite: Custom Scripting node and click the Add test or output button.

5. In the Add Test Wizard, select SOAP Client in the right, then click Finish. A SOAP Client tool is added to the test suite. 6. Double-click the Test 1: SOAP Client node underneath the Test Suite: Custom Scripting node and enter Validate Price Value in the Name field in the right GUI Panel. 7. In the WSDL tab of the test configuration panel, enter http://soatest.parasoft.com/ store-01.wsdl in the WSDL URI field. 8. Open the Request tab, then select getItemByTitle from the Operation drop down box.

128

Extending SOAtest with Scripting

9. Enter Linux as the Fixed value in the titleKeyword element entry box, and then click the Save toolbar button.

10. Right-click the Test 1: Validate Price Value node and select Add Output. 11. In the Add Output wizard, select Response> SOAP Envelope on the left, and XML Assertor on the right, and click Finish. This tells SOAtest to chain an XML Assertor to the XML Response output of the SOAP Client. 12. Open the Configuration tab within the XML Assertor test configuration panel, then click the Add button. 13. In the Select Assertion dialog, expand the Value Assertion node, select Custom Assertion, and click the Next button.

The Custom Assertion dialog then displays a tree view of the XML message from which you can select a single value to enforce.

129

Extending SOAtest with Scripting

14. Select the price element in the XML tree view and click the Finish button.

The test configuration tab will now be populated with a Custom Assertion. 15. Enter the following script, which ensures that the price value is even, in the Text field of the test configuration tab: def checkPrice(input, context): price = float(input) if price % 2 == 0: return 1 else: return 0

130

Extending SOAtest with Scripting

16. Select Python from the Language drop-down menu. 17. Select checkPrice() from the Method drop-down menu. 18. Click the Save toolbar button. 19. Select the Test 1 node and click the Test button. Notice that the test fails. If you double-click the Traffic node, you will see that the price of the Linux book is an odd number, causing the test to fail.

131

Extending SOAtest with Scripting

20. Double-click the Test 1 node and enter Java as the Fixed value in the titleKeyword entry box. 21. Click Save. 22. Click the Test button. The test succeeds because the price of the Java book is even.

132

Extending SOAtest with Scripting

133

Asynchronous Testing

Asynchronous Testing In this age of flexible, high performance web services, asynchronous communication is often used to exchange data, allowing the client to continue with other processing rather than blocking until a response is received. SOAtest comes packaged with a server that runs in the background and manages the asynchronous Call Back messages received. SOAtest supports the major asynchronous communication protocols including Parlay, SCP, and WS-Addressing. In this example we will use a simple web service which takes a string as input and then echoes this string back to the client in an asynchronous message exchange. This web service uses the WSAddressing protocol. We will need to send a Message ID which is used by SOAtest to identify the message when the Call Back Message is received and a Call Back URL so that the service knows where to send the Call Back Message. Note: It is likely that you will not be able to run the scenario in this exercise because of firewall restrictions. In order to successfully invoke this service, your machine would need to be accessible over the Internet by the Parasoft machine which sends the asynchronous response (HTTP post to SOAtest). When you complete this section of the tutorial, your test suite should resemble the test suite entitled "Asynchronous Testing" in the SOAtestTutorial.tst file. 1. Create a new test (.tst) file as follows: a. Right-click the project from the previous exercises, then choose New Test (.tst) File from the shortcut menu.

b. Enter a name for the file (e.g. Asynchronous Testing), then click Next. c.

Select SOA> WSDL, then click Next.

d. In the WSDL wizard page, enter the following into the WSDL URL field: http://soatest.parasoft.com/echo.wsdl

e. Make sure the Create functional tests from the WSDL checkbox is selected and Generate Web Service Clients is the selected option. Create tests to validate and

134

Asynchronous Testing

enforce policies on the WSDL should NOT be selected.

f.

Click Next several times until you advance to the Layout page.

g. In the Layout page, choose the Asynchronous radio button and the WS-Addressing radio button and click Finish. •

A new Test Suite: Test Suite folder is created which contains automatically configured asynchronous test cases for each operation defined within the WSDL.



Notice that many tests have been created under the Test Suite: echo folder. You can delete all but the last one, Scenario: echoString(string).

2. Configure the test suite as follows: a. Double-click the new Scenario: echoString(string) node.

b. Enter Asynchronous Testing in the Name field in the right GUI panel.

135

Asynchronous Testing

c.

Open the Execution Options tab and select the Tests run concurrently radio button.

3. Click Save. 4. Looking back at the Test Case Explorer, note that Asynchronous Testing folder contains two tests. •

The first test is a SOAP Client test which will send an initial request to the asynchronous service.



The second is a tool called the Call Back tool. Using the Call Back tool, SOAtest is able to listen for call back messages that are sent in an asynchronous messaging exchange. A local stub server has been integrated into SOAtest, allowing the Call Back tool to listen for these incoming messages. For this reason, it is important that the server is running before executing these examples.

5. Configure and start the local server as follows: a. Choose Window> Show View> Stub Server to open the Stub Server tab, which should appear at the bottom of the GUI. b. If the local server is not already running (if it has as gray light rather than a green light), right-click the root Server node and select Start Server. The light next to the node should turn green indicating that the server has been started. c.

We need to deploy a stub using the stub server to emulate the asynchronous service. Under the Stub Server node, right-click the Local machine folder and choose Add Stub.

d. Click the Workspace button next to the Message Stub Tester Suite field, browse to the AsynchronousTestingStub.tst shipped with the SOAtest examples, then click Next.

136

Asynchronous Testing

The new stub is now ready to use. 6. Configure the test to send a Message ID which is used by SOAtest to identify the message when the Call Back Message is received and a Call Back URL so that the service knows where to send the Call Back Message as follows: a. In the Test Case Explorer, double-click the Test 1: echoString async node in the Asynchronous Testing test suite. b. Open the request tab and enter the fixed value Hello World as the arg0 input parameter to this operation.

c.

Open the SOAP Header tab and notice that the SOAP Headers defined within the WSDL have been automatically created and added to this test case. Select the WS Addressing header, click Modify, and open the MessageID/ReplyTo tab. Note the value of the dropdown under wsa:MessageID.

137

Asynchronous Testing

By default, the SOAP Client Tool will generate a unique messageID, which will be sent

138

Asynchronous Testing

to your server. When receiving messages, the Call Back Tool will then check this messageID so that asynchronous responses can be correlated to the proper requests. In this case, though, we need to specify a messageID because we are using a stub instead of a live service. d. Change the dropdown under wsa:MessageID from Unique to Fixed, enter the following into the field uuid:3799a6eb-cf84-4141-b8b4-09e1f7090734

then click OK to close that dialog e. Open the Transport tab and set the endpoint to Custom with the value of http://localhost:9080/servlet/StubEndpoint?stub=AsynchronousTesting

f.

Click Save.

g. Double-click the Test 2: echo call back node in the Asynchronous Testing test suite. In the test configuration panel, notice that the Call Back Tool has been automatically configured to use the WS-Addressing protocol. By default, the Call Back Tool will listen for incoming messages with the same MessageID that was generated in Test 1: echoString async. h. Because we are using a stub to emulate the service, we need to use a specific messageID for correlation. Double-click the MessageID entry in the table. In the popup dialog, change the dropdown to Fixed, enter urn:message-1 into the box, then click OK.

i.

Click Save.

7. Select the Test Suite: Asynchronous Testing node and click the Test toolbar button. All the tests should succeed. 8. To see the traffic, expand both the Test 1: echoString async and Test 2: echoString call back nodes on the left, then double-click on the Traffic Viewers attached to them. The Traffic Viewer attached to the SOAP Client tool will show the request message, and the Traffic Viewer attached to the Call Back Tool will show the response.

139

WS-Security

WS-Security To help you ensure that your security measures work flawlessly in terms of authentication, encryption, and access control, SOAtest contains a vast array of security tools and options that fully supports the industry standard WS-Security specification. In the example given in the WS-Security test suite, examples of encryption/decryption, digital signature, and the addition of SOAP Headers are shown. The following are key security tools and options that SOAtest supports: •

XML Encryption Tool: The XML Encryption tool allows you to encrypt and decrypt entire messages or parts of messages using Triple DES, AES 128, AES 192, or AES 256. In WS-Security mode, Binary Security Tokens, X509IssuerSerial, and Key Identifiers are supported.



XML Signer Tool: The XML signer tool allows you to digitally sign an entire message or parts of a message depending on your specific needs. In some cases it may be important to digitally sign parts of a document while encrypting other parts.



XML Signature Verifier Tool: The XML verifier tool allows for the verification of digitally signed documents using a public/private key pair stored within a key store file.



Key Stores: The use of key stores in SOAtest allows you to encrypt/decrypt and digitally sign documents using public/private key pairs stored in a key store. Key stores in JKS, PKCS12, BKS, and UBER format can be used.



Username Tokens, SAML Tokens, X509 Tokens, or Custom Headers: SOAtest supports sending custom SOAP Headers and includes templates for Username Tokens and SAML tokens.

When you complete this section of the tutorial, your test suite should resemble the test suite entitled "WS-Security" in the SOAtestTutorial.tst file.

Unlimited Strength Java Cryptography Extension Important: In order to perform security tests using the XML Signature Verifier, XML Signer, or XML Encryption tools, or if using Key Stores, you will need to download and install the Unlimited Strength Java Cryptography Extension. To do so, go to http://java.sun.com/javase/downloads/index_jdk5.jsp and download the JCE Unlimited Strength Jurisdiction Policy Files. The files downloaded should be installed into the following directory on your machine: [SOAtest install dir]\[SOAtest version_number]\plugins\com.parasoft.xtest.jre.eclipse.core.[platform]_[jre version]\jre\lib\security

Be sure to replace the existing local_policy.jar and US_export_policy.jar files with the new ones that you downloaded.

140

WS-Security

Message Layer Security with SOAP Headers In this example we will use a book store web service, which requires a Username and Password to be submitted within the SOAP Header element according to the WS-Security specification. SOAtest provides the ability to add Custom Headers and also provides pre-defined templates for creating Username Tokens and SAML Tokens. The following example uses a Username Token. 1. Right-click the project from the previous exercises, then choose New Test (.tst) File from the shortcut menu.

2. Enter a name for the file, then click Next. 3. Select Empty and click Finish. An empty test suite folder is created. 4. Double-click the new Test Suite: Test Suite node that was added. 5. Type WS-Security into the Name field in the configuration panel on the right. 6. Click the Save button to save the WS-Security test suite. 7. Copy the Excel: Books data source that you added in the Functional Test lesson and paste it into this test suite. 8. Select the Test Suite: WS-Security node and click the Add Test Suite button.

9. Select Empty and click Finish. An empty test suite folder is created. 10. Type Username Tokens into the Name field in the tool configuration panel on the right. 11. Click the Save button to save the Username Tokens test suite. 12. Select the Test Suite: Username Tokens node and click the Add Test or Output button.

13. In the Add Test wizard, select Standard Test from the left pane, and SOAP Client from the right pane, and click Finish. A SOAP Client tool is added to the test suite. 14. Double-click the Test 1: SOAP Client node beneath the Test Suite: Username Tokens node.

141

WS-Security

15. Complete the SOAP Client tool’s configuration panel as follows: a. Enter SOAP Client – getItemByTitle operation in the Name field. b. Open the WSDL tab and enter the following in the WSDL URI field: http://soatest.parasoft.com/store-wss-01.wsdl

c.

Open the Request tab and select getItemByTitle(string) from the Operation dropdown menu.

d. For the title element, select Keywords as its Parameterized value.

16. Click the Save toolbar button to save the modified test. 17. Run the test by clicking the Test toolbar button.

Notice that the test fails because it did not have the required Security Header. To add the required SOAP Header: 1. Double-click the Test 1: SOAP Client node. 2. Open the SOAP Header tab in the tool’s configuration panel, then click the Add button. An Add New SOAP Header dialog opens.

142

WS-Security

3. Select WS-Security then click OK.

4. Double-click the new entry added to the SOAP Header table. A dialog will open. 5. In the Timestamp tab, clear the Send Timestamp checkbox. 6. Open the Username Token tab and complete the following: a. Enter soatest in the wsse:Username field. b. Enter soatest in the wsse:Password field.

7. Click OK. 8. Click the Save toolbar button to save the modified test. 9. Run the test by clicking the Test toolbar button.

The test now succeeds. Double-click the Traffic Viewer node to view the SOAP Header sent in the request and verify that the service returned information about the specified books. 10. To create a regression control that will alert you to any changes in the server response in the future, right click Test 1: SOAP Client – getItemByTitleOperation and choose Create/Update Regression Control from the shortcut menu.

143

WS-Security

11. In the Response Validation Wizard, select Create Regression Control> Create Single Control, then click Finish.

If you run the test a few more times you will notice that it fails because the price element has changed. Follow the steps from previous exercises to ignore the dynamically changing price value.

Using the XML Encryption Tool In this example, we will use a book store service similar to the service used in previous examples, except that: •

Request bodies must be encrypted using the key store soatest.pfx which is located in the examples\keystores directory.



Responses are encrypted as well and can be decrypted using the same key store.

First you will need to set up the key store: 1. Select the Test Suite: WS-Security node and click the Add Property toolbar button.

144

WS-Security

2. In the Global Property Type Selection dialog, select Global Key Store and click Finish.

3. Complete the Key Store configuration panel as follows: a. Enter PKCS12 Keystore in the Name field in the GUI panel. b. Make sure the Use same key store for private key checkbox is selected. c.

Click the Browse button and navigate to the location of the key store soatest.pfx. •

For Windows: C:\Program Files\Parasoft\SOAtest\[SOAtest version number]\examples\keystores.



For Linux: [SOAtest installation directory]\examples\keystores.

d. Enter security in the Key Store Password field and select the Save check box. This will enable SOAtest to remember the keystore password the next time the test suite is opened. e. Select PKCS12 from the Key Store Type drop-down menu. f.

Click the Load button.

145

WS-Security

The list of available certificate aliases within the keystore are populated into the Certificate Alias drop-down menu. g. Select soatest in the Certificate Alias field. h. Open the Private Key tab at the top of the Key Store configuration panel. i.

Select soatest for the Private Key Alias and enter security for the Private Key Password.

j.

Select the Save key store password check box.

4. Click the Save toolbar button. Now we are ready to set up a test using the XML Encryption tool. To better organize our security tests, we will create a new folder for the encryption test. 1. Select the Test Suite: WS-Security node and click the Add Test Suite button.

2. Select Empty and click Finish. An empty test suite folder is created. 3. Type Encryption/Decryption into the Name field in the right GUI panel. 4. Click the Save toolbar button. 5. Select the Test Suite: Encryption/Decryption node and click the Add Test or Output button.

6. Select Standard Test from the left pane, and select SOAP Client from the right pane, and click Finish. A SOAP Client tool is added to the test suite. 7. Complete the SOAP Client tool’s configuration panel as follows: a. Enter SOAP Client – getItemByID operation in the Name field. b. Open the WSDL tab and enter the following in the WSDL URI field: http://soatest.parasoft.com/store-wss-03.wsdl

c.

Open the Request tab.

d. Select getItemById from the Operation drop-down menu.

146

WS-Security

e. For the id element, select ID as its Parameterized value.

8. Click the Save toolbar button. 9. Right-click the Test 1: SOAP Client - getItemByID operation node and select Add Output from the shortcut menu. The Add Output wizard displays. 10. Select Request> SOAP Envelope from the left pane, and select XML Encryption from the right pane, and click the Finish button.

147

WS-Security

An Encryption Tool is chained to the SOAP Client. 11. Complete the Request SOAP Envelope -> XML Encryption tool’s configuration panel as follows: a. Ensure that the Encrypt radio button is selected. b. Ensure that the WS-Security Mode box is checked. c.

Select AES 256 from the Symmetric (Block Encryption) drop down menu.

d. Open the WS-Security page and ensure that X509BinarySecurityToken is selected in the Form box.

148

WS-Security

e. Open the Target Elements page and verify that the SOAP Body/entire document checkbox is selected. This will encrypt the XML Body element. The XML Request is now set up to be encrypted when the request is sent to the service. f.

Click the Save toolbar button to save the modified test.

Now you can add an XML Encryption tool to the XML Response of the SOAP Client test to enable Decryption of the XML response. 1. Right-click the Test 1: SOAP Client - getItemByID operation node and select Add Output from the shortcut menu. The Add Output wizard displays. 2. Select Response> SOAP Envelope from the left pane, and select XML Encryption from the right pane, and click the Finish button.

149

WS-Security

An Encryption Tool is chained to the SOAP Client. 3. Complete the Response SOAP Envelope -> XML Encryption tool’s configuration panel as follows: a. Select the Decrypt radio button.

150

WS-Security

b. Select the PKCS12 Keystore from the Key Store drop down menu.

4. Click the Save toolbar button to save the modified test. 5. Run the test by clicking the Test toolbar button.

6. Double-click the Traffic Viewer node to view the encrypted data. 7. Right-click the Test 1: SOAP Client - getItemByID operation node and select Create/Update Regression Control. 8. In the dialog that opens, select Create Regression Control> Create Multiple Controls, then click Finish.

151

WS-Security

Regression controls are created and automatically chained to the Response SOAP Envelope -> XML Encryption. Notice that the decrypted responses are shown in the Regression Control. Finally, you want to ignore dynamic values from the XML Response so that the Regression Control does not fail each time. 1. Double-click the XML Document -> Diff node and complete the following in the right GUI panel: a. Set the Diff Mode to XML. 2. Select Form XML as the Diff Mode.When the Form XML tab is selected, a popup will appear asking whether to override with values from Literal XML view. Click Yes. a. Right-click the price element and select Setup Ignored XPath from the shortcut menu. An Ignore XPath Setting dialog appears. Click OK to ignore modifications to the text content of the price element. b. Repeat the previous step for the CipherValue element. c.

Right click on the DataReference element and select Setup Ignored XPath. An Ignore XPath Setting Dialog appears. Select the Attribute check box to ignore changes to the attributes of the DataReference element. Click OK.

d. Select the Literal XML button to switch back to Literal XML view. 3. Click the Save toolbar button to save the modified test. 4. Run the test by clicking the Test toolbar button.

Using the XML Signer Tool In the next example, we will use a book store service which requires request bodies to be signed with the certificate in the key store soatest.pfx. Responses from this service are signed as well and can be verified using the same key store. We will use the same key store settings from the previous example. 1. Select the Test Suite: WS-Security node and click the Add Test Suite button.

2. Select Empty and click Finish. An empty test suite folder is created. 3. Type Sign/Verify into the Name field in the right GUI panel, then click the Save toolbar button.

152

WS-Security

4. Select the Test Suite: Sign/Verify node and click the Add Test or Output button.

5. Select Standard Test from the left panel and SOAP Client from the right panel and click Finish. A SOAP Client tool is added to the test suite. 6. Complete the SOAP Client tool’s configuration panel as follows: a. Enter SOAP Client – placeOrder operation in the Name field. b. Open the WSDL tab and enter the following in the WSDL URI field: http://soatest.parasoft.com/store-wss-02.wsdl

c.

Open the Request tab.

d. Select placeOrder(int, int) from the Operation drop-down menu. e. Select the itemId parameter and select ID as its Parameterized value. f.

Select the count parameter and enter 1 as its Fixed value.

7. Click Save to save the modified test. 8. Right-click the Test 1: SOAP Client - placeOrder operation node and select Add Output from the shortcut menu. The Add Output wizard displays. 9. Select Request> SOAP Envelope from the left panel, select XML Signer from the right panel, and click Finish. An XML Signer Tool is chained to the SOAP Client. 10. Complete the XML Signer tool’s configuration panel as follows: a. Select PKCS12 Keystore from the Key Store drop down menu. b. Select RSAwithSHA1 (PKCS1) – http://www.w3.org/2000/09/xmldsig#rsa-sha1 from the Algorithm drop down menu.

153

WS-Security

c.

Open the WS-Security page and select X509IssuerSerial from the Form box.

d. Open the Target Elements page and verify that the SOAP Body/entire document checkbox is selected. The XML Request is now set up to be signed when the request is sent to the service. e. Click Save to save the modified test. Now you can add an XML Verifier Tool to the XML Response of the SOAP Client test to enable Signature Verification of the XML response: 1. Right-click the Test 1: SOAP Client - placeOrder operation node and select Add Output from the shortcut menu. The Add Output wizard displays. 2. Select Response> SOAP Envelope from the left pane, and select XML Signature Verifier from the right pane, and click Finish. An XML Signature Verifier Tool is chained to the Test 1: SOAP Client - placeOrder operation node. 3. Complete the XML Signature Verifier tool’s configuration panel as follows: a. Select the Use Key Store checkbox and choose PKCS12 Keystore from the dropdown menu. b. Ensure that the WS-Security Mode check box is checked.

4. Click the Save toolbar button to save the modified test. 5. Run the test by clicking the Test toolbar button.

154

WS-Security

6. Double-click the Traffic Viewer node to view the signed data. Since the test succeeds, this tells us that the server accepted our signed request and the server’s signed response was successfully verified.

XML Encryption and Signature Combined In this example, we will create a more complex test using a book store service which combines the security requirements of the previous two exercises. This service requires request bodies to be signed and encrypted using the key store soatest.pfx. The responses from this service are signed and encrypted as well and can be decrypted and verified using the same key store. 1. Select the Test Suite: WS-Security node and click the Add Test Suite button.

2. Select Empty and click Finish. An empty test suite folder is created. 3. Type Encryption and Signature Combined into the Name field in the right GUI panel, then click the Save toolbar button. 4. Select the Test Suite: Encryption and Signature Combined node and click the Add Test or Output button.

5. In the Add Test Wizard, select Standard Test from the left pane, and select SOAP Client from the right pane, and click Finish. A SOAP Client tool is added to the test suite. 6. Complete the SOAP Client tool’s configuration panel as follows: a. Enter SOAP Client – getItemByTitle operation in the Name field. b. Open the WSDL tab and enter the following in the WSDL URI field: http://soatest.parasoft.com/store-wss-04.wsdl

c.

Open the Request tab.

d. Select getItemByTitle from the Operation drop-down menu.

155

WS-Security

e. Select the title parameter and enter Linux as its Fixed value.

7. Right-click the Test 1: SOAP Client - getItemByID operation node and select Add Output from the shortcut menu. The Add Output wizard displays. 8. Select Request> SOAP Envelope from the left pane, and select XML Signer from the right pane, and click the Finish button. An XML Signer Tool is chained to the SOAP Client. 9. Complete the XML Signer tool’s configuration panel as follows: a. Select RSA from the Algorithm drop down menu. b. Select PKCS12 Keystore from the Key Store drop down menu.

c.

Open the WS-Security page and choose X509BinarySecurityToken from the drop down menu.

d. Open the Target Elements page and ensure that SOAP Body/entire document is checked. The XML Request is now set up to be signed when the request is sent to the service. Next you can add an XML Encryption Tool to the XML Response of the XML Signer Tool to encrypt the signed document. 1. Right-click the Request SOAP Envelope> XML Signer node and select Add Output from the shortcut menu. The Add Output wizard displays. 2. Select XML Encryption and click the Finish button. An XML Encryption Tool is chained to the XML Response of the XML Signer Tool. 3. Complete the XML Encryption Tool tool configuration panel as follows:

156

WS-Security

a. Ensure that the Encrypt radio button is selected. b. Choose PKCS12 Keystore from the Key Store drop down menu. c.

Select AES 256 from the Symmetric drop down menu.

d. Open the WS-Security page and select X509BinarySecurityToken from the Form box. e. Open the Target Elements page and verify that the SOAP Body/entire document checkbox is selected. The XML Request is now set up to be signed when the request is sent to the service. f.

Click Save to save the modified test.

4. Run the test by clicking the Test toolbar button.

5. Double-click the Traffic Viewer node to view the server response

Automatically Generating WS-Security Tests with WS-SecurityPolicy Parasoft enables automatic test creation to enforce runtime security policies. This helps you automatically generate the correct tests with the correct settings so the services can be invoked instantly. Furthermore, by managing the policies at the project test level, you can more easily create and manage various policy variations in order to test the services properly, both positive and negative. SOAtest recognizes WS-SecurityPolicy assertions in the WSDL when using the WS-PolicyAttachment standard. In order to automatically generate tests from a WSDL with WS-SecurityPolicy assertions, complete the following: 1. Select the Test Suite: WS-Security node and click the Add Property toolbar button.

2. In the Global Property Type Selection dialog, select WS-Policy Bank and click Finish. A WS-Policy Banks node is added to the Test Case Explorer. 3. In the WSDL Policies configuration panel on the right side of the GUI, enter http://soatest.parasoft.com/store-wss-04.wsdl in the WSDL URL field and click the Refresh from

157

WS-Security

WSDL button. The Global Policies are populated.

Notice how there are policy nodes that include the WS-SecurityPolicy configuration that corresponds to the WS-SecurityPolicy assertions in the WSDL. Notice how the tests generated are automatically configured with the signer and encryption tool on the request, because the policy dictates so. Since a keystore has already been added to the test suite, the tests are ready to run. If you have not added a keystore, one needs to be configured. For more information on adding a keystore, see “Using the XML Encryption Tool”, page 144.

158

Design and Development Policy Enforcement

Design and Development Policy Enforcement As a greater number of Service Oriented Architectures (SOA) are deployed throughout the industry, the need arises to enforce policies and best practices on all components of the SOA. Policy enforcement over these components will help to ensure interoperability, consistency, and re-usability throughout the life cycle of the SOA. SOAtest provides SOA architects the ability to create and manage design-time SOA policies. A SOAtest “policy” now combines both static analysis policy configurations for XML artifacts (WSDLs, schemas and SOAP) as well as semantic and schema validation tests. SOAtest allows an architect to create a policy configuration which combines Coding Standard tool rule assertions with test assertions such as Schema validity and WS-I interoperability. The new SOA policy configuration interface is very similar to rule configurations in Parasoft's language products (Jtest for Java, C++test for C and C++, .TEST for .NET languages). SOAtest saves and loads policies in an XML format which extends on WS-Policy. When you complete this section of the tutorial, your test suite should resemble the test suite entitled "Design and Development" in the SOAtestTutorial.tst file.

Enforcing Design-Time SOA Policies For this example we will create policy enforcement tests for a book store service with the WSDL located at http://soatest.parasoft.com/store-01.wsdl. 1. Right-click the project from the previous exercises, then choose New Test (.tst) File from the shortcut menu.

2. Enter a name for the file (e.g., Policy Enforcement), then click Next. 3. Select SOA> WSDL, and click Next to advance to the WSDL dialog. 4. Select http://soatest.parasoft.com/store-01.wsdl from the WSDL URL field.

159

Design and Development Policy Enforcement

5. Check the Create tests to validate and enforce policies on the WSDL check box and make sure the Create functional tests from the WSDL check box is also checked.

6. Click Next until you advance to the Policy Enforcement dialog. •

Select the Apply Policy Configuration check box. This will create WSDL and functional tests that will enforce the assertions defined in the specified policy configuration.

160

Design and Development Policy Enforcement

The default policy configuration, soa.policy, is a collection of industry-wide best practices. To use a custom policy configuration, you can either use the Browse button to select a policy configuration or the policy configuration's path can be entered in the text field. For details on policy enforcement, see “SOA Policy Enforcement: Overview”, page 570. 7. Click the Finish button. 8. Double-click the new Test Suite: Test Suite node added to the test case tree, enter Policy Configuration in the Name field in the test configuration panel, and click the Save toolbar button. 9. Expand Test Suite: Policy Configuration then Test Suite: WSDL Tests. Notice that Test 4: Policy Enforcement has been added to Test Suite: WSDL Tests.

161

Design and Development Policy Enforcement

10. Expand the Test 4: Policy Enforcement test to view its chained tools. You will see two Coding Standards tools, one for enforcing rules on the WSDLs and one for enforcing rules on the schemas.



The first tool, WSDL> WSDL Policy Enforcer, is chained to the WSDL Output of the Test 4: Policy Enforcement test and thus is passed the base WSDL and all imported WSDLs for rule enforcement.



The second Coding Standards tool titled Schema> Schema Enforcer is chained to Test 4: Policy Enforcement's Schema Output and thus is passed all schema files referenced in the WSDL for rule enforcement.

11. Expand one of the tests in the Test Suite: ICart node and notice that a referenced Coding Standards tool titled Response SOAP Envelope> SOAP Policy Enforcer has been chained to the Test.

This tool will apply its contained policy configuration on the messages received by this test client. The tool is a reference to a Global Tool in the Tools Test Suite under the root Test Suite.

For more information on Global Tools see “Global Tools”, page 341. 12. Select the Test 4: Policy Enforcement Test and click the Test toolbar button. This will run policy enforcement tests on the WSDL and schema files. If any errors occur, they will be reported in the SOAtest view.

162

Design and Development Policy Enforcement

Defining Custom SOA Policies In the previous exercise, we enforced policies using a default policy configuration. For this example, we will define a custom SOA policy. 1. Open the pull-down menu for the New toolbar button (top left) then choose SOA Policy Configuration File.

2. Enter a name for the policy in the Policy name field, then click the Finish button. The Policy Configuration panel displays in the right GUI pane of SOAtest and lists assertions that correspond to policy enforcement rules and WSDL tests.

3. From the Policy Configuration panel, you can: •

Enable/disable individual assertions by selecting or unselecting corresponding check boxes.



Access help documentation for assertions by right-clicking and selecting View Rule Documentation from the shortcut menu.



Import custom rules designed using SOAtest’s RuleWizard feature by clicking Add.

163

Design and Development Policy Enforcement

4. Click Save to save the custom policy to the default SOAtest rules folder. The policy configuration you define can be used later to automatically create tests to enforce policies.

164

Automation/Iteration (Nightly Process)

Automation/Iteration (Nightly Process) This lesson teaches you how to run tests from the command line, which allows you to configure SOAtest to automatically check the complete project at a specified time each night (or at another interval). This ensures that testing occurs consistently without being disruptive or obtrusive. SOAtest’s command line mode allows you to perform tests from Windows or UNIX command line shells and to run SOAtest from automated build utilities such as Ant, Maven, and CruiseControl. The following exercises are designed to demonstrate the basics of using soatestcli.

Important A command-line license is required to use soatestcli. This license is provided with SOAtest Server Edition.

Running a Test Suite From the Command Line In this example, we will run SOAtestTutorial.tst from the command line which can be found in the examples directory. 1. Close SOAtest and open a command line window. 2. Switch to the directory where SOAtest is installed. 3. From the command line window type the following command: •

On Windows: soatestcli.exe -config <configuration name> -resource "C:\Location Of SOAtestTutorial.tst" -report MySampleReport



On UNIX (where Location Of SOAtestTutorial.tst represents the location of SOAtest on disk).: soatestcli -config <configuration name> -resource "/Location Of SOAtestTutorial.tst" -report MySampleReport

Running all Projects in a Workspace soatestcli.exe -data "c:\mySOAtestWorkspace" -showdetails -config "user://Example Configuration" -report "c:\mySOAtestReports"

The –data option specifies the Eclipse workspace location. The –showdetails option prints detailed test progress information. The –config option specifies test configuration. The –report option generates an HTML report.

Running an Individual Project in a Workspace soatestcli.exe -data "c:\mySOAtestWorkspace" -resource "/MyProject" -exclude "**/somebadtesttoskip.tst" -showdetails -config "user://Example Configuration" -report "c:\mySOAtestReports"

To run an individual project in a workspace, you must specify the project to be tested with the -resource option. The -exclude option specifies files to be excluded during testing.

165

Automation/Iteration (Nightly Process)

Using a localsettings File Local settings files can control report settings, Report Center settings, error authorship settings, and Team Server settings. You can create different local settings files for different projects, then use the localsettings option to indicate which file should be used for the current command line test. Each local settings file must be a simple text file. There are no name or location requirements. Each setting should be entered in a single line. If a parameter is specified in this file, it will override the related parameter specified from the GUI. If a parameter is not specified in this file, SOAtest will use the parameter specified in the GUI. soatestcli.exe -data "c:\mySOAtestWorkspace" -showdetails -config "user://Example Configuration" -report "c:\mySOAtestReports" -publish -localsettings "c:\mylocalsettings.properties"

Example localsettings file: grs.enabled=true grs.server=grs.server.com grs.port=32323 grs.log_as_nightly=true tcm.server.enabled=true tcm.server.name=tcm.server.com tcm.server.port=18888 report.mail.enabled=true report.mail.server=smtp.server.com report.mail.domain=server.com report.mail.subject=My Nightly Tests [email protected] report.mail.exclude.developers=false scope.sourcecontrol=true scope.local=false soatest.license.use_network=true soatest.license.network.host=ls.server.com soatest.license.network.port=2002 soatest.license.network.edition=server_edition

166

Running Regression Tests in Different Environments

Running Regression Tests in Different Environments It is common for Web service applications to be developed and maintained by different teams under different environments. For example, a developer may start with tests on a server deployed on his or her local machine, then as the server is deployed to a development build server, the same tests would need to be executed against that server, then QA and testing teams would need to run the same regression tests on their own integration server. Parasoft SOAtest includes an “Environments” management feature which makes such tasks easy, because reusing and sharing test assets is critical for achieving a highly efficient process. The New Test Suite wizard includes an option to create an environment configuration for the tests that are generated automatically. To create a new test suite with preconfigured environment variables: 1. Right-click the project from the previous exercises, then choose New Test (.tst) File from the shortcut menu.

2. Enter a name for the file, then click Next. 3. Select SOA> WSDL and click Next. 4. Enter http://soatest.parasoft.com/calculator.wsdl in the WSDL URL field.

167

Running Regression Tests in Different Environments

5. Select the Create functional tests from the WSDL check box, select the Generate Web service clients radio button.

6. Click the Next button twice to advance to the Create Environment page. 7. Review the settings and options, then click Finish. A new Test Suite: Test Suite node displays in the Test Case Explorer tab. New environment variables are added to the Default Calculator Environment node. 8. Double-click the Default Calculator Environment node.

168

Running Regression Tests in Different Environments

Notice how the environment configuration now includes variables for the HOST, PORT and PATH to the service defined in the environment. The same variables are referenced by name in each of the automatically generated SOAP Client tests (look under the Transport tab). To create a new environment configuration, complete the following: 1. Right-click on the Environments node and select New Environment. A New Environment node appears. 2. Double-click the New Environment node and enter Echo Environment in the Name field in the test configuration panel. 3. Click the Add button and enter the following values to the corresponding variable names: •

CALCULATOR_HOST: ws1.parasoft.com



CALCULATOR_PORT: 8080



CALCULATOR_PATH: examples/servlets/Echo

4. Click Save to save the new environment. 5. Right-click the new Echo Environment node and select Set as Active Environment from the shortcut menu. This will set that new environment as the new configuration for the test project. Running the tests again will cause the SOAP messages to be sent to the Echo servlet on bpel.parasoft.com instead of the original calculator service. Environment configurations can be exported and imported into external XML files, as well as uploaded and referenced to the Parasoft Team Server. Environment variables can be referenced from most of the fields in the test settings GUI, not just URL fields.

Applying an Environment Configuration to a 169

Running Regression Tests in Different Environments

Regression Test from the Command Line The greatest benefit of environments is the ability to rerun the same regression suites from the command line without the need to open the SOAtest GUI and modify host or URL settings. From the command line, run a command like: soatestcli.exe -config <configuration name> -resource <path to test suite name.tst relative to the workspace> -environment "Default Calculator Environment"

Then try: soatestcli.exe -config <configuration name> -resource <path to test suite name.tst relative to the workspace> -environment "Echo Environment"

This will run the same suite with the second environment applied to it.

170

Web Functional Testing

Web Functional Testing Introduction Web interface testing is difficult to automate. Teams often abandon automated testing in favor of manual testing due to too many false positives or too much effort required to maintain the test suites. SOAtest facilitates the creation of automated test suites that are reliable and dependable. Its ability to isolate testing to specific elements of the Web interface eliminates noise and provides accurate results. SOAtest isolates and tests individual application components for correct functionality across multiple browsers without requiring scripts. Dynamic data can be stubbed out with constant data to reduce test case noise. Validations can be performed at the page object level as well as the HTTP message level. SOAtest also verifies the client-side JavaScript engine under expected and unexpected conditions through asynchronous HTTP message stubbing. The following exercises demonstrate how to use SOAtest to perform functional testing on the Web interface. It covers: •

Recording in a Browser



Adding a Data Source



Parameterizing a Form Input



Configuring Validation on a Page Element



Configuring a Browser Stub



Playing a Recorded Scenario in a Browser



Performing Static Analysis During Functional Testing

Recording in a Browser To record in a browser: 1. Right-click the project from the previous exercises, then choose New Test (.tst) File from the shortcut menu.

2. Enter a name for the file, then click Next. 3. Select Web> Record web functional tests and click Next. 4. In the first Record Web Functional Tests wizard page, ensure that Record new functional test is selected, then click Next. 5. Complete the next Record Web Functional Tests wizard page as follows:

171

Web Functional Testing

a. Enter Web Functional Testing in the Test Suite Name field. b. Enter www.endless.com in the Start Recording From field. c.

Ensure that the following options are checked, and the others are not: •

Generate Functional Tests



Generate Asynchronous Request Tests.

d. Click the Finish button. The test will begin, and a browser window will open. 6. Within the browser window that opens, perform the following actions:

172

Web Functional Testing

a. Click on the Men's Shoes link toward the top of the page.

b. Type leather into the Search field to the right of the drop-down menu showing Men's Shoes, then click the Go button immediately to the right of this field.

c.

Select the Sale & Clearance radio button from the Show options on the right of the screen.

d. Close the browser to end recording. In the Test Case Explorer view, SOAtest creates a new .tst file and a Web Functional Testing test suite which contains the generated tests for the scenario that you just recorded.

173

Web Functional Testing

7. Expand the new test suite node in the Test Case Explorer to view the tests created for each user action taken during the recording.

Adding a Data Source To add a data source: 1. Right-click the Scenario: Web Functional Testing node, then choose Add New> Data Source from the shortcut menu.

174

Web Functional Testing

2. In the New Project Data Source wizard, select Table and click Finish.

A new Data Sources node is added to the Scenario: Web Functional Testing branch. 3. In the New Datasource table configuration panel that is opened in the right side of the GUI, make sure the First row specifies column names checkbox is checked.

175

Web Functional Testing

4. Enter Material in the top cell in column A. In the same column, enter suede in row 1 and canvas in row 2.

5. Click Save to save the changes.

Parameterizing a Form Input To parameterize a form input: 1. Expand the Scenario: Form: keywordSearchForm branch to view the recorded actions related to the search form. 2. Double-click the Test 1: Type "leather" node to open the test configuration panel .

3. Note that the Pre-Action Browser Contents tab shows what the page looked like before the test action (typing leather) was performed. It also uses a blue border to highlight the user

176

Web Functional Testing

action for this test.

4. In the test configuration panel, open the User Action tab. 5. Near the top of that tab, change the Text Input Value drop-down menu from Fixed to Parameterized. 6. Select Material from the menu that appears to the immediate right.

7. Click Save to save the changes.

Configuring Validation on a Page Element To configure a validation on a page element: 1. Fully expand the Scenario: Form: shippingOptionFilterForm branch.

177

Web Functional Testing

2. Double-click the Browser Contents Viewer node under Test 1.

The Browser Contents Viewer tool configuration panel will open. 3. In the Browser Contents Viewer configuration panel, right-click some element on the page (for example, the number of results shown at the top of the page) and select Extract Value for <Element>... from the shortcut menu.

4. In the dialog that opens, ensure that the text property is selected in the Property name box.

5. Click Next two times (until you reach the Validate or Store Data page). 6. Ensure that Validate the value is selected, and that the expected value matches the results that number displayed on the rendered page.

178

Web Functional Testing

7. Click Finish. There is now a Browser Validation Tool added to this test that is set up to check that the element you selected remains the same as the application evolves. 8. Double-click the added Browser Validation Tool node.

9. In the Browser Validation tool’s test configuration panel, notice that the validation is displayed. If you wanted to reconfigure the validation settings, you could do so here.

Configuring a Browser Stub A stub is static data that SOAtest saves when recording a Functional Test Scenario through a web site. Stubbing helps to verify that any changes to the client-side code do not affect the final resultant html page by feeding unchanging data to the client in place of the actual server response. To configure a message stub: 1. Expand the Scenario: Form: shippingOptionFilterForm branch. 2. Right-click on the test labeled Test 1: Click "shippingOptionFilter" and select Add Output from the shortcut menu.

3. In the Add Output wizard that opens, choose HTTP Traffic, then click Next. 4. Select 1. http://www.endless.com/searchrequest/ref=sr_nr_onsale and click Next. 5. In the left panel, select Both> Stub Request/Response. 6. In the right panel, select any Browser Stub.

179

Web Functional Testing

7. Click Finish.

8. In the Response Body section of the Browser Stub configuration view (opened on the right side of the GUI), change the number for numPrimaryResults from its existing value to 1,234. •

The easiest way to do this is to copy the entire text in the Response Body into a text editor (e.g. Notepad), use its search capability to find 'numPrimaryResults'", modify the value, and then copy all the text from the editor and replace the current contents in the Response Body.

180

Web Functional Testing



This response body will be provided in place of the actual server response when the recorded Scenario is run later in this example.

9. Click Save to save the changes.

Playing a Recorded Scenario in a Browser To playback the recorded scenario in a browser: 1. In the Test Case Explorer, select the Scenario: Web Functional Testing node. 2. Click the Test toolbar button. The recorded scenario will now be played back in your browser, once for each search value that we parameterized in a previous section of this example. Please wait for each action to be played out in your browser. Note if you have been following the complete tutorial, errors will be reported due to the Browser Stub tool that was previously added. This is expected.

Performing Static Analysis During Functional Testing To configure SOAtest to perform static analysis on the Web pages that the browser downloads as Web functional tests execute: 1. In the Test Case Explorer, select the Scenario: Web Functional Testing node. 2. Open the Test toolbar button’s pull-down menu, then choose one of the available static analysis configurations.

181

Web Functional Testing

When SOAtest is finished performing static analysis, static analysis errors are shown in the SOAtest view. To facilitate review of these results, open the SOAtest view’s Layout menu and choose SOAtest Static Analysis for Functional Tests Layout.

182

Web Static Analysis

Web Static Analysis Introduction While functional testing finds problems by simulating how click paths would operate in a browser, static analysis finds problems by inspecting pages’ source code. Static analysis is just like a code review or code inspection. It reads and analyzes your source files, then it lets you know if it finds any coding errors that could cause functionality or presentation problems. It also pinpoints broken links and other navigational problems it finds. In addition, it can alert you to code that might not work correctly when people with disabilities try to use your site with adaptive devices, such as machines that read screen content out loud or convert content into Braille. Furthermore, static analysis can verify that custom design and content requirements, such as corporate branding, are met. SOAtest facilitates static analysis by automating complex analyses that would otherwise take days. Static analysis can be customized to involve tools that help expose and prevent problems such as: •

Coding constructs that make code more error-prone and difficult to update.



Navigational problems such as broken links, actions that invoke designated error pages, anchor problems, non-clickable links, and so forth.



HTML, CSS, JavaScript, and VBScript coding problems that affect presentation, execution, dynamic content, performance, transformation, display in non-traditional browsers, etc.



XML problems that affect transformations and data retrieval.



Code and content that violates Web accessibility guidelines.



Content that contains misspellings and typos.



Code that violates application-specific requirements and business rules.



Code that violates project-specific, organization-specific branding, design, or content guidelines.

The following exercises will demonstrate how to use SOAtest's Static Analysis feature to create a project, load an initial site into that project, and then quickly and easily analyze that project's files for any coding errors. We will not use the projects created from the previous exercises.

Performing Static Analysis To perform static analysis, complete the following: 1. Right-click the project from the previous exercises, then choose New Test (.tst) File from the shortcut menu.

183

Web Static Analysis

2. Enter a name for the file, then click Next. 3. Select Web> Scan web application and click Next. 4. Complete the Scan HTTP/FTP/Local Resources page as follows: a. Enter parabank.parasoft.com in the Start URL field. SOAtest will add this URL to the Allowable/Restricted URLs table. b. Check the Limit loading depth to checkbox and enter 2 in the field to the immediate right. This depth setting determines the number of clicks (links) deep to load the site. For example, a loading depth of 2 tells SOAtest to load only pages that can be reached in two clicks from the start URL (Note: redirect is also considered to be 1 depth). c.

In the Form Options section at the bottom of the panel, set the options as follows: •

Scan Forms With Default Inputs: Unchecked.



Fill Active Inputs Manually: Checked.

d. Click Finish. A new .tst file, and a test suite with one test, will be added to the Test Case Explorer.

184

Web Static Analysis

5. Right-click the Test 1: Scanning Tool node and select Test Configurations from the shortcut menu.

6. In the Test Configuration dialog that appears, select User-defined in the left panel and click the New button under the panel. 7. In the new configuration panel that appears to the right, Change the Name field to ScanTesting Configuration. 8. Open the Static tab, and check the Enable Static Analysis checkbox. 9. In the list of rules below in the same tab, check the following options: •

Accessibility - WCAG 2.0 [ACC-WCAG2]



Coding Convention [CC]



Check Links [CLINK]

185

Web Static Analysis

10. Click the Apply button to save the configuration, then click Close.

11. Select the scan testing test suite in the Test Case Explorer. 12. Open the pull-down menu for the Test toolbar button, then choose Test Using> UserDefined> ScanTesting Configuration.

While scanning the specified site, SOAtest will open a Form dialog box each time it loads a form that allows user input. In this example, a Form input dialog box will open for the form1 form. 13. Complete the Form:Form1 dialog as follows: a. Select Fixed from the Text: "username" drop-down menu, then enter john in the corresponding text field. b. Select Fixed from the Password: "password" drop-down menu, then enter demo in the corresponding text field. This tells SOAtest how to populate this particular form's input elements. The page returned after these inputs are submitted will be tested and included in the Project tree.

186

Web Static Analysis

c.

Click the Add button.

SOAtest will then reopen the same form dialog box so you can enter additional inputs if desired. d. Click the Skip All button to indicate that you do not want to enter any more inputs for forms in this site. When SOAtest is finished performing static analysis, static analysis errors are shown in the SOAtest view. To facilitate review of these results, open the SOAtest view’s Layout menu and choose SOAtest Static Analysis for Functional Tests Layout.

Additional Options You can customize static analysis tests by: •

Enabling or disabling static analysis rules.



Customizing the parameters of the various rules applied during static analysis.

187

Web Static Analysis



Designing rules that verify compliance with unique team or project requirements, then configuring SOAtest to apply those rules during static analysis.



Configuring SOAtest to suppress error messages that are not relevant to your project.

188

Team-Wide Deployment In this section: •

Team-Wide Deployment - Configuration Overview



Team-Wide Deployment - Usage Overview

189

Team-Wide Deployment - Configuration Overview In this section: •

Configuring a Team Deployment: Introduction



Connecting All SOAtest Installations to Your Source Control System



Connecting All SOAtest Installations to Team Server



Connecting SOAtest Server to Report Center



Connecting All SOAtest Installations to Parasoft Project Center



Configuring Team Test Configurations and Rules



Configuring Task Goals



Configuring Task Assignment



Sharing Project and Test Assets



Configuring Automated Nightly Testing

190

Configuring a Team Deployment: Introduction

Configuring a Team Deployment: Introduction The recommended way to configure a team-wide SOAtest deployment is to perform the following tasks in the specified order: 1. Installing and licensing SOAtest Professional Edition on all team machines, SOAtest Architect Edition on the architect’s or lead developer or tester’s machine, and SOAtest Server Edition on a team server machine. 2. Connecting All SOAtest Installations to Your Source Control System 3. Connecting All SOAtest Installations to Team Server 4. Connecting SOAtest Server to Report Center 5. Connecting All SOAtest Installations to Parasoft Project Center 6. Configuring Team Test Configurations and Rules 7. Configuring Task Goals 8. Configuring Task Assignment 9. Sharing Project and Test Assets 10. Configuring Automated Nightly Testing

191

Connecting All SOAtest Installations to Your Source Control System

Connecting All SOAtest Installations to Your Source Control System This topic explains how to connect SOAtest to your source control system. This connection enables SOAtest to use file revision history data in order to automatically assign test failures or policy violations to the responsible team members. It is required if you will be using SOAtest to automate peer review of artifacts such as configuration files, Web code, WSDLs, etc. Sections include: •

About SOAtest's Source Control Support



Enabling Source Control Support



AccuRev Configuration



ClearCase Configuration



CM Synergy Configuration



CVS Configuration



Perforce Configuration



Serena Dimensions Configuration



StarTeam Configuration



Subversion Configuration



Visual SourceSafe Configuration



Specifying Source Control Definitions from the Command Line Interface (cli)

About SOAtest's Source Control Support SOAtest currently supports the following source control system. •

AccuRev 4.6



ClearCase 2003.06.00



CM Synergy 6.4



CVS



Perforce 2006.2



Serena Dimensions 9.x and 10.x



StarTeam 2005 and 2008



Subversion (SVN) 1.2.x, 1.3.x, or 1.4x



Visual SourceSafe 6.0, 2005

192

Connecting All SOAtest Installations to Your Source Control System

Subclipse Support Notes •

Each Subclipse plugin version is compatible only with specific Subversion versions. Ensure that your Subclipse plugin is compatible with a version of Subversion that SOAtest supports. For example, you should NOT install Subversion 1.3 and Subclipse plugin 1.2, which uses Subversion 1.4.



Due to changes introduced in Subversion 1.4, Subversion clients earlier than 1.4 will not be able to work with working copies produced by Subversion 1.4. If you are using Subclipse plugin 1.2 (which includes Subversion 1.4) you might receive the following error message: svn: This client is too old to work with working copy '.'; please get a newer Subversion client

This means that SOAtest is using a command-line client that is version 1.3 or older. The solution is to update your command-line SVN client to version 1.4. The client version can be verified by executing svn --version If your team is using one of these source control systems and performs any necessary configurations (as described later in this topic), SOAtest can: •

Use file revision history data in order to automatically assign test failures or policy violations to the responsible team members. See “Configuring Task Assignment”, page 215 for details.



Update projects from source control before testing. See “Defining Common Options that Affect Multiple Analysis Types (Common Tab)”, page 249 for details (the related setting is Source Control> Update projects).



Automate the peer review process for the various quality artifacts involved in delivering secure, reliable, compliant SOA. See “Code Review”, page 619 for details.

Why do dialogs open when I try to modify files? Some source controls (including ClearCase, Perforce, Synergy, and Visual SourceSafe) require users to mark (lock) sources before editing them. If you are using one of these source control systems and you prompt SOAtest to perform an operation that involves editing a "read-only" file in source control, SOAtest will first open a dialog asking you whether you want to make the file writeable and lock it. Click OK, then provide your source control username and password in the next dialog that opens; this allows SOAtest to access the source control system and set the lock.

Enabling Source Control Support To enable support for any of the supported source control systems: 1. Make sure that the command line client for the given source control is on the system %PATH%/$PATH and is available when SOAtest is launched. •

For example, if you have Subversion, it is not sufficient (or even required) to install the Subclipse plugin to Eclipse (SVN Eclipse plugin). Instead, you should have the plain command line svn.exe Subversion client.

2. Choose SOAtest> Preferences. The Preferences dialog will open. 3. Select SOAtest> Scope and Authorship in the Preferences dialog. 4. Check Use source control to compute scope.

193

Connecting All SOAtest Installations to Your Source Control System

5. Select SOAtest> Source Control in the Preferences dialog. 6. Enable the check box for the source control system you want to use. 7. If the source control executable is not already on your system path, specify the path to it in the text field to the right of the source control system’s name. 8. Specify the source control properties required for the selected type of source control system by clicking New in the Defined Source Controls table, completing the Create Source Control Description dialog’s fields as appropriate for your system, then clicking OK. •

The fields in the Create Source Control Description dialog are described below.

9. Click OK to close the Source Control Description dialog. 10. Click Apply in the Preferences dialog. 11. Click Apply, then OK. To test the integration: 1. In the SOAtest environment, open a project that is checked out from the repository. 2. Open a file in the editor. 3. Right-click the source code, and choose SOAtest> Show Author at Line. If the correct author is shown, the integration was successful.

Debugging Tip To troubleshooting problems with source control integration, run -consolelog -J-Dcom.parasoft.xtest.logging.config.jar.file=/com/parasoft/xtest/logging/log4j/config/logging.on.xml. This should result in detailed log information being printed to the console.

To include messages from the source control system that may contain fragments of user source code, use an additional flag: -Dscontrol.log=true

AccuRev Configuration When you are enabling source control support, specify the following repository properties in the Create Source Control Description dialog: •

Server: Enter the hostname of server where AccuRev is running (required).



Port: Enter the port of the server where AccuRev is running (required).



Username: Enter the AccuRev username/login (required).



Password: Enter the AccuRev password (if needed).

ClearCase Configuration To use ClearCase with SOAtest: •

Check whether a file is controlled by ClearCase by calling the cleartool describe -fmt %Vn <file_path> command. No output means that the file is not controlled by ClearCase.



Ensure that the VOB root directory contains a lost+found directory.

When you are enabling source control support, specify the following repository properties in the Create Source Control Description dialog:

194

Connecting All SOAtest Installations to Your Source Control System



VOB location: Enter the dynamic or snapshot VOB access path. Use the full VOB path (e.g., / vobs/myvob (Linux dynamic view) or M:\\my_dynamic_view\myvob (Windows VOB path). Note that when you enter a Vob Location, the Vob tag field will automatically display the vob tag. If the location is not a proper vob path, a warning message is displayed.

CM Synergy Configuration When you are enabling source control support, specify the following repository properties in the Create Source Control Description dialog: •

Database path: Enter the absolute Synergy database path.



Engine host: Enter the Synergy server’s machine name or IP address.



User: Enter the user name under which you want to connect to the repository.



Password: Enter the password for the above user name.



Use remote client (UNIX systems only): Enable this option if you want to start CCM as a remote client session.



Local database (remote client): Enter the path to the location where the database information is copied when you are running a remote client session.

CVS Configuration To use CVS with SOAtest, ensure that the .cvspass file is in one of the following locations: •

user.home system property



HOME env variable



(Windows) combination of HOMEDRIVE and HOMEPATH (example: "C:" + "\home")



current working directory

When you are enabling source control support, specify the following repository properties in the Create Source Control Description dialog: General tab •

Connection type: Enter the authentication protocol of the CVS server.



User: Enter the user name under which you want to connect to the repository.



Password: Enter the password for the above user name.



Repository path: Enter the path to the repository on the server.



Server: Enter the CVS server’s machine name or IP address.



Port: Enter the CVS server’s port.

Custom SSH/CVS_RSH tab •

CVS_SERVER value: If connecting to a CVS server in EXT mode, this specifies which CVS application to start on the server side.



Use custom authentication properties for ext/server method: Enable this option if you want to use custom authentication for ext/server method.



Remote shell login: Enter your SSH login.



Remote shell password: Enter the password for the above SSH login.



Private key file: Enter the private key file.

195

Connecting All SOAtest Installations to Your Source Control System



Passphrase for private key file: Enter the passphrase for the above private key file.



Use command-line program to establish connection: Enables you to run an external program to establish an EXT connection. Use this option only for non-standard and legacy protocol connections (telnet, rsh). Linux/Unix/Cygwin ssh prompts for passwords/passphrases/ security word sequences are not currently supported.



CVS_RSH path: Specifies the full path to the executable used to establish EXT connections.



CVS_RSH parameters: Specifies the parameters for the executable. The following macro-definitions (case sensitive) can be used to expand values into command line parameters: •

{host} - host parameter



{port} - port parameter



{user} - user parameter from primary page



{password} - user password from primary page



{extuser} - user parameter from EXT/CVS_RSH page



{extpassword} - password parameter from EXT/CVS_RSH page



{keyfile} - path to key file



{passphrase} - password to key file

Perforce Configuration When you are enabling source control support, specify the following repository properties in the Create Source Control Description dialog: •

Server: Enter the Perforce server’s machine name or IP address.



Port: Enter the Perforce server’s port.



User: Enter the user name under which you want to connect to the repository.



Password: Enter the password for the above user name.



Client: Enter the client workspace name, as specified in the P4CLIENT environment variable or its equivalent.

Serena Dimensions Configuration

196

Connecting All SOAtest Installations to Your Source Control System

Linux and Solaris Configuration Note To use Serena Dimensions with SOAtest, Linux and Solaris users should run SOAtest in an environment prepared for using Serena programs, such as 'dmcli' •

LD_LIBRARY_PATH should contain the path to <SERENA Install Dir>/libs.



DM_HOME should be specified.

Since many Solaris users commonly set the required Serena variables by running the Serena dmgvars.sh file, it also necessary to modify LD_LIBRARY_PATH variable. To use Serena Dimensions with SOAtest, LD_LIBRARY_PATH needs to include the following items (paths can be different on client machines): •

SSL/Crypto library - /usr/local/ssl/lib



STDC++ library - /usr/local/lib

When you are enabling source control support, specify the following repository properties: •

Host: Enter the Serena Dimensions server host name.



Database name: Enter the name of the database for the product you are working with



Database connection: Enter the connection string for that database



Login: Enter the login name.



Password: Enter the password.



Mapping: Enter an expression that maps workspace resources to Serena Dimension repository paths. •

Example 1: If you use scontrol.rep.serena.mapping_1=${project_loc\:MyProject};PRODUCT1\:WORKSET1;src\\MyProject, then Project 'MyProject' will be mapped to the Serena workset PRODUCT1:WORKSET1 and workset relative path: src\\MyProject



Example 2: If you use scontrol.rep.serena.mapping_2=${workspace_loc};PRODUCT1\:WORKSET1 then the complete workspace will be mapped to the Serena workset PRODUCT1:WORKSET1.

StarTeam Configuration To use StarTeam with SOAtest: •

Ensure that you have the Borland StarTeam SDK installed. This can be downloaded for free from the Borland web site.

When you are enabling source control support, specify the following repository properties: •

Server: Enter the StarTeam server’s machine name or IP address.



Port: Enter the StarTeam server’s port.



User: Enter the user name under which you want to connect to the repository.



Password: Enter the password for the above user name.

197

Connecting All SOAtest Installations to Your Source Control System

Subversion Configuration SOAtest’s Subversion support is based on the command line client 'svn'. To use Subversion with SOAtest, ensure that: •

The Subversion 1.2.x, 1.3.x, or 1.4.x client is installed.



The client certificate is stored in the Subversion configuration area. The Subversion client has a built-in system for caching authentication credentials on disk. By default, whenever the command-line client successfully authenticates itself to a server, it saves the credentials in the user's private runtime configuration area—in ~/.subversion/auth/ on Unix-like systems or %APPDATA%/Subversion/auth/ on Windows.

When you are enabling source control support, specify the following repository properties in the Create Source Control Description dialog: •

URL: Enter the URL for the SVN server. The URL should specify the protocol, server name, port and starting repository path (for example, svn://buildmachine.foobar.com/home/svn).



User: Enter the user name under which you want to connect to the repository.



Password: Enter the password (not encoded) for the above user name.

Visual SourceSafe Configuration When you are enabling source control support, specify the following repository properties in the Create Source Control Description dialog: •

VSS Database Path: Enter the database path (the location of SRCSAFE.INI).



User: Enter the user name under which you want to connect to the repository.



Password: Enter the password for the above user name.



Project root in repository: Enter the project root. This begins with $/; for example, $/ nightly_test.

Specifying Source Control Definitions from the Command Line Interface (cli) Source control definitions can be specified from the command line, using local settings files (described in “Local Settings (Options) Files”, page 266). The fastest and easiest way to add source control settings to a local settings file is to export them directly into a new or existing file. To do this: 1. On a SOAtest installation with your source control repositories defined, choose SOAtest> Preferences. The Preferences dialog will open. 2. Select SOAtest> Source Control in the Preferences dialog. 3. Click the Export to local settings file button, then specify the file where you want the settings saved. •

If you select an existing file, the source control settings will be appended to that file. Otherwise, a new file will be created.



Exported passwords will be encrypted.

198

Connecting All SOAtest Installations to Team Server

Connecting All SOAtest Installations to Team Server This topic describes how to connect all team SOAtest installations to Parasoft Team Server, which supports centralized administration and application of test practices. Sections include: •

About Team Server



Prerequisites



Connecting SOAtest to Team Server



Extending the Team Server Timeout Period



Exporting Team Data

About Team Server Parasoft Team Server is the software that manages the team-wide distribution and sharing of Test Configurations, rules, rule mappings, suppressions, skipped resources, code review tasks, and test results. All team SOAtest machines should be connected to Team Server to enable centralized administration and application of test practices. The Team Server module ensures that all team members have access to the appropriate team Test Configurations, suppressions, rule files, and test case files. Team Server is available and licensed separately. This version of SOAtest works with Team Server 2.0 and higher, which is distributed as part of Parasoft Server Tools. After Team Server is installed and deployed as a Web service, the team architect or manager can configure the appropriate team settings and files on one SOAtest installation, then tell Team Server where to access the settings and related test files. Team members can then point their machines to the Team Server location, and Team Server will ensure that all team machines access the appropriate settings and files. When the master version of a file is modified, added, or removed, Team Server makes the appropriate updates on all of the team’s SOAtest installations. Before you can use Team Server to share SOAtest files, Team Server must be installed and deployed on one of your team’s machines. For information on obtaining, installing, and deploying Team Server, refer to the Team Server documentation or contact your Parasoft representative.

Prerequisites Before you proceed with the team deployment, ensure that Team Server is successfully installed and deployed on one of your organization’s machines. If you need information on obtaining, installing, or deploying Team Server, contact your Parasoft representative.

Connecting SOAtest to Team Server After Team Server is installed and deployed, you need to connect all team machines to that Team Server. If a SOAtest installation is not connected to Team Server, Team Server will not provide file/configuration/task sharing and management for that installation. To connect the team’s SOAtest installations to Team Server, perform the following procedure on every SOAtest installation used by the team: 1. Choose SOAtest> Preferences to open the Preferences dialog.

199

Connecting All SOAtest Installations to Team Server

2. Select the SOAtest> Team category in the left pane. 3. Enable the Enable Team Server option in the SOAtest Team preferences page. 4. If the appropriate Team Server is not already set, select it from the Autodetected servers list and click Set. Or, manually enter your Team Server’s host (either a name or an IP address) in the Host name field, then enter its port in the Port number field. 5. If you want to minimize the number of operations on Team Server by reusing cached data, check Enable cache mode. •

This can improve performance, but there is a small risk that outdated rules or Test Configurations could be distributed (if the file was updated since the caching, which is set to occur every 8 hours by default). If a file has been updated since the caching, users can force a refresh by clicking Refresh.

6. If your team requires users to log in to Team Server, check Enable account login and then enter your Team Server username and password in the appropriate fields. Depending on how your Team Server was configured, each team member might have a unique Team Server username and password, or all developers might share a single "generic" account. 7. Click Test Connection to verify the connection to Team Server. 8. Click Apply to apply your settings. 9. Click OK to set and save your settings.

Extending the Team Server Timeout Period By default, SOAtest waits 60 seconds for a response from Team Server; if a response is not received within this time, it times out. If you want SOAtest to wait longer for a response from Team Server before timing out, you can extend the timeout as follows: •

For the standalone: Start the tool using the argument -J-Dparasoft.tcm.timeout=[timeout_in_seconds]



For the plugin: Start the tool using the argument -vmargs -Dparasoft.tcm.timeout=[timeout_in_seconds]

Exporting Team Data You may occasionally want to export team data. You can copy: •

All data from one Team Server account to another (with or without transforming the paths to use a new location).



Suppressions and resource data from one location to another within the same Team Server account.

To export team data: 1. Open the Team Server page in the Preferences panel. 2. Click Export Team Data. 3. Use the available controls to specify what data you want exported, where you want it exported, and whether you want paths to be transformed during export. Exporting Team Server data may be especially useful when... •

Renaming IDE projects: To ensure that resource data settings and suppressions are still available after renaming a project, you can use the export wizard to copy data with path reloca-

200

Connecting All SOAtest Installations to Team Server

tion; for example:



Creating a new version of the general project connected to a new Team Server user: When a new version of project is created in source control (branch), it is recommended that you also create a new Team Server user that will control configurations, rules, suppressions and other data for the given project version. Initially, the new area on Team Server should be filled from the curent project. After creating the Team Server user, you can use the wizard to copy all data from the current user to the new one. This configures Team Server to support two separate areas for two versions of product. From this point forward, any changes in configurations, rules, or suppressions in one version will not affect settings in the other version.

201

Connecting All SOAtest Installations to Team Server



Modifying the project/solution layout: For example, assume that your team decides to add artifacts in separate folders: you have all artifacts in /My Project/src/... but want to have them in in /My Project/... To make this move without losing the data on Team Server, you can copy data from /My Project/src to /My Project.

202

Connecting SOAtest Server to Report Center

Connecting SOAtest Server to Report Center This topic describes how to connect the SOAtest Server Edition installation to Parasoft Report Center, which provides dashboards, reports, and metrics that help team members evaluate the project’s overall quality and readiness. Sections include: •

About Report Center



Prerequisites



Configuring SOAtest to Send Results to Report Center



Configuring Report Center Attributes



Accessing Report Center Reports

About Report Center Parasoft Report Center is a decision support system that provides teams the on-going visibility and measurement of the software development process necessary to help keep software projects on track. Collecting and consolidating metrics generated during the SDLC, Report Center turns these data points into meaningful statistics and dashboards that provide development managers and team members with the ability to continuously and objectively assess the quality and readiness of the project, status of the coding process, and the effectiveness of the team. With Report Center, teams can more readily identify, respond to and manage risks that threaten project schedules and quality. Report Center provides the metrics by which management can more effectively assess and direct resources, set and monitor development targets, communicate, guide and measure conformance to development policies, and ensure successful project outcomes. Once SOAtest is configured to send information to Report Center, developers, testers, architects, and managers can use the Report Center dashboard to access role-based reports on quality, progress, and productivity.

Prerequisites If your team is using Report Center, at least one SOAtest installation—the SOAtest Server Edition— should be connected to Report Center. Before you connect the SOAtest Server Edition to Report Center, ensure that Report Center is successfully installed and deployed on one of your organization's machines. If you need information on obtaining, installing, or deploying Report Center, contact your Parasoft representative.

Configuring SOAtest to Send Results to Report Center Before a SOAtest installation can send results to Report Center, you must connect it to Report Center. Most teams connect only their SOAtest Server installation to Report Center because they do not want their Report Center reports and statistics to contain data from tests performed on developer desktops. However, SOAtest Professional and/or SOAtest Architect installations can also be configured to send data to Report Center. To connect a machine to Report Center:

203

Connecting SOAtest Server to Report Center

1. Choose SOAtest> Preferences to open the Preferences dialog. 2. Select the SOAtest> Report Center / Project Center category in the left pane. 3. (Optional) If your team wants to run tests on different code/IDE projects that are components of a larger "global project," share quality and code review tasks across these project components, import results for them all, and report everything to Concerto Report Center, specify the name of the general project you want the data collected under in the General Project field. You can enter a new project name, or click Find to locate an existing one (one already defined in Concerto). •

For example, you might have a general project for "MyProduct" that includes projects such as product_engine, product_gui, product_utils. Results for all of these projects will then be sent to Concerto and reported under "MyProduct."

4. Check the Send results to Report Center and Project Center option. 5. (Optional) If you want Report Center to log results from this SOAtest installation as the results from the nightly tests, check the Log as nightly option. •

If you enable this option, results will be logged at the group level in Report Center.

6. (Optional) If you want this machine to send only a brief summary report sent to Report Center, check the Send summary only option. •

This is recommended for SOAtest Professional or Architect installations that are configured to send data to Report Center.



If you enable this option, only results summaries from static analysis and test execution (not individual violations) will be sent to Report Center.

7. Enter your team’s Report Center Server host (either a name or an IP address) in the Server host name field. 8. Enter your team’s Report Center Data Collector port in the Data Collector port field. 9. Click Test Connection to verify this connection. 10. Enter your team’s Report Center Report Server port in the Report Server port field. 11. Click Test Connection to verify this connection. 12. (Optional) Add Report Center attributes as described in Configuring Report Center Attributes below. 13. Click Apply. 14. Click OK to set and save your settings.

Configuring Report Center Attributes Report Center attributes help you mark results in ways that are meaningful to your organization. They also determine how results are grouped in Report Center and how you can filter results in Report Center. By default, the following information is sent to Report Center: •

user



machine



date



product



product version

204

Connecting SOAtest Server to Report Center



project name

Additional Report Center attributes can be specified in two ways: •

Through the SOAtest GUI.



Through a SOAtest command-line interface local settings file.

To set general attributes (attributes that apply to all tests) through the GUI: 1. Do one of the following: •

If you want to set attributes that will be applied to all tests run by this SOAtest installation, choose SOAtest> Preferences to open the Preferences dialog, then select the SOAtest> Report Center / Project Center category in the left pane.



If you want to set attributes that will be applied only to tests of a specific project or file, right-click the project that you want to set Report Center attributes for, choose Properties from the shortcut menu, then select SOAtest> Report Center Attributes in the left pane.

2. Use the available controls in the Attributes section to add or import Report Center attribute settings. Each attribute contains two components: a general attribute category name and a specific identification value. •

Example 1: If your group wants to label results by project names that are more specific than those used in the SOAtest project, you might use the attribute name PROJECT_NAME and the attribute value projname1. For the next project, you could specify an attribute with the attribute name PROJECT_NAME and the attribute value projname2.



Example 2: If your organization wants to label results by division, you might use the attribute name DIVISION and the attribute value division1 for all tests performed by your division. Another division could specify an attribute with the attribute name DIVISION and the attribute value division2.



Example 3: If your group wants to label results by project versions, you might use the attribute name VERSION and the attribute value 1.1. For the next project, you could specify an attribute with the attribute name VERSION and the attribute value 1.2.

For details on setting attributes through a SOAtest command-line interface local settings file, see “Local Settings (Options) Files”, page 266.

Accessing Report Center Reports To access Report Center reports based on information from SOAtest tests and other sources: •

Choose SOAtest> Explore> Report Center Reports, or open the reports as described in your Report Center User’s Guide.

205

Connecting All SOAtest Installations to Parasoft Project Center

Connecting All SOAtest Installations to Parasoft Project Center This topic explains how to work with Parasoft Project Center from the SOAtest environment. You can have Mylyn drive the process of connecting to Project Center, identifying tasks, and activating/ deactivating tasks—or you can perform these actions manually (via the Task Assistant) if Mylyn is not installed (or if you prefer not to use it). Sections include: •

Working with Project Center through Mylyn



Working with Project Center without Using Mylyn



Configuring Task Assistant Preferences

Working with Project Center through Mylyn This section covers: •

Prerequisites



Understanding the Interface



Defining a Task Repository



Defining a Task Query



Working with Project Center Tasks

Prerequisites Before you start interacting with Parasoft Project Center in Mylyn-driven mode, you must have Mylyn installed in your Eclipse IDE. For SOAtest standalone, Mylyn is included. For the SOAtest Eclipse plugin, you need Mylyn 3.0 or higher and Eclipse 3.3 or 3.4. If you already have Eclipse 3.3 or 3.4 but not Mylyn, you can obtain Mylyn at: •

For Eclipse 3.3: http://download.eclipse.org/tools/mylyn/update/e3.3



For Eclipse 3.4: http://download.eclipse.org/tools/mylyn/update/e3.4

Understanding the Interface There are three views for interacting with Project Center in Mylyn-driven mode: •

Mylyn> Task Repositories: Enables you to define Mylyn's task repositories.



Mylyn> Task List: Main working area for task management. Enables you to define task queries, open tasks, edit tasks, and close tasks.



SOAtest> Task Assistant: Tracks data related to the Project Center task you are working on.

To activate each view, use the Window> Show View> Other option. Mylyn views are available in the Mylyn category, while the Task Assistant view is available in the SOAtest category.

206

Connecting All SOAtest Installations to Parasoft Project Center

Defining a Task Repository Before you can start working on tasks, a task repository that points to the Project Center Server needs to be defined. A task repository can be defined as follows: 1. Open the Mylyn> Task Repositories view. 2. Right-click in that view and choose Add Task Repository (or click the Add Task Repository view toolbar button). 3. In the Add Task Repository dialog, choose Project Center, then click Next. 4. Complete the Project Center Repository form as follows: •

Server: URL of your Project Center server, for example, http://concerto.example.com:8080



Label: Any string.



User ID: Your Project Center login.



Password: Your Project Center password.

5. Click Finish. After adding the Project Center Task Repository, it should be displayed in the Task Repositories list under the label you selected.

Defining a Task Query 207

Connecting All SOAtest Installations to Parasoft Project Center

Before you can start working on tasks, a task query for each developer needs to be created. Once the query is defined, all tasks that are assigned to the developer and that have a status of "open" and "in progress" will be imported from Project Center to Eclipse. To define a task query: 1. Open the Mylyn> Task List view. 2. Choose New> Query... . The New Repository Query dialog is displayed. 3. Select the Project Center repository. The Edit Repository Query form is displayed and contains the same fields as the Task Search page in Project Center. 4. Fill in one or more fields to query for Project Center tasks that meet your criteria. For example, if you would like to view your open tasks, select Open in the Status field, and then enter your name in the Name field:

5. Click Finish. The query is displayed in the Task List view with tasks matching the specified search criteria. Note: If the query does not appear in the view, clear Focus on Work Week on the Task List toolbar (or via the right-click menu). It could be that no tasks in the newly-defined query are scheduled for the current work week. Eclipse connects to Project Center to import tasks for the specified developer. You can refresh to view the updated list. Tasks displayed are sorted based on the priority defined in Project Center.

Working with Project Center Tasks Viewing Tasks Project Center tasks are prioritized based on either Priority or Planned Start Date. Tasks assigned to you are sorted by work priority. You should set your view based on how you prioritize your tasks: •

Categorized view: If your tasks are prioritized based on priority, stay in this view.

208

Connecting All SOAtest Installations to Parasoft Project Center



Scheduled view. If your tasks are prioritized tasks based on Planned Start Date, switch to this view.

Activating Tasks To work on tasks: 1. Double-click the appropriate task. The Edit Task page is displayed in Mylyn.

2. Activate the task using Mylyn. In the Edit Task page, change task status to "in progress" and apply the changes by clicking the Submit button. 3. (Optional) Open the task in Project Center to verify that the status changed to In progress.

Tracking Task Implementation with the Task Assistant From this point forward, you can work on your task development in Eclipse/SOAtest. As you do so, the Task Assistant monitors any changes made to the project files. You can see the modified task files and tests (if test collection is enabled) in the Task Assistant view:

This view lists the resources related to the current task, and provides information about revisions stored in source control. •

Use the Expand All and Collapse All toolbar buttons to quickly display and hide details.



Use the right-click menu commands to: •

Open an editor for selected resource.



Expand part of the tree.



Indicate that a resource is not related to the current task.

Updating Working Time Estimates When you start Eclipse each day, the following Tasks Management dialog will be displayed:

209

Connecting All SOAtest Installations to Parasoft Project Center

If needed, update the number of days that you need to finish the active tasks based on your own estimation, then click OK. The Estimated Remaining Working Time field for your task will be updated accordingly, and managers will be able to review the number of days left until this task is finished.

Closing Tasks To close a task: 1. In the Task Editor, add a comment in the New Comment field. 2. If the task was successfully accomplished, select Success (under Actions), and then click Submit. Alternatively, you can cancel the task (Canceled option) or un-assign it (Open option). 3. Deactivate the task in Mylyn. The Deactivate wizard will open and list the files that you changed while working on this task. 4. If you want to see a text summary report of the file modifications related to this task (so you can copy/paste it), enable Show me a dialog with reported revisions in text format. 5. Click the OK button to send information about the modified files to Project Center."

Configuring Task Assistant Preferences For details on configuring Task Assistant preferences, see “Configuring Task Assistant Preferences”, page 213.

Working with Project Center without Using Mylyn If you do not have Mylyn installed (or you do not want to use it), you can work with Project Center tasks exclusively through the Task Assistant. This section covers: •

Connecting to the Project Center Server



Opening the Task Assistant



Identifying Tasks



Working with Tasks

Connecting to the Project Center Server Before you can start working on tasks, the location of the Project Center server needs to be defined as described in “Connecting SOAtest Server to Report Center”, page 203.

Opening the Task Assistant

210

Connecting All SOAtest Installations to Parasoft Project Center

To open the Task Assistant, choose Window> Show View> Other, then choose SOAtest> Task Assistant.

Identifying Tasks To identify tasks, you search for them using the Project Center web interface. You can click the My Tasks Task Assistant toolbar button to open this interface. The interface will show your assigned tasks, and allow you to search through the entire repository of tasks.

When you identify the task you are going to work on, note the task id. You will need to enter this in the Task Assistant view.

Working with Tasks You can use the Task Assistant to activate tasks, track the files related to those tasks, and close tasks

Activating Tasks To activate a task. 1. Ensure that the project related to this task is available in the IDE. 2. In the Task Assistant, enter the task ID and press Enter. The program will connect to Project Center and retrieve data about this task. The task name will display in the Task Assistant. 3. Click the Activate button in the Task Assistant toolbar (this is the button on the far right). The Activate wizard will open and allow you easily switch the task to "in progress" state—as well as enter the estimated time for this task.

211

Connecting All SOAtest Installations to Parasoft Project Center

Reviewing or Modifying Task Details If you want to review details of an active task or modify task details, click the Edit Task toolbar button.

Closing Tasks To close a task that you have finished working on:

212

Connecting All SOAtest Installations to Parasoft Project Center

1. Click the Task Assistant’s Deactivate toolbar button. The Deactivate wizard will open and list the files that you changed while working on this task.

2. In the Deactivate wizard, describe your changes in the New Comment field. 3. If the task was successfully accomplished, select Success. Alternatively, you can cancel the task (Canceled option) or un-assign it (Open option). 4. If you want 1) information about the modified files to be sent to Project Center and 2) all the listed file modifications to be associated with the task, then enable Notify Project Center about the following modifications. 5. If you want to see a text summary report of the file modifications related to this task (so you can copy/paste it), enable Show me a dialog with reported revisions in text format.

Configuring Task Assistant Preferences

213

Connecting All SOAtest Installations to Parasoft Project Center

You can configure Task Assistant preferences in the SOAtest preference panel’s Report Center/ Project Center> Task Assistant page (to open the preference panel, choose SOAtest> Preferences). In the General tab, you can configure the following options: •

Source control support: Determines Task Assistant’s level of interaction with the specified source control system. •

Recommended: Allows Task Assistant to collect information about changed files and data required for Project Center reporting. With this option, source control is used to validate if a file was modified, what its current version is, whether a change was reverted, and so on. If your source control connection is slow, this option may also work slowly; in that case, we recommend the Minimal option.



Minimal: Allows Task Assistant to collect only the bare minimum of information required for Project Center reporting (e.g., only a few calls to get data regarding file revisions).



None: Does not allow Task Assistant to interact with source control. Project Center reporting is not meaningful in this mode, and is thus disabled.



Open related defect or enhancement in browser when changing task status: Determines whether the details for the defect/enhancement you are working on are shown when you change the task status.



Test collecting: Determines whether the tests related to a task are tracked and shown in the Task Assistant.



Automatically mark tests by task identifier: Determines if new and modified tests are automatically marked by task id.

In the Filters tab, you can specify patterns that match the types of files and tests you do NOT want the Task Assistant to track. For instance, you might not want to track the modifications to binary files such as files in the bin directory, class files, or jar files.

214

Configuring Task Assignment

Configuring Task Assignment This topic explains how to configure SOAtest to assign quality tasks to the various team members building and testing your SOA. Sections include: •

About SOAtest’s Task Assignment



Understanding How SOAtest Assigns Tasks



Specifying How Tasks are Assigned



Directly Specifying Task Owners for Specific Resources



Specifying Team Member’s Email Addresses



Handling Authorship When Using Multiple Source Control Systems

About SOAtest’s Task Assignment There are typically many different team members working on a SOA throughout the SDLC—from developers of Web services, to developers of the Web interface, to QA testers verifying end-to-end business processes. Using technologies ranging from peer review workflow automation, to static analysis, to SOA policy enforcement, to functional testing of individual SOA components as well as end-to-end test scenarios, SOAtest generates quality tasks for the various team members to perform. For example: •

If the organization has a policy that the Web interface must comply with WCAG 2.0 accessibility guidelines, any Web developer who contributes non-compliant code to the project would receive a static analysis task to fix the noncompliant code.



If the team has a policy that certain critical artifacts must be peer reviewed, the assigned reviewer will be automatically notified with a review task if such an artifact is added or changed.



If a tester added a functional test to the test suite, then that test later fails as the system evolves, that tester will be assigned a "test review" task to determine if the test needs to be updated to remain in sync with intentional application changes, or if that test failure indicates a problem with the application functionality.

Understanding How SOAtest Assigns Tasks SOAtest can assign tasks based on source control data in a number of ways. •

If you are going to be statically analyzing source files or executing functional tests whose test files are stored in your source control system, you can set up SOAtest to use data from your source control system in order to assign tasks. Static analysis tasks are assigned to the person who introduced them. Test failures are assigned to the person who last worked on the related test.



If you are using SOAtest for functional testing, you can directly specify task owners for specific test suites or other resources.



If you will be using SOAtest to automate peer review, review task assignment is defined through the Code Review interface (described in the Code Review section).

215

Configuring Task Assignment

Tip If you have added source files to a project in the SOAtest environment, you can see the team member assigned to a particular line of that source file, as well as information about when it was last modified: 1. Open an editor for the appropriate file. 2. Right-click the line whose author you want to view, then choose SOAtest> Show Author at Line from the shortcut menu Note that if you are not using source control data to calculate task ownership, the message will show modification information for the file (rather than for the specific line selected).

Specifying How Tasks are Assigned Task assignment settings can be specified: •

Through the Scope and Authorship preferences page in the SOAtest GUI.



Through a command-line interface local settings file.

To use GUI controls to change task assignment settings: 1. Choose SOAtest> Preferences to open the Preferences dialog. 2. Select the SOAtest> Scope and Authorship category in the left pane. 3. Use the available controls to indicate how you want SOAtest to assign tasks. •

Use source control (modification author) to computer scope: Source control data will be used to assign tasks related to the source files you are analyzing with SOAtest.



Use file system (xml map) to compute scope: You will directly specify how you want tasks assigned for particular files or sets of files (for example, you want tester1 to be responsible for one set of .tst files, tester2 to be responsible for another set of .tst files, and so on). See “Directly Specifying Task Owners for Specific Resources”, page 216 for details.



Use file system (current user) to compute scope: The local user name will be used to compute authorship.

4. Click OK to set and save your settings. For details on setting attributes through a command-line interface local settings file, see “Local Settings (Options) Files”, page 266.

Directly Specifying Task Owners for Specific Resources To directly specify how you want tasks for particular files or sets of files (for example, for .tst files) assigned: 1. Indicate that you will be entering mappings directly as follows: a. Choose SOAtest> Preferences to open the Preferences dialog.

216

Configuring Task Assignment

b. Select the Scope and Authorship category in the left pane. c.

Select Use file system (xml map) to compute scope.

2. Enter file-to-author mapping details as follows: a. Select the Scope and Authorship> Authorship Mapping category in the Preferences dialog. b. Specify your mappings in the authorship mapping table. Note that wildcards are allowed; for example: •

?oo/tests/SomeTest.tst - assigns all files whose names starts with any character (except /) and ends with "oo/tests/



**.tst - assigns all *.tst files in any directory



**/tests/** - assigns every file whose path has a folder named "tests"



tests/** - assigns all files located in directory "tests"



tests/**/Test* - assigns all files in directory "tests" whose name starts with "Test" (e.g., "tests/some/other/dir/TestFile.tst")

Mapping Order Matters Place the most general paths at the end of the mapping. For example, /ATM/** path is the last here because it is the most general: /ATM/unittests/** user 1 /ATM/other/** /ATM/**

user 2 user3

This assigns unittests files to user 1, other files to user 2, and all other ATM project files to user 1. If you reversed the order, all ATM files would be assigned to user 3. c.

Click Save Changes.

3. Click Export to export the mappings as an XML file, then have team members import the mapping file. •

If you are already sharing preferences across the team, this step is not necessary.

Alternatively, you can direct SOAtest to use the exported XML file by either: •

Specifying the path to that file in the Preferences dialog’s Scope and Authorship> Authorship Mapping category (using the Shared File option).



Specifying the path to that file from the command line, using soatestcli -mapping {filename} . Note that this will override any authorship settings specified in the GUI. For details on setting attributes through a SOAtest command-line interface local settings file, see “Testing from the Command Line Interface (soatestcli)”, page 257.

A sample XML authorship mapping file follows: <?xml version="1.0" encoding="UTF-8" ?> <!DOCTYPE authorship (View Source for full doctype...)> <authorship> <!-- assigns all files named: "foo/tests/SomeTest.tst" to "tester1" --> <file author="tester1" path="foo/tests/SomeTest.tst" />

217

Configuring Task Assignment

<!-- assigns all files whose names starts with any character (except /) and ends with "oo/ tests/SomeTest.tst" to "tester2" --> <file author="tester2" path="?oo/tests/SomeTest.tst" /> <!-- assigns all *.tst files in any directory to "tester3" --> <file author="tester3" path="**.tst" /> <!-- assigns every file whose path has a folder named "tests" to "tester4" --> <file author="tester4" path="**/tests/**" /> <!-- assigns all files located in directory "tests" to "author5" --> <file author="tester5" path="tests/**" /> <!-- assigns all files in directory "tests" whose name starts with "Test" i.e. "tests/some/ other/dir/TestFile.tst" to "tester6" --> <file author="tester6" path="tests/**/Test*" /> </authorship>

Specifying Team Member’s Email Addresses By default, SOAtest assumes that each username value it detects is the team member’s username, and that the related team member’s email is [username]@[mail_domain]. However, in some cases, you might want to map the detected username to a different username and/or email address. For example: •

If you want to reassign all of one team member’s tasks to another team member (for instance, if User1 has left the group and you want User2 to take care of all the tasks that would be assigned to User1). We will refer to this type of mapping as an "author to author mapping."



If a team member’s username does not match his email address (for instance, the detected username is john but the appropriate email is [email protected]). We will refer to this type of mapping as an "author to email mapping."

To map the default user value detected to a different username and/or email address: 1. Choose SOAtest> Explore> Team Server. The Browsing dialog will open. 2. Open the Authors tab of the Browsing dialog. 3. If you want to specify "author to author" mappings: a. Select the Reassign tasks from this author... To this author row. b. Click Add. c.

Specify the desired mapping.

4. If you want to specify "author to email" mappings: a. Select the Send emails for this author... To this email address row. b. Click Add. c.

Specify the desired mapping.

218

Configuring Task Assignment

5. Click the Done button to close the Browsing dialog.

Tip - Importing and Exporting to Share Authorship and Email Mappings If you want to move mappings from one Team Server to another, use the Export button to export them from the first server, then use the Import button to import them on the other server.

Handling Authorship When Using Multiple Source Control Systems If you and/or your team members work with multiple source control system, but use the same login for all source control systems, start SOAtest as follows to ensure accurate authorship computation from source control: •

Standalone: SOAtest -J-Duser.name=your_username ...



Plugin: eclipse .... -vmargs -Duser.name=your_username

219

Configuring Team Test Configurations and Rules

Configuring Team Test Configurations and Rules This topic explains how to share Test Configurations (and any rule files or rule mapping files that they depend on) across the team. Sections include: •

About Team Test Configurations



Sharing Team Test Configurations



Modifying a Team Test Configuration



Setting the Team Favorite Test Configuration



Sharing Rule Mappings



Sharing Custom Rules

About Team Test Configurations Team Test Configurations are the Test Configurations that apply the team’s designated test settings (for instance, the static analysis rules that your team has decided to follow, browser playback options for web functional tests, etc.). When all team members use the designated Test Configurations, tests will be run consistently, and the team’s quality and style guidelines will be applied consistently across the project. Once a team Test Configuration is added to Team Server, it will be accessible on all connected team SOAtest installations. If a Test Configuration uses custom rules and/or rulemappings, they can be added to Team Server, then automatically accessed by all connected team SOAtest installations.

Sharing Team Test Configurations To share a Team Test Configuration team-wide, the architect or manager performs the following procedure on a SOAtest installation (Architect or Server Edition) that is already connected to Team Server: 1. If you have not already done so, create a user-defined Test Configuration that applies the designated team settings. •

See “Creating Custom Test Configurations”, page 244 for instructions.

2. Upload that configuration to Team Server as follows: a. Open the Test Configurations dialog by choosing SOAtest> Test Configurations. b. Right-click the Test Configurations category that represents the Test Configuration you want to upload. c.

Choose Upload to Team Server from the shortcut menu.

You can configure multiple team Test Configurations (for instance, one for static analysis, one for regression testing, etc.).

Tip •

If your Team Test Configuration uses custom rules or rule mappings, the related files can be shared as described later in this topic.

220

Configuring Team Test Configurations and Rules

Modifying a Team Test Configuration Team Test Configurations can be directly edited from SOAtest Architect Edition or Server Edition. To directly modify a Team Test Configuration from an Architect or Server Edition: 1. Open the Test Configurations dialog by choosing SOAtest> Test Configurations or by choosing Test Configurations in the drop-down menu on the Test Using toolbar button. 2. In the left pane, select Team> [your_team_Test_Configuration]. 3. Modify the settings as needed. 4. Click either Apply or Close to commit the modified settings. The settings will then be updated on Team Server, and the updated settings will be shared across the team.

Alternate Update Method You can update a Team Test Configuration by modifying the User-Defined Test Configuration it was based on, then repeating the Sharing Team Test Configurations procedure to re-upload the modified Test Configuration.

Setting the Team Favorite Test Configuration The Team Favorite Test Configuration is the test scenario that SOAtest uses when any Team Serverconnected team member starts a test without specifically indicating which Test Configuration to use (for example, when a team member starts a test by clicking the Test Using button). To set the Team Favorite Test Configuration, perform the following steps from a SOAtest Architect or Server edition: 1. Choose SOAtest> Explore> Team Server. The Browsing dialog will open. 2. Open the Configurations tab of the Browsing dialog. 3. Select the Test Configuration that you want to serve as the Team Favorite Test Configuration. 4. Click the Set as Team Favorite button.

Sharing Rule Mappings Rule mapping is a key part of configuring SOAtest to enforce your team’s or organization’s coding policy (e.g. by customizing the built-in rule names, severities, and categories to match the ones defined in your policy). You can use Team Server to ensure that all team members can access the rulemap.xml file (described in “Modifying Rule Categories, IDs, Names, and Severity Levels”, page 616) you have created to customize SOAtest rule categories and severity levels. To upload a rulemap.xml file to the Team Server, perform the following steps from SOAtest Architect Edition or Server Edition: 1. Launch SOAtest on a machine from which you can access the rulemap.xml file that you want to share. 2. Choose SOAtest> Explore> Team Server. The Browsing dialog will open. 3. Open the Rules tab of the Browsing dialog.

221

Configuring Team Test Configurations and Rules

4. Click the Upload button. A file chooser will open. 5. Select the rulemap.xml file that you created, then click Open. The rulemap.xml file that you just uploaded should now be listed in the Browsing dialog’s Rules tab. The rule configurations specified in this file will be available on all SOAtest installations connected to Team Server. 6. Click Done, click Apply, and then close the SOAtest Preferences dialog. 7. Restart SOAtest. You do not have to stop the server first. 8. Open the Test Configurations dialog by choosing SOAtest> Test Configurations or by choosing Test Configurations in the drop-down menu on the Test Using toolbar button. 9. Select any Test Configuration and open the Static tab. The new rule settings should be applied.

Tip •

If you later modify the master rulemap.xml file, you must repeat the Sharing Rule Mappings procedure to upload the modified file; if the modified file is not uploaded, the modifications will not be shared.

Sharing Custom Rules You can use Team Server to ensure that all team members can access and check custom static analysis rules you have designed with the RuleWizard module. When Team Server manages a rule, all SOAtest installations connected to Team Server will automatically have access to the most recent version of the rule. If a rule changes and the modified rule is uploaded to Team Server, the version on all team SOAtest installations will be updated automatically. The architect (or other designated team member) performs the following procedure on one SOAtest Architect or Server Edition that is already connected to Team Server: 1. Create one or more custom rules in RuleWizard. 2. Save each rule and assign it a .rule extension. You can save the rule in any location. 3. If any new rules should belong to a new SOAtest category, create a new category as follows: a. Open the Test Configurations dialog by choosing SOAtest> Test Configurations or by choosing SOAtest> Test Configurations in the drop-down menu on the Test Using toolbar button. b. Select any Test Configurations category. c.

Open the Static> Rules Tree tab.

d. Click the Edit Rulemap button. e. Open the Categories tab. f.

Click New. A new entry will be added to the category table.

g. Enter a category ID and category description in the new entry. For instance, an organization might choose to use ACME as the category ID and ACME INTERNAL RULES as the description. h. Note the location of the rulemap file, which is listed at the top of this dialog. You will need this information in step 9. i.

Click OK to save the new category.

4. Choose SOAtest> Explore> Team Server. The Browsing dialog will open.

222

Configuring Team Test Configurations and Rules

5. Open the Rules tab of the Browsing dialog. 6. Click the Upload button. A file chooser will open. 7. Select one or more of the new .rule files that you created, then click Open. The .rule files that you just uploaded should now be listed in the Browsing dialog’s Rules tab. All rules represented in this tab will be available on all SOAtest installations connected to Team Server. 8. Add additional team rules by repeating the previous two steps. 9. If you added any new rule categories or made any other changes to the rule mappings, click Upload, select the edited rulemap file, then click Open. The file that you just uploaded should now be listed in the Browsing dialog’s Rules tab. This file will be available on all SOAtest installations connected to Team Server. This file tells SOAtest how to categorize the team rules.

Tip - Rules for the Coding Standard Tool If the rule is going to be added to a Coding Standards test suite tool (as opposed to being run through a static analysis Test Configuration), you add it to the tree in the Coding Standards tool’s configuration panel—not the Test Configuration panel, as described below.

10. Open the Test Configurations dialog by choosing SOAtest> Test Configurations. 11. Select any Test Configuration and open the Static> Rules Tree tab. 12. Click Reload. The new rule should be available in all available Test Configurations and classified under the Team category. The rule will be disabled by default. 13. If you want a Team Test Configuration to check these rules: a. Configure a new or existing Test Configuration to check these rules. The added rules will be disabled by default, so you will need to enable any rules that you want checked. b. Ensure that the modified Test Configuration available to the team as described in “Sharing Team Test Configurations”, page 220. You must follow this procedure even if you are modifying a Test Configuration that is already shared. 14. Click either Apply or Close to commit the modified settings.

Tips •

If your custom rule is visible in Test Configuration rules tree (for instance, if you imported it via the rules tree Import button), you can upload it to Team Server by simply right-clicking the rule, then choosing Upload to Team Server from the shortcut menu.



If you later modify a team rule, you must repeat the Sharing Custom Rules procedure to upload the modified rule file; if the modified .rule file is not uploaded, the rule modifications will not be shared.

Removing Rules From Team Server To remove a rule from Team Server, the architect (or other designated team member) performs the following procedure from SOAtest Architect Edition or Server Edition: 1. Choose SOAtest> Explore> Team Server. The Browsing dialog will open.

223

Configuring Team Test Configurations and Rules

2. Open the Rules tab of the Browsing dialog. 3. Select the rule you want to remove. 4. Click Delete. 5. Click OK.

224

Configuring Task Goals

Configuring Task Goals This topic explains how managers can set goals for task reporting and task resolution, then have SOAtest apply these goals across the team’s SOAtest installations. Sections include: •

About Global Goals



Configuring Global Goals



Importing Tasks for Specified Goals

About Global Goals Global team goals are goals that may span across multiple Test Configurations. For instance, a global goal might be: •

Test execution failures from any project should be fixed each day.



Certain projects must have all static analysis tasks related to a given policy completed by a certain date.

To configure global goals, your SOAtest installation must have a SOAtest Server license. Global goals will be applied across the team if all team SOAtest installations are connected to Team Server as described in “Connecting All SOAtest Installations to Team Server”, page 199. Users without a Server license can review goals, but not configure them. When global goals are enabled, the Test Configuration manager’s Goals tab will be disabled.

Configuring Global Goals To configure global goals: 1. Choose SOAtest> Preferences to open the Preferences panel. 2. Choose Tasks> Goals on the left. 3. Check the Enable global management button. 4. Click New. 5. Configure the new goal that is added to the table. You can configure global goal options such as: •

The project(s) that the goal applies to.



The deadline for achieving the goal.



What the goal requires.

Importing Tasks for Specified Goals When importing tasks from Team Server, team members can choose to import only the tasks related to a specified goal. To do this: 1. Choose SOAtest> Import> Custom Tasks or choose Custom Tasks from the Import My Recommended Tasks pull-down toolbar menu. 2. Select Import from Team Server server. 3. Select Filtered. 4. Select for goals.

225

Configuring Task Goals

5. Choose the appropriate goal from the goals box.

6. Click OK.

226

Sharing Project and Test Assets

Sharing Project and Test Assets This topic explains how to share your project and test assets across the team.

Sharing Test Assets To ensure that your tests are accessible to other team members, as well as to the nightly test machine, share test assets such as the following through source control: •

.tst files



data sources



custom scripts



external regression controls

Using a Team Project Set File (.psf) to Share the Project Across the Team Once one team member creates a project, that team member can create a Team Project Set File (.psf) which can then be shared by other members of the team. Doing so allows every team member to create their Eclipse Projects in the same uniform way. This is a necessary step for importing tasks from the automated nightly test process. To create a Team Project Set File, complete the following: 1. Select File> Export. The Export Wizard displays. 2. Within the Export Wizard, select Team> Team Project Set, and then click the Next button. 3. Select the projects to be included in the Team Project Set File by selecting the corresponding check boxes, 4. Enter the location where the Team Project Set File will be saved and click the Finish button. To create a Project from the Team Project Set File, complete the following: 1. Select File> Import. The Import Wizard displays. 2. Within the Import Wizard, select Team> Team Project Set, and then click the Next button. 3. Browse to the desired Team Project Set and click the Finish button. The tests you selected display in the Test Case Explorer.

Sharing Preferences Across a Team To share preferences, you can export them (ideally, to source control, a file server, twiki, etc.) then import them onto the other installations. When the desired preferences change, they will need to be manually exported and imported again. To export and then import preferences: 1. In the workspace where they are set, export them by choosing File> Export, selecting General> Preferences, then completing the wizard. In the new workspace, import them by choosing by choosing File> Import, selecting General> Preferences, then completing the wizard.

227

Sharing Project and Test Assets

228

Configuring Automated Nightly Testing

Configuring Automated Nightly Testing You can run SOAtest from the command line to speed up your testing/development process and integrate SOAtest into your automatic nightly builds/deployments. The following example scenario describes how you might setup SOAtest to run automatically as part of a nightly testing and build process, including automatic test execution and reporting. We will assume that you have your SOAtest project files checked into a revision control system and you have an existing automated job scheduler, such as Windows Task Manager or Unix cron jobs. The goal is to have SOAtest run all project files that are checked into source control and to generate reports which will be visible on a web server. 1. Create directories on your test system for storing project files and reports. 2. Schedule a process in the task scheduler on the test system to check out the latest version of your SOAtest project and test files from your revision control system. This process should be scheduled to run daily. 3. Schedule another daily process to invoke SOAtest using the desired command line options. •

For details and examples, see the tutorial lesson “Automation/Iteration (Nightly Process)”, page 165 as well as the topic “Testing from the Command Line Interface (soatestcli)”, page 257

229

Team-Wide Deployment - Usage Overview In this section: •

Using a Team Deployment: Daily Usage Introduction



Creating Tests and Analyzing Source Files from the GUI



Reviewing and Responding to Tasks Generated During Nightly Tests



Accessing Results and Reports



Reassigning Tasks to Other Team Members



Monitoring Project-Wide Test Results

230

Using a Team Deployment: Daily Usage Introduction

Using a Team Deployment: Daily Usage Introduction Once a team deployment is configured as described in “Configuring a Team Deployment: Introduction”, page 191, daily usage should involve: •

Developers/QA - Creating Tests and Analyzing Source Files from the GUI



Developers/QA - Reviewing and Responding to Tasks Generated During Nightly Tests



Managers/Architects - Monitoring Project-Wide Test Results

231

Creating Tests and Analyzing Source Files from the GUI

Creating Tests and Analyzing Source Files from the GUI Team members who are testing SOA systems typically use their local installations of SOAtest to develop and run tests, then check their test artifacts (.tst files, etc.) into source control so that they can be integrated into the nightly regression test suite. Team members who are working on source files typically use their local installations of SOAtest to ensure that each new or modified source file complies with the organization’s policy, address the reported violations, then add the compliant files to source control.

232

Reviewing and Responding to Tasks Generated During Nightly Tests

Reviewing and Responding to Tasks Generated During Nightly Tests This topic explains the recommend procedure for team members to follow each morning to review the results from the team's SOAtest Server (soatestcli) tests. soatestcli should be configured to run the team’s designated functional test suite and policy enforcement/static analysis tests nightly to identify any quality tasks that require the team’s attention (failed test cases, policy violations, etc.)

To access these tasks, each team member performs the following procedure each morning: 1. (Optional) Review the emailed report of test results to determine if any new tasks were assigned to you. •

See “Understanding Reports”, page 300 for report details.

2. Import results by choosing SOAtest> Import [preferred category of tasks]. •

See “Accessing Results and Reports”, page 234 for other import options.

3. Review and respond to results. •

See “Viewing Results”, page 290 for details.



If a test failure is reported, determine if the failure was the result of an expected change (for instance, an application change made in response to a modified requirement) or a problem with the application. For expected changes, the test case needs to be modified. For unexpected changes, the application needs to be corrected.



If a policy violation is reported, either correct the violation, or use code review to discuss whether it is a good candidate for a suppression.

4. Reassign tasks if needed. •

See “Reassigning Tasks to Other Team Members”, page 238 for details.

5. Add any modified code and/or test assets (test cases, data sources, etc.) to the designated source control location.

233

Accessing Results and Reports

Accessing Results and Reports This topic explains how to access test results available on Team Server. If SOAtest results (e.g., from a nightly command line test) are sent to Parasoft Team Server, team developers can import test results into the SOAtest GUI. This way, project-wide/team-wide tests can be run automatically each night on a central server machine. Each team member can import his or her assigned tasks (failed tests to review, policy violations to address, etc.) into the GUI and then review and respond to them in the SOAtest view. In addition, team members can download and/or open those reports from any SOAtest installation connected to Team Server or from any machine that can access the Team Server Web server. Sections include: •

Importing Results From Team Server into the SOAtest GUI



Accessing Team Server Reports through the GUI



Accessing Team Server Reports through a Web Browser



Accessing Results through Report Center

Terminology •

Your tasks: The subset of all of your testing tasks that you are responsible for (based on SOAtest’s task assignments, which are discussed in “Configuring Task Assignment”, page 215).



Recommended tasks: The subset of all of your testing tasks that SOAtest has selected for you to review and address today (based on the maximum number of tasks per developer per day that your team has configured SOAtest to report, as described in “Defining Task Reporting and Resolution Targets (Goals Tab)”, page 250).



Your recommended tasks: The subset of all of your testing tasks that 1) you are responsible for (based on SOAtest’s task assignments, which are discussed in “Configuring Task Assignment”, page 215) and 2) SOAtest has selected for you to review and address today (based on the maximum number of tasks per person per day that your team has configured SOAtest to report, as described in “Defining Task Reporting and Resolution Targets (Goals Tab)”, page 250).

Importing Results From Team Server into the SOAtest GUI Any team member whose SOAtest installation is connected to Team Server will be able to import the test results stored on Team Server. When results are imported, test results are shown in the GUI as if the test were run in the GUI. After the import, you can drill down into results in the normal manner. You can import a specific category of tasks, tasks for specific resources you have worked on or you are responsible for correcting, or all tasks. You can only import results for projects that are currently in your workbench. If the tested project files were modified in your workbench since the test was run, the results will not be reported because they might not correspond to your modified version of the project files.

234

Accessing Results and Reports

Tip: Importing Tasks if Tests Are Not Performed Frequently By default, SOAtest is configured to import tasks from tests performed within the past 2 days. If your team doesn’t run tests frequently and you try to import tasks more than 2 days after the test has run, nothing will be imported—unless you change the default settings. To change the default test import settings: 1. Choose SOAtest> Preferences. The Preferences dialog will open. 2. In the left pane, select SOAtest> Tasks. 3. Modify the setting for Import only tasks reported for tests ran in the last n days.

Importing Your Recommended Tasks To import your recommended tasks reported for all test runs that were performed in the previous 24 hours and whose results were sent to Team Server: •

Choose SOAtest> Import> My Recommended Tasks or click the Import My Recommended Tasks toolbar button.

Importing All Your Tasks To import your testing tasks reported for all test runs that were performed in the previous 24 hours and whose results were sent to Team Server: •

Choose SOAtest> Import> All My Tasks or choose All My Tasks from the Import My Recommended Tasks pull-down toolbar menu.

Importing a Custom Set of Tasks To import a custom set of tasks: 1. Choose SOAtest> Import> Custom Tasks or choose Custom Tasks from the Import My Recommended Tasks pull-down toolbar menu. 2. Specify where you want to import tasks from. Available options are: •

Import from Team Server server: Imports tasks that were uploaded to Team Server (for example, after a batch mode test).



Import from local file(s): Imports tasks from a results file that is accessible from your local file system.

3. Specify where you want to import tasks from. Available options are: •

All/Filtered: Specifies whether you want to import all tasks on Team Server, or only a subset of tasks (tasks that satisfy the criteria specified in the subsequent options).



recommended tasks: Imports only recommended tasks.



for selected resources: Imports only tasks for the selected resources.



for single user: Imports only tasks for the specified user.

4. Click OK.

235

Accessing Results and Reports

For example, if you were tasked with cleaning up a particular file and wanted to import all of the tasks reported for that file, you would first select that resource in the project tree, then you would open the Custom Tasks dialog and select the following: •

Team Server



Filtered



for selected resources

Importing all results from an XML file Here’s an alternative way to import all tasks: 1. Choose SOAtest> Explore> Team Server. The Browsing dialog will open. 2. Open the Reports tab of the Browsing dialog. 3. Select the XML report whose results you want to import, then click the Import Results button.

Accessing Team Server Reports through the GUI Any team member whose SOAtest installation is connected to Team Server will be able to view and download the report files that are available on Team Server. To download a report file: 1. Choose SOAtest> Explore> Team Server. The Browsing dialog will open. 2. Open the Reports tab of the Browsing dialog. Reports will be organized according to the date when they were generated. 3. Do one of the following: •

To view a report, select the report that you want to view, then click the View button. The report will open in a Web browser.



To download a report, select the report that you want to download, then click the Download button. A file chooser will open. Specify a location for the downloaded report, then click Save. The report file will then be downloaded to the specified location.

Removing Reports from Team Server If you want to delete reports stored on Team Server (for example, if you want to remove all old reports from Team Server or remove the report for a failed test run): 1. Choose SOAtest> Explore> Team Server. The Browsing dialog will open. 2. Open the Reports tab of the Browsing dialog. 3. Select the XML report whose results you want to delete, then click the Delete button. •

If you want to keep the related test data on Team Server (e.g., if you are cleaning out old reports, but you still want data from these tests used for graphs that show historical trends), clear the Keep summary data for report graphs check box.

236

Accessing Results and Reports

Accessing Team Server Reports through a Web Browser Any team member who can access the Team Server’s Web server can directly browse to the HTML and XML report files that are available on Team Server. This allows team members to access reports outside of the SOAtest GUI. Moreover, in the reports available on Team Server, all links (for instance, links to Category) are active. All links are not active in emailed reports. To directly access a report available on Team Server: •

Choose SOAtest> Explore> Team Server Reports in the SOAtest GUI.

Accessing Results through Report Center To access Report Center reports based on information from SOAtest tests and other sources: •

Choose SOAtest> Explore> Report Center Reports or open the reports as described in your Report Center User’s Guide.

237

Reassigning Tasks to Other Team Members

Reassigning Tasks to Other Team Members This topic explains how to reassign a reported task (for example, to fix a static analysis/policy violation, review a test failure, etc.) to another team member. SOAtest automatically assigns tasks as described in “Configuring Task Assignment”, page 215. You can override these computations and reassign tasks as needed. To reassign a task: 1. In the SOAtest view, right-click the task you want to reassign, then choose Reassign Task To from the shortcut menu. The Reassign Task To dialog will open. 2. Choose the option that indicates how you want the task reassigned. •

To reassign the task to a specific user, choose Username, then enter the username of the person to whom you want that task assigned.



To reassign the task to the user who last modified the file, choose Last Author.



(Optional) If you want to remove the reassigned task from your SOAtest view, enable the Delete task if reassigning option.

Note If you are using Parasoft Team Server, then task assignment is managed by the server and shared across the team; otherwise, it is managed locally by your installation of SOAtest. For example, if you are using Team Server and you reassign tasks to johndoe, those tasks will only be shown on johndoe’s machine; if you and other team members have set the Check only file/lines authored by [your_username] Scope option, those tasks will not appear on your machine or other team members’ machines. If you are not using Team Server and you reassign tasks to johndoe, those tasks will not be shown on your machine, but may be shown on other team members’ machines.

238

Monitoring Project-Wide Test Results

Monitoring Project-Wide Test Results This topic explains the recommend procedure for team leaders (managers, architects, etc.) to follow in order to monitor quality, evaluate project readiness, and to spot emerging trends and problems. To monitor project quality and status, managers should review results on a regular basis. Results can be accessed in several ways: •

From the emailed report of test results. •



See “Understanding Reports”, page 300 for report details.

From the report of test results that was sent to Team Server (by choosing SOAtest> Explore> Team Server Reports). •

See “Accessing Results and Reports”, page 234 for details.



From the Report Center dashboard (by choosing SOAtest> Explore> Report Center Reports).



From the GUI (by choosing SOAtest> Import> All Tasks).

239

Test and Analysis Basics In this section: •

Running Tests and Analysis



Reviewing Results

240

Customizing Settings and Configurations In this section: •

Modifying General SOAtest Preferences



Creating Custom Test Configurations

241

Modifying General SOAtest Preferences

Modifying General SOAtest Preferences This topic explains how to modify SOAtest’s general settings (settings that are not specific to any particular Test Configuration, project, or test). Sections include: •

Customizing Preferences



Configuring Double-Click vs. Single-Click Options



Sharing Preferences Across a Team



Transferring Preferences Across Workspaces

Customizing Preferences To customize general SOAtest preferences: 1. Choose SOAtest> Preferences. The Preferences dialog will open. 2. In the left pane, select the category that represents the SOAtest settings you want to change. •

See “Preference Settings”, page 747 for details on the available categories and settings.

3. Modify the settings as needed. 4. Click either Apply or OK to commit the modified settings.

Configuring Double-Click vs. Single-Click Options With the default settings, you need to double-click a Test Case Explorer or Navigator node to open the related file or configuration panel. For instance, if you want to configure a tool, you need to double-click the related Test Case Explorer node to open the that tool’s configuration panel. You can change the default double-click behavior to single-click by completing the following: 1. Select Window> Preferences. 2. Within the Preferences dialog, select General on the left, and change the Open mode from Double click to Single click within the right GUI panel. 3. Select General> Editors, enable Close editors automatically and then click the OK button. You will now be able to open editors based on a single click.

Sharing Preferences Across a Team If you share an entire workspace with team members, your preferences (SOAtest-specific preferences as well as general Eclipse preferences) will be shared along with the project. If you do not share an entire workspace, but want to share preferences, you will need to export them (ideally, to source control, a file server, twiki, etc.) then import them onto the other installations. When the desired preferences change, they will need to be manually exported and imported again. To export and then import preferences: 1. In the workspace where they are set, export them by choosing File> Export, selecting General> Preferences, then completing the wizard.

242

Modifying General SOAtest Preferences

In the new workspace, import them by choosing by choosing File> Import, selecting General> Preferences, then completing the wizard.

Transferring Preferences Across Workspaces Eclipse resets preferences to the defaults when a new workspace is opened. If you want to transfer your preferences (SOAtest-specific preferences as well as general Eclipse preferences) to a new workspace: 1. In the workspace where they are set, export them by choosing File> Export, selecting General> Preferences, then completing the wizard. In the new workspace, import them by choosing by choosing File> Import, selecting General> Preferences, then completing the wizard.

243

Creating Custom Test Configurations

Creating Custom Test Configurations This topic explains why and how to create custom Test Configurations that define the test scenarios you plan to use, as well as how to export/import Test Configurations. Sections include: •

About Custom Test Configurations



Creating a Custom Test Configuration



Configuring Test Settings •

Defining What Code is Tested (Scope Tab)



Defining How Static Analysis is Performed (Static Tab)



Defining How Test Cases are Executed (Execution Tab)



Defining Common Options that Affect Multiple Analysis Types (Common Tab)



Defining Peer Review Options (Code Review Tab)



Defining Task Reporting and Resolution Targets (Goals Tab)



Changing the Favorite Test Configuration



Importing/Exporting Test Configurations



Comparing Test Configurations



Specifying Test Configuration Inheritance

About Custom Test Configurations Every SOAtest test—whether it is performed in the GUI or from the command line interface—is based on a Test Configuration which defines a test scenario and sets all related test parameters for static analysis and test execution. To change how a test is performed, you modify the settings for the Test Configuration you plan to use. SOAtest provides built-in Test Configurations that are based on a variety of popular test scenarios. However, because development projects and team priorities differ, some SOAtest users prefer to create custom Test Configurations. The default Test Configurations, which are in the Built-in category, cannot be modified. The recommended way to create a custom Test Configuration is to copy a Built-in Test Configuration to the Userdefined category, then modify the copied Test Configuration to suit your preferences and environment. Alternatively, you could create a new Test Configuration "from scratch", then modify it as needed. The Favorite Configuration should be set to the custom Test Configuration that you plan to use most frequently. By setting your preferred Test Configuration as the Favorite Configuration, you can easily run it from the SOAtest menu, the Test Using tool bar button, or from the command line interface.

Creating a Custom Test Configuration To create a custom Test Configuration: 1. Open the Test Configurations panel by choosing SOAtest> Test Configurations. 2. Review the available Test Configurations to determine which (if any) you want to base your custom Test Configuration on. •

Built-in Test Configurations are described in “Built-in Test Configurations”, page 741.

244

Creating Custom Test Configurations

3. Do one of the following: •

If you want to base a custom Test Configuration on a built-in Test Configuration, rightclick that Test Configuration, then choose Duplicate.



If you want to create a custom Test Configuration from scratch, click New.

4. Select the new Test Configuration, which will be added to the User-defined category. 5. Modify the settings as needed. •

Scope tab settings determine what code is tested; it allows you to restrict tests by author, by timestamp, and so on. For details, see “Defining What Code is Tested (Scope Tab)”, page 245.



Static tab settings determine how static analysis is performed and what rules it checks. For details, see “Defining How Static Analysis is Performed (Static Tab)”, page 247.



Execution tab settings determine if test cases are executed. For details, see “Defining How Test Cases are Executed (Execution Tab)”, page 248.



Common tab settings cover various actions that can affect multiple types of analysis. For details, see “Defining Common Options that Affect Multiple Analysis Types (Common Tab)”, page 249.



Goals tab settings allow the team to specify goals for task reporting and task resolution. For details, see “Defining Task Reporting and Resolution Targets (Goals Tab)”, page 250.

6. (Optional) Set the Test Configuration as the Favorite Test Configuration by right-clicking it, then choosing Set as Favorite from the shortcut menu. The configuration will then be set as the Favorite Configuration; the "favorite" icon will be added to that configuration in the Test Configurations tree. 7. Click Apply, then Close.

"Grayed-Out" Test Configurations = Incompatible Test Configurations If a Test Configuration is "grayed out," this indicated that it was created with an incompatible version of SOAtest, and cannot be edited or run with the current version.

Tip - Importing and Exporting to Share Test Configurations If you are not using Parasoft Team Server to share test settings across your team, you can share custom Test Configurations with team members by exporting each Test Configuration you want to share, then having your team members import it. See “Importing/Exporting Test Configurations”, page 251 for details.

Configuring Test Settings Defining What Code is Tested (Scope Tab) For source code static analysis only.

245

Creating Custom Test Configurations

During a test, SOAtest will perform the specified action(s) on all code in the selected resource that satisfies the scope criteria for the selected Test Configuration. By default, SOAtest checks all code in the selected resource. However, you can use the Scope tab to configure restrictions such as: •

Test only files or lines added or modified after a given date.



Test only files or lines added or modified on the local machine.



Test only files modified by a specific user.

You can restrict the scope of tests by specifying file filters or line filters that define what files or code you want tested (for example, only files or lines added or modified since a cutoff date, only files or lines added or modified locally, only files or lines last modified by a specific user). Some file filters and line filters are only applicable if you are working with projects that are under supported source control systems. The Scope tab has the following settings: •

File Filters: Restricts SOAtest from testing files that do not meet the specified timestamp and/ or author criteria. •





Time options: Restricts SOAtest from testing files that do not meet the specified timestamp criteria. Available time options include: •

No time filters: Does not filter out any files based on their last modification date.



Test only files added or modified since the cutoff date: Filters out files that were not added or modified since the cutoff date.



Test only files added or modified in the last n days: Filters out files that were not added or modified in the specified time period.



Test only files added or modified locally: Filters out files that were not added or modified on the local machine. This feature only applies to files that are under supported source control systems.

Author options: Restricts SOAtest from testing files that do not meet the specified author criteria. Available author filter options include: •

No author filters: Does not filter out any files based on their author.



Test only files authored by preferred user: Filters out any files that were not authored by the specified user (i.e., filters out any files that were authored by another user).

Line filters: Restricts the lines of code that SOAtest operates on. The file filter is applied first, so code that reaches the line filter must have already passed the file filter. Available line filter options include: •

Time options: Restricts SOAtest from testing lines of code that do not meet the specified timestamp criteria. Available time options include: •

No time filters: Does not filter out any lines of code based on their last modification date.



Test only lines added or modified since the cutoff date: Filters out lines of code that were not added or modified since the cutoff date. This feature only applies to files that are under supported source control systems.



Test only lines added or modified in the last n days: Filters out lines of code that were not added or modified in the specified time period.

246

Creating Custom Test Configurations





Test only lines added or modified locally: Filters out lines of code that were not added or modified on the local machine. This feature only applies to files that are under supported source control systems.

Author options: Restricts SOAtest from testing lines of code that do not meet the specified author criteria. Available author filter options include: •

No author filters: Does not filter out any lines of code based on their author.



Test only lines authored by preferred user: Filters out any lines of code that were not authored by the specified user (i.e., filters out any lines of code that were authored by another user).

Note •

Code authorship information and last modified date is determined in the manner set in the Scope and Authorship preferences page; for details about available settings, see “Configuring Task Assignment”, page 215.



Setting file or line scope filters could prevent SOAtest from reporting some of the violations that occur within tested files. See “”, page 607 for details.

Defining How Static Analysis is Performed (Static Tab) During a test, SOAtest will perform static analysis based on the parameters defined in the Test Configuration used for that test. The Static tab has the following settings: •

Enable Static Analysis: Determines whether SOAtest performs static analysis, which involves checking whether the selected resources follow the rules that are enabled for this Test Configuration.



Limit maximum number of tasks reported per rule to: Determines whether SOAtest limits the number of violations (tasks) reported for each rule, and—if so—the maximum number of violations per rule that should be reported during a single test. For instance, if you want to see no more than five violations of each rule, set this parameter to 5. The default setting is 1,000.



Do not apply suppressions: Determines whether SOAtest applies the specified suppressions. If suppressions are not applied, SOAtest will report all violations found.



Rules tree: Determines which rules are checked during static analysis. Use the rules tree and related controls to indicate which rules and rule categories you want checked during static analysis. •

To view a description of a rule, right-click the node that represents that rule, then choose View Rule Documentation from the shortcut menu.



To view a description of a rule category, right-click the node that represents that rule category, then choose View Category Documentation from the shortcut menu.



To enable or disable all rules in a specific rule category or certain types of rules within a specific rule category, right-click the category node, then choose Enable Rules> [desired option] or Disable Rules> [desired option].



To search for a rule, click the Find button, then use that dialog to search for the rule.

247

Creating Custom Test Configurations



To hide the rules that are not enabled, click the Hide Disabled button. If you later want all rules displayed, click Show All.

Tips •

The number next to each rule ID indicates the rule’s severity level. The severity level indicates the chance that a violation of the rule will cause a serious construction defect (a coding construct that causes application problems such as slow performance, security vulnerabilities, and so on). Possible severity levels (listed from most severe to least severe) are: •

Severe Violation (SV) - Level 1



Possible Severe Violation (PSV) - Level 2



Violation (V) - Level 3



Possible Violation (PV) - Level 4



Informational (I) - Level 5



To learn about the rules that are included with SOAtest, choose Help> Help Contents, then open the SOAtest Static Analysis Rules book, then browse the available rule description files.



To generate a printable list of all rules that a given Test Configuration is configured to check: a. Open the Test Configurations panel by choosing SOAtest> Test Configurations. b. Select the Test Configurations category that represents the user-defined Test Configuration you want to modify. c.

Open the Static tab.

d. Click the Printable Docs button.

Defining How Test Cases are Executed (Execution Tab) During a test, SOAtest will execute test cases based on the parameters defined in the selected Test Configuration’s Execution tab. The Execution> Functional tab has the following settings: •

Enable Test Execution: Determines whether SOAtest executes available tests. If this option is not checked, all other test execution parameters are irrelevant.



Execute in load test mode: Determines whether SOAtest executes available tests in load testing mode and alerts you to any outstanding issues that might impact your load testing—for example, incorrectly configured HTTP requests. See “Validating Tests”, page 566 for details. •

Auto-configure tests in preparation for load testing: Determines whether SOAtest configures browser-based web functional tests to run in a browser-less load test environment. See “Configuring Tests”, page 562 for details. See “Configuring Tests”, page 562 for details.



Execute only opened Test Suite (.tst) Files (always false in command-line mode): Determines whether SOAtest executes Test Suites that are not currently active (i.e., tests that you are not currently working on).



Override default Environment during Test Execution: Configures SOAtest to always use the specified environment for tests run with this Test Configuration—regardless of what environment is active in the Test Case Explorer.

248

Creating Custom Test Configurations

For example, assume you have the following environments:

This is how you set the Test Configuration to always use the "staging server" environment:



Use browser: Allows you to override a test’s browser playback settings at the time of test execution See “Modifying Browser Playback Settings”, page 447 and “Specifying the Browser at the Time of Test Execution”, page 447 for details.



Apply static analysis to: If a Test Configuration runs both static analysis and test execution (e.g., for performing static analysis on a Web functional test), this setting determines whether static analysis is performed on the HTTP responses, or the browser contents. •

HTTP Responses refers to the individual HTTP messages that the browser made in order to construct its data model—the content returned by the server as is (before any browser processing).



Browser-Constructed HTML refers to the real-time data model that the browser constructed from all of the HTML, JS, CSS, and other files it loaded.

The Execution> Security tab allows you to configure penetration testing, which is described in “Penetration Testing”, page 484. The Execution> Runtime Error Detection tab allows you to configure runtime error detection, which is described in “Performing Runtime Error Detection”, page 490.

Defining Common Options that Affect Multiple Analysis Types (Common Tab) The Test Configuration’s Common tab controls test settings for actions that affect multiple analysis types. The Common tab has the following settings: •

Before Testing> Refresh projects: Determines whether projects are refreshed before they are tested. When a project is refreshed, SOAtest checks whether external tools have changed

249

Creating Custom Test Configurations

the project in the local file system, and then applies any detected changes. Note that when you test from the command line, projects are always refreshed before testing. •

Before Testing> Update projects from source control: Determines whether projects are updated from source control (if you using a supported source control system) before they are tested.



Build: Determines if and when whether projects are built before they are tested. Note that this settings applies to GUI tests, not command-line tests. Available options include:





Full (rebuild all files): Specifies that all project files should always be rebuilt.



Incremental (build files changed since last build): Specifies that only the project files that have changed since the previous build should be rebuilt.



Stop testing on build errors: Specifies that testing should stop when build errors are reported.

Commit added/modified files to source control if no tasks were reported: Allows you to combine your testing and your source control check-ins into a single step. For example, you would enable this if you want to run static analysis on files, then have SOAtest automatically check in the modified files if no static analysis tasks are reported. In the context of functional testing, it tells SOAtest that if you run modified tests—and they pass—it should check the modified tests into source control.

Defining Peer Review Options (Code Review Tab) This tab contains settings for automating preparation, notification, and tracking the peer review process, which can be used to evaluate critical SDLC artifacts (source files, tests, etc.) in the context of the organization’s defined quality policies. For details, see “Code Review”, page 619.

Defining Task Reporting and Resolution Targets (Goals Tab) To ensure that SOAtest does not report an overwhelming number of tasks, the team manager can specify a reporting limit (such as "Do not report more than 25 static analysis tasks per developer per day") and/or a quality goal (such as "All static analysis violations should be fixed in 2 months"). SOAtest will then use the specified criteria to select a subset of testing tasks for each developer to perform each day. These goals are specified in the Goals tab. Alternatively, you can set global team goals—goals that may span across multiple Test Configurations—as described in “Configuring Task Goals”, page 225. This requires Team Server and a SOAtest Server license. If goals are set globally, the Goals tab in the Test Configuration panel will be disabled. The Goals tab has the following settings:

Static tab •

Perform all tasks: Specifies that you want SOAtest to report all static analysis tasks it recommends, and the team should perform all static analysis tasks immediately.



Don’t perform tasks: Specifies that you want SOAtest to report all static analysis tasks it recommends, but the team is not required to perform all static analysis tasks immediately. This is useful, for instance, if you want to see all recommended static analysis tasks, but you want the team to focus on fixing test failures before addressing static analysis violations.



No more than n tasks per developer by date: Specifies that you want each developer to have only n static analysis tasks by the specified date.



Max tasks to recommend: Limits the number of static analysis tasks reported for each developer on any test run. The tasks shown are selected randomly so that different tasks are shown after each run. For example, if you set the parameter to 50, the first task shown after each run

250

Creating Custom Test Configurations

is selected at random, and the following 49 tasks shown are the ones that would follow that first task in a complete report.

Execution tab •

Perform all tasks: Specifies that you want SOAtest to report all functional testing tasks, and the team should perform the specified tasks immediately.



Don’t perform tasks: Specifies that you want SOAtest to report all functional testing tasks, but the team is not required to perform the specified tasks immediately. This is useful, for instance, if you want to see a list of all necessary functional testing tasks, but you want the team to focus on fixing static analysis tasks before addressing functional test failures.



No more than n tasks per developer by date: Specifies that you want each developer to have only n functional testing tasks by the specified date.



Max tasks to recommend: Limits the number of functional testing tasks reported for each developer on any test run. The tasks shown are selected randomly so that different tasks are shown after each run. For example, if you set the parameter to 50, the first task shown after each run is selected at random, and the following 49 tasks shown are the ones that would follow that first task in a complete report.

Changing the Favorite Test Configuration The Favorite Configuration defines the test scenario that SOAtest uses when you start a test without specifically indicating which Test Configuration you want to use. For example, if you start a test by clicking the Test Using button, SOAtest will run that test based on the parameters defined in the Favorite Configuration. To indicate which Test Configuration you want set as the Favorite Configuration: 1. Open the Test Configurations panel by choosing SOAtest> Test Configurations or by choosing Test Configurations in the drop-down menu on the Test Using toolbar button. 2. Right-click the Test Configurations category that represents the Test Configuration you want set as the Favorite Configuration, then choose Set As Favorite from the shortcut menu. The configuration will then be set as the Favorite Configuration; the "favorite" icon will be added to that configuration in the Test Configurations tree.

Importing/Exporting Test Configurations If you have created a Test Configuration that you want to share with team members or use in an upgraded version of SOAtest, you can export the Test Configuration into a properties file. That Test Configuration can then be added by importing the related properties file.

Exporting To export a Test Configuration: 1. Open the Test Configurations panel by choosing SOAtest> Test Configurations. 2. Right-click the Test Configuration you want to export, choose Export from the shortcut menu, then use the file chooser to indicate where you want to save the properties file that will be created for this Test Configuration. A properties file will then be saved in the designated location. A dialog box will open to confirm the location of the newly-created properties file.

251

Creating Custom Test Configurations

Importing To import a Test Configuration that was previously exported into a properties file: 1. Open the Test Configurations panel by choosing SOAtest> Test Configurations. 2. Right-click the User-defined category, choose Import from the shortcut menu, then use the file chooser to select the appropriate properties file.

Comparing Test Configurations To compare two Test Configurations: 1. Open the Test Configurations panel by choosing SOAtest> Test Configurations. 2. Right-click one of the Test Configurations you want to compare, choose Export from the shortcut menu, then use the file chooser to indicate where you want to save the .properties file (choose a folder that is in your workspace and available in the Package Explorer). 3. Repeat the above step for the other Test Configurations you want to compare. 4. Select the two different .properties files in the Navigator, right-click the selection, then choose Compare with> Each other.

Specifying Test Configuration Inheritance If you want multiple Test Configurations to share some of the same parameter settings (for example, if you want multiple Test Configurations to have the same rules enabled), you can create new child Test Configurations referring to one parent Test Configuration. A child Test Configuration will inherit the parent’s settings; the value of each preference in the parent Test Configuration is used whenever the corresponding preference in the child Test Configuration is not present. Inheritance is recursive; in other words, you could have the MyConfig2 Test Configuration inherit the settings from MyConfig1, and have MyConfig3 inherit the settings from MyConfig 2. MyConfig3 will thus inherit some MyConfig1 settings as it inherits MyConfig2 settings. You can create a child Test Configuration from a Test Configuration shown in the Test Configuration panel, or by specifying a Test Configuration URL (for Test Configurations available via HTTP). To create a child from a Test Configuration shown in the Test Configuration panel: 1. Open the Test Configurations panel. 2. Right-click the desired parent Test Configuration, then choose New Child from the shortcut menu. To create a child from Test Configurations available via HTTP: 1. Open the Test Configurations panel. 2. Right-click the User-Defined node, then choose New Child from the shortcut menu. 3. In the dialog that opens, enter the URL for the desired parent Test Configuration (http:// config_address). For example: http://SOAtest.acme.com/configs/static.properties To disconnect a child from its parent: 1. Open the Test Configurations panel. 2. Click the Disconnect button to the right of the Parent field.

252

Creating Custom Test Configurations

Important Notes •

It is not possible to change the parent of a Test Configuration. Test Configurations that inherit from a parent must be created that way from the start using the "New Child" action.



Once a child Test Configuration is disconnected from its parent, it cannot be reconnected. All the inherited settings are applied directly in the child when disconnected.



Each test configuration may have at most one parent configuration. Multiple inheritance is not supported.

253

Running Tests and Analysis In this section: •

Testing from the GUI



Testing from the Command Line Interface (soatestcli)



Testing from the Web Service Interface

254

Testing from the GUI

Testing from the GUI This topic explains the general procedure for running tests from the SOAtest GUI. Sections include: •

Running a Test



Reviewing Results



Fine-Tuning Test Settings

Running a Test SOAtest can perform a variety of tasks, from static analysis, to functional testing, to regression testing. To start using SOAtest to achieve your goals, you run a test based on a default or custom test scenario, which defines the precise nature and scope of SOAtest's analysis. All preconfigured Test Configurations are described in “Built-in Test Configurations”, page 741. The procedure for creating a custom Test Configuration is described in “Creating Custom Test Configurations”, page 244. The general procedure for testing from the GUI is as follows: 1. In the Test Case Explorer or Navigator view, select the resources you want to analyze or the tests you want to execute. You can use Ctrl + click or Shift + click to select multiple resources. 2. Start the test in one of the following ways: •



To run the Favorite Configuration (which executes the selected tests), perform one of the following actions: •

Click the Test Using button in the toolbar.



Choose SOAtest> Test Using [Favorite Configuration] from the menu bar.



Right-click the resource, then choose SOAtest Test Using [Favorite Configuration] from the shortcut menu.

To run another Test Configuration, perform one of the following actions: •

Choose the appropriate Test Configuration from the Test Using section of the Test Using button’s pull-down menu.



Choose the appropriate Test Configuration from the SOAtest> Test Using menu in the menu bar.



Choose the appropriate Test Configuration from the SOAtest> Test History menu in the menu bar. Note that this menu contains only the most recently-run Test Configurations.



Right-click the selection, then choose the appropriate Test Configuration from the SOAtest> Test Using shortcut menu.



Right-click the selection, then choose the appropriate Test Configuration from the SOAtest> Test History shortcut menu.

"Grayed-Out" Test Configurations = Incompatible Test Configurations If a Test Configuration is "grayed out," this indicated that it was created with an incompatible version of SOAtest, and cannot be applied with the current version.

255

Testing from the GUI

SOAtest will then run the test scenario defined by the selected Test Configuration.

Reviewing Results Test progress and results summaries will be reported in the Testing panel that SOAtest opens when it starts the test. Detailed results will be reported in the SOAtest view. Drill down to see details about the test findings. For details on reviewing results, see “Viewing Results”, page 290. For details on producing a report for the test, see “Generating Reports”, page 295.

Fine-Tuning Test Settings To change test settings—such as what rules are checked—edit an existing Test Configuration or create a new one, then run a test using the modified/new Test Configuration. Test Configurations and all related parameters can be viewed, edited, and modified in the Test Configurations dialog. To open this dialog, choose SOAtest> Test Configurations from the menu bar. For information on configuring test parameters, see “Creating Custom Test Configurations”, page 244.

256

Testing from the Command Line Interface (soatestcli)

Testing from the Command Line Interface (soatestcli) This topic explains how to run a test from the SOAtest command line interface (soatestcli),. Sections include: •

Introduction



Running a Test



cli Overview



cli Options



Local Settings (Options) Files



Using Variables in Local Settings (Options) Files



cli Exit Codes

Migrating Your Automated Nightly Process from an Earlier Version of SOAtest or WebKing For help migrating your existing SOAtest automated nightly process from earlier versions of SOAtest or WebKing, see “Command Line Interface (cli) Migration”, page 51.

Introduction Prerequisites •

The command line mode requires a command line interface license (available with SOAtest Server Edition).



Before you can run a test from the command line, you need to setup a project, .tst file, and test suites. See “Adding Projects, .tst files, and Test Suites”, page 308 for details.

About the cli SOAtest’s command line interface (soatestcli) allows you to perform static analysis and execute tests as part of an automated nightly process. Command line mode is available for the Server Edition of SOAtest. soatestcli can send results to the Parasoft Report Center, send comprehensive reports to the team manager and to the Parasoft Team Server, and send focused reports to each team developer and tester working on the SOA project. Reports can be generated in a number of formats. Details such as reporting preferences (who should reports be sent to, how should those reports be labelled, what mail server and domain should be used, etc.) Team Server settings, Report Center settings, and email settings, license settings, etc. can be controlled by local settings files.

The optimal team configuration is to have one SOAtest (Server Edition) on the team build machine, SOAtest (Professional Edition) on every developer/tester workstation, one SOAtest (Architect Edition) on the architect’s machine, and one installation of Team Server on the team build machine or another team machine.

257

Testing from the Command Line Interface (soatestcli)

Engineers use their local installations of SOAtest to develop and run tests, then check their work in to source control. Every night, soatestcli runs on a team machine. Depending on the configuration, it may execute the available tests, monitor policy adherence, and/or perform the specified static analysis tests. After the test completes, engineers can import test result into the SOAtest GUI to facilitate task review and resolution. Additionally, SOAtest sends results to the Parasoft Report Center, emails each team member a report that contains only the tasks assigned to him or her, emails group managers a report that lists all team/ project quality tasks and identifies which team member is responsible for each task, and uploads reports and results to Team Server. Throughout the process, Team Server manages the sharing and updating of test settings and test files; this standardizes tests across the team and helps team members leverage one another’s work. The standardized test settings and custom team rules are configured and maintained by the team architect, who is using SOAtest Architect Edition.

Running a Test The general procedure for testing from the command line is as follows: •

Use the soatestcli utility, with appropriate options, to launch analysis in the command-line mode. A complete list of options is provided in “cli Options”, page 260. Key options are: •

-data: Specifies the Eclipse workspace location.



-config: Specifies test configuration.



-resource: Specifies the test suite(s) to run. To run a single test suite, specify the path to <test suite name.tst> relative to the workspace To run all test suites within a directory, specify the directory path relative to the workspace.



-publish: Publishes test results to Team Server.



-report: Generates a report.



-localsettings: Passes advanced settings for Team Server/Report Center/mail reporting. Options are described in “Local Settings (Options) Files”, page 266.

If the SOAtest installation is not on your path, launch soatestcli with the full path to the executable.

cli Overview The general form of invocation for soatestcli is: •

Windows: soatestcli.exe



UNIX: soatestcli

[OPTIONS]

[OPTIONS]

Typically, invocations follow this pattern: •

Windows: soatestcli.exe



UNIX: soatestcli

-data %WORKSPACE_DIR% -resource resource_to_test | local_settings_file config %CONFIG_URL% -report %REPORT_FILE%

-data %WORKSPACE_DIR% -resource resource_to_test | local_settings_file -config %CONFIG_URL% -report %REPORT_FILE%

258

Testing from the Command Line Interface (soatestcli)

Examples soatestcli.exe -config "user://Example Configuration"



Runs all tests in the default workspace (for standalone installations, usually c:\documents and settings\<username>\soatest\workspace) with the user-defined Test Configuration named "Example Configuration."

Note A user-defined configuration is local to the specified workspace and the configuration. "Example Configuration" is automatically created and set to be the default configuration by SOAtest when the workspace is first created.

soatestcli.exe -import c:\myProject



Imports a project from c:\myProject into the default workspace. Once this is done, the project does not need to be imported into the same workspace again.

soatestcli.exe -config "user://Example Configuration" -data "c:\myWorkspace"



Runs all tests in the workspace at c:\myWorkspace. Tests in projects that have not been imported will not be run. The "Example Configuration" Test Configuration is used.

soatestcli.exe -config "user://Example Configuration" -resource "tests/myTest.tst"



Runs the test suite file 'myTest.tst' located in the 'tests' project of the default workspace. The "Example Configuration" Test Configuration is used. The project 'tests' must have already been imported into the workspace.

soatestcli.exe -config "user://Example Configuration" -resource "tests" -report c:\reports\Report1



Runs all tests in the 'tests' project folder of the default workspace, and saves the report at c:\reports\Report1. The project 'tests' must have already been imported into the workspace.

Using -data to Specify Your Eclipse Workspace If you are not in the same directory as the Eclipse workspace that you want to test, you need to use soatestcli with the -data option. For example, this Windows command tests the SOAtest Example project by applying the "My Configuration" Test Configuration, generates a report of results, and saves that report in the c:\reports\Report1 directory: soatestcli -data "c:\Documents and Settings\cynthia\Application Data\Parasoft\SOAtest\work-

space" -resource "SOAtest Example" -config user://"My Configuration" c:\reports\Report1

-report

If you are in the same directory as the workspace that you want to test, you can call soatestcli without the -data option. For example, this Windows command tests the SOAtest Example project by applying the My Configuration Test Configuration, generates a report of results, and saves that report in the c:\reports\Report1 directory: soatestcli -resource "SOAtest Example" -config user://"My Configuration" -report

c:\reports\Report1

259

Testing from the Command Line Interface (soatestcli)

cli Options Available soatestcli options are listed in the following tables.

Double Quotes vs. Single Quotes Use "double-quotes" (not ’single quotes’) to specify options. For example: -config team://"Our Configuration"

Option

Purpose

Notes

-data %WORKSPACE_DIR%

Specifies the location of the Eclipse workspace directory to use.

Defaults to the current user’s dependent directory.

-import %PROJECT%

Imports the specified projects into the Eclipse workspace.

For example, the command can be used as follows to import the project at location "C:\Documents and Settings\user\My Test Project" into the current workspace: soatestcli.exe -import "C:\Documents and Settings\user\My Test Project"

The -import command-line argument supports project paths that are relative to the current working directory. Typically, the current working directory is the current directory on the command-line from which the command is given.

260

Testing from the Command Line Interface (soatestcli)

Option -resource

%RESOURCE%

Purpose

Notes

Specifies the test suite(s) to run.

To run a single test suite, specify the path to <test suite name.tst> relative to the workspace. To run all test suites within a directory, specify the directory path relative to the workspace. Use multiple times to specify multiple resources. Use quotes when the resource path contains spaces or other non-alphanumeric characters. If %RESOURCE% is a .properties file, the value corresponding to com.parasoft.xtest.checkers.resources will be interpreted as a colon(:)-separated list of resources. Only one properties file can be specified in this way. If no resources are specified on the command line, the complete workspace will be tested. Paths (even absolute ones) are relative to the workspace specified by the -data parameter. Examples: -resource "Acme Project" -resource "/MyProject/tests/acme" -resource testedprojects.properties

-config %CONFIG_URL%

Specifies that you want to run the Test Configuration available at %CONFIG_URL% .

This parameter is required. %CONFIG_URL% is interpreted as a URL, the name of a Test Configuration, or the path to a local file.

Examples: •

By filename: -config "mylocalconfig.properties"



By URL: -config "http://intranet.acme.com/SOAtest/ team_config.properties"



Built-in configurations: -config "builtin://Demo Configuration" -config "Demo Configuration"



User-defined configurations: -config "user://My First Configuration"



Team configurations: -config "team://Team Configuration" -config "team://teamconfig.properties"

-localsettings %LOCALSETTINGS_FILE%

Reads the local setting file %LOCALSETTINGS_FILE% for global preferences. These settings specify details such as Report Center settings, email settings, and Team Server settings.

261

The local setting file is a properties file. These files can control reporting preferences (who should reports be sent to, how should those reports be labelled, what mail server and domain should be used, etc.) Team Server settings, Report Center settings, email settings, and more. For details on creating local setting files, see “Local Settings (Options) Files”, page 266.

Testing from the Command Line Interface (soatestcli)

Option

Purpose

Notes

-publish

Publishes the reports to the Team Server.

The Team Server location can be specified in the GUI or in the local setting file (described in the -localsettings %LOCALSETTINGS_FILE% entry).

-report %REPORT_FILE%

Generates an XML report to the given file %REPORT_FILE% and adds an HTML (or PDF or custom format—if specified using the report.format option) report with the same name—and a different extension—in the same directory.

All of the following commands will produce an HTML report filename.html and an XML report filename.xml. • • •

-report filename.xml -report filename.htm -report filename.html

If the specified path ends with an ".html"/".htm"/ ".xml" extension, it will be treated as a path to the report file to generate. Otherwise, it will be treated as a path to a directory where reports should be generated. If the file name is explicitly specified in the command and a file with this name already exists in the specified location, the previous report will be overwritten. If your command doesn’t explicitly specify a file name, the existing report file will not be overwritten—the new file will be named repXXXX.html, where XXXX is a random number. If the -report option is not specified, reports will be generated with the default names "report.xml/html" in the current directory.

-startStubServer

Starts the stub server

-router matchWhole <searchURI:URI> <repla-ceURI:URI>

Specifies search and replace arguments

Starts the stub server and readies stubs for use as endpoints by a project. For example: -router searchURI:host1.adobe.com replaceURI:host2.adobe.com

OR -router searchURI:* replaceURI:http://

host2.adobe.com/service This feature is now deprecated. Please use Environments instead.

262

Testing from the Command Line Interface (soatestcli)

Option

Purpose

Notes

-testName [match:] <test name>

Specifies test name patterns

Allows you to specify the name of the test in the test suite to run. SOAtest will find tests that contain the specific string specified, but it does not perform actual pattern matching (such as wildcards or Regular Expressions). For example, -testName match: something will run all tests whose names contain the word something. To run multiple tests use -testName name1 -testName where name1 and name2 correspond to the names of the desired tests. name2

Note that you can surround the value with quotes in order to allow spaces in the name. For example, -testName match: "hello world" will search for a test with the exact string hello world in its name. -environment <environment_name>

Specifies environment options

When running functional tests from the command line, you can override the active environment specified in a project with one specified from the command line. Note that if the specified environment is not found in the project, the default active environment will be used instead.

-dataSourceRow <row> dataSourceName <name>

Runs tests with a single data source row.

The -dataSourceName

<name>

argument is optional.

For example: • •

will cause any test that is using a data source to run with row 5.

-dataSourceRow 5

-dataSourceRow 5 -dataSourceName "Data"

will cause any test that is using a data source named "Data" to run with row 5. -Centrasite

Report test results to the Software AG CentraSite Active SOA registry

Allows you to send results back to the Software AG CentraSite Active SOA registry. For details, see “Using Software AG CentraSite Active SOA with SOAtest”, page 682

-qualityCenter

Report test results to HP Quality Center

Allows you to send results back to HP Quality Center. For details, see “Using HP with SOAtest”, page 659

Report test results to Rational TestManager

Allows you to send results back to Rational TestManager. For details, see “Using IBM/ Rational with SOAtest”, page 672

-qualityCenterReportAllTraffic

-testManager -testManagerVerbose

Verbose mode provides more information such as request and response traffic

263

Testing from the Command Line Interface (soatestcli)

Option

Purpose

Notes

-visualStudio

Report test results to Microsoft Visual Studio Team System

Allows you to send results back to Microsoft VIsual Studio Team System. For details, see “Using Microsoft with SOAtest”, page 666

-include %PATTERN% -exclude %PATTERN%

Specifies files to be included/excluded during testing.

You must specify a file name or path after this option. Patterns specify file names, with the wildcards * and ? accepted, and the special wildcard ** used to specify one or more path name segments. Syntax for the patterns is similar to that of Ant filesets. Examples: -include **/Bank.xml (test Bank.xml files) -include **/ATM/Bank/*.xml (test all .xml files in folder ATM/Bank) -include c:/ATM/Bank/Bank.xml (test only the c:/ ATM/Bank/Bank.xml file) -exclude **/internal/** (test everything except classes that have path with folder "internal") -exclude **/*Test.xml (test everything, but files that end with Test.xml) Additionally if a pattern is a file with a .lst extension, it is treated as a file with a list of patterns. For example, if you use -include c:/include.lst and include.lst contains the following (each line is treated as single pattern): **/Bank.xml **/ATM/Bank/*.xml c:/ATM/Bank/Bank.xml then it has same effect as specifying: -include **/Bank.xml -include **/ATM/Bank/*.xml -include c:/ATM/Bank/Bank.xml"

-encodepass <plain password>

-showdetails

Generates an encoded version of a given password.

Prints the message 'Encrypted password: <encpass>' and terminates the cli app.

Prints detailed test progress information.

N/A

264

Must be used along with

-config <url>.

Testing from the Command Line Interface (soatestcli)

Option

Purpose

Notes

-J

Specifies additional JVM options, which in turn get passed to the Eclipse executable via the -vmargs option.

The Eclipse -vmargs argument is used to customize the operation of the Java VM to use to run Eclipse. If specified, this option must come at the end of the command line. Even if not specified on the executable command line, the executable will automatically add the relevant arguments (including the class being launched) to the command line passed into Java using the -vmargs argument. Java Main then stores this value in eclipse.vmargs. Usage is -vmargs [vmargs*] (Executable, Main)

-prefs %PREFS_URL%

Reads the %PREFS_URL% preference URL to import Eclipse workspace preferences.

is interpreted as a URL or the path to a local Eclipse workspace preferences file. The best way to create a workspace preferences file is to use the Export wizard. To do this:

%PREFS_URL%

1. Choose File> Export. 2. In the Export Wizard, select Preferences, then click Next. 3. Do one of the following: •

To add all of the preferences to the file, select Export all.



To add only specified preferences to the file, select Choose specific preferences to export, then check the preferences you want to import.

4. Click Browse... then indicate where you want the preferences file saved. 5. Click Finish. We recommend that you delete non-applicable properties and keep only critical properties, such as the classpath property. We also recommend that you replace machine/user-specific locations with variables by using the $(VAR) notation. These variables will be replaced with the corresponding Java properties, which that can be set at runtime by running soatestcli with -J-D options (for example soatestcli -J-DHOME=/home/ user). Examples: -prefs "http://intranet.acme.com/SOAtest/ workspace.properties" -prefs "workspace.properties" -help

Displays help information.

Does not run testing.

265

Testing from the Command Line Interface (soatestcli)

Option

Purpose

Notes

-version

Displays version number.

Does not run testing.

-initjython, installcertificate, uninstallcertificate

Installer options

N/A

Notes •

To see a list of valid command line options, enter for soatestcli



soatestcli automatically emails designated group managers and architects a report that lists all team/project tasks and identifies which team member is responsible for each task. If no tasks are reported, reports will be sent unless the local setting file contains the report.mail.on.error.only=true option.



If the appropriate prerequisites are met, soatestcli automatically emails each team member a report that contains only the tasks assigned to him or her. If no tasks are assigned to a particular team member, he or she will not be emailed a report.



For more details about options that are inherited from Eclipse, see the Eclipse documentation.

-help.

Local Settings (Options) Files Local settings files can control report settings, Report Center settings, task assignment settings, and Team Server settings. You can create different local settings files for different projects, then use the localsettings option to indicate which file should be used for the current command line test. Each local settings file must be a simple text file. There are no name or location requirements. Each setting should be entered in a single line. If a parameter is specified in this file, it will override the related parameter specified from the GUI. If a parameter is not specified in this file, SOAtest will use the parameter specified in the GUI.

Creating a Local Settings File by Exporting Your GUI Preferences The fastest and easiest way to create a local settings file is to export your Preferences from the GUI. 1. Choose SOAtest> Preferences. The Preferences dialog will open. 2. Select SOAtest (the root element in the left tree) in the Preferences dialog. 3. Specify which preferences you want to export. 4. Click the Export button, then specify the file where you want the settings saved. •

If you select an existing file, the source control settings will be appended to that file. Otherwise, a new file will be created.



Exported passwords will be encrypted.

Local settings files can determine the following settings: •

Reporting Settings

266

Testing from the Command Line Interface (soatestcli)



Report Center Settings



Team Server Settings



Licensing Settings



Technical Support Settings



Authorship Settings



Source Control Settings



Miscellaneous Settings

Notes •

Each setting should be entered on a single line.



If your local settings file contains any invalid settings, details will be reported in the command line output.

Reporting Settings

Setting

Purpose

report.authors_details

Determines whether the report includes an overview of the number and type of tasks assigned to each team member. The default is true.

report.contexts_details

Determines whether the report includes an overview of the files that were checked or executed during testing.The default is false.

report.custom.extension report.custom.xsl.file

Specifies the location and extension of the XSL file for a custom format. Used with report.format=custom See “Support for Custom Report Formats”, page 299 for details and examples.

report.developer_errors=true|false

Determines whether manager reports include details about team member tasks.

report.developer_reports=true|false

Determines whether the system generates detailed reports for all team members (in addition to a summary report for managers).

report.format=html|pdf|custom

Specifies the report format.

267

Testing from the Command Line Interface (soatestcli)

Setting

Purpose

report.generate_htmls=true|false

Determines whether HTML reports are generated and saved on the local filesystem. XML reports are generated and saved regardless of this setting’s value. The default setting is true. Determines the start date for trend graphs that track static analysis tasks over a period of time.

report.graph.cs_start_date=[MM/dd/yy}

See “Understanding Reports”, page 300 for more details on these reports. Determines the start date for trend graphs that track test execution results over a period of time.

report.graph.ue_start_date= [MM/dd/yy}

See “Understanding Reports”, page 300 for more details on these reports. Determines whether reports are sent as attachments. All components are included as attachments; before you can view an HTML report with images, all attachments must be saved to the disk.

report.mail.attachments=true|false

The default setting is false. report.mail.cc=[email_addresses]

Specifies where to mail comprehensive manager reports. This setting must be followed by a semicolon-separated list of email addresses. This setting is typically used to send reports to managers or architects. It can also be used to send comprehensive reports to team members if such reports are not sent automatically (for example, because authorship is not being determined by SOAtest).

report.mail.domain=[domain]

Specifies the mail domain used to send reports.

268

Testing from the Command Line Interface (soatestcli)

Setting

Purpose

report.mail.enabled=true|false

Determines whether reports are emailed to team members and to the additional recipients specified with the cc setting. Remember that each team member with assigned tasks will automatically be sent a report that contains only the assigned tasks.

report.mail.exclude=[email_addresses]

Specifies any email addresses you do not want to receive reports. This setting is used to prevent SOAtest from automatically sending reports to someone who worked on the tests or source code, but should not be receiving reports.

report.mail.exclude.developers=true|false

Specifies whether reports should be mailed to any team member whose email is not explicitly listed in the report.mail.cc

property. This setting is used to prevent reports from being mailed to individual team members. report.mail.format=html|ascii

Specifies the email format.

report.mail.from=[email_address OR user_name_of_the_same_domain}

Specifies the "from" line of the emails sent.

report.mail.include=[email_addresses]

Specifies the email addresses of team members that you want to receive individual reports. This setting must be followed by a semicolon-separated list of email addresses. This setting is typically used when such reports are not sent automatically (for example, because the team is not using SOAtest to assign tasks). It overrides the settings specified in the 'exclude' list.

report.mail.on.error.only=true|false

Determines whether reports are sent to the manager only if a task is generated or a fatal exception occurs. Team member emails are not affected by this setting; individual emails are sent only to team members who are responsible for reported tasks. The default setting is false. Specifies the mail server used to send reports.

report.mail.server=[server]

269

Testing from the Command Line Interface (soatestcli)

Setting

Purpose

report.mail.subject=My New Subject

Specifies the subject line of the emails sent. The default subject line is SOAtest Report." For example, if you want to change the subject line to SOAtest Report for Project A", you would use report.mail.subject=SOAtest Report for Project A

report.mail.time_delay=[server]

Specifies a time delay between emailing reports (to avoid bulk email restrictions).

report.mail.unknown=[email_address OR user_name_of_the_same_domain}

Specifies where to mail reports for tasks assigned to "unknown".

report.mail.username=[username] report.mail.password=[password] report.mail.realm=[realm]

Specifies the settings for SMTP server authentication. The realm value is required only for those servers that authenticate using SASL realm.

report.active_rules=true|false

Determines if reports contain a list of the rules that were enabled for the test.

report.suppressed_msgs=true|false

Determines whether HTML reports include suppressed messages. The default setting is false. Specifies the name of the group that is responsible for the project. This value is used for uploading summary results to Team Server.

report.tag=[name}

The tag is an identifier of the module checked during the analysis process. Reports for different modules should be marked with different tags. Determines whether HTML reports include test parameter details.

report.test_params=true|false

The default setting is false.

Report Center Settings

270

Testing from the Command Line Interface (soatestcli)

Setting

Purpose

grs.enabled=true|false

Determines whether the current SOAtest installation is connected to Report Center. This setting is not needed if you want to use the value specified in the GUI.

grs.server=[server]

Specifies the host name of the Report Center server. This setting is not needed if this information is specified in the GUI.

grs.port=[port]

Specifies the port number of the Report Center data collector. This setting is not needed if you want to use the value specified in the GUI.

grs.user_defined_attributes=[attributes]

Specifies the user-defined attributes for Report Center. Use the format key1:value1;

key2:value2

For more details on attributes, see “Configuring Report Center Attributes”, page 204. This setting is not needed if you want to use the value specified in the GUI. grs.log_as_nightly=true|false

Determines whether the results sent to Report Center are marked as being from a nightly build.

grs.use_resource_attributes=true|false

Determines whether Report Center attributes specified in the GUI at the project level should be used. This allows you to disable project-level Report Center attributes.

Team Server Settings

Setting

Purpose

tcm.server.enabled=true|false

Determines whether the current SOAtest installation is connected to the Team Server. This setting is not needed if you want to use the value specified in the GUI.

271

Testing from the Command Line Interface (soatestcli)

Setting

Purpose

tcm.server.name=[name]

Specifies the machine name or IP address of the machine running Team Server. This setting is not needed if you want to use the value specified in the GUI.

tcm.server.port=[port]

Specifies the Team Server port number. This setting is not needed if you want to use the value specified in the GUI.

report.tag=[name]

Specifies the name of the group that is responsible for the project. This value is used for uploading summary results to Team Server.

tcm.server.accountLogin=true|false tcm.server.username=[username] tcm.server.password=[password]

Determines whether username and password are submitted to connect to Team Server. Usernames/passwords are not always needed; it depends on your team’s setup. If the first setting is true, the second and third settings specify the username and password. Note that Team Server must have the username and password setting already enabled before these settings can be used.

Licensing Settings

Setting

Purpose

soatest.license. use_network=true|false

Determines whether the current SOAtest installation retrieves its license from LicenseServer. This setting is not needed if you want to use the value specified in the GUI. Example: soatest.license.use_network=true

272

Testing from the Command Line Interface (soatestcli)

Setting

Purpose

soatest.license. network.host=[host]

Specifies the machine name or IP address of the machine running LicenseServer Configuration Manager. This setting is not needed if you want to use the value specified in the GUI. Example: soatest.license.network.host=10.9.1.63

Specifies the LicenseServer port number. This setting is not needed if you want to use the value specified in the GUI.

soatest.license. network.port=[port]

Example: soatest.license.network.port=2002

Specifies the type of license that you want this SOAtest installation to retrieve from LicenseServer. This setting is not needed if you want to use the value specified in the GUI.

soatest.license. network.edition=[edition_name]

[edition_name]

can be

professional_edition, architect_edition,

or server_edition. To use a custom edition, do not set anything after the "="; simply leaving the value empty. Example: soatest.license.network.edition= architect_edition soatest.license.network.edition= server_edition soatest.license.network.edition= professional_edition soatest.license.network.edition=

soatest.license. autoconf.timeout=[seconds]

Specifies the maximum number of seconds SOAtest will wait for the license to be automatically configured from LicenseServer. Default is 10.

soatest.license. local.expiration=[expiration]

Specifies the local license that you want this SOAtest installation to use. This setting is not needed if you want to use the value specified in the GUI.

soatest.license. local.password=[password]

Specifies the local password that you want this SOAtest installation to use. This setting is not needed if you want to use the value specified in the GUI.

273

Testing from the Command Line Interface (soatestcli)

Technical Support Settings

Setting

Purpose

techsupport.auto_creation=true|false

Determines whether archives are automatically prepared when testing problems occur.

techsupport.send_email=true|false

Determines whether prepared archives are emailed to Parasoft support. If you enable this, be sure to specify email settings from the GUI or with the options in Reporting Settings.

techsupport.archive_location=[directory]

Specifies where archives are stored.

techsupport.verbose=true|false

Determines whether verbose logs are included in the archive. Note that this option cannot be enabled if the logging system has custom configurations. •

Verbose logs are stored in the xtest.log file within the user-home temporary location (on Windows, this is <drive>:\Documents and Settings\<user>\Local Settings\Temp\parasoft\xtest).



Verbose logging state is cross-session persistent (restored on application startup).



The log file is a rolling file: it won't grow over a certain size, and each time it achieves the maximum size, a backup will be created.

techsupport.verbose.scontrol=true|false

Determines whether verbose logs include output from source control commands. Note that the output could include fragments of your source code.

techsupport.item.general=true|false

Determines whether general application logs are included.

274

Testing from the Command Line Interface (soatestcli)

Setting

Purpose

techsupport.item.environment=true|false

Determines whether environment variables, JVM system properties, platform details, additional properties (memory, other) are included in the archive.

techsupport.advanced=true|false

Specifies if advanced options will be sent.

techsupport.advanced.options=[option]

Specifies any advanced options that the support team asked you to enter.

Authorship Settings

Setting

Purpose

scope.sourcecontrol=true|false

Determines whether SOAtest computes task assignment based on a data from a supported source control system. This setting is not needed if you want to use the value specified in the GUI.

scope.local=true|false

Determines whether SOAtest computes task assignment based on the local user. This setting is not needed if you want to use the value specified in the GUI.

scope.xmlmap=true|false

Specifies whether SOAtest computes task assignment based on XML files that define how you want tasks assigned for particular files or sets of files (these mappings can be specified in the GUI then saved in an XML file).

scope.xmlmap.file=[file]

Specifies the name of the XML file that defines how you want tasks assigned for particular files or sets of files.

Source Control Settings

275

Testing from the Command Line Interface (soatestcli)

Defining multiple repositories of the same type Indexes (numbered from 1 to n) must be added to the prefix if you want to define more than one repository of the same type. For example: scontrol.rep1.type=ccase scontrol.rep1.ccase.vob=/vobs/myvob1

scontrol.rep2.type=ccase scontrol.rep2.ccase.vob=/vobs/myvob2

If you are defining only one repository, you do not need to use an index. For example: scontrol.rep.type=ccase scontrol.rep.ccase.vob=/vobs/myvob1

AccuRev Repository Definition Properties

Property

Description

scontrol.rep.type=accurev

AccuRev repository type identifier.

scontrol.rep.accurev.host=

AccuRev server host.

scontrol.rep.accurev.port=

AccuRev server port. Default port is 1666.

scontrol.rep.accurev.login=

AccuRev user name.

scontrol.rep.accurev.password=

AccuRev password.

ClearCase Repository Definition Properties

Property

Description

scontrol.ccase.exec=

Path to external client executable (cleartool).

scontrol.rep.type=ccase

ClearCase repository type name.

scontrol.rep.ccase.vob=

Path inside VOB. ccase.vob value + File.separator must be the valid path to a ClearCase controlled directory.

CVS Repository Definition Properties

276

Testing from the Command Line Interface (soatestcli)

Property

Description

scontrol.rep.type=cvs

CVS repository type identifier.

scontrol.rep.cvs.root=

Full CVSROOT value.

scontrol.rep.cvs.pass=

Plain or encoded password. The encoded password should be the same as in the .cvspass file. When you are first logged in to the CVS repository from the command line using "cvs login", the password is saved in the registry. To retrieve it, go to the registry (using regedit), and look for the value under HKEY_CURRENT_USER->CVSNT> cvspass. This should display your entire login name (:pserver:exampleA@exampleB:/exampleC) encrypted password value.

scontrol.rep.cvs.useCustomSSHCredentials=

Determines whether the cvs login and password should be used for EXT/SSH connections. Allowed values are true and false. It is disabled by default.

scontrol.rep.cvs.ext.server

If connecting to a CVS server in EXT mode, this specifies which CVS application to start on the server side. Has the same meaning as the CVS_SERVER variable. cvs is the default value.

scontrol.rep.cvs.ssh.loginname=

Specifies the login for SSH connections (if an external program can be used to provide the login).

scontrol.rep.cvs.ssh.password=

Specifies the password for SSH connection.

scontrol.rep.cvs.ssh.keyfile=

Specifies the private key file to establish an SSH connection with key authentication.

scontrol.rep.cvs.ssh.passphrase=

Specifies the passphrase for SSH connections with the key authentication mechanism.

scontrol.rep.cvs.useShell=

Enable an external program (CVS_RSH) to establish a connection to the CVS repository. Allowed values are true and false. It is disabled by default.

scontrol.rep.cvs.ext.shell=

Specifies the path to the executable to be used as the CVS_RSH program. Command line parameters should be specified in the cvs.ext.params property.

277

Testing from the Command Line Interface (soatestcli)

Property

Description

scontrol.rep.cvs.ext.params=

Specifies the parameters to be passed to an external program. The following case-sensitive macro definitions can be used to expand values into command line parameters: •

{host} repository host



{port} port



{user} cvs user



{password} cvs password



{extuser} parameter cvs.ssh.login-

name •

{extpassword} parameter

cvs.ssh.password •

{keyfile} parameter cvs.ssh.keyfile



{passphrase} parameter

cvs.ssh.passphrase

Perforce Repository Definition Properties

Property

Description

scontrol.perforce.exec=

Path to external client executable (p4).

scontrol.rep.type=perforce

Perforce repository type identifier.

scontrol.rep.perforce.host=

Perforce server host.

scontrol.rep.perforce.port=

Perforce server port. Default port is 1666.

scontrol.rep.perforce.login=

Perforce user name.

scontrol.rep.perforce.password=

Password.

scontrol.rep.perforce.client=

The client workspace name, as specified in the P4CLIENT environment variable or its equivalents. The workspace's root dir should be configured for local path (so that files can be downloaded).

Serena Dimensions Repository Definition Properties

278

Testing from the Command Line Interface (soatestcli)

Linux and Solaris Configuration Note To use Serena Dimensions with SOAtest, Linux and Solaris users should run SOAtest in an environment prepared for using Serena programs, such as 'dmcli' •

LD_LIBRARY_PATH should contain the path to <SERENA Install Dir>/libs.



DM_HOME should be specified.

Since many Solaris users commonly set the required Serena variables by running the Serena dmgvars.sh file, it also necessary to modify LD_LIBRARY_PATH variable. To use Serena Dimensions with SOAtest, LD_LIBRARY_PATH needs to include the following items (paths can be different on client machines): •

SSL/Crypto library - /usr/local/ssl/lib



STDC++ library - /usr/local/lib

Property

Description

scontrol.rep.type=serena

Serena Dimensions repository type identifier.

scontrol.rep.serena.host=

Serena Dimensions server host name.

scontrol.rep.serena.dbname=

Name of the database for the product you are working with.

scontrol.rep.serena.dbconn=

Connection string for that database.

scontrol.rep.serena.login =

Login name.

scontrol.rep.serena.password

Password.

scontrol.rep.serena.mapping

Maps workspace resources to Serena Dimension repository paths.

279



Example 1: If you use scontrol.rep.serena.mapping_1=${project_loc\:MyPro ject};PRODUCT1\:WORKSET1;src\\ MyProject, then Project 'MyProject' will be mapped to the Serena workset PRODUCT1:WORKSET1 and workset relative path: src\\MyProject



Example 2: If you use scontrol.rep.serena.mapping_2=${workspace_loc};P RODUCT1\:WORKSET1 then the complete workspace will be mapped to the Serena workset PRODUCT1:WORKSET1.

Testing from the Command Line Interface (soatestcli)

StarTeam Repository Definition Properties

Property

Description

scontrol.rep.type=starteam

StarTeam repository type identifier.

scontrol.rep.starteam.host=

StarTeam server host.

sscontrol.rep.starteam.port=

StarTeam server port. Default port is 49201.

scontrol.rep.starteam.login=

Login name.

scontrol.rep.starteam.password=

Password (not encoded).

Subversion Repository Definition Properties

Property

Description

scontrol.rep.type=svn

Subversion repository type identifier.

scontrol.rep.svn.url=

Subversion URL specifies protocol, server name, port and starting repository path (e.g., svn://buildmachine.foobar.com/home/svn).

scontrol.rep.svn.login=

Login name.

scontrol.rep.svn.password =

Password (not encoded).

scontrol.svn.exec=

Path to external client executable (svn).

CM Synergy Repository Definition Properties

Property

Description

scontrol.rep.type=synergy

Synergy/CM repository type identifier.

scontrol.rep.synergy.host=

Computer on which synergy/cm engine runs. Local host is used when missing.

scontrol.rep.synergy.dbpath=

Absolute synergy database path e.g \\host\db\name (backslash symbols '\' in UNC/ Windows paths must be doubled).

scontrol.rep.synergy.projspec=

Synergy project spec which contains project name and its version e.g name-version.

scontrol.rep.synergy.login=

Synergy user name.

scontrol.rep.synergy.password=

Synergy password (not encoded).

scontrol.rep.synergy.port=

Synergy port.

280

Testing from the Command Line Interface (soatestcli)

Property

Description

scontrol.rep.synergy.remote_client=

(UNIX only) Specifies that you want to start ccm as a remote client. Default value is false. Optional.

scontrol.rep.synergy.local_dbpath=

Specifies the path name to which your database information is copied when you are running a remote client session. If null, then the default location will be used.

scontrol.synergy.exec=

Path to external client executable (ccm)

Microsoft Visual Source Safe Repository Definition Properties

Property

Description

scontrol.rep.type=vss

Visual SourceSafe repository type identifier.

scontrol.rep.vss.ssdir=

Path of repository database (backslash symbols '\' in UNC/Windows paths must be doubled).

scontrol.rep.vss.projpath=

VSS project path.

scontrol.rep.vss.login=

VSS login.

scontrol.rep.vss.password=

VSS password.

scontrol.vss.exec=

Path to external client executable (ss).

scontrol.vss.lookup=

Determines whether a full VSS database search is performed to find associations between local paths and repository paths. True or false.

Important Notes •

The repository(n).vss.ssdir property should contain a UNC value even if the repository database resides locally.



Be aware of VSS Naming Syntax, Conventions and Limitations. Any character can be used for names or labels, except the following: •

Dollar sign ($)



At sign (@)



Angle brackets (< >), brackets ([ ]), braces ({ }), and parentheses (( ))



Colon (:) and semicolon (;)



Equal sign (=)



Caret sign (^)



Exclamation point (!)



Percent sign (%)



Question mark (?)

281

Testing from the Command Line Interface (soatestcli)



Comma (,)



Quotation mark (single or double) (' ")



VSS 6.0 (build 8163), which is deployed with Visual Studio 6, does not work properly with projects whose names start with a dot (.) symbol. If such a project name is used, subprojects cannot be added.



Do not use custom working directories for sub-projects (example: Project $/SomeProject has the working directory C:\TEMP\VSS\SomeProject and its subproject $/SomeProject/SomeSubProject has the working directory D:\SomeSubProject).

Miscellaneous Settings

Setting

Purpose

report.rules=[url_path_to_rules_directory]

Specifies the directory for rules html files. The default setting is none.

tasks.clear=true|false

Clears existing tasks upon startup in cli mode. This prevents excessive time being spent "loading existing results." The default is true.

classpath.[variable]=[value]

Specifies classpath variables. For example: classpath.ECLIPSE_HOME= $(ECLIPSE_HOME) classpath.ECLIPSE_LIB=$(HOME)/ dv/ThirdParty/eclipse/2.1.3/$(PS_ARCH) classpath.ECLIPSE3_LIB=$(HOME)/ dv/ThirdParty/eclipse/3.0.0/$(PS_ARCH) classpath.THIRD_PARTY=$(HOME)/ dv/ThirdParty classpath.SOAtest_CLASSES=$(HOME)/ dv/plugins/ com.parasoft.eclipse.api.$(os)/ SOAtest/bin/ classpath.JUNIT_JAR=$(HOME)/ dv/ThirdParty/junit.jar classpath.SOAtest_ZIP=$(HOME)/dv/ plugins/com.parasoft.eclipse. SOAtestplugin/resources/webking.jar classpath.JUNIT_HOME=$(HOME)/ dv/ThirdParty

system.properties.classpath=[path1];[path2];[path3] ...

Specifies which jar files and class folders are in the classpath. For example: system.properties.classpath=C\:\\myjars\ \myLib1.jar;C\:\\myjars\\myLib2.jar

282

Testing from the Command Line Interface (soatestcli)

Setting

Purpose

startup.server=true|false

Determines whether the embedded stub server is started.

Here is one sample local settings file named local.properties: # Team Server settings tcm.server.enabled=true tcm.server.name=tcm.mycompany.com tcm.server.port=18888 tcm.server.accountLogin=true tcm.server.username=tcm_user tcm.server.password=tcm_pass # Report Center settings grs.enabled=true grs.server=grs.mycompany.com grs.port=32323 # Mail settings report.mail.enabled=true report.mail.server=mail.mycompany.com report.mail.domain=mycompany.com report.mail.cc=project_manager report.mail.subject=Coding Standards grs.log_as_nightly=true

Using Variables in Local Settings (Options) Files The following variables can be used in reports, e-mail, Report Center, Team Server, and license settings. Note that report tag value can't contain any ':' characters. env_var example: ${env_var:HOME} Outputs the value of the environmental variable specified after the colon. project_name example: ${project_name} Outputs the name of the tested project. If more than one project is provided as an input, it first outputs the tested project name, then "..." workspace_name example: ${workspace_name} Outputs an empty string in Eclipse. config_name

283

Testing from the Command Line Interface (soatestcli)

$ example: ${config_name} Outputs the name of executed test configuration; applies only to Reports and Email settings. analysis_type $ example: ${analysis_type} Outputs a comma separated list of enabled analysis types (for example: Static, Execution); applies only to Reports and Email settings. tool_name $ example: ${tool_name} Outputs the tool name (for example: SOAtest). Example localsettings file # REPORTS #Determines whether reports are emailed to team members and to the additional recipients specified with the cc setting. #Remember that if the team is using CVS for source control and each team member’s email address matches his or her CVS username + the mail domain, each team member that worked on project code will automatically be sent a report that contains only the tasks/results related to his or her work. report.mail.enabled=true #Exclude team member emails (true/false) report.mail.exclude.developers=false # Append team member tasks to manager emails (true/false) report.developer_errors=true # Send reports to team members (true|false) report.developer_reports=true # Append suppressed messages (true|false) report.suppressed_msgs=false #Determines where to mail complete test reports. #This setting is typically used to send reports to managers or architects. #It can also be used to send reports to team members if team member reports #are not sent automatically (for example, because the team is not using CVS). [email protected]; ${env_var:USERNAME} @domain.com # mail target for unknown team member tasks [email protected] #Specifies the mail server used to send reports. report.mail.server=mail_server.domain.com #Specifies the mail domain used to send reports. report.mail.domain=domain.com #Specify mali from report.mail.from=nightly #Specifies any email addresses you do not want to receive reports. #This setting is used to prevent from automatically sending reports to someone that worked on the code, but should not be receiving reports. This setting is only applicable if the team is

284

Testing from the Command Line Interface (soatestcli)

using CVS for source control and team member reports are being sent automatically. report.mail.exclude=developer1;developer2 # Specifies the subject line of the emails sent. report.mail.subject= ${tool_name} Report - ${config_name} # Report test params inculde (true|false) report.test_params=true # Team Server #Determines whether the current installation is connected to the Team Server. tcm.server.enabled=true #Specifies the machine name or IP address of the machine running Team Server. tcm.server.name=tcm_server.domain.com #Specifies the Team Server port number. tcm.server.port=18888 tcm.server.accountLogin=true tcm.server.username=user tcm.server.password=password report.tag= ${config_name} # Report Center #Determines the current installation is connected to Report Center. grs.enabled=true #Specifies the host name of the Report Center server. grs.server=grs_server.domain.com # Specifies the port number of the Report Center report collector. grs.port=32323 # Specifies user-defined attributes for Report Center. #Use the format key1:value1; key2:value2 #Attributes help you mark results in ways that are meaningful to your organization. #They also determine how results are grouped in Report Center and how you can filter results in Report Center. #For example, you might want to label results by project name and/or by project component name. #Each attribute contains two components: a general attribute category name #and a specific identification value. For example, assume your organization wants to classify results by project. #You might then use the attribute project:projname1. For the next project, you could use a different #local settings file that specified an attribute such as project:projname2. grs.user_defined_attributes=Type:Nightly;Project:Project1 # Determines whether the results sent to Report Center are marked as being from a nightly build. grs.log_as_nightly=true # SCOPE #task assignment based on CVS scope.sourcecontrol=true #task assignment based on author tag scope.author=false #task assignment based on local user scope.local=false

285

Testing from the Command Line Interface (soatestcli)

# LICENSE #override license settings #soatest.license.autoconf.timeout=40 soatest.license.use_network=true soatest.license.network.host=license_server.domain.com soatest.license.network.port=2002 soatest.license.network.edition=server_edition # SOURCE CONTROL scontrol.rep1.type=cvs scontrol.rep1.cvs.root=:pserver:developer@cvs_server.domain.com:/home/cvs/ scontrol.rep1.cvs.pass=mypassword

cli Exit Codes soatestcli

uses the following exit codes when it encounters a problem:

Code

Meaning

130

bad command-line (command-line is malformed or refers to a resource that does not exist)

131

Eclipse already running in the same workspace

133

Parser configuration error (cannot find/instantiate XML parser)

134

Command-line is not licensed

135

Test process exited with an exception. Check error log.

286

Testing from the Web Service Interface

Testing from the Web Service Interface This topic explains how to run a test from the SOAtest web service API. Sections include: •

Prerequisites



About the Web Service API



Operations

Prerequisites •

The web service API requires SOAtest Server Edition.



Before you can run a test from the web service API, you need to setup a project, .tst file, and test suites. See “Adding Projects, .tst files, and Test Suites”, page 308 for details.

About the Web Service API SOAtest's web service API enables web service clients to run tests on a remote machine running SOAtest. Web service mode is available for the Server Edition of SOAtest. SOAtest can be started in web service mode as follows: soatestcli -startStubServer -data <workspace_dir> -localsettings <localsettings_file>

The -data and

-localsettings

arguments are optional.



-data specifies the Eclipse workspace location containing your test cases (.tst files).



-localsettings specifies a properties file used to control certain global settings such as license password and Team Server settings. For more information about these command line arguments, see “Testing from the Command Line Interface (soatestcli)”, page 257.

The SOAtest web service is described by a WSDL document. When SOAtest is running in server mode, this WSDL can be found at http://localhost:9080/axis2/services/SOAtestService?wsdl. The WSDL can be used to generate web service clients. Most web service platforms can generate web service clients from a WSDL document.

Operations The web service has three operations: •

startTestExecution: This operation initiates a test run. This operation returns a "pid" identifier that can be used in subsequent web service calls to check the test execution status and get the test results. The XML schema for the request message corresponds to the command line interface used by Parasoft products (e.g., SOAtest, Jtest, C++test, .TEST). For example, the request must include a "config" element, which maps to the "-config" argument from the command line interface. SOAtest-specific command line extensions, such as "-environment", are also represented in the XML schema for the request message.



getExecutionStatus: This operation gets the current execution status of a test run given a "pid". The response message indicates if the test run is in progress and the percent completed.

287

Testing from the Web Service Interface



getResult: This operation returns a test execution summary given the "pid" of a completed test run. The response message can also contain the XML and HTML reports. These reports can be used to get detailed information about the test run (including the specific test failures).

288

Reviewing Results In this section: •

Viewing Results



Generating Reports



Configuring Reporting Settings



Understanding Reports

289

Viewing Results

Viewing Results This topic explains how to view results and customize SOAtest’s results display. Sections include: •

Accessing Results



Reviewing the Results



Customizing the Results Display



Clearing Messages

Accessing Results SOAtest results can be accessed from a variety of locations in the GUI, as well as from command-line reports.

Results from Tests Run in the GUI Test Progress View The Test Progress view reports test progress and status.

Note that: •

When a test runs, the view label changes from "Test Progress" to "Testing [Test Configuration name].



Clicking the Review tasks button displays the results in the SOAtest view.



A results summary for each analysis category is available in expandable sections.

290

Viewing Results



The toolbar buttons in the upper right corner of the Test Progress view allow you to generate reports or show/hide details.

This opens the Report dialog from which you can configure Report Preferences.

SOAtest View If SOAtest detects that a quality task needs to be performed (e.g., review a test failure, resolve a policy violation, etc.) it reports a task in the SOAtest view. If this view is not available, choose SOAtest> Show View> SOAtest to open it. To see additional details, drill down into the SOAtest view tree. To toggle through the items reported in the SOAtest view, use the arrow buttons in the SOAtest view toolbar.

Source Code Markers For static analysis tests that were run on source files, results are also reported at the source code level. For details, see “Accessing Results”, page 605.

Console View To see testing details, open the Console view during test execution. Testing details are reported here when a test is in process, and remain there until they are cleared or until another test is run.

Test Case Explorer View The Test Case Explorer indicates the status (pass/fail/not yet executed) of all available functional test cases. For details, see “Reviewing Functional Test Results”, page 379. To view any tasks related to a test listed in the Test Case Explorer, right-click that test’s Test Case Explorer node, then choose Show in Tasks. Any tasks related to that test will then be shown in the SOAtest view.

291

Viewing Results

Tips •

Many SOAtest tree nodes report the line number at which an error or possible problem occurred. To view the related code, double-click the node that shows the line number, or right-click that node and choose Go to from the shortcut menu. The related editor will then open and highlight the designated line of code.



You can use Ctrl + C to copy findings in the SOAtest view, then paste them into another document.

Results from Tests Run from the Command Line For tests run from the command line, results are recorded in the generated report. If results were sent to Team Server, results can be imported into the GUI as described in “Accessing Results and Reports”, page 234. You can then review the results as if the test had been performed in the GUI. Static analysis results are reported under the Static Analysis category. Functional test results are reported under the Test Execution category.

Reviewing the Results In the SOAtest view, the results are presented as a task list that helps you determine how to proceed in order to ensure the quality of your system.

Functional Testing Functional test results are organized by test suite. See “Reviewing Functional Test Results”, page 379 for details.

Static Analysis Static analysis results should be reviewed in one of the layouts designed especially for static analysis. For details on enabling these layouts and reviewing static analysis results, see “Reviewing Static Analysis Results”, page 605

How do source code changes affect reported findings? If you enable Decorate code markers when tasks become out of date in the Configurations page of the SOAtest preferences panel (accessed by choosing SOAtest> Preferences), then any time an element in a results message is outdated (i.e., the current source code does not match the analyzed code), the element—plus the violation node—will be marked with a special "out of date" icon and tool tip. For example:

Even if this option is not enabled, the option to double-click an outdated element to see the related code will be disabled because the related source code is no longer available,

292

Viewing Results

Filtering Results By default, the SOAtest view shows cumulative results for all tested resources. For example, if you imported results from Team Server, then ran two tests from the GUI, the SOAtest view would show all imported tasks, plus all results from the subsequent two tests. If you prefer to see only results from the last test session or from selected resources, you can filter results. To filter results: 1. Click the filters button in the SOAtest view’s toolbar.

2. Set the desired filter options in the dialog that opens. You can configure it to show only last session tasks, or for designated resources.

Customizing the Results Display Changing the Display Format There are three available layout templates: •

Default Layout: For functional testing.



Static Analysis Layout: For running static analysis against source code (e.g., from the Scanning perspective).



Static Analysis for Functional Tests Layout: For running static analysis by executing a test suite (e.g., a test suite that contains a Browser Testing tool or a Scanning tool).

You can choose the format by opening the pull-down menu on the top right of the SOAtest view, then choosing one of the available formats from the Layout shortcut menu that opens.

Changing Categories To hide or display a category of problem information (project, category, subcategory, location, user, etc.) in the SOAtest view, click the pull-down menu on the top right of this view, then choose the appropriate command in the Layout> Advanced menu.

293

Viewing Results

Or, right-click that category’s node in the SOAtest view, then choose Hide [category] from the shortcut menu.

Clearing Messages You might want to clear messages from the SOAtest view to help you focus on the findings that you are most interested in. For example, if you are fixing reported errors, you might want to clear each error message as you fix the related error. That way, the SOAtest view only displays the error messages for the errors that still need to be fixed. Messages that you clear will only be removed temporarily. If the same findings are achieved during subsequent tests, the messages will be reported again. You can clear individual messages, categories of messages represented in the SOAtest view, or all reported messages.

Clearing Selected Messages To clear selected messages shown in the SOAtest view: 1. Select the message(s) or category of messages you want to delete. You can select multiple messages using Shift + left click or Ctrl + left click. 2. Right-click the message(s) you want to delete, then choose Delete. The selected messages will be removed from the SOAtest view.

Clearing All Messages To clear all messages found: •

Click the Delete All icon (a red X) at the top of the SOAtest view.

294

Generating Reports

Generating Reports This topic explains how to generate HTML, PDF, or custom XSL reports for tests that you run from the GUI or command line. Sections include: •

From the GUI



From the Command Line

From the GUI Generating the Report To generate a report immediately after a test completes: 1. After the test has completed, click the Report button that is available in the Test Progress view (at the bottom of the GUI).

2. Complete the Report dialog that opens. The Report dialog allows you to specify: •

Report preferences (by clicking the Preferences button and specifying settings as explained in “Configuring Reporting Settings”, page 297).



Any options files that specify reporting settings you want to use (these will override settings specified in the GUI’s Preferences panel).



The report format (HTML, PDF, or custom XSL).



The location of the report file.



Whether the report is deleted upon exit.



Whether the report should be uploaded to the Team Server (Server Edition only; requires Team Server).



Whether code review tasks/results should be uploaded to the Team Server (any edition; requires Team Server).

3. Click OK. SOAtest will then open an HTML report in a browser window. This report is similar to the manager HTML reports that are generated from command line tests.

On-Demand Report Generation You can generate a report for the previous test session at any time—regardless of whether the Testing dialog is available. To generate a report of the previous test session: •

Click the pull-down menu on the top right of this view, then choose Last Session> Report.

295

Generating Reports

Uploading the Report to Team Server To upload the report to Team Server (Server Edition only): •

Follow the above procedure, but be sure to enable Publish: Reports before clicking OK.

For information about this report, see “Understanding Reports”, page 300.

Tip To customize report settings, create a local settings file (as described in “Testing from the Command Line Interface (soatestcli)”, page 257 ), then enter the location to this file in the Report Options field of the Report dialog.

From the Command Line To generate an HTML report of command line test results (Server Edition only), use the -report %REPORT_FILE% option with your soatestcli command. To upload the report to Team Server, also use the -publish option with your soatestcli command. Two types of HTML reports can be produced from the command line interface: manager reports and individual reports. For information about this report, see “Understanding Reports”, page 300.

296

Configuring Reporting Settings

Configuring Reporting Settings This topic explains how to configure reporting settings. Reporting settings can be configured in the UI or from the command line interface (using a local settings file). Sections include: •

From the GUI



From the Command Line



Support for Custom Report Formats

From the GUI The available GUI controls can be used to specify reporting settings for any test—whether it is run from the command-line interface or the UI. Before configuring report settings, you should review the settings on the following preference pages to ensure that the task authorship is being calculated correctly, results are being sent to the proper Team Server and Report Center server, the correct email host is used, and so on: •

E-mail



Group Reporting



License



Scope and Authorship



Source Control



Team

The settings specified in the UI can be fully or partially overwritten by those specified in a local settings file. To specify reporting settings from the GUI: 1. Choose SOAtest> Preferences. The Preferences dialog will open. 2. Select SOAtest> Reports. 3. Specify the appropriate report content and format settings. Available settings include: •

Report contents: •

Detailed report for developers: Determines whether customized, detailed reports are generated for each developer (in addition to a summary report for managers). These reports contain only the tasks assigned to that developer.



Overview of tasks by authors: Determines whether the report includes an overview of the number and type of tasks assigned to each developer.



Tasks details: Determines whether the report includes details about all of the reported tasks.



Only top-level test suites: Determines whether the Test Suite Summary report section only lists the .tst files (with this option enabled) or displays a tree-like view of the individual tests in each .tst file (with this option disabled).



Active static analysis rules: Determines whether the report lists the static analysis rules that were enabled for the test.

297

Configuring Reporting Settings







Generate formatted reports in command-line mode: Determines whether formatted reports are generated for tests run in command line mode.



Overview of checked files and executed tests: Specifies whether the report provides details about all checked files and executed tests. •

For static analysis, this results in a list of all the files that were checked. For each file, it lists the number of rule violations and the number of suppressed violations. If the file has a violation, it also lists the line number, rule name, and rule ID for that violation.



For test execution, this results in a list of all executed test cases and their outcomes (pass or fail). For each test suite, it lists the total number of test cases and the number of passed test cases. If a task is reported for a test case, additional details (stack trace, outcome, etc.) are presented.



Suppressions details: Specifies whether the report lists suppressed messages.



Only tests that failed: Specifies whether the report lists only failed tests.



Cutoff date for graphs: Specifies the start date for trend graphs that track different categories of tasks over a period of time.

Report format: •

Format: Specifies the desired report format (HTML, PDF, or custom XSL).



XSL file: If you chose custom XSL as the report format, specify the path to the XSL file that defines your custom format. See “Support for Custom Report Formats”, page 299 for details and examples.



Report file extension: If you want to use a file extension other than the default .html extension, specify that extension here.

Advanced settings: •

Add absolute file paths to XML data: Specifies whether absolute file paths are added to XML data.



Report tag: Specifies the name of the group that is responsible for the project. This value is used for uploading summary results to Team Server. The tag is an identifier of the module checked during the analysis process. Reports for different modules should be marked with different tags. The variables detailed in “Using Variables in Preference Settings”, page 761 can be used here.

4. If you have not already configured e-mail settings (sender address, host name, etc.) in either the GUI or from the command line, do so now in SOAtest> E-mail Settings. 5. Select SOAtest> Reports> E-mail Notifications. 6. Specify the appropriate e-mail notification settings. Available settings include: •

Send reports by e-mail: Specifies whether reports are sent via e-mail.



E-mail subject: Specifies the subject for e-mails containing reports.Specifies the subject line of the emails sent. The default subject line is "SOAtest Report." For example, if you want to change the subject line to "SOAtest Report for Project A", you would enter SOAtest Report for Project A



Send manager reports to: Specifies where to send manager reports.



Send reports without tasks: Specifies whether reports are sent when zero tasks are reported.

298

Configuring Reporting Settings



Send developer reports to: Specifies where to send developer reports.



Send ’Unknown’ developer reports to: Specifies where to send developer reports for tasks assigned to "unknown" (tasks that could not be traced back to a specific developer).

From the Command Line Reporting settings can also be specified in local settings files. See “Reporting Settings”, page 267 for details. Note that the settings specified in the UI can be fully or partially overwritten by those specified in an local settings file. If a parameter is specified in this file, it will override the related parameter specified from the GUI. If a parameter is not specified in this file, SOAtest will use the parameter specified in the GUI.

Support for Custom Report Formats A custom XSL transformer was added to facilitate the use of custom XSL formats. To specify a custom report format by entering the XSL file and report file extension. In the Reports preference page, you can specify this information in the Custom report format area of the page. In the options file, you can specify this information using the following options: (results.)report.custom.extension (results.)report.custom.xsl.file For additional guidance, see the following files (available in the manuals directory): •

XML Schema: reports.xsd



Sample XML with a variety of results: rep_example.xml •







Note that this report is used by all transformations below

Simple CSV plain text file transformation •

XSL file: csv.xsl



result: csv.txt

Simple HTML table with violations list •

XSL file: html_table.xsl



result: html_table.html

Simple HTML table with author/violations statistics •

XSL file: stats_table.xsl



result: stats_table.html

299

Understanding Reports

Understanding Reports This topic provides a general introduction to the reports that SOAtest produces for GUI and cli tests. Report details will vary based on report settings, the Test Configuration used, and the errors found. Sections include: •

Report Types



Report Contents

Report Types Two types of reports can be produced from the command line interface: •

Comprehensive reports: Reports that contain all tasks generated for a single test run.



Individual reports: Reports that contain only tasks assigned to the specified team member.

For example, if a test generated 5 tasks for Tom and 10 tasks for Joe, the comprehensive report would contain all 15 tasks, Tom’s report would contain 5 tasks, and Joe’s report would contain 10 tasks.

Report Contents Reports may contain the following sections:

Header/Navigation Bar The top left cell of the header/navigation bar shows the time and date of the test. The remaining cells (Static Analysis, Test Execution) each link to the named report section.

Static Analysis Section The Static Analysis section includes several items: •

The Static Analysis trends graph tracks how the total number of lines of code in the project, the lines of project code that were checked during static analysis, and the total number of reported static analysis tasks vary over time. This graph is created only for tests that are run from the command line and that use the -publish command.

300

Understanding Reports



The Overview table shows a basic summary of all static analysis tasks for the tested project(s). It reports the total number of static analysis tasks, the number files checked during static analysis, the total number of project files, the number of lines of code checked during static analysis, and the total number of lines of code in the project. It also reports the total time spent performing static analysis.



The All Tasks by Category table shows the total number of tasks reported for each static analysis rule category and rule. Tasks can be sorted by rule category or rule severity; click the appropriate link in the table header to change the table sorting.



The Tasks per Author table shows the number of static analysis tasks assigned to each team member. To see details about the static analysis tasks assigned to a particular team member, click the related username. •

If a team member’s name is listed in green, it means that there are no “recommended static analysis tasks" reported for that team member ("recommended tasks" are the subset of all reported tasks that has selected for that team member to review and address today [based on the maximum number of tasks per team member that your team has configured to report, as described in “Defining Task Reporting and Resolution Targets (Goals Tab)”, page 250, and task assignment settings, as described in “Configuring Task Assignment”, page 215]).



The Task Details section provides details about each reported task.



The Checked Files (Details) section lists all the files that were checked. For each file, it lists the number of rule violations and the number of suppressed violations. If the file has a violation, it also lists the line number, rule name, and rule ID for that violation.



The Active Rules section lists the names and IDs of all rules that were enabled for the test.

Test Execution Section The Test Execution section includes several items: •

The Test Execution Tasks trends graph tracks how the number of functional test failures change over time. This graph is created only for tests that are run from the command line interface and that use the -publish command.

301

Understanding Reports



The Test Suite Summary provides a breakdown of failed tests, successful tests, total test, and the success %. Test suites with failed tests are highlighted in pink.

302

Understanding Reports



The Tasks per Author table shows the number of testing tasks assigned to each team member. To see details about the testing tasks assigned to a particular team member, click the related username. •

If a team member’s name is listed in green, it means that there are no “recommended testing tasks" reported for that team member ("recommended tasks" are the subset of all reported tasks that has selected for that team member to review and address today [based on the maximum number of tasks per team member that your team has configured to report, as described in “Defining Task Reporting and Resolution Targets (Goals Tab)”, page 250, and task assignment settings, as described in “Configuring Task Assignment”, page 215]).

Tasks Per Author

Tasks Details



The Task Details section provides details about each reported task.



The Executed Tests (Details) section lists all executed test cases and their outcomes (pass or fail). For each test suite, it lists the total number of test cases and the number of passed test cases. If a task is reported for a test case, additional details (stack trace, outcome, etc.) are presented.

303

Understanding Reports

Team Server Report Link This link allows you to directly browse to this and other report files available on Team Server. In the reports available on Team Server, all links (for instance, links to Category) are active. All links are not active in emailed reports. Thus, if you want to explore an emailed report in more detail, we recommend that you follow this link and access the report on Team Server.

304

Functional/Integration Testing In this section: •

End-to-End Test Scenarios



Web Functional Tests



SOA Functional Tests

305

End-to-End Test Scenarios In this section: •

Configuring End-to-End Test Scenarios: Overview



Adding Projects, .tst files, and Test Suites



Working with Projects and .tst files



Reusing/Modularizing Test Suites



Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)



Adding Standard Tests



Adding Set-Up and Tear-Down Tests



Adding Test Outputs



Adding Global Test Suite Properties



Reusing and Reordering Tests



Parameterizing Tests (with Data Sources or Values from Other Tests)



Configuring Testing in Different Environments



Validating the Database Layer



Validating EJBs



Validating Java Application-Layer Functionality



Monitoring and Validating Messages and Events Inside ESBs and Other Systems



Executing Functional Tests



Reviewing Functional Test Results



Creating a Report of the Test Suite Structure



Managing the Test Suite

306

Configuring End-to-End Test Scenarios: Overview

Configuring End-to-End Test Scenarios: Overview SOAtest provides full support for the testing of both Web interfaces and services over multiple protocols. This establishes an integrated framework for "end-to-end" testing; a single test suite can verify operations that cross the messaging layer, the Web interface, the database, and EJBs. Moreover, "Unit tests" can be created as soon as a single "unit of work" is completed, then this test suite can be incrementally enhanced to verify additional components as they are added, test the integration of components within an SOA, and audit end-to-end business processes that span across composite applications. SOAtest provides a flexible test suite infrastructure that lets you add, organize, and run specialized test scenarios. Each test in a test suite contains a main test tool and any number or combination of outputs (other tools, or special output options). You can run individual tests, or the complete test suite. In addition, you can attach regression controls at the test or test suite level so that you are immediately alerted to unexpected changes. After individual functional tests have been created, they can be leveraged into scenario-based tests without any additional work. Scenario tests allow you to emulate business logic or transactions that may occur during normal usage of the application or service. This also allows you to find bugs that may surface only after a certain sequence of events.

Recommended Workflow The most common workflow for developing test end-to-end test scenarios is as follows. As each "unit of work" (service or other logic component) is completed: 1. Use a wizard to automatically generate a project, test suite, and initial test cases. 2. Add additional outputs and tests as needed to cover the functionality that you want to test. 3. Execute the test and verify that it is producing the expected result. 4. If there are problems with the application, diagnose and correct them, then rerun the tests. 5. If the test is not working as expected, adjust test and tool settings as needed, then rerun the tests. 6. Add regression controls or validations once the tested functionality is behaving as expected. As your test suites grow, combine tests into test scenarios; for example, you can: •

Copy and paste tests from different test suites into a logic order.



Configure execution options such as test sequence, test relationship, and test flow logic.



Parameterize tests to use values from data sources or values extracted from other tests.



Use stubs and environments to configure a predictable and accessible test bed.

307

Adding Projects, .tst files, and Test Suites

Adding Projects, .tst files, and Test Suites This topic provides a general guide on how to add projects, .tst files and test suites using SOAtest’s various test creation wizards. Sections include: •

Projects, .tst files, and Test Suites



Test Suites and Scenarios



Adding a New Project and .tst File



Creating an Empty Project



Adding a New .tst File to an Existing Project



Adding a New Test Suite

Projects, .tst files, and Test Suites A project (an entity created by Eclipse) can contain any number of SOAtest-specific .tst files. They can also contain source files you want to analyze with SOAtest, and any other resources that make sense for your environment. Each .tst file can include any number of test suites/scenarios, tools, inputs, and stubs. The organization and structure is up to you. To keep file size down and to improve maintainability, we recommend using one .tst file for each distinct testing requirement.

Test Suites and Scenarios A test suite is any collection of tests that are individually runnable, and has the following setting in the test suite configuration panel:

A scenario is any collection of tests that are not individually runnable because they have dependencies. One example of a scenario is when a series of SOA tests extracts a value from one test’s response and uses it as part of subsequent test message. Another example is a sequence of Web functional tests recorded from a browser.

Adding a New Project and .tst File

308

Adding Projects, .tst files, and Test Suites

Projects provide you with a central place to access and operate on different .tst files and the test suites they contain. You can either create a test suite manually, or you can use the SOAtest test creation to automatically create a test suite from various platforms or artifacts. SOAtest provides a wizard to guide you through the creation of a new project and adding an initial .tst file to it. There are two ways to access this wizard: •

Choose File> New, then either select your desired test creation option (for example, Project from WSDL, Project from Web Browser Recording, etc.) if it is listed, or choose Other to open a full list of test creation options.



Choose this command from pull-down menu for the New toolbar button (top left).

The wizard will guide you through the test case creation process, then create a project and .tst file containing the generated tests. For help selecting and completing the available test creation wizards, see: •

“Automatic Creation of Test Suites for SOA: Overview”, page 384.



“Recording Tests from a Browser”, page 431



“Configuring SOAtest to Scan a Web Application”, page 583

Creating an Empty Project 309

Adding Projects, .tst files, and Test Suites

You can create an empty project as follows: 1. Open the pull-down menu for the New toolbar button (top left) then choose Project.

2. Choose SOAtest> Empty Project, then click Next. 3. Enter a name for the project, change the destination if needed, then click Finish.

Adding a New .tst File to an Existing Project We recommend that you create a separate test (.tst file) for each distinct testing requirement. To add a new Test Suite (.tst) file to an existing project. 1. Do one of the following: •

Right-click the project node, and select New Test (.tst) File from the shortcut menu.



Choose File > New > New Test (.tst) File.

310

Adding Projects, .tst files, and Test Suites

2. In the New Test (.tst) File wizard that opens, select the project that you want to contain the .tst file, enter a name for the .tst file, then click Next.

You can then complete the wizard to specify what type of tests you want to create and how you want them created. For help selecting and completing the available test creation wizards, see: •

“Automatic Creation of Test Suites for SOA: Overview”, page 384.



“Recording Tests from a Browser”, page 431



“Configuring SOAtest to Scan a Web Application”, page 583

Adding a New Test Suite To create a new test suite: 1. Do one of the following:

311

Adding Projects, .tst files, and Test Suites



Select the Test Case Explorer node for the existing test suite into which you want to add a new test suite, then click the Add Test Suite button:



Right-click the Test Case Explorer node for the existing test suite into which you want to add a new test suite, then choose Add New> Test Suite from the shortcut menu.

312

Working with Projects and .tst files

Working with Projects and .tst files This topic explains how to work with projects and .tst files. Sections include: •



Working with Projects •

Saving a Project File



Closing a Project File



Opening a Project File

Working with Test (.tst) Files •

Opening .tst Files



Understanding a .tst File’s XML Format

Working with Projects Saving a Project File Any changes that you make to project properties, test suites, and so on are automatically saved to the project. Projects will remain open until you explicitly close them/

Closing a Project File When you close a project file, SOAtest will close all related trees and settings. Closed projects are shown in the Navigator, but not in the Test Case Explorer. To close the current project and all related setting: •

In the Navigator, right-click the project, and choose Close Project from the shortcut menu.

Opening a Project File When you open a project file, all associated trees, settings, and reports will be restored. To open a project file: •

In the Navigator, right-click the project, and choose Open Project from the shortcut menu.

Working with Test (.tst) Files Opening .tst Files By default, .tst files are closed. All open .tst files are loaded into memory. There are two ways to open a .tst file: •

Double click the .tst file’s Test Case Explorer node



Right-click the .tst file’s Test Case Explorer node, then choose Open Test (.tst) File from the shortcut menu.

Closed .tst files have the following "closed box" icon:

313

Working with Projects and .tst files

Open .tst files have the following "open box" icon:

Closing .tst Files There are two ways to close a .tst file: •

Double click the .tst file’s Test Case Explorer node



Right-click the .tst file’s Test Case Explorer node, then choose Close Test (.tst) File from the shortcut menu.

Understanding a .tst File’s XML Format SOAtest .tst files are saved in XML format, and thus can be parsed to get test suite and test information into a custom framework. The following table describes how some of the most commonly-used tools are represented:

Artifact

Element Name

Parent Element

Root element

SOAtestProject

None

Test Suite

Test Suite

TestSuite (if nested under other Test Suites)

Test or a Test Suite name

name

any of the other listed elements

Environments

EnvironmentConfiguration

TestSuite

Data Sources

DataSource

TestSuite/SOAPRPCToolTest

Messaging Client

HTTPClient

TestSuite/HTTPClientToolTest

Browser Testing Tool

BrowserTestingTool

TestSuite/ToolTest

DB Tool

DbTool

TestSuite/ToolTest

Extension Tool

MethodTool

TestSuite/ToolTest

Call Back Tool

CallBackTool

TestSuite/CallBackToolTest

Message Stub Tool

ClientTester

TestSuite/ClientTesterTest

Moreover, the following list describes common, specially named elements that will appear in the XML project file. The fields are named so that you can exclude parts of it and make the search more general. For example, to search for any WSDL, you could search & replace for "_WSDLLocation>http:// mywsdl</", for just SOAP Client WSDLs, you could search for "<SOAPClient_WSDLLocation>http:/ mywsdl</SOAPClient_WSDLLocation>" WSDL fields: •

SOAPClient_WSDLLocation



ClientTester_WSDLLocation

314

Working with Projects and .tst files



WSITool_WSDLLocation



XMLValidator_WSDLLocation

Schema fields: •

SOAPClient_SchemaLocation



ClientTester_SchemaLocation



MessagingClient_SchemaLocation

Endpoint fields: •

SOAPClient_CustomEndpoint



SOAPClient_UDDIServiceKey



MessagingClient_Endpoint

Literal (XML) fields: •

SOAPClient_LiteralMessage



ClientTester_LiteralMessage



MessagingClient_LiteralMessage

XPath Fields: •

XMLDatabank_ExtractXPath



XMLDatabank_AlterXPath



XMLTransformer_ExtractXPath



XMLTransformer_AlterXPath



Assertion_XPath

Diff Tool Regression Controls: •

DiffTool_RegressionControl

BrowserTestingTool fields: •

BrowserTestingTool_NavigateURL - the url field for a Navigate action



BrowserTestingTool_WindowName - the window name field for any action



BrowserTestingTool_LocatorAttributeValue - the attribute value field for any action set to an Element Properties locator



BrowserTestingTool_LocatorXPath - the xpath field for any action set to an XPath locator



BrowserTestingTool_TypeValue - the value field for a type action



BrowserTestingTool_OtherValue - the value field for an "other" action



BrowserTestingTool_NewBrowserURL - the url field for a NewBrowser action

315

Working with Projects and .tst files

BrowserDataBank: •

BrowserDataBank_LocatorAttributeValue - the attribute value field for any extraction set to an Element Properties locator



BrowserDataBank_LocatorXPath - the xpath field for any extraction set to an XPath locator



BrowserDataBank_WindowName - the window name field for any extraction

BrowserValidationTool: •

BrowserValidationTool_LocatorAttributeValue - the attribute value field for any validation set to an Element Properties locator



BrowserValidationTool_LocatorXPath - the xpath field for any validation set to an XPath locator



BrowserValidationTool_WindowName - the window name field for any validation



BrowserValidationTool_ExpectedValue - the expected value field for any validation

316

Reusing/Modularizing Test Suites

Reusing/Modularizing Test Suites This topic explains how to make test suites reusable. Sections include: •

Introduction



Using Test Suite References



Using Test Variables



Tutorial

Introduction In many cases, you may want to create a test suite that can be reused by other test suites. A common example is a test suite that logs in to a web site. Once such a test suite is created, it can be used by other test suites in various scenarios that require a login. Two SOAtest features are especially helpful for the creation and use of reusable test suites: •

Referenced test suites: Once a reusable module or test suite has been created, it can be referenced by another test suite.



Test variables: You can parameterize tests with test variables, which can be set to specific values from a central location, set from data sources, or set from a data bank tool or Extension tool.

Using Test Suite References Adding an existing test suite as a test suite reference is especially useful if you have a test suite that you would like your team members to reuse across multiple parent test suites. For example, you may have a single Authentication test suite that different team members want to use within different root test suites. In this situation, the team members can add a reference to the defined Authentication test within their specific test suite. For another example, consider a web application that requires a user to log in. The sequence of steps to log into the application could be saved in one SOAtest test suite, and then every functional test for that web application could reference the test suite containing the login information. Setting up the tests in this manner makes it much easier to manage an evolving web application. If an extra step is added to the login process for the web application, then only the "login" test suite needs to be modified to include that extra step, and all other tests that reference the "login" test suite will automatically be updated with the change. To reference an existing test suite in another test suite: 1. Right-click the Test Suite tree node where you want the test suite referenced, then select Add New> Test Suite from the shortcut menu. 2. Select Reference Test (.tst) File and click the Finish button. 3. Select the appropriate .tst file from the file chooser that opens. After you add a test suite reference, it will be referenced by the current test suite. If the referenced test suite is modified (from its original location in the Test Case Explorer) , those changes will be propagated to the parent test.

Using Test Variables 317

Reusing/Modularizing Test Suites

For details on using test variables, see “Defining Test Variables”, page 326.

Tutorial For a step-by-step demonstration of how to construct and use a reusable test suite, see “Creating Reusable (Modular) Test Suites”, page 92.

318

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.) This topic explains how to customize a test suite’s properties (such as its name and how individual test cases are run). Topics include: •

Accessing the Test Suite Configuration Panel



Correlating Tests to Requirements, Project Center Tasks, Bug Reports, and Feature Requests



Specifying Execution Options (Test Flow Logic, Regression Options, etc.)



Defining Test Variables



Specifying SOAP Client Options



Specifying Browser Playback Options



Specifying Reporting Options

Accessing the Test Suite Configuration Panel To customize test suite properties, double-click its Test Case Explorer node and use the controls that open in the configuration panel (on the right side of the GUI).

Correlating Tests to Requirements, Project Center Tasks, Bug Reports, and Feature Requests The Requirements and Notes tab of the test suite configuration panel allows you to identify requirements for each test in the test suite. The requirements you define will appear in Structure reports (and also in Report Center for Report Center users), allowing managers and reviewers to determine whether the specified test requirements were accomplished. For more information on Structure Reports, see “Creating a Report of the Test Suite Structure”, page 380. To configure requirements for tracking, complete the following from the Requirements and Notes tab: 1. Select a node from the test suite tree within the Requirements and Notes tab.

319

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)

2. Click the Add button. 3. In the Type box, select a requirement type. Parasoft Concerto will use this information to associate the test suite’s test cases to the specified element type. For instance, if it is associated with a specific bug, information about the test case’s status will be considered for Report Center’s bugs graphs. Available task types are: •

@pr: for bugs.



@fr: for feature requests.



@req: for requirements.



@task: for tasks.

4. Enter an ID and a URL for the requirement and click OK.

The requirement you defined will display in the Requirements table within the Requirements and Notes tab and correspond to the test suite node you selected and all of its child nodes. 5. (Optional) If you want to enter notes for the test suite, enter a description in the Notes field. This option is useful in that it enables you to view a quick description of the test suite purpose.

Specifying Execution Options (Test Flow Logic, Regression Options, etc.) Configurable execution options allow you to control factors such as whether: •

Tests run sequentially or concurrently.



Tests can be run independently, or should be run in groups.



One test depends on the result of another test



The entire test suite should loop until a certain condition being met.



Regression controls are created for specific tests, and how regression controls map to data sources.

These options are configured in the Execution Options tab, which has three sub-tabs: Test Execution, Test Flow Logic, and Regression Options.

Test Execution You can customize the following options in the Test Execution sub-tab of the Execution Options: •

Execution Mode: These options determine the concurrency of test runs

320

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)





Tests run Sequentially: Choose this option to tell SOAtest to run each test, and child test suite of this test suite, separately from the others. Tests will run one at a time.



Tests run Concurrently: Choose this option to tell SOAtest to run all tests and child test suites of this test suite at the same time. Tests will run simultaneously.

Test Relationship: These options determine how SOAtest will iterate through the rows of your data sources •

Tests are individually runnable: (Default) SOAtest iterates through all data source rows for each test. When an individual test executes, it will use every row of the data source before the next test or test suite is executed. When a child test suite is executed, SOAtest will wait for all of its children to finish before the next test or test suite is executed.



Abort Scenario on Fatal Error: To stop running tests if the previous test resulted in a fatal error, check the Abort Scenario on Fatal Error checkbox. •

This option is only available when both Tests run Sequentially is selected and Tests are individually runnable is not selected. This case occurs when a set of tests in a test suite are dependent on each other, cannot be run apart from each other, and must run sequentially. If the option is enabled, and if a test within the scenario being run has a fatal error, the rest of the tests in the scenario will not be run. If it is disabled, even if a fatal error occurs, the remaining tests in the scenario will be run.



Tests run as group: (Default for scenarios) SOAtest runs all tests for each row of the data source. In this case, a data source row is chosen, and each test and child test suite is executed for that row. Once all children have executed, a new row is chosen and the process repeats.



Tests run all sub-groups as part of this group: SOAtest treats all tests contained in this test suite like direct children of this test suite. SOAtest will then iterate through them as a group. For example, consider the arrangement in the following figure:

In this case, we assume that Test Suite 2 and Test Suite 3 are both set to “Tests are individually runnable,” the Table has 2 rows of data, and we consider a test run under each of the Test Relationship options for Test Suite 1.This table demonstrates the order that tests would run for different choices of Execution Options of Test Suite 1. This also assumes that Test Suite 2 and Test Suite 3 remain set to "Run individually." The result is to run tests in the order shown in the table below.

321

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)

Run individually

Run as group

Run all subgroups as part of this group

SOAP Client 1 row 1

SOAP Client 1 row 1

SOAP Client 1 row 1

SOAP Client 1 row 2

SOAP Client 2 row 1

SOAP Client 2 row 1

SOAP Client 2 row 1

SOAP Client 2 row 2

SOAP Client 3 row 1

SOAP Client 2 row 2

SOAP Client 3 row 1

SOAP Client 1 row 2

SOAP Client 3 row 1

SOAP Client 3 row 2

SOAP Client 2 row 2

SOAP Client 3 row 2

SOAP Client 1 row 2

SOAP Client 3 row 2

SOAP Client 2 row 1 SOAP Client 2 row 2 SOAP Client 3 row 1 SOAP Client 3 row 2

Test Flow Logic SOAtest allows you to create tests that are dependent on the success or failure of previous tests, setup tests, or tear-down tests, thereby creating an efficient workflow within the test suite. In addition, you can also influence test suite logic by creating while loops and if/else statements that depend on the value of a test variable. Options can be set at the test suite level (options that apply to all tests in the test suite), or for specific tests.

Test Suite Logic Options In many cases, you may want to have SOAtest repeatedly perform a certain action until a certain condition is met. Test suite flow logic allows you to configure this. Understanding the Options To help you automate testing for such scenarios, SOAtest allows you to choose between two main test flow types: •

While variable: Repeatedly perform a certain action until a test variable condition is met. This requires test variables, described in “Defining Test Variables”, page 326, to be set.



While pass/fail: Repeatedly perform a certain action until a pass/fail condition is met (e.g., one of the tests in the test suite either passes or succeeds).

For example: •

A user submits some data to a web service, and then that submission results in other data being inserted into a database at a later time. The time at which the data is inserted into the database varies. To check this in SOAtest, you could define a DB tool with a chained assertor that fails while the data is not present. The test would then need to loop on this DB tool until it succeeds.



In a web application, the user enters some data and clicks a "Submit Query" button. If the data is not available, the application just shows a "data loading" message. The user repeatedly

322

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)

clicks the button until some data appears. To check this in SOAtest, you could setup a Browser Testing Tool that performs a click action on the button, then chain to it a Browser Validation Tool that validates whether some element is present. The test would need to loop until the element appears. •

In a web application, search results are often present in a "paged" format, meaning that the results are distributed over multiple pages. If the result that you are looking for is not on the currently displayed page, you need to click the "Next" link until it appears. To check this in SOAtest, you could configure a Browser Testing tool that performs a click action on the "Next" link, with a Browser Validation tool that validates if the desired result is present. The test would then need to loop until the result appears.

Setting the Options To configure test flow logic options that apply across the test suite: 1. Open the Execution Options> Test Flow Logic tab, then select the top-level node. .

2. Select the desired flow type. •

You can choose from while variable or while pass/fail loop flow (see above for an explanation of the different types) or none (if you do not want execution flow to depend on a condition being met).

3. (Optional) Customize the Maximum number of loops setting, which determines the maximum number of loops to run if the specified condition is never met. 4. If you chose while/pass fail flow specify the loop conditions by going to Loop until one of the test(s) and choose succeeds or fails—depending on which outcome you want to occur before the test suite proceeds. 5. If you chose while variable flow, set the while and do conditions as follows: •



while: Select the desired variable from the drop-down list. The items in this list depend on the variables you added to the Test Variables tab. •

If the variable you select was defined as a boolean value, you will be able to select from either true or false radio buttons.



If the variable you select was defined as an integer, a second drop-down menu displays with == (equals), != (not equal), < (less than), > (greater than), <= (less than or equal to), >= (greater than or equal to). In addition, a text field is available to enter an integer.

do: Allows you to determine the action for the variable in the while loop. The following options are available: •

Nothing: If the variable condition is met, do nothing.



Increment: (For integer values only) If the variable condition is met, increment the variable.

323

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)



Decrement: (For integer values only) If the variable condition is met, decrement the variable.



Negate: (For boolean values only) If the variable condition is met, negate the variable.

Test Flow Logic Tutorial For a step-by-step demonstration of how to apply while pass/fail test flow logic, see “Looping Until a Test Succeeds or Fails - Using Test Flow Logic”, page 97.

Test-Specific Logic Options The following options are available for specific tests:



Test Result Dependency: If the current (selected) test should run only if another test succeeds, fails, or is skipped, then specify the name of that dependent test here. For example, if Test 4 depends on the results of Test 1, select Test 4 in the left panel, then choose Test 1 from the drop-down menu. Then, specify the condition under which the current test should run. Options are: •

Success: Select if the subsequent test case should be run according to the success of the test case selected in the Test drop-down menu. If the test case selected in the Test drop-down menu does not succeed, the subsequent test case will not run.



Failure: Select if the subsequent test case should be run according to the failure of the test case selected in the Test drop-down menu. If the test case selected in the Test drop-down menu does not fail, the subsequent test case will not run.



Skipped: Select if the subsequent test case should be run if the test case selected in the Test drop-down menu was skipped. If the test case selected in the Test drop-down menu is not skipped, the subsequent test case will not run.

324

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)

Set-Up and Tear-Down Tests If any set-up or tear-down tests are available, they display in the left GUI panel and you will be able to configure test logic as follows: •

The execution of a test can be dependent on a set-up test.



A setup test can now be dependent on a previous set-up test.



A tear-down test can be dependent on a regular test, set-up test, or previous tear-down test.

This functionality allows you to stop a test (or run a test) if a setup test fails.



Variable Condition: Allows you to determine whether or not a test is run depending on variables added to the Test Variable table (for more information on adding test variables, see “Defining Test Variables”, page 326). If no variables were added, then the Variable Condition options are not available. The following options are available if variables were defined: •

Variable Condition drop-down: Select the desired variable from the drop-down list. The items in this list depend on the variables you added to the Test Variable table. •

If the variable you select was defined as an integer, a second drop-down menu displays with == (equals), != (not equal), < (less than), > (greater than), <= (less than or equal to), >= (greater than or equal to). In addition, a text field is available to enter an integer. For example:

If x != 13 (x does not equal 13), the test will run, however, if x does equal 13, the test will not be run. •

If the variable you select was defined as a boolean value, you will be able to select from either true or false radio buttons. For example:

If variable x1 is false, the test will run, however, if x1 is true, the test will not be run. •

Delay in milliseconds: Lets you set a delay before and/or after test execution.

Regression Options The Regression Options controls options allow you to customize how data sources are used in regression tests and which test suite have regression controls. Note that this tab is not applicable for Web functional tests. Available options are:

325

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)



Use data source row numbers: (Default value) Select to enable all the Diff regression controls within the test suite to associate data source row numbers to the data generated by the Diff control. For example, Row N in a data source will be associated with the Row N control in the diff tool, regardless of what data source values are used. When this option is selected, there is no dependency between the data source values and the corresponding diff regression control (in the context of multiple regressions). •

Note: If a new row is inserted into or deleted from the data source, all multiple regression controls associated with that data source must be updated.



Use data source column names and values: Select to enable all the Diff regression controls within the test suite to associate the data source column names and values to the data generated by the Diff control. For example, the request which used A=1, B=2 in a SOAP Client will be associated with the control that has been labelled "A=1, B=2" and so on. When this option is selected, you can add and remove data source rows as you wish and the Diff will map the content to the correct control as long as the column names and values are unchanged. For more information on using data sources, see “Parameterizing Tests (with Data Sources or Values from Other Tests)”, page 345.



Regression Controls Logic: This table allows you to configure which tests in a test suite SOAtest should create regression controls for. From each test entered in the table, you can select Always or Never. Regression controls will be updated accordingly the next time you update the regression controls for the test suite.

Defining Test Variables The Test Variables tab allows you to configure variables that can be used to simplify test definition and create flexible, reusable test suites. After a test variable is added, tests can parameterize against that test variable.

Understanding Test Variables You can set a test variable to specific value, then use that test variable throughout the current test suite to reference that value. This way, you don’t need to enter the same value multiple times—and if you want to modify the value, you only need to change it in one place. As an alternative to manually setting a test variable to a specific value, you can have a data bank tool (e.g., XML Data Bank) or Extension tool set the value of that test variable "on-the-fly." Moreover, if you have a referenced test suite (a test suite that is referenced by a parent test suite—see “Using Test Suite References”, page 317 for details), test variables can be used to access data sources from the parent test suite.

Adding Test Variables You can add a new variable as follows: 1. Click the Add button. 2. Enter a new variable name in the Name field. 3. Select either Integer, Boolean, String, or Data Source from the Type box. 4. Specify whether you want to use a local value or use a value from a parent test suite. •

Use value from parent test suite (if defined) - Choose this option if the current test suite is a "referenced" test suite and you want it to use a value from a data source in the parent test suite. See “Using Test Suite References”, page 317 for details on parent test suites.

326

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)



Use local value - Choose this option if you always want to use the specified value— even if the current test suite has a parent test suite whose tests set this test variable. Note that if you reset the value from a data bank tool or Extension tool, that new value will take precedence over the one specified here.

5. (For data source type only) Specify the name of the data source and column where the appropriate variables are stored. The data source should be in the parent test suite (the test suite that references the current test suite). 6. Enter the variable value in the Value field. If you chose Use local value, the variable will always be set to the specified value (unless it is reset from a data bank tool or Extension tool). If you chose Use value from parent test suite, the value specified here will be used only if a corresponding value is not found in the parent test suite.

7. Click OK.

Using Test Variables Once added, variables can be... •

Used via the "parameterized" option in test fields. For instance, If you wanted to set a SOAP Client request element to use the value from the title variable test variable, you would configure it as follows:



Reset from a data bank tool (e.g., an XML Data Bank as described in “Configuring XML Data Bank Using a Wizard”, page 863).



Reset from an Extension tool (as described below in “Setting Test Variables and Logic Through Scripting”, page 328).



Used to define a test logic condition as described below in “Test Flow Logic”, page 322.

327

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)

Setting Test Variables and Logic Through Scripting Very often, test suite logic and variables will depend on responses from the service itself. Using an Extension tool, you can set a test suite variable in order to influence test flow execution. For example, if Test 1 returns a variable of x=3, then Test 2 will run. Via the TestSuiteVariable API, located in <SOAtest_62_Installation_Directory>/plugins/ com.parasoft.xtest.web_<soatest_version>/help/api, you can access a test suite variable and either set it to a value, or get a value from it. Using this value, you can configure test flow logic. For example, you can enter the following into an Extension tool to set a variable: from com.parasoft.api import Application def setVar(input, context): context.setValue("x", input.toString())

To get a value from a TestSuiteVariable object x: varValue = context.getValue("x")

Where varValue will be returned as a string. For instance, you can add an XML Transformer tool to a test and extract a certain value from that test. Then, you can add an Extension output to the XML Transformer and enter a script to get the value from the Transformer. Finally, you can set up a second test to run only if the correct value is returned from the first test.

Monitoring Test Variable Usage To configure SOAtest to show what variables are actually used at runtime, set Console preferences (SOAtest> Preferences> Console) to use normal or high verbosity levels. After each test, the Console view (Show View> SOAtest> Console) will then display test variables used at runtime. For example: Scenario: ICalculator Test 1: first add - success get x=0 set x=10.0 set Test 1: type=xsd:float Test 2: second add - success get x=10 set x=20.0 Test 3: third add - success get x=20 set x=30.0 Test 1: first add - success get x=30 set x=50.0 set Test 1: type=xsd:float Test 2: second add - success get x=50 set x=70.0 Test 3: third add - success get x=70 set x=90.0

Viewing such variables is useful for diagnosing the cause of any issues that occur.

Tutorial For a step-by-step demonstration of how to use test variables, see “Creating Reusable (Modular) Test Suites”, page 92.

328

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)

Specifying SOAP Client Options You can customize the following options in the SOAP Client Options tab of the test suite configuration panel: •

Endpoint: If you would like to specify an endpoint for all tests within the test suite, enter an endpoint and click the Apply Endpoint to All Tests button.



Timeout after (milliseconds): If you do not want to use the default, select Custom from the drop-down menu and enter the desired time. The default value is 30000.



Attachment Encapsulation Format: Select Custom from the drop-down menu and select either MIME or DIME, MTOM Always, or MTOM Optional. The default value is MIME.



SOAP Version: Select Custom from the drop-down menu and select either SOAP 1.1 or SOAP 1.2. The default value is SOAP 1.1.



Outgoing Message Encoding: Allows you to choose the encoding for outgoing messages. You can choose any Character Encoding you wish from the SOAtest Preferences panel to read and write files, but the Outgoing Message Encoding provides additional flexibility so you can set a different charset encoding for the SOAP request from the global setting.

Specifying Browser Playback Options The Browser Playback Options tab is divided into four sections: •

Browser Playback: Describes the Browser type to use in the test. The panel contains a dropdown menu and three radio buttons (Firefox, Internet Explorer, Both). If the drop-down menu is set to Default, then all three radio buttons will be disabled, and this test suite will choose which browsers to replay the test based on its parent’s choice. If there is no parent (i.e. The root test suite), then the default option is to run the test back in both Firefox and Internet Explorer. If Custom is selected from the drop-down menu, then the selected test suite and any of its children test suites with Default selected in the same drop down box, will play back the test in the selected browser. •

Firefox: If this option is selected, SOAtest will play back the test in the Firefox browser using the appropriate Firefox executable (see part 2 below). SOAtest supports Mozilla Firefox 1.5 and higher.



Internet Explorer: If this option is selected, SOAtest will play back the test in the Internet Explorer browser (Windows machines only). SOAtest supports Internet Explorer 6 and higher. •

Run in specified browser only: Enable this option if you want to ensure that this test is never played in an alternate browser (e.g., because the web page structure is significantly different on other browsers and the scenario would need to be constructed differently on another browser).



Both: If this option is selected, SOAtest will play back the test in both Firefox and Internet Explorer.



Firefox executable path: This value is inherited from its parent if the Default option is selected for Browser type. You may select which version of Firefox to use by selecting the option from the drop-down menu, or by clicking the Browse to executable button and selecting the appropriate version. Note: On Windows machines, SOAtest will attempt to detect a Firefox installation automatically. Linux users will have to browse to the Firefox executable.

329

Configuring Test Suite Properties (Test Flow Logic, Variables, etc.)



Visibility: Describes the visibility of the tests as they playback. This option is inherited from its parent if Default is selected. You may choose Headless or Visible if Custom is selected. •

• •

In Headless mode, you will not be able to see the tests as they run (i.e. the browser will not be visible while the test is running). The following support is available for Headless mode: •

Windows: Fully supported



Linux: Supported on Linux 2.4.21-27.0.2 kernel builds and later (tested on Red Hat, Debian, and Mandrake Architectures)



Solaris: Supported on Solaris 9 and 10

In Visible mode, you will be able to watch in the browser as the test runs and be able to visually verify that the test ran correctly.

Authentication: Allows you to specify a Username and Password for Basic and NTLM authentication of your web application. If you enter a username/password during recording, this section will be automatically configured in the recorded scenario. However, you can later go back and modify the settings. The settings in this section are also inheritable from parent test suites. Therefore, if you have many functional test scenarios that require authentication, you can specify the settings in a high-level test suite that contains all the functional scenarios. •

The Perform Authentication checkbox specifies whether to send authentication credentials to the web application. If it is checked, credentials will be sent.

Specifying Reporting Options The Reporting Options tab allows you to customize meta-attributes that SOAtest sends to different reporting tools while running the current test suite. To specify reporting attributes for CentraSite Active SOA: 1. Select CentraSite from the Reporting options drop-down menu. 2. Specify the UDDI Service Key. To specify test-suite-specific reporting attributes for Parasoft Report Center: 1. Select Report Center Attributes from the Reporting options drop-down menu. 2. Click Add, then specify reporting attributes. •

For details on attributes, see “Configuring Report Center Attributes”, page 204.

330

Adding Standard Tests

Adding Standard Tests This topic explains how to add standard tests—tests that are added to a test suite and which are executed in the order in which they are listed. To add a standard test case to a test suite: 1. Select the Test Case Explorer tree node that represents the Test Suite you want to extend.

2. Click the Add Test or Output toolbar button. •

The Add Test wizard opens and displays a list of available tools.



To learn about a particular tool, select it and review the description that displays.

3. In the left pane, select Standard Test. 4. In the right pane, select the tool you want to use. 5. Click Finish. 6. Double-click the node added for that tool, then review and modify settings as needed in the tool configuration panel that opens on the right. •

Most settings vary from tool to tool.



If an Input Tab is shown in the tool configuration panel, you need to specify what text or file you want the tool to operate on. To do this, complete the tab as follows: •

Text: Use this option if you want to type or copy the document into the UI. Select the appropriate MIME type, enter the text in the text field below the Text radio button.



File: Use this option if you want to use an existing file. Click the Browse button to choose a file. Check the Persist as Relative Path option if you want the path to this file to be saved as a path that is relative to the current configuration file. Enabling this option makes it easier to share tests across multiple machines. If this option is not enabled, the test suite will save the path to this file as an absolute path.

331

Adding Set-Up and Tear-Down Tests

Adding Set-Up and Tear-Down Tests This topic explains how to add Set-Up and Tear-Down tests—tests that are executed before or after the rest of the test suite. SOAtest’s Set-Up and Tear-Down tests that mirror the setUp and tearDown operations in JUnit test cases. To add a Set-Up or Tear-Down test: 1. Select the Test Case Explorer tree node that represents the Test Suite you want to extend.

2. Click the Add Test or Output toolbar button. •

The Add Test wizard opens and displays a list of available tools.



To learn about a particular tool, select it and review the description that displays.

3. In the left pane, select Set-Up Test (if you want the tool to run before the test suite executes) or Tear-Down Test (if you want the tool to run after the test suite executes). 4. In the right pane, select the tool you want to use. 5. Click Finish. 6. Double-click the node added for that tool, then review and modify settings as needed in the tool configuration panel that opens on the right. •

Most settings vary from tool to tool.



If an Input Tab is shown in the tool configuration panel, you need to specify what text or file you want the tool to operate on. To do this, complete the tab as follows: •

Text: Use this option if you want to type or copy the document into the UI. Select the appropriate MIME type, enter the text in the text field below the Text radio button.



File: Use this option if you want to use an existing file. Click the Browse button to choose a file. Check the Persist as Relative Path option if you want the path to this file to be saved as a path that is relative to the current configuration file. Enabling this option makes it easier to share tests across multiple machines. If this option is not enabled, the test suite will save the path to this file as an absolute path.

332

Adding Test Outputs

Adding Test Outputs This topic explains the how you can access and manipulate test case output. Sections include: •

Understanding Outputs



Adding an Output



SOAP Client Output Options



Messaging Client Output Options



General Tool Output Options

Understanding Outputs You can have multiple tools perform operations on the result of one tool by adding multiple outputs to the appropriate tool node. Or, you can you can have one output perform an operation on the result of another output by adding an additional output to an existing output node. To specify one or more output(s) for a tool added to a test suite. Test suite tests based on a SOAP Client tool typically use outputs to operate on the messages that the SOAP Client tool returns. For example, if you wanted to test whether a certain SOAP remote procedural call always returned the same response for a given input, you might create a SOAP Client tool that sent the specified input, then use a Diff tool to verify the response. You could also use outputs to send the HTTP traffic from this test to the Results window so you could view the traffic. Or, if you wanted to test whether a Web service returned values in a correct format, you might create a SOAP Client tool, then use a Coding Standards output to apply a set of rules that checked if the output matched a certain required pattern. If it did, you could attach a regression control to the Coding Standards; SOAtest would then alert you if the Web service failed to match that required pattern in subsequent tests. Or, if you have an Extension tool that retrieves XML-format data, you could customize that tool so that it always sends its output to a RuleWizard rule which verifies whether the data is correct. If you want to apply a transformation tool such as XSLT and would like the transformed source code saved in a file, you need to chain a File Writer tool to the original tool. You can also use outputs to save the results of file transformations. For example, you could the XSLT tool to transform a set of files, then send the resulting files to a Write File output so that they would be saved.

Adding an Output To add an output: 1. Select the Test Case Explorer tree node that represents the test that you want to add an output for.

2. Click the Add Test or Output toolbar button. •

The Add Test wizard opens and displays a list of available tools.



To learn about a particular tool, select it and review the description that displays.

3. (Web tests only) Specify whether you want to use data from the browser contents (rendered HTML) or the original HTTP Traffic, then click Next.

333

Adding Test Outputs





Browser contents refers to the real-time data model that the browser constructed from all of the HTML, JS, CSS, and other files it loaded. Choose Browser contents if you want to validate values that appear in the browser (i.e., use the final DOM that the browsers construct from the server code). This option allows you to: •

Add a Browser Validation Tool to validate values that appear in the browser.



Add an Extension Tool for complex validations that require significant scripting.



Add a Browser Data Bank that extracts a value that appears in the browser .

HTTP traffic refers to the individual HTTP messages that the browser made in order to construct its data model. Choose HTTP traffic if you want to use the content returned by the server as is (before any browser processing). This options allows you to validate individual http requests/responses.

4. In the left pane of the Add Output wizard, select the node that specifies the output type you want to use as a tool input. 5. In the right pane, select the tool you want to use. 6. Click Finish. 7. Double-click the node added for that tool, then review and modify settings as needed in the tool configuration panel that opens on the right.

SOAP Client Output Options In the left pane of the Add Output wizard are three sub-menus: Response, Request, and Both. In the right pane of the Add Output wizard are New or Existing Output tools that you may select. •





Response: Sends the SOAP response to the following choices: •

Transport Header: Allows you to add the desired New/Existing Output tool to the HTTP Header or JMS properties specified as the Transport of the SOAP Client tool.



SOAP Envelope: Allows you to add the desired New/Existing Output tool to the SOAP envelope:



Attachment: Allows you to add the Attachment Handler tool as the output.

Request: Sends the SOAP request to the following choices: •

Transport Header: Allows you to add the desired New/Existing Output tool to the HTTP Header or JMS properties specified as the Transport of the SOAP Client tool.



SOAP Envelope: Allows you to add the desired New/Existing Output tool to the SOAP envelope:

Both: Sends both the SOAP response and SOAP request to the following choices: •

Traffic Object: Allows you to add an Extension, Traffic Viewer, or WS-I tool as an output.



Traffic Stream: Sends the HTTP traffic (SOAP request and SOAP response) to the selected location. Available options include: •

File: Sends the output to a file. If you choose this option, be sure to select the new FileStreamWriter output node, then specify where you want the file written in the control panel that opens. Moreover, if you want to ensure that this file’s path is always relative to your test file, enable the Persist as Relative Path to Test option in that same panel.



Results: Sends output to the GUI Message panel.

334

Adding Test Outputs



stderr: Sends standard error to a console window.



stdout: Sends standard output to a console window.

Messaging Client Output Options In the left pane of the Add Output wizard are three sub-menus: Response, Request, and Both. In the right pane of the Add Output wizard are New or Existing Output tools that you may select. •





Response: Sends the HTTP response to the following choices: •

Transport Header: Allows you to add the desired New/Existing Output tool to the HTTP Header or JMS properties specified as the Transport of the Messaging Client tool.



HTTP Response: Allows you to add the desired New/Existing Output tool.

Request: Sends the HTTP request to the following choices: •

Transport Header: Allows you to add the desired New/Existing Output tool to the HTTP Header or JMS properties specified as the Transport of the SOAP Client tool.



SOAP Envelope: Allows you to add the desired New/Existing Output tool to the SOAP envelope:

Both: Sends both the HTTP response and HTTP request to the following choices: •

Traffic Object: Allows you to add an Extension, Traffic Viewer, or WS-I tool as an output.



Traffic Stream: Sends the HTTP traffic (HTTP request and HTTP response) to the selected location. Available options include: •

File: Sends the output to a file. If you choose this option, be sure to select the new FileStreamWriter output node, then specify where you want the file written in the control panel that opens. Moreover, if you want to ensure that this file’s path is always relative to your test file, enable the Persist as Relative Path to Test option in that same panel.



Results: Sends output to the GUI Message panel.



stderr: Sends standard error to a console window.



stdout: Sends standard output to a console window.

General Tool Output Options You can typically send the output of most tools (other than the SOAP Client tool) directly to a tool of your choice. In addition, you can send SOAP Client output to tools if you first choose the XML Request output option. Some tools (such as Coding Standards) have 2 outputs: •

Messages about the test.



The transformed source produced during the test.

In these cases, you will see the following options: •

Messages: Sends the previous tool’s result messages to the specified tool.

335

Adding Test Outputs



Transformed Source: Sends the source code created/modified by the previous tool to the next specified tool. For example, you would use this option if you wanted to send the file that results from an XSLT tool to the Validate XML tool.

336

Adding Global Test Suite Properties

Adding Global Test Suite Properties This topic explains how to create JMS, XPath, SOAP Header, Database, Keystore, Tool, and Input properties that can be shared and referenced globally across a test suite. Sections include: •

Global JMS Connection Properties



Global Ignored XPath Properties



Global SOAP Header Properties



Global Database Account Properties



Global Key Stores



Global Tools



Global WS-Policy Banks

Global JMS Connection Properties When creating a large test suite with multiple tools, there may be instances where some tools (i.e. SOAP Client, Messaging Client, and Call Back Tool) will use the same JMS Connection Properties. Rather than manually entering the same information into each tool or copying and pasting settings between tools, it may be easier to create JMS settings that each tool can reference. In this case, you can create global JMS Connection Properties at the test suite level. To create global JMS Properties, complete the following:

1. Select the desired test suite node and click the Add Property button. The Add Global wizard displays. 2. Select Global Property> JMS Connection Properties from the Add Global wizard and click Finish. A Properties node displays in the Test Case Explorer tree and the JMS Connection Properties panel displays in the right side of the GUI. 3. Specify the settings in the JMS Connection Properties panel as follows: a. If you want to change the default name, enter the new name in the Name field. This will be the name that appears in the SOAP Client, Messaging Client, and Call Back tools from which you will reference these properties. Because you can create more than one global reference for JMS Connection Properties, the name you enter should be intuitive to its use. b. Click the Add Property to All Tests button (if you don’t click this button, the global properties you add will be ignored by the tests in the test suite). Depending on what you select from the drop-down list, one of the following will occur: •

c.

If Use Shared Property Only is selected from the drop-down list, the corresponding tests in the test suite will be able to only use the global property you added.

If Use Local and Shared Properties is selected from the drop-down list, the corresponding tests in the test suite will be able to use the global property you added and any properties configured within the individual test itself.

d. In the Provider URL field, specify the location of the JMS Administered Objects. e. In the Initial Context field, specify the Java class that contains all the JMS properties mappings.

337

Adding Global Test Suite Properties

f.

In the Connection Factory field, specify the key used to look up the MOM-specific factory from the initial context. Can be either a Queue Connection Factory or a Topic Connection Factory.

g. In the Authentication area, select the Perform Authentication checkbox and enter the Username and Password to authenticate the request. If the correct username and password are not used, the request will not be authenticated. Note: Only the SOAP Client, Messaging Client, and Call Back Tools can reference global JMS Connection Properties. After specifying the global JMS Connection Properties, you can share these properties across multiple instances of these SOAtest tools.

Global Ignored XPath Properties As with global JMS properties, there may be instances when you have multiple Diff tools that use the same XPath settings. Rather than manually entering the same information into each Diff tool or copying and pasting settings between Diff tools, it may be easier to create XPath settings that each Diff tool can reference. In this case, you can create global XPath Properties at the test suite level. To create a global list of ignored XPaths, complete the following:

1. Select the desired test suite node and click the Add Property button. The Add Global wizard displays. 2. Select Global Property> Ignored XPaths from the Add Global wizard and click Finish. A Properties node displays in the Test Case Explorer tree and the Ignored XPaths panel displays in the right side of the GUI. 3. Specify the settings in the Ignored XPaths panel as follows: a. If you want to change the default name, enter the new name in the Name. This will be the name that appears in the Diff tools from which you will reference these XPaths. Because you can create more than one global reference list for XPaths, the name you enter should be intuitive to its use. b. Click the Add Property to All Tests button (if you don’t click this button, the global properties you add will be ignored by the tests in the test suite). Depending on what you select from the drop-down list, one of the following will occur:

c.



If Use Shared Property Only is selected from the drop-down list, the corresponding tests in the test suite will be able to only use the global property you added.



If Use Local and Shared Properties is selected from the drop-down list, the corresponding tests in the test suite will be able to use the global property you added and any properties configured within the individual test itself.

Click the Add button. An empty field displays in the XPath column of the Ignored XPaths List. By default, the Settings column is automatically filled in with all XPath operations specified, meaning that the entire XPath you add will be ignored.

d. Using the Ignored XPath setting dialog that opens when you double-click in the XPath column, specify an XPath position. The XPath you enter can be shared by multiple diff tools within the test suite. •

If you want to ignore more than one attribute at an element’s XPath location, leave the attribute name empty or use the wild card * (e.g. myAttribute*).

338

Adding Global Test Suite Properties

Global SOAP Header Properties When creating a large test suite with multiple tools, there may be instances where SOAP Client tests will use the same SOAP Header Properties. Rather than manually entering the same information into each test or copying and pasting settings between tests, it may be easier to create SOAP Headers that each test can reference. In this case, you can create global SOAP Header Properties at the test suite level. To create a global SOAP Header, complete the following:

1. Select the desired test suite node and click the Add Property button. The Add Global wizard displays. 2. Select Global Property> SOAP Headers from the Add Global wizard and click Finish. A Properties node displays in the Test Case Explorer tree and the SOAP Headers panel displays in the right side of the GUI. 3. Specify the settings in the SOAP Headers panel as follows: a. If you want to change the default name, enter the new name in the Name field. b. Click the Add Property to All Tests button (if you don’t click this button, the global properties you add will be ignored by the tests in the test suite). Depending on what you select from the drop-down list, one of the following will occur:

c.



If Use Shared Property Only is selected from the drop-down list, the corresponding tests in the test suite will be able to only use the global property you added.



If Use Local and Shared Properties is selected from the drop-down list, the corresponding tests in the test suite will be able to use the global property you added and any properties configured within the individual test itself.

Click the Add button. The Choose Header Type dialog displays.

d. Select a SOAP Header type from the Available Header types list and click OK. e. Configure the SOAP Header parameters as needed. For more information on each SOAP Header, see “Adding SOAP Headers”, page 811.

339

Adding Global Test Suite Properties

Global Database Account Properties When creating a large test suite with multiple tools, there may be instances where DB tools will use the same Database Properties. Rather than manually entering the same information into each tool or copying and pasting settings between tools, it may be easier to create a Database Account that each tool can reference. In this case, you can create global Database Account Properties at the test suite level. To create a global Database Account, complete the following:

1. Select the desired test suite node and click the Add Property button. The Add Global wizard displays. 2. Select Global Property> Database Account from the Add Global wizard and click Finish. A Properties node displays in the Test Case Explorer tree and the Database Account panel displays in the right side of the GUI. 3. Specify the settings in the Database Account panel as follows: a. If you want to change the default name, enter the new name in the Name field. b. Click the Add Property to All Tests button (if you don’t click this button, the global properties you add will be ignored by the tests in the test suite). Depending on what you select from the drop-down list, one of the following will occur:

c.



If Use Shared Property Only is selected from the drop-down list, the corresponding tests in the test suite will be able to only use the global property you added.



If Use Local and Shared Properties is selected from the drop-down list, the corresponding tests in the test suite will be able to use the global property you added and any properties configured within the individual test itself.

Configure the rest of the Database Account settings as needed. •

If the account settings are stored in a file, select File then specify the path to that file. •



To refresh/reload the file (e.g., if you edited it outside of SOAtest), click Refresh Configuration Settings.

If you want to specify the settings in this panel, select Local, then specify the settings in the Driver, URL, Username, and Password field. •

To export these values to a file, click Export Configuration Settings. Once the values are exported to a file, you can import the file through the File> Input File control (described above). This way, you won’t have to re-type the same values if you want to add this same account to a different test suite.

Global Key Stores Key Stores contain the necessary certificates and private keys needed to perform secure Web service through means such as server/client authentication, XML encryption, and XML digital signatures. The values you specify in a Key Store will be available to use with the SOAP Client, XML Encryption, and XML Signer tools. The SOAP Client Tool can use a Key Store certificate to complete the handshake with a server. The XML Encryption Tool can use a Key Store certificate to encrypt XML documents, and the XML Signer tool can use a Key Store certificate and private key to sign and verify your identity in an XML document.

340

Adding Global Test Suite Properties

To add a key store:

1. Select the desired test suite node and click the Add Property button. The Add Global wizard displays. 2. Select Global Key Store from the Add Global wizard and click Finish. A Properties node displays in the Test Case Explorer tree and the Key Store panel displays in the right side of the GUI. 3. If you want to change the default name, enter the new name in the Name field. 4. Specify the settings in the Key Store panel’s Certificate tab as follows: a. Select Use same key store for private key if the Key Store contains private keys for the certificate. b. In the Key Store File field, specify the key store file by clicking the Browse button and using the file chooser that opens. If you want the path saved as a relative path (for example, to facilitate project sharing), check the Persist as Relative Path option. c.

In the Key Store Password field, specify the Key Store password and select the Save option if you want to save the password on future runs of the test.

d. In the Key Store Type box, select the type of Key Store being used (e.g. JKS, PKCS12, BKS, UBER). e. Click Load to populate the aliases with the available certificates/keys (if the path, type, and key store password are valid), then choose the certificate alias in the Certificate Alias.box. 5. Specify the settings in the Key Store panel’s Private Key tab as follows: a. In Key Store File, specify the key store file by clicking the Browse button and using the file chooser that opens. If you want the path saved as a relative path (for example, to facilitate project sharing), check the Persist as Relative Path option. •

This field is only available if the Key Store Contains Keys option is unselected in the Certificate tab.

b. In Key Store Password, specify the Key Store password and select the Save option to remember the password on future runs of the test. • c.

This field is only available if the Key Store Contains Keys option is unselected in the Certificate tab.

In Key Store Type, specify the type of Key Store being used (e.g. JKS, PKCS12, BKS, UBER). •

This field is only available if the Key Store Contains Keys option is unselected in the Certificate tab.

d. Click Load to populate the aliases with the available certificates/keys (if the path, type, and key store password are valid), then choose the private key alias in the Private Key Alias.box. e. In Private Key Password, specify the private key password and select the Save option to remember the password on future runs of the test.

Global Tools If you expect to use certain specialized tools (for example a particular XSLT tool or a “chained” tool that operates on a request or response, then sends its output to additional tools, which can send their out-

341

Adding Global Test Suite Properties

put to additional tools, and so on) only within the context of the current test suite, you can add them to the test suite tool repository, then add them to the test suite without recreating them each time. (If you plan to use a specialized tool for multiple test suites, you should add it to the program via the Tools panel available when you choose Tools> Customize. To add a tool to a test suite’s tools repository:

1. Select the desired test suite node and click the Add Property button. The Add Global wizard displays. 2. Select Global Tool> [Tool_name] from the Add Global wizard and click Finish. A new tool node displays in the Test Case Explorer tree (under the Tools branch, which will be added if it did not already exist) and a tool configuration panel displays in the right side of the GUI.

3. Customize that tool’s settings in the tool configuration panel that opens. 4. You can chain additional tools to that tool as described in “Adding an Output”, page 333. To use a repository tool in a test, select it from the Existing Tools list that is available when you add a test or output.

Global WS-Policy Banks One of the biggest aspects of Web services is interoperability. Web services rely on a standardized interface to declare what requirements must be met in order for a service consumer to interact with a service provider. The basic WSDL specification does not have the capacity to declare complex clientside requirements. To accommodate for this, WSDL is extended with WS-Policy and WS-PolicyAttachment, allowing a service provider to define additional requirements within the WSDL. WS-Policy leaves it up to other WS-* specifications to define their own set of policies. One such specification is WSSecurityPolicy which defines policies related to WS-Security. When reading a WSDL with SecurityPolicy extensions, SOAtest automatically generates the test cases with all the necessary policy related configurations. There are some attributes of the test case that still require manual configuration, but SOAtest will automatically set up the foundation. Note: WS-Policy is a lightweight specification. It delegates policy design to the WS-* specifications; in addition, there is a large set of proprietary policies. Since the WS-* space is large, SOAtest only supports WS-SecurityPolicy assertions but will continue to extend the processor to handle other common assertion sets. To add a global WS-Policy Bank, complete the following:

1. Select the desired test suite node and click the Add Property button. The Add Global wizard displays.

342

Adding Global Test Suite Properties

2. Select WS-Policy Bank from the Add Global wizard and click Finish. A new WSDL Policies node displays in the Test Case Explorer tree (under the WS-Policy Banks branch, which will be added if it did not already exist) and a WSDL Policies configuration panel displays in the right side of the GUI and various WS-security tests will be chained to the SOAP client tools. .

3. Specify the settings in the WSDL Policies panel as follows: a. If you want to change the default name, enter the new name in the Name field. b. In WSDL URI, specify the WSDL URI where this Web service can be accessed. You can either enter a WSDL or click the Browse button. c.

Click Refresh from WSDL to refresh the WSDL from the given location URL and reparses it.



In the Global Policies area, review the policy definitions in non-XML format. as well as the policy alternatives implied by your WSDL. Each section in the left hand tree represents a global policy element in the WSDL.

343

Reusing and Reordering Tests

Reusing and Reordering Tests This topic explains how to reorder tests as well as reuse tests and other test assets. You can drag and drop tests to reorder them. In addition, you might want to cut/copy and paste tests in the following situations: •

You want to create new tests that are very similar to existing tests.



You want to change the order of test execution (Tests are executed in the order in which they are listed in the Test Case Explorer).



You want to copy tests or test suites from one project to another.

To copy or cut a test or test suite: 1. Right-click the Test node or Test Suite node that represents the test or test suite that you want to cut/copy. 2. Choose Cut or Copy from the shortcut menu. To paste a test or test suite at the end of a test suite: 1. Right-click the Test Suite node 2. Choose Paste. To paste a test or test suite in a specific position in a test suite: 1. Right-click the Test node above which you want the test case or test suite pasted. 2. Choose Paste.

Copying/Pasting Other Test Assets You can also copy/paste other test assets—such as chained tools, data sources, transport settings, etc.—as needed.

344

Parameterizing Tests (with Data Sources or Values from Other Tests)

Parameterizing Tests (with Data Sources or Values from Other Tests) This topic explains how you can quickly extend the scope and comprehensiveness of your functional testing by parameterizing tests with values that are stored in data sources or extracted from other tests. Parameterization can be applied to test case inputs as well as data validation. Sections include: •

Parameterizing Tests with Values Extracted from Another Test



Understanding How SOAtest Performs Functional Testing Using Data Sources



Adding a Data Source •

Adding a CSV File Data Source



Adding a Database Data Source



Adding an Excel Spreadsheet Data Source



Specifying Data Source Values in a Table



Combining Multiple Data Sources into an Aggregate Data Source



Adding a File Data Source



Adding a Writable Data Source



Setting Up a "One-to-Many" Data Source Mapping



Generating a Data Source Template for Populating Message Elements



Cutting, Copying, and Pasting Data Sources



Performing Functional Tests Using Data Sources







Understanding Data Source Iteration



Configuring the SOAP Client and Diff Tools to Use Data Sources



Parameterizing Arrays of Varying Size

Using Interpreted Data Sources •

Generating a Data Source from the fields of a Java Bean



Interpreted Data Source Table Format

Populating and Parameterizing Elements with Data Source Values

Parameterizing Tests with Values Extracted from Another Test You can parameterize tests by extracting values from one test and then using them available in another test. This is accomplished with tools such as: •

XML Data Bank



Browser Data Bank



Header Data Bank



JSON Data Bank



Object Data Bank



Text Data Bank

345

Parameterizing Tests (with Data Sources or Values from Other Tests)

Understanding How SOAtest Performs Functional Testing Using Data Sources Another way to parameterize tests is with data source values. For example, you can configure SOAtest to send data source values as part of a request to a server. The values that SOAtest receives in response can then be compared to another data source value to check if the response received is correct. SOAtest will check each available combination of data source rows. This behavior is particularly useful if you want SOAtest to perform functional testing on a number of different inputs stored in a data source. For more information, see “Performing Functional Tests Using Data Sources”, page 354. SOAtest can perform functional testing using values from any of the following types of data sources: •

CSV files



Databases



Excel spreadsheets



Tables created in (or copied into) the internal table editor.



File



Writable



Aggregate

Tip- Generating a Data Source Template for Populating Message Elements Manually creating a data source for parameterizing large, complex XML messages can be time-consuming and tedious. For a fast way to accomplish this, have SOAtest automatically generate a CSV data source template based on the structure of the request or response message that you want to parameterize. Columns in the generated data source are automatically mapped to the appropriate elements in the request or response message. The only thing you need to do is add values to the generated data source template. For details, see “Generating a Data Source Template for Populating Message Elements”, page 353.

Adding a Data Source Data sources are added at the test suite level and saved in .tst files. You can specify any number of data sources for a test suite, and you can use any specified data sources throughout a test suite’s tests. In addition, you can create an aggregate data source in which you can combine the values of other available data sources into a single data source. This is especially useful if you would like to perform a functional test that needs to draw values from multiple data sources. For example, in sending a request to a server, you may want to send values from a data source that contains user information such as a first and last name, and you may also want to send values from a separate data source that contains the user’s login and password information. By combining the two data sources into a single aggregate data source, you can create a single test instead of having to create separate tests for each data source. The procedure for adding a data source depends on the type of data source you want to add. The following topics explain how to add the five possible types of data sources: •

Adding a CSV File Data Source

346

Parameterizing Tests (with Data Sources or Values from Other Tests)



Adding an Excel Spreadsheet Data Source



Specifying Data Source Values in a Table



Adding a Database Data Source



Combining Multiple Data Sources into an Aggregate Data Source



Adding a File Data Source



Adding a Writable Data Source

Once a data source is added, it will be represented in the Data Sources branch of the Project tree. SOAtest will add one node for each available data source. To view or change data source settings, you select that node, then view or modify the options listed in the right GUI panel. If you right-click on a data source node that is not a table, the Create Table option will be available in the shortcut menu. Selecting this option creates a new table data source that contains the same data and settings as the original data source that was right-clicked. This newly created data source will be added as a node to the Data Sources branch of the project tree. For more information on table data sources, see “Specifying Data Source Values in a Table”, page 349.

Adding a CSV File Data Source To add a CSV file data source:

1. Select the desired test suite node and click the Add Database toolbar button. The New Data Source wizard displays. 2. Select CSV and click Finish. The Data Source configuration options display in the right GUI panel of SOAtest. 3. (Optional) Change the data source label in the Name field of the Data Source configuration options. 4. Use the Rows controls to indicate the range of rows you want to use. •

If you only want to use selected rows, click the Range button, then enter the desired range (assuming a one-based index) by typing values into the From and To fields. For example, to use only the first 10 rows, enter 1 in the From field and 10 in the To field. To use only the fifth row, enter 5 in the From field and 5 in the To field.

5. Specify the path to the CSV file in the File Path field. 6. Specify the type of separator and quotes that the file uses. 7. If you want to see a list of the columns from that data source, click Show Columns. •

SOAtest assumes that the first row of values represents your column titles. If they do not, you might have trouble identifying and selecting your data source columns in SOAtest. If you want SOAtest to use different column titles, update the first row of your data source, then click the Show Columns button.

Adding a Database Data Source To add a database data source:

1. Select the desired test suite node and click the Add Database toolbar button. The New Data Source wizard displays.

347

Parameterizing Tests (with Data Sources or Values from Other Tests)

2. Select Database and click Finish. The Data Source configuration options display in the right GUI panel of SOAtest. 3. (Optional) Change the data source label in the Name field of the Data Source configuration options. 4. Use the Rows controls to indicate the range of rows you want to use. •

If you only want to use selected rows, click the Range button, then enter the desired range (assuming a one-based index) by typing values into the From and To fields. For example, to use only the first 10 rows, enter 1 in the From field and 10 in the To field. To use only the fifth row, enter 5 in the From field and 5 in the To field.

5. Specify the Database Configuration parameters. For more information, see the following section.

Database Configuration Parameters To configure the settings in the Database Configuration panel. 1. In the Driver class box, select the type of driver to use. The following are download links for common drivers: •

Oracle: http://www.oracle.com/technology/software/tech/java/sqlj_jdbc/index.html



MySQL: http://dev.mysql.com/downloads/connector/j/



Windows System DSN: This driver is included with Java.



SQLServer: http://msdn.microsoft.com/data/jdbc/



Sybase: http://www.sybase.com/products/allproductsa-z/softwaredeveloperkit/jconnect



DB2: http://www-306.ibm.com/software/data/db2/java/



Other: http://developers.sun.com/product/jdbc/drivers

2. Specify the settings for that particular type of driver. Settings will vary from driver to driver. Common settings include: •

Driver class: Type the path to the appropriate JDBC driver class, including the package name. For example, you might enter the following path if you were using an Oracle database: oracle.jdbc.driver.OracleDriver.

The driver that you enter must be available on your CLASSPATH; if it is not, SOAtest will not be able to access your database. To add the driver jars (or zip files) to the CLASSPATH, go to SOAtest> Preferences, then add the file(s) under the JDBC Drivers section. •

URL: Type the appropriate URL. For ex. For example, you might enter the following path if you were using an Oracle database: jdbc:oracle:thin:@bear:1521:mydb

Here are some examples of Drivers and URLs you might use for different databases •

Oracle: Driver: oracle.jdbc.driver.OracleDriver URL: jdbc:oracle:thin:@host:port:dbName



MySQL: Driver: com.mysql.jdbc.Driver URL: jdbc:mysql://host:port/dbName

348

Parameterizing Tests (with Data Sources or Values from Other Tests)



Windows System DSN: Driver: sun.jdbc.odbc.JdbcOdbcDriver URL: jdbc:odbc:DATABASE_NAME (where DATABASE_NAME is the database name from your System DSN settings)



SQLServer Driver: com.microsoft.sqlserver.jdbc.SQLServerDriver URL: jdbc:sqlserver://host:port;DatabaseName=DATABASE_NAME •

Note that older drivers may have different settings.



Sybase: Driver: com.sybase.jdbc2.jdbc.SybConnectionPoolDataSource URL: jdbc:sybase:Tds:host:port/dbName



DB2: Driver: COM.ibm.db2.jdbc.app.DB2Driver (included in db2java.zip, which comes with the DB2 run-time client) URL: jdbc:db2://host/dbName



Username: Type a valid username for this database (if the database requires passwords).



Password: Type the password for the given username (if the database requires passwords).



SQL Query: Type or copy the SQL query that expresses which data you want to use.

If you want to check what column names SOAtest is using, click the Show Columns button. If you want SOAtest to use different column titles for the existing columns, update your database column names, then click the Show Columns button. If you want SOAtest to use different columns, update your SQL query so that it retrieve the appropriate columns, then click the Show Columns button.

Adding an Excel Spreadsheet Data Source To add an Excel spreadsheet data source:

1. Select the desired test suite node and click the Add Database toolbar button. The New Data Source wizard displays. 2. Select Excel and click Finish. The Data Source configuration options display in the right GUI panel of SOAtest. 3. (Optional) Change the data source label in the Name field of the Data Source configuration options. 4. Use the Rows controls to indicate the range of rows you want to use. •

If you only want to use selected rows, click the Range button, then enter the desired range (assuming a one-based index) by typing values into the From and To fields. For example, to use only the first 10 rows, enter 1 in the From field and 10 in the To field. To use only the fifth row, enter 5 in the From field and 5 in the To field.

5. Specify the path to the Excel file in the File Path field. 6. Select the sheet of the specified Excel file you would like to use from the Sheet menu. Important: SOAtest assumes that the first row of values represents your column titles. If they do not, you might have trouble identifying and selecting your data source columns in SOAtest. If you want SOAtest to use different column titles, update the first row of your data source, then click the Show Columns button.

Specifying Data Source Values in a Table 349

Parameterizing Tests (with Data Sources or Values from Other Tests)

To specify data source values by entering or pasting them into an internal table editor:

1. Select the desired test suite node and click the Add Database toolbar button. The New Data Source wizard displays. 2. Select Table and click Finish. The Data Source configuration options display in the right GUI panel of SOAtest. 3. (Optional) Change the data source label in the Name field of the Data Source configuration options. 4. Use the Rows controls to indicate the range of rows you want to use. •

If you only want to use selected rows, click the Range button, then enter the desired range (assuming a one-based index) by typing values into the From and To fields. For example, to use only the first 10 rows, enter 1 in the From field and 10 in the To field. To use only the fifth row, enter 5 in the From field and 5 in the To field.

5. If you want to specify column names (rather than use the default A, B, C, D, etc.), check First row specifies column names. 6. Add the data values by typing or pasting them into the table. You can copy from popular spreadsheets such as Excel. Note that the table editor contains standard copy/cut/paste editing commands (when a cell is selected) as well as commands to insert rows or tables. To add more rows, use the downward arrow key or the downward arrow scrollbar button. To add more columns, use the right arrow key or the right arrow scrollbar button. Note: You can add columns to a table data source by right-clicking on a column header and selecting Insert column or Insert multiple columns from the shortcut menu.

Combining Multiple Data Sources into an Aggregate Data Source You can create an aggregate data source in which you can combine the values of other available data sources into a single data source. This is especially useful if you would like to perform a functional test that needs to draw values from multiple data sources. For example, in sending a request to a server, you may want to send values from a data source that contains user information such as a first and last name, and you may also want to send values from a separate data source that contains the user’s login and password information. By combining the two data sources into a single aggregate data source, you can create a single test instead of having to create separate tests for each data source. To combine multiple data sources into an aggregate data source:

1. Select the desired test suite node and click the Add Database toolbar button. The New Data Source wizard displays. 2. Select Aggregate and click Finish. The Data Source configuration options display in the right GUI panel of SOAtest. 3. (Optional) Change the data source label in the Name field of the Data Source configuration options. 4. Use the Rows controls to indicate the range of rows you want to use. •

If you only want to use selected rows, click the Range button, then enter the desired range (assuming a one-based index) by typing values into the From and To fields. For example, to use only the first 10 rows, enter 1 in the From field and 10 in the To field. To use only the fifth row, enter 5 in the From field and 5 in the To field.

350

Parameterizing Tests (with Data Sources or Values from Other Tests)

5. Choose the desired data sources from the Available box and click the Add button to add them to the Selected box. •

The Available box contains all of the data sources added to the test suite. After selecting and adding the desired data sources to the Selected box, the column names contained in the added data sources display in the Columns box.

Adding a File Data Source To add a File data source:

1. Select the desired test suite node and click the Add Database toolbar button. The New Data Source wizard displays. 2. Select File and click Finish. The Data Source configuration options display in the right GUI panel of SOAtest. 3. (Optional) Change the data source label in the Name field of the Data Source configuration options. 4. Use the Rows controls to indicate the range of rows you want to use. •

If you only want to use selected rows, click the Range button, then enter the desired range (assuming a one-based index) by typing values into the From and To fields. For example, to use only the first 10 rows, enter 1 in the From field and 10 in the To field. To use only the fifth row, enter 5 in the From field and 5 in the To field.

5. Specify the file or directory to import files from. All of the files available in the specified location will display in the table. Right-click options allow you to cut, copy, and paste values in as well. •

To filter which files are used by the File Data Source, enter a string in the File Filter field. For example: •

* = wild card for any string



*.* = all files (this is the default)



*.txt = all text files



data* = all files whose files names begin with "data"



data*.txt = all text files whose files names begin with "data"



*data* = all files whose file names contain the string "data" somewhere

At runtime, SOAtest will use the contents of each file as a data source value.

Adding a Writable Data Source To add a writable data source that dynamically generates values:

1. Select the desired test suite node and click the Add Database toolbar button. The New Data Source wizard displays. 2. Select Other from the Add Data Source wizard and click Next. The Other dialog of the Add Data Source wizard displays. 3. Select Writable and click Finish. The Data Source configuration options display in the right GUI panel of SOAtest.

351

Parameterizing Tests (with Data Sources or Values from Other Tests)

4. (Optional) Change the data source label in the Name field of the Data Source configuration options. 5. Use the Rows controls to indicate the range of rows you want to use. •

If you only want to use selected rows, click the Range button, then enter the desired range (assuming a one-based index) by typing values into the From and To fields. For example, to use only the first 10 rows, enter 1 in the From field and 10 in the To field. To use only the fifth row, enter 5 in the From field and 5 in the To field.

6. Specify your preferred writing mode for setup tests and standard tests. •

Append will only continuously append to the Writable Data Source without clearing the previous appended value as long as the Writable Data Source is being written to by a single test case. When another test case writes to the Writable Data Source, it will clear the data source and append a new value.

7. If you want to specify column names (rather than use the default A, B, C, D, etc.), check First row specifies column names. 8. (Optional) Specify a column name for each non-empty column by double-clicking the header for that column, then entering a column name in the dialog box that opens. If you do not change the column headers, SOAtest will refer to each column by its default name (a, b, c, etc). 9. To populate the Writable data source, complete the following: a. Add a SOAP Client tool as a Set-Up Test to the test suite. For more information on Set-Up Tests, see “Adding Set-Up and Tear-Down Tests”, page 332. b. Add a XML Data Bank tool as an output to the SOAP Client Set-Up Test. c.

Run the SOAP Client Set-Up Test to populate the XML Data Bank.

d. Add a node to the Selected XPaths in the XML Data Bank GUI. e. Double-click the entry row underneath the Data Source column name column in the XML Data Bank GUI. A Modify dialog displays. f.

Select Writable Data Source Column in the Modify dialog and click OK. Now when the Set-Up Test runs, the Writable Data Source will be populated. The Writable Data Source will automatically be reset every time its parent Test Suite is run.

Note: You can add row and columns to a writable data source by right-clicking and selecting Insert rows | columns.

Setting Up a "One-to-Many" Data Source Mapping You can also set up SOAtest to use values from a single row of one data source (e.g., a global data source that contains login information) with multiple rows from another data source as follows: 1. Add a Writable Data Source to your project as described in “Adding a Writable Data Source”, page 351. The Writable Data Source lets SOAtest iterate independent of the other Data Sources. 2. If you have more than one "global" parameter in your Data Source, right-click the single Writable Data Source column and select Insert Columns. Rename the columns to something matching your original Data Source columns. 3. Add an Extension tool as a Set-Up Test (see “Adding Set-Up and Tear-Down Tests”, page 332 for details). This will act as an interface to your "global" Data Source. 4. Configure the extension tool as described in “Extension (Custom Scripting)”, page 960. Assuming that the column names you want to access are named "username" and "password"

352

Parameterizing Tests (with Data Sources or Values from Other Tests)

from the Data Source "Credentials", you would select the Credential data source in the tool’s configuration panel, check Use data source, then add the following code: from soaptest.api import * def getCredentials(input, context): username = context.getValue("Credentials", "username") password = context.getValue("Credentials", "password") return SOAPUtil.getXMLFromString( [ username, password ] )

5. Chain an XML Data Bank to the output of the Extension Tool by right-clicking the Extension tool, choosing Add Output, and then selecting the XML Data Bank tool option. 6. Run the test once to populate the XML Data Bank. 7. Double-click the XML Data Bank tool to open its configuration panel. 8. Select the element that corresponds to the first parameter, and click Add. 9. Click Modify and select DataSource column name. 10. Select Writable Data Source Column and select the corresponding name. 11. Repeat steps 8-10 for the element corresponding to the second parameter.

Generating a Data Source Template for Populating Message Elements Manually creating a data source for parameterizing large, complex XML messages can be time-consuming and tedious. For a fast way to accomplish this, have SOAtest automatically generate a CSV data source template based on the structure of the request or response message that you want to parameterize. To generate and use a data source template: 1. If the Form Input view does not display all of the elements that you want the generated data source to reference, manually add them or automatically populate the form as described in “Populating a Set of Elements with Automatically-Generated Values”, page 809. Automated population will add nodes for optional elements. 2. In the appropriate messaging tool (SOAP Client, Messaging Client, Message Stub), go to the Form Input view, right-click the root tree element, then choose Generate CSV Data Source.

353

Parameterizing Tests (with Data Sources or Values from Other Tests)

3. In the dialog that opens, specify the following settings. •

File Name: The name of the CSV file that will be generated.



CSV File Destination: The workspace location where you want to save the generated file.



File separator: The delimiter for the CSV file.

4. Click OK. A data source will be added to the test suite and each message element will be parameterized with a column from the data source. 5. Open the generated data source template and add values as needed. These values will be passed to the associated elements during test execution.

Cutting, Copying, and Pasting Data Sources To cut, copy, or paste a data source, right-click the desired data source node and select Cut, Copy, or Paste from the shortcut menu. This feature is useful if you already have a data source in which you would like to modify only a few values. You can then just cut or copy the data source, paste it in the same tree branch, then modify the new data source as needed.

Performing Functional Tests Using Data Sources After adding a data source to a test suite, the data source values can be used in conjunction with SOAtest tools to further extend the functionality of the test suite. Data sources can be used to parameterize values in tools such as SOAP Client, Messaging Client, Browser Testing Tool, DB Tool, Diff Tool, and so on. The SOAP Client and Messaging Client tools can be configured to send data source values as part of a request to the server. The Diff tool can then be configured to compare the responses to another set of values in the data source. For example, consider the functional testing of a service that receives the names of U.S. capital cities and returns the names of corresponding states. In this case, a data source with two columns of differing values—the first column containing cities as values, and the second column containing states as values—would be added to the test suite. The SOAP Client tool could be configured to send requests that draw inputs from the first column. The Diff tool would then be configured to compare the actual responses to the inputs in the second column. Using the SOAP Client and Diff tools in this way is a very powerful option in functional testing since each value in each row of the data source would be cycled through and checked.

Understanding Data Source Iteration It is important to understand how the test Execution Options will affect Data Source iteration. For more information on Test Execution settings, see “Specifying Execution Options (Test Flow Logic, Regression Options, etc.)”, page 320. The following Execution Options will affect Data Source Iteration: •

Individually Runnable: When Individually Runnable is selected in the Execution Options of a test suite, each test in the test suite will iterate through every row of the data source before moving on to the next test.



Scenario: When Individually Runnable is not selected, every test in the test suite will execute before iterating to the next data source row.

For example, if a Test Suite contains Test A and Test B that both use a Data Source with three rows, the following execution patterns would occur: •

Individually Runnable: Test A: Row 1, Test A: Row 2, Test A: Row 3, Test B: Row 1, Test B: Row 2, Test B: Row 3

354

Parameterizing Tests (with Data Sources or Values from Other Tests)



Scenario (Non-Individually Runnable): Test A: Row 1, Test B: Row 1, Test A: Row 2, Test B: Row 2, Test A: Row 3, Test B: Row 3

If an XML Data Bank is used to pass values between test cases in a Test Suite, this will automatically put the Test Suite in "Scenario Mode" regardless of whether Individually Runnable is selected. The Data Source iteration will behave as if Individually Runnable is not selected. For more information on using an XML Data Bank see “XML Data Bank”, page 863.

Configuring the SOAP Client and Diff Tools to Use Data Sources To configure the SOAP Client and Diff tools to send and compare data source values: 1. If you have not already done so, add the data sources you want to use as described in “Adding a Data Source”, page 346. 2. Double-click the desired SOAP Client node in the Test Case Explorer and choose the appropriate data source from the Data Source combo box in the right GUI panel. •

The Data Source combo box will not display unless a data source was previously added to the test suite. If there is only one data source available, the SOAP Client tool will default to that data source. If there is more than one data source available, the SOAP Client tool will default to the first data source listed in the Tests tab.

3. Complete the rest of the fields in the Project Configuration panel for the SOAP Client node as explained in “SOAP Client”, page 777. 4. Select the SOAP Client node and click the Add Test or Output toolbar button. An Add Output wizard displays. 5. From the Add Output wizard, select Request> SOAP Envelope from the left pane, and select Diff from the right pane and click Finish. A Diff node is added to the SOAP Client Node. 6. Double-click the Diff node to open the tool configuration panel. 7. Choose the appropriate data source from the Data Source box. The data source you choose must be the same data source specified for the SOAP Client. 8. In the Regression control source box, choose Data Source. 9. In the Data Source Column box, choose the column from the combo box that you would like the Diff tool to compare responses to. 10. Complete the rest of the Diff tool configuration settings as explained in “Diff”, page 899. You can now perform a functional test by selecting the SOAP Client node and clicking the Test toolbar button. The results will display in the right GUI panel.

Parameterizing Arrays of Varying Size Soatest also allows you to map hierarchical data structure in data sources. Suppose that you have a web service that collects information to construct family trees. The information sent looks like the following: <ns1:People xmlns:ns1="http://www.example.org/ParentChildGrandChild"> <ns1:Person> <ns1:Name>GrandPa</ns1:Name> <ns1:Age>85</ns1:Age> <ns1:Child> <ns1:Name>Daddy</ns1:Name> <ns1:Age>55</ns1:Age> <ns1:Child> <ns1:Name>FirstSon</ns1:Name>

355

Parameterizing Tests (with Data Sources or Values from Other Tests)

<ns1:Age>25</ns1:Age> </ns1:Child> <ns1:Child> <ns1:Name>SecondSon</ns1:Name> <ns1:Age>22</ns1:Age> </ns1:Child> </ns1:Child> </ns1:Person> </ns1:People>

It represents a family tree as follows: •

GrandPa •

Daddy •

FirstSon



SecondSon

How do you set up an “Array Data Source” that can be used to vary the number of children and grandchildren in the XML? First, you setup the data source, then you use it to parameterize the form input as described in the following sections.

Setting up the Data Source Suppose we want to send family tree information for the following 2 families. •

GrandPa •



Daddy •

FirstSon



SecondSon

GrandMa •

Mommy •

FirstDaughter



SecondDaughter



FirstAunt



SecondAunt •

FirstCousin



SecondCousin

To set up the data source for this scenario:

356

Parameterizing Tests (with Data Sources or Values from Other Tests)

1. Create an Excel Spreadsheet with 3 sheets named: FirstGeneration, SecondGeneration, ThirdGeneration.

2. Fill out the FirstGeneration sheet.

Notice the SecondGeneration dsref* column. This is how we denote that the children of the FirstGeneration will be from the SecondGeneration sheet. (dsref* denotes Data Source REFerence.)

357

Parameterizing Tests (with Data Sources or Values from Other Tests)

3. Fill out the SecondGeneration sheet.

Notice the ThirdGeneration dsref* column. Notice also the ParentIndex column. The value of the ParentIndex column indicates which FirstGeneration the SecondGeneration is related to. For example, Daddy is related to the first FirstGeneration or GrandPa. Mommy, FirstAunt, and SecondAunt are related to the second FirstGeneration or GrandMa.

358

Parameterizing Tests (with Data Sources or Values from Other Tests)

4. Fill out the ThirdGeneration Sheet.

Notice again the ParentIndex column. Notice that there is no ParentIndex 3 because FirstAunt does not have any children.

Parameterizing Form Input The next step to varying the number of children and grandchildren in the XML is to paramerize the form input as follows: 1. Create a new .tst file called ArrayDataSource.tst. 2. Add an Excel Data Source that points to the Excel spreadsheet created in previous steps. Select FirstGeneration as the sheet. 3. Create a new Messaging Client. 4. Configure the Messaging Client as follows: •

Set Schema URL to http://soatest.parasoft.com/ParentChildGrandChild.xsd



Set RouterEndpoint to http://ws1.parasoft.com:8080/examples/servlets/ Echo

5. Set up the Form Input as follows: People •

Person •

Name – Parameterized to: Name



Age - Parameterized to: Age

359

Parameterizing Tests (with Data Sources or Values from Other Tests)



Child



Name - Parameterized to: SecondGeneration:Name (Name column in SecondGeneration Sheet)



Age - Parameterized to: SecondGeneration:Age (Age column in SecondGeneration Sheet) •

Child



Name - Parameterized to: SecondGeneration:ThirdGeneration:Name (Name column in ThirdGeneration Sheet)



Age - Parameterized to: SecondGeneration:ThirdGeneration:Age (Age column in ThirdGeneration Sheet)

6. Run the test. In the Traffic Viewer, the XML should reflect the family information in the data source.

360

Parameterizing Tests (with Data Sources or Values from Other Tests)

Using Interpreted Data Sources An interpreted data source is a tabular data source that is regarded by SOAtest as a relational representation of a Java object graph. An interpreted data source can be used to facilitate creation of multiple Java objects and object graphs that can be used by the EJB Client Tool and other SOAtest tools as test parameter inputs. For more information on the EJB Client tool, see “EJB Client”, page 841.

Generating a Data Source from the fields of a Java Bean To generate a Data Source from the fields of a Java Bean:

1. Select the desired test suite node and click the Add Database toolbar button. The New Data Source wizard displays. 2. Select Bean Wizard and click Next. The Bean Wizard dialog displays.

3. Complete the following options in the Bean Wizard dialog: •

Destination Type: Select the type of table template you would like to create from the drop-down menu.



Destination Directory: Specify where the tables will be written.



Write Over Existing Files: Specify whether you want to overwrite existing files.



Java Class: Specify the class for which you would like to create a tabular representation.



Trace Dependencies: Select either Yes or No for trace dependencies. Selecting Yes prompts SOAtest to generate tables for types of class member variables “reachable” from the “root” class that you specified in the Java Class field.

4. Click the Next button. The Dependencies dialog displays. 5. Select the desired type dependencies from the Generate Table for Classes list.

361

Parameterizing Tests (with Data Sources or Values from Other Tests)

6. Click the Finish button to generate the tables. The tables will be created in the designated directory and a data source for each file will be added to the test suite you selected.

Interpreted Data Source Table Format The following are the main concepts of the relational to object mapping used by SOAtest: •

An object is a row in a table.



There are two types of tables: •



Class Tables •

Column for each class member variable



Row for each instance

Collection Tables •

Column for table and row of actual object



Row for each instance

Object References The first column in each table is an identifier for object instances. An object may be referenced by the table name followed by a space, followed by the object identifier. Identifier column need not necessarily be row numbers, so long as the identifier values are unique within each table.

Values and References Field values of non-primitive classes are generally specified through object references as described above. However, the extra level of indirection is unnecessary and cumbersome for values of primitive types. To accommodate an abbreviated form, a built-in support is included for inlining commonly used types with well-defined string representations. An empty cell in a reference column is interpreted as the null value. An empty cell for a value column is interpreted as an empty string.

Collections In order to support variable sized collections, a collection table is introduced. A collection table has a single reference column. An object in the collection is referenced by the table name followed by a space, followed by the object identifier.

Example As an example, let us consider an object graph with CreditCardDO as a root. CreditCardDO contains instances of PersonDO and AddressDO and a Vector of ActivityDO type objects. public class CreditCardDO extends PaymentMethodDO implements Serializable { protected String ccNumber; protected Date expirationDate; protected PersonDO ccHolder; protected AddressDO billingAddress; protected Vector recentActivity = new Vector(); // set…()/get…() methods omitted } public class PaymentMethodDO implements Serializable { protected String bankName; // set…()/get…() methods omitted } public class AddressDO implements Serializable {

362

Parameterizing Tests (with Data Sources or Values from Other Tests)

protected protected protected protected

String streetAddress; String city; int zipCode; String state;

// set…()/get…() methods omitted } public class PersonDO implements Serializable { protected String firstName; protected String lastName; // set…()/get…() methods omitted } public class ActivityDO implements Serializable { private float amount; private String description; // set…()/get…() methods omitted }

The following tables illustrate how the above object graph example can be represented in tabular format: Table CreditCardDO.csv soatest.examples.CreditCardDO

bankName

billingAddress ref

ccHolder ref

ccNumber

expirationDate

recentActivity ref

1

SampleBank

AddressDO 1

PersonDO 1

1234123412341 234

8/31/2005

RecentActivities1

Table AddressDO.csv soatest.examples.AddressDO

city

state

streetAddress

zipCode

1

Los Angeles

CA

101 E. Huntington Dr.

91016

Table PersonDO.csv soatest.examples.PersonDO

firstName

lastName

1

Donald

Duck

Table ActivityDO.csv soatest.examples.ActivityDO

amount

description

1

10

10 Charge-10

2

20

20 Charge-20

3

30

30 Charge-30

4

40

40 Charge-40

5

50

50 Charge-50

363

Parameterizing Tests (with Data Sources or Values from Other Tests)

6

60

60 Charge-60

7

70

70 Charge-70

8

80

80 Charge-80

9

90

90 Charge-90

10

100

100 Charge-100

Table RecentActivities-1.csv ActivityDO ref ActivityDO 1 ActivityDO 2 ActivityDO 3

Populating and Parameterizing Elements with Data Source Values Why Populate and Parameterize Elements with Data Source Values? Let's say you have a complex request message that looks something like this. <?xml version="1.0" encoding="UTF-8"?> <SOAP-ENV:Envelope xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchemainstance"> <SOAP-ENV:Body> <addNewItems xmlns="http://www.parasoft.com/wsdl/store-01/"> <book> <id xmlns="">0</id> <title xmlns=""></title> <price xmlns="">0.0</price> <authors xmlns=""> <i></i> <i></i> </authors> <publisher xmlns=""> <id myAtt="attVal">1</id> </publisher> </book> </addNewItems> </SOAP-ENV:Body> </SOAP-ENV:Envelope>

You will want to parameterize most of the above elements—and even some arrays that may vary in the number of elements they contain (such as the list of authors in the above example). This is data-driven testing. However, manually creating an Excel spreadsheet with data and parameterizing each individual element can be time-consuming and tedious. If you already have an existing data source, you can use the Populate feature (described below) to automatically map data source values to message elements.

364

Parameterizing Tests (with Data Sources or Values from Other Tests)

If you do not already have a data source with these values, you can use the Populate feature (described below) to automatically generate simple values for a set of form fields. You can also automatically generate and then complete a data source template as described in “Generating a Data Source Template for Populating Message Elements”, page 353.

Data Source Mapping and Naming Conventions When parameterizing element or attribute values, there are three possibilities: •

Specifying a value



Specifying that an element should be Nil or Null



Specifying that an element should be excluded entirely.

Specifying Values Matching data source columns to request elements is accomplished via certain naming conventions applied to the data source column names. In the example XML above, notice that there are 2 elements named "id." To distinguish them, we can use "book/id" as one column name and "book/publisher/id" as the other. This naming convention mimics a file directory structure or an XPath. Attributes are similarly identified, with the additional specification of an "@" symbol. In the example above, "book/publisher/ id@myAtt" refers to the attribute "myAtt" of the publisher id. Next, consider the case where several elements may have the same name, as demonstrated in the authors element above. The "book/authors/i" identifier applies to both elements, so we need a way to distinguish them. In this case, we can append array numbers within parentheses "()" to repeated elements of the same name. Hence, "book/authors/i(1)" identifies the first element, and "book/authors/ i(2)" identifies the second.

Specifying Nil and Exclude In some cases, you will want to specify that an element appear as Nil or that the element should not appear in the request message. By default, appending "XL" for exclude and "NIL" for nil values will accomplish this goal. For example, a column named "book/authors/iXL(2)" will allow you to indicate that the second child of the authors tag should not be sent, such as for the case where there is only one author.

How to Populate and Parameterize Elements with Data Source Values In cases of large, complex XML message requests, the process of configuring each element and attribute item in the request XML, and then parameterizing it with the correct data source column can be time consuming. Therefore, it is possible to expedite this process with the automatic populate and parameterize feature. Ultimately, when this feature is used with the associated data source naming conventions, it can provide huge productivity gains, especially when dealing with messages containing hundreds of nested, complex elements. It also allows you to focus on designing use cases in a data-driven manner (using data sources) instead of focusing on the error-prone method of individually configuring each message parameter.

365

Parameterizing Tests (with Data Sources or Values from Other Tests)

Note for SOAtest 5.x Users In SOAtest 5.x, the populate feature was available for Form XML and Literal XML views. It populated Form Input, and then overrode the values in Form XML or Literal XML with values from Form Input. With SOAtest 6.x, the populate has been removed from Form XML and Literal XML. You should populate in Form Input, then switch views to have the same effect.

To automatically populate and parameterize elements with data source values, complete the following: 1. In a SOAP Client test configured with a WSDL or a schema, open the Request> SOAP Body tab and right-click in a blank area of the Form Input tree (not on a specific element) and select Populate from the shortcut menu.

2. Enable Map parameter values to data source columns to tell SOAtest to automatically set each form input parameter to Parameterized. 3. Select the data source column with a name that matches the parameter name. For example, if the data source has a column name "title" and one of the form input elements has the same name "title", then the "title" element will be mapped to the data source column "title", and so on. 4. Customize the remaining options as needed. See “Populate Wizard Options”, page 366 for details. 5. Click OK. A Populate Results dialog displays Summary and Detail information.

Populate Wizard Options

366

Parameterizing Tests (with Data Sources or Values from Other Tests)

Option

Description

Map parameter values to data source columns

(Only enabled when a data source is present.) Indicates whether to automatically set each form input parameter to Parameterized and selects the data source column with a name that matches the parameter name. For example, if the data source has a column name "title" and one of the form input elements has the same name "title", then the "title" element will be mapped to the data source column "title", and so on.

Element Exclusion

Indicates whether to also map the property Use data source: Exclude [element name] with empty string with a data source column. For more details about the exclude with empty string option, see “Using Data Source Values to Determine if Optional Elements Are Sent”, page 810. The following options are available from the Element Exclusion dropdown menu: •

Always Include: Instructs SOAtest to always add new elements that are optional (schema type attribute minOccurs='0') as part of the populate process. The number of elements added is determined by the Number of sequence (array) items field. By default that is set to the value of 2.



Leave Unchanged: Leaves each sequence element at the current state. No new elements are added and no exclusion properties are modified.



Use Data Source: Instructs the populate process to map the Use data source: Exclude [element name] with empty string property of each Form Input element to the data source with the same name, but postfixed with the value XL. If the value in the specified data source column is an empty string, the optional element will not be included in the message. If it is an actual value, that value will be sent as part of the message.

The postfix XL is specified in the Exclude column name postfix field; XL is the default value. For example, if the request XML message includes an element named "title" and that element type in the schema is defined with the attribute minOccurs='0', the Use data source: Exclude title with empty string option becomes available in the Form Input view when right-clicking on the parent node of title. The populate feature would map the exclude property to the data source column named "titleXL", assuming that "XL" is the postfix specified.

367

Parameterizing Tests (with Data Sources or Values from Other Tests)

Option

Description

Nillable Elements

Similar to Element Exclusion except that Nillable Elements affects the Use data source set Nill with empty string property. This Form Input property is available on elements that have their schema type set with nillable="true". When the data source has an empty string, the nill attribute will be sent as part of the message. If a value is specified, the request element will include the specified value; the nill attribute will not be sent. If the data source has an empty string (e.g., ""), then the request element will have no value and will include the xsi:nil="true" attribute. For more information, see “Using Data Source Values to Configure if Nill Attributes Are Used”, page 810. The Nillable column name postfix field of the dialog specifies the postfix.

Attribute Exclusion

Indicates whether optional attributes are automatically added by the populate process

Data Source Mapping and Naming Conventions For both sequence (Array) and nested types: The value mapping and exclude/nillable mappings are based on name-matching conventions between the element name and the column name. However, there are cases where the same element name is reused within the XML message, so mapping collisions need to be avoided if one-to-one mapping between each element and data source column is to be maintained. For nested complex types: XPath-like data source column names can be used to disambiguate. For example, instead of using the data column name "title", you may use "book/title" as the column name and it would therefore be mapped to any "title" elements falling under "book". If that can lead to ambiguity, you may also use a column name such as "addNewItem/book/title" to further identify which element it is supposed to be associated with. For sequence types (arrays where items with the same name are repeated): Item index numbers can be used to disambiguate. For example, in the Parasoft store service the book type has an authors element, which in turn can have many "i" elements indicating a list of authors. Only using the data source column name "i" would result in that data source column being mapped to all occurrences of element "i". Using data source column name "i[2]" which results in that column name being mapped to all occurrences of "i" as the second item in the sequence. (The index numbers start from 1, not 0 as per the XPath specification). If the column name "authors/i[2]" is used, then it will be mapped only to the second item "i" element under the "authors". Note: if there happens to be multiple "authors" elements in the XML message, then all of them would be mapped accordingly, unless the column names are disambiguated with enough XPath parent nodes to make the mapping one-to-one. Parentheses can be used as the numeric index syntax, so "authors/ i(2)" will map to the same elements as "authors/i[2]". The () syntax is inconsistent with the XPath specification, but it is helpful when database data sources are used where [] is not a valid SQL character. Exclude and Nillable mapping: Follows the same XPath and indexing conventions as values. For example, to exclude/include the "i" element based on a data source column, you may use the column name "authors/iXL[2]" to indicate specifically which elements it is intended for.

368

Configuring Testing in Different Environments

Configuring Testing in Different Environments This topic explains how to work with SOAtest environments. A SOAtest Environment is a collection of variables that can be referenced within fields of your test configuration. By switching which environment is the "active" environment, you can dynamically switch the values of your test configurations at runtime. Sections include: •

Understanding Environments



Manually Defining an Environment



Using Environment Variables in Tests



Changing Environments from the GUI



Changing Environments from the Command Line



Sample Usage



Exporting, Importing, and Referencing Environments

Understanding Environments You may want to run the same test suite against other target systems (different testing environments, production environments, etc.). Rather than editing server configuration-related settings within your SOAtest project, you can instead use SOAtest Environments to decouple configuration settings from your test data. Once you have configured environments for your projects, running tests against another systems is as easy as clicking a button. An environment is a collection of variables that can be referenced in your SOAtest project configuration. When running a test, SOAtest will substitute the names of variables in your project configurations with the values of those variables in the active environment. By changing which Environment is active, you can quickly and easily change which values SOAtest uses. Environments are defined when you automatically generate (e.g., from a WSDL, by recording from a browser, etc.). In addition, you can manually define one as described below.

Manually Defining an Environment Creating and switching environments is done through the Environments branch of the test suite’s Test Case Explorer node.

The Environments branch is created by default when a new test suite is created. To add a new environment:

369

Configuring Testing in Different Environments

1. Right-click the Environments node, then choose New Environment. 2. In the configuration panel that opens on the right, use the available controls to define environment variables.

Using Environment Variables in Tests Environment variables can be accessed in test configuration fields using a special syntax. To reference a variable, enclose the variable name in the following character sequence: ${}. For example, if you have a variable named HOST, you would reference the variable in a field by typing: ${HOST}. Variable references may appear anywhere within a field. For example let's say your environment contains variables HOST = localhost and PORT = 8080, and you have an Endpoint field in a SOAP Client containing: "http://${HOST}:${PORT}/Service". Upon running the test, the value used for the endpoint will be "http://localhost:8080/Service". You can access Environment Variable values from a SOAtest Extension yool/Script through the Extensibility API. MethodToolContext now has a method called "getEnvironmentVariableValue(String)" which will lookup and return the current value of an Environment variable. This will allow you to use the value within your SOAtest scripts. Note: If your test case requires the character sequence ${}, you can escape the sequence by adding a backslash. For example, if SOAtest encounters the value "\${HOST}" it will use the value "${HOST}" and will not try to resolve the variable. Also note that environment variable names are case sensitive.

Changing Environments from the GUI To change what environment is active: •

Right-click the node representing the environment you want to make active, then choose Set as Active Environment.

Changing Environments from the Command Line In addition to being able to select the active Environment from the GUI, you can also switch the active environment from the command line, using the -environment option. See “Testing from the Command Line Interface (soatestcli)”, page 257 for details.

370

Configuring Testing in Different Environments

Overriding Environments Using Test Configurations To set a Test Configuration to always use a specific environment for test execution (regardless of what environment is active in the Test Case Explorer): •

Set the Test Configuration’s Override default Environment during Test Execution option in the Execution tab. See “Defining How Test Cases are Executed (Execution Tab)”, page 248 for details.

Sample Usage Let's say you are developing a service on your local machine. Once the code works locally, you commit the code to source control and start a build process targeted to a staging server. You want to create a suite of tests that will test against both your local machine and the staging server with minimal modification. This is the perfect case for using SOAtest Environments. Begin by creating a localhost environment for your new project. The SOAtest New Project Wizard will get you started with a few basic environment variables. If your WSDL varies across the two machines, you can decompose the WSDL URI into variables. If the WSDL location is constant across the machines, you can alternatively decompose the endpoint URI. The next step is to identify other machine-specific settings in your projects and to create environment variables for them. For example, let's say you have a JMS step in your test suite and you need to send the message to different queues depending on whether it's a localhost test or a staging test. First, create a new environment variable, let's call it "JMS_REPLY_QUEUE". Next, revisit your SOAP Client, go to the JMS settings, and enter ${JMS_REPLY_QUEUE} as the value for the JMSReplyTo field. Once you have created your localhost environment, you can copy and paste the existing environment and rename the new environment to "Staging Environment". Then, simply modify the values of each of the variables so that they reflect the settings of your staging server. Once your environments are setup, targeting your tests against the various environments is simply a matter of selecting the active environment. If you want to test against your local machine, you would set your active environment to be "Localhost Environment". This will run your tests using the values defined in the localhost environment. To test against the staging server, set the "Staging Server" environment to active and its values will be used.

Exporting, Importing, and Referencing Environments You may find that many configuration settings, such as server names and ports, will be common across multiple projects. Rather than duplicating these settings, you can export environment settings to an external file and import or reference the values in other projects.

Exporting Environments To export an environment: 1. Right-click the node representing the environment you want to export, then choose Export Environment. 2. In the file chooser that opens, specify a location for the exported environments file. The environments configuration will be written in an XML-based text file. If one Environment is selected, a *.env file will be created, containing a single environment. If multiple environments are selected, a *.envs, or Environment Set, file will be created containing all of the selected environments.

Importing Environments 371

Configuring Testing in Different Environments

When you import environments, you are bringing a copy of the values from the external environment file into your project. Further modification to the XML file will not be reflected in your project. To import an environment: 1. Right-click the Environments node, then choose Import Environment. 2. In the file chooser that opens, specify the location of the environments file that you want to import.

Referencing Environments Referencing environments is the most efficient way to share a single environment configuration across multiple projects. Using environment references, you can easily modify the configurations of multiple projects from a single location. To reference an environment: 1. Right-click the Environments node, then choose Reference Environment. 2. In the configuration panel that opens on the right, specify the location of the environments file that you want to reference. Note that when an environment configuration is referenced, you cannot edit the environment variables in the environment directly. However, your project will always use the values reflected in the referenced *.env file. Modifying the *.env file will propagate changes to all projects that reference it.

372

Validating the Database Layer

Validating the Database Layer Using SOAtest’s DB tool (described in “DB”, page 911), you can validate the database layer. For example, assume you are testing a Web service invocation that is supposed to add a record to the database. You can not only ensure that the service returned the expected response, but also query the database to verify whether the data was added to the database correctly. In addition, you can perform database setup or cleanup actions as needed to support your testing efforts.

373

Validating EJBs

Validating EJBs SOAtest’s EJB Client tool (described in “EJB Client”, page 841) can test the remote interface of deployed EJBs. This allows testing of EJBs through their remote interfaces—without having to go through a Web or Web service interface. Additionally, you can use Tracer (described in “Jtest Tracer Client”, page 966) to generate functional JUnit test cases for specified components as you run your use cases on the working application. Using Parasoft Jtest, you can perform additional levels of testing for EJBs and other code written for the Java EE framework. For example, as code is being written, a special rule library checks compliance to EJB and other Java EE best practices; these rules can be applied along with industry standard rules and rules that enforce your organization's specific polices. Upon component completion, unit tests can be generated and executed to test the code outside and/or inside the container.

374

Validating Java Application-Layer Functionality

Validating Java Application-Layer Functionality SOAtest’s Jtest Tracer Client (described in “Jtest Tracer Client”, page 966) can be used to identify, isolate, then reproduce bugs in a multi-layered system. Tracer allows you to rapidly create realistic functional JUnit test cases that capture the functionality covered by your SOAtest test cases. Using Tracer, you can trace the execution of Java applications at the JVM level (without a need to change any code or to recompile), and in the context of a larger integrated system. As your SOAtest test cases execute, Tracer monitors all the objects that are created, all the data that comes in and goes out. The trace results are then used to generate contextual JUnit test cases that replay the same actions in isolation, on the developer's desktop, without the need to access all the application dependencies. This means that you can use a single machine to reproduce the behavior of a complicated system during your verification procedure. Since the generated unit tests directly correlate tests to source code, this improves error identification and diagnosis, and allows developers to run these test cases without having to depend on access to the production environment or set up a complex staging environment. This facilitates collaboration between QA and Development: QA can provide developers traced test sessions with code-level test results, and these tests help developers identify, understand, and resolve the problematic code.

375

Monitoring and Validating Messages and Events Inside ESBs and Other Systems

Monitoring and Validating Messages and Events Inside ESBs and Other Systems Parasoft SOAtest can visualize and trace the intra-process events that occur as part of the transactions triggered by the tests and then dissect them for validation. This enables test engineers to identify problem causes and validate multi-endpoint, integrated transaction system—actions that are traditionally handled only by specialized development teams. For details, see “Event Monitoring (ESBs, Java Apps, Databases, and other Systems)”, page 494.

376

Executing Functional Tests

Executing Functional Tests This topic explains how to execute functional tests individually or with the complete test suite, and then view the HTTP traffic. Sections include: •

Running the Entire Test Suite



Running Specific Test Cases



Viewing HTTP Traffic

Running the Entire Test Suite To run all test cases in your test suite: 1. Select the Test Case Explorer node that represents the test suite you want to run. 2. Click the Test toolbar button.

SOAtest will run all available test cases, then report the outcome of each test and the test suite’s overall success rate. Green bubbles mark tests that succeeded. Red bubbles mark tests that failed. Yellow bubbles mark tests that encountered errors and were not completed. The results from all tests will be collected in the SOAtest view, which is typically positioned at the bottom of the GUI. For more information about results, see “Viewing Results”, page 290. In addition, you can access a report that contains a results summary, as well as result details. This is described in “Generating Reports”, page 295.

Running Specific Test Cases To run one or more selected test cases from a test suite that was marked as “individually runnable”: 1. Select the Test Case Explorer nodes that represent the test cases you want to run. 2. Click the Test toolbar button.

SOAtest will run the selected test case, then report the test outcome. Green bubbles mark tests that succeeded. Red bubbles mark tests that failed. Yellow bubbles mark tests that encountered errors and were not completed. The results from all tests will be collected in the SOAtest view, which is typically positioned at the bottom of the GUI. For more information about results, see “Viewing Results”, page 290. In addition, you can access a report that contains a results summary, as well as result details. This is described in “Generating Reports”, page 295.

Viewing HTTP Traffic If you would like to view the HTTP traffic for each individual test in a test suite, double-click the Traffic Viewer node of the desired test after the test has been completed. The traffic will display in the Traffic Viewer tab on the right side of the GUI.

377

Executing Functional Tests

For SOA tests, the HTTP traffic viewer shows the SOAP requests and SOAP responses. The Response and Request bodies display in Literal form by default. If you find that you have to scroll from left to right to view the HTTP traffic, you can click in the Literal view and press CTRL + B to beautify the XML. After pressing CTRL + B, all well-formed XML fragments will be beautified, alleviating the need to scroll from left to right. For more information on the Traffic Viewer tool, see “Traffic Viewer”, page 888.

378

Reviewing Functional Test Results

Reviewing Functional Test Results In addition to the general results review actions presented in “Viewing Results”, page 290, the following options are available for reviewing functional test results.

In the SOAtest View If a test fails, the SOAtest view reports a task to alert you that the test failure requires review. These tasks are organized by test suite and test. •

To open the test case related to a reported failure, right-click that failure then choose Open Editor for Test.



To locate the Test Case Explorer node related to a reported failure, right-click that failure then choose Show in Test Case Explorer.



To see the related traffic (when applicable), right-click an error message, then choose View Associated Traffic.

In the Test Case Explorer The Test Case Explorer indicates the status (pass/fail/not yet executed) of all available test cases.



• •

A green check mark indicates that the test passed.

A red X mark indicates that the test failed. An unmarked test indicated that the test was not yet executed.

379

Creating a Report of the Test Suite Structure

Creating a Report of the Test Suite Structure SOAtest provides a design-time structure report that exports test structure details to an XML or HTML document. The structure report provides details about the test setup which allows managers and reviewers to determine whether specified test requirements were accomplished. To view a structure report, right-click on the test suite’s .tst node in the Test Case Explorer, then choose View Structure Report> Structure.

The Structure Report will display in the right side of the GUI. The Structure Report contains the following information: •

Project Structure: Displays the available test suites and test cases for the selected project.



WSDLs and Operations Tested: Displays the WSDLs and operations for each WSDL for the selected project.



Endpoints Tested: Displays all endpoints that were tested for the selected project.



Requirements Tested: Displays all the defined requirements that were configured in the Requirements and Notes tab of the root test suite. In the Requirements Tracking sub section, you may add IDs and URLs to relate your tests with the requirements/bug fixes that are tested.



Data Sources Used: Displays the data sources that were configured for the selected project.



Key Stores Used: Displays the key stores that were configured for the selected project.

You can configure which of the above items to display in the Structure Report via the Reports configuration panel in the SOAtest Preferences. To access the Reports configuration panel, select SOAtest> Preferences, then select SOAtest> Reports> Structure Reports from the SOAtest Preferences dialog. For details on available structure report options, see “Reports> Structure Reports”, page 753.

380

Managing the Test Suite

Managing the Test Suite This topic explains how manage the test suite. Sections include: •

Deleting Test Cases



Disabling/Enabling a Test or Test Suite



Disabling/Enabling a Tool



Saving and Restoring a Test Suite



Exporting a Test Suite



Importing a Test Suite

Deleting Test Cases To delete a test case, right-click the related Test Suite tree node, then choose Delete from the shortcut menu.

Disabling/Enabling a Test or Test Suite You can temporarily disable tests or test suites that you want to save as part of your project, but do not want to run at the current time. To disable a test or test suite: •

Right-click the Tests tree node that represents the test or test suite you want to disable, then choose Disable from the shortcut menu.

To enable a test or test suite that you previously disabled: •

Right-click the Tests tree node that represents the test or test suite you want to enable, then choose Enable from the shortcut menu.

To disable multiple tests or test suites: •

Select multiple tests from the Tests tree node that represents the tests or test suites you want to disable, then choose Disable from the shortcut menu.

To enable multiple tests or test suites that you previously disabled: •

Select multiple tests from the Tests tree node that represents the tests or test suites you want to enable, then choose Enable from the shortcut menu.

Disabling/Enabling a Tool If you plan to use your test suite for load testing and you anticipate load testing with a large load, disabling heavy-chained tools (e.g. Diff tools or Check XML) may be useful and allow you to generate more load rather than having to delete the tools or create a new test suite. To enable or disable all tools of the same type from the test suite level: 1. Right-click the main test suite tree node. 2. Choose Search and Replace> Enable/Disable> [Tool Type] from the shortcut menu. Only tools that are used in the test suite display in the shortcut menu. Whatever tool you select, will be enabled or disabled in the entire test suite wherever it is used. Disabled tools will turn gray indicating that they are disabled and the test suite will function as if the tools are not there.

381

Managing the Test Suite

Saving and Restoring a Test Suite Test suites are saved when you save a project file and are restored whenever you open the related project file.

Exporting a Test Suite You may find that many configuration settings will be common across multiple tests. Rather than duplicating these settings, you can export test settings to an external file and import or reference the values in other tests. To export environments, complete the following: 1. Right-click the Test Suite tree node that you want to be exported, then select Export from the shortcut menu. 2. Select the appropriate .tst file from the file chooser that opens.

Importing a Test Suite SOAtest lets you import previously-saved test suites so that you can easily share tests with fellow team members and integrate test suites as needed. When a test suite is imported, it can be edited to the specific needs of the team members. To import a previously-saved test suite into an existing Test Suite: 1. Right-click the Test Suite tree node where you want the test suite integrated, then select Add New> Test Suite from the shortcut menu. 2. Select Import Test (.tst) File e and click the Finish button. 3. Select the appropriate .tst file from the file chooser that opens. After you import a test suite, it will be integrated into the current test suite.

382

SOA Functional Tests In this section: •

Automatic Creation of Test Suites for SOA: Overview



Creating Tests From a WSDL



Creating Tests From XML Schema



Creating Tests From AmberPoint Management System



Creating Tests From Oracle Enterprise Repository / BEA AquaLogic Repository



Creating Tests From BPEL Files



Creating Tests From Software AG CentraSite Active SOA



Creating Tests From JMS System Transactions



Creating Tests From Sonic ESB Transactions



Creating Tests From TIBCO EMS Transactions



Creating Tests From Traffic



Creating Tests From a UDDI



Creating Tests From a WSIL



Creating Asynchronous Tests



Testing RESTful Services



Sending MTOM/XOP Messages



Sending and Receiving Attachments



Accessing Web Services Deployed with HTTPS



Configuring Regression Testing



Validating the Value of an Individual Response Element



Validating the Structure of the XML Response Message

383

Automatic Creation of Test Suites for SOA: Overview

Automatic Creation of Test Suites for SOA: Overview With SOAtest’s test creation wizard, you can easily and automatically create a series of test cases based on a variety of artifacts and platforms. SOAtest provides a flexible test suite infrastructure that lets you add, organize, and run your Web service test cases. Each test in a test suite contains a main test tool (usually, a SOAP Client tool), and any number or combination of outputs (other tools, or special output options). You can run individual tests, or the complete test suite. In addition, you can attach regression controls at the test or test suite level so that you are immediately alerted to unexpected changes.

Understanding the Test Creation Wizard SOAtest automatically generates a suite of SOAP Client test cases from a variety of platforms and artifacts. Rather than creating each of the required tests by hand and adding them to a test suite one at a time, you can point SOAtest to the appropriate resources, and it will automatically generate a suite of test cases that covers every object associated with the corresponding data. In addition, when automatically creating test suites from WSDL or WSIL documents, you can organize tests into positive and negative unit tests, and create asynchronous test suites. The wizard can be used for: •

Creating Tests From a WSDL



Creating Tests From XML Schema



Creating Tests From AmberPoint Management System



Creating Tests From Oracle Enterprise Repository / BEA AquaLogic Repository



Creating Tests From BPEL Files



Creating Tests From Software AG CentraSite Active SOA



Creating Tests From JMS System Transactions



Creating Tests From Sonic ESB Transactions



Creating Tests From TIBCO EMS Transactions



Creating Tests From Traffic



Creating Tests From a UDDI



Creating Tests From a WSIL

384

Creating Tests From a WSDL

Creating Tests From a WSDL To automatically create a test suite from a valid WSDL document, complete the following: 1. Choose the WSDL option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project



Adding a New Test Suite

2. In the wizard’s WSDL page, enter a valid WSDL URL in the WSDL URL field, or click the Browse button to locate a WSDL file on the local file system.

Note: The remaining steps are optional. Once you enter a valid WSDL URL, you can go ahead and click the Finish button and SOAtest will generate a suite of test cases that test every object associated with the WSDL you entered. If you would like to configure the test suite further, continue to the next step. 3. Select the Create Functional Tests from the WSDL checkbox and choose the Generate Web Service Clients radio button. To create server stubs and perform client testing, see “Creating Stubs from Functional Test Traffic”, page 535. 4. To create a separate test suite that generates a series of tests to verify every aspect of the WSDL, select the Create tests to validate and enforce policies on the WSDL checkbox. 5. Click Next. The Interoperability dialog opens.

385

Creating Tests From a WSDL

6. Select whether you would like to create SOAtest (Java) Clients or .NET WCF Clients. 7. Click Next. The Create Environment dialog opens.

8. Select the Create a new environment for your project checkbox and enter an Environment Name and Variable Prefix, then select whether you want to create environment variables for WSDL URI Fields, Client Endpoints, or Both. For more information on environments, see “Configuring Testing in Different Environments”, page 369.

386

Creating Tests From a WSDL

9. Click Next. The Policy Enforcement dialog opens.

10. Select the Apply Policy Configuration check box. This will create WSDL and functional tests that will enforce the assertions defined in the specified policy configuration. •

The default policy configuration, soa.policy, is a collection of industry-wide best practices. To use a custom policy configuration, you can either use the Browse button to select a policy configuration or the policy configuration's path can be entered in the text field. For details on policy enforcement, see “SOA Policy Enforcement: Overview”, page 570.

11. Click the Next button to advance to the Layout dialog.

387

Creating Tests From a WSDL

12. (Optional) Select the Organize as Positive and Negative Unit Tests checkbox to create both positive and negative tests for each operation since it is important to test situations where we send expected data as well as unexpected data to the server. The default value is configured to Sort Tests Alphabetically. 13. (Optional) Select the Asynchronous radio button and choose Parlay, Parlay X, SCP, or WSAddressing to create asynchronous test suites. For more information on asynchronous testing, see “Creating Asynchronous Tests”, page 419. 14. Click the Finish button. SOAtest will generate a suite of test cases that test every object associated with the WSDL you entered.

388

Creating Tests From XML Schema

Creating Tests From XML Schema To automatically create a test suite from XML schema, complete the following: 1. Choose the XML Schema option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project



Adding a New Test Suite

2. In the XML Schema wizard page, specify the location of the schema from which you want to generate tests. 3. Select the type of functional test you’d like to create: •

Generate Messaging Client (non-SOAP XML messages). For more information, see “Messaging Client”, page 782.



Generate SOAP Clients (SOAP Messages). For more information, see “SOAP Client”, page 777.



Generate Server Stubs: For more information, see “Message Stub”, page 784.

4. Enter an Endpoint and click the Next button.

389

Creating Tests From XML Schema

5. In the Elements page, select one or more elements from which to generate your tests and click the Finish button. SOAtest only recognizes element definitions defined at the top level (i.e. as the child of the root schema element).

A new test suite is created based on the XML Schema and functional test type you selected.

390

Creating Tests From AmberPoint Management System

Creating Tests From AmberPoint Management System If your team uses AmberPoint Management System, you can export your runtime message sets or runtime validation baselines in the production environment, then provide this information to Parasoft SOAtest in order to create tests that can replay the SOAP messages. You can also establish the captured response messages as the regression control within the generated tests. To generate tests from an exported AmberPoint baseline or message set, complete the following: 1. Choose the AmberPoint Management System option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project



Adding a New Test Suite

2. In the AmberPoint Management System wizard page, browse to the location of the AmberPoint File.

391

Creating Tests From AmberPoint Management System

3. If you would like to create a regression test from the captured response messages, select the Create Regression Controls checkbox. 4. Click the Finish button. SOAtest generates a test suite from the exported baseline or message set file. If regression controls were created, a Diff tool is attached to each test.

392

Creating Tests From Oracle Enterprise Repository / BEA AquaLogic Repository

Creating Tests From Oracle Enterprise Repository / BEA AquaLogic Repository SOAtest can create tests that enforce policies applied to Web service assets declared in an Oracle/ BEA repository. You can select a Web service asset and choose the desired policies to enforce. To enforce Oracle/BEA AquaLogic policies, complete the following: 1. Choose the BEA AquaLogic Enterprise Repository option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project



Adding a New Test Suite

2. In the BEA AquaLogic Enterprise Repository wizard page, enter the location of the repository in the Repository URL field, enter a Username and Password. To save these settings, click the Save to Preferences button. 3. Click the Next button. A list of Available Web Service Assets corresponding to the selected repository displays.

393

Creating Tests From Oracle Enterprise Repository / BEA AquaLogic Repository

4. Select the desired Web service asset from the list and click the Next button. A list of Policies applied to the assets you selected displays.

394

Creating Tests From Oracle Enterprise Repository / BEA AquaLogic Repository

5. Select the desired policies and click the Finish button. SOAtest creates a test suite with the selected policies and tests whether these policies are enforced against the Web service assets.

395

Creating Tests From BPEL Files

Creating Tests From BPEL Files SOAtest can automatically create taste cases from vendor-specific BPEL deployment artifacts. You can then arrange these test cases into suites that reflect different aspects of testing of a BPEL process. SOAtest can create the following types of tests from BPEL files: BPEL Semantics Tests BPEL Semantics tests include the BPEL Semantics Validator – a static analysis tool that verifies syntactic correctness through schema validation, which verifies that the elements and attributes conform to the XML Schema. Beyond this, the validator explicitly verifies constraints imposed by the BPEL specification that are not enforced by the XML Schema. The validator finds errors such as: •

Unresolved references to BPEL, WSDL, and XML Schema types.



Violations to constraints on start activities, correlations, scopes, variables, links, and partner links.



Incompatible types in assignments.



Errors in embedded XPath expressions.

WSDL Tests BPEL depends on Web Services Description Language (WSDL) to define both outgoing and incoming messages. The SOAtest BPEL Wizard examines the BPEL process deployment artifacts for WSDL references. For each referenced WSDL file, the Wizard will create tests that verify WSDL schema validity, semantic validity, WS-I interoperability and will create a regression control. BPEL Process Tests BPEL Process tests emulate external business partners accessing the deployed BPEL process. The SOAtest BPEL Wizard examines the business process deployment artifacts, including BPEL and WSDL files. The Wizard maps the process's partner link description to a WSDL port type and a protocol binding through which the process can be externally invoked. The BPEL Wizard then creates a test for each operation of the port type of the business process. BPEL Partner Tests The correct functioning of a BPEL process directly depends on the correct functioning of its business partners. A change in the behavior of a business partner may cause the BPEL process to fail. Finding the cause of such failures can be time consuming. By including BPEL partner tests into the BPEL process test suite, the SOAtest BPEL Wizard allows users to test BPEL partners as components of the BPEL process and detect business partner errors and unexpected behavior early in the development lifecycle. The SOAtest BPEL Wizard examines the business process deployment artifacts, including BPEL and WSDL files. The Wizard then maps partner links descriptions to WSDL port types and protocol bindings through which business partners can be externally invoked. The Wizard will then create a test suite for each business partner - within each test suite will be a test for every operation declared in the partner's port type.

Automatically Creating Test Suites from BPEL Process Deployment Artifacts To automatically create test suites from BPEL process deployment artifacts, complete the following: 1. Choose the BPEL option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project

396

Creating Tests From BPEL Files



Adding a New Test Suite

2. In the BPEL wizard page, go to the BPEL Engine drop-down menu and select the BPEL engine type where the BPEL process you would like to test is deployed. •



If BPEL Maestro is selected from the BPEL Engine drop-down menu, complete the following: •

Enter the URL of the BPEL file in the BPEL URL field



Enter the public WSDL URL of the BPEL process in the Public WSDL URL field.

If Active BPEL is selected from the BPEL Engine drop-down menu, complete the following: •

Enter the location of the BPEL process Deployment Descriptor File (.pdd file).



Enter the location of the BPEL file.



Enter the Engine URL where the Active BPEL 2.0 engine is deployed. For example, if you installed Active BPEL 2.0 to run in a Tomcat servlet container with the address: http://mybpelhost:8080, then your Active BPEL 2.0 Engine URL will be http://mybpelhost:8080/active-bpel. Verify that this is the right URL by opening it in the browser —you should see the Administrative Servlets panel.

397

Creating Tests From BPEL Files



If Generic BPEL is selected from the BPEL Engine drop-down menu, complete the following: •

Enter the URL of the BPEL file in the BPEL URL field



Enter the public WSDL URL of the BPEL process in the Public WSDL URL field.

For Generic BPEL engines, it is likely that the BPEL partner link to WSDL port mappings, and the ports and port types of your BPEL process business partners, are declared in WSDL files other than the public WSDL of the BPEL process. If this is the case, you should declare those dependency WSDLs in the Optional Parameters dialog. To invoke this dialog press the Optional Parameters Configure button, press Add and enter the dependency WSDL URL in the newly created table row. Add as many dependency WSDLs as needed. 3. Select the test categories that you would like the BPEL Wizard to create: •

Create BPEL Semantics Tests: Verifies semantic and schema validity of BPEL files.



Create WSDL Tests: Checks WSDL files referenced in the BPEL deployment for schema validity, semantic validity, WS-I interoperability, and regression.



Create BPEL Process Tests: Emulates external business partners accessing the deployed BPEL process.



Create BPEL Partner Tests: Allows direct testing of BPEL process business partners.

4. Click the Finish button. SOAtest will examine the BPEL process deployment artifacts and automatically create test suites for the BPEL process you selected.

398

Creating Tests From Software AG CentraSite Active SOA

Creating Tests From Software AG CentraSite Active SOA SOAtest can create tests that enforce policies applied to Web service assets that are declared in a Software AG CentraSite Active SOA repository. You can select a service asset and choose the desired policies to enforce. To enforce CentraSite Active SOA policies, complete the following: 1. Choose the CentraSite repository option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project



Adding a New Test Suite

2. In the CentraSite wizard page, enter the location of the repository in the URL field, enter a Username and Password. To save these settings, click the Save to Preferences button. 3. Click the Next button and enter a service name query in the Name Query Field. This allows you to query for services by name that are registered in CentraSite Active SOA.

399

Creating Tests From Software AG CentraSite Active SOA

4. Click the Next button. A list of Available Web Service Assets corresponding to the entered query displays.

5. Select the desired service asset from the list and click the Finish button. SOAtest creates a test suite with the selected policies and tests whether these policies are enforced against the service assets.

Reporting Test Execution Results to CentraSite Active SOA After running the test suite with the selected CentraSite Active SOA policies, you have instant access to quality data associated with the assets in CentraSite Active SOA. For details on how to report results to CentraSite Active SOA, see “Using Software AG CentraSite Active SOA with SOAtest”, page 682.

400

Creating Tests From JMS System Transactions

Creating Tests From JMS System Transactions This topic explains how you can use SOAtest to monitor transactions that pass through any JMS system, then generate functional test cases that check the monitored messages. Sections include: •

Overview



Prerequisites



Generating Tests from JMS Transactions

Overview SOAtest can monitor transactions that pass through a JMS, then generate functional test cases that check the monitored messages. In addition to providing visibility into the systems messages, this allows you replay transactions directly from SOAtest and verify that the monitored functionality continues to work as expected. To achieve this, you tell SOAtest how to connect to your JMS and what destination (topic or queue) messages you want it to monitor, then you prompt it to start monitoring. SOAtest will generate a test suite of Messaging Client tests for each JMS message captured at the specified destination or for all messages within the process flow (if a process tracking topic was used). These tests are preconfigured with the connection parameters, requests, and destination information so that SOAtest can replay the same messages.

Prerequisites See “JMS Prerequisites”, page 694.

Generating Tests from JMS Transactions To generate tests: 1. Choose the Java Message Service (JMS) option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project

401

Creating Tests From JMS System Transactions



Adding a New Test Suite

2. In the JMS wizard page, complete the following: a. In the Connection area, specify your JMS connection settings. b. In the Initial Context field, specify a fully-qualified class name string, passed to the JNDI javax.naming.InitialContext constructor as a string value for the property named javax.naming.Context.INITIAL_CONTEXT_FACTORY. c.

In the Connection Factory field, specify the JNDI name for the factory This is passed to the lookup() method in javax.naming.InitialContext to create a javax.jms.QueueConnectionFactory or a javax.jms.TopicConnectionFactory instance.

d. In the Destination Name field, specify the topic or queue that you want to monitor. •

You can specify a regular topic or queue (e.g., the entry or exit of a workflow process), or a special process tracking topic .

e. In the Destination Type field, specify whether the tracking destination is a topic or a queue. f.

(Optional) In the Message Selector field, enter a value to act as a message filter. See “Using Message Selector Filters”, page 704 for tips.

3. Click Next. SOAtest will start monitoring the messages that match the settings specified in the previous wizard page. If you run another application that sends messages to the bus, those messages will be noted in this panel.

402

Creating Tests From JMS System Transactions

4. When you are ready to stop monitoring, click Finish. SOAtest will then create test cases based on the verified messages.

Monitoring Intermediary Messages In addition to automatically generating functional tests from monitoring the transaction messages that touch JMS endpoints in ESBs or middleware systems, you can also visualize and trace the intra-process JMS messages that take place as part of the transactions that are triggered by the tests, and then dissect them for validation. For details on how to do this, see “Event Monitoring (ESBs, Java Apps, Databases, and other Systems)”, page 494.

403

Creating Tests From Sonic ESB Transactions

Creating Tests From Sonic ESB Transactions This topic explains how SOAtest can monitor transactions that pass through a Sonic ESB system, then generate functional test cases that check the monitored messages. In addition to providing visibility into the systems messages, this allows you replay transactions directly from SOAtest and verify that the monitored functionality continues to work as expected. Sections include: •

Overview



Prerequisites



Generating Tests from Sonic ESB Transactions

Overview SOAtest can monitor transactions that pass through Sonic ESB, then generate functional test cases that check the monitored messages. In addition to providing visibility into the systems messages, this allows you replay transactions directly from SOAtest and verify that the monitored functionality continues to work as expected. To achieve this, you tell SOAtest how to connect to your Sonic ESB and what destination (topic or queue) messages you want it to monitor, then you prompt it to start monitoring. SOAtest will generate a test suite of Messaging Client tests for each JMS message captured at the specified destination or for all messages within the process flow (if a process tracking topic was used). These tests are preconfigured with the connection parameters, requests, and destination information so that SOAtest can replay the same messages.

Prerequisites The following jar files must be added to your classpath (via SOAtest> Preferences> System Properties): •

broker.jar



mfcontext.jar



sonic_Client.jar

Generating Tests from Sonic ESB Transactions To generate tests: 1. Choose the Sonic Enterprise Service Bus option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project

404

Creating Tests From Sonic ESB Transactions



Adding a New Test Suite

2. Complete the first page of the Sonic ESB wizard as follows: a. In the Connection area, specify your Sonic ESB connection settings. b. In the Destination Name field, specify the topic or queue that you want to monitor.]

c.



You can specify a regular topic or queue (e.g., the entry or exit of a workflow process), or a special "dev.Tracking" tracking endpoint.



For instance, if you want to track all events that occur as part of the process flow, specify the dev.Tracking endpoint, and have the process set to Tracking Level of 4 in the ESB.

In the Destination Type field, specify whether the tracking destination is a topic or a queue.

d. (Optional) In the Message Selector field, enter a value to act as a message filter. See “Using Message Selector Filters”, page 704 for tips. 3. Click Next. SOAtest will start monitoring the messages that match the settings specified in the previous wizard page. If you run another application that sends messages to the bus, those messages will be noted in this panel. 4. When you are ready to stop monitoring, click Finish. SOAtest will then create test cases based on the verified messages.

405

Creating Tests From Sonic ESB Transactions

Monitoring Intermediary Messages In addition to automatically generating functional tests from monitoring the transaction messages that touch JMS endpoints in Sonic ESB, you can also visualize and trace the intra-process events that take place as part of the transactions that are triggered by the tests, and then dissect them for validation. For details on how to do this, see “Event Monitoring (ESBs, Java Apps, Databases, and other Systems)”, page 494.

406

Creating Tests From TIBCO EMS Transactions

Creating Tests From TIBCO EMS Transactions This topic explains how you can use SOAtest to monitor transactions that pass through a TIBCO EMS system, then generate functional test cases that check the monitored messages. Sections include: •

Overview



Prerequisites



Generating Tests from TIBCO EMS Transactions

Overview SOAtest can monitor transactions that pass through TIBCO EMS, then generate functional test cases that check the monitored messages. In addition to providing visibility into the system’s messages, this allows you replay transactions directly from SOAtest and verify that the monitored functionality continues to work as expected. To achieve this, you tell SOAtest how to connect to your TIBCO EMS and what destination (topic or queue) messages you want it to monitor, then you prompt it to start monitoring. SOAtest will generate a test suite of Messaging Client tests for each JMS message captured at the specified destination or for all messages within the process flow (if a process tracking topic was used). These tests are preconfigured with the connection parameters, requests, and destination information so that SOAtest can replay the same messages.

Prerequisites The tibjms.jar file must be added to your classpath (via SOAtest> Preferences> System Properties).

Generating Tests from TIBCO EMS Transactions To generate tests: 1. Choose the TIBCO Enterprise Messaging Service option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project

407

Creating Tests From TIBCO EMS Transactions



Adding a New Test Suite

2. Complete the first page of the TIBCO EMS wizard as follows: a. In the Connection area, specify your TIBCO EMS connection settings. b. In the Destination Name field, specify the topic or queue that you want to monitor. •

You can specify a regular topic or queue (e.g., the entry or exit of a workflow process), or a special process tracking topic .



For instance, to track any JMS message that gets transmitted through TIBCO EMS, use $sys.monitor.Q.r.>



c.

For details on specifying tracking topics for TIBCO EMS, see "Chapter 13: Monitoring Server Activity" and "Appendix B: Monitor Messages" in the TIBCO EMS User’s Guide.

In the Destination Type field, specify whether the tracking destination is a topic or a queue.

d. (Optional) In the Message Selector field, enter a value to act as a message filter. See “Using Message Selector Filters”, page 704 for tips.

408

Creating Tests From TIBCO EMS Transactions

3. Click Next. SOAtest will start monitoring the messages that match the settings specified in the previous wizard page. If you run another application that sends messages to the bus, those messages will be noted in this panel. 4. When you are ready to stop monitoring, click Finish. SOAtest will then create test cases based on the verified messages.

Monitoring Intermediary Messages In addition to automatically generating functional tests from monitoring the transaction messages that touch JMS endpoints in TIBCO EMS, you can also visualize and trace messages that take place through EMS as part of the transactions that are triggered by the tests, and then dissect them for validation. For details on how to do this, see “Event Monitoring (ESBs, Java Apps, Databases, and other Systems)”, page 494.

409

Creating Tests From Traffic

Creating Tests From Traffic Creating test suites from HTTP traffic is useful for replaying messages in a plain text traffic log/trace file. For example, you can log the traffic in a server and save it to a file, then provide that file to SOAtest in order to construct a SOAP Client for each SOAP request found in that file. You can optionally create a regression control with each response to validate whether each request continues to have the expected response (the response captured in the file) when the messages are replayed from SOAtest. In addition to generating SOAP Client tools for replaying the logged SOAP requests, you can also create stubs to virtualize/replace the represented servers in your testing environment (e.g., if they are not available/accessible for testing). Message traces or logs for test case or stub creation can be captured at the network level using network sniffing tools such as the freely available WireShark tool (http://www.wireshark.org/), or obtained by having your application log its traffic.

Wire Shark Tip Once the trace is captured, highlight one of the relevant TCP packets and select Analyze> Follow TCP stream. Then, save it by clicking Save As. To automatically create a test suite from HTTP traffic, complete the following: 1. Choose the Traffic option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project

410

Creating Tests From Traffic



Adding a New Test Suite .

2. In the Traffic wizard page, specify the location of the traffic file from which you want to create tests. 3. Specify what type of tests you want to generate. Available options include: •

Generate Server Stubs: Creates stubs for servers that you want to stub, or virtualize, during testing. For example, you could use this to emulate servers that are not available for testing—instead of accessing the actual servers from the test environment, you interact with the virtualized ones. SOAtest will construct a Message Stub tool for each group of structurally similar messages and configure the various messages in each group as Multiple Responses within the Message Stub. •



For details on SOAtest’s stubbing/virtualization capabilities, see “Service Virtualization: Creating and Deploying Stubs”, page 531.

Generate Web Service Clients: Creates SOAP Client tools that replay the messages represented in the file. SOAtest will construct a SOAP Client for each SOAP request found in that file.

4. (For Web service clients only) Customize generation options as needed. Available options are: •

Group similar sequential messages into tests parameterized with file data sources: Consolidates structurally similar messages (messages that differ only in text content changes in elements and attributes) into a single SOAP Client that is parame-

411

Creating Tests From Traffic

terized with a file data source. This is particularly useful when creating tests from large traffic files: it relieves you from having to create a test for every request message in the file. It also facilitates test maintenance and management, since fewer tests are created and they are better organized. •

Create Regression Control: Creates regression controls for each test. This allows you to validate whether each request continues to have the expected response (the response captured in the file) when the messages are replayed from SOAtest.

5. Do one of the following: •

For Web Service Clients: Click Finish. SOAtest will import tests from the captured HTTP traffic and create SOAP Clients for each SOAP Request/Response pairs.



For Server Stubs: Click Next, customize stub deployment settings if desired, then click Finish. SOAtest will then generate and deploy stubs that emulate the recorded traffic.

412

Creating Tests From a UDDI

Creating Tests From a UDDI To automatically create a test suite from a UDDI registry, complete the following: 1. Choose the UDDI option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project



Adding a New Test Suite

2. In the UDDI wizard page, enter a new endpoint in the UDDI Inquiry Endpoint field or select from a previous endpoint in the drop-down menu. 3. Select Business, Service, or TModel for the service Type in the UDDI. 4. Enter a search keyword for the UDDI service in the Query field. 5. Enter an integer in the Maximum Results field to limit the number of results from the query.

413

Creating Tests From a UDDI

6. Click Next. The Policy Enforcement dialog opens.

7. Select the Apply Policy Configuration check box. This will create WSDL and functional tests that will enforce the assertions defined in the specified policy configuration. •

The default policy configuration, soa.policy, is a collection of industry-wide best practices. To use a custom policy configuration, you can either use the Browse button to select a policy configuration or the policy configuration's path can be entered in the text field. For details on policy enforcement, see “SOA Policy Enforcement: Overview”, page 570.

414

Creating Tests From a UDDI

8. Click the Next button to advance to the Layout dialog.

9. (Optional) Select the Organize as Positive and Negative Unit Tests checkbox to create both positive and negative tests for each operation since it is important to test situations where we send expected data as well as unexpected data to the server. The default value is configured to Sort Tests Alphabetically. 10. (Optional) Select the Asynchronous radio button and choose Parlay, Parlay X, SCP, or WSAddressing to create asynchronous test suites. For more information on asynchronous testing, see “Creating Asynchronous Tests”, page 419. 11. Click the Finish button. SOAtest will generate a suite of test cases that test every object associated with the WSDL you entered.

415

Creating Tests From a WSIL

Creating Tests From a WSIL To automatically create a test suite from a valid WSIL document, complete the following: 1. Choose the WSIL option in one of the available test creation wizards. For details on accessing the wizards, see: •

Adding a New Project and .tst File



Adding a New .tst File to an Existing Project



Adding a New Test Suite

2. In the WSIL wizard page, enter a valid WSIL URL in the WSIL URL field, or click the Browse button to locate a WSDL URL. •

Note: The remaining steps are optional. Once you enter a valid WSIL URL, you can go ahead and click the Finish button and SOAtest will generate a suite of test cases that test every object associated with the WSDLs within the WSIL you entered. If you would like to configure the test suite further, continue to the next step.

416

Creating Tests From a WSIL

3. Select the Create Functional Tests from the WSDL checkbox and choose the Generate Web Service Clients radio button. To create server stubs and perform client testing, see “Creating Stubs from Functional Test Traffic”, page 535. 4. To create a separate test suite that generates a series of tests to verify every aspect of the WSDL, select the Create tests to validate and enforce policies on the WSDL checkbox. 5. Click Next. The Policy Enforcement dialog opens.

6. Select the Apply Policy Configuration check box. This will create WSDL and functional tests that will enforce the assertions defined in the specified policy configuration. •

The default policy configuration, soa.policy, is a collection of industry-wide best practices. To use a custom policy configuration, you can either use the Browse button to select a policy configuration or the policy configuration's path can be entered in the text field. For details on policy enforcement, see “SOA Policy Enforcement: Overview”, page 570.

7. Click the Next button to advance to the Layout dialog.

417

Creating Tests From a WSIL

8. (Optional) Select the Organize as Positive and Negative Unit Tests checkbox to create both positive and negative tests for each operation since it is important to test situations where we send expected data as well as unexpected data to the server. The default value is configured to Sort Tests Alphabetically. 9. (Optional) Select the Asynchronous radio button and choose Parlay, Parlay X, SCP, or WSAddressing to create asynchronous test suites. For more information on asynchronous testing, see “Creating Asynchronous Tests”, page 419. 10. Click the Finish button. SOAtest will generate a suite of test cases that test every object associated with the WSDL you entered.

418

Creating Asynchronous Tests

Creating Asynchronous Tests In this age of flexible, high performance web services, asynchronous communication is often used to exchange data, allowing the client to continue with other processing rather than blocking until a response is received. SOAtest comes packaged with a server that runs in the background and manages the asynchronous Call Back messages received. When creating a test suite from WSDL or WSIL documents, you can use the Layout dialog via the test creation wizard to create asynchronous tests. SOAtest supports the major asynchronous communication protocols including Parlay, Parlay X, SCP, and WS-Addressing. After selecting the Asynchronous option from the test creation wizard, a test suite folder is created which contains automatically configured asynchronous test cases for each operation defined within the WSDL or WSIL you entered.

Notice that two asynchronous tests are created for each WSDL in the test suite. The first test is a SOAP Client test which will send an initial request to the asynchronous service. The second is a tool called the Call Back tool. Using the Call Back tool, SOAtest is able to listen for call back messages that are sent in an asynchronous messaging exchange. For more information on the Call Back tool, see “Call Back”, page 847. A local server has been integrated into SOAtest, allowing the Call Back tool to listen for these incoming messages. For this reason, it is important that the stub server is running before executing these asynchronous tests. To do this, complete the following: 1. Choose Window> Show View> Stub Server. 2. Right-click the Server node the Stub Server tab and select Start Server from the shortcut menu. A green light next to the node indicates that the server has been successfully started.

419

Testing RESTful Services

Testing RESTful Services SOAtest fully supports the testing of RESTful services via the REST Client tool, which was specifically designed for sending messages to RESTful services. Messages can be sent with HTTP GET, POST, PUT, DELETE, HEAD, OPTIONS, or custom methods. For more details, see “REST Client”, page 829.

420

Sending MTOM/XOP Messages

Sending MTOM/XOP Messages SOAtest enables the testing of Web services leveraging the MTOM (Message Transmission Optimization Mechanism) and XOP (XML-binary Optimized Packaging) technologies. SOAtest enables users to select the binary content to include in the transmission, as well as to send, receive, and validate optimized messages. MTOM optimized messages can be sent using the SOAP Client in Form Input View. To enable the sending of optimized messages, complete the following: 1. In the Misc Tab of a SOAP Client, select Custom from the Attachment Encapsulation Format drop-down menu and choose either MTOM Always or MTOM Optional •

MTOM Always: SOAtest will always send the request in a XOP Package, i.e. with MIME boundaries, even when there is not optimized content in the request



MTOM Optional: SOAtest will only send the request in a XOP Package, i.e. with MIME boundaries, when there is optimized content in the request. In the absence of optimized content, SOAtest will send a normal SOAP request. Note that MTOM Always or MTOM Optional can be selected at the Test Suite level in the SOAP Client Options tab or in the SOAP Client page of the SOAtest Preferences panel.

2. Select the Request tab and make sure Form Input is selected from the Views menu. The Form Input view is a schema-aware view. In this view, SOAtest will recognize the xsd:base64Binary schema datatype and allow you to reference the content that you want to optimize. When you click on a base64Binary type, the following options are available: •

Reference to file: This is the recommended option. This option allows you to select a file to send as optimized content. SOAtest reads the contents from the file as the contents are being sent on the wire. This way, the contents of the file do not need to be stored in memory.



Persist As Relative Path: It is always recommended that the path to the file be kept as a relative path to the Test Suite to allow for easier sharing and collaboration with the rest of the organization.



Import from file: This option (not recommended) allows you to read the contents in from the file. This option is not recommended for big files because the contents of the file will be loaded in to memory.



Additionally, the file to be sent can be driven by data sources by selecting Parameterized from the drop down box and selecting a File Data Source. For more information about File Data Sources, see “Adding a File Data Source”, page 351.

421

Sending and Receiving Attachments

Sending and Receiving Attachments Many Web Services employ attachments to send and receive data that is not well represented with XML messages, such as multimedia binary data. SOAtest can be used to send and receive attachments along with SOAP messages. Received attachments can then be processed and validated for correctness. The SOAP Client tool can be configured to send MIME, DIME, and MTOM attachments along with the request SOAP message. For more information, see “Misc Options”, page 780. The Attachment Handler tool can be used in conjunction with a SOAP Client to extract and validate incoming attachments from a response SOAP message. For more information on configuring the Attachment Handler tool, see “Attachment Handler”, page 962.

422

Accessing Web Services Deployed with HTTPS

Accessing Web Services Deployed with HTTPS To configure SOAtest to work with Web services deployed using HTTPS (HTTP via the SSL), you need to identify the certificate being used for the HTTPS connection from the server, then register that certificate with SOAtest. You do this as follows: 1. Close SOAtest if it is currently open. 2. Identify the location of the server certificate used for the HTTPS connection. 3. Ensure that this certificate’s COMMON NAME parameter contains both the server’s machine name and the subdomain (for example, machine.company.com). 4. Copy the certificate to <soatest_install_dir>/plugins/com.parasoft.xtest.libs.web_<soatest_version_number>/root/lib. This directory should contain a cacerts file in which SOAtest stores trusted certificates. 5. Execute a command of the following format: keytool -import -alias <certificate_alias> -file <certificate_file> -keystore cacerts

For example, if your certificate file is named test.cert and your SOAtest installation directory is C:\Program Files\Parasoft\SOAtest\6.2, you would execute the following command from the C:\Program Files\Parasoft\SOAtest\6.2\lib prompt: keytool -import -alias serverTrustCert -file test.cert -keystore cacerts

This will import the certificate into the cacerts file with the alias "serverTrustCert". 6. When prompted to enter a keystore password, enter changeit. 7. When asked whether you want to trust this certificate, enter yes. You will then see a message indicating that the certificate has been added to the keystore. 8. (Optional) Verify that the certificate has been added to the keystore by entering the following command, then checking the file that opens: keytool -list -keystore cacerts

9. Launch SOAtest and try to access the service again. If SOAtest still does not work with services deployed using HTTPS, ensure that: 1. Your server is running. 2. You used the full name of the machine when trying to communicate with this HTTPS. 3. The server certificate was created with the full name. 4. The name on the certificate is identical to the name the client tried to access it with. If you cannot satisfy the above requirements (for example, if you don’t have necessary permissions): 1. Choose SOAtest> Preferences to open the SOAtest Preferences dialog. 2. Select Security from the left pane of the SOAtest Preferences dialog, then select the Trust all certificates option in the right pane. 3. Click OK or Apply to apply this change. SOAtest will then try to access any WSDL you specify, regardless of any problems with the certificate. However, SOAtest will still try use the certificate while trying to send SOAP messages because it is required to do so. Note: You must add certificates to cacerts files on load test slave machines as well as on the master machine. Otherwise, SSL connections will not work when running a load test with slave machines.

423

Accessing Web Services Deployed with HTTPS

If none of these procedures solve your problem, contact Parasoft in one of the ways described in “Contacting Parasoft Technical Support”, page 16.

Debugging SSL Issues Parasoft SOAtest runs on a standard JVM. To show the SSL/TLS handshake details and help identify causes of SSL connection problems, enable JVM network and SSL debugging as follows: 1. Open a command line console and navigate to the SOAtest installation directory. 2. Start the SOAtest executable with the arguments: -J-Dssl.debug=true -J-Djavax.net.debug=all -consolelog

SOAtest will start as usual, but whenever SSL connections are made, debugging output will be printed on the console. If you wish to save the trace output to a file (for example, output.txt), you may append the following to the end of the command : > output.txt

For more information about managing keys and certificates using the Java keytool, refer to: •

Windows: http://java.sun.com/javase/6/docs/technotes/tools/windows/keytool.html



Linux and Solaris: http://java.sun.com/javase/6/docs/technotes/tools/solaris/keytool.html

JMS SSL See “JMS SSL”, page 703.

424

Configuring Regression Testing

Configuring Regression Testing This topic explains how to configure your functional test suite for regression testing, which helps you determine whether functionality unexpectedly changes over time. Sections include: Sections include: •

Understanding Regression Testing



Creating Regression Tests



Modifying Regression Options

Understanding Regression Testing The purpose of regression testing is to detect unexpected faults—especially those faults that occur because a developer did not fully understand the internal code correlations when he or she modified or extended code that previously functioned correctly. Regression testing is the only reliable way to ensure that modifications did not introduce new errors into code or to check whether modifications successfully eliminated existing errors. During regression testing, SOAtest runs specified test cases, compares the current outcomes with the previously-recorded outcomes, then alerts you to differences between the current responses and control responses. Subsequent regression test runs will remind you of this discrepancy until the application returns the expected result. We recommend that you perform functional testing to verify that your service functions correctly, then begin regression testing to ensure that system changes do not introduce errors or unexpected changes.

Creating Regression Tests You can automatically create regression controls for an entire test suite, or you can record regression controls for individual test cases. If you want to run regression testing on all or most test cases, we recommend that you automatically add regression controls to the complete test suite, then remove the controls you do not want to use. If you only want to perform regression testing on a limited number of test cases, it is more efficient to add the controls individually.

Automatically Creating Regression Controls To have SOAtest automatically create regression controls for an entire test suite or for specific test cases: 1. Right-click the appropriate test suite or test nodes, then choose Create/Update Regression Control from the shortcut menu. 2. In the Response Validation wizard that opens, expand Create Regression Control, then specify what type of control you want create and click Finish. A Diff control will be added to the selected test(s) •

Single vs. Multiple Regression Controls: This option is only available for tests that use a parameterized value from a data source. It determines whether SOAtest creates regression controls using only one row from the data source, or all rows from the data source.



Internal vs. External Regression Controls: External regression controls allow you to manage the regression controls outside of SOAtest. If you are working with large messages, external controls are recommended because 1) they reduce the size of the .tst file and 2) they enable you to use the ExamXML Diff option, which can process very

425

Configuring Regression Testing

large files. Regression content in external files can also be searched, updated, scripted, and managed since the content is readable text. All subsequent times SOAtest runs a test case with a Diff control, it will compare the actual outcomes with the outcome specified in the Diff controls. If a single regression control was created, a single file is saved in the same directory as the current project file. This file will contain the regression content. If multiple regression controls created, the response associated with each data source row will be saved into a separate file. A directory with a random name will also be created and it will contain multiple files with names such as DS_Row_001.xml. The number in this file name correlates to the row number. For example, if the data source has five rows, and the test ran five times when creating regression controls, five files will be generated.

Manually Creating a Regression Control If you want to perform regression testing on a test case, but do not want to use the current outcome as the comparison value, we recommend that you define your own Diff control for that test case. To define your own Diff control for a test case: 1. Use the procedure described in “Adding Test Outputs”, page 333 to add a Diff tool. 2. Configure the Diff tool as described in “Diff”, page 899. All subsequent times SOAtest runs a test case with a Diff control, it will compare the actual outcomes with the outcome specified in the Diff controls.

Modifying Regression Options To automatically update a regression control: •

Right-click the appropriate test suite or test nodes, then choose Create/Update Regression Control from the shortcut menu.

To manually update a regression control: •

Double-click the Diff tool and modify the available options.

To customize the logic and data source usage for all regression controls in a test suite: •

Double-click the test suite’s Test Case Explorer node and modify the available options, which are described in “Regression Options”, page 325.

426

Validating the Value of an Individual Response Element

Validating the Value of an Individual Response Element This topic explains how to use SOAtest’s Response Validation wizard to validate the value of an individual response element. To configure SOAtest to validate a specific element from the service’s response message: 1. Right-click the appropriate test suite or test nodes, then choose Create/Update Regression Control from the shortcut menu. 2. In the Response Validation wizard that opens, select Create Value Assertion, then click Next. The next page will show a tree representation of the expected response message from the server. 3. Select the tree node that corresponds to the element you want to validate, then click Finish. An XML Assertor tool will then be configured and chained to the test. For details on this tool, see “XML Assertor”, page 917.

427

Validating the Structure of the XML Response Message

Validating the Structure of the XML Response Message This topic explains how to use SOAtest’s Response Validation wizard to quickly add an XML Validator to verify conformance of the incoming response to any XML Schemas that it is bound to. To configure SOAtest to validate the structure of an incoming response message: 1. Right-click the appropriate test suite or test nodes, then choose Create/Update Regression Control from the shortcut menu. 2. In the Response Validation wizard that opens, select Validate XML Response, then click Finish. An XML Validator tool will then be configured and chained to the test. For details on this tool, see “XML Validator”, page 862.

428

Web Functional Tests In this section: •

Web Functional Testing: Overview



Recording Tests from a Browser



Generating JUnit Tests from a Web Browser Recording



Configuring Browser Playback Options



Configuring User Actions (Navigation, Delays, etc.)



Configuring Wait Conditions



Validating or Storing Values



Stubbing Test Requests/Responses



Running Web Scenarios in Headless Mode



Running Static Analysis as Functional Tests Execute



Customizing Recording Options



Creating Custom Locators for Validations and Extractions



Understanding Web Functional Test Errors



Creating a Report of Test Suite Maintainability

429

Web Functional Testing: Overview

Web Functional Testing: Overview This topic provides an overview of SOAtest’s web functional testing capabilities. Web interface testing is difficult to automate. Teams often abandon automated testing in favor of manual testing due to too many false positives or too much effort required to maintain the test suites. SOAtest facilitates the creation of automated test suites that are reliable and dependable. Its ability to isolate testing to specific elements of the Web interface eliminates noise and provides accurate results. SOAtest isolates and tests individual application components for correct functionality across multiple browsers without requiring scripts. Dynamic data can be stubbed out with constant data to reduce test case noise. Validations can be performed at the page object level as well as the HTTP message level. SOAtest also verifies the client-side JavaScript engine under expected and unexpected conditions through asynchronous HTTP message stubbing.

430

Recording Tests from a Browser

Recording Tests from a Browser This topic explains how to record web functional tests by recording from a browser: Sections include: •

Recording New Tests



Extending an Existing Test Scenario



Exploring the Functional Tests Generated



Reviewing Pre-Action and Post-Action Browser Contents



Exploring the Asynchronous Test Requests Generated



Handling Internet Explorer Security Settings



Tutorial

Recording New Tests Want to extend an existing Test Scenario (instead of recording a new one)? See “Extending an Existing Test Scenario”, page 433 for details on how to add new steps to an existing test scenario. To create a new functional test for a web application, complete the following: 1. Do one of the following: •

To add a test suite to a new project, open the pull-down menu for the New toolbar button (top left) then choose Project from Web Browser Recording.

In the Create a SOAtest Project wizard that opens, enter a project name in the Project name field, then click Next. •

To add a new .tst file to an existing project, select the project node where you want the new .tst file added, then choose File> New> New Test (.tst) file, enter a .tst file name and location, click Next, then choose Web> Record functional tests, then click Next again.



To add a test suite to an existing project and .tst file, select the project’s Test Suite node where you want the new test suite added, then click Add Recorded Tests.

431

Recording Tests from a Browser

2. In the first Record Web Functional Tests wizard page, indicate if you want to create the new functional test "from scratch" or if you want to use an existing test scenario as a starting point. •

Record starting from a referenced test allows you start recording a new functional test scenario that builds upon an existing (reusable) functional test scenario. For example, you can record one test scenario that captures the steps to log into your application, and then reference this when creating new test scenarios. This way, you don’t need to record the login steps every time that you want to create a new test scenario that requires a log in. If the log in steps change, you just need to update the one login test scenario. All related test scenarios will automatically use the updated information.

3. Complete the next Record Web Functional Tests wizard page as follows: •

Test Suite Name: Enter a test suite name.



Start Recording From: Enter the URL of the site on which you would like to perform functional testing. To record applications that "live" on the same machine as SOAtest, do not use localhost—instead, use the machine name (e.g., mymachine.parasoft.com) or the IP address (e.g., 10.10.11.11).



Generate Functional Tests: Select this option if you want SOAtest to record user actions on the page, and generate a test suite that will allow you to replay the entire scenario in Firefox and/or Internet Explorer. •

Auto Generate Response Stubs: Select this option if you want SOAtest to automatically generate stub outputs for functional tests that have asynchronous responses.



Generate Asynchronous Request Tests: While navigating a web site in Firefox, the site may use the XMLHttpRequest object, or hidden IFrame calls to asynchronously request data from a server. Selecting Generate Asynchronous Request Tests will prompt SOAtest to capture those requests and their responses, and then generate and auto-configure tests to validate these requests. For more information, see “Exploring the Asynchronous Test Requests Generated”, page 437.



Record with: Specifies which browser you want to record the test. •



Firefox executable path: If you are running Windows, then SOAtest will automatically attempt to locate a version of Firefox on your machine, if SOAtest cannot locate Firefox, or if you are running Linux, you will need to click Browse to executable. This will open up a file chooser from which you may browse to the location of Firefox on your machine. SOAtest will display the version of Firefox that you have selected just below the path field. Note: SOAtest supports Mozilla Firefox 1.5 and higher.

Generate Test Maintainability Report: Specifies whether you want SOAtest to generate a report that helps you gauge the maintainability of a test suite. See “Creating a Report of Test Suite Maintainability”, page 476 for details.

4. Click the Next button. 5. (Optional) Complete the Create Environment page. •

The controls in this page allow you to specify whether environment variables automatically get added to recorded tests. For functional tests, they get used in the URL of the first navigate test. For asynchronous request tests, they get used in the endpoint and the HTTP header "Referer" of each Messaging Client that gets generated for each asynchronous request.

432

Recording Tests from a Browser



These variables are generated by default. If you do not want these variables generated into the recorded tests, disable the Add url variable to your existing environment option.



Name specifies the name that will be used for the environment if a new environment is created. (A new environment is created if the test suite that the tests are being generated into does not already contain at least one environment).



Prefix specifies the prefix that will be used on the environment variables that get generated into the environment and that are referenced by the tests. The text below the prefix field demonstrates what the environment variable name will look like based on the specified prefix.

6. Click the Finish button. The designated start page will open in the selected browser. •

If you configured SOAtest to record starting from a referenced test, that test will be played back in the browser before you can start adding the new test steps.

7. Specify the functionality that you want to capture by following it in the browser(s). You can click links, complete and submit forms, use the navigation bar to enter open URLs, access shortcuts, go forward or back, and so on. •

To ensure that recording works properly, wait until each page has fully loaded before performing an action.

Tip- Completing Forms To complete forms, enter the values directly into the GUI controls as if you were actually navigating the site. For instance, type in your user name and password, select radio buttons, check or clear check boxes, and so on. As you record sessions, please note: •

Password recall and auto-completion in Internet Explorer’s (Internet Options Advanced settings) are not supported during recording.



Google Toolbar's Auto Fill feature is not supported.



A "type" test may not be recorded if you type the beginning of a term into a field, but then click on a suggestion from a drop down.

8. Close the browser window(s). A new Test Suite will appear in the Test Case Explorer. This new Test Suite will contain different tests depending on your selections from the wizard’s Test Type field. For more information, see the following subsections.

Extending an Existing Test Scenario To add new steps to an existing test scenario (for instance, if you want to add more steps to the middle of a use case scenario that you already recorded): 1. Select the point in the scenario where you want to start recording from.

433

Recording Tests from a Browser

2. Click Add Recorded Tests.

3. In the first Record Web Functional Tests wizard page, choose Continue recording from <selected test>, then click Next. 4. Continue recording the test in the usual manner.

Exploring the Functional Tests Generated If you selected the Generate Functional Tests option in the SOAtest Project’s Record Web Functional Tests wizard, a Test Suite: Functional Tests folder is added to the Test Case Explorer. For each recording session completed, a test scenario will be added. Each test within this scenario is a Browser Testing tool that was automatically added and configured to represent one action taken in the browser. Actions taken inside forms will be placed inside a nested test scenario. Each test’s configuration is controlled by the settings available in the tool configuration panel, which can be opened by double-clicking the test node. This panel contains the following tabs: •

The Pre-Action Browser Contents tab shows browser contents before the user action (for the first test in a scenario, there are no contents to display). The contents are from the last run in which the previous browser test was successful (the post-action contents of the previous test are the pre-action for this test). If there are multiple windows, the one in which the test's user action occurs is displayed.



The User Action and Wait Conditions tabs correspond to settings that are used, in order, by each test. First, the test executes the User Action; second, it waits based on the conditions set in the Wait Conditions.

Two additional tools will be chained to each Browser Testing tool that is added: •

A Traffic Viewer that shows the HTTP traffic.



A Browser Contents Viewer that stores and displays the browser contents from the last test run—whether that test succeeded or failed.

Reviewing Pre-Action and Post-Action Browser Contents Pre-Action To see the page before the test action occurred, you use the Browser Testing tool’s Pre-Action Browser Contents tab. For example, here is the Browser Testing tool showing what the web page looked like before the Click "Athletic" test—a test that mimics a user narrowing search results to include only athletic shoes.

434

Recording Tests from a Browser

You can see the HTML for a given element by right-clicking that element, then choosing Inspect <Element> from the shortcut menu.

435

Recording Tests from a Browser

Post-Action To see the page after the test action occurred, you use the Browser Contents Viewer tool. For example, here is the Browser Contents Viewer tool showing what the same web page looked like after the Click "Athletic" test.

Colored Borders Note that various colored borders are used to highlight elements that are the source of operations such as extractions, validations, user actions, etc.

436

Recording Tests from a Browser

The following table explains what each colored border indicates.

Color

Marks the source of

Blue

User actions

Red

Validations

Gray

Data bank extractions

Green

Wait conditions

Purple

Validations + databank extractions

Rendering Pages on Linux SOAtest uses the XULRunner runtime version (1.9.0.1) to render web pages in SOAtest (for example, the rendered pages in the Browser Contents Viewer or the Browser Functional Test). The system requirements for XULRunner are the same as the system requirements for Firefox 3. On Windows, there are no notable system requirements. On Linux, additional software needs to be required. If these software requirements are not met, you may encounter errors (including SOAtest terminating) when you try to open an editor with a rendered page.

Exploring the Asynchronous Test Requests Generated If you select the Generate Asynchronous Requests Tests option in the Generation Options panel within the Record Web Functional Tests wizard, an Asynchronous Request Tests folder is added to the Test Case Explorer.

437

Recording Tests from a Browser

SOAtest detects XMLHttpRequests calls and hidden IFrames and uses them to auto configure asynchronous requests tests. IFrames are considered “hidden” if they meet any of the following criteria: •

the Iframe has a width and height of either 0 or 1 pixels



the Iframe’s visibility attribute is set to hidden



the Iframes’s display attribute is set to none

If you double-click the Asynchronous Requests Tests folder, test suite options display in the configuration panel on the right.

438

Recording Tests from a Browser

Attached to each test is a Response Traffic> Recorded Response tool that compares the server’s response to what was recorded during test creation. If the responses are different, the test fails.

There is also a Traffic Viewer tool attached to each test that lets you view the request that was sent and the server’s response.

Handling Internet Explorer Security Settings Internet Explorer’s Enhanced Security Configuration and Protected Mode cannot be enabled while you are working with SOAtest.

Enhanced Security Configuration On Windows Servers 2003 and 2008, Internet Explorer Enhanced Security Configuration is enabled by default. The Enhanced Security Configuration will prevent SOAtest from playing back browser tests in Internet Explorer on those machines. When SOAtest starts Internet Explorer, a dialog like the following will appear, and the tests will not complete.

439

Recording Tests from a Browser

To run tests on these machines, you should disable the Enhanced Security Configuration as described in http://www.visualwin.com/IE-enhanced-security/. For more information about the Enhanced Security Configuration, see http://blogs.msdn.com/askie/archive/2009/03/24/using-internet-explorer-enhancedsecurity-configuration-on-terminal-servers.aspx. Alternatively, you can prevent this dialog from displaying by configuring the Enhanced Security Configuration so that sites are "trusted" as described in http://support.microsoft.com/kb/933991. This will allow SOAtest to play back the tests. However, the recommended approach is to completely disable the Enhanced Security Configuration.

Protected Mode If Internet Explorer is running in Protected Mode (i.e., on Windows Vista), SOAtest automatically disables Protected Mode for the Internet, Local intranet, and Trusted sites. It does not disable it for restricted sites. To re-enable Protected Mode: 1. Open Internet Explorer's Internet Options dialog. 2. Open the Security tab. 3. Select a Web content zone. 4. Check the Enable Protected Mode check box. To verify that Protected Mode has been successfully re-enabled, look for the words "Protected Mode: On" next to the Web content zone displayed in Internet Explorer's status bar.

Tutorial For a step-by-step tutorial on how to use SOAtest to perform functional testing on the Web interface, see “Web Functional Testing”, page 171.

440

Generating JUnit Tests from a Web Browser Recording

Generating JUnit Tests from a Web Browser Recording This topic explains how to generate JUnit tests that represent the actions recorded in a web browser. Sections include: •

About JUnit Test Generation



Prerequisite



Generating JUnit Tests from a Scenario as You Record It



Generating JUnit Tests from a Previously-Recorded Scenario



Configuring License Information for JUnit Test Execution



Adding Assertion Statements to the Generated Tests

About JUnit Test Generation SOAtest can generate JUnit tests that represent a new or previously-recorded Web functional test scenario. You can set up a functional test against a Web application, then use the generated JUnit test cases to validate test results using the JUnit framework. This gives you the flexibility of a test script and the easy-to-use interface of SOAtest—without having to learn a new test scripting language. The resulting class files are JUnit based and depend on the following jar files available from the SOAtest installation directory: •

bcprov.jar



commons-httpclient-3.0.jar



commons-logging.jar



FESI.jar



grtlogger.jar



junit.jar



mail.jar



parasoft.jar



webking.jar



wizard.jar



xercesImpl.jar



xml-apis.jar

You also need junit.jar, which is available within the junit 3.x release that can be downloaded from http://www.junit.org/. Note: If the JRE used to run the junit test classes is 1.4.2 or lower, xml-apis needs to be appended in the Xbootclasspath attribute during execution. For example, the command should look like "java Xbootclasspath/p:xml-apis.jar TestCase".

Prerequisite Before you start generating JUnit tests, perform this one-time configuration: 1. Switch to the Java perspective (Choose Window> Open Perspective> Other> Java) .

441

Generating JUnit Tests from a Web Browser Recording

2. Create a new project in the workspace and name it MyJUnitTest. 3. In the Package Explorer, right-click the new MyJUnitTest project, then choose Properties from the shortcut menu. 4. Select Java Build Path, then go to the Libraries tab and click Add External JARs. 5. Browse to <SOAtest install root>/plugins/com.parasoft.xtest.libs.web_<version>/root/, then select and add the following jars: •

bcprov.jar



commons-httpclient-3.0.jar



commons-logging.jar



FESI.jar



grtlogger.jar



mail.jar



parasoft.jar



webking.jar



wizard.jar



xercesImpl.jar



xml-apis.jar

6. Browse to the location of junit.jar, then select and add it. If you do not already have it, you can get junit.jar within the junit 3.x release that can be downloaded from http://www.junit.org/.

Generating JUnit Tests from a Scenario as You Record It To generate JUnit tests from a scenario as you record it: 1. Return to the SOAtest perspective. 2. Open the pull-down menu for the New toolbar button (top left), choose Other, select SOAtest> Web> JUnit test from Web Browser Recording, then click Next.

442

Generating JUnit Tests from a Web Browser Recording

3. Complete the Record and Generate JUnit Test wizard page as follows: •

Start Recording From: Enter the URL of the site on which you would like to perform functional testing.

Recording Apps that "Live" on the Same Machine as SOAtest To record applications that "live" on the same machine as SOAtest, do not use localhost. Instead, use the machine name (e.g., mymachine.parasoft.com) or the IP address (e.g., 10.10.11.11). •

Record with: Specifies which browser you want to record the test. •

Firefox executable path: If you are running Windows, then SOAtest will automatically attempt to locate a version of Firefox on your machine, if SOAtest cannot locate Firefox, or if you are running Linux, you will need to click Browse to executable. This will open up a file chooser from which you may browse to the location of Firefox on your machine. SOAtest will display the

443

Generating JUnit Tests from a Web Browser Recording

version of Firefox that you have selected just below the path field. Note: SOAtest supports Mozilla Firefox 1.5 and higher. •

Class Name: Enter a class name for the generated JUnit test class. Enter MyJUnit for the class name



Package Name: This value is optional, but we recommend that you select the project ${project_loc:MyJUnitTest}/src for the output location. If the package name does not correspond to the folder structure dictated by the output location, SOAtest will generate the necessary sub-folders.



Generate into Output Location Specifies the destination folder for the generated test class file.

4. Click the Finish button. The designated start page will open in the selected browser. 5. Specify the functionality that you want to capture by following it in the browser(s). You can click links, complete and submit forms, use the navigation bar to enter open URLs, access shortcuts, go forward or back, and so on. Note: To ensure that recording works properly, you must wait until the page has fully loaded before performing some action. You must wait each time the page or some part of the page gets reloaded before performing an action.

Tip- Completing Forms To complete forms, enter the values directly into the GUI controls as if you were actually navigating the site. For instance, type in your user name and password, select radio buttons, check or clear check boxes, and so on. As you record sessions, please note: •

Password recall and auto-completion in Internet Explorer’s (Internet Options Advanced settings) are not supported during recording.



Google Toolbar's Auto Fill feature is not supported.



A "type" test may not be recorded if you type the beginning of a term into a field, but then click on a suggestion from a drop down.

6. Close the browser window(s). A JUnit test class will be added to the specified output location. A new project will not be created or added to the Test Case Explorer.

Generating JUnit Tests from a Previously-Recorded Scenario To generate JUnit tests that represent a previously-recorded test scenario: 1. Return to the SOAtest perspective. 2. In the Test Case Explorer, right-click the web functional test scenario that you want to generate JUnit tests for, then choose Generate JUnit Tests from the shortcut menu. 3. Complete the Generation Options dialog, then click Finish. Available options are: •

Class Name: Enter a class name for the generated JUnit test class. Enter MyJUnit for the class name

444

Generating JUnit Tests from a Web Browser Recording



Package Name: This value is optional, but we recommend that you select the project ${project_loc:MyJUnitTest}/src for the output location. If the package name does not correspond to the folder structure dictated by the output location, SOAtest will generate the necessary sub-folders.



Generate into Output Location: Specifies the destination folder for the generated test class file.

Tip- Managing Multiple Tests Many users find it convenient to put all tests into the same project. However, you may create multiple projects if you prefer.

Executing the Generated Tests To execute the generated tests 1. Go to the Java perspective. 2. Refresh the MyJUnitTest project. You should see a node representing the generated test. 3. Right-click MyJUnit.java and choose Run As> JUnit Test. You can also execute these tests from the command line as described in the JUnit documentation.

Configuring License Information for JUnit Test Execution License information is required to run JUnit tests generated by SOAtest. The license information can be set in two ways: •

If you want to use the same license as the local SOAtest installation, simply verify that the license information is configured properly under SOAtest> Preferences> License. The license information will then be detected by WebBrowser from the installation root passed to the constructor, i.e. <SOAtest install root>/plugins/com.parasoft.xtest.libs.web_<version>/root/.



If you want to run the tests from a machine that does not have a local installation of SOAtest— or if you want to use a different license information than one being used for the local SOAtest installation—you can control the licensing information without having to open SOAtest and modify the preferences from the UI. To do this, you pass the license information using the following constructor:

WebBrowser( String installRoot, int browserType, String ffExePath, String licenseServerLocation, int licenseServerPort, int licenseServerTimeout )

Adding Assertion Statements to the Generated Tests Each JUnit class generated by SOAtest consists of one test function that mimics the test sequence of a SOAtest test. Whenever the server returns a response, the test will make an assignment to the Web-

445

Generating JUnit Tests from a Web Browser Recording

Response object declared within the test function. You should insert assertion statements after these assignments to validate that the response from the server is expected. We have created comment blocks where we recommend assertion placements within the test function. For example: public void testA() throws Exception { WebConversation wc = new WebConversation(); WebRequest req = new GetMethodWebRequest("http://mole/tests/"); WebResponse resp = wc.getResponse(req); //Begin assertions //End assertions WebLink link = resp.getLinkWith("popup.html"); link.click(); resp = wc.getCurrentPage(); //Begin assertions //End assertions WebForm form = resp.getFormWithName("childrenForm"); resp = form.submit(); //Begin assertions //End assertions }

In the above JUnit test function, the blocks appear each time that the WebResponse object was assigned with a new value.

446

Configuring Browser Playback Options

Configuring Browser Playback Options This topic explains how to determine what browser is used to playback previously-recorded tests. Sections include: •

About SOAtest’s Browser Playback Options



Modifying Browser Playback Settings



Specifying the Browser at the Time of Test Execution



Specifying the Browser from the Command Line

About SOAtest’s Browser Playback Options By default, SOAtest configures a Web functional test to be played back using the browser in which it was recorded. You can change the test to use a different browser or both available browsers by default. SOAtest also provides Test Configurations that allow you to specify "on the fly" which browser to use for playback. This allows you to have your test played back in the browser you recorded in by default, as well as play it back in a different browser (or both browsers) by simply running the appropriate Test Configuration. If you want to ensure that a test is played only in the browser with which it was recorded (e.g., because the web page structure is significantly different on other browsers and the scenario would need to be constructed differently on another browser), you can configure the test to be played only in the specified browser.

Modifying Browser Playback Settings To modify the test’s browser playback settings (the settings used during playback unless another option is explicitly selected as described in Specifying the Browser at the Time of Test Execution below): 1. Double-click the scenario’s Test Case Explorer node. 2. Open the Browser Playback Options tab. 3. At the top of the tab, specify the browser you want the test case played in. •

If you want to ensure that this test is never played in an alternate browser (e.g., because the web page structure is significantly different on other browsers and the scenario would need to be constructed differently on another browser), enable Run in specified browser only.



With Run in specified browser only option disabled, each test scenario could have a different browser playback setting, and each test could run in a different browser depending on its test setting. This allows you to have your test played back in the browser you recorded in by default, as well as play it back in a different browser or both browsers by simply running the appropriate Test Configuration.

Specifying the Browser at the Time of Test Execution If the test does not have Run in specified browser only enabled, you can override the test’s browser playback settings at the time of test execution as follows: 1. Select the test scenario’s Test Case Explorer node.

447

Configuring Browser Playback Options

2. Choose the desired Test Configuration from SOAtest> Test Using> Built-In> Functional Testing •

Run Web Functional Tests in Both Browsers: Executes each test in both Firefox and Internet Explorer.



Run Web Functional Tests in Browser Specified by Tests: Executes each test using the browser playback settings configured in the test scenario’s Browser Playback Options tab. If you have multiple scenarios, each with different browser playback settings, this Test Configuration would run all the scenarios in the designated browser(s).



Run Web Functional Tests in Firefox: Executes each test in Firefox. If a test was configured to run in Internet Explorer, this does not perform any testing.



Run Web Functional Tests in Internet Explorer: Executes each test in Internet Explorer. If a test was configured to run in Firefox, this does not perform any testing.

Specifying the Browser from the Command Line To specify the browser to be used from the command line, set com.parasoft.xtest.execution.api.web.use_browser with one of the following values:



Both



Firefox



Internet Explorer



Specified in test

For example: com.parasoft.xtest.execution.api.web.use_browser=Internet Explorer

This is useful if you are creating your own custom Test Configuration that has different settings than the ones in the builtin Test Configurations.

448

Configuring User Actions (Navigation, Delays, etc.)

Configuring User Actions (Navigation, Delays, etc.) This topic explains how to modify the user actions simulated by a web functional test. Sections include: •

Configuring Actions



Understanding Preset Actions



Specifying Other Actions

Configuring Actions To view and modify the action taken by a specific functional test: 1. Double-click the test’s Test Case Explorer node. 2. In the configuration panel that opens in the right side of the GUI, open the User Action tab. 3. Review the existing actions (initially, the ones captured during test creation) and modify the settings as needed to specify the actions you want performed. You can choose from the available pre-set actions, or define a custom one.

Identifying Elements Associated with User Actions The element that is the source of a user action will be highlighted with a solid blue border in the test’s Pre-Action Browser Contents tab.

Changing the Target of a User Action To quickly change the target of a user action, right-click the related element in the Pre-Action Browser Contents tab, then choose the appropriate Modify command.

If the user action that you want to change is not associated with a specific element (for instance, a "close" or "navigate" action), you can right-click anywhere in the Pre-Action Browser Contents tab, then choose Change User Action.

449

Configuring User Actions (Navigation, Delays, etc.)

This opens the the User Action tab, which allows you to modify the target.

Inspecting the HTML for Elements As you create and modify user actions for page elements, you may want to inspect the HTML to determine if you are adding actions to the appropriate elements. To see the HTML for a given element by right-click that element, then choose Inspect <Element> from the shortcut menu.

Understanding Preset Actions Navigate Select the navigate action if you want the browser to navigate to the provided URL as though it was entered in the URL bar of the browser. If you choose this action, you can specify the following settings: •

URL: You can enter a Fixed, Parameterized (if a data source is available), or Scripted URL. •

To enter a scripted URL, select Scripted, then click the Edit Script button to enter a script method to return the URL that should be navigated to in the selected test.

450

Configuring User Actions (Navigation, Delays, etc.)



Window Name: You may specify the name of the window you would like the action to occur in. Leaving this field blank indicates that SOAtest will use the default window.

Click Select the click action if you want the browser to click the specified element. If you choose this action, you can specify the following settings: •

Element Locator •

Use Element Properties: Select to identify an element by the following properties: •

Element: Specifies the element name (for example, "img", "div", or "a") that the action should apply to. To allow any element, enter "Any" into this field.



Attribute Name: Specifies the attribute name to identify the element (for example, "title", "id", or "name"). You can configure this value using one of the following mechanisms:



Attribute Value: Specifies the expected value for the attribute supplied by the Attribute Name field.







If you want to specify a fixed value, select the Fixed option, then specify the desired value in the text box.



If you want to use values defined in a data source, select the Parameterized option, then specify the data source column that contains the values you want to use. Note that this option is only available if the project contains at least one data source.



If you want to use the return value of a Java/JavaScript/Python method, select the Script option. Click the Edit button to create or edit the method(s) and choose the desired method for use from the Method drop-down menu in the popup dialog. If there are two or more methods, you can also select a different method for use from the drop-down menu in the form panel.

Index: Specifies the element that matches the previous criteria. Entering "0" means that the first element that matches the "Element," "Attribute Name," and "Attribute Value" criteria will be used. Entering "1" means that the second element that matches will be used, and so on. •

If you want to specify a fixed value, select the Fixed option, then specify the desired value in the text box.



If you want to use values defined in a data source, select the Parameterized option, then specify the data source column that contains the values you want to use. Note that this option is only available if the project contains at least one data source.



If you want to use the return value of a Java/JavaScript/Python method, select the Script option. Click the Edit button to create or edit the method(s) and choose the desired method for use from the Method drop-down menu in the popup dialog. If there are two or more methods, you can also select a different method for use from the drop-down menu in the form panel.



Use XPath: Enter an XPath to be used as an identifier.



Use Script: Enter a script that defines the desired click action.

Key Modifiers: Specifies if you want to mimic the user pressing the Alt, Ctrl, or Shift keys during the click.

451

Configuring User Actions (Navigation, Delays, etc.)



Window Name: You may specify the name of the window you would like the action to occur in. Leaving this field blank indicates that SOAtest will use the default window.

Type Select the type action if you want the browser to type the specified text into the specified element. If you choose this action, you can specify the following settings: •

Value: You can enter a Fixed, Parameterized (if a data source is available), or Scripted value. •



To enter a scripted value, select Scripted, then click the Edit Script button to enter a script method to return the value that should be typed in the selected test.

Element Locator •

Use Element Properties: Select to identify an element by the following properties: •

Element name: Specifies the element name (for example, "img", "div", or "a") that the action should apply to. To allow any element, enter "Any" into this field.



Attribute Name: Specifies the attribute name to identify the element (for example, "title", "id", or "name"). You can configure this value using one of the following mechanisms:



Attribute Value: Specifies the expected value for the attribute supplied by the Attribute Name field.





If you want to specify a fixed value, select the Fixed option, then specify the desired value in the text box.



If you want to use values defined in a data source, select the Parameterized option, then specify the data source column that contains the values you want to use. Note that this option is only available if the project contains at least one data source.



If you want to use the return value of a Java/JavaScript/Python method, select the Script option. Click the Edit button to create or edit the method(s) and choose the desired method for use from the Method drop-down menu in the popup dialog. If there are two or more methods, you can also select a different method for use from the drop-down menu in the form panel.

Index: Specifies the element that matches the previous criteria. Entering "0" means that the first element that matches the "Element," "Attribute Name," and "Attribute Value" criteria will be used. Entering "1" means that the second element that matches will be used, and so on. •

If you want to specify a fixed value, select the Fixed option, then specify the desired value in the text box.



If you want to use values defined in a data source, select the Parameterized option, then specify the data source column that contains the values you want to use. Note that this option is only available if the project contains at least one data source.



If you want to use the return value of a Java/JavaScript/Python method, select the Script option. Click the Edit button to create or edit the method(s) and choose the desired method for use from the Method drop-down menu in the popup dialog. If there are two or more

452

Configuring User Actions (Navigation, Delays, etc.)

methods, you can also select a different method for use from the drop-down menu in the form panel.





Use XPath: Enter an XPath to be used as an identifier.



Use Script: Enter a script that defines the desired click action.

Window Name: You may specify the name of the window you would like the action to occur in. Leaving this field blank indicates that SOAtest will use the default window.

Wait Select the wait action if you want the browser to wait the specified number of milliseconds before continuing the next step in the functional test. If you choose this action, you can specify the following settings: •

Milliseconds: You can enter a Fixed, Parameterized (if a data source is available), or Scripted value. •



To enter a scripted value, select Scripted, then click the Edit Script button to enter a script method to return the value that should be typed in the selected test.

Window Name: You may specify the name of the window you would like the action to occur in. Leaving this field blank indicates that SOAtest will use the default window.

Close Select the close action if you want the browser to close the specified window. If you choose this action, you can specify the following setting: •

Window Name: You may specify the name of the window you would like the action to occur in. Leaving this field blank indicates that SOAtest will use the default window.

New Browser Select the new browser action if you want to open a new browser based on the specified start URL. If you choose this action, you can specify the following setting: •

Window Name: You may specify the name of the window you would like the action to occur in. Leaving this field blank indicates that SOAtest will use the default window.

Specifying Other Actions You can use the "other" action to specify common commands such as: •

goback - Equivalent to the user pressing the Back button in the browser.



mousedown - Equivalent to the user pressing the mouse on an element.



mouseup - Equivalent to the user releasing the mouse over an element.



mouseover - Equivalent to the user moving the mouse over an element.



select - Equivalent to the user choosing an option in a single selection combo box.



addselection - Equivalent to the user choosing an option in a multiple selection combo box.



removeselection - Equivalent to the user unselecting an option in a multiple selection combo box.

The action field is optional depending on what command is used. goback, mousedown, mouseup, mouseover do not need it. For select, addselection, and removeselection, you need to specify the name of the option that is being selected or deselected.

453

Configuring Wait Conditions

Configuring Wait Conditions This topic explains how to customize wait conditions for web functional tests. Sections include: •

Understanding Wait Conditions



Specifying Wait Conditions



Configuring the Order of Wait Conditions



Upgrading WebKing Projects (pre-6.0.5) to Current SOAtest Wait Options

Understanding Wait Conditions You can customize how long SOAtest waits after performing a user action in order to move on to the validations/extractions step of the current test, and then to the next test in the scenario. When a user interacts with a web page, the web page responds to whatever the user is doing. For example, when the user clicks a link, the page is reloaded with new content. When a user chooses a menu item, some other part of the page might refresh itself. The user instinctively waits until the page is done updating before continuing further use of the page. In fact, in most cases the user HAS TO wait for the update in order for the page element to be present with which the user is going to interact with next. Using automation, however, there is no human to decide when the page is done updating. This decision has to be made automatically by SOAtest. SOAtest must wait long enough so that it does not try to continue with the testing process before the page is ready, but at the same time it must also run quickly to achieve one of the benefits of automation—speed. SOAtest automatically configures the wait conditions while it is recording. However, you may want to manually adjust or modify the wait conditions in order to get the tests to perform as desired. In many cases, multiple wait conditions will be used for a single test.

Specifying Wait Conditions

454

Configuring Wait Conditions

The wait conditions captured during test creation can be viewed and modified in the Wait Conditions tab. Available wait conditions include: •

Wait for Page Load: This wait condition waits until at least one page load has occurred. However, it will wait until all page loads that happen within one second of each other have finished. Once one second has passed without any new page loads starting, the wait is finished. A page load can either mean the entire page is reloading, or it can mean that a single frame is reloading. A Page Load wait condition is added to a test during recording if SOAtest detects that a page load occurs after the particular user action that causes that test to be recorded, and before the user action for the next test that is recorded.



Wait for Asynchronous Requests: This wait condition waits until at least one asynchronous request has been made and a response is received for it. If any other asynchronous requests were begun while the first was in progress, then it waits until all asynchronous requests have completed. For this wait condition, an asynchronous request is defined as a request that is made while a page load is not occurring, and whose response is text-based. An Asynchronous Request wait condition is added to a test during recording if SOAtest detects that one or more asynchronous requests occur, outside the context of a page load, after the user action that caused that test to be recorded and before the user action for the next test is recorded.



Wait for Element: This wait condition waits until a specified page element meets a specific condition. •



Page Element: The page element can be specified in two ways: •

Element for next user action: This option automatically determines which element to wait for by looking at the next browser test in the scenario and using the element that is configured in that test’s User Action tab. If the current test is the last browser test in a certain scenario, then this wait condition will not perform a wait.



Element specified: This option allows you to manually choose which element to wait for. You can use Element Properties, XPath, or Script to choose the element.

Condition: You can set the following element conditions to wait for: •

Present: This condition waits until the element is present on the page. The element may or may not be visible to the user. This is preferred over the next condition, Visible, in cases where the element does not become visible until a user mouses over some other element. This is the default condition used for element waits that are automatically added while recording.



Visible: This condition waits until the element is visible on the page.



Not visible: This condition waits until the element is either not present on the page, or is on the page but is not visible.



Editable: This condition waits until an input element is editable. If this condition is used on an element that is not an input, it will eventually time out because elements that are not inputs are by definition not editable.



Has value: This condition waits until an element has an attribute with the specified value. The attribute could be any attribute supported by the element, or “text” for the text content of the element.

An Element wait condition (wait for element present) is added during recording as the last wait condition for all tests except those that have a Script Dialog wait condition added to them. Element wait conditions are inactive (meaning that they don’t wait for anything) in the following cases: •

There are no browser tests in the scenario after the current test

455

Configuring Wait Conditions



There are no enabled browser tests in the scenario after the current test



The next browser test does not use a page element in its user action



The next browser test is configured to use test suite logic. Element wait conditions are not used if the next browser test uses test suite logic, since the logic can cause the next test to not be run



Wait for Script Dialog: This wait condition waits until one of the following script dialogs is detected: alert, confirm, or prompt. A Script Dialog wait condition is added during recording as the last wait condition for all tests that cause a script dialog to appear.



Wait for Specified Time: This wait condition simply waits for the specified number of milliseconds. This wait condition is not added automatically during recording.



Wait For Time Interval Without HTTP Traffic: This wait condition waits until a specified number of seconds have passed without there being any traffic between the browser and the server. For example, if the specified time is 1 second, it finishes waiting once there has not been any traffic between the browser and the server for 1 second. This wait condition is added during recording only in conjunction with an Asynchronous Request wait condition, in cases where SOAtest detects that an asynchronous request causes other non-asynchronous requests to occur. Note: Prior to WebKing 6.0.5, this was the default wait condition. It came in two flavors – a Request Wait Time and a UI Wait Time. The individual times were able to be customized manually, but the defaults were 4000 ms for Request Wait Time and 100 ms for UI Wait Time.

In addition to manually adding new wait conditions from this tab, you can also add them automatically from the Browser Contents Viewer tool.

Adding a Wait Condition from the Browser Contents Viewer The Browser Contents Viewer allows you to specify a wait condition graphically, from a view of the rendered Web page. To add a new wait condition from the Browser Contents Viewer tool: 1. Right-click the element for which you want to specify a wait condition. 2. Choose Add Wait for Element from the shortcut menu. 3. In the Add Wait Condition dialog, specify the details for the new wait condition, then click OK. The wait condition added will automatically be configured to wait until the clicked-on element is present.

Identifying Elements Associated with Wait Conditions The element that is the source of a wait condition will be highlighted with a solid green border in the Browser Contents viewer.

Configuring the Order of Wait Conditions The Wait Conditions appear in order of execution in the Wait Executions tab. You can Add, Remove, and change the order of conditions by clicking the appropriate buttons in the Wait Conditions tab. The order of the wait conditions is important. SOAtest will execute all wait conditions in the order that they appear, regardless of whether any of the other wait conditions succeed or fail. If a wait condition fails (meaning the condition is not satisfied before the timeout for that condition), then an error message is generated. For example, for most web applications page loads typically happen before asynchronous requests are made. Therefore, the wait conditions for a test that has both a page load and

456

Configuring Wait Conditions

asynchronous requests typically should have a Page Load wait condition appearing before an Asynchronous Request wait condition. If the order of the conditions were switched, than the Page Load condition would fail, because the Asynchronous Request wait condition will wait for asynchronous requests which happen after any page loads occur. Then the Page Load wait condition would execute, but since there will be no more page loads it will end up timing out. Each wait condition, other than the Wait for Specified Time condition, has a timeout. If the wait condition is not met within the timeout, an error message is reported so you know to adjust the wait conditions. However, the test will continue even if wait conditions fail. The timeout can be set to use the default timeout that is set in the preferences, or it can be customized for that individual wait condition.

Upgrading WebKing Projects (pre-6.0.5) to Current SOAtest Wait Options By default, WebKing projects from WebKing versions prior to 6.0.5 that are opened in the current version of SOAtest will retain the old wait conditions (and thus will not benefit from the added speed and control that the new wait conditions provide). However, there is a way to automatically transform them to use the new wait conditions. To help you update wait conditions, SOAtest ships with a updateWaitConditions.py file, which is located at <SOAtest_62_Installation_Directory>/plugins/com.parasoft.xtest.libs.web_<soatest_version>/root/startup. To automatically update tests to the new wait conditions, modify that script to pass true (the value “1” in jython) to the updateWaitConditions method. SOAtest must be restarted after you make this change in order for it to take effect. Once restarted, any projects that are opened that were saved with the old wait conditions will be automatically updated as follows: •

Any navigate tests are converted to have a page load wait.



Any test that causes a script dialog to appear, has a script dialog wait added to it. Any other test that does not have a script dialog wait added to it, including navigate tests, will get an element wait added to it.

Once this conversion is complete, however, it is quite likely that page load waits will need to be added for any test that causes an entire page or frame to reload. The following error messages may appear to alert you to the fact that you need to add a page load wait (These error messages signal the need for a page load wait even when not dealing with old projects. •

Unable to extract value because the previous page did not finish loading – add a page load wait to the same test this error gets reported on.



Unable to perform user action because the previous page did not finish loading – add a page load wait to the browser test immediately before the test this error gets reported on.



The request has an attached output, but it is no longer being requested by the browser – add a page load wait to the same test this error gets reported on.



Any time a test gets a finished status, of either success or failure, before the page looks like it has completely finished loading, could mean a page load wait is needed on the test that finishes too quickly.

Wait Options Q&A How might I know that I need to update the wait conditions, and how should I update them? •

SOAtest shows that a test executes successfully, but the action that the test is supposed to trigger does not actually happen in the browser. This can happen when an element is present on the page, but not yet visible and also not ready for user interaction. Since SOAtest’s default

457

Configuring Wait Conditions

element wait condition is to wait for the element to be present, SOAtest may trigger the action on the element before it is ready to process the action. In this case you should change the wait condition to wait for the element to be visible instead. •

SOAtest shows an error message that starts with “[Internet Explorer Script Error]”, and it appears that the test that shows this message gets executed before the element it is executing the action on is visible. This can happen when an element is present on the page, but not yet visible and also not ready for user interaction. Since SOAtest’s default element wait condition is to wait for the element to be present, SOAtest may trigger the action on the element before it is ready to process the action. In this case you should change the wait condition to wait for the element to be visible instead.

458

Validating or Storing Values

Validating or Storing Values This topic explains how to extract a value for validation or store it for use in another test. Sections include: •

Understanding Extractions



Validating or Storing a Value



Specialized Extractions/Validations



Setting up Validations from within the Browser



Viewing Stored Variables Used During Test Execution



Configuring Custom Validations with Scripting

Understanding Extractions In the Browser Contents Viewer tool that is automatically added to each test recorded from the browser, you can click on page elements within a rendered view and automatically set up functional tests on those elements. If a validation is not satisfied in a subsequent test run, the associated test will fail. In addition, you can "extract" and store data from those elements, then use those extracted values in additional tests (for instance to populate form fields or to validate data). This allows you to easily set up functional tests on applications where dynamic data validation is important. Extracted data can be used in both Web tests and SOA tests. NOTE: You can also extract path elements from within your Web browser WHILE you are recording a path. To do so, right-click an element from within your browser and select Configure Validations from the shortcut menu. The Validation Options dialog displays. The options within this dialog will be the same as the options in the Browser Contents Viewer tool that appears AFTER a path has been recorded and replayed at least once.

Validating or Storing a Value To validate or store a value represented in the rendered page, complete the following from the Browser Contents Viewer’s tool configuration panel (accessible by double-clicking the tool’s Test Case Explorer node): 1. Right-click on the page element for which you would like to create a test (for example, rightclick on a link), and choose Extract Value from <element> Element… from the shortcut menu. 2. In the wizard that opens, ensure that the desired element is selected in the Property name box. 3. If you want to "zoom in" on the value to extract, complete the isolate partial value wizard page. •

Sometimes you may only want to validate or send a portion of a property value to a data source. If this is the case, you can isolate the part of the property value to use by selecting the Isolate Partial Value using Text Boundaries checkbox. You then enter Left-hand and Right-hand text that serve as boundaries for the value that you enter. The preview pane shows you what value will be used based on the boundary values that you have entered. For example, assume that the property value is "Click here to log in”:

459

Validating or Storing Values



To isolate the value “Click”, leave the left boundary blank and enter “ here” (including the space) in the right boundary.



To isolate the value “here”, enter “Click “ in the left boundary and “ to” in the right boundary (again including spaces).



To isolate the value “in”, enter “log “ (including the space) as the left boundary and leave the right boundary blank.

4. If you want to validate a value: a. Select Validate the value. b. Choose from the following expected value options:

c.



equals: Validates that the property value exactly matches the expected value.



contains: Validates that the property value contains the expected value somewhere within it.



starts with: Validates that the property value starts with the expected value.



ends with: Validates that the property value ends with the expected value.



is not present: Validates that the specified property does NOT appear on the page, and will report an error if it does. This is useful for cases when a web application is showing an error message that it should not be.

Choose Fixed, Parameterized, or Scripted, then specify a value. •

If the Parameterized option is chosen, then you can specify a column name from that data source. When the test is run, the expected value will be taken from the appropriate row and column in the data source. Column names will only be shown for one data source, so if you have multiple data sources in your project, you will need to go to the chained Browser Validation tool and modify the data source being used at the top of that panel. If other extracted column names are available because they were set up by extracting from a different HTML page, they will also be in the list of available column names, even if the project does not define a data source.

5. If you want to send the value of the selected property to a data source (so that the value can be used later in a functional test): a. Select Extract the value to a data bank. b. Enter a Column Name by which you will reference this value later in the test. When the test is run, the property value will be extracted from the page and placed into a temporary data source within a column with the specified name. When later parts of the test reference the column name, the value stored in the temporary data source will be used for those parts of the test. You can both validate and send a property value to a data source at the same time if desired. 6. Click Finish. The value will be validated or stored when the test is executed.

What if I don’t see the value I want to validate or extract? If the Browser Contents Viewer tool does not display the value you want to extract or validate—for example, because the related test failed or because the item is not visible in the rendered page (e.g., it is a title), you can manually add a Browser Validation tool or Browser Data Bank tool as described in “Adding Test Outputs”, page 333.

460

Validating or Storing Values

If you configured a validation... A Browser Validation tool will be chained to the test. This tool will performing the validation. If you later want to modify the validation, you can do so by modifying this tool’s settings. The element that is the source of a validation will be highlighted with a solid red border in the Browser Contents viewer, and in the Post-Action Browser Contents tab of the Browser Validation tool.

If you configured an extraction... A Browser Data Bank tool will be chained to the test. This tool will store the extracted value. The extracted value can be used wherever parameterized values are allowed, such as the value to type into an input in a subsequent test. If you later want to modify the stored value, you can do so by modifying this tool’s settings. The element that is the source of an extraction will be highlighted with a solid gray border in the Browser Contents viewer, and in the Post-Action Browser Contents tab of the Browser Data Bank tool.

If you configured both... A Browser Validation tool and a Browser Data Bank tool will be chained to the test as described above. In addition, a dotted purple border will be used to highlight the source element.

Specialized Extractions/Validations Validating or Extracting Text To validate text that appears on a page (or to extract it to a browser data bank), complete the following: 1. Select the text you want to validate or extract. 2. Do one of the following: •

To configure a validation for that text, right-click the selection and choose Validate Selected Text from the shortcut menu.



To configure an extraction for that text, right-click the selection and choose Extract Selected Text into Data Bank from the shortcut menu.

3. Ensure that the desired validation/extraction settings appear in the dialog that opens. 4. Click Finish.

Validating Color Elements To create a test that validates the color on a page, complete the following: 1. Right-click on the page element for which you would like to create a test (for example, rightclick on a link), and choose Extract Value from <element> Element… from the shortcut menu. 2. In the wizard that opens, ensure that style_color is selected in the Property name box, then click Next two times. 3. In the Validate or Store Value wizard page, select matches color from the Expected Value drop-down menu and enter a color in the text field (e.g. "red"). The matches color option validates color values corresponding to names of colors specified in the validation colors mapping

461

Validating or Storing Values

file. These mappings are either in hex notation, or RGB notation -- rgb(0, 255, 0). For more information, see “Validation Colors Mapping File”, page 462. 4. Click Finish. The color validation will be performed when the test is executed.

Validation Colors Mapping File There is a file in the product installation called the Validation Colors Mapping file. This file defines how SOAtest validates colors by name. It is located in <SOAtest_62_Installation_Directory>/plugins/com.parasoft.xtest.libs.web_<soatest_version>/root/validation/validationColors.txt.

Each line of the file defines a color by name, along with ranges for each component of the RGB color model. The specified ranges tell SOAtest that if a color is validated and falls within the ranges for each of the components of the RGB color model, then the color being validated matches the color that is defined within those ranges. For example, a line in this file may look like the following: red, b0-ff, 00-30, 00-30

This line defines the valid ranges for the color “red”. The ranges are specified using hex notation. In the above example the valid R range for red is between the hex values b0 and ff. The valid G and B ranges for red are between the hex values 00 and 30. In other words, if an element has a hex value of #c80000, then it will be considered to be red, since the R value, which is c8, falls between b0 and ff, and each of the G and B values, which are 00, fall between 00 and 30. However, if a validation is set up on an element that is expected to be red, but the element’s color has a hex value of #909090, then SOAtest will display a message that the element has the incorrect color. The mapping file has a few standard colors already defined. However, if you would like to specify additional colors, you can simply modify the file. There must be only one color defined per line. Also, if you want to change the valid RGB ranges for a defined color, you can also modify the mapping file. Ranges can be specified with a hyphen (b0-ff, as already described), or they can be a single value (ff). If they are a single value, the range of valid values only includes one value. As mentioned before, the ranges must also be in hex notation. SOAtest must be restarted in order for changes to this file to take effect.

Validating Style Properties To validate a style property, open the Browser Validation tool’s configuration panel, then set the validation’s Element Property to the value "style_" + <the JavaScript name of the property>. For example, to validate the text-decoration style property, you would specify style_textDecoration (textDecoration is the JavaScript way to specify the style property text-decoration) in the Element Property field, and specify the desired value of the property using the Expected Value controls. In the text-decoration case, the expected value might be equal to line-through or underline.

Validation Styles List File If you want a certain style property to display as an available property in the validation wizard, you can add that style to the Validation Styles List. The Validation Styles List file, located in <SOAtest_62_Installation_Directory>/plugins/ com.parasoft.xtest.libs.web_<soatest_version>/root/validation/validationStylesList.txt, specifies runtime style properties that can be validated. The format of this file is to have one property per line. By default, the color property is specified in this file; however, you can add any valid style properties that you would like to validate. SOAtest must be restarted in order for changes to this file to take effect. Once it is restarted, you will see the properties specified in this file in the validation dialog when right-clicking on an element (in the Firefox browser

462

Validating or Storing Values

during recording, or in the rendered view after recording). The properties will have “style_” appended to each of the properties to tell you (and SOAtest) that these refer to the runtime value individual style properties. A validation set-up using one of these properties will validate the runtime value of that style property. This is the runtime value of the property after all inline styles and styles defined in CSS files have been applied. Because of this, the value may differ from what is defined inline in the element. For example, these runtime style validations allow you to validate the actual color that is seen by a user after all styles have been processed by the browser.

Setting up Validations from within the Browser If you are recording with a Firefox browser, it is possible to setup validations on elements while you are recording. In order to do so, begin recording a functional test scenario. When the browser is open, right-click on the element you’d like to create a validation for and select Configure Validations from the shortcut menu. The Validation Options dialog displays. This dialog is nearly identical in functionality to the dialog that is created by right-clicking within the Browser Contents Viewer tool from within SOAtest. There is only one major difference. In order for the validation to be created, either the Validate Value check box (found when selecting Validate Value from the list) must be selected, or the Extract to Data Source check box (found when selecting Extract to Data Source from the list) or both must be checked. It is possible to setup validations on multiple properties of the same element with the same dialog.

Viewing Stored Variables Used During Test Execution You can configure the Console view (Window> Show View> Console) to display the stored data bank variables used during test execution. For details, see “Console view”, page 35.

Configuring Custom Validations with Scripting If you want to perform complex validations that cannot be properly representing using the GUI controls, you can express them using scripting. For example, suppose that you want to validate all of the rows in a table. That table could be of variable length. You can attach an Extension tool to a browser functional test and pull values from the document provided by input.getDocument(). Here's a sample JavaScript script that accomplishes that. var Application = Packages.com.parasoft.api.Application; var WebBrowserTableUtil = Packages.webking.api.browser2.WebBrowserTableUtil; var WebKingUtil = Packages.webking.api.WebKingUtil; // Verify that all values in a table column are equal to a previously // extracted value. For example, we searched for all places in which // widget 11 is sold, and we want to make sure that all results are // for widget 11. function validateTable(input, context) { // Column we want to validate. var columnIndex = 0; // We extracted the value we want to use for comparison through a Data Bank // extraction in a previous step. // Must prepend extracted column name with "Extracted: ". var expectedValue = context.getValue("dsExtracted", "Extracted: testValue"); var document = input.getDocument();

463

Validating or Storing Values

// Table should have some unique identifying attribute (e.g., id). var table = WebBrowserTableUtil.getTable("id", "mytable", document); // If the first row of the table contained column headers, we could // use getCellValuesForColumn(String, Element). var values = WebBrowserTableUtil.getCellValuesForColumn(columnIndex, table); if (values.length == 0) { context.report("No rows found!"); } for (var i = 0; i < values.length; ++i) { if (values[i] != expectedValue) { var errorMessage = "Table column '" + columnIndex + "': " + "Expected value '" + expectedValue + "', but found '" + values[i] + "'."; context.report(errorMessage); } }

464

Stubbing Test Requests/Responses

Stubbing Test Requests/Responses This topic explains how to stub test requests/responses for web functional tests. Sections include: •

Understanding Stubs for Web Functional Tests



Creating Stubs



Configuring the Browser Stub Tool

Understanding Stubs for Web Functional Tests Testing applications with dynamic data can cause many false positives, creating extra overhead for the developers and QA who have to determine which failures are real, and which are “noise.” To solve this, SOAtest has the ability to “stub” data that is sent back to the client. Stubbing helps to ensure that any changes to the client-side code do not affect the final resultant html page. A stub is static data that SOAtest saves when recording a functional test scenario through a web site. Since the data that will be fed to the client is unchanging, any new errors that occur while processing the data can be attributed to changes in the client-side JavaScript that processes the data.

Creating Stubs While recording a functional test, SOAtest keeps track of each request made by the client, as well as the response. To create a stub, complete the following: 1. Right-click the test from which you would like to return the static data and select Add Output from the shortcut menu.

2. In the wizard that opens, choose HTTP traffic, then click Next. 3. In the next page, choose the browser request that you want to stub. 4. In the left panel, select Both - Stub Request/Response. 5. In the right panel, select any Browser Stub.

465

Stubbing Test Requests/Responses

6. Click Finish.

Configuring the Browser Stub Tool To configure a Browser Stub tool that has been added to an functional test: 1. Double-click the Stub Request/Response for <URL> node that was added to the test. 2. In the Browser Stub test configuration panel, you can modify the following options:

466

Stubbing Test Requests/Responses



Name: Specifies the name for the Browser Stub.



Response Header View: Customizes the returned request/response headers. Select the following from the drop-down menu:





Literal: This view allows you to modify the raw text returned from the request.



Form: This view provides the following sub-options. •

General: Selecting this option from the list will allow you to modify the version of HTTP and response code that is returned.



Request/ Response Headers: This panel lets you add/remove/modify the list of headers that are sent to the server, or received by the client. To add a new header, click the Add button. This will add a new entry to the list that can be modified by clicking the Modify button. Clicking Modify will open up a dialog with two text fields. The top field is for the header name, and the bottom option is for the header value. Click OK to make the changes permanent. To remove headers, select the header you wish to remove and click the Remove button.



URL Parameters: This option, only available if configuring a request, allows you to add/remove/modify any arguments to be placed into the URL when the browser makes its request to the server.



Parameterized: This option will allow you to return values stored in data sources created in the project. For more information on setting up data sources see “Parameterizing Tests (with Data Sources or Values from Other Tests)”, page 345.



Scripted: This option allows you to use a custom script to return the correct values. This option is identical to using an Extension tool in SOAtest. For more information see “Extension (Custom Scripting)”, page 960.

Response Body View: Customizes the returned request/response body. Select the following from the drop-down menu: •

Literal: This view allows you to modify the raw text returned from the request.



Parameterized: This option will allow you to return values stored in data sources created in the project. For more information on setting up data sources see “Parameterizing Tests (with Data Sources or Values from Other Tests)”, page 345.



Scripted: This option allows you to use a custom script to return the correct values. This option is identical to using an Extension tool in SOAtest. For more information see “Extension (Custom Scripting)”, page 960.

467

Running Web Scenarios in Headless Mode

Running Web Scenarios in Headless Mode This topic explains how to run web scenarios without opening a browser, and discusses special configuration steps for configuring this mode on Linux and Solaris. Sections include: •

Running in Headless Mode



Linux and Solaris Configuration

Running in Headless Mode SOAtest can run web scenarios in "headless mode"—where the browser is not shown. In command line mode (using soatestcli), SOAtest always runs web scenarios in headless mode. You can also configure scenarios executed from the SOAtest GUI to run in headless mode. To do this: 1. Open the scenario’s configuration panel. 2. In the Browser Playback Options tab, choose the Headless option. By running tests in headless mode, you can work without the distraction of browser windows opening and closing.

Linux and Solaris Configuration To run web scenarios in headless mode in Linux and Solaris, SOAtest creates its own hidden X display. SOAtest uses the X server Xvfb to create a virtual framebuffer that requires no display hardware. The SOAtest installation includes a copy of Xvfb for each of Linux and Solaris (the files Xvfb_Linux and Xvfb_Solaris, respectively). SOAtest will use a system-installed Xvfb if it exists. SOAtest searches the following paths, in order: /usr/bin/Xvfb /usr/X11R6/bin/Xvfb /usr/X/bin/Xvfb /usr/openwin/bin/Xvfb

If your distribution does not provide Xvfb, SOAtest will use its own copy of Xvfb. This may require some configuration in the form of command line arguments. The following describes how to configure the Xvfb supplied by SOAtest to start with the appropriate arguments. If SOAtest cannot start Xvfb, then it will simply run the browser in the display specified by the system environment variable $DISPLAY. In other words, if you are starting tests from the SOAtest GUI, the browser will appear in the same display as the GUI and SOAtest will output any startup error message from Xvfb to the Console view. If you are running tests from the command line and there is no available display, then SOAtest will not be able to start Firefox; any web scenarios will therefore fail.

Getting Xvfb working independently of SOAtest To get Xvfb to work when invoked from SOAtest, first get Xvfb working independently of SOAtest. This how-to assumes that you need to configure Xvfb_Linux; however, the process is the same for configuring Xvfb_Solaris.

(1) Start Xvfb on display :20. Use a different display argument if :20 is not available.

468

Running Web Scenarios in Headless Mode

$ cd /path/to/soatest/plugins/com.parasoft.xtest.libs.web_[version].0/root $ ./Xvfb_Linux :20 -ac

If you get any error messages, you will need to specify command line arguments for the location of various X-related files whose location will be dependent on your distribution. When trying Xvfb_Linux on Fedora Core, this was the error message: Couldn't open RGB_DB '/usr/X11R6/lib/X11/rgb' error opening security policy file /usr/X11R6/lib/X11/xserver/SecurityPolicy Could not init font path element /usr/X11R6/lib/X11/fonts/misc/, removing from list! Could not init font path element /usr/X11R6/lib/X11/fonts/Speedo/, removing from list! Could not init font path element /usr/X11R6/lib/X11/fonts/Type1/, removing from list! Could not init font path element /usr/X11R6/lib/X11/fonts/CID/, removing from list! Could not init font path element /usr/X11R6/lib/X11/fonts/75dpi/, removing from list! Could not init font path element /usr/X11R6/lib/X11/fonts/100dpi/, removing from list! Fatal server error: could not open default font 'fixed'

(2) Find the necessary X server files and add command line arguments to successfully start Xvfb This is what worked on Fedora Core: $ ./Xvfb_Linux -ac -sp /usr/lib/xserver/SecurityPolicy -fp /usr/share/X11/fonts/misc -co /usr/share/X11/rgb -screen 0 1024x768x24 :20 A virtual frame buffer is now running on display :20.

(3) Try running a simple X application on the display you have created. $ xclock -display :20 &

Verify that the clock is visible in display :20 by dumping an image of display :20 and viewing the image. $ xwd -display :20 -root | xwud

You should see a clock. See the respective man pages for more information on xwd(1) and xwud(1).

(4) Run Firefox on the display you have created. $ firefox --display :20

Use the same xwd/xwud command to verify that Firefox is running in the virtual frame buffer. $ xwd -display :20 -root | xwud

If you can create an X display using Xvfb but Firefox fails to run on this virtual framebuffer, you may need to update various libraries external to SOAtest. For example, an outdated Cairo library (for 2D graphics) provided by the distribution has been known to cause problems on both Linux and Solaris. If you have problems with Cairo, make sure the virtual framebuffer uses a display depth greater than 8 bits because this will solve a common problem. If troubles persist, you may need to update the library itself.

Getting Xvfb working with SOAtest Once you determine the arguments that you need to pass to Xvfb, put these arguments to use in SOAtest. To do so, create a shell script named Xvfb_Linux that invokes Xvfb with the necessary arguments. SOAtest will then run the shell script when trying to start Xvfb.

(1) In /path/to/soatest/plugins/com.parasoft.xtest.libs.web_[version]/root rename Xvfb_Linux to Xvfb_Linux.bin. $ mv Xvfb_Linux Xvfb_Linux.bin

469

Running Web Scenarios in Headless Mode

(2) Create a shell script Xvfb_Linux.sh that will run Xvfb_Linux.bin with the appropriate arguments. This is what worked on Fedora Core: ----#!/bin/sh # # # #

Use $@ to pass along any argument specified by SOAtest. The $@ will include the display on which to run the server. Currently this is always display :32. You can override this by adding another display number as the last argument.

# There can be problems with graphics libraries if you do # not set the display depth to greater than 8 using the # -screen argument. # Make sure to run Xvfb with 'exec' so that SOAtest will # kill the correct process. XVFB_DIR=`dirname $0` exec ${XVFB_DIR}/Xvfb_Linux.bin \ -ac \ -sp /usr/lib/xserver/SecurityPolicy \ -fp /usr/share/X11/fonts/misc \ -co /usr/share/X11/rgb \ -screen 0 1024x768x24 \ $@

-----

(3) Create a symbolic link to Xvfb_Linux.sh. $ ln -s Xvfb_Linux.sh Xvfb_Linux

Then, when SOAtest runs Xvfb_Linux. it will run the script that passes the appropriate arguments to Xvfb_Linux.bin. Alternatively, you could name the script Xvfb_Linux. However, using the symbolic link makes it easier to differentiate between what SOAtest installed and what you created to run the installed Xvfb.

(4) Run a web scenario in headless mode. While SOAtest is running the scenario, use ps(1) to check if Xvfb_Linux.bin is running, and then use xwd/xwud again to verify that Firefox is running. (SOAtest will close Firefox as soon as it completes the test run.) $ ps -ef | grep Xvfb $ xwd -display :32 -root | xwud

If you configured Xvfb to run on a different display, use that when invoking xwd(1).

470

Running Static Analysis as Functional Tests Execute

Running Static Analysis as Functional Tests Execute SOAtest can perform static analysis (check links, HTML well-formedness, spelling, accessibility, etc.) on the Web pages that the browser downloads as Web functional tests execute. For details, see “Performing Static Analysis”, page 578.

471

Customizing Recording Options

Customizing Recording Options This topic explains how to exercise greater control over the way that SOAtest records Web functional tests. SOAtest allows you to customize the clickable elements that can be recorded during functional test creation. The scripts to modify are located in the following directories: •

For Internet Explorer:<SOAtest_Installation_Directory>/plugins/com.parasoft.xtest.libs.web_<soatest_version>/root/browsers/ie/UserCustomizableOptions.js



For Firefox: <SOAtest_Installation_Directory>/plugins/com.parasoft.xtest.libs.web_<soatest_version>/root/browsers/ff/extensions/[email protected]/chrome/content/UserCustomizableOptions.js

SOAtest uses the array variables defined in UserCustomizableOptions.js during recording. The following are the variables currently available in this script: •

Recorder.clickableAttributes: This variable defines the type of attribute of html elements that SOAtest is looking for when determining if it should record a click functional test on this element. For example, the onclick is used to initiate script execution and is a good candidate in this array.



Recorder.clickableTags: This variable defines the html tags that SOAtest will consider when recording click functional tests. In an ajax web application, there are cases where tags such as span or li are clicked that caused certain functionalities to execute on the client side. This variable is used to define such tags.



Recorder.clickableInputTypes: This variable defines the form input types that SOAtest will consider when recording click functional test. Types such as text and textarea are not considered clickable by default because the user usually clicks on it just to gain focus and enter text.



Recorder.disallowedTags: This variable contains a list of tags that will never be recorded, even if they satisfy other recording criteria.



LocatorBuilders.order: This variable defines the order in which SOAtest uses to create the locator for a functional test. A locator is created to identify the html element on the page that the user action should take place. In a functional test, a locator is necessary during playback to repeat the user action. The order is constructed such that visual attributes in an element are more favorable when creating the locator.

472

Creating Custom Locators for Validations and Extractions

Creating Custom Locators for Validations and Extractions This topic explains how the CreateXPath hook allows you to create custom locators that can be used to identify elements when recording functional tests. These custom locators can then be used for validations and extractions. Sections include: •

About Hooks



About the CreateXPath Hook

About Hooks Customized hooks can be used to record or modify the values passed at specific points in SOAtest execution. Hooks are defined and customized in scripts using Python, JavaScript, or Java methods. The same file can define multiple hooks. If you add more than one method to a hook, all methods defined for that hook will be executed when the hook is called. You can create, apply, and invoke scripts that define hooks in the same way that you create, apply, and invoke any other script in SOAtest: upon startup (only for JavaScript and Python scripts), by creating and applying an Extension tool, and by adding scripts to a specific path node. You can invoke hooks at different times to elicit the desired functionality. For example, if you want to use a script’s hook functionality for all SOAtest projects and sites, you could add the JavaScript or Python script that defines and uses that hook to the <soatest_install_dir>/plugins/com.parasoft.xtest.libs.web_<soatest_version>/root/startupdirectory; then, any time the program calls the hook, the associated user-defined methods will be executed. The methods will be executed until you call clear () on the hook. For a complete description of general hook options, see the SOAtest Extensibility API (available by choosing SOAtest> Help> Extensibility API).

About the CreateXPath Hook When recording a functional test, SOAtest identifies the elements that you interacts with by certain attributes of those elements. For example, SOAtest will use the ID of an element, or the text content of an element to locate that element. If you need to extend the attributes used to identify elements during functional testing, you can use the CreateXPath hook. The CreateXPath hook takes two arguments. The first is an instance of org.w3c.dom.Node, and is the node that the locator will be built for. The second is an instance of org.w3c.dom.Document, and is the document where the node was found. The CreateXPath function should return an XPath that identifies the node in the document, or null if no XPath can be created. For example, if you want to identify elements by their "class" attribute, you could use the CreateXPath hook to create a custom identifier that checks elements for a "class" attribute and, if it finds one, returns an XPath that identifies the element by the value of the "class" attribute. When selecting an element to validate, or clicking an element during recording, SOAtest would then use this custom hook. The element would be passed to the custom hook, which would then check the element for a "class" attribute. If it found one, it would return an XPath using the "class" attribute value to identify the element. Here is a a sample script that identifies elements using the "class" attribute and an index (when more than one node has the same "class").

473

Creating Custom Locators for Validations and Extractions

app = Packages.com.parasoft.api.Application; settings = Packages.webtool.browsertest.XPathCreator; // establish the locator builder priority order // we use our custom locator after using the id, name and attribute locators // and before using the position xpath locator settings.locatorBuildersOrder[0] = settings.ID_LOCATOR; settings.locatorBuildersOrder[1] = settings.NAME_LOCATOR; settings.locatorBuildersOrder[2] = settings.ATTRIBUTE_XPATH_LOCATOR; settings.locatorBuildersOrder[3] = settings.HOOK_LOCATOR; settings.locatorBuildersOrder[4] = settings.POSITION_XPATH_LOCATOR; // gets the index of the node out of a list of nodes returned using xpath. function getIndex(xpath, node, doc) { index = -1; if (doc) { try { list = Packages.org.apache.xpath.XPathAPI.selectNodeList(doc, xpath); count = 0; while (index == -1 && count < list.getLength()) { candidate = list.item(count); if (candidate.equals(node)) { index = count; } else { ++count; } } } catch (e) {} } return index; } // the hook function to link to the CreateXPath hook. // looks for the class attribute and uses it (along with an index) to create // a locator for the given node. function createXPathHook(node, doc) { nodeName = node.getNodeName(); attributes = node.getAttributes(); classValue = attributes.getNamedItem("class"); if (classValue) { xpath = "//" + nodeName + "[@"+ classValue + "]"; index = getIndex(xpath, node, doc); if (index != -1) { return xpath + "[" + index + "]"; } } return null; } // sets up the hook function setHook() { hook = app.getHook("CreateXPath"); hook.set(createXPathHook); }

474

Understanding Web Functional Test Errors

Understanding Web Functional Test Errors This topic introduces some common web functional test errors. There are a few types of errors, with varying severity, that you may encounter when running these particular types of tests: •

Unable to perform user action: SOAtest was unable to Element <identifier> not found. – The element on which the specific action was to take place was not located on the page at all. This error is fatal if it relates to an action (click, type, etc), and will stop execution of the test suite. If this error occurs during an extraction, then the test suite will continue to run after reporting the error.



Firefox browser not installed: This error is due to SOAtest either not being able to locate a version of Firefox via the registry (Windows only), or the supplied path is not pointing to a viable Firefox executable. SOAtest supports Mozilla Firefox 1.5 and higher.



Your current version of <browser> is unsupported by SOAtest: This error is the result of attempting to run a version of Firefox or Internet Explorer that is unsupported by SOAtest. SOAtest supports Mozilla Firefox 1.5 and higher and Internet Explorer 6.0 and higher.

475

Creating a Report of Test Suite Maintainability

Creating a Report of Test Suite Maintainability This topic explains the purpose of SOAtest’s Test Maintainability report, and describes how to access it. Sections include: •

Understanding the Test Maintainability Report



Generating a Test Maintainability Report

Understanding the Test Maintainability Report How element locators get created is one of the greatest barriers to creating maintainable web functional tests that withstand changes to a web application over time. If page elements are created with unique attributes such as an id, then a test can be generated that uses that id to locate the element. No matter where the element moves on the page, the element will be able to be located because the id is unique . However, these kinds of unique identifiers are not often used in web applications. XPaths are commonly used instead, but tests that use XPaths are difficult to maintain. Why? If page elements get rearranged, the XPaths quickly can become invalid. To help you gauge the maintainability of a test suite, SOAtest provides a Test Maintainability report that shows a summary of web functional test suites. A warning is generated for each Web functional browser test with low maintainability—in other words, a test that uses one or more XPaths, which can make test maintenance difficult as the web application evolves.

Clicking on a warning opens a dialog that provides more detail.

476

Creating a Report of Test Suite Maintainability

Generating a Test Maintainability Report A Test Maintainability report can be generated in two ways: •

Right click on a root .tst file in the Test Case Explorer, then choose View Test Maintainability Report from the shortcut menu.



When creating a new web functional test from browser recording (as described in “Recording Tests from a Browser”, page 431), select the Generate Test Maintainability Report checkbox in the browser recording wizard. The report will open automatically when the recording is completed.

477

Security Testing In this section: •

Security Testing: Introduction



Authentication, Encryption, and Access Control



Penetration Testing

478

Security Testing: Introduction

Security Testing: Introduction This topic provides an overview of how SOAtest facilitates two types of security testing: •

Authentication, encryption, and access control (i.e., runtime security policy validation).



Hybrid security analysis, which integrates penetration testing with runtime error detection.

Sections include: •

Authentication, Encryption, and Access Control



Hybrid Security Analysis

Authentication, Encryption, and Access Control SOAtest assists with runtime security policy validation by enabling execution of complex authentication, encryption, and access control test scenarios. SOAtest includes security support for testing services with security layers. At the transport level, SOAtest supports SSL (both server and client authentication), basic, Digest and Kerberos authentication. At the message level, SOAtest supports WS-Security including X509, SAML, Username security tokens, XML Encryption and XML Digital Signature. It allows for security token validation as well as negative tests that ensure proper enforcement of message integrity and authentication.

Learning More For details on how to perform this validation, see “Authentication, Encryption, and Access Control”, page 479.

Hybrid Security Analysis SOAtest’s hybrid security analysis takes the functional tests that you and your team have already defined and uses them to perform a fully-automated assessment of where security attacks actually penetrate the application. This hybrid analysis: 1. Uses penetration testing to automatically generate and run penetration attack scenarios against your existing web or service functional test scenarios. 2. Uses runtime error detection to monitor the back-end of the application during test execution in order to determine whether security is actually compromised (and whether other runtime errors occur as well). 3. Correlates each reported runtime error with the functional test that was being run when the error was detected—allowing you to trace each reported error to a the specific use case related to the problem. The two key components of hybrid analysis—penetration testing and runtime error detection—can also be used independently of one another.

479

Security Testing: Introduction

Penetration Testing SOAtest’s penetration testing can generate and run a variety of attack scenarios (such as Parameter Fuzzing, SQL and XPath injections, Cross Site Scripting, XML Bombs, and more) against your functional test suites. For instance, if you are not able or ready to configure your application server for runtime error detection, you can still use penetration testing to generate and run attack scenarios, then use alternative strategies to determine if the attacks caused security breaches.

Runtime Error Detection SOAtest’s runtime error detection monitors the application from the back-end as SOAtest tests executes and alerts you if security breaches or other runtime defects (such as race conditions, exceptions, resource leaks) actually occur. You may want to perform runtime error detection both with and without penetration testing. This way, you can ensure that error detection covers both the exact use case functionality captured in your test cases as well as the simulated attacks based on this functionality.

Learning More For details on how to perform these analyses, see “Penetration Testing”, page 484 and “Performing Runtime Error Detection”, page 490.

480

Authentication, Encryption, and Access Control

Authentication, Encryption, and Access Control This topic explains how SOAtest assists with runtime security policy validation by enabling execution of complex authentication, encryption, and access control test scenarios. Sections include: •

Overview



Tutorial



Related Topics



Testing Oracle/BEA WebLogic Services with WS-Security

Overview To help you ensure that your security measures work flawlessly, SOAtest contains a vast array of security tools and options that help you construct and execute complex authentication, encryption, and access control test scenarios. For example: •

XML Encryption tool: The XML Encryption tool allows you to encrypt and decrypt entire messages or parts of messages using Triple DES, AES 128, AES 192, or AES 256. In WS-Security mode, Binary Security Tokens, X509IssuerSerial, and Key Identifiers are supported.



XML Signer tool: The XML Signer tool allows you to digitally sign an entire message or parts of a message depending on your specific needs. In some cases it may be important to digitally sign parts of a document while encrypting other parts.



XML Signature Verifier tool: The XML Signature Verifier tool allows for the verification of digitally signed documents using a public/private key pair stored within a key store file.



Key Stores: The use of key stores in SOAtest allows you to encrypt/decrypt and digitally sign documents using public/private key pairs stored in a key store. Key stores in JKS, PKCS12, BKS, and UBER format can be used.



Username Tokens, SAML Tokens, X509 Tokens, or Custom Headers: SOAtest supports sending custom SOAP Headers and includes templates for Username Tokens and SAML tokens.

Tutorial For a step-by-step demonstration of how to apply SOAtest for validating authentication, encryption, and access control, see “WS-Security”, page 140. This tutorial covers encryption/decryption, digital signature, and the addition of SOAP Headers. Lessons include: •

“Message Layer Security with SOAP Headers”, page 141



“Using the XML Encryption Tool”, page 144



“Using the XML Signer Tool”, page 152



“XML Encryption and Signature Combined”, page 155



“Automatically Generating WS-Security Tests with WS-SecurityPolicy”, page 157

481

Authentication, Encryption, and Access Control

Related Topics For more details on how to use SOAtest’s tools to support your specific authentication, encryption, and access control validation needs, see the following sections.

Topic

Reference

WS-Security Policy

“SOA Quality Governance and Policy Enforcement”, page 569 “Global WS-Policy Banks”, page 342

Custom Headers

“Adding SOAP Headers”, page 811 “Adding SAML Headers”, page 815 “Global SOAP Header Properties”, page 339

Tools

“XML Encryption”, page 873 “XML Signer”, page 878 “XML Signature Verifier”, page 882

General Security Settings (Authentication, Keystores, etc.)

“Security Settings”, page 754 “Using HTTP 1.0”, page 689 “Using HTTP 1.1”, page 691 “Global Key Stores”, page 340

HTTPS and SSL

“Accessing Web Services Deployed with HTTPS”, page 423

SAML

“Adding SAML Headers”, page 815 “SAML 1.1 Assertion Options”, page 814 “SAML 2.0 Assertion Options”, page 814

Testing Oracle/BEA WebLogic Services with WSSecurity If your services are configured with WS-Security XML security policies, then you can configure SOAtest with the necessary settings in order to interoperate with WebLogic. To help you configure these settings, a sample SOAtest project WebLogicWSS.tst is included under [SOAtest install dir]/examples/tests. WebLogicWSS.tst is not an executable test; it intended to serve as a reference, allowing you to compare a working configuration that has been verified by Parasoft against your own. This example configuration has been tested to work with WebLogic 9.2 and later. This example assumes that default sign, encrypt and UsernameToken (ut) policies are being used by your WebLogic application. It also assumes that the wss_client certificate (the client public key) has been imported to WebLogic's DemoTrust keystore. Please note how:

482

Authentication, Encryption, and Access Control



The signature and encryption tests in WebLogicWSS.tst include a WS-Security header with an X509 token configured.



The various encryption and signature tools are setup for various WS-Security scenarios.



Two certificate alias are used by the various operations.

If you are using the default policies or policies that are built off of the defaults, configure your test settings to match this example in terms of the options selected. For more information about WebLogic security policies, please refer to Oracle's e-docs sites. Here are some references that we found to be useful: •

Enabling WS-Security debugging on WebLogic: http://e-docs.bea.com/wls/docs103/ webserv_sec/message.html#wp288375 •

Note that these properties can be added to the startWebLogic.sh script within the WebLogic domain you are using.



Weblogic DemoIdentity and DemoTrust keystores:http://kingsfleet.blogspot.com/2008/11/ using-demoidentity-and-demotrust.html



Oracle Web Services Security Policy Assertion Reference: http://edocs.bea.com/wls/docs103/ webserv_ref/sec_assert.html

483

Penetration Testing

Penetration Testing SOAtest’s penetration testing can generate and run a variety of attack scenarios against your functional test suites. Sections include: •

Configuring Penetration Testing Attacks



Configuring Runtime Error Detection for Hybrid Security Analysis



Executing Tests



Reviewing and Validating Results



Configuring Attackable Parameters

Configuring Penetration Testing Attacks To configure SOAtest to simulate attacks against your functional tests scenarios: 1. Choose SOAtest> Test Configurations to open the Test Configuration manager. 2. Click New to create a new Test Configuration. 3. Give the new Test Configuration a meaningful name. 4. Open that Test Configuration’s Execution> Security tab. 5. Enable Perform penetration testing. 6. Use the rule tree to indicate which attacks you want to run.

484

Penetration Testing

Available Attacks SOAtest can simulate the following attacks:

Attack

Description

Parameter Fuzzing

When input parameters to a Web service are not properly validated, it can lead to vulnerabilities in the underlying system. In native applications, buffer overflow attacks can occur when input parameter data sizes go unchecked. These vulnerabilities could cause system crashes or could even lead to unauthorized information being returned to the client application.

SQL Injections

When SQL statements are dynamically created as software executes, there is an opportunity for a security breach by passing fixed inputs into the SQL statement, making them a part of the SQL statement. This could allow an attacker to gain access to privileged data, login to password-protected areas without a proper login, remove database tables, add new entries to the database, or even login to an application with admin privileges.

Username Harvesting

A request that includes a wrong username or password should not be met with a response that indicates whether the username is valid or not; this would make it easier for an attacker to identify valid usernames, then use them to guess the passwords.

XPath Injections

XPath injections are similar to SQL injections in that they are both specific forms of code injection attacks. XPaths enable you to query XML documents for nodes that match certain criteria. If such a query is constructed dynamically in the application code (with string concatenation) using invalidated inputs, then an attacker could inject XPath queries to retrieve unauthorized data.

Cross-Site Scripting

Cross-site scripting problems occur when user-modifiable data is output verbatim to HTML. Subsequently, an attacker can submit script tags with malicious code, which is then executed on the client browser. This allows an attacker to deface a site, steal credentials of legitimate users, and gain access to private data.

XML Bombs

When using a DTD (Document Type Definition) within an XML Document, a Denial of Service attack can be executed by defining a recursive entity declaration that, when parsed, can quickly explode exponentially to a large number of XML elements. This can consume the XML parser resources, causing a denial of service.

External Entities

XML has the ability to build data dynamically by pointing to a URI where the actual data is located. An attacker may be able to replace the data that is being collected with malicious data. This URI can either be pointed to local XML files on the Web service's file system to make the XML parser read large amounts of data, to steal confidential information, or launch DoS attacks on other servers by having the compromised system appear as the attacker by specifying the URLs of the other servers.

Schema Invalid XML

A well-formed document is not necessarily a valid document. Without referencing either a DTD or a schema, there is no way to verify whether the XML document is valid or not. Therefore, measures must be taken to ensure that XML documents do, in fact, reference a DTD or schema.

485

Penetration Testing

Attack

Description

Large XML Documents

Large payloads can be used to attack a Web service in two ways. First, a Web service can be clogged by sending a huge XML payload in the SOAP request, especially if the request is a well-formed SOAP request and it validates against the schema. Secondly, large payloads can also be induced by sending certain request queries that result in large responses.

Malformed XML

XML elements with malformed, unacceptable, or unexpected contents can cause the service to fail.

Attack String Customization To customize the attack strings used for various attacks, modify the .csv files in [SOAtest_install_dir]/plugins/com.parasoft.xtest.libs.web_[version]\root\security.

Configuring Runtime Error Detection for Hybrid Security Analysis If your application has a Java backend and you want to apply runtime error detection in order to determine if these attacks actually cause security breaches or other runtime defects, you should also configure runtime error detection as described in “Performing Runtime Error Detection”, page 490. Be sure to configure both: •

The server.



The Test Configuration that you will use to perform penetration testing (see “Configuring Penetration Testing Attacks”, page 484).

Executing Tests To run the penetration tests: 1. Select the test suite that you want to attack. 2. Run the Test Configuration that you designed for penetration testing (see “Configuring Penetration Testing Attacks”, page 484).

Reviewing and Validating Results Results will be reported in the SOAtest tab and in any reports generated.

With Runtime Error Detection Enabled If you performed hybrid analysis (penetration testing + runtime error detection), errors detected will be reported as follows:

486

Penetration Testing

Note that SOAtest correlates each reported error with the functional test that was being run when the error was detected. This correlation between violations and functional tests allows you to trace each reported error to particular use cases against your application.

Without Runtime Error Detection Enabled Additional validation strategies can help you determine if the generated attacks succeeded. For example, you can chain Coding Standards, Search, or XML Validator tools to the test suite, inspect server logs manually, or run a script to parse these logs.

Viewing Attack Traffic The Traffic Viewer for each test allows you to view attack traffic. Using the available Attacks and Iteration controls, you can display traffic for all attacks or for specific attack types, as well as focus on traffic for specific attack values.

487

Penetration Testing

Configuring Attackable Parameters By default, SOAtest will try to attack all of the available parameters represented in a selected test suite’s SOAP Client, REST Client, Messaging Client, and Browser Testing tools. To customize which parameters may be attacked: 1. Double-click the top-level test suite node for functional tests you want to attack. 2. Open the Security Options tab (on the far right). 3. Use the Penetration Test Parameter tree to indicate which parameters can be attacked.

4. Save the test suite configuration changes.

488

Runtime Error Detection In this section: •

Performing Runtime Error Detection

489

Performing Runtime Error Detection

Performing Runtime Error Detection This topic explains how to use SOAtest to perform runtime error detection on Java applications. Sections include: •

Runtime Error Detection Overview



Preparing the Server



Customizing the Test Configuration



Detecting Errors

Runtime Error Detection Overview SOAtest can perform runtime error detection as functional tests or penetration tests execute. This runtime error detection analyzes the executing application, applying a configurable set of dynamic "runtime rules" that verify the runtime behavior of the application. The rule violations reported indicate runtime errors that actually occurred during execution. SOAtest correlates each reported error with the functional test that was being run when the error was detected. This correlation between violations and functional tests allows you to trace each reported error to particular use cases against your application. Categories of errors detected include: •

Application crashes



Eclipse development



Exceptions



Functional Errors



File I/O



Graphical User Interface



Database



Network



Optimization



Portability



Security



Servlets



Threads & Synchronization

Preparing the Server Before you can monitor an application, you must copy the appropriate jar files to the machine that is running your application server, then configure the application server to use the jars. This is done as follows: 1. Copy insure.jar and insureimpl.jar from [SOAtest install dir]/eventmonitor to a directory on the server with the application you want to check. 2. If the server is running, stop it. 3. In your startup script, add the -javaagent command to the existing Java arguments.

490

Performing Runtime Error Detection



For details, see “javaagent Command Details”, page 491.



Be sure to add the optional trace_to_xml parameter if you also want to monitor the application’s internal behavior as your functional tests execute. See the box below for details.

4. Restart the server. The server will start as usual, but with the specified package classes instrumented.

Gain visibility into the application’s internal behavior during test execution As you configure the application for runtime error detection, you are also preparing it for application monitoring, which provides visibility into the application’s internal behavior as functional tests execute. This allows you to better identify the causes of test failures as well as enrich the use case validation criteria in your regression tests. For example, you can validate that an EJB method call or a remote call to another system is taking place with the expected parameters. To perform this monitoring: •

Use the required trace and recommended trace_to_xml parameter in your server startup script.



Add a properly-configured Event Monitor tool to the start of the test scenario you want to monitor.

See “Monitoring Java Applications”, page 514 for details.

javaagent Command Details Basics The following invocation-time parameters are required:

Parameter

Description

soatest

Required for configuring runtime error detection.

port=[port_number]

Specifies which port should be used to communicate with the monitored program. Use 5050 to 5099.

instrument= [class_name.method_name(jni _sig)]

Specifies which packages/classes/methods to check. Use ':' to separate multiple prefixes. You can provide specific class names, or use wildcards to monitor any class in the specified package.

For Applications Running from Eclipse or Application Servers Applications that define their own class loaders (i.e. Eclipse, JBoss and Tomcat) need insure.jar added to the boot classpath. To monitor those applications, add to the launch VM arguments: -javaagent:"<path_to_jar>\insure.jar=soatest,port=<port>"instrument=<my.package.prefix> Xbootclasspath/a<path_to_jar>\insure.jar

For instance, you may use:

491

Performing Runtime Error Detection

-javaagent:"/home/user/parasoft/insure.jar=soatest,port=5060"instrument=com.mycompany.onlinestore -Xbootclasspath/a:/home/user/parasoft/insure.jar

For other (Standalone) Java Applications For other (standalone) Java applications, you do NOT need to add insure.jar to the boot classpath. For instance, you may use: java -javaagent:"C:\Program Files\Parasoft\insure.jar=soatest,port=5050",instrument=com.mycompany.myapp

Customizing the Test Configuration Before you can perform runtime error detection, you need to customize a Test Configuration to specify which application you want to check and what errors you want to look for. To do this: 1.

Create a new Test Configuration as described in “Creating Custom Test Configurations”, page 244.

2. In the Execution> Runtime Error Detection tab, do the following: a. Enable Perform runtime error detection. b. Specify the host and port of the server running the application you want to check. c.

Enable/disable rules to specify what types of errors you want detected.

Detecting Errors To perform runtime error detection on your application: 1. Select the node for the functional tests you want to execute with runtime error detection. 2. Run the custom Test Configuration (described in the previous section). Errors detected will be reported as follows:

492

Performing Runtime Error Detection

Note that SOAtest correlates each reported error with the functional test that was being run when the error was detected. This correlation between violations and functional tests allows you to trace each reported error to particular use cases against your application.

493

Event Monitoring (ESBs, Java Apps, Databases, and other Systems) In this section: •

Monitoring Intra-Process Events: Overview



Using SOAtest’s Event Monitor



Monitoring IBM WebSphere ESB



Monitoring Oracle or BEA AquaLogic Service Bus



Monitoring Software AG webMethods Broker



Monitoring Sonic ESB



Monitoring TIBCO EMS



Monitoring Other JMS Systems



Monitoring Java Applications



Monitoring Databases



Monitoring Stub Events



Monitoring a Custom API-Based Events Source



Extensibility API Patterns



Generating Tests from Monitored Transactions

494

Monitoring Intra-Process Events: Overview

Monitoring Intra-Process Events: Overview This topic provides an overview of why and how SOAtest monitors events. SOAtest can visualize and trace the intra-process events that occur as part of the transactions triggered by the tests and then dissect them for validation. This enables test engineers to identify problem causes and validate multi-endpoint, integrated transaction system—actions that are traditionally handled only by specialized development teams. Sections include: •

Why Monitor Events?



Monitoring Events with SOAtest

Why Monitor Events? In a test environment, a lot of things can go wrong: The message could be routed to the wrong system, it can be transformed incorrectly, or the target system may not perform a required action. If a test succeeds, how do you know that every step was executed correctly and the appropriate updates were performed? If it fails, how do you know what went wrong, where, and why? Once you identify the cause of an error, how can you isolate it and design the right tests around the isolated parts so the problem can be resolved and then keep the test so that the problem can be detected again in the future? Just sending a message into a system and validating the response coming out is not sufficient to address these challenges. Having the visibility and control inside these systems—especially ESBs that serve as the heart of today's business transactions—is critical to success.

Monitoring Events with SOAtest Parasoft SOAtest can monitor messages and events inside ESBs, Java applications, databases, and other systems in order to provide validation and visibility into test transactions. SOAtest also provides a framework and an API to provide this level of internal visibility within almost any system. In addition to sending an initial message and then validating the response (and possibly validating various changes that take place as a result of the transaction), SOAtest can also monitor the intermediate messages and events. For example, it may monitor the initial messages, the message after it is transformed, messages where a service calls another service and a response comes back, and the message that reaches the destination system—then also monitor the steps through the entire route that the response messages take back. If the transaction executes correctly, you can easily define the assertions to automate validation of these intermediate messages/steps in all subsequent test runes. If it does not execute correctly, you can determine what went wrong and where. You can then apply the available tool set to validate whether messages satisfy functional expectations. Using this functionality, you can ensure that tests aren’t marked as successful unless the test transaction executes exactly as intended.

495

Using SOAtest’s Event Monitor

Using SOAtest’s Event Monitor This topic explains how to configure and apply the Event Monitor tool, which traces the internal events within systems such as ESBs, Java applications, databases, and other and business applications— and allows you to make them part of SOAtest’s end-to-end test scenarios. Sections include: •

Understanding the Event Monitor



Tool Configuration



Test Suite Execution



Stand-Alone/Ad-Hoc Execution



Viewing Monitored Events



Using Data Sources



Retrieving Events from Other Platforms



Validating Monitored Messages

Understanding the Event Monitor The Event Monitor tool traces the internal events within systems such as ESBs, Java applications, and business applications—providing visibility into intermediary messages without requiring direct access to the messaging system’s interface. Additionally, you can apply the available tool set to validate whether messages satisfy functional expectations. SOAtest provides built-in support for monitoring TIBCO Enterprise Messaging System, Sonic Enterprise Service Bus systems, Oracle/BEA Aqualogic Service Bus, Software AG webMethods Broker, IBM WebSphere ESB, any other JMS-based systems, any Java application, and any relational database. In addition, it can be configured to monitor any custom API-based events source (for example, nonJMS systems from other vendors, custom logging frameworks, etc.). The Event Monitor should always be positioned as the first test in the test suite. To configure it, you tell SOAtest how to connect to your system and what you want it to monitor. When you run the test suite, SOAtest will start the event monitor and then keep it running as the test suite’s tests execute. SOAtest monitors and reports the specified events/messages as they occur. Additional tools can then be chained to validate or further process the monitored events/messages.

Tool Configuration The configuration procedure depends on what type of system you’re monitoring. This tool can be used for: •

Monitoring Oracle or BEA AquaLogic Service Bus



Monitoring Software AG webMethods Broker



Monitoring Sonic ESB



Monitoring TIBCO EMS



Monitoring Other JMS Systems



Monitoring Java Applications



Monitoring Databases



Monitoring a Custom API-Based Events Source

496

Using SOAtest’s Event Monitor

Test Suite Execution Test suite execution is the typical and recommended usage of the Event Monitor tool. Add an Event Monitor Tool to your test suite and make it the first test. When you select and run the parent test suite, the Event Monitor will start automatically and continue monitoring while the rest of the tests in the test suite execute. It will stop when the first of the following two events take place: 1. The last test in the test suite (or last row in the data source, if the test suite tests are iterating over a data source) has completed execution, the last event has been retrieved, and the monitoring connection has been destroyed. 2. The maximum monitor execution duration (this value is configured in the tool’s Options tab) has been reached. You do not need to set the Event Monitor's parent test suite to Tests run concurrently. SOAtest will automatically recognize the Event Monitor as a special tool and manage its concurrent execution accordingly.

Stand-Alone/Ad-Hoc Execution You may configure the connection settings of an Event Monitor and execute it on its own while exercising your target systems outside of SOAtest. For example, you may wish to monitor JMS messages in a back-end middleware system while you manually browse the Web interface of your application. To watch the events being logged in real time, open the Event Viewer tab. The Event Monitor tool will run for the duration that is set under Options tab’s Maximum monitor execution duration (milliseconds) setting. Such ad-hoc execution is not applicable when the Custom events source option is selected with the Poll after each test execution pattern.

Viewing Monitored Events Events will be reported and visualized in the Event Viewer tab within the Event Monitor tool configuration panel. To view monitored events during in real time: •

Keep the Event Viewer tab (in the Event Monitor tool’s configuration panel) open during test execution.

There are two tabs available: the graphical view and the text view.

497

Using SOAtest’s Event Monitor

In the graphical view, click on an event to see message details.

Using Data Sources Event Monitor itself is not parameterizable with a data source. However if the tests inside the test suite that includes the Event Monitor tool are parameterizable, it will start monitoring before the first actual test (following the Event Monitor test) and first data source row executes, and it will stop monitoring when all data sources rows have been used. This ensures that you will obtain a single, seamless log of events regardless of how the tests iterate inside the test suite.

Retrieving Events from Other Platforms The Event Monitor tool has extensibility hooks that allow it to obtain events from a variety of sources besides the built-in platforms and systems that it supports. See “Extensibility API Patterns”, page 525 for details.

Validating Monitored Messages If the transaction executes correctly, you can add regression controls for monitored messages as described in “Configuring Regression Testing”, page 425. In addition, you can add validations as described in “Adding Test Outputs”, page 333. The available validation tools are listed in “Validation Tools”, page 898.

498

Using SOAtest’s Event Monitor

You can also monitor the system while the transactions execute (for example, trigger the transaction from the Web interface), and then generate tests automatically from these messages. This includes tests for transaction entry/exit points, as well as tests for intermediate parts of the transaction. In this way, you can reproduce issues quickly, create tests with real values rapidly, then isolate the system and create regression tests around the pieces and components of the transaction. For details on how to do this, see: •

“Creating Tests From Sonic ESB Transactions”, page 404



“Creating Tests From TIBCO EMS Transactions”, page 407



“Creating Tests From JMS System Transactions”, page 401

499

Monitoring IBM WebSphere ESB

Monitoring IBM WebSphere ESB This topic explains how to configure monitoring for IBM WebSphere ESB. Sections include: •

WebSphere Configuration



SOAtest Configuration



Viewing Monitored Events

WebSphere Configuration IBM WebSphere ESB includes monitoring capabilities that build upon its underlying WebSphere Application Server. Parasoft SOAtest can subscribe to Common Base Events that are fired at points in the processing of service components, and which are managed by WebSphere Common Event Infrastructure (CEI). For information about monitoring service component events in the WebSphere ESB and enabling the monitoring using WebSphere administrative console, see http://publib.boulder.ibm.com/infocenter/ dmndhelp/v6r2mx/topic/com.ibm.websphere.wesb620.doc/doc/cmon_businessevents.html. To configure WebSphere for event monitoring: 1. Enable the CEI service in the ESB. 2. Choose the level of logging for the service components you are interested in. The steps for performing this task on the ESB can be found at http://publib.boulder.ibm.com/infocenter/ dmndhelp/v6r2mx/topic/com.ibm.websphere.wesb620.doc/doc/tmon_configcei.html. •

In order to get the full event details in SOAtest, we recommend that you select the "ALL MESSAGES AND TRACES" option and the "FINEST" logging level for the components you are interested in, and which results in the business messages being included in the CEI events. To enable that for all business integration components, the log level string in the WebSphere administrative console would look like this: *=info: WBILocationMonitor.CEI.SCA.com.*=finest

SOAtest Configuration Adding Required Jar Files to the SOAtest Classpath The following jar files need to be added to the SOAtest classpath: •

com.ibm.ws.ejb.thinclient_7.0.0.jar



com.ibm.ws.orb_7.0.0.jar



com.ws.sib.client.thin.jms_7.0.0.jar



com.ibm.ws.emf_2.1.0.jar

The jar files can be found under [WAS installation dir]/runtimes. To add these jar files to SOAtest’s classpath, complete the following: 1. Choose SOAtest> Preferences. 2. Open the System Properties page. 3. Click the Add JARS button and choose and select the necessary JAR files to be added.

Configuring the Event Monitor Tool 500

Monitoring IBM WebSphere ESB

To configure the Event Monitor tool to monitor messages that pass through WebSphere ESB: 1. Double-click the Event Monitor tool to open up the tool configuration panel. 2. In the Event Source tab, select IBM WebSphere Enterprise Service Bus as the platform, then configure the following options: a. In the Connection area, specify your ESB connection settings. •

The username and password are the credentials that were configured in the WebSphere ESB (under Security, Business Integration Security on the WebSphere administrative console for the Common Event Infrastructure).



The credentials you provide are used by SOAtest to create the JNDI InitialContext of the events JMS topic and to create the JMS connection.

b. In the Monitoring Source field, specify the topic or queue that you want to monitor. •

You can leave the default destination name as jms/cei/notification/ AllEventsTopic, which is the CEI topic that reports all CEI events.



The connection URL is essentially the JNDI InitialContext URL for the WebSphere Default JMS provider.



The port number is the WebSphere bootstrap port.



You can check the correct port number for your WebSphere ESB using the administrative console under Servers section, WebSphere Application Server, then click or expand the "Ports" link under the "Communication" section. The port number to use in SOAtest is the BOOTSTRAP_ADDRESS value.

3. In the Options tab, modify settings as needed. •

Report test execution events specifies whether the Event Viewer tab and XML output display show only the monitored messages and events, or if they also indicate when each test started and completed. Enabling this option is helpful if you have multiple tests in the test suite and you want to better identify the events and correlate them to your test executions.



Wrap monitored messages with CDATA to ensure well-formedness of the XML event output should be disabled if you expect the monitored events’ message content to be well-formed XML. This will make the messages inside the events accessible via XPaths, allowing the message contents to be extracted by XML Transformer or validated with XML Assertor tools.





If the message contents are not necessarily XML, you should enable this option to ensure that the XML output of the Event Monitor tool (i.e. the XML Event Output for chaining tools to the Event Monitor, not what is shown under the Event Viewer) is well-formed XML by escaping all the message contents. This will make the content of these messages inaccessible by XPath since the message technically becomes just string content for the parent element.



Note that the Diff tool’s XML mode supports string content that is XML. In other words, no matter which option you select here, the Diff tool will still be able to diff the messages as XML, including the ability to use XPaths for ignoring values.

Maximum monitor execution duration specifies the point at which the test should timeout—in case another test in the test suite hangs, or if no other tests are being run (e.g., if you execute the Event Monitor test apart from the test suite, then use a custom application to send messages to system).

501

Monitoring IBM WebSphere ESB



Event polling delay after each test finishes execution (milliseconds) is not appliable here.

Viewing Monitored Events After the test runs, the Event Monitor will show the XML representation of the Common Base Events it receives from WebSphere, including the event's raw business data if it is present.

502

Monitoring Oracle or BEA AquaLogic Service Bus

Monitoring Oracle or BEA AquaLogic Service Bus This topic explains how to configure monitoring for events that are transmitted through Oracle Service Bus (OSB) or BEA Aqualogic Service Bus (ALSB). Sections include: •

Service Bus Configuration



SOAtest Configuration

Service Bus Configuration 1. Ensure that "Message Reporting" is enabled. This is required so SOAtest can draw message events. •

For details on how to globally enable message reporting in the bus, refer to the OSB Console Guide at http://download.oracle.com/docs/cd/E13159_01/osb/docs10gr3/ consolehelp/configuration.html#wp1080858.

2. Add Message Reporting actions to the desired message workflow components as described in http://download.oracle.com/docs/cd/E13159_01/osb/docs10gr3/userguide/modelingmessageflow.html#wp1080496. •

For details on how to accomplish this, see http://download.oracle.com/docs/cd/ E13159_01/osb/docs10gr3/consolehelp/proxyactions.html#wp1309439.

SOAtest Configuration 1. Configure the SOAtest classpath. By default, OSB is configured to use the built in PointBase relational Database for "Message Reporting" purposes. Parasoft SOAtest uses the OSB message reporting framework to obtain and visualize events (intermediate messages) from the bus by executing SQL queries on the reporting database. •

If you have a default OSB configuration, then you need the add the PointBase JDBC driver to the SOAtest classpath. This is in a single jar found in your OSB/WebLogic installation directory: $[BEA HOME]/wlserver_10.*/common/eval/pointbase/lib/pbclient5*.jar

You need to use the pbclient51.jar for ALSB 3.0 and pbclient57.jar for OSB 10gR3 (each ships with its own jar). •

If your OSB is configured to use a different database, then you need to provide the database JDBC drivers to the SOAtest classpath. If it is Oracle, you do not need to add any drivers (since SOAtest ships with Oracle drivers).

2. Set the Event Monitor URL based on the database. For more information, see “Database Configuration Parameters”, page 348 and the SOAtest forum. 3. (Recommended) Open the Options tab and set the delay amount (Event polling delay after each test finishes execution) to 3 seconds or more. By making the event monitor wait for a few seconds before obtaining the events, you can ensure that the events have been logged to the database before the query is executed. Note that you can click Export Configuration Settings to export these configuration settings to a file, then other team members can reference the settings by selecting the File button and specifying the path to this file.

503

Monitoring Oracle or BEA AquaLogic Service Bus

504

Monitoring Software AG webMethods Broker

Monitoring Software AG webMethods Broker This topic explains how to configure monitoring for events that are transmitted through Software AG webMethods Broker. This monitoring requires admin client group privileges. Sections include: •

Adding Required Jar Files to the SOAtest Classpath



Configuring Event Monitor



Notes

Adding Required Jar Files to the SOAtest Classpath The following jar files need to be added to the SOAtest classpath: •

wmbrokerclient.jar



g11nutils.jar

The jar files can be found under [webmethods install dir]/Broker/lib/. For more details, please refer to webMethods Broker Client Java API Programmer's Guide> Getting Started> Using the webMethods Broker Java API. To add these jar files to SOAtest’s classpath, complete the following: 1. Choose SOAtest> Preferences. 2. Open the System Properties page. 3. Click the Add JARS button and choose and select the necessary JAR files to be added.

Configuring Event Monitor To configure the Event Monitor tool to monitor webMethods Broker: 1. Double-click the Event Monitor tool to open up the tool configuration panel. 2. In the Event Source tab, select Software AG webMethods Broker as the platform. The configuration fields will be populated with default values. 3. Adjust the configuration field values as needed. The fields are the same as those used in the webMethods tool, and are described in “webMethods”, page 833.

505

Monitoring Software AG webMethods Broker

Note - Using Filters and Wildcards The Event Monitor uses BrokerAdminClient to monitor events. By default, it subscribes to the "admin" client group. Thus, if you wish to filter events based on their content, you should change the client group value to one that allows for subscription to regular event types (instead of Trace). Why? Because when using Broker::Trace::* events, the filter string will be applied to the fields in the trace BrokerEvents—not the original events that they represent. When completing the Event Type field, remember that wildcards are not allowed for Broker::Trace or Broker::Activity event types according to WebMethods Broker Client Java API Programmer's Guide. If you wish to monitor a set of event types, specify a client group name that allows subscription access to the desired event types (possibly other than the default "admin" client group setting), then provide event types with wildcards. For example, you can use Sample::* You may also use $[data bank column name] variables in your string. SOAtest will replace that with the data bank value so filter strings can have dynamic values based on the output of other tests. For more details on filters and using wildcards in event type names, please refer to the WebMethods Broker Client Java API Programmer's Guide.

4. In the Options tab, modify settings as needed. •

Report test execution events specifies whether the Event Viewer tab and XML output display show only the monitored messages and events, or if they also indicate when each test started and completed. Enabling this option is helpful if you have multiple tests in the test suite and you want to better identify the events and correlate them to your test executions.



Wrap monitored messages with CDATA to ensure well-formedness of the XML event output should be disabled if you expect the monitored events’ message content to be well-formed XML. This will make the messages inside the events accessible via XPaths, allowing the message contents to be extracted by XML Transformer or validated with XML Assertor tools. •

If the message contents are not necessarily XML, you should enable this option to ensure that the XML output of the Event Monitor tool (i.e. the XML Event Output for chaining tools to the Event Monitor, not what is shown under the Event Viewer) is well-formed XML by escaping all the message contents. This will make the content of these messages inaccessible by XPath since the message technically becomes just string content for the parent element.



Note that the Diff tool’s XML mode supports string content that is XML. In other words, no matter which option you select here, the Diff tool will still be able to diff the messages as XML, including the ability to use XPaths for ignoring values.



Maximum monitor execution duration specifies the point at which the test should timeout—in case another test in the test suite hangs, or if no other tests are being run (e.g., if you execute the Event Monitor test apart from the test suite, then use a custom application to send messages to system).



Event polling delay after each test finishes execution (milliseconds) is not applicable here.

506

Monitoring Software AG webMethods Broker

Notes With admin client group privileges, the Event Monitor tool can subscribe to the following event types: •

Broker::Ping



Adapter::ack



Adapter::adapter



Adapter::error



Adapter::errorNotify



Adapter::refresh



Broker::Trace::Publish



Broker::Trace::Enqueue



Broker::Trace::Drop



Broker::Trace::Receive



Broker::Trace::PublishRemote



Broker::Trace::EnqueueRemote



Broker::Trace::ReceiveRemote



Broker::Activity::TerritoryChange



Broker::Activity::ClientChange,



Broker::Activity::ClientGroupChange



Broker::Activity::EventTypeChange



Broker::Trace::Insert



Broker::Trace::Delete



Broker::Trace::Peek



Broker::Trace::DropRemote



Broker::Trace::Modify



Broker::Activity::ClientSubscriptionChange



Broker::Activity::RemoteSubscriptionChange

Note that you can also configure the webMethods tool (described in “webMethods”, page 833.) to subscribe to the desired event type—the only difference is that with the webMethods tool you need to provide the specific event type name.

507

Monitoring Sonic ESB

Monitoring Sonic ESB This topic explains how to configure monitoring for Sonic ESB.

Prerequisites The following jar files must be added to your classpath (via SOAtest> Preferences> System Properties): •

broker.jar



mfcontext.jar



sonic_Client.jar

Configuration To configure the Event Monitor tool to monitor messages that pass through Sonic ESB: 1. Double-click the Event Monitor tool to open up the tool configuration panel. 2. In the Event Source tab, select Sonic Enterprise Service Bus as the platform, then configure the following options: a. In the Connection area, specify your Sonic ESB connection settings. b. In the Destination Name field, specify the topic or queue that you want to monitor.

c.



You can specify a regular topic or queue (e.g., the entry or exit of a workflow process), or a special "dev.Tracking" tracking endpoint.



For instance, if you want to track all events that occur as part of the process flow, specify the dev.Tracking endpoint, and have the process set to Tracking Level of 4 in the ESB.

In the Destination Type field, specify whether the tracking destination is a topic or a queue.

d. (Optional) In the Message Selector field, enter a value to act as a message filter. See “Using Message Selector Filters”, page 704 for tips. 3. In the Options tab, modify settings as needed. •

Report test execution events specifies whether the Event Viewer tab and XML output display show only the monitored messages and events, or if they also indicate when each test started and completed. Enabling this option is helpful if you have multiple tests in the test suite and you want to better identify the events and correlate them to your test executions.



Wrap monitored messages with CDATA to ensure well-formedness of the XML event output should be disabled if you expect the monitored events’ message content to be well-formed XML. This will make the messages inside the events accessible via XPaths, allowing the message contents to be extracted by XML Transformer or validated with XML Assertor tools. •

If the message contents are not necessarily XML, you should enable this option to ensure that the XML output of the Event Monitor tool (i.e. the XML Event Output for chaining tools to the Event Monitor, not what is shown under the Event Viewer) is well-formed XML by escaping all the message contents. This will make the content of these messages inaccessible by XPath since the message technically becomes just string content for the parent element.

508

Monitoring Sonic ESB



Note that the Diff tool’s XML mode supports string content that is XML. In other words, no matter which option you select here, the Diff tool will still be able to diff the messages as XML, including the ability to use XPaths for ignoring values.



Maximum monitor execution duration specifies the point at which the test should timeout—in case another test in the test suite hangs, or if no other tests are being run (e.g., if you execute the Event Monitor test apart from the test suite, then use a custom application to send messages to system).



Event polling delay after each test finishes execution (milliseconds) is not applicable here.

509

Monitoring TIBCO EMS

Monitoring TIBCO EMS This topic explains how to configure monitoring for TIBCO EMS.

Prerequisites The tibjms.jar file must be added to your classpath (via SOAtest> Preferences> System Properties).

Configuration To configure the Event Monitor tool to monitor messages that pass through TIBCO EMS: 1. Double-click the Event Monitor tool to open up the tool configuration panel. 2. In the Event Source tab, select TIBCO Enterprise Message Service as the platform, then configure the following options: a. In the Connection area, specify your TIBCO EMS connection settings. b. In the Destination Name field, specify the topic or queue that you want to monitor. •

You can specify a regular topic or queue (e.g., the entry or exit of a workflow process), or a special process tracking topic .



For instance, to track any JMS message that gets transmitted through TIBCO EMS, use $sys.monitor.Q.r.>



c.

For details on specifying tracking topics for TIBCO EMS, see "Chapter 13: Monitoring Server Activity" and "Appendix B: Monitor Messages" in the TIBCO EMS User’s Guide.

In the Destination Type field, specify whether the tracking destination is a topic or a queue.

d. (Optional) In the Message Selector field, enter a value to act as a message filter. See “Using Message Selector Filters”, page 704 for tips. 3. In the Options tab, modify settings as needed. •

Report test execution events specifies whether the Event Viewer tab and XML output display show only the monitored messages and events, or if they also indicate when each test started and completed. Enabling this option is helpful if you have multiple tests in the test suite and you want to better identify the events and correlate them to your test executions.



Wrap monitored messages with CDATA to ensure well-formedness of the XML event output should be disabled if you expect the monitored events’ message content to be well-formed XML. This will make the messages inside the events accessible via XPaths, allowing the message contents to be extracted by XML Transformer or validated with XML Assertor tools. •

If the message contents are not necessarily XML, you should enable this option to ensure that the XML output of the Event Monitor tool (i.e. the XML Event Output for chaining tools to the Event Monitor, not what is shown under the Event Viewer) is well-formed XML by escaping all the message contents. This will make the content of these messages inaccessible by XPath since the message technically becomes just string content for the parent element.



Note that the Diff tool’s XML mode supports string content that is XML. In other words, no matter which option you select here, the Diff tool will still be

510

Monitoring TIBCO EMS

able to diff the messages as XML, including the ability to use XPaths for ignoring values. •

Maximum monitor execution duration specifies the point at which the test should timeout—in case another test in the test suite hangs, or if no other tests are being run (e.g., if you execute the Event Monitor test apart from the test suite, then use a custom application to send messages to system).



Event polling delay after each test finishes execution (milliseconds) is not applicable here.

511

Monitoring Other JMS Systems

Monitoring Other JMS Systems This topic explains how to configure monitoring for any JMS system. To configure the Event Monitor tool to monitor any JMS system: 1. Double-click the Event Monitor tool to open up the tool configuration panel. 2. In the Event Source tab, select Generic JMS system as the platform, then configure the following options: a. In the Connection area, specify your connection settings. b. In the Initial Context field, specify a fully-qualified class name string, passed to the JNDI javax.naming.InitialContext constructor as a string value for the property named javax.naming.Context.INITIAL_CONTEXT_FACTORY. c.

In the Connection Factory field, specify the JNDI name for the factory. This is passed to the lookup() method in javax.naming.InitialContext to create a javax.jms.QueueConnectionFactory or a javax.jms.TopicConnectionFactory instance.

d. In the Destination Name field, specify the topic or queue that you want to monitor. •

You can specify a regular topic or queue (e.g., the entry or exit of a workflow process), or a special process tracking topic.

e. In the Destination Type field, specify whether the tracking destination is a topic or a queue. f.

(Optional) In the Message Selector field, enter a value to act as a message filter. See “Using Message Selector Filters”, page 704 for tips.

3. In the Options tab, modify settings as needed. •

Report test execution events specifies whether the Event Viewer tab and XML output display show only the monitored messages and events, or if they also indicate when each test started and completed. Enabling this option is helpful if you have multiple tests in the test suite and you want to better identify the events and correlate them to your test executions.



Wrap monitored messages with CDATA to ensure well-formedness of the XML event output should be disabled if you expect the monitored events’ message content to be well-formed XML. This will make the messages inside the events accessible via XPaths, allowing the message contents to be extracted by XML Transformer or validated with XML Assertor tools.





If the message contents are not necessarily XML, you should enable this option to ensure that the XML output of the Event Monitor tool (i.e. the XML Event Output for chaining tools to the Event Monitor, not what is shown under the Event Viewer) is well-formed XML by escaping all the message contents. This will make the content of these messages inaccessible by XPath since the message technically becomes just string content for the parent element.



Note that the Diff tool’s XML mode supports string content that is XML. In other words, no matter which option you select here, the Diff tool will still be able to diff the messages as XML, including the ability to use XPaths for ignoring values.

Maximum monitor execution duration specifies the point at which the test should timeout—in case another test in the test suite hangs, or if no other tests are being run

512

Monitoring Other JMS Systems

(e.g., if you execute the Event Monitor test apart from the test suite, then use a custom application to send messages to system). •

Event polling delay after each test finishes execution (milliseconds) is not applicable here.

513

Monitoring Java Applications

Monitoring Java Applications This topic explains how to configure monitoring for any Java application. When a properly-configured Event Monitor tool is placed in the beginning of a test suite that includes tests which invoke that Java application (directly or indirectly), it will receive and visualize the Java events that take place. Sections include: •

Why Monitor Java Applications?



Application Configuration



SOAtest Configuration

Why Monitor Java Applications? By monitoring instrumented Java applications, you can gain visibility into the application’s internal behavior as functional tests execute. This allows you to better identify the causes of test failures as well as enrich the use case validation criteria in your regression tests. In addition to validating the messages returned by the system and the intermediate messages/steps monitored via the ESB, you can also validate events within the Java application being invoked. For example, you can validate that an EJB method call or a remote call to another system is taking place with the expected parameters.

Application Configuration To configure the application for monitoring, you need to instrument it with Parasoft’s monitoring agent. To do this: 1. Copy insure.jar and insureimpl.jar from [SOAtest install dir]/eventmonitor to a directory on the server with the application you wish to instrument. 2. If the server is running, stop it. 3. In your startup script, add the -javaagent command to the existing Java arguments. •

For details, see “javaagent Command Details”, page 514.

4. Restart the server. The server will start as usual, but with the specified package classes instrumented. Now, whenever instances of objects are created or methods within the specified package prefixes are invoked, SOAtest (which can be running from another developer/QA desktop machine) will be able to receive event notifications in Event Monitor.

javaagent Command Details Basics The following invocation-time parameters are required in all situations:

Parameter

Description

soatest

Required for configuring monitoring.

port=[port_number]

Specifies which port should be used to communicate with the monitored program. Use 5050 to 5099.

514

Monitoring Java Applications

Parameter

Description

instrument= [class_name.method_name(jni _sig)]

Specifies the prefixes of fully-qualified methods to check. For instance, given the com.abc.util.IOUtil.method, it will instrument all methods in IOUtil.java that start with method. If given com.abc., it will also instrument those methods and all methods of classes whose fully qualified name starts with com.abc. Note that wildcards are not permitted. See the note below for more details.

trace=[class_name.method_na me(jni_sig)]

Specifies the filter for method calls to trace. See the note below for more details.

Note - instrument and trace Instrumentation of a class applies to all of its method bodies; it provides visibility, for example, into what methods that class calls and what values those methods return. Tracing is implemented by also instrumenting the caller of the code you want to visibility into. The called code is not instrumented. For example, assume you want to trace third-paty methods called from your code, but not third-party methods called from other third-paty code. In this case, you would instrument your own code, and trace the callls to the third-paty code. More, specifically.... instrument=com.acme.util configures instrumentation for all classes matching com.acme.util. All methods of those classes are instrumented. In the following code, the writeData method will be

instrumented: package com.acme.util; class IOUtil { int writeData(DataOutputStream dos, Data data) { dos.write(data._int); dos.write(data._float); } }

instrument=com.acme.util,trace=java.io provides visibility into the the java.io calls made by com.acme.util code. Instrumentation adds calls to the writeData() method in order to check which calls to java.io are made by that monitored code.

The following parameters are optional:

515

Monitoring Java Applications

Parameter

Description

trace_to_xml

Tells the monitor to serialize complex Java object into an XML representation. If this option is omitted, only primitive and "toString()" values of Objects will be returned. This parameter is strongly recommended for use with Event Monitor. It is not applicable if you are performing only runtime error detection.

xmlsizelimit=[integer_value]

Determines the maximum XML size limit in bytes. The default size is 20000 if this option is not specified. Applies only when trace_to_xml is used.

xmlcolsizelimit=[integer_value]

When generating an XML representation of Java objects, determines the maximum number of elements shown for collections/maps/arrays. The first 100 elements are shown by default. Applies only when trace_to_xml is used.

xmldeeplimit=[integer_value]

When generating an XML representation of Java objects, determines the maximum field depth included for data structures. Fields up to a depth of 8 are included by default. Applies only when trace_to_xml is used.

xmlexcl=[classes_or_fields]

':' separated classes or fields to exclude from xml serialization (i.e. xmlexcl=com.acme.A:com.acme.B._field). Applies only when trace_to_xml is used.

xmlinc=[classes_or_fields]

':' separated classes or fields to always include in xml serialization (i.e.xmlinc=com.acme.A:com.acme.B._field) Matches for xmlinc take preference over matches for xmlexcl: if something matches xmlinc, it will always be shown— even if xmlexcl also matches it. When the pattern matches a class name, 1) fields of that class type or a derived type are excluded from serialization and 2) method arguments and return values of that type or derived ones will not be serialized to xml. By default, the monitor excludes classes of the following types: - org.apache.log4j.Logger - java.util.logging.Logger - java.util.Timer - java.io.Writer - java.io.Reader - java.io.InputStream - java.io.OutputStream Applies only when trace_to_xml is used.

516

Monitoring Java Applications

Parameter

Description

xmlsecondlimit=[seconds]

By default, if you are using trace_to_xml, the monitoring will spend only up to 10 seconds to convert a monitored complex Java object to an XML representation. This is to prevent significant slow-downs in the monitoring agent when monitoring very large objects. If that limit is reached, then the SOAtest event will show the following message instead of the XML representation of an object: SKIPPED: converting to XML takes too long: 10 seconds If you wish to change that 10 second threshold, use the xmlsecondslimit flag. example: xmlsecondslimit=20 For large objects, the recommended appoach is to avoid reaching the threshold in the first place: reduce the XML size by excluding the fields you are not interested (using the xmlecl option).

trace_exceptions[=exception_c lass_prefix]

Shows a trace of events related to an exception that was created, thrown, or caught. _details can be added to get more detail of the events (i.e. the

stack trace where the event happens). terse

Configures terse output to the console (stack traces have only 1 element).

For Applications Running from Eclipse or Application Servers Applications that define their own class loaders (i.e. Eclipse, JBoss and Tomcat) need insure.jar added to the boot classpath. To monitor those applications, add to the launch VM arguments: -javaagent:"<path_to_jar>\insure.jar=alias,port=<port>"trace_to_xml,instrument=<my.package.prefix>,trace=<my.package.prefix> -Xbootclasspath/a<path_to_jar>\insure.jar

For instance, you may use: -javaagent:"/home/user/parasoft/insure.jar=alias,port=5060"trace_to_xml,instrument=com.mycompany.onlinestore,trace=com.mycompany.onlinestore -Xbootclasspath/a:/home/user/parasoft/ insure.jar

For other (Standalone) Java Applications For other (standalone) Java applications, you do NOT need to add insure.jar to the boot classpath. For instance, you may use: java -javaagent:"C:\Program Files\Parasoft\insure.jar=alias,port=5050",trace_to_xml,instrument=com.mycompany.myapp,trace=com.mycompany.myapp

SOAtest Configuration 1. Double-click the Event Monitor tool to open up the tool configuration panel. 2. In the Event Source tab, select Instrumented Java Application as the platform, then specify the hostname where the server resides and the port number for the Parasoft agent runtime (5050 by default).

517

Monitoring Databases

Monitoring Databases This topic explains how to configure monitoring for databases. Sections include: •

Why Monitor Databases?



Configuring Event Monitor to Monitor a Database



Notes

Why Monitor Databases? The Event Monitor allows you to execute a SQL query on a database of your choice after each test executes within the test suite. Although the DB Tool can be used for similar purposes, the Event Monitor Database mode is better suited for retrieving database rows when events occur as a result of your application logging messages into a database. The Event Monitor is different than the DB Tool in a number of ways: •

A single Event Monitor in your test suite can execute database queries automatically after each test execution. This relieves you from having to add a DB Tool directly after each test.



It allows delayed query execution (see the "Event polling delay after each test finishes execution" option under the Options tab). This is important for many logging databases because the logged entries may not reach the database in real time.



It can consolidate the database entries into a single flow of events within a test suite, while the DB Tool gives you the flexibility to execute isolated and different queries at the desired points of your use case scenario.



It helps you isolate the entries that were added to the database during test execution.

Configuring Event Monitor to Monitor a Database 1. Double-click the Event Monitor tool to open up the tool configuration panel. 2. In the Event Source tab, select Database as the platform. 3. With Local selected, enter the Driver, URL, Username, and Password for the database you want to query. For details on completing these fields, see “Database Configuration Parameters”, page 348 and the SOAtest forum. 4. (Optional) In the Constraint SQL Query field, enter a value that identifies the last logged value from the database before a test executes. Event Monitor expects that query to return a single value. Typically this would be a table key, an entry number, or a Timestamp. •

Using a constraint query is useful when your logging database is cumulative and it is not cleaned/restored after each scenario executes. By executing this query before the test suite tests execute, the Event Monitor can distinguish the pre-existing entries from the new entries that will be logged into the database during test execution.



Such queries would typically use the SQL MAX function. For example, the query select max(MESSAGE_TIMESTAMP) from MESSAGE_LOG assumes that you have a table named MESSAGE_LOG that contains a column named MESSAGE_TIMESTAMP of type Timestamp. It will return a single value representing the

latest message entry currently present in that database. Event Monitor will execute that query first and keep that timestamp value.

518

Monitoring Databases

5. In the Event SQL Query field, specify the SQL for retrieving the log or event entry from the database. For example, such a query might look like: select * from MESSAGE_LOG where MESSAGE_TIMESTAMP > $[CONSTRAINT] order by MESSAGE_TIMESTAMP DESC



Note that $[CONSTRAINT] is a special SOAtest variable. It tells the Event Monitor to use the value it received from the first constraint query (described in the previous step) and automatically provide it in the event query. The event query executes after test suite execution completes (and after the delay specified in the Event Monitor’s Options tab). It retrieves the rows that were added to the database after the test executed.



Use of $[CONSTRAINT] in event queries is not required.

6. In the Options tab, modify settings as needed. •

Report test execution events specifies whether the Event Viewer tab and XML output display show only the monitored messages and events, or if they also indicate when each test started and completed. Enabling this option is helpful if you have multiple tests in the test suite and you want to better identify the events and correlate them to your test executions.



Wrap monitored messages with CDATA to ensure well-formedness of the XML event output should be disabled if you expect the monitored events’ message content to be well-formed XML. This will make the messages inside the events accessible via XPaths, allowing the message contents to be extracted by XML Transformer or validated with XML Assertor tools. •

If the message contents are not necessarily XML, you should enable this option to ensure that the XML output of the Event Monitor tool (i.e. the XML Event Output for chaining tools to the Event Monitor, not what is shown under the Event Viewer) is well-formed XML by escaping all the message contents. This will make the content of these messages inaccessible by XPath since the message technically becomes just string content for the parent element.



Note that the Diff tool’s XML mode supports string content that is XML. In other words, no matter which option you select here, the Diff tool will still be able to diff the messages as XML, including the ability to use XPaths for ignoring values.



Maximum monitor execution duration specifies the point at which the test should timeout—in case another test in the test suite hangs, or if no other tests are being run (e.g., if you execute the Event Monitor test apart from the test suite, then use a custom application to send messages to system).



Event polling delay after each test finishes execution (milliseconds) specifies how long Event Monitor waits between the time the test ends and the time it retrieves the events.

Notes •

The Event Viewer tab will display each row retrieved by the Event SQL query as a box in the event sequence flow. Double-clicking the box opens a dialog with data details.



You can chain XML tools to a database Event Monitor. Since the database rows are outputted in XML, they can be diffed, validated with XML Assertor, etc.

519

Monitoring Stub Events

Monitoring Stub Events This topic explains how to configure monitoring for events that occur within stubs you have deployed using SOAtest (to virtualize services or other resources that are unavailable for testing or that you do not want to access during testing). Sections include: •

Why Monitor Stub Events?



Prerequisites



Configuration



Using An Alternative JMS System for Stub Event Monitoring

Note: For details on SOAtest’s stubbing (virtualization) functionality, see “Service Virtualization: Creating and Deploying Stubs”, page 531.

Why Monitor Stub Events? Visibility into what messages are sent to and from stubs—and what validations and errors occur at the stub level—enables you to: •

Validate the messages that the application sends to your stubs.



See what errors occur based on how your application interacts with the stubbed services.

In a common test situation, SOAtest will send a message to a system, such as an available service or a web browser, which will then send a message to another service or system that you have stubbed (e.g., because it is not yet available or not accessible for testing).

SOAP Client Tool

Stub 1

2

System A

Validation 3

4

Event Monitor

Client Tester Tools

By adding a stub Event Monitor tool to the test suite, you gain insight into messages 2 and 3 and well as messages 1 and 4. You also see the result of any validation tools you may have attached to the stub (for instance, XML Assertor tools) and receive details on any stub errors that may have occurred (for instance, because the stub was not properly configured to process valid messages, or because invalid messages were sent).

520

Monitoring Stub Events

Prerequisites Stub event monitoring applies specifically to "hosted stubs"—stubs that are available for continuous access. It is not relevant to Message Stub tools that are integrated into end-to-end test suites as described in “Hosted Deployment vs. Scenario-Mode Integration”, page 542. Note that firewalls running where SOAtest is running can sometimes block communication between the Event Monitor and a remote Stub Server. If you are using the Windows Vista firewall, it needs to be disabled prior to using Event Montiro with a remote Stub Server.

Configuration 1. Add an Event Monitor tool to the test suite that drives the interaction with the stub you want to monitor. 2. In the Event Source tab of the Event Monitor tool’s configuration panel, select SOAtest Stub Server as the platform. 3. Under Event Reporting Provider, ensure that SOAtest Builtin Provider is selected. •

If you want to use a different JMS system as the stub server (e.g., to scale for load testing), see “Using An Alternative JMS System for Stub Event Monitoring”, page 522.

4. If you want to monitor stub events on a remote SOAtest server, select Remote SOAtest Server and specify that server’s URL. Otherwise the local SOAtest server will be used. 5. Under Stub Event Subscriptions, specify what type of stub events you want to monitor. Available options are: •

Request messages: The message sent to the stub. For instance, in the image shown in Why Monitor Events?, this would be message #2.



Response messages: The message that the stub returns. For instance, in the image shown in Why Monitor Events?, this would be message #3.



Message validation results: The result of any validation tools you may have attached to the stub’s Message Stub tools —such as XML Assertor tools. For instance, in the image shown in Why Monitor Events?, this would be whatever tool is represented by the "validation" marker.



Stub errors: Any stub errors that may have occurred (e.g., because Message Stub tools were not properly configured to process valid messages, or because invalid messages were sent). For instance, in the image shown in Why Monitor Events?, if the stub was not configured to route message #2 to a specific Message Stub tool, this would be reported as an error.

6. Under Test Failure Criteria, specify the test failure criteria. Note that if a validation tool is chained to the current Event Monitor tool, the outcome of that validation will determine this test’s success or failure—and the following settings not applicable. Available options include: •

Stub error events: The test will fail if any stub errors occur (e.g., because Message Stub tools were not properly configured to process valid messages, or because invalid messages were sent).



Stub validation failure events: The test will fail if a failure is reported by any validation tool (such as an XML Assertor tool) you may have attached to the stub’s Message Stub tool.



No events received: The test will fail if the stub does not receive any events before the current test suite (the one that includes the Event Monitor tool) completes execution.

521

Monitoring Stub Events

7. In the Options tab, modify settings as needed. •

Report test execution events specifies whether the Event Viewer tab and XML output display show only the monitored messages and events, or if they also indicate when each test started and completed. Enabling this option is helpful if you have multiple tests in the test suite and you want to better identify the events and correlate them to your test executions.



Wrap monitored messages with CDATA to ensure well-formedness of the XML event output should be disabled if you expect the monitored events’ message content to be well-formed XML. This will make the messages inside the events accessible via XPaths, allowing the message contents to be extracted by XML Transformer or validated with XML Assertor tools. •

If the message contents are not necessarily XML, you should enable this option to ensure that the XML output of the Event Monitor tool (i.e. the XML Event Output for chaining tools to the Event Monitor, not what is shown under the Event Viewer) is well-formed XML by escaping all the message contents. This will make the content of these messages inaccessible by XPath since the message technically becomes just string content for the parent element.



Note that the Diff tool’s XML mode supports string content that is XML. In other words, no matter which option you select here, the Diff tool will still be able to diff the messages as XML, including the ability to use XPaths for ignoring values.



Maximum monitor execution duration specifies the point at which the test should timeout—in case another test in the test suite hangs, or if no other tests are being run (e.g., if you execute the Event Monitor test apart from the test suite, then use a custom application to send messages to system).



Event polling delay after each test finishes execution (milliseconds) is not applicable here.

Notes •

The Event Viewer tab will display details about the stub events received. It indicates the stub name, the name of Message Stub tool that responded to that message, the response message, results of validation tools (if available), and any stub errors that occurred. Double-clicking an item opens a dialog with additional details.

Using An Alternative JMS System for Stub Event Monitoring By default, stubs are monitored using SOAtest’s built-in JMS-based provider. Alternatively, you can use another JMS system that you have—for instance, if you want to scale for load testing. To configure this: 1. Open the Stub Server view (if it is not available, choose Window> Show View> Stub Server). 2. Double-click the node for the machine (local or remote) you want to configure to use the stub event reporting provider. 3. Open the Event Reporting tab in the machine configuration panel. 4. In the Event Reporting Provider field, select your preferred server. If you want to use a JMS server that is not specifically listed, choose Other JMS Provider.

522

Monitoring Stub Events

5. Specify the connection settings. The available fields are described in •

Monitoring IBM WebSphere ESB



Monitoring Sonic ESB



Monitoring TIBCO EMS



Monitoring Other JMS Systems

Event Reporting Destination - Configuration Needed Note that a default event reporting destination and type are specified in the available controls. You need to either: •

Configure your JMS system to use this default destination, or



Change the SOAtest settings to another destination that is available on your system.

6. In the Event Monitor tool configuration panel, open the Event Source tab, select the appropriate Event Reporting Provider, and specify the settings required to connect to your JMS. Again, the available fields are described in •

Monitoring IBM WebSphere ESB



Monitoring Sonic ESB



Monitoring TIBCO EMS



Monitoring Other JMS Systems

523

Monitoring a Custom API-Based Events Source

Monitoring a Custom API-Based Events Source This topic explains how to configure monitoring for a custom API-Based Events source. To configure the Event Monitor tool to monitor a custom API-based event source: 1. Double-click the Event Monitor tool to open up the tool configuration panel. 2. In the Event Source tab, select Custom API-Based Events Source as the platform, then configure the following options: a. In the Connection area, specify your connection settings. b. In the Event Retrieval area, specify the event retrieval pattern you want to use (polling at a specified time interval, polling after each test execution, or subscribing to an event producer). c.

In the User Code area, specify the location of your custom event monitoring application or scripts. •

See “Extensibility API Patterns”, page 525 for details.

3. In the Options tab, modify settings as needed. •

Report test execution events specifies whether the Event Viewer tab and XML output display show only the monitored messages and events, or if they also indicate when each test started and completed. Enabling this option is helpful if you have multiple tests in the test suite and you want to better identify the events and correlate them to your test executions.



Wrap monitored messages with CDATA to ensure well-formedness of the XML event output should be disabled if you expect the monitored events’ message content to be well-formed XML. This will make the messages inside the events accessible via XPaths, allowing the message contents to be extracted by XML Transformer or validated with XML Assertor tools. •

If the message contents are not necessarily XML, you should enable this option to ensure that the XML output of the Event Monitor tool (i.e. the XML Event Output for chaining tools to the Event Monitor, not what is shown under the Event Viewer) is well-formed XML by escaping all the message contents. This will make the content of these messages inaccessible by XPath since the message technically becomes just string content for the parent element.



Note that the Diff tool’s XML mode supports string content that is XML. In other words, no matter which option you select here, the Diff tool will still be able to diff the messages as XML, including the ability to use XPaths for ignoring values.



Maximum monitor execution duration specifies the point at which the test should timeout—in case another test in the test suite hangs, or if no other tests are being run (e.g., if you execute the Event Monitor test apart from the test suite, then use a custom application to send messages to system).



Event polling delay after each test finishes execution (milliseconds) specifies how long Event Monitor waits between the time the test ends and the time it retrieves the events.

524

Extensibility API Patterns

Extensibility API Patterns This topic describes the Extensibility API, which allows for capturing events by polling at specified time intervals, polling after test execution, and subscribing to an event producer. Sections include: •

Available Patterns



Using “Poll at specified time intervals” and “Poll after each test execution” Patterns



Using the “Subscribe to an event producer” pattern

Where are the Javadocs? The Javadocs for the Monitor Tool API can be accessed by choosing SOAtest> Help> Extensibility API . The resources that directly concern the Event Monitor tool are com.parasoft.api.IEvent, its default implementation adapter com.parasoft.api.Event, and com.parasoft.api.EventSubscriber

Available Patterns The Extensibility API allows for capturing events using one of three different patterns: •

Poll at Specified Time Intervals



Poll after Each Test Execution



Subscribe to an Event Producer

Poll at Specified Time Intervals This pattern is useful when the events in the system you are trying to monitor are not synchronized with your test execution. For example, you may be interested in pulling data from a logging framework which logs events at specified time intervals, and therefore you wish to capture the logged events in SOAtest at the same intervals in order to ensure that the events are actually retrieved. An analogy to this is somebody being a fan of a particular monthly magazine, but rather than subscribing to it in order to receive every issue that is published, the reader would visit a book store every month and obtain the latest issue. A point of interest with this pattern is that if you poll for events before new ones have been made available, then you may get the same events as the last poll or no events at all—depending on how your target framework behaves. This is just like our example reader possibly going to the store and finding only last month's issue since the current month issue has not been released yet. With this pattern, you can specify a time amount (in milliseconds) which will serve as the wait period in between the executions of your script. For example, if you specify the interval to be 1000 milliseconds, it will cause the Event Monitor tool to execute the code you provide every 1 second. This will continue until one of two events take place: 1. If the Event Monitor is being executed as part of a test suite with other various tests, the periodical user code execution will stop as soon as the last test in the test suite (or last row in the data source, if the test suite tests are iterating over a data source) has finished execution. 2. The maximum monitor execution duration (the value is configured under the Options tab) has been reached.

525

Extensibility API Patterns

Poll after Each Test Execution This pattern is useful when the events in the system you are trying to monitor are generally in sync with your test suite's test execution events. For example, your system's logging framework is triggered immediately after events occur in the system and the events are made available for you to obtain. With this pattern, the code you provide will be executed immediately after each test in the test suite executes. This pattern is not applicable if you run the Event Monitor tool independently (apart from executing the entire test suite which contains it). In this case, the Event Monitor will stop once: 1. The last test in the test suite (or the last row in the data source if the test suite tests are iterating over a data source) has finished execution. 2. The maximum monitor execution duration (value is configured under Options tab) has been reached.

Subscribe to an Event Producer This is probably the most common pattern, and Parasoft recommends its use whenever possible. This pattern is useful when the framework you are trying to monitor can allow a subscriber to be triggered immediately upon the occurrence of an event—in other words, it can perform a “call back” function. For example, the JMS publish/subscribe message pattern is an example of this pattern and it is the pattern used to drive the Sonic, TIBCO and other built-in platforms supported in the Event Monitor tool. In our magazine reader example, this is similar to subscribing to the publication so the latest issue gets delivered to the reader as soon as it is published.

Using “Poll at specified time intervals” and “Poll after each test execution” Patterns When using these two patterns, the method you select in the User Code section will be executed in accordance to the respective pattern. You may have a method name with any name you wish (be sure it is selected in the Method menu). IEvent getEvent(String url, String username, String password, Object connection, ScriptingContext context)



url (String): the value that is provided in the URL field of the Event Monitor Connection section.



username (String): the value that is provided in the username field of the Event Monitor Connection section.



password (String): the value that is provided in the password field of the Event Monitor Connection section.



connection(Object): this object can be optionally provided in order to maintain and reuse the same connection over the multiple script executions of the Event Monitor. See to “Maintaining Connections”, page 527 for details.



context (com.parasoft.api.Context): standard SOAtest scripting context that allows for accessing variables, data sources values and setting/getting objects for sharing across test executions.

Example (Jython) from com.parasoft.api import * def getEvent(url, username, password, connection, context): return Event("Hello!")

526

Extensibility API Patterns

The code under getEvent() method would basically handle event retrieval from the system you wish to monitor and return an implementation of com.parasoft.api.IEvent. In this example, we return com.parasoft.api.Event, which is an adapter implementation to that interface and takes a simple String object “Hello!”.

Maintaining Connections In practice, it is often useful to be able to create a connection to the remote system you want to monitor and reuse that connection for retrieving events (instead of creating a new connection on each User Code invocation). Therefore, in addition to have a method for retrieving an IEvent object as described above, you may choose to add two additional optional methods: Object createConnection(String url, String username, String password, com.parasoft.api.Context context)

and

Object destroyConnection(Object connection, com.parasoft.api.Context context)

createConnection would create and return an object handle to the monitoring connection, while destroyConnection takes that same object and allows for you to provide code for a graceful disconnection. The Event Monitor looks for the presence of these two optional methods. If you want to add them, be sure to use the exact method signatures. createConnection() is invoked once at the beginning of the Event Monitor execution and destroyConnection is invoked once at the end. The event retrieval method—for example, getEvent() above—is potentially invoked multiple times during Event Monitor test run in accordance with the selected pattern. The connection object you create with the createConnection method is passed to the event retrieval method so you can potentially use that connection to return an event.

Example from com.parasoft.api import * def getEvent(url, username, password, connection, context): return connection.getEvent()

Using the “Subscribe to an event producer” pattern With this pattern, Event Monitor expects a single method (with any name you wish—as long as the name is selected in the Method drop down menu) and with the following signature:

EventSubscriber getEventSubscriber(String url, String username, String password, Context context)

The arguments descriptions are provided in the previous patterns. In this case, you need to provide a Java implementation of an EventSubscriber (by inheriting from com.parasoft.api.EventSubscriber). The methods to implement are: public boolean start() Throws Exception

527

Extensibility API Patterns

and public boolean stop() Throws Exception

The start method will be invoked automatically by Event Monitor when the monitoring begins, and the stop method will be invoked automatically when the Event Monitor Execution finishes. The assumption under this pattern is that your EventSubscriber implementation would take care of connecting to your target system and subscribe to its event producing framework in start() and then unsubscribe and disconnect in stop(). An example implementation for subscribing to TIBCO EMS message monitoring topics is provided below. This actually mirrors the pattern used by the built-in TIBCO EMS platform of the Event Monitor.

Example This example is also available under the examples scripting directory that ships with SOAtest. It is included as an Eclipse project in a zip archive that can be imported to your Eclipse workspace. It requires tibjms.jar from TIBCO EMS and parasoft.jar to be added to the classpath in order to build and run. parasoft.jar is available at [SOAtestinstall directory]/plugins/com.parasoft.xtest.libs.web_1.0.0/root

import com.parasoft.api.*; import com.tibco.tibjms.*; import javax.jms.*; public class TIBCOEventSubscriber extends EventSubscriber { private ConsumerRunnable _consumerRunnable; private Connection _connection; private String _dest; private boolean _started = false; public TIBCOEventSubscriber(String url, String username, String password, String destination) throws JMSException { _dest = destination; QueueConnectionFactory factory = new TibjmsQueueConnectionFactory(url); _connection = factory.createQueueConnection(username, password); } public boolean start() throws Exception { Session session = null; session = _connection.createSession(false, Session.AUTO_ACKNOWLEDGE); Destination destination = session.createTopic(_dest); MessageConsumer msgConsumer = session.createConsumer(destination); _connection.start(); _started = true; _consumerRunnable = new ConsumerRunnable(msgConsumer); Thread thread = new Thread(_consumerRunnable); thread.start(); Application.showMessage("Monitoring started on " + _dest); return true; } public boolean stop() throws Exception { _started = false; Thread.sleep(1000); if (_connection != null) { _connection.close(); Application.showMessage("Monitoring connection closed"); } if (_consumerRunnable != null && _consumerRunnable.getException() != null) { throw _consumerRunnable.getException(); } return _started; }

528

Extensibility API Patterns

private class ConsumerRunnable implements Runnable { private MessageConsumer msgConsumer; private Exception t; public ConsumerRunnable(MessageConsumer msgConsumer) { this.msgConsumer = msgConsumer; } public void run() { while(_started) { try { MapMessage msg = (MapMessage)msgConsumer.receive(500); if (msg == null) { continue; } Message actualMessage = null; try { byte[] bytes = msg.getBytes("message_bytes"); if (bytes != null && bytes.length > 0) { actualMessage = Tibjms.createFromBytes(bytes); } } catch (JMSException e1) { } // you can provide your own extension of Event in order to customize the event // getLabel(), toString() and toXML() outputs IEvent event; if (actualMessage == null) { event = new Event(msg); } else { event = new Event(actualMessage); } event.setTimestamp(msg.getJMSTimestamp()); onEvent(event); } catch (JMSException e) { Application.showMessage(e.getClass().toString() + ": " + e.getMessage()); try { msgConsumer.close(); } catch (JMSException e1) { Application.showMessage(e1.getClass().toString() + ": " + e1.getMessage()); } t = e; _started = false; } } } public Exception getException() { return t; } } }

529

Generating Tests from Monitored Transactions

Generating Tests from Monitored Transactions In addition to providing visibility into your test transactions, SOAtest can monitor the system while real transactions execute, and then generate tests automatically from these messages. This includes tests for transaction entry/exit points, as well as tests for intermediate parts of the transaction. Benefits of this include: •

Accelerate test case creation by running real transactions through the system then generating tests that can replay them.



Consume real messages with real data into tests from monitoring.



Debug problems and reduce complexity by isolating components in an integrated system by replaying messages that trigger only certain parts of the transaction

For details, see: •

“Creating Tests From Sonic ESB Transactions”, page 404



“Creating Tests From TIBCO EMS Transactions”, page 407



“Creating Tests From JMS System Transactions”, page 401

530

Service Virtualization: Creating and Deploying Stubs In this section: •

Understanding Stubs



Creating Stubs from Functional Test Traffic



Creating Stubs from Recorded HTTP Traffic



Creating Stubs from WSDLs or Manually



Working with Stubs



Configuring Stub Server Deployment Settings

531

Understanding Stubs

Understanding Stubs Evolving services in a distributed SOA environment, and across multiple teams, is a complex endeavor due to the interdependencies between the system and business processes. For example, in a system that incorporates multiple endpoints such as credit card processing, billing, shipping, etc., it may be difficult for one team to test the responses from another team without interrupting normal business transactions. With SOAtest’s stub generation capability, you can test complex and distributed environments by automatically generating server stubs based on existing test suites. SOAtest can quickly and easily automate server emulation across multiple environments, thereby streamlining the collaborative development and testing activities of multiple teams and ultimately speeding the SDLC. SOAtest can: 1. Replace the various endpoints in the system architecture (services, applications, mainframes, stored procedures, etc.) as well as emulate the application behavior at the unit level. 2. Add stubs to the test environment to replace the behavior of various components that would otherwise inhibit your team’s ability to effectively exercise and validate the end-to-end business process.

Constructing Stubs Stubs can be constructed in several main ways: •

Emulation Based on Historical Data



Contextual Emulation



Using Data Sources (for unavailable services)

Emulation Based on Historical Data You can automatically emulate services based on real-world historical data (request/reply sets) collected from the runtime environment. Once you have captured a set of messages from the actual system (via monitoring, tracing, log files, etc.), SOAtest can create an emulated version of the service based on the same messages. For example, you can obtain runtime message sets from a runtime monitoring system such as AmberPoint Management System, then SOAtest can automatically generate stubs that represent the monitored behavior.

Contextual Emulation In some cases, it’s necessary to have a better understanding of the context around these messages in order to produce these emulated services intelligently. When this contextual understanding is important, you can create a functional test that models the scenario that you want emulated (you simply interact with the actual components to be emulated), then have Parasoft automatically generate stubs that emulate the behavior monitored when executing the modeled scenario. This significantly reduces the resources required to intelligently mirror the real-world behavior of complex, distributed, and heterogeneous environments. For example, assume you are interacting with Amazon Web Services (AWS). In order to emulate them, you can model a scenario that interacts with the actual Amazon services, then automatically “flip” that scenario into an emulated version. With the emulated version deployed locally, you gain control over its behavior for your testing environment (instead of relying on Amazon). This way, you can easily emulate error conditions, realistic delays, and so forth.

532

Understanding Stubs

Using Data Sources to Define Behavior for Unavailable Services Even if a system is completely unavailable, you can rapidly create an emulated version of necessary services from scratch. For example, you can automatically create a stub skeleton from a WSDL, use spreadsheets or other data sources to define the desired behavior, and then visually correlate request parameter values with desired response values.

Extending Beyond Services SOAtest’s stub generation is not limited to service emulation. The solution’s extreme extensibility and easy customization allows you to stub any component or protocol that is creating a dependency problem in the test environment. For example, you can use SOAtest to: •

Monitor what happens inside an ESB or another system as it is executed and create stubs that emulate the monitored behavior.



Emulate the other components that the bus is talking to—for instance, a CRM application, or a partner’s service which in turn calls a Software as a Service application.



Use unit-level stubbing to emulate the application logic’s calls to other applications, legacy systems, etc.



Configure the test scenario to set up and clean up test data on the actual databases so that the actual data set is not impacted by test transactions.



Simulate the behavior of a user interacting with the system via a browser.

Customizing Stubs To ensure that the emulated assets are flexible and robust enough to represent even the most complex test scenarios, Parasoft provides easy ways to configure stubs. For example, you can use the tool configuration panel to customize stubs with different request/ response use cases, error conditions, delays, and so forth. To quickly configure responses with a wide variety of values, you can set the stub to use automaticallygenerated inputs for specified operations, or feed it a range of values that are stored in a data source. In addition, if you have scripts or code that represents a custom response, you can integrate it directly into the emulated environment. This means that you can extend the stub to mimic any level of processing—no matter how complex or sophisticated. The configured stubs can be deployed locally or made available as a service so that different teams or business partners can collaborate on evolving their components within the distributed architecture. If the emulated asset changes as the application evolves (for example, the WSDL for an emulated service is extended to include a new operation), the associated stub does not need to be re-created; it can be updated. Stubs can be made available for continuous access (i.e., be deployed as "hosted stubs") or they can be integrated into an end-to-end test suite. For instance, to validate a loan approval service that executes a business process workflow with multiple steps (including calling a manager approval service and another external credit rating service), you could construct a scenario with the following tests: •

Test 1: Send a request to a loan approval service to initiate the process.



Test 2: Act as a stub to listen for the incoming credit rating service over HTTP and respond with the desired rating for the scenario (emulate the crediting rating service response).



Test 3: Act as a stub to consume the manager approval message over a JMS queue and respond with approval, denial, etc.

533

Understanding Stubs



Test 4: Get an asynchronous response from the loan process with the final loan result and validate it.



Test 5: Execute a query against a relational database to check if the loan application was audited in the database properly.



Test 6: Remove the application data from the database in order to restore it to the original state and make the test scenario repeatable.

Creating Stubs Specific to the Travel Industry Certain travel industry XML services are built around plain XML messaging where the XML request is URL-encoded under a parameter named xmlRequest. SOAtest stubs services that respond to XML based on values within a URL-encoded XML request without requiring any additional configuration. SOAtest stubs recognize incoming requests with Content-Type application/x-www-form-urlencoded and expect to read the request XML from the HTTP form parameter "xmlRequest". Multiple responses mode can be used so that you can build XPaths to correlate the desired response message based on values in the XML request.

534

Creating Stubs from Functional Test Traffic

Creating Stubs from Functional Test Traffic This topic explains how to create stubs from existing functional test suite traffic (from SOAP Client tests that use HTTP, JMS, or MQ as well as Messaging Client tests). If the service you want to emulate is available, this is typically the most effective way to create stubs. Sections include: •

Creating Stubs



Using the Stubs

Creating Stubs To create stubs from existing test suite traffic: 1. Design a functional test suite that represents the traffic that you want the stub to emulate. For instance, you can create a test suite based on historical data (request/reply sets from Amberpoint, traffic, etc.). Or, you can design a functional test suite that sets up a sequence of events representing the contextual application behavior you want emulated. 2. Run the test suite from which that you want to create stubs. 3. Select the test suite’s Test Case Explorer node, then choose Create Stub from the shortcut menu. 4. In the wizard that opens, modify the stub file name if desired, then click Next.

535

Creating Stubs from Functional Test Traffic

5. Modify the stub name and path if desired, then click Next.

6. (Optional) Specify how you want SOAtest to determine which parameter values in the request message determine should be used to determine the response messages of the corresponding stub (i.e., which request message parameters you want SOAtest to evaluate when determining which response to send). In the Message Stub tool configuration, you will be able to specify the response that the stub should return when a message matching one of these "relevant" values is received. •

Choose Automatic if you want SOAtest to automatically select the parameters that vary from request to request.



If you have a complex service and/or want to manually specify which parameters are relevant, choose Select Relevant Parameters, then use the available controls to indicate which parameters you want to use.



For instance, assume that the request message has 3 parameters: x, y, z. If you want SOAtest to consider the values of y and z in determining what response the stub should send, you would specify y and z here. The selection of y and z would then be reflected in the generated Message Stub tool’s Response settings. This is where you would specify what responses you want the stub to return.

7. Click the Finish button. SOAtest automatically creates a stub based on the existing test suite. Stubs are implemented as a set of Message Stub tools, and are grouped in a .tst file with the specified name and in the specified location. The resulting .tst file of Message Stub tools is added to a "stubs" project in the SOAtest workspace and deployed to the local stub server.

536

Creating Stubs from Functional Test Traffic

You can customize stubs with different request/response use cases, error conditions, delays, and so forth as described in “Message Stub”, page 784.

MQ Note If you are working with IBM WebSphere MQ, you also need to provide the MQ connection settings in the stub configuration editor as described in “Configuring MQ for Stubs”, page 556.

Creating Stubs from a Messaging Client The Messaging Clients response messages or requests do not have to be XML. However, if you want to do request/response correlation (i.e., have SOAtest automatically discover the changing parameters and configure the XPaths for multiple responses) the request needs to be XML. If the request is not XML, then the Message Stub will be configured with a single response. If there is more than one test that uses JMS, the first test with a JMS configuration will be used to configure the JMS for the stub.

Using the Stubs For details on how to customize, deploy, and exercise stubs, see “Working with Stubs”, page 541.

537

Creating Stubs from Recorded HTTP Traffic

Creating Stubs from Recorded HTTP Traffic If you can access a message log or a trace from the traffic between clients and servers, then you can create stubs from that data. Specifically, you can have SOAtest automatically create Message Stubs that respond to incoming messages in a way that mimics the behavior captured in the traffic. Such message traces or logs can be captured at the network level using network sniffing tools such as the free WireShark tool (http://www.wireshark.org/), or obtained by having your application log its traffic. The format that SOAtest expects is fairly loose. It can parse a wide range of formats—as long as it can find HTTP headers and request/response messages in sequence. For details on how to create stubs from messages in a plain text traffic log/trace file, see “Creating Tests From Traffic”, page 410.

538

Creating Stubs from WSDLs or Manually

Creating Stubs from WSDLs or Manually If the service you want to emulate is not yet completed or accessible, you will not have any existing traffic—and thus cannot use the stub creation method described in “Creating Stubs from Functional Test Traffic”, page 535). However, you can create stubs from the WSDL test creation wizard. Alternatively, you can define stubs "from scratch" by adding Message Stub tools the test suite and configuring them manually. This topic explains how to create stubs even if you do not have functional tests that exercise the services you want to stub. Sections include: •

Adding Stubs



Using the Stubs

Adding Stubs If you have access to the WSDL, you can have SOAtest automatically add and configure Message Stub tools based on the operations defined in the WSDL. This is accomplished using the WSDL test creation wizard (described in detail in “Creating Tests From a WSDL”, page 385). If you do not have access to the WSDL, if the services you want to stub are REST services, or if you simply want more control over the stub configuration process, you can manually add Message Stub tools to the test suite.

Generating Stubs from a WSDL To generate stubs from a WSDL, complete the following:. 1. Start completing the WSDL test creation wizard as normal. 2. In the WSDL page, select the Create functional tests from the WSDL checkbox and choose the Generate Server Stubs button.

3. Continue completing the wizard.

539

Creating Stubs from WSDLs or Manually

After you click Finish, a test suite will be created containing stubs with the methods declared in the WSDL that you entered. Each test in the test suite is composed of the Message Stub tool.

Defining Stubs Manually To define stubs manually, add Message Stub tools to the test suite as described in “Adding Standard Tests”, page 331. Typically, each Message Stub tool represents an operation in the service.

Using the Stubs For details on how to customize, deploy, and exercise stubs, see “Working with Stubs”, page 541.

540

Working with Stubs

Working with Stubs This topic explains how to customize, deploy, and exercise stubs. Sections include: •

Locating and Storing Stubs



Customizing Stub Behavior



Using Data Sources with Stubs



Understanding Deployment Options •

Hosted Deployment vs. Scenario-Mode Integration



Dedicated (Remote) Stub Servers vs. The Local Stub Server



Recommended Workflow



Configuring the Local Stub Server



Configuring a Dedicated Stub Server



Manually Deploying Stubs



Refreshing Stubs



Validating Stubs



Interacting with Stubs



Gaining Visibility into Stub Behavior



Tutorial



Stub Server View - GUI Reference

Locating and Storing Stubs The "stubs" project is the default and recommended location for stub .tst files. It is added to the SOAtest workspace when a stub is deployed (either from the Stub Server view or when creating a stub from traffic). Any .tst files added to this project will be automatically deployed to the local stub server. This project contains a stubs.xml file that stores the deployment configuration for each stub—including the location of each stub's .tst file, name and HTTP endpoint path, global reporting settings, and JMS and WebSphere MQ transport settings. This file is automatically saved when SOAtest exits.

The Stub Server view is where you manage and interact with the local stub server as well as dedicated stub servers running on remote machines. To open it, choose Window> Show View> Stub Server. For an overview of the available buttons and commands, see “Stub Server View - GUI Reference”, page 552. If you want to save local stub settings before exiting SOAtest, right-click the Local machine node in the Stub Server view and choose Save Stub Deployment Changes.

541

Working with Stubs

Customizing Stub Behavior You can customize the behavior of the emulated assets—with different request/response use cases specified manually or via data sources, error conditions, delays, and so forth—by customizing the automatically-generated (or manually added) Message Stub tools as described in “Message Stub”, page 784.

Using Data Sources with Stubs Specifying response values in a data source is a very efficient way to add a significant volume of request/response pairs. For details on how to use an existing data source or rapidly create a new one, see “Using Existing Data Sources or Rapidly Creating Data Sources for Responses”, page 789.

Understanding Deployment Options Hosted Deployment vs. Scenario-Mode Integration Hosted Deployment The typical usage model for stubs is "hosted deployment": the stub for the Message Stub tools’ test suite is continuously available in the background as a "hosted" stub. In this mode, the stub will wait for an appropriate message, then respond in the specified manner whenever the designated stub endpoint is contacted. This can continue as long as the stub server is running. These stubs can be deployed on a local stub server or a remote stub server. See “Dedicated (Remote) Stub Servers vs. The Local Stub Server”, page 543 for a discussion of these options.

Integrating Message Stub Tools into an End-to-End Test Scenario An alternate usage model is to have a Message Stub tool invoked as part of an end-to-end test scenario. With this configuration, SOAtest’s local stub server will start listening for a message when that specific Message Stub test step is called as part of the test sequence. It will consume the message, then react in the manner defined in the Message Stub tool’s configuration. After this completes, the next test in the test scenario will execute. You can cut/copy and paste automatically-generated or manually-defined Message Stub tools into an end-to-end test scenario so that they are invoked at the desired point in the test suite execution flow and can trigger additional test actions. For instance, to validate a loan approval service that executes a business process workflow with multiple steps (including calling a manager approval service and another external credit rating service), you could construct a scenario with the following tests: •

Test 1: Send a request to a loan approval service to initiate the process.



Test 2: Act as a stub to listen for the incoming credit rating service over HTTP and respond with the desired rating for the scenario (emulate the crediting rating service response).



Test 3: Act as a stub to consume the manager approval message over a JMS queue and respond with approval, denial, etc.



Test 4: Get an asynchronous response from the loan process with the final loan result and validate it.



Test 5: Execute a query against a relational database to check if the loan application was audited in the database properly.

542

Working with Stubs



Test 6: Remove the application data from the database in order to restore it to the original state and make the test scenario repeatable.

These stubs can be parameterized with data from a data source. For details on constructing end-to-end test scenarios, see “End-to-End Test Scenarios”, page 306.

Dedicated (Remote) Stub Servers vs. The Local Stub Server SOAtest allows you to configure dedicated stub servers—always-running machines that host the specified stubs in order to provide all team members and project stakeholders continuous, stable access to virtualized resources. With such a server, the team gains centralized stub access and management. Such stub servers can be accessed and managed remotely from your team’s various SOAtest installations. SOAtest also provides a local stub server that is ideal for quickly deploying stubs. This creates an environment for easy experimentation and validation. Hosted stubs can be deployed on either a remote stub server or a local stub server. Scenario-mode stubs are always deployed on the local stub server.

Recommended Workflow The recommended workflow for hosted stubs is to deploy a newly-created stub to the local server in order to validate that it works as expected and to fine-tune its behavior. This deployment is automated in many circumstances; when it is not fully automated, the stub can be deployed by simply dragging the related .tst file to the appropriate Stub Server node. Then, once the stub is operating properly, you can move it to a dedicated stub server for centralized, team-wide access. This re-deployment can be done by simply dragging the related .tst file from the local server to the remote one.

Configuring the Local Stub Server The local stub server can be started and stopped from the Stub Server view or from the command line.

From the GUI Starting the Server To start the local stub server: 1. Open the Stub Server view (if it is not available, choose Window> Show View> Stub Server). 2. If the Server node does not have a green ball icon to the left of it, start the local stub server in one of the following ways:

543

Working with Stubs



Right-click the Server node and choose Start Server.



Select the Server node , then click Start Server in the Stub Server view’s toolbar.

Stopping the Server You can stop the local stub server (making any stubs on the local stub server inaccessible) in any of the following ways: •

Right-click the Server node and select Stop Server.



Select the Server node , then click Stop Server in the Stub Server view’s toolbar.

From the Command Line To start the local stub server from the command line: •

On the local machine, use a command such as "soatestcli -startStubServer -data <workspace_dir> -localsettings <localsettings_file>" file

For details on using SOAtest in command line mode, see “Testing from the Command Line Interface (soatestcli)”, page 257.

Deploying Stubs to the Local Stub Server For instructions on how to deploy stubs to the local stub server, see “Manually Deploying Stubs”, page 546.

Configuring a Dedicated Stub Server To work with a dedicated stub server, you start SOAtest in server mode from the designated server machine, then you interact with it from the various desktop SOAtest installations that your team uses for testing.

Starting SOAtest in Server Mode To set up a dedicated stub server: 1. Install SOAtest Server on the machine that you want to act as a dedicated stub server.

544

Working with Stubs

2. On that same machine, start SOAtest in stub server mode by using a command such as: "soatestcli -startStubServer -data <workspace_dir> -localsettings <localsettings_file>" file

For details on using SOAtest in command line mode, see “Testing from the Command Line Interface (soatestcli)”, page 257. The stub server is controlled by a web service with the URL http://localhost:9080/axis2/services/StubService?wsdl.

Stopping a Dedicated Server To stop a dedicated stub server: •

Invoke the "shutdown" operation from the stubs web service.

Interacting with a Remote Stub Server To configure a desktop SOAtest installation to interact with a remote stub server (e.g., so you can view and add stubs): 1. Open the desktop SOAtest installation’s Stub Server view (Choose Window> Show View> Stub Server). 2. Do one of the following: •

Right-click the Server node, then choose Add Server.



Select the Server node, then click Add Server.

545

Working with Stubs

3. In the wizard that opens, specify the server’s host name, protocol, and port.

The server will then be added to the list of servers—allowing you to add stubs and configure stubs that run on this server.

Deploying Stubs to a Remote Stub Server For instructions on how to deploy stubs to a remote stub server, see “Manually Deploying Stubs”, page 546 below. Note that when a stub is deployed to a remote stub server, that stub’s .tst file is written to the "stubs" project of the workspace being used by the remote stub server.

Manually Deploying Stubs There are several ways to manually deploy stubs: •

Drag and drop (or copy/paste) already-deployed stubs from one stub server to another.



Drag and drop (or copy/paste) .tst files to the Stub Server node representing the desired stub server.



Drag and drop (or copy/paste) .tst files to the "stubs" project.



Use the Add Stubs wizard.

More specifically, here is an overview of the deployment options available for the local stub server and for dedicated (remote) stub servers:

546

Working with Stubs

Local Stub Server

Remote Stub Server



Right-click the Local machine node then chose Add Stub.



Right-click the remote stub server node and choose Add Stub.



Create stubs from existing traffic as described in “Creating Stubs from Functional Test Traffic”, page 535 and “Creating Tests From Traffic”, page 410.



Drag and drop a .tst file from the Test Case Explorer or Navigator view to the remote stub server node in the Stub Server view.





Drag and drop (or copy) a .tst file to the stubs project in the local workspace.

Copy and paste a stub from the Local Machine node to the remote stub server node in the Stub Server view.





Drag and drop a .tst file to the Local machine node in the Stub Server view.

Drag and drop a stub from the Local machine node to the remote stub server node.



Copy and paste a stub from a remote stub server node to the Local machine node.



Drag and drop a stub from a remote stub server Stub View node to the Local machine node.

Detailed instructions are provided in the following sections.

You Can Skip Manual Deployment If... No manual deployment steps are needed if: •

You created stubs from functional test traffic (as described in “Creating Stubs from Functional Test Traffic”, page 535) and you want to deploy stubs to the local stub server... OR



You created stubs from recorded traffic (as described in “Creating Tests From Traffic”, page 410) and you want to deploy stubs to the local stub server...OR



You are invoking stubs by running Message Stub tools as part of an end-to-end test scenario (as described in “Hosted Deployment vs. Scenario-Mode Integration”, page 542).

Using Drag and Drop or Copy/Paste The fastest way to deploy a stub to a local or remote server or to move stubs from one server to another is as follows: 1. In the Stub Server tab, find the node representing the local or remote server to which you want to deploy the stubs. 2. Drag (or copy/paste) the stub to that node. You can drag or copy stubs from other servers, or .tst files from the Test Case Explorer or Navigator. Additionally, you can drag or copy related test assets (such as a .csv or .xls data source used by the stub) from the Test Case Explorer or Navigator. You can use this procedure for a variety of purposes, including:

547

Working with Stubs



To deploy a newly-created stub to the local server in order to validate and fine-tune its operation.



To move a properly-functioning stub from the local server to a remote server for team-wide use.



To move a stub from the remote server to a local server for editing, then re-deploy the modified stub to the remote server.



To update the .tst file used by any already-deployed stub.

Alternatively, you can deploy stubs to the local server by adding the related .tst file to the "stubs" project (through drag and drop, copy/paste, or a source control update).

Using the Add Stub Wizard (for Local Server Deployment) If you want additional control over the deployment process (e.g., if you want to modify the endpoint), you can deploy stubs to the local server as follows: 1. In the Stub Server tab, right-click the node representing the machine to which you want to deploy the stub, then choose Add Stub. •

To deploy the stub to a remote stub server, right-click that machine’s node. For more on using remote servers, see “Dedicated (Remote) Stub Servers vs. The Local Stub Server”, page 543 and “Configuring a Dedicated Stub Server”, page 544.



To deploy stubs to the local machine, right-click the Local machine node.

548

Working with Stubs

2. Specify the path to the test suite that contains your Message Stub tools, then click Next.

3. Modify the endpoint if desired, then click Finish.

About "Hosted Stub" Deployment If the stub for the Message Stub tools’ test suite is continuously available in the background (as a "hosted" stub), the stub will wait for an appropriate message, then respond in the specified manner whenever the designated stub endpoint is contacted. This can continue as long as the stub server is running.

About Deployment of Message Stub Tools Integrated into an End-toEnd Test Scenario

549

Working with Stubs

If a Message Stub tool is invoked as part of an end-to-end test scenario, SOAtest will start listening for a message when that specific Message Stub test step is called as part of the test sequence. It will consume the message, then react in the manner defined in the Message Stub tool’s configuration. After this completes, the following test in the test scenario will execute.

Customizing Stub Deployment For details on how to customize advanced options for stub deployment, see “Configuring Stub Server Deployment Settings”, page 554.

Re-Deploying Modified Stubs If you modify the stubs, be sure to re-deploy them as follows: •

In the Stub Server tab, right-click the appropriate machine node, then choose Re-Deploy All Stubs from the shortcut menu to re-deploy the modified stubs.

Refreshing Stubs Refreshing stubs ensures that the Stub Server tree is in sync with the deployed stubs. To refresh the entire Stub Server view, do one of the following: •

Right-click the Server node, then choose Refresh from the shortcut menu.



Select the Server node, then click Refresh .

To refresh a particular stub server (e.g., to display stubs recently added by a team member): •

Right-click the related node in the Stub Server tree, then choose Refresh from the shortcut menu.

Validating Stubs Before you configure your application to interact with your stubs, you might want to validate that the stubs behave as expected. To do this, you can create a new environment that uses the stubs instead of the actual server, then run your tests against this environment to ensure that the stubs are demonstrating the expected behavior.

550

Working with Stubs

For a demonstration of how to use an environment to validate stub behavior, see “Creating and Deploying Stubs”, page 102. For a general discussion of environments, see “Configuring Testing in Different Environments”, page 369.

Interacting with Stubs For Message Stub Tools Integrated Into an End-to-End Test Scenario When you want to have your application interact with a stub instead of an actual resource, configure your application to access the stub, which will be deployed at http://<localhost>:9080/servlet/MessageHandler. Note that every stub created and deployed in this manner has the same URL. This is possible because the Message Stub tools will be invoked one at a time, according to the logic of the test suite.

For "Hosted Stubs" that are Continuously Deployed To have your application interact with a stub instead of an actual resource, configure your application to access the stub’s HTTP endpoint (for example, http://shuttle45:9080/servlet/StubEndpoint?stub=MyStub). For stubbed REST services, use the HTTP endpoint, plus the desired parameters. Note that each stub has a unique URL because you may have multiple stubs deployed at once. To determine the stub’s HTTP endpoint: 1. Open the Stub Server view (if it is not available, choose Window> Show View> Stub Server). 2. Expand the appropriate server’s branch and double-click the node named after the test suite from which you created stubs.

551

Working with Stubs

Review the value in the HTTP endpoint value in the configuration panel that opens. This is the location to which the stub is deployed.

Gaining Visibility into Stub Behavior Visibility into what messages are sent to and from stubs—and what validations and errors occur at the stub level—enables you to: •

Validate the messages that the application sends to your stubs.



See what errors occur based on how your application interacts with the stubbed services.

When Message Stub tools are integrated into end-to-end test suites, you can gain visibility into their behavior through the Traffic Viewer tools that are attached to them. When stubs are deployed as continuously available "hosted stubs", you can use the Event Monitor to gain visibility into the stubs. For details on how to configure this, see “Event Monitoring (ESBs, Java Apps, Databases, and other Systems)”, page 494.

Tutorial For a tutorial on using stubs, see “Creating and Deploying Stubs”, page 102.

Stub Server View - GUI Reference Toolbar Buttons The Stub Server view’s toolbar provides the following buttons:

Icon

Name

Description

Start Server

Starts the local server.

Stop Server

Stops the local server.

552

Working with Stubs

Icon

Name

Description

Refresh

Refreshes all servers in the tree.

Add Server

Allows you to add a remote server to the Stubs View tree.

Shortcut (Right-click) Commands The following shortcut (right-click) commands are available within the Stub Server view: •









From the Server node: •

Start Server: Starts the local server.



Stop Server: Stops the local server.



Refresh: Refreshes all servers in the tree.



Add Server: Allows you to add a remote server to the Stub Server tree.

From the Local machine node: •

Open: Opens a panel that allows you to configure advanced settings for the local stub server. See “Configuring Stub Server Deployment Settings”, page 554 for details.



Refresh: Refreshes the local stub server.



Add Stub: Allows you to add a stub to the local stub server.



Re-deploy All Stubs: Re-deploys stubs so that modifications are "live."



Save Stub Deployment Changes: Forces SOAtest to save the local stub modifications to stubs.xml. Otherwise, changes will be saved upon exiting SOAtest.

From a remote server node: •

Open: Opens a panel that allows you to configure advanced settings for the given stub server. See “Configuring Stub Server Deployment Settings”, page 554 for details.



Refresh: Refreshes the given server (e.g., to keep it in sync with stubs added or removed by other team members).



Add Stub: Allows you to add a stub to the given server.



Re-deploy All Stubs: Re-deploys stubs so that modifications are "live."



Remove Server: Removes a remote server from the Stubs Server tree.

From a specific stub node (local machine or remote server): •

Open: Opens a panel that allows you to configure advanced deployment settings for the given stub. See “Configuring Stub Server Deployment Settings”, page 554 for details.



Copy: Allows you to copy a stub so you can paste it from one server to another.



Paste: Allows you to paste a copied stub from one server to another.



Delete: Deletes the stub from the given server.

Unprocessed Messages: Shows details on messages that were sent to that stub, but not processed by that stub.

553

Configuring Stub Server Deployment Settings

Configuring Stub Server Deployment Settings This topic explains how to configure advanced deployment settings for local or remote stub servers— and for the stubs deployed upon them. Via the Stub Server tab, various settings can be configured globally—or individually for each stub (when stubs are created automatically from a functional test suite as described in “Creating Stubs from Functional Test Traffic”, page 535). When a global setting is configured, the setting applies to all deployed stubs on the given stub server. The global setting can be overridden by configuring the individual setting for the stub. Sections include: •

Global Stub Deployment Settings



Individual Stub Deployment Settings



Configuring JMS for Stubs



Configuring MQ for Stubs

Note: For details on customizing stub behavior (e.g., how to customize stubs with different request/ response use cases, error conditions, delays, and so forth), you customize the related Message Stub tools as described in “Message Stub”, page 784.

Global Stub Deployment Settings Global settings can be configured by right-clicking any listed stub server (remote or local), in the Stub Server tab, then choosing Open.

From the configuration panel that opens, you can configure settings for: •

Emulating services deployed on JMS - See “Configuring JMS for Stubs”, page 556



Emulating services deployed on IBM WebSphere MQ - See “Configuring MQ for Stubs”, page 556



Gaining visibility into stub behavior - See “Monitoring Stub Events”, page 520

Individual Stub Deployment Settings Individual settings can be configured by right-clicking a specific stub listed in the Stub Server view, then choosing Open.

554

Configuring Stub Server Deployment Settings

You can then configure general options as well as JMS and MQ settings.

Configuring General Stub Deployment Options In the General tab, you can specify: •

Test Suite: Specifies the Test Suite in which Message Stubs are configured. To change the related test suite, drag or copy the related .tst file to the appropriate server node in the Stub Server tree.



"stub" parameter: If you specify a value here, the SOAtest stub will only consume messages that include a string property with a value matching that field value.



HTTP Endpoint: If you are using HTTP (not JMS or MQ), this is where the stub can be accessed. To exercise the stub, you can configure your application to use this URL instead of the URL for the actual resource. Any machine that can access this endpoint can access and use your stub.

Note: Before deploying stubs over JMS and/or MQ, please add the appropriate jar files to the SOAtest classpath. For details on how to do this, see “System Properties Settings”, page 758.

555

Configuring Stub Server Deployment Settings

Configuring JMS for Stubs A stub can be configured to receive messages from, and send messages to, a queue or a topic. To configure global JMS settings that apply across a specific stub server, double-click the appropriate server machine node in the Stub Server view. To configure JMS settings for a specific stub, double-click the appropriate stub node in the Stub Server view. The following JMS options are required in the JMS Settings tab : •

providerURL: Specifies the value of the property named javax.naming.Context.PROVIDER_URL passed to the JNDI javax.naming.InitialContext constructor.



initialContext: Specifies a fully qualified class name string, passed to the JNDI javax.naming.InitialContext constructor as a string value for the property named javax.naming.Context.INITIAL_CONTEXT_FACTORY



Messaging Model: Messaging Model options specify how messages are sent between applications. Select either Point to Point or Publish and Subscribe, then specify the settings in the appropriate area (Point-to-Point Settings or Publish-and-Subscribe Settings)



Message Selector Expression: (optional) When the same queue is being used by multiple services, it is helpful to specify a message selector expression. For example, if the message selector expression is "product = 'soatest'", then the stubs will only select messages in the queues/topics that have a JMS Header "product: soatest". See “Using Message Selector Filters”, page 704 for tips.



username/password: Enter if needed.

Behavior of Stubs Deployed Over JMS The JMSMessageID of the request message will be sent as the JMSCorrelationID of the response message. SOAtest stubs deployed over JMS can be invoked simply by having the application send or publish the messages to the specified destination as usual. SOAtest will consume messages on that destination. If a value is specified in the Message Selector Expression field, it will consume any message that matches the specified expression. Optionally, you can also specify a value in the "stub" parameter field. In this case, the SOAtest stub will only consume messages that include a string property with a value matching that field value.

Configuring MQ for Stubs SOAtest stubs can emulate services deployed on IBM WebSphere MQ by configuring the necessary MQ Settings. To configure global MQ settings that apply across a specific stub server, double-click the appropriate server machine node in the Stub Server tab. To configure MQ settings for a specific stub, double-click the appropriate stub node in the Stub Server tab. The following MQ options are required in the MQ Settings tab: •

mq_host: Specifies the name of the host running IBM MQ.



mq_port: Specifies the port number for IBM MQ.

556

Configuring Stub Server Deployment Settings



queueManager: Specifies the name of the Queue Manager.



channel: Specifies the name of the server-defined channel.



putQueue: Specifies the queue that SOAtest sends the SOAP message to.



getQueue: Specifies the queue that SOAtest retrieves the SOAP message from.



Message Selector ID: (optional) When the same queue is being used by multiple services, it is helpful to specify a message selector expression. For example, if the message selector expression is "product = 'soatest'", then the stubs will only select messages in the queues that have a header "product: soatest". See “Using Message Selector Filters”, page 704 for tips.

Behavior of Stubs Deployed over IBM WebSphere MQ SOAtest stubs deployed over MQ can be invoked simply by having the application send or publish the messages to the specified destination as usual. SOAtest will consume messages on that destination. If a value is specified in the Message Selector Expression field, it will consume any message that matches the specified expression. Optionally, you can also specify a value in the "stub" parameter field. In this case, the SOAtest stub will only consume messages that include a string property with a value matching that field value. Note on IBM WebSphere MQ Clients: In order to use the MQMD.putApplicationName field, the client must also ensure that MQMD.putApplicationName matches the "stub" parameter field in the stub configuration editor

557

Load Testing In this section: •

Load Testing your Functional Tests: Introduction



Load Test Documentation and Tutorial



Preparing Web Functional Tests for Load Testing

558

Load Testing your Functional Tests: Introduction

Load Testing your Functional Tests: Introduction Load testing is performed in Parasoft Load Test—a load testing platform that features: •

Centrally-managed load test configuration/execution with seamless integration into Parasoft SOAtest. This is aligned with how teams and roles are typically structured within an organization.



The ability to load test complete end-to-end test scenarios—from the web interface, through services, to the database. Every protocol and test type available in Parasoft SOAtest is supported in Parasoft Load Test.



Support for load testing non-Parasoft components such as JUnits or lightweight socketbased components. This provides an integrated solution for your various load testing needs.

Important Notes •

Obtaining Parasoft Load Test: The Parasoft SOAtest installer installs both Parasoft SOAtest and Parasoft Load Test.



Web load testing: If you want to use your browser-based functional tests for browser-less web load testing, use SOAtest to configure them for this application. For details, see “Preparing Web Functional Tests for Load Testing”, page 561.

559

Load Test Documentation and Tutorial

Load Test Documentation and Tutorial Detailed load testing documentation is provided with the Load Test product. Documentation is available in a fully-searchable online help system, as well as a PDF. A load testing tutorial is included as part of the documentation.

560

Preparing Web Functional Tests for Load Testing

Preparing Web Functional Tests for Load Testing SOAtest web functional tests (including automatically-generated tests added when you record from a browser as well as manually-added Browser Testing Tool tests) are designed to be run in a browser. Since load tests don’t run in a browser, some configuration is necessary to reuse functional web tests in a load testing environment—where web tests are conducted by sending requests to the server. Parasoft SOAtest automatically configures your browser-based functional tests for load testing. It also validates them by executing them in a simulated load testing environment. This significantly reduces the setup required to create meaningful web load tests, and helps you to identify and resolve any potential load test issues before the load testing efforts actually begin. This topic explains how to prepare your web functional tests for load testing. Sections include: •

Recommended Preparation Procedure



Accessing and Understanding the Load Test Perspective



Configuring Tests



Validating Tests



Notes

Recommended Preparation Procedure The recommended procedure is to configure your tests for load testing as described in “Configuring Tests”, page 562, then validate that they will work properly as described in “Validating Tests”, page 566. However, if you do not want to run the configuration step (e.g., because you have already configured the tests and do not want to overwrite any manual configurations you added), configuration is not required as long as the validation step passes.

Accessing and Understanding the Load Test Perspective The Load Test perspective is designed to help you prepare your web functional tests for load testing. To open the Load Test perspective: •

Choose Window> Open Perspective> Other> Load Test.

This perspective is similar to the SOAtest perspective, but it also provides the following features: •

Two toolbar buttons (Configure for Load Test and Validate for Load Test) which allow you to run automated test configuration and validation.



A Load Test Explorer, which lists the available web functional tests. Note that any web functional test components that are not relevant to load testing—for example, browser-based validations or data banks—will not be shown in this view.



Load Test Explorer right-click menus for running automated test configuration and validation (the same commands available in the toolbar buttons).



Specialized test configuration panels, which are accessed by double-clicking a test in the Load Test Explorer.

561

Preparing Web Functional Tests for Load Testing

Configuring Tests Why Do I Need to Configure Tests? Load tests take the set of requests that the browser test would use and sends those results outside of the browser context. Sometimes, browser requests become invalid when they are re-submitted outside of the browser—for instance, because a request contains session-dependent information such as session ID. In these cases, configuration is required. To facilitate configuration, SOAtest identifies such issues and automatically configures the requests to run in the browser-less load test environment. In the configure mode, SOAtest: 1. Runs the test twice to identify dynamic parameters (e.g., session IDs). 2. Sets up a Text Data Bank to extract a valid value for each dynamic request parameter (e.g., using a value extracted from a previous test, or an earlier response in the current test). For more details about this tool, see “Text Data Bank”, page 952. 3. Configures the test to use the appropriate extracted value for each dynamic parameter. These requests are saved with the appropriate tests, and can be accessed as described in How Can I Review and Modify the Requests that SOAtest Configured? below. This configuration is required when either: •

Load test validation does not succeed.



Your application has evolved to the point that your existing load test configurations no longer match the application functionality.

How Do I Configure Tests? Warning Configuration will re-create all the existing load testing requests based on the application’s existing state. As a result, any existing load test configurations you have set up in SOAtest (for example, if you manually configured the URL or Post Data to be set using parameterized or scripted values) will be overwritten. Run the automated configuration as follows: 1. In the Load Test Explorer, select the test suite that you want to configure. 2. Either click the Configure for Load Test toolbar button, or right-click the test suite and choose Configure for Load Test from the shortcut menu. Next, validate tests as described in “Validating Tests”, page 566.

How Can I Review and Modify the Requests that SOAtest Configured? If you double-click on a Browser Testing tool in the Load Test Explorer (available in the Load Test perspective), you will see a special configuration panel that displays a list of the requests that the test is supposed to make. It shows both the URL and post data, and allows you to modify these if desired. You can also add and remove requests using the controls on the right of the configuration panel.

How Do I Parameterize or Script Request Values?

562

Preparing Web Functional Tests for Load Testing

If you want to use a dynamic value for any part of the request, you can parameterize requests with values from a data source or values extracted from another test—or with values resulting from custom scripting. To do this: 1. Double-click the test in the Load Test Explorer (available in the Load Test perspective) to open its configuration panel. 2. Select the specific request whose values you want to parameterize 3. In the URL or Post Data tab (depending on what part of the request you wish to parameterize), highlight the text you want to parameterize. 4. Click Parameterize Selected Text. 5. In the dialog that opens, specify a name for the parameterized value. The actual value in the URL or Post Data tab will be replaced with a reference to a variable, and an entry for that variable will be added to the Parameterized Values area at the bottom of the test configuration panel. 6. To configure the variable to use a value that is stored in a data source or that is extracted from another test, choose Parameterized in the Value field, then select the desired data source column in the box to the right of Parameterize. •

See “Parameterizing Tests (with Data Sources or Values from Other Tests)”, page 345 for more details about parameterizing tests.

7. To configure the variable to use the result of a custom script, select Scripted in the Value field, then click Edit Script and specify the script details. •

See “Extensibility (Scripting) Basics”, page 764 for more details about using custom scripts.

Note that if your functional test is already configured to use parameterized values, the configuration step will set up the tests so that the parameterized values will also be used for load testing.

What Happens During Test Configuration? When running the “Configure for Load Test” step, SOAtest executes the test suite twice and performs three kinds of automated configuration: 1. HTTP requests that were previously set to use data source values for the functional test scenario are automatically configured to use the same data source values for the load test configuration Here is an example where SOAtest parameterized the username and password parameters in the HTTP request with the same data source values that the functional test was already configured to use:

563

Preparing Web Functional Tests for Load Testing

Note that the “password” parameter is configured to use the “Password” column from the "Login Values” data source. 2. Dynamic parameter values (for example session ids) in the HTTP requests are configured to use the updated values from the current session SOAtest does this by creating a Text Data Bank tool on an HTTP response that contains the dynamic value. This data bank value is then used in the appropriate HTTP request. For instance, in the example shown below, Test 2 had a Text Data Bank added to it.

Note that a value is being extracted into a column name “qid”. Left and right hand text boundaries have been auto-configured.

564

Preparing Web Functional Tests for Load Testing

Below, you can see that one of the HTTP requests has been configured to use the extracted value:

For more details about this tool, see “Text Data Bank”, page 952. 3. The scenario is configured to extract the same data bank values that were extracted for functional testing This configuration makes the values available for other tools in mixed web/SOA load test scenarios. For example, assume that Test 2: Click “Google Search” has a Browser Data Bank that extracted a column named “Extracted: bookName”. SOAtest found the HTTP response that contained the same value, and created a Text Data Bank that extracts the value into a column with the same name (“Extracted: bookName”). This value is later used in a SOAP client, as shown below:

565

Preparing Web Functional Tests for Load Testing

Validating Tests Why Validate Tests? When you validate tests, SOAtest will run them in load testing mode and alert you to any outstanding issues that might impact your load testing—for example, incorrectly configured HTTP requests. This way, you can resolve these issues before the actual load testing begins.

How Do I Validate Tests? Run the automated validation as follows: 1. In the Load Test Explorer, select the test suite that you want to validate. 2. Either click the Validate for Load Test toolbar button, or right-click the test suite and choose Validate for Load Test from the shortcut menu. If the validation succeeds, the Validate for Load Test tab will report that the test suite is ready to be used in Load Test. If a potential issue is detected, it will report that n issues need to be resolved before this test suite can be used in Load Test. You can then review the reported issues in the SOAtest view.

How Can I See a Test Step Rendered in a Browser? To better determine what is occurring at each test step, you can have SOAtest display what happens when the load test requests are rendered in a browser. To do this, double-click the Browser Contents Viewer added to the related test. This is especially helpful if you want to visualize why the test is not producing the expected results. For example, the rendered page might reveal that the login is not occurring properly. Using this tool, along with examining error messages, helps you identify and resolve the cause of problems.

What Is Validation Looking For? During validation, SOAtest determines if any configuration needs to be done on the scenario—either automated configuration (by SOAtest) or manual configuration. If validation does not succeed, this indicates that you need to run the configuration step or—if you have already run the configure step— that you need to manually configure parameters.

What if Problems Are Reported? If the dynamic parameters could not be auto-configured by “Configure for Load Test”, one or both of the following will happen: 1. Errors will be reported by “Validate for Load Test”. Here are the kinds of errors you might see and what they could mean: a. HTTP error codes (e.g. 404 – Not Found or 401 – Not Authorized). This means that the HTTP requests have incorrect dynamic parameter values or are otherwise wrongly configured. b. Functional test errors such as “Unable to perform user action”, “Unable to validate or extract …”. These errors occur because the specified page elements for the failing test could not be found. Page elements not being found is typically the result of the HTTP responses containing unexpected data. Again, this is usually the result of the HTTP requests having incorrect dynamic parameter values or being otherwise wrongly configured.

566

Preparing Web Functional Tests for Load Testing

2. The Browser Contents Viewer will show an incorrect or unexpected page at the point where the incorrect dynamic parameter was used. If such issues occur, run “Configure for Load Test”. If “Configure for Load Test” has already been run and these errors are still occurring, you may need to manually configure the HTTP requests and/or parameters causing the problem.

When Do I Need to Manually Configure Parameters? There is one class of dynamic parameter values that SOAtest cannot configure automatically: values that are normally constructed or transformed by JavaScript in the browser. Since the (transformed) parameter values do not exist in any of the HTTP responses, SOAtest cannot extract them to be used where necessary in any HTTP requests. These parameters need to be configured manually. Validation will alert you when these kinds of dynamic parameters are present and are required by the web application to be updated for each session.

How Do I Manually Configure Parameters? Use the procedure described in “How Do I Parameterize or Script Request Values?”, page 562. Here is an example of a parameter that passes the current time to the server. This is a dynamic parameter, constructed by JavaScript, that is not present in any of the previous HTTP responses. It has been manually configured to be parameterized using a script that calculates and returns the current time.

567

Preparing Web Functional Tests for Load Testing

Notes •

Web load testing focuses on requests that result in text responses. It does not transfer binary files such as images, flash files, JavaScript, CSS., etc.) This allows you to simulate a mode where everything is cached on the user’s machine—providing response times that are accurate for a repeat visitor/existing user.



The requests for web load testing are configured to simulate the browser specified in the test suite’s Browser Playback Options tab. Browser type is simulated by sending the appropriate header content (User-Agent and Accept).



If the application requires basic or NTLM authentication, the settings used in the test suite’s Browser Playback Options tab will be applied to web load testing as well.

568

SOA Quality Governance and Policy Enforcement In this section: •

SOA Policy Enforcement: Overview



Defining the Policy



Enforcing Policies on WSDLs, Schemas, and SOAP Messages

569

SOA Policy Enforcement: Overview

SOA Policy Enforcement: Overview This topic provides an overview of SOAtest’s quality policy enforcement capabilities. Sections include: •

Policy Enforcement Details



Recommended Workflow



Tutorial

Policy Enforcement Details SOAtest provides a complete SOA policy enforcement solution, enforcing policies with executable rules that can be applied to WSDLs, schemas, SOAP messages, and any other XML artifact or SOA meta-data component. Once an organization has defined their policies to guide their SOA deployments, SOAtest can be used to enforce them throughout the development and QA process. For example, SOAtest verifies schema and semantic validity for W3C and OASIS standards compliance, validates Basic Profile 1.1 for WS-I Interoperability compliance, and implements rules to enforce various other endorsed WS* Standards.In addition, SOAtest can be used to enforce compliance to best practices such as customized company guidelines, security, and maintainability and reusability.

Registry-Based Policy Management SOAtest provides native support for multiple commercial registries. This integration enables teams to automatically execute a quality workflow and correlate quality data in the context of an SOA Governance initiative. Teams can automatically extract the information needed to create tests for design and development policies (such as standards, compliance, security, and best practices) for Web services assets as they are defined in a registry. They can also select a service asset and verify the associated policies, thereby ensuring interoperability and consistency. SOAtest is capable of querying any UDDI registry from vendors such as IBM, HP, and Microsoft. Furthermore, Parasoft offers even tighter integration with Oracle / BEA's AquaLogic Enterprise Repository (ALER) and Software AG's CentraSite. We automatically generate tests at the time the services are published to the registry–including functional test cases and WSDL verification tests that ensure WSDLs are compliant to best practices and organizational policies. Policy compliance results are then reported back to the registry and updated in real-time. This provides continuous visibility into a service's quality throughout its lifecycle.

Registry-Based Test Generation •

To learn how to create tests that enforce policies applied to Web service assets that are declared in a BEA repository, see “Creating Tests From Oracle Enterprise Repository / BEA AquaLogic Repository”, page 393.



To learn how to create tests that enforce policies applied to Web service assets that are declared in a Software AG CentraSite repository, see “Creating Tests From Software AG CentraSite Active SOA”, page 399.

570

SOA Policy Enforcement: Overview

WSDL, Schema, and Semantic Verification WSDL verification can be considered as the first step in testing Web services. Although WSDLs are generally created automatically by various tools, it doesn't necessarily mean that the WSDLs are correct. When WSDLs are manually altered, WSDL verification becomes even more important. Ensuring correct and compliant WSDLs enables your service consumers to function correctly, and avoids vendor lock-in, thus achieving interoperability and realizing SOA goals of service reuse. SOAtest can automatically generate a test suite of comprehensive WSDL tests to ensure that your WSDL conforms to the schema and passes XML validation tests. Additionally, it performs an interoperability check to verify that your Web service is WS-I compliant.

WS-* Standards Validation SOAtest enforces policies with executable rules that can be applied to WSDLs, schemas, SOAP messages and any other XML artifact or SOA meta-data component. For example, we verify schema and semantic validity for W3C and OASIS standards compliance, validate Basic Profile 1.1 for WS-I Interoperability compliance, and implement rules to enforce various other endorsed WS* Standards. In addition, we enforce compliance to best practices such as customized company guidelines, security, and maintainability and reusability.

Interoperability Testing SOAtest verifies the WSDL and SOAP traffic for conformance to Basic Profile 1.1 using WS-I Testing Tools. Functioning as both a traffic monitor and analyzer, SOAtest enhances the usability of WS-I Testing Tools by eliminating the need to set up man-in-the-middle Monitor and configuration files for the Analyzer. The only required input is the WSDL URL.

Recommended Workflow SOAtest’s policy enforcement component enables the SOA architect to define policy rules. Using SOAtest, Web service developers, QA, and Test engineers can then verify service compliance against the Architect-defined rules early in the Web service lifecycle process. The Architect then monitors compliance to these rules, resulting in a visible and controlled development process.

571

SOA Policy Enforcement: Overview

In addition, the rules defined by the architect can be applied across the entire organization via Parasoft’s Team Server, a server component that enables the sharing of rules and test policies among Parasoft’s products.

Tutorial For a step-by-step tutorial of how to monitor compliance to SOA policies, see “Design and Development Policy Enforcement”, page 159.

572

Defining the Policy

Defining the Policy This topic explains how to define and share a policy for your group. Sections include: •

Defining a Group Policy



Sharing Policy Files across the Group

Defining a Group Policy In SOAtest a policy consists of a set of assertions or rules. SOAtest enforces a policy by creating tests that check those rules. When creating tests from WSDL with a policy file, SOAtest creates "policy enforcer" tests. SOAtest ships with a default policy file that addresses the key concerns for a web service in an SOA, such as interoperability and compliance to industry standards, as well as maintainability and best practices. We strongly recommend that the team’s architect customize this policy to suit the team’s specific needs. To define a custom SOA policy: 1. Select File> New> Policy Configuration. 2. Specify a name and location for the policy, then click Finish. •

The Policy Configuration panel displays in the right GUI pane of SOAtest and lists assertions that correspond to policy enforcement rules and WSDL tests.

573

Defining the Policy

3. From the Policy Configuration panel, you can: •

Enable/disable individual rules or groups of rules by selecting or clearing the available check boxes.



Customize a parameterizable rule by right-clicking it, choosing View/Change Rule Parameters from the shortcut menu, then modifying settings as needed. Parameterized rules are marked with a special icon (a wizard hat with a radio button):



Search for a rule by clicking the Find button, then using that dialog to search for the rule.



Hide the rules that are not enabled by clicking the Hide Disabled button. If you later want all rules displayed, click Show All.



Define custom rules in RuleWizard by clicking New, then using the RuleWizard graphical editor or automated generator to create new rules. Once custom rules are defined, add them to the rule tree by clicking Add, then enable them. For details, open the RuleWizard User Guide by clicking New in this panel, then choosing Help> Documentation from within the RuleWizard GUI.)



View a description of a rule by right-clicking the node that represents that rule, then choosing View Rule Documentation from the shortcut menu

4. Click Save to save the custom policy to the location you previously specified. The policy configuration you define can be used later to automatically create tests to enforce policies as described in . For details, see “Enforcing Policies on WSDLs, Schemas, and SOAP Messages”, page 575.

Sharing Policy Files across the Group Once you have created your policy configuration files and custom rules, they can be shared so that all team members will have access to them in their own testing environments. Policy configuration files can be uploaded to Team Server through the Team Server web interface as described in the Team Server user’s guide. Rules can be also added to Team Server by choosing SOAtest> Explore> Team Server, then opening the Rules tab and uploading rules. You can also share policy configuration files via source control.

574

Enforcing Policies on WSDLs, Schemas, and SOAP Messages

Enforcing Policies on WSDLs, Schemas, and SOAP Messages This topic explains how to create, run, and review the results of policy enforcement tests. Sections include: •

Creating Policy Enforcement Tests



Running Policy Enforcement Tests



Reviewing Results/Reports

Creating Policy Enforcement Tests Once the policy has been defined, policy enforcement tests can be created within SOAtest to reference the related rules, or assertions. From the SOAtest test creation wizard, you can create the following types of policy enforcement tests: •

WSDL: Used to enforce policies and standards on the content of your WSDL documents including all imported WSDLs.



Schema: Used to enforce policies and standards on schemas referenced within your WSDL file including all imported schemas.



SOAP: Used to enforce policies and standards on SOAP messages sent from your web service.

To reference a policy configuration when creating tests from the WSDL test creation wizard (described in detail in “Creating Tests From a WSDL”, page 385): 1. Start completing the wizard as normal. 2. If you would like to create WSDL/Schema policy enforcement tests, select the Create tests to validate and enforce policies on the WSDL checkbox in the first wizard page. 3. When you reach the Policy Enforcement wizard page, select the Apply Policy Configuration check box. This will create WSDL and functional tests that will enforce the assertions defined in the specified policy configuration.

575

Enforcing Policies on WSDLs, Schemas, and SOAP Messages

4. Enter or browse to the desired policy configuration file.



You can reference a local file or a file on Team Server.



If you do not have a custom policy configuration, you can use the default policy enforcement tests that reference the default rules sets included with SOAtest. The default policy configuration, soa.policy, is a collection of industry-wide best practices.

5. Click the Finish button.

Running Policy Enforcement Tests You can now execute the test by clicking the Test toolbar button:

Reviewing Results/Reports Any policy violations detected will be reported as tasks in the SOAtest view as described in “Reviewing Results”, page 289. Alternatively, you can run the test from the command line, then import results into the SOAtest GUI as described in “Accessing Results and Reports”, page 234. Once the policy enforcement tests have been run, a report can be generated in HTML or XML format which will contain all the error messages for each policy enforcement violation. To view a report, right click the desired error message and select View Report> Detailed/Summary/Structure from the shortcut menu. The report will display in a Web browser.

576

Static Analysis In this section: •

Performing Static Analysis



Configuring SOAtest to Scan a Web Application



Reviewing Static Analysis Results



Suppressing the Reporting of Acceptable Violations



Customizing Static Analysis: Overview



Creating Custom Static Analysis Rules



Modifying Rule Categories, IDs, Names, and Severity Levels

577

Performing Static Analysis

Performing Static Analysis This topic explains how you can perform static analysis to identify code that does not comply with a preconfigured or customized set of static analysis rules. Sections include: •

About Static Analysis



Analyzing Resources From Your Project Workspace



Reviewing and Retesting Scanned Resources



Analyzing Files Outside of Your Project Workspace



Tutorial

About Static Analysis Static analysis is one of many technologies used to throughout the SDLC to help team members deliver secure, reliable, compliant SOA. Static analysis helps the team ensure that development activities satisfy the organization's expectations, ensuring interoperability and consistency across all SOA layers. SOAtest can perform static analysis on both SOA artifacts and the Web interface. For SOA, static analysis can be performed on individual artifacts (e.g., WSDL or XML files). It is also used as one of several components in a comprehensive SOA policy enforcement framework, which is discussed in “SOA Quality Governance and Policy Enforcement”, page 569. For Web interfaces, SOAtest’s static analysis can perform an automated audit of Web interface content and structure, automatically scanning and analyzing a Web asset for accessibility, branding, intranet standards compliance, and consistency. It inspects and exposes issues that present potential risk to the proper functionality, usability, and accessibility of your Web-based applications. It can cover an entire Web application or an individual component or module. Scan results are presented as actionable reports that identify erroneous objects—providing direct linkage to exposed issues for quick analysis and remediation. The assessment analysis covers the following areas: •

Accessibility: Support for Section 508, WAI, and WCAG 2.0 guidelines.



Branding: Automatically enforce policies related to site layout and "look and feel."



Intranet Standards: Identifies use of sensitive corporate data.



Consistency: Prevents broken links, spelling errors, browser compatibility issues.

More specifically, to facilitate Web accessibility validation (Section 508, WAI, WCAG), SOAtest automatically identifies code that positively or possibly violates Section 508, WAI, and WCAG 2.0 Web accessibility guidelines. During its automated audit. the solution checks whether Web interfaces comply with core accessibility guidelines and helps you identify code and page elements that require further inspection and/or modification. Moreover, Parasoft's pattern-based code analysis monitors whether Web language code follows industry-standard or customized rules for ensuring that code meets uniform expectations around security, reliability, performance, and maintainability. We provide an extensive rule library with hundreds of configurable rules for Web languages (JavaScript, VBScript/ASP, HTML, CSS, XML, and so on), as well as a graphical RuleWizard module that makes it very simple to construct and maintain customized rules.

Analyzing Resources From Your Project Workspace You can use the following procedure to perform static analysis on:

578

Performing Static Analysis



Any source files that are available in your project workspace (e.g., HTML, XML, WSDL, and other source files added to your project workspace). For details on linking your source files to a project created by SOAtest, see the Eclipse Workbench User Guide (choose Help> Help Contents).



Any Web pages that are represented in SOAtest test suites within your project workspace (e.g., the Web pages that are accessed as SOAtest crawls your Web application or the Web pages that the browser downloads as Web functional tests execute).

The general procedure for performing static analysis on one or more files in your project workspace is as follows: 1. Ensure that the files you want to analyze are available within SOAtest. The files must be available as a project in your workspace. •

If the files are not actually part of the project or downloaded as functional Web tests execute, you can have SOAtest "crawl" your Web application and then analyze the accessed pages; for details, see “Configuring SOAtest to Scan a Web Application”, page 583.

2. Select or create a Test Configuration with your preferred static analysis settings. •

For a description of preconfigured Test Configurations, see “Built-in Test Configurations”, page 741.



For details on how to create a custom Test Configuration, see “Creating Custom Test Configurations”, page 244.

Configuring SOAtest to Run Static Analysis on Web Functional Test Scenarios •

The selected Test Configuration must have Test Execution enabled (this is the default setting for all built-in Test Configurations).



By default, SOAtest will statically analyze the individual HTTP messages that the browser made in order to construct its data model—the content returned by the server as is (before any browser processing). If you prefer to analyze the browser-constructed HTML (the realtime data model that the browser constructed from all of the HTML, JS, CSS, and other files it loaded) you can change this by modifying the Test Configuration’s Execution settings.

3. Select the resource you want to analyze, then run the appropriate Test Configuration).

579

Performing Static Analysis



To test the Web pages that are accessed as SOAtest crawls your Web application, select the Test Suite that contains the Scanning Tool, then run the desired Test Configuration.



To test the Web pages that the browser downloads as Web functional tests execute, select the Test Suite that contains the Web Functional tests, then run the desired Test Configuration.



To perform static analysis from the command line, use the procedure described in “Testing from the Command Line Interface (soatestcli)”, page 257.

4. Review and respond to the results using the appropriate static analysis results layout option. •

For details, see “Reviewing Static Analysis Results”, page 605.

5. (Optional) Fine-tune static analysis settings as needed. •

For details, see “Customizing Static Analysis: Overview”, page 611.

580

Performing Static Analysis

Reviewing and Retesting Scanned Resources The Scanning Perspective is designed to facilitate reviewing and retesting of resources scanned during static analysis.

Opening the Scanning Perspective To open the Scanning Perspective: •

Choose Window> Open Perspective> Other> Scanning.

This perspective is similar to the SOAtest perspective, but has two additional features: •

The Quick Test tool bar button for testing a single URL or file (as described “Reviewing and Retesting Scanned Resources”, page 581).This button can be added to any perspective by choosing Window > Customize Perspective> Commands and clicking the checkbox next to SOAtest Scanning.



The Scanned Resources view. The view can be added to any perspective by choosing Window > Show View> SOAtest> Scanned Resources Listing.

Reviewing Scanned Resources The Scanned Resources view will show the resources scanned by a Scanning tool. After a Scanning Tool has been run, you can select it in the Test Case Explorer, and the Scanned Resources Listing will show all items that have been scanned by that Scanning Tool. You can right-click single files in that view, and choose to open them in an editor or a web browser.

Analyzing Files Outside of Your Project Workspace To quickly access and scan a resource that is NOT available in your workspace: 1. Open the Scanning perspective by choosing Window> Open Perspective> Other> Scanning. 2. Start the test with the Quick Test tool bar button:



To run a test with the "Favorite" (default) Test Configuration, simply click that toolbar button.



To run a test with another Test Configuration, use that button’s pull-down menu.



To check only a specific category of rules, choose Built-in> Static Analysis Rules> [category].

3. Specify the how to access the resource you want to scan (you can specify a URL or a file path). •

The file path can be absolute, or it can be an Eclipse workspace path).



The specified file can also be a .urls file that contains a list of URLs (see “Using .urls Files”, page 584 for details).

Tutorial 581

Performing Static Analysis

For a step-by-step tutorial of how to perform static analysis on a Web application, see “Web Static Analysis”, page 183.

582

Configuring SOAtest to Scan a Web Application

Configuring SOAtest to Scan a Web Application This topic explains how to prompt SOAtest to scan a web application. Static analysis can then be performed on the accessed pages as described in “Performing Static Analysis”, page 578. Sections include: •

Configuring the Scanning



Handling Special Loading Issues



Loading/Site Scanning FAQ



Customizing Scanning with Scripting Hooks

Configuring the Scanning A Scanning Test is a test that crawls the specified Web application when the test case is executed To configure SOAtest to scan your Web application, you add a Scanning Test as follows: 1. Do one of the following: •

To add a "Scanning Test" to a new project: Open the pull-down menu for the New toolbar button (top left) then choose Project.

Select SOAtest> Web> Scanning Web application project and click the Next button. •

To add a "Scanning Test" to a new .tst file: Right-click the project node where you want the new .tst file added, then choose New Test ( .tst) file. Enter a name for the new .tst file, click Next, then choose Web> Scan web application.



To add a "Scanning Test" to an existing project: Select the test suite node where you want the new test added, and click Add Test or Output.

In the Add Test wizard that opens, select Standard Test on the left, Scanning Tool on the right, and click Finish. 2. Specify whether you want to scan from an HTTP, FTP, or local source. •

If you choose FTP or Local, complete the available fields, then your configuration is completed. You can skip the following steps.

583

Configuring SOAtest to Scan a Web Application

FTP Options When specifying an FTP source, you have the following options: •

Case Insensitive Server: Indicates that your server is case insensitive.



Follow Symbolic Links: If selected, this option tells SOAtest to scan symbolic links as though they were actual files or folders on the ftp site. If the links point to a folder, SOAtest will scan the contents of that folder •

If you choose HTTP, continue specifying settings as detailed in the following steps.

3. Select the Required Basic Settings tab of the HTTP Tool Settings and complete the following: •

Start URL: Specify a single URL or .urls file in the field. •

If the username and password of a URL are entered as '*', SOAtest will first try using the same credential used for the closest URL that precedes this URL and belongs to the same domain. If this credential does not apply to this URL, SOAtest will open a realm password dialog, asking you to input the correct credential.

Using .urls Files Use a .urls file if you plan to have SOAtest add positive restrictions to the project and start loading all the appropriate sites starting at the URLs specified in the file to one default depth specified in the Project Creation panel. If usernames and passwords are specified for protected URLs, the credentials will be automatically added to the project. The format of a .urls file is: URLs are specified on separate lines; if a credential is specified for a URL, it must be put on the same line as the url in the order: url, username, password. The url, username, and password are separated by commas. If any item of the three has commas, the item must be put inside a pair of double quotes. If any item contains double quotes, each of the double quote must be escaped with another double quote, and the whole item must be put in double quotes. For example, http://url1.com/ http://url2.com/, username, password http://url3.com/, "user,name", ",password," http://url4.com/, "username", "pass""word"



Limit loading depth: If you do not want SOAtest to try to load the entire site, use this control to limit the loading depth. The depth settings determines the number of clicks (links) deep to load the site. A loading depth of 0 means to load only the initial page. This option is particularly useful if you want to test only selected paths through the site or if you want SOAtest to automatically construct the site structure and paths based on the traffic recorded in log file entries.

584

Configuring SOAtest to Scan a Web Application



Restrict scanning to deepest subdirectory of start URL: If you want SOAtest to load only the subdirectories from the initial start URL, select this option. See “Restricting Scanning to the Deepest Subdirectory of the Start URL”, page 590 for more details.



Follow redirects not specified as restrictions in the table: If you want SOAtest to follow redirects to sites that are not specified as allowed URLs, select this option. For more information about redirect handling, see “Understanding and Configuring Redirect Handling”, page 590.



Allowable/Restricted URLs: If accessing certain pages (for example, log out pages) might prevent SOAtest from loading the site correctly and/or completely, prompt SOAtest to avoid these pages by marking them as restricted URLs in this table; see “Preparing SOAtest to Load Sites that Use JavaScript”, page 590 for instructions.



Form Options: The following options are available: •

Scan Forms With Default Input: Off by default



Fill Active Inputs Manually: On by default

4. If you want to enter connection passwords, select the Realm Passwords tab and click Add to specify passwords. •

To prevent SOAtest from saving the password, disable the Save password check box. By default, any password you enter in one of these dialog boxes will be saved and stored in your project file; this way, you only need to enter passwords once. If you are not confident that your project file is secure, you should not save these passwords.

5. If you want to specify advanced scanning options, select the Advanced Settings tab and complete the following: •

Default Pages: * (this is the default setting)



Obey Robot Restrictions: Off (provided that the site’s Webmaster has granted you permission to ignore robot restrictions)



Pass Cookies: On



Invoke CGI For All Arguments: On



Case Insensitive Server: Depends on your server



Aliases: Depends on your site. If your site uses aliases, enter them here by clicking the Add button. An alias is a domain that you want SOAtest to consider equivalent to the domain of one of the sites in your project. For example, http://www.parasoft.com might have the aliases parasoft.com and www.parasoft.com.

6. If you are creating a new project, click Finish. Otherwise, click Save. You can now run any available Test Configuration on the test suite or the individual Scanning tool. When the test executes, SOAtest spiders through static and dynamic pages and loads at least one instance of each page it encounters. If you chose the recommended HTTP Configuration options and your site contains forms, SOAtest allows you to populate forms using form input dialog boxes. If SOAtest opens form input dialog boxes as it loads your site, complete them as described in “Populating Forms”, page 592.

Handling Special Loading Issues This section explains how to handle the following special loading issues: •

Preparing SOAtest to Restrict and/or Allow URLs During Scanning



Preparing SOAtest to Load Sites that Use JavaScript

585

Configuring SOAtest to Scan a Web Application



Populating Forms



Managing Authentication During Loading

Preparing SOAtest to Restrict and/or Allow URLs During Scanning You can control what site URLs SOAtest loads and ignores by adding restrictions and/or allowances in the Allowed/Restricted URL table. If a URL is marked as restricted, SOAtest will not load that URL if it’s encountered during scanning. If you do not want SOAtest to access certain directories or files, you can mark them as restricted URLs. For example, if you have a logout page, you might want to mark it as restricted so that it does not prevent the loading of other pages or cause other pages to become invalid. SOAtest avoids restricted URLs when creating paths, exercising the site with virtual users, and so on. If your site has a Web server source, you can specify restricted URLs before you load the site (as described in this section). If a URL is marked as allowed, SOAtest will load that URL if it’s encountered during scanning. The deepest directory URL pattern for the URL specified in the Start URL field is automatically added as allowed URL; this URL is called the Allowed Subdirectories URL. http(s)://... is used here so that both the HTTP and HTTPS versions of the specified URLs will be allowed or restricted. For more information, see “URL Restriction Examples”, page 587. You can also specify additional allowed URLs. This is particularly useful for configuring SOAtest to load related sites. A related site is any site that is linked to one of your project’s sites, but does not have the same host name as any of the project’s sites. For example, if your primary project source is http:// www.parasoft.com and that site links to http://forums.parasoft.com and http://www2.parasoft.com, those two sites are related sites for the current project. If you want SOAtest to automatically load a related site when it encounters it in the context of the current project, you need to indicate that they are Allowed URLs. Restrictions and allowances are both specified in the Scanning Tool’s Allowed/Restricted URL table. In the table, allowances are marked with a + sign and restrictions are marked with a - sign. Allowances/ restrictions are case sensitive. Restrictions and allowances can be combined to create a sophisticated scanning scheme. Allowances/ restrictions are processed in order, starting with the entry at the top of the table and ending with the entry at the bottom of the table. To add an entry to the Allowed/Restricted URL table: 1. Click Add. A new row will be added to the Allowed/Restricted URL table. 2. In the text field of the newly-added row, enter the URL you want to restrict or allow. •

The wildcard asterisk (*) can be used anywhere in the URL to match 0 or more characters. "*" means any combination of characters. It can be used in any combination of ways to construct an Allowed/Restricted URL table entry. See “Wildcard Examples”, page 587 for samples.



If there is a slash '/' at the end of the URL, then the slash will be removed before the comparison.

3. If needed, toggle the +/- sign to indicate whether the specified URL should be allowed or restricted. •

The + sign is used to mark allowed URLs.



The - sign is used to mark restricted URLs.

586

Configuring SOAtest to Scan a Web Application

Note on "Unallowed" JavaScript and CSS Files JavaScript and CSS files that do not match the allowed URL patterns will not be shown in the Project tree. However, if SOAtest needs to access these files to properly process the Web pages, it will still visit these files (even though they are not added to the project).

To change the position of an Allowed/Restricted URL table entry: •

Select the entry that you want to move, then click the upward and downward arrow buttons next to the Allowed/Restricted URL table until the entry is in the desired position.

To remove an Allowed/Restricted URL table entry: •

Select the entry you want to move, then click Remove.

URL Restriction Examples The following examples explain specifics about the http(s)://... restriction: •

The default http(s)://... restriction allows URLs to match either http:// or https://. For example, with the default restriction: http(s)://www.google.com/*

would match both http://www.google.com/amazing_page.html

and https://www.google.com/amazing_page.html



The string "http(s)://" does not have meaning in a URL restriction other than at the very start. For example, you could NOT have the restriction: *http(s)://www.google.com/*

and have it match http://www.google.com/amazing_page.html

The string "http(s)://" is case-sensitive. So, you could NOT have: HTTP(S)://www.google.com/*

and have it match http://www.google.com/amazing_page.html



The special string is "http(s)://" and not any part of the string. For example, you could NOT have the restriction: http://www.para(s)oft.com/*

and have it match http://www.parasoft.com/*

or http://www.paraoft.com/*

Wildcard Examples The following table shows how wildcards can be used to achieve various goals:

587

Configuring SOAtest to Scan a Web Application

To match this...

Use an entry like this...

Any protocol

*www.parasoft.com/dir1/dir2/index.html

HTTP or HTTPS protocol

http(s)://www.parasoft.com/dir1/dir2/index.html

Any pages within the domain

http://www.parasoft.com/*

Any pages within the deepest subdirectory

http://www.parasoft.com/dir1/dir2/*

Any protocols and pages within the domain

*www.parasoft.com/*

Any page name index.html

*index.html

Any link with the word 'dir' in it

*dir*

Any link

*

*.parasoft.com would match www.parasoft.com and articles.parasoft.com. *parasoft* would match www.parasoft.com, www.parasoft.net, parasoft.com, etc. direc*ry/* would match directory/dir2, direcHELLOry/dir2/dir3, and direc12345fry/ images.

Understanding the SOAtest Scanning Heuristic During project scanning and refreshing, each potential URL is matched against the Allowed/Restricted URLs table to determine if the URL should be visited. The process for determining whether the URL should be visited is as follows: 1. Find the last matching URL in the Allowed/Restricted URLs table. 2. If the sign of the last matching URL is positive, the URL is allowed. If the sign is negative, the URL is restricted. The Allowed/Restricted URL table (described above in Preparing SOAtest to Restrict and/or Allow URLs During Scanning) allows you to control what portions of a site are scanned. For example, it can allow certain URLs to be specified as "off-limits" within a portion of a site that is otherwise allowed to be scanned. Or, it can allow the scanning of specific URLs that are within an area of the site that otherwise should not be scanned. Example 1

588

Configuring SOAtest to Scan a Web Application

The URL http://www.dev.parasoft.com/products/dir/page.html will match the following URLs from the table above: + http(s)://www.dev.parasoft.com/* - http(s)://www.dev.parasoft.com/products/* + http(s)://www.*.com/products/dir/*

The last matching URL is http(s)://www.*.com/products/dir/*. Because this URL has a positive sign (indicating that it is an allowed URL), the URL http://www.dev.parasoft.com/products/dir/ page.html will be allowed. The URL http://www.dev.parasoft.com/products/soap.html will match the following URLs from the table above: + http(s)://www.dev.parasoft.com/* - http(s)://www.dev.parasoft.com/products/*

The last matching URL is http(s)://www.dev.parasoft.com/products/*. Because this URL has a negative sign (indicating that it is a restricted URL), the URL http://www.dev.parasoft.com/products/soap.html will be restricted. Example 2 Your site http://www.dev.parasoft.com has information regarding several thousand types of hardware. You want to limit scanning of the site to two types of hardware: routers and disks. You also want to process all other parts of the site—except for the old part of the site. To start scanning at the URL http://www.dev.parasoft.com/index.html, enter this URL in the Start URL field. This allowed URL will automatically be added to the Allowed/Restricted URLs table: +

http(s)://www.dev.parasoft.com/*

To scan only the two folder Routers and Disks in the hardware directory, you will need to add the following entries to the table in the following order:

+ +

http(s)://www.dev.parasoft.com/newdir/hardware/* http(s)://www.dev.parasoft.com/newdir/hardware/Routers/* http(s)://www.dev.parasoft.com/newdir/hardware/Disks/*

To restrict the scanning of your old directory, you will need to add the following entry.

589

Configuring SOAtest to Scan a Web Application

-

http(s)://www.dev.parasoft.com/olddir/*

This is how the table will look like after you are done.

Restricting Scanning to the Deepest Subdirectory of the Start URL When the Restrict scanning to deepest subdirectory of Start URL Http Scanning option is selected, SOAtest will automatically take the Start URL, create an allowed URL for the deepest subdirectories, and add it to the first row of the Allowed/Restricted URL table. If you clear this option, SOAtest will automatically add the default Allowed Subdirectories URL to the first row of the Allowed/Restricted URL table. SOAtest also behaves this way when the first redirect is to a URL with the same fully-qualified domain name. For example:

Start URL

Subdirectories Restriction

http://www.parasoft.com/dir1/dir2/

+ http(s)://www.parasoft.com/dir1/dir2/*

http://www.parasoft.com/dir1/dir2

+ http(s)://www.parasoft.com/dir1/*

http://www.parasoft.com/dir1/page.html

+ http(s)://www.parasoft.com/dir1/*

http://www.parasoft.com/dir1/page.html?count=1

+ http(s)://www.parasoft.com/dir1/*

Understanding and Configuring Redirect Handling If the specified Start URL is a redirect, SOAtest will load that redirect and any continuous subsequent redirects. In addition, SOAtest will automatically add an Allowed Subdirectories URL for each of those redirects. All the added entries for the redirects will be added to the top of the Allowed/Restricted URL table. If you want SOAtest to follow redirects that are not added as allowed URLs, select the Follow redirects not specified as restrictions in the table option. When this option is selected, if SOAtest encounters a redirect during scanning and this redirect is not specified in the Allowed/Restricted URLs table, it will be loaded. SOAtest will add an Allowed Subdirectories URL for the redirect to the Allowed/ Restricted URLs table.

Preparing SOAtest to Load Sites that Use JavaScript

590

Configuring SOAtest to Scan a Web Application

Before you load a site that contains JavaScript, verify that the Load JavaScript option is enabled. (This option is enabled by default.) When this option is enabled, SOAtest will execute JavaScript and load any related links. To verify that this option is enabled: 1. Choose SOAtest> Preferences. 2. Select SOAtest> Scanning.

Customizing JavaScript Event Simulation If you want to customize how SOAtest simulates JavaScript events (such as opening and closing additional windows, running timers, and so on), you can do so by modifying the JavaScript options in this tab. Use the following table to determine what options to choose.

To do this...

Use these settings...

Prompt SOAtest to trigger each handler once, with default arguments.

Choose single time in the Simulate JavaScript events box.

Prompt SOAtest to create multiple kinds of events while loading a site (in order to find new links).

Choose multiple times in the Simulate JavaScript events box.

Prevent SOAtest from simulating JavaScript events.

Choose never in the Simulate JavaScript events box.

Working with Alert, Confirm, and Prompt Messages Each time SOAtest encounters JavaScript Alert, Confirm, Prompt messages as it scans your site, it will print a message to the results area (in the right GUI panel) and take the default action. By default, SOAtest will print the following messages and performs the following actions:

591

Configuring SOAtest to Scan a Web Application

JavaScript Message

SOAtest Message

Default SOAtest Action

JavaScript Alert

JavaScript Alert: "message"

Click the OK button

JavaScript Prompt

JavaScript Prompt: "message"

Click the Cancel button

JavaScript Confirm

JavaScript Confirm: "message"

Click the Cancel button

If you want SOAtest to handle these messages differently, you can define custom behavior using scripting. See “Customizing Scanning with Scripting Hooks”, page 599 for details.

Supported Methods and Objects To see a list of what JavaScript methods and objects SOAtest supports and implements, choose Help> JavaScript DOM. In the DOM documentation, "Stub" indicates that SOAtest recognizes the related method, but the method has no functionality in the current implementation of SOAtest. "IE" indicates that a method is specific to Internet Explorer. "NN" indicates that a method is specific to Netscape Navigator or Mozilla. A method that is specific to a browser will only work if you have set SOAtest to use the appropriate user agent (global user agent settings are determined in the SOAtest Preferences panel; global user agent settings can be overridden during load testing if you designate a different user agent in the virtual user profile.). Please note that this list is not comprehensive.

Populating Forms If you selected the Fill Active Inputs Manually HTTP configuration panel check box (enabled by default), SOAtest opens a Form Input dialog box each time it detects a form that requires user input. If you want SOAtest to analyze the pages that are returned after forms are submitted, you need to tell SOAtest how to populate the forms’ various input elements (text fields, select boxes, check boxes, radio buttons, etc.). You can do this by specifying fixed inputs in these dialog boxes during the loading process. You do not need to add test inputs for every form. We recommend that you add only the inputs required to access all major areas of your site; you can skip dialog boxes as needed. Moreover, you do not need to enter any form inputs during the loading process if SOAtest can reach all of your site areas without submitting specific form inputs. To indicate that you do not want to enter inputs for any forms: •

Click Skip All in the first dialog box.

To enter inputs for some or all forms: •

Use the following table to determine what action to take for each dialog box that opens.

To do this...

Perform this action...

Label the input submission.

Enter a new name in the Form Test Name field.

592

Configuring SOAtest to Scan a Web Application

To do this...

Perform this action...

Enter inputs for the current form.

1. Use the Configure Form Input controls to add or modify inputs for each form input element. If you do not add an input for a specific element, the default input (as specified in the code) will be submitted. 2. (Optional) Change the default form submission method. •

To mimic a simple JavaScript submission or a situation where the user submits the form by pressing the Enter key, select the Implied Submit option.



To mimic the user submitting the form by clicking a submit button, either choose the option representing that submit button (for example, Image: "Anonymous") or click a specific area of the submit image (if available in the Form Input panel).

3. Click Add when you are ready to add the input. After you add an input, SOAtest reopens the same form dialog box so you can enter additional inputs. Indicate that you do not want to enter any more inputs for the current form.

Click Skip.

Indicate that you do not want to enter any more inputs for the current form or any other instances of this form.

Click Skip Form.

Indicate that you do not want to enter any more inputs for forms in this site.

Click Skip All.

Browse the page related to the current form.

Click View.

Understanding Form Inputs Purpose and Options If your site uses forms and you want SOAtest to load and test the pages that are returned after those forms are submitted, you need to tell SOAtest how to populate the forms’ various input elements (text fields, select boxes, check boxes, radio buttons, etc.). The more different page instances you want to test, the more different inputs you need to use to populate the forms. There are several ways to populate a form: •

By entering fixed values that will be used every time SOAtest encounters the related form. (We will refer to these as fixed inputs).



By configuring SOAtest to use the default input (as specified in the code). (We will refer to these as default inputs).

593

Configuring SOAtest to Scan a Web Application



By configuring SOAtest to extract the value from the page on which the form appears—for instance, if the correct form value is set dynamically by JavaScript. (We will refer to these as extracted inputs).



By disabling the input completely.



By configuring SOAtest to use the returned value of a Java/JavaScript/Python method. (We will refer to these as script inputs).

Completing Form Input Panels To add or change fixed inputs in a Form Input panel that SOAtest has opened: 1. (Optional) Modify the input label name in the Form Test Name field. 2. (Optional) Modify the form action in the Form Action field. •

If you want to use the default form action (as specified in the code), select the Default option. Note that if the default value changes, SOAtest will update the form action automatically; you will not need to manually update the form test.



If you want to specify a fixed value, select the Fixed option, then specify the desired form action in the text field.

3. Use the Form Inputs controls to add or modify inputs for each form input element. •

If you want to use the default input (as specified in the code), select the Default option. Note that if the default value changes, SOAtest will update the form test value automatically; you will not need to manually update the form test.



If you want to specify a fixed value, select the Fixed option, then specify the desired value using the available controls. •

Check boxes, radio buttons, and select inputs have a User-Defined option, which allows you to specify a simple string that will be sent as the value for this input.



Radio buttons and select inputs also have an Index option which allows you to specify the index of a radio button or of select option(s). If an index is selected, then the value of that radio button or option will be sent, regardless of how the option changes. This index is 0-based. For instance, if you want to always select the 2nd option, then you can choose the Index option and choose the index “1”. For radio buttons, the current value of the radio button at that index is shown for each index. For select inputs, the current display value of the option is shown along with each index.



Select inputs store the display values for the select rather than the values that would be sent to the server. Consequently, when the submit values change, you do not need to modify the test.



If you want SOAtest to extract the form input value from the page on which the form appears—for instance, if the correct form value is set dynamically by JavaScript— select Extracted, fill the Left-hand text field with the text string that always appears to the left of the value you want extracted, then fill the Right-hand text field with the text string that always appears to the right of the value you want extracted. For example, to extract the value 123 from the text pre123post, you would enter pre in the Left-hand text field and post in the Right-hand text field.



If you want to disable an enabled input, select the Disable option. If you want to enable a disabled input, select the Enable option.



If you want to use the return value of a Java/JavaScript/Python method, select the Script option. Click the Edit button to create or edit the method(s) and choose the

594

Configuring SOAtest to Scan a Web Application

desired method for use from the Method drop-down menu in the popup dialog. If there are two or more methods, you can also select a different method for use from the drop-down menu in the form panel. 4. (If a form has an OnSubmit handler) Enable or disable the Process OnSubmit Handler When Submitting Form option depending on whether you want that handler used during the test. •

If the form has a an OnSubmit handler, this option by default is set to true by default— unless the form test was created while recording paths with the browser. In this case, it is set to false because when SOAtest records a path from the browser, SOAtest sets up the form test in such a way that the OnSubmit does not need to be processed (and, in fact, processing this handler in this case could cause problems during path execution).

5. (Optional) Change the default form submission method. •

If you want to mimic a simple JavaScript submission or a situation where the user submits the form by pressing the Enter key, select the Implied Submit option.



If you want to mimic the user submitting the form by clicking a submit button, either choose the option representing that submit button (for example, Image: "Anonymous") or click a specific area of the submit image (if available in the Form Input panel).

Managing Authentication During Loading If SOAtest reaches a page that requires authentication, it will open a password dialog box in which you can enter a valid username and password.

Enter a valid username and password in the appropriate fields, specify whether you want SOAtest to save the password in its project file, then click OK. Warning: By default, any password you enter in one of these dialog boxes will be saved and stored in your project file; this way, you only need to enter passwords once. If you are not confident that your project file is secure, you should not save these passwords. To prevent SOAtest from saving these passwords, disable the Save password check box at the bottom of the Connection Password dialog box in which you enter the password. .

Loading/Site Scanning FAQ •

How does the Allowed/Restricted URLs table affect my scanning?

595

Configuring SOAtest to Scan a Web Application



Can I use wildcards for the URL in the Allowed/Restricted URLs table?



What is an allowed domain URL?



What if I only want to scan a certain directory in my site?



How do I scan external sites?



What happens if my site interlinks with the protocol http & https?



Are redirects followed?



SOAtest automatically added URLs into the Allowed/Restricted URLs table. When does this happen?



My Start URL is a redirect. What will happen?



My site does not get fully loaded after SOAtest scans a logout page. How do I prevent being automatically logged out?



What if my pages contain forms?



What web technologies does SOAtest support while scanning?

How does the Allowed/Restricted URLs table affect my scanning? Using the Allowed/Restricted URLs table determines what portions of a website should be scanned. During automatic web project scanning, each potential URL is matched against the Allowed/Restricted URLs table to determine if the URL should be visited. The process for determining whether the URL should be visited is as follows: 1. Find the last matching URL in the Allowed/Restricted URLs table. 2. If the sign of the last matching URL is positive, the URL is allowed. 3. If the sign is negative, the URL is restricted. This mechanism allows control in determining what portions of a website to scan. For example, it allows certain URLs to be specified as "off-limits" within a portion of the website that is otherwise allowed to be scanned. Or it allows other URLs to be scanned within a portion of the website that otherwise should not be scanned. Note: The Allowed/Restricted URLs table entries are case sensitive.

Can I use wildcards for the URL in the Allowed/Restricted URLs table? Yes, the wildcard asterisk (*) can be used anywhere in the URL to match 0 or more characters. The asterisk (*) can be used in a combination of ways to construct an Allowed/Restricted URL for URL matching. Example usage of the asterisk (*) : •

To match any protocol: *www.parasoft.com/dir1/dir2/index.html



To match http or https protocol: http(s)://www.parasoft.com/dir1/dir2/index.html



To match any pages within the domain: http://www.parasoft.com/*



To match any pages within the deepest subdirectory: http://www.parasoft.com/dir1/dir2/*

596

Configuring SOAtest to Scan a Web Application



To match any protocols and pages within the domain: *www.parasoft.com/*



To match any page name index.html: *index.html



To match any link with the word 'dir' in it: *dir*



To match any link: *

What is an allowed domain URL? An allowed domain URL tells SOAtest to load all pages within that domain. For example if your site is http://www.parasoft.com, the allowed domain URL would be http(s)://www.parasoft.com/* where the wildcard asterisk represents any link within this domain. Any link within the domain will be loaded during scanning such as http://www.parasoft.com/dir1/dir2/page2/html or https://www.parasoft.com/dir/page1.html.

An allowed domain URL gets automatically added to the Allowed/Restricted URLs table when you enter the Start URL.

What if I only want to scan a certain directory in my site? You will need to add entries into the Allowed/Restricted URLs table to control what does or does not get scanned. If your web site has information regarding several thousand types of hardware, you may want to limit scanning of the site to two type of hardware: Routers and Disks. You also want to process all other parts of the site except for the old part of the site. For example: •

If you wish to start scanning the URL http://www.dev.parasoft.com/index.html: •

Enter http://www.dev.parasoft.com/index.html in the Start URL field. This allowed URL will automatically be added to the Allowed/Restricted URLs table: +





http(s)://www.dev.parasoft.com/*

To scan only the two folders, Routers and Disks, in the hardware directory, you will need to add the following entries to the table in the following order: •

- http(s)://www.dev.parasoft.com/newdir/hardware/*



+ http(s)://www.dev.parasoft.com/newdir/hardware/Routers/*



+ http(s)://www.dev.parasoft.com/newdir/hardware/Disks/*

To restrict the scanning of your old directory, you will need to add the following entries. •

- http(s)://www.dev.parasoft.com/olddir/*

After adding the above URLs, the Allowed/Restricted URLs table should contain the following: •

+ http(s)://www.dev.parasoft.com/*



- http(s)://www.dev.parasoft.com/newdir/hardware/*



+ http(s)://www.dev.parasoft.com/newdir/hardware/Routers/*



+ http(s)://www.dev.parasoft.com/newdir/hardware/Disks/*



- http(s)://www.dev.parasoft.com/olddir/*

See “Preparing SOAtest to Restrict and/or Allow URLs During Scanning”, page 586 for more details.

597

Configuring SOAtest to Scan a Web Application

How do I scan external sites? To allow other websites to be loaded during the scanning of your site, you would need to add the external site to the Allowed/Restricted URLs table. You can add an asterisk (*) as an allowed URL to allow all sites to be loaded.

What happens if my site interlinks with the protocol http & https? When you enter the Start URL, SOAtest will automatically add an allowed domain restriction (i.e http(s)://www.parasoft.com/*) to the Allowed/Restricted URL's table. The http(s)://... is used so that both http and https versions of the specified URLs will be allowed or restricted. You can remove the asterisk or replace it with an 's' if you want to restrict scanning to either a nonencrypted or an encrypted version of your site.

Are redirects followed? Redirects are checked against the Allowed/Restricted URLs table to see if they will be followed or not.

SOAtest automatically added URLs into the Allowed/Restricted URLs table. When does this happen? There are two cases when URLs are automatically added to the Allowed/Restricted URLs table: •

If the Start URL is a redirect (see “My Start URL is a redirect. What will happen?”, page 598), a URL is automatically added to the table.



If the Follow redirects not specified as restrictions in the table option is selected, a URL is automatically added to the table. When this option is selected, if SOAtest encounters a redirect during scanning and this redirect is not specified in the Allowed/Restricted URLs table, it will be loaded. SOAtest will also automatically add an allowed URL for the redirect to the Allowed/ Restricted URLs table.

My Start URL is a redirect. What will happen? If the specified Start URL is a redirect, SOAtest will load that redirect and any continuous subsequent redirects. In addition, SOAtest will automatically add an allowed URL for each of those redirects. These allowed URLs will be added to the Allowed/Restricted URL table. See “Preparing SOAtest to Restrict and/or Allow URLs During Scanning”, page 586 for details.

My site does not get fully loaded after SOAtest scans a logout page. How do I prevent being automatically logged out? If your site contains logout pages, this can cause SOAtest to stop scanning. If SOAtest encounters this kind of page, SOAtest will log out the user. This will cause other pages to be inaccessible because the user is not logged in. You can prevent this from happening during scanning by restricting the logout pages from being loaded. You can add a URL restriction for the pages that contain the logout event to the Allowed/Restricted URLs table. For example, if the logout page is http://www.parasoft.com/logout.html, you can add it as a restricted URL so it does not get loaded.

What if my pages contain forms? SOAtest opens a Form Input dialog box each time it detects a form that requires user input.

598

Configuring SOAtest to Scan a Web Application

See “Populating Forms”, page 592 for details.

What web technologies does SOAtest support while scanning? SOAtest can emulate the execution of the following technologies during scanning: •

JavaScript - See “Preparing SOAtest to Load Sites that Use JavaScript”, page 590.



Only non-GUI Applets are supported. SOAtest is able to detect Applets but it will not be processed.



ActiveX is detected by SOAtest but there is minimal support.



Macromedia is detected by SOAtest but there is only minimal support for older versions of Macromedia.

Note that the above list is focused on technologies supported for scanning—it is not the same as the list of technologies supported for static analysis.

Customizing Scanning with Scripting Hooks You can customize SOAtest’s scanning behavior by having it execute custom scripts when the events associated with "hooks" occur within the SOAtest program.

Understanding the Concept of Hooks Customized hooks can be used to record or modify the values passed at specific points in the Scan tool’s execution (e.g., when alert dialog boxes are opened, users are prompted for responses, users are prompted for passwords, and so on). For example, one of SOAtest’s hooks is the "Alert" hook. This hook is called whenever SOAtest’s Scan tool encounters a JavaScript alert. If you want SOAtest to record this alert information and report it in a SOAtest Message window, you can do so by creating a script that defines this hook, then describes how you want SOAtest to behave when it encounters this hook. After this script is invoked, SOAtest will access it and perform the specified action (recording and reporting alert information) each time it encounters a JavaScript alert. Hooks are defined and customized in scripts using Python, JavaScript, or Java methods. The same file can define multiple hooks. If you add more than one method to a hook, all methods defined for that hook will be executed when the hook is called. You can create, apply, and invoke scripts that define hooks in the same way that you create, apply, and invoke any other script in SOAtest: upon startup (only for JavaScript and Python scripts), by creating and applying an Extension tool, and by adding scripts to a specific path node. You can invoke hooks at different times to elicit the desired functionality. For example, if you want to use a script’s hook functionality for all SOAtest projects and sites, you could add the JavaScript or Python script that defines and uses that hook to the <soatest_install_dir>/plugins/com.parasoft.xtest.libs.web_<soatest_version>/root/startup directory; then, any time the program calls the hook, the associated user-defined methods will be executed. The methods will be executed until you call clear () on the hook. For a complete description of general hook options, see the SOAtest Extensibility API (available by choosing SOAtest> Help> Extensibility API).

Understanding Available Hooks SOAtest allows you to define and manipulate the following specific hooks in your scripts: •

Alert

599

Configuring SOAtest to Scan a Web Application



Confirm and Prompt



Realm Password



URL Logging

Alert When SOAtest encounters a JavaScript alert() method, its default behavior is to click the OK button and print JavaScript Alert: "message" to the SOAtest Console view. You can modify this behavior using the Alert hook. By default, the Alert hook is called whenever SOAtest encounters a JavaScript alert. By adding methods to this hook, you determine how SOAtest behaves when it encounters a JavaScript alert. When the Alert hook is invoked, SOAtest will print the message JavaScript Alert: "message" to the location specified in the script, or to the SOAtest Console view (if no alternate location is specified). This hook is commonly used to print the results of alert messages to a special SOAtest Message window. For example, if you wanted SOAtest to report all alert messages in a SOAtest Message window named "Alert Messages," you could create the following Python script and add it to your <soatest_install_dir>/plugins/com.parasoft.xtest.libs.web_<soatest_version>/root/ startup directory: from com.parasoft.api import Application def myAlertHook(msg): # Print the alert message to SOAtest Messages Application.showMessage(" JavaScript Alert: %s" % str(msg)) # Add the contents of the alert message to a Result Window named # "Alert Messages". This ResultWindow can later be used # as part of a test suite in a regression test, to make sure the # contents of the window, which contains the alert message, # are what you expect them to be. Application.report("Alert Messages", str(msg)) # Add your method to the Alert Hook Application.getHook("Alert").set(myAlertHook)

The argument passed into the method myAlertHook will be the alert message. It is a string that is the message the alert contains. You could also use the Alert hook to verify that when a user (or SOAtest) submits certain invalid form values, the application opens an alert box warning that invalid data was entered. To implement this test, you would first create and invoke a script (like the one above) that records alert messages into a SOAtest Message window. Then, you would implement a test suite that checks the contents of the Message window using the following test cases: 1. An Extension tool that clears the Message window. 2. A Test Path tool that executes the click path that should cause the alert message. 3. An Extension tool with a script (like the one shown below) that returns the text of the Message window, with an attached regression control that verifies that the correct alert message appears in the Message window. #Script to return text of result window named "Alert Messages" from soaptest.api import * def return(): return SOAPUtil.getResultWindowText("Alert Messages") #Script to clear text of result window named "Alert Messages" from soaptest.api import * def clear(): SOAPUtil.clearResultWindow("Alert Messages")

600

Configuring SOAtest to Scan a Web Application

Confirm and Prompt When SOAtest encounters a JavaScript confirm() method, its default behavior is to click the Cancel button, return false, and print JavaScript Confirm: "message" to the SOAtest Console view. When SOAtest encounters a JavaScript prompt() method, its default behavior is to click the Cancel button, return null, and print JavaScript Prompt: "message" to the Console view. You can modify these behaviors by using scripting to define and customize a Confirm hook or Prompt hook (the hook used depends upon the JavaScript method whose return value you want to modify). Confirm By customizing the Confirm hook, you determine how SOAtest behaves when it encounters the confirm() method. For example, one way to prompt SOAtest to return a customized (non-default) response whenever it encounters the confirm() method is to perform the following steps: First, create a Python file (named startup.py) that defines a method which takes a single argument. This argument will have the same value that gets passed to the confirm() method in your JavaScript to this argument. Based on whatever logic you need, determine whether you want confirm() to return "true" or "false", and then return that value from the method you defined. To have SOAtest use the method you just defined, add it to the Confirm hook in the following way, where the argument you pass to set is the name of the method that you defined. In the example, the method defined is named myConfirmHook. from com.parasoft.api import Application # msg will have the same value that gets passed to confirm() def myConfirmHook(msg): if msg == "Yes or no?": return 1 return 0 Application.getHook("Confirm").set(myConfirmHook)

Next, add this file to the <soatest_install_dir>/plugins/com.parasoft.xtest.libs.web_<soatest_version>/root/startup directory. The next time you start SOAtest, it will execute the above script whenever it encounters a confirm() method. When the Confirm hook is invoked, SOAtest will print one of the following messages (depending on the confirmation action) to the location specified in the script, or to the Console view (if no alternate location is specified): •

Based on the script, SOAtest clicked the "OK" button



Based on the script, SOAtest clicked the "CANCEL" button

Prompt By customizing the Prompt hook, you determine how SOAtest behaves when it encounters the prompt() method. The Prompt hook works very much like the Confirm hook does. One difference between these hooks is that Prompt takes two arguments and passes them to the prompt() JavaScript method. The other difference is that the method you define for Confirm returns a boolean, but the method you define for Prompt must return either a string, or "null" ("None" in Jython). You can define logic that determines return values based on the arguments that are passed to the prompt() method. Returning "null" is equivalent to clicking Cancel in the prompt box that the browser opens when the prompt() method is called. You add the method to the appropriate hook in the same

601

Configuring SOAtest to Scan a Web Application

way that you added a custom method for confirm(), but now the hook is named "Prompt". An example script that returns "http://www.parasoft.com" as the value for prompt() follows: from com.parasoft.api import Application def myPromptHook(msg, defaultMsg): return "http://www.parasoft.com" Application.getHook("Prompt").set(myPromptHook)

When the Prompt hook is invoked, SOAtest will print one of the following messages (depending on the prompt response) to the location specified in the script, or to the Console view (if no alternate location is specified): •

Based on the script, SOAtest entered "[message- string that the script returns]"



Based on the script, SOAtest clicked the "CANCEL" button

Returning Different Values in Different Situations You can have SOAtest return different confirm() or prompt() values in different situations by adding the appropriate logic to the method you define for the Confirm or Prompt hook. You determine what value is returned using the text of the first argument to the method you're adding to the Confirm or Prompt hook. For example, you might want SOAtest to return http://www.parasoft.com for the prompt "Enter a URL", but return John for the prompt "Enter your name". This functionality could be implemented as follows: def myPromptHook(msg, defaultMsg): if msg == "Enter a URL": return "http://www.parasoft.com" elif msg == "Enter your name": return "John" else: return None

Realm Password The Realm Password hook is called when all of the following conditions are satisfied: •

SOAtest makes a request of a server.



The server responds with a challenge, asking for either a realm password or an NTLM password (both of which SOAtest handles in the same way).



You have not yet entered a realm or NTLM password.

If you have defined a custom Realm Password hook before the above conditions are satisfied, SOAtest will execute the custom hook method before prompting you to manually enter a password. If that method adds a password (either to a site in the project, or to a load test virtual user), SOAtest will use that password, and will not prompt you for a password. If you or another SOAtest user previously entered a password, this hook will not get executed. If you want to ensure that SOAtest uses the password you defined for the hook, you should clear the site passwords before you want SOAtest to add new passwords from the hook. This hook is particularly useful in cases where you want to vary usernames/passwords depending on the particular context you are in (e.g., based on the particular path you are using, or you would like to use different usernames/passwords in the context of a virtual user load test). The following sample Java file is a very basic sample implementation of the Realm Password hook. This script adds the same password regardless of the context. import com.parasoft.api.*; import java.lang.reflect.*;

602

Configuring SOAtest to Scan a Web Application

import java.net.*; import soaptest.api.*; public class RealmPasswordHook { static public void returnPassword(String realm, String url, boolean ntlm, Context context) throws MalformedURLException { SOAPUtil.addSitePassword(realm, "user1", "password1", url, ntlm); } public void setHook() throws UserMethodException, NoSuchMethodException { Hook hook = Application.getHook("RealmPassword"); Method userMethod = getClass().getMethod("returnPassword", new Class[] {String.class, String.class, boolean.class, Context.class}); hook.set(userMethod); } }

The following Python sample file is similar to the previous Java file, except that it clears all existing passwords when the Realm Password hook is accessed. from com.parasoft.api import * from soaptest.api import * def returnPassword(realm, url, ntlm, context): SOAPUtil.addSitePassword(realm, "user1", "password1", url, ntlm) def setHook(): hook = Application.getHook("RealmPassword") hook.set(returnPassword) // Remove all previous passwords to the new one being added above is the one that gets used. SOAPUtil.removeSitePasswords()

The following sample JavaScript file removes existing realm passwords when the hook is accessed, then—when a virtual user load test is executed—submits passwords that depend upon the URL currently being accessed. // This method uses a different password based on the URL. var Application = Packages.com.parasoft.api.Application function returnPassword(realm, url, ntlm, context) { Packages.soaptest.api.SOAPUtil.removeSitePasswords() // Adds password to a virtual user in load test if (url.indexOf("http://toad.parasoft.com:90" == 0) { Packages.soaptest.api.SOAPUtil.addSessionPassword(realm, "user1", "password1", url, ntlm, context) } else { Packages.soaptest.api.SOAPUtil.addSessionPassword(realm, "user2", "password2", url, ntlm, context) } } function setHook() { var hook = Application.getHook("RealmPassword") hook.set(returnPassword) Packages.soaptest.api.SOAPUtil.removeSitePasswords() }

603

Configuring SOAtest to Scan a Web Application

Notes •

When you use addSitePassword() and addSessionPassword(), you should pass ntlm and context from the method you define to the password-adding method that you are calling. We recommend that you avoid changing these values unless you are comfortable manipulating the SOAtest Extensibility API. Also, we recommend that you do not change the realm value; the server gives that realm value to SOAtest, and that realm value is one of the factors that SOAtest uses to determine which password to send for a particular URL. If the URL you pass does not reside within your Project tree, the password will not be added.



When defining hooks in Python and JavaScript that should get set when the application starts, the methods in the examples that define the hooks need to get called within the script. If any JavaScript or Python scripts are placed in the startup directory, the script also needs to actually call the method that sets the hook - not just define the method.

URL Logging The URL Logging hook is called every time SOAtest visits a URL. It can be used to prompt SOAtest to print each URL visited. The methods you attach to this hook need to have either 3 or 4 arguments. The first argument should be a string specifying the URL visited. The second argument should be any POST data submitted with the URL (this mat be NULL). The third argument is a java.util.Hashtable containing other HTTP properties sent with the request, (e.g., referer). The optional fourth argument, if added to the method signature, is a com.parasoft.api.Context. In addition, you need to set URLUtil.enableLogging to true and pass UrlLogging to Application.getHook()

For example, if you wanted SOAtest to print each URL visited to the Console view, you could create the following Python script and add it to your <soatest_install_dir>/plugins/com.parasoft.xtest.libs.web_<soatest_version>/root/startup directory: from com.parasoft.api import * from webtool.site import URLUtil count = 1 def setHook(): URLUtil.enableLogging = 1 hook = Application.getHook("UrlLogging") hook.set(loggingHook) def loggingHook(url, post, props): global count if post: Application.showMessage(str(count) + ". " + url + " [POST=" + post + "]") else: Application.showMessage(str(count) + ". " + url) count = count + 1 setHook()

604

Reviewing Static Analysis Results

Reviewing Static Analysis Results This topic covers how to analyze and correct static analysis violations. Section include: •

Accessing Results



Responding to Results



Learning More About Violated Rules



Reviewing and Correcting the Responsible Source Code



Suppressing the Reporting of "Allowed" Violations

Accessing Results For tests run in the GUI, results are reported in the SOAtest view. In addition, for static analysis tests that were run on source files, results are also reported at the source code level: if you open the editor for a tested source file, markers will be placed next to the source code responsible for problems found. Markers are placed next to the line of code responsible for the violation. To learn what problem a particular marker indicates, place your mouse over the marker and review the information in the popup window. Or, to go directly to the related SOAtest view message, right-click the source code responsible for the problem, choose SOAtest> Show in Tasks. For tests run from the command line interface, results are reported in the Static Analysis section of the report. If results were sent to Team Server, results can be imported into the GUI as described in “Accessing Results and Reports”, page 234. They will then be available in the SOAtest view. Note that static analysis errors are not a criteria for functional test failure. For example, assume that you are performing static analysis on a Web functional test suite that includes a Browser Testing tool or a Scanning tool). If the tests execute successfully but static errors are found in the referenced pages, the test will be reported as passing, and the static analysis errors will be reported in the SOAtest view.

Modifying the Results Layout The default SOAtest view layout is designed for displaying functional test results. To facilitate the review of static analysis results, SOAtest provides additional two layouts: •

SOAtest Static Analysis Layout: This option is recommended if you are running static analysis against source code. (e.g., from the Scanning perspective).



SOAtest Static Analysis for Functional Tests Layout: This option is recommended if you are running static analysis by executing a test suite (e.g., a test suite that contains a Browser Testing tool or a Scanning tool). With this layout, SOAtest will report the test that uncovered each static analysis error, as well as the name of the file where the static analysis error occurs. If the error occurs in browser-constructed HTML, SOAtest will report the "Browser-Constructed HTML (<Browser name and version>)" instead of a specific file name.

To choose one of the static analysis layouts: •

Open the pull-down menu on the top right of the SOAtest view, then choosing one of the available formats from the Layout shortcut menu that opens.

605

Reviewing Static Analysis Results

Responding to Results For each violation reported, we recommend that you and your team review the rule description and the related code, then decide whether: •

The violation is valid and significant (and the violation should be corrected)



The rule does not apply in that particular context (and the violation should be suppressed)



The rule is not well-suited to your projects or priorities (and the related rule should be disabled)

Many teams like to review SOAtest’s static analysis violations during code reviews. Developers check their code using the rules selected by the team’s architect and/or manager. If a developer thinks that it makes sense to ignore a particular rule violation, that developer discusses this at the code review. The team then decides whether the violation should be suppressed, the rule should be disabled, or the violation should be corrected.

Learning More About Violated Rules The SOAtest rule descriptions can help you determine which rules your team wants to follow, understand how reported violations can impact application reliability, security, maintainability, etc., and learn how to correct reported violations. To view a rule description file, right-click the static analysis violation message in the SOAtest view, then choose View Documentation from the shortcut menu. A yellow "Yield" sign marks the node that you should right-click.

606

Reviewing Static Analysis Results

Tips To learn about the static analysis rules that are included with SOAtest, choose Help> Help Contents, then open the SOAtest Static Analysis Rules book, then browse the available rule description files. To view a list of all static analysis rules that a given Test Configuration is configured to check, as well as descriptions of all enabled rules: 1. Open the Test Configurations dialog by choosing SOAtest> Test Configurations or by choosing Test Configurations in the drop-down menu on the Test Using toolbar button. 2. Select the Test Configurations category for which you want a rule list. 3. Open the Static tab. 4. Click Printable Docs. If you want to print this list of rules and all related rule descriptions, enable your browser’s Print all linked documents printer option before you print the main the list of rules (the index.html page). If you want to generate a PDF, create a PDF from the index.html page; be sure to configure the PDF generator to include linked pages (in Acrobat, by enabling the Get entire site option). If you would rather check if your code follows a single coding standard, a predefined category of coding standards, or all available coding standards, you can check coding standards "on the fly." To check coding standards in this way: 1. In the Navigator for the SOAtest perspective, select the resource that you want to test. 2. Do one of the following: •

Choose Test Using> Built-in> Static Analysis> [desired rule or category of rules] from the Test Using pull-down menu.



Choose SOAtest> Test Using> Built-in> Static Analysis> [desired rule or category of rules].

Reviewing and Correcting the Responsible Source Code To view the source code responsible for the rule violation, double-click the node that shows the line number, or right-click that node and choose Go to from the shortcut menu. The editor will then open and highlight the designated line of code. You can make the necessary modifications, then save the modified file.

Suppressing the Reporting of "Allowed" Violations See “Suppressing the Reporting of Acceptable Violations”, page 608.

607

Suppressing the Reporting of Acceptable Violations

Suppressing the Reporting of Acceptable Violations This topic explains how to prevent SOAtest from reporting specific static analysis violations (e.g., when you generally follow a rule, but decide to ignore that rule in an isolated number of exceptional situations). Suppression schemes can be entered in the GUI or defined directly in the source code. Sections include: •

About Suppressions



Defining Suppressions



Viewing Suppressions



Clearing Suppressions

About Suppressions Suppressions are used to prevent SOAtest from reporting additional occurrences of a specific static analysis task (multiple tasks might be reported for a single rule). Suppressed messages will be sent to a special Suppressions view instead of the SOAtest view; this allows you to monitor those violations as needed while keeping your main results areas focused on the other errors. You use suppressions for situations when you generally want to follow a rule, but decide to ignore that rule in an isolated number of exceptional situations. With suppressions, you can continue checking whether your code follows that rule without receiving repeated messages about your intentional rule violations. If you do not want to receive error messages for any violations of a specific rule, we recommend that you modify your Test Configurations so that they no longer check that rule. Note that suppression settings are independent of Test Configurations. To avoid confusion, keep in mind that: •

The Test Configuration defines the set of rules that are checked during static analysis.



Suppressions define which static analysis results should be visible in SOAtest view and reports.

This means that rules selected in the Test Configuration will be checked during analysis, but results that match the suppression criteria will not be displayed.

Tip Suppressions are message-based and not rule-based. Suppressions prevent the reporting of a specific static analysis task (e.g., fix the violation of rule X that occurs in line Y); they do not prevent the reporting of all violations of a rule.

Defining Suppressions To suppress a static analysis task that is shown in the SOAtest view: 1. Right-click the SOAtest view item that represents the task you want to suppress, then choose Suppress Task from the shortcut menu.

608

Suppressing the Reporting of Acceptable Violations



To suppress all tasks in a group (a rule category, a specific rule, a file, etc.) right-click the node that represents that group, then choose Suppress All Tasks.

2. Enter the reason for the suppression in the dialog box that opens The task will then be "suppressed" and removed from the SOAtest view. A suppression entry will be added to the Suppressions view. If the same static analysis violation is found in subsequent tests of this project, it will be reported in the Suppressions view, but not in the SOAtest view.

Viewing Suppressions To view the suppressed messages that were reported in a subsequent test run: •

Open the Suppressions view. If this view is not available, choose SOAtest> Show View> Suppressions to open it.

The Suppression Messages view will display the following information: •

The static analysis violation that was suppressed.



The reason why the task was suppressed.



The resource (file) to which the suppression applies.



The folder that contains the resource.



The name of the person who suppressed the task.



The date on which the suppression was first applied.

To sort the Suppressions view contents by one of the column headings, click that column heading.

Tip •

You can edit a suppression’s message or reason by right-clicking the suppression in the Suppressions view, choosing Edit Message or Edit Reason from the shortcut menu that opens, then modifying the message or reason in the dialog that opens.

Using the Suppression Filter to Restrict the Suppressions Shown You can restrict which suppressions are shown in the Suppressions view by using the available suppressions filter. To filter the suppressions shown in the Suppressions view: 1. Click the Filter button in the Suppressions view toolbar. The Filters dialog will open. 2. Check the Enabled check box to enable the filter. 3. Use the dialog’s controls to specify your filtering criteria. Available options include: •

Limit visible items to: Shows no more than the specified number of suppressions.



On any resources: Shows all suppressions for all projects.



On any resource in same project: Shows all suppressions for the currently-selected project.



On selected resource only: Shows only the suppressions entered for the currentlyselected resource.



On selected resource only and its children: Shows only the suppressions entered for the currently-selected resource, and that resource’s children.

609

Suppressing the Reporting of Acceptable Violations

Clearing Suppressions To unsuppress a task: •

Select the Suppressions view item that represents the task you want to unsuppress, then click the Red X Remove Suppression icon in the top right of the view.

If this same static analysis violation is found in subsequent tests of this project, the task will be reported in the SOAtest view.

610

Customizing Static Analysis: Overview

Customizing Static Analysis: Overview Static analysis can be highly customized. Use the following table as a reference to determine how to achieve the customization effect you are seeking:

Desired Customization

Required Action

Reference

Determine exactly what rules are checked during coding standard analysis

Change the rules settings in the Static tab of the Test Configuration(s) you want to apply

“Defining How Static Analysis is Performed (Static Tab)”, page 247

Modify static analysis settings, such as number of static analysis tasks reported per rule, etc.

Change the analysis settings in the Static tab of the Test Configuration(s) you want to apply

“Defining How Static Analysis is Performed (Static Tab)”, page 247

Prevent the reporting of additional occurrences of a specific static analysis task

Suppress the task

“Suppressing the Reporting of Acceptable Violations”, page 608

Create custom rules for checking application-specific, project-specific, and team-specific requirements

Use RuleWizard to create custom coding standards

“Creating Custom Static Analysis Rules”, page 612

Customize the built-in rules

Edit rule parameters in the Static tab of the Test Configurations dialog.

“Creating Custom Static Analysis Rules”, page 612

or Use RuleWizard to modify static analysis rules Change rule categories, IDs, headers, and/or severities (e.g., to match your team’s or organization’s coding policy)

Define the changes with rule mappings

“Modifying Rule Categories, IDs, Names, and Severity Levels”, page 616

Create new rule categories

Define the changes with rule mappings

“Modifying Rule Categories, IDs, Names, and Severity Levels”, page 616

Change the default directory where SOAtest looks for userdefined rules

Set the desired User rules location in the SOAtest> Configurations page of the SOAtest Preferences dialog.

“Modifying General SOAtest Preferences”, page 242

611

Creating Custom Static Analysis Rules

Creating Custom Static Analysis Rules This topic explains how to check custom requirements or tailor existing rules to your unique needs by either modifying built-in static analysis rules or by creating your own static analysis rules. Sections include: •

Customizing Parameterized Rules



About RuleWizard



Customizing Built-In Rules with RuleWizard



Creating New Rules



Using Custom Rules (For Teams Using Team Server)



Using Custom Rules (For Teams Not Using Team Server)

Customizing Parameterized Rules Many rules are parameterized, meaning that you can customize the nature of the rules by modifying the available rule parameters. Many naming convention rules are parameterized so that you can specify the naming convention that you want to check. Other rules are parameterized so that you can control rule options such as the scope of checking or choose among different interpretations of the rule. Parameterized rules are marked with a special icon (a wizard hat with a radio button) in the Test Configurations dialog Static> Rules Tree tab:

If a rule is parameterized, its parameters are described in the rule’s description. To view a rule’s description, right-click the node that represents that rule, then choose View Rule Documentation from the shortcut menu. To edit a parameterized rule: 1. Open the Test Configurations dialog by choosing SOAtest> Test Configurations or by choosing Test Configurations in the drop-down menu on the Test Using toolbar button. 2. Open the Static> Rules Tree tab for any Test Configuration. The modified rule parameters will be applied to all Test Configurations, so it does not matter which Test Configuration you select in this step. 3. Expand the rule’s category branch. 4. Right-click the parameterized rule that you want to modify, then choose View/Change Rule Parameters from the shortcut menu. 5. Modify the rule parameters in the dialog that opens. 6. Click OK to save your changes.

About RuleWizard RuleWizard (available in the Architect and Server edition only) allows you to create custom static analysis rules. SOAtest can automatically enforce any valid rule created in RuleWizard. By creating and checking custom rules, teams can verify unique project and organizational requirements, as well as prevent their most common errors from recurring.

612

Creating Custom Static Analysis Rules

With RuleWizard, rules can be created graphically (by creating a flow-chart-like representation of the rule) or automatically (by providing code that demonstrates a sample rule violation). No coding or knowledge of the parser is required to write or modify a rule.

There are two ways to open RuleWizard: •

Choose SOAtest> Launch RuleWizard.



Click the New button in the Test Configurations panel’s Static tab.

The RuleWizard GUI will then open. The RuleWizard User's Guide (accessible by choosing SOAtest> Help in the SOAtest GUI or Eclipse workbench, then opening the SOAtest RuleWizard User’s Guide book) contains information on how to modify, create, and enable custom rules.

Customizing Built-In Rules with RuleWizard With SOAtest Architect Edition and Server Edition, you can use RuleWizard to customize any rule marked with the following wizard hat + wizard wand icon in the Test Configuration panel’s rules tree:

We strongly recommend that you leave the SOAtest built-in rules intact; rather than modify the built-in rules, duplicate them, then modify the duplicates. To customize a built-in rule in RuleWizard: 1. Open the Test Configurations dialog by choosing SOAtest> Test Configurations or by choosing Test Configurations in the drop-down menu on the Test Using toolbar button. 2. Open the Static> Rules Tree tab for any Test Configuration. 3. Right-click the rule you want to modify, then choose Duplicate from the shortcut menu. A duplicate rule node—with a file icon—will be added to the rules tree. 4. Right-click the duplicate rule, then choose Edit Rule in RuleWizard from the shortcut menu. The RuleWizard GUI will then open. The RuleWizard User's Guide (accessible by choosing Help> Documentation in the RuleWizard GUI) contains information on how to modify and save custom rules. Be sure to save the rule after you modify it, then enable it as described in “Sharing Custom Rules”,

613

Creating Custom Static Analysis Rules

page 222 (if your team will be sharing custom rules with Team Server) or “Using Custom Rules (For Teams Not Using Team Server)”, page 614 (if you will not be sharing custom rules with Team Server).

Creating New Rules You can easily create your own static analysis rules (or modify built-in rules) using the SOAtest RuleWizard module, a graphical rule creation and customization tool available in SOAtest (Architect edition) and SOAtest (Server edition). With RuleWizard, rules can be created graphically (by creating a flow-chart-like representation of the rule) or automatically (by providing code that demonstrates a sample rule violation). No coding or knowledge of the parser is required to write or modify a rule. To open RuleWizard: •

Choose SOAtest> Launch RuleWizard.

The RuleWizard GUI will then open. The RuleWizard User's Guide (accessible by choosing Help> Documentation in the RuleWizard GUI) contains information on how to modify, create, and save custom rules. Before you can check custom rules, you must configure SOAtest to import and check them. For details on how to configure SOAtest to recognize and check those rules, see “Sharing Custom Rules”, page 222 (if your team will be sharing custom rules with Team Server) or “Using Custom Rules (For Teams Not Using Team Server)”, page 614 (if you will not be sharing custom rules with Team Server).

Note on Rule IDs Each rule that you import into the tool must have a unique rule ID. You should not import multiple rules that have the same rule ID.

Using Custom Rules (For Teams Using Team Server) See “Sharing Custom Rules”, page 222.

Using Custom Rules (For Teams Not Using Team Server) Before you can check custom coding rules that were designed in RuleWizard, you need to configure SOAtest to access and check those rules.

Note The following procedure describes how to enable custom rules if you are not using Parasoft Team Server to share rules across the team. If you are using Team Server, follow the instructions in “Sharing Custom Rules”, page 222. To configure SOAtest to import and check custom rules if you are not using Team Server:

614

Creating Custom Static Analysis Rules

1. Open the Test Configurations dialog by choosing SOAtest> Test Configurations or by choosing SOAtest> Test Configurations in the drop-down menu on the Test Using toolbar button. 2. Select any Test Configurations category. The new rule(s) will be available in all available Test Configurations. 3. Open the Static> Rules Tree tab. 4. If any new rules should belong to a new category, create a new category as follows: a. Click the Edit Rulemap.button. b. Open the Categories tab. c.

Click New. A new entry will be added to the category table.

d. Enter a category ID and category description in the new entry. For instance, an organization might choose to use ACME as the category ID and ACME INTERNAL RULES as the description. e. Click OK to save the new category. 5. Click the Import button to the right of the rules tree. The Import RuleWizard rule dialog will open. 6. Use the Import RuleWizard rule dialog to specify which rule(s) you want to import, and whether you want to overwrite existing rule files (if an imported rule file has the same name as an existing rule file). 7. Click OK. The rule will be displayed under the assigned category and will be disabled by default. 8. Enable the new rule(s) you want checked. 9. Click either Apply or Close to commit the modified settings.

615

Modifying Rule Categories, IDs, Names, and Severity Levels

Modifying Rule Categories, IDs, Names, and Severity Levels This topic explains how you can create new rule categories, as well as modify rule categories, rule IDs, names, and severity levels using rule mappings. This is especially helpful if you want to configure SOAtest to enforce your team’s or organization’s coding policy (e.g. by customizing the built-in rule names, severities, and categories to match the ones defined in your policy). Sections include: •

Specifying Rule Mappings



Sharing Rule Mappings

Specifying Rule Mappings To specify rule mappings: 1. Open the Test Configurations dialog by choosing SOAtest> Test Configurations. 2. Select any Test Configurations category. 3. Open the Static> Rules Tree tab. 4. Click Edit Rulemap. 5. Enter your rule mapping in the Edit Rulemap File dialog that opens. •

To change the rule ID, name, and/or severity level: Add an entry to the Rules tab. A map entry defines a rule map which can change the specified built-in rule’s category ID, name, and/or severity level. Enter the current rule ID (for instance, SECURITY.RULENAME) in the Original ID column, then enter the desired changes in the appropriate column. •

If you want to change the rule ID, enter the desired rule ID in the Mapped ID column.



If you want to change the severity, select it from the Severity column drop down menu.



If you want to change the rule name, enter the desired name in the Header column.



To copy a rule to another rule ID and/or severity level: Add an entry to the Clones tab. Cloning rules is useful for mapping a single rule instance to multiple rule IDs. Enter the current rule ID (for instance, SECURITY.RULENAME) in the Original ID column, then enter the desired clone rule ID (for instance, SECURITY.RULE1) in the Clone ID column and/or select the desired clone severity level from the Severity column drop down menu.



To add a new rule category: Add an entry to the Categories tab. Enter a new category ID in the Category ID column, then enter a brief category description in the Category Description column.

616

Modifying Rule Categories, IDs, Names, and Severity Levels

After you enter rule mappings in the SOAtest UI, they are saved in a simple text file named rulemap.xml in your SOAtest installation directory.

Sharing Rule Mappings Through Team Server If you are using Team Server and you want team members to share the rule changes, you should upload the rulemap.xml file to the Team Server. To upload the file to Team Server: •

See “Sharing Rule Mappings”, page 221.

If a machine has access to both a local rulemap.xml file and a team rulemap.xml file, the team file will take precedence over the local file.

Through Export/Import Even if you do not have Team Server, you can share rule mappings by exporting/importing them. To export rule mappings: 1. Open the Rulemap dialog by clicking the Edit Rulemap button in the Static> Rule Tree tab of the Test Configuration panel. 2. Click the Export button, then use the file chooser to indicate where you want to save the rulemap file. To import a rule mappings:

617

Modifying Rule Categories, IDs, Names, and Severity Levels

1. Open the Rulemap dialog by clicking the Edit Rulemap button in the Static> Rule Tree tab of the Test Configuration panel. 2. Click the Import button, then use the file chooser to select the appropriate rulemap file.

618

Code Review In this section: •

Code Review Introduction



General Code Review Configuration



Configuring and Running Pre-Commit Code Review Scans



Configuring and Running Post-Commit Code Review Scans



Working with the Code Review UI



Authors - Examining and Responding to Review Comments



Reviewers - Reviewing Code Modifications



Monitors - Overseeing the Review Process



Code Review Tips and Tricks

619

Code Review Introduction

Code Review Introduction Given the complexity of developing and maintaining secure, reliable, and compliant SOA, it is often critical to manually review specific SDLC artifacts in order to ensure consistency and correctness. In conjunction with automated policy enforcement, peer review allows for peers or managers to evaluate critical SDLC artifacts in the context of the organization’s defined quality policies. For example, a team may have a policy that test suites for certain core functionality must be peer reviewed every time that they are created or modified. Parasoft’s Code Review module is designed to make peer reviews more practical and productive by automating preparation, notification, and tracking. It automatically identifies updated files, matches the files with designated reviewers, and tracks the progress of each review item until closure. This allows teams to establish a bulletproof review process that ensures the designated files are reviewed and all identified issues are resolved. Parasoft provides built-in support for the following typical code review flows: •

Post-commit: This mode is based on automatic identification of file changes in a source repository via custom source control interfaces. Review tasks are created based on a preconfigured mapping of changed code to reviewers.



Pre-commit: Users can initiate a code review from the desktop by selecting a set of files to distribute for the review, or automatically identify all locally changed source code.

Code Review requires Team Server and at least one SOAtest Server Edition.

Workflow Overview There are two key workflow variations: default vs. restricted and pre-commit vs. post-commit.

Default vs. Restricted In a restricted workflow, the reviewer must accept each issue. If the author does not agree with the reviewer’s suggestion, he needs to discuss this with the reviewer. The default workflow is more open: when the author receives comments from a reviewer, he can either apply them to the code, or leave the code as is. If you want to use a restricted workflow, add the 'workflow' property to your Team Server: Use the property key path /usr/{user}/codereview/workflow, and the property value restrict. If you later want to revert to a default workflow policy, change the property value to default.

Pre-Commit vs. Post-Commit Pre-commit code reviews are for teams who want to review code before adding it to source control. When developers are ready to have a piece of new/modified code reviewed, they run a Code Review Test Configuration from the SOAtest UI, then the reviewer is automatically notified about the required review. In a post-commit process, a code review package is created for code that has already been committed to source control. After the package is accepted, it is no longer needed (since the code is already in source control). Post-commit code reviews are for teams who want to review code after code is committed to source control. A Code Review Test Configuration is typically scheduled to run automatically on a regular basis (for example, every 24 hours). It scans the designated source control repository to identify code

620

Code Review Introduction

that requires review, then sends this information to Team Server, which then distributes it to the designated reviewer. In this scenario, the code authors do not need to perform any special actions to have their code reviewed; simply committing it to source control is sufficient. After the Code Review Test Configuration runs, the designated reviewers are automatically notified about the required reviews. In a pre-commit process, a code review package is created for code that needs to be accepted before it is committed to source control. In this type of process, the author should always see that packages are finally accepted. At that point, he can commit the files and then close the package.

Workflow Details The following diagrams illustrate the various workflows available.

Reviewer

Host (Development) Enviro

To Fix

Author

Reviewer

Author

Accepted

Pre-Commit Restricted Workflow

Author

To Review

Reviewer Author

To Review

Accepted

Pre-Commit Default Workflow

To Fix

Author

Reviewer

Author

Author

Reviewer

Author

To Review

Host (Development)Done Environm

To Fix

Done

Done

Reviewer

Author

To Fix

To Review

Reviewer

Reviewer

Post-Commit Restricted Workflow

Post-Commit Default Workflow

621

Done

General Code Review Configuration

General Code Review Configuration Thios section explains general code review configuration options (options that apply to both pre-commit and post-commit processes). Sections include: •

Configuring Preferences



Exempting Specific Pieces of Code from Code Review Scans



Troubleshooting Unpublished or Skipped Reviews



Understanding How Code Review Packages are Created



Importing a Scanner Configuration Previously-Defined in a scanner.properties File

Configuring Preferences To configure Code Review preferences on every team SOAtest installation: 1. Choose SOAtest> Preferences to open the Preferences dialog. 2. If you have not already done so, configure source control preferences as follows: a. Select the Scope and Authorship category in the left pane of the Preferences panel. b. Enable Use source control (modification author) to compute scope. c.

Select the Source Control category in the left pane of the Preferences panel.

d. Specify your team’s source control repository.

How do I get an encoded password? For CVS, use the value in .cvspass from within the user's home directory. For CVSNT, use the value store in the registry under HKEY_CURRENT_USER\Software\Cvsnt\cvspass.

3. If you have not already done so, configure Report Center or Team Server preferences (depending on where code review data will be stored—we recommend using Report Center when it is available, but also support Team Server for backwards compatibility) in the appropriate area of the Preferences panel (Report Center/Project Center or Team Server).

Team Server Configuration Note If your Parasoft solution uses Team Server Named Accounts, ensure that Team Server user accounts can access the Team Server 'Code Review' directory. For details on opening the appropriate path permissions, see the 'Named Accounts' section of the PST Admin Guide.' 4. (Desktop installations only) Select the Code Review category in the left pane, then complete the Code Review settings as follows:

622

General Code Review Configuration









General Settings •

User name: Enter a unique Code Review name for the current user. The same user name specified here should also be specified in the Code Review Test Configuration.



Show user assistant during scanner run: This enables a user assistant, which allows you to specify a task identifier, enter notes, specify review priority, and enter a specific reviewer or monitor for each Code Review run. This is intended for pre-commit code reviews.



Notify me about new or updated reviews every ___ minutes: Select this if you want to be alerted when you need to review code and/or when code that you authored has been reviewed. This option is recommended.



Show completed tasks by: If you want the code review tasks tree to show completed tasks as well as active ones, enable this option and specify the range of completed tasks you want shown. For example, if you want to see all tasks completed within the past week, choose 7 days.

Team Settings •

Storage: Indicate whether you want code review data stored on Report Center or Team Server. We recommend using Report Center when it is available, but also support Team Server for backwards compatibility.



Workflow: Indicate whether you want to use the default workflow or the restricted workflow. For an overview of these options, see “Workflow Overview”, page 620.

Compare Editor •

Reuse compare editor: If want Code Review to open each revision in the same editor, enable this option.



Close compare editors on commit review action: If you want the compare editor to be closed when a revision is committed, enable this option.



Show structural changes: If you want the compare editor to show structural changes, enable this option.



Show suppressed parts If you want the compare editor to show details from code excluded from analysis (as described in “Exempting Specific Pieces of Code from Code Review Scans”, page 624), enable this option.

Opening Local Sources in Existing Projects •

Use source control to recognize local sources: If you want to allow SOAtest to use source control data in order to better recognize the sources found, enable this option.



Always open without asking when path of a local source is different than remote path (single source matches only): If you want to force SOAtest to always apply a local source path (if it is different than the remote one), enable this option.



Show warning when the file has been changed since issue was created: If you want to receive warnings if the Code Review task becomes outdated (i.e., because the current source code has been modified since the Code Review task was created), enable this option.

623

General Code Review Configuration



Label decorations: Allows you to customize what label decorations are used in the code review task tree and determine whether tasks for multiple revisions of a file are merged.



Import old scanner properties: For importing settings from an existing scanner.properties file (from previous releases). See “Importing a Scanner Configuration Previously-Defined in a scanner.properties File”, page 626 for details.

5. Click Apply to apply your settings. 6. Click OK to set and save your settings.

Configuring Test Configurations for Scanning Test Configurations control how code reviews are prepared, and the appropriate configuration varies depending on whether your team has a pre-commit or post-commit code review process. •

For details on configuring Test Configurations for pre-commit code review processes, see “Configuring and Running Pre-Commit Code Review Scans”, page 628.



For details on configuring Test Configurations for post-commit code review processes, see “Configuring and Running Post-Commit Code Review Scans”, page 635.

Exempting Specific Pieces of Code from Code Review Scans If you do not want specific pieces of code scanned by the code review scanner: 1. Add the codereview-begin-suppress comment at the point in the code where you want to stop scanning. 2. Add the codereview-end-suppress comment at the point in the code where you want to start scanning again. By default, suppressed code will not be shown in diffs for related code review tasks (for example, a diff highlighting a code change immediately following or preceding suppressed code). If you want the suppressed content to be shown in diffs: 1. Choose SOAtest> Preferences to open the Preferences dialog. 2. Select the Code Review category in the left pane. 3. Enable the Show suppressed parts option.

Troubleshooting Unpublished or Skipped Reviews Unpublished Reviews If the Test Progress tab reports that reviews were not published, this is a sign that your configuration has undefined authors. For example, assume that your code review results are as follows:

624

General Code Review Configuration

In this case, 2 files have been scanned and 2 revisions have been accepted—but there is no code review package to publish. Why? Because the scanned code had modifications made by "pietrk"—and the Test Configuration does not list pietrek in the Code Review> Authors tab. To correct this problem: 1. Add the author to the Test Configuration’s Code Review> Authors tab. 2. Either assign a reviewer to review all code by this author, or assign a reviewer to the area of source code where this author’s revisions occur. 3. Run the Test Configuration again.

Skipped Reviews If the results show that reviews were skipped, this is a sign that the modified code is not properly assigned to a reviewer. For example, assume that you have the following results

and that your report indicates the following:

625

General Code Review Configuration

You can resolve the problem by modifying the Test Configuration to either assign that area of code to a reviewer, or to assign a reviewer to all code modified by that author.

Understanding How Code Review Packages are Created Each code review package is defined based on the author changes (commits). •

In post-commit code review, changes are collected when the automated scan runs each day.



In pre-commit code review, the code author specifies which changes should be included in the package.

Any reviewers and monitors mapped to that author are then assigned to the package. If the package contains code from the project areas that other reviewers or monitors are set to review, those reviewers/monitors are also assigned to this package. If a package is created and no reviewers or monitors are set to review the related author or project area, then no review tasks will be generated for that package. Separate packages are not created for different paths (regions). As a result, reviewers assigned to different paths may be working on the same package. They should decide for themselves which paths they want to review. This maintains the integrity of tasks that may spread across multiple reviewers' assigned areas. If a reviewer wants to see which files in a package were matched to him because they belong to "his" assigned region, this information is available in the Code Review view.

Importing a Scanner Configuration PreviouslyDefined in a scanner.properties File In previous versions of SOAtest, the scanner configuration was defined in a scanner.properties file. Now, it is defined in the UI, via Test Configurations. If you have a scanner.properties file, import it to the current version of SOAtest as follows: 1. Choose SOAtest> Preferences to open the Preferences dialog. 2. Select the Code Review category in the left pane, then click the Import old scanner properties button. 3. Complete the dialog that opens as follows: •

Scanner properties to load: Enter or browse to your scanner.properties file.

626

General Code Review Configuration



New test configuration name: Enter a name for the Test Configuration that will be created based on the imported scanner configuration.



Import options: Specify 1) if you want new repositories that are defined in the scanner.properties file to be imported into the source control preferences and 2) if you want non-standard email addresses (i.e., addresses whose email username does not match the Code Review username) imported to Team Server author mapping. •

Author mapping links the username to the appropriate email address, ensuring that notification emails are sent to the appropriateemail address (rather than the default of username@domain).



If the Code Review username matches the email username, then no mapping is required.



For details on Team Server author mapping, see “Configuring Task Assignment”, page 215.

4. Follow the instructions below to fine-tune your Code Review Test Configuration and add additional repositories if needed.

627

Configuring and Running Pre-Commit Code Review Scans

Configuring and Running Pre-Commit Code Review Scans This topic explains how to setup and run a pre-commit code review process where developers submit code for review before it is committed to source control. In such a process, each time a developer is ready to have a piece of new/modified code reviewed, he runs a Code Review Test Configuration from the SOAtest UI. This Test Configuration detects code changes and prepares them for review. Sections include: •

Configuration Overview



Configuring a Test Configuration for Pre-Commit Code Reviews



Submitting Code for Review - Interactive Desktop Execution



Adding New Files to the Code Review Package



Defining Reviewers, Authors, and Monitors with a Properties File



Adding Post-Commit Scans To Your Pre-Commit Process

Related Topics •

For details on configuring post-commit code review, which is for teams who want to review code after it is committed to source control, see “Configuring and Running PostCommit Code Review Scans”, page 635.



For details on pre-commit vs. post-commit, see “Workflow Overview”, page 620.



For details on general code review configuration options (options that apply to both precommit and post-commit processes, see “General Code Review Configuration”, page 622.

Configuration Overview Configuring a pre-commit code review requires: •

Setting up the appropriate preferences on all team SOAtest installations as described in “General Code Review Configuration”, page 622.



Configuring a Code Review Test Configuration on one machine and sharing it across the team as described below •

Ensuring that the projects containing the reviewed code are available within SOAtest . These projects should be checked out from source control.

Configuring a Test Configuration for Pre-Commit Code Reviews When a developer is ready to have a piece of new/modified code reviewed, he runs a Code Review Test Configuration from the SOAtest UI. This Test Configuration detects code changes and prepares them for review. We recommend that you configure this Test Configuration from one SOAtest installation, then share it across the team SOAtest installations using Team Server. This streamlines configuration and updating.

628

Configuring and Running Pre-Commit Code Review Scans

You can create a Test Configuration that is dedicated to Code Review, or have a Test Configuration that checks for Code Reviews as well as performs other analyses.

Important The default Code Review Test Configuration settings must be reviewed and customized before you run Code Review.

To configure a Test Configuration that detects code changes and prepares them for review: 1. Open the Test Configurations dialog by choosing SOAtest> Test Configurations or by choosing Test Configurations in the drop-down menu on the Test Using toolbar button. 2. Duplicate an existing Test Configuration (such as Built-in> Code Review> Pre-Commit) or create a new one. 3. In the Scope tab, choose Test only files added or modified locally. 4. At the top of the Code Review tab: a. Check Enable Code Review Scanner. b. Check Use unique [user@host] pattern. This ensures proper sorting and recognition of code review tasks. c.

Check Generate comprehensive report (includes all scanners) if you want the report to include code review results from all available team scanners. If this is not enabled, the report will include only results for the scanner ID specified in the current Test Configuration. This option is not typically used for pre-commit code reviews.

d. Check Auto publish reviews if you want review tasks to be "published" (uploaded) automatically after this Test Configuration is run. This is recommended for pre-commit code reviews. 5. In the Authors, Reviewers, Monitors, and Filters tabs, define how you want your code reviews assigned. Reviewers and monitors can be assigned to specific authors, or to specific project areas.

Tip - Streamlining Configuration for a Large Number of Developers If you don’t want to define every developer, you can 1) enable the Accept all (also undefined) authors for reviewed paths, and then 2) Define which reviewers should review different parts of the code. •

In the Authors tab, define the list of developers who are writing code that you want reviewed. For each author, specify an author name and a source control login (if the author’s source control login is different than the author’s name). •

Your list of authors can include all of your developers, or only your junior developers.



If the developer who commits a code change is not defined in this tab, the change will be marked as coming from an 'undefined author'.



You do not need to map authors to reviewers or monitors here. These fields are provided for backwards compatibility with earlier releases.

629

Configuring and Running Pre-Commit Code Review Scans





In the Reviewers and Monitors tabs, specify which authors and/or project areas you want each reviewer or monitor to cover. •

Reviewers examine, comment on, and approve code changes. Monitors supervise the entire process to ensure that revisions are being reviewed and then corrected in a timely manner. They do not need to perform any reviews, but can add comments to the revisions or reviews. This role is optional.



Paths are defined in logical (workspace) path convention. Wildcards are allowed. For example, * /com.parasoft.xtest.results.api/** assigns someone to review changes in the result api module. * **/*ja.properties assigns someone to review the changes in files with japanese resources.



You can define reviewers and monitors without mapping them to any particular path or author. Such users will be not assigned to any package automatically, but they will be included in the report and authors will be able to select them in the Code Review Assistant dialog.

In the Filters tab, check Include files added or modified locally, then use the available fields to specify which files you want included or excluded.

630

Configuring and Running Pre-Commit Code Review Scans

Filter Tips and Examples Tips •

Perl-style expressions can be used.



The following wildcards are supported:





* matches 0 or more characters except '/'.



? matches any single character except '/'.



** matches 0 or more characters, including '/'. This allows you to include path elements.

The following sample elements are added by default to the Code Review configuration: •

**/bin/**/*.properties is added to the sample list of rejected wildcards.



(.*?/(bin|obj)(/x86|/x64){0,1}/(Debug|Release)/.*?\\.(dll|exe|pdb))$ is added to the sample list of rejected regexps.

Examples A basic file mask might be: •

*.xml, *.properties

To include every file whose path has a folder named "bank" or "customer", use: •

**/bank/**, **/customer/**

To include every file whose path has a folder with a name that either starts with "bank", includes "customer", or ends with "invoice", use: •

**/bank*/**, **/*customer*/**, **/*invoice/**

6. Click Apply to commit the new Test Configuration. 7. Share the Test Configuration by right-clicking it, then choosing Upload to Team Server from the shortcut menu.

Submitting Code for Review - Interactive Desktop Execution Each developer should perform the following steps after completing new/modified code that is ready for review: 1. In the project tree, select the project that contains the changes you want reviewed. 2. Run the Code Review Test Configuration. This should be in the Team category. 3. If you enabled the Show user assistant during scanner run option in the Code Review Preferences panel (strongly recommended), SOAtest will display the Code Review assistant.

631

Configuring and Running Pre-Commit Code Review Scans

You can use this dialog to: •

Provide information about the task you were working on. You can select a previous task that you recorded here, or enter a new one. This information is used to determine if you are still working on an existing task, or if you have started a new one. Your changes will be grouped by task in the code review packages that are prepared.



Assign a specific reviewer or monitor to the submitted revision when you are submitting a new task for review. This allows you to override the default reviewer/monitor assignment—or to specify a reviewer if one is not already assigned. •

The box on the right shows the list of already-defined reviewers (when your cursor is in the Reviewers field) or monitors (when your cursor is in the Monitors field). You can either select any user who is listed in that box, or type in a new name.



Provide the reviewer or monitor additional information about the current changes.



Indicate the priority of the review.

4. If your Code Review Test Configuration does not have Auto publish reviews enabled, upload the results to Team Server as follows: a. After the Test Configuration has run, click the Report button that is available in the Testing dialog. b. In the Report dialog that opens, check Publish: Code Reviews, then click OK. The designated reviewer will then be alerted that code is ready for review. The reviewer can perform the review as described in “Reviewers - Reviewing Code Modifications”, page 650. After the review is completed, the author can respond as described in “Authors - Examining and Responding to Review Comments”, page 647.

Adding New Files to the Code Review Package To add selected files to an existing code review package:

632

Configuring and Running Pre-Commit Code Review Scans

1. In the Code Review tab, right-click the package to which you want to add files, then choose Add Sources from the shortcut menu.

2. From the file chooser that opens, select the folders or specific files you want to add. Note that folders will not be processed recursively. The files will be added the next time the Code Review Test Configuration is run.

Defining Reviewers, Authors, and Monitors with a Properties File In addition to specifying code review users directly in the Test Configuration and adding reviewers "onthe-fly" in the Code Review Assistant dialog, you can also define code review users in a .properties file. This can be useful if your review process involves a large number of team members. To define reviewers with a properties file: 1. Prepare an empty properties file config.properties. 2. Add to it the following initial required data: com.parasoft.xtest.checkers.api.config.name=Base Code Review com.parasoft.xtest.checkers.api.config.tool=3 com.parasoft.xtest.codereview.scanner.checksum=fixed

3. Begin defining the list of users—either manually, or using a custom script that pulls the appropriate information automatically. Each user must have an index, and then properties, such as name, roles, and monitored.locations (for monitors only). Available roles are 'a' (author), 'm' (monitor), 'r' (reviewer). For example: com.parasoft.xtest.codereview.scanner.crusers.0.name=tom com.parasoft.xtest.codereview.scanner.crusers.0.roles=r com.parasoft.xtest.codereview.scanner.crusers.1.name=bob com.parasoft.xtest.codereview.scanner.crusers.1.roles=r com.parasoft.xtest.codereview.scanner.crusers.2.monitored.locations=**/src/**, **/include/** com.parasoft.xtest.codereview.scanner.crusers.2.name=joe com.parasoft.xtest.codereview.scanner.crusers.2.roles=am com.parasoft.xtest.codereview.scanner.crusers.3.name=bill com.parasoft.xtest.codereview.scanner.crusers.3.roles=a

4. Publish the prepared file on a dedicated HTTP server. If you will be automatically regenerating this based on user database, use a file location that will allow such re-generation. 5. Open the Test Configuration dialog and create new configuration by duplicating the "Pre Commit" Test Configuration. 6. Export the Test Configuration to the file system. 7. Open the exported file in editor and set the parent configuration as property com.parasoft.xtest.checkers.api.config.parent=http\://server.com/config.properties

633

Configuring and Running Pre-Commit Code Review Scans

8. Import the modified configuration. It should now have a parent config from the HTTP server and defined reviewers on the list. 9. Refresh the Test Configuration manager and verify that any added monitors are displayed. 10. Publish it on Team Server. Here is a sample properties file: com.parasoft.xtest.codereview.scanner.crusers.13.roles=arm com.parasoft.xtest.codereview.scanner.crusers.13.name=dave com.parasoft.xtest.codereview.scanner.crusers.13.reviewers=mark_A, mark_B com.parasoft.xtest.codereview.scanner.crusers.13.monitors=mike com.parasoft.xtest.codereview.scanner.crusers.13.reviewed.locations=**/test/*.txt, project_A/ ** com.parasoft.xtest.codereview.scanner.crusers.13.monitored.locations=/** com.parasoft.xtest.codereview.scanner.checksum=fixed

Note that: •

number 13 is a unique id of this user in a config.



roles "arm" stand for author/reviewer/monitor. The presence of a particular letter indicates that this user has its corresponding role. For example roles "rm" mean that he should be included in list of suggested reviewers/monitors, but he's not an author (no reviews will be created for his revisions).



reviewers/monitors is a list of users that should be automatically assigned to package authored by this user (requires role author=true).



reviewed/monitored locations is a list of regions to which this user should be assigned as reviewer/monitor (requires role reviewer/monitor=true).



reviewer/monitor=true for 'dave' is required also if any other user has 'dave' specified on his reviewers/monitors list



only unique id and user name is required, but each user has to have at least one of author/ reviewer/monitor roles active to be meaningful



checksum=fixed is a single key required to avoid overwriting of this config by older versions of the application.

Adding Post-Commit Scans To Your Pre-Commit Process Some teams who submit code for review via a pre-commit process also like to perform a post-commit nightly scan to: •

Generate emails notifying authors and reviewers about their assigned code review tasks.



Identify any code changes that were committed to source control, but were not submitted for review using the pre-commit procedure.

For details on how to accomplish this, see “Running Post-Commit Scans with a Pre-Commit Process”, page 639.

634

Configuring and Running Post-Commit Code Review Scans

Configuring and Running Post-Commit Code Review Scans This topic explains how to setup and run a post-commit code review scan that scans the source control system, identifies new/modified code that has been checked in, and matches the code with designated reviewers. Sections include: •

Configuration Overview



Configuring a Test Configuration for Post-Commit Code Reviews



Preparing Code for Review - Automated Scanner Execution



Running Post-Commit Scans with a Pre-Commit Process

Related Topics •

For details on pre-commit code review, which is for teams who want to review code before it is committed to source control, see “Configuring and Running Pre-Commit Code Review Scans”, page 628.



For details on pre-commit vs. post-commit, see “Workflow Overview”, page 620.



For details on general code review configuration options (options that apply to both precommit and post-commit processes, see “General Code Review Configuration”, page 622.

Configuration Overview Configuring a post-commit code review requires: •



On the SOAtest server installation: •

Configuring a Code Review Test Configuration as described below.



Configuring the command-line execution of this Test Configuration as described below.



Scheduling automated scans as described below.

On all team SOAtest installations —including desktop and server installations: •

Setting up the appropriate preferences on all team SOAtest installations as described in “General Code Review Configuration”, page 622.



Ensuring that the projects containing the reviewed code are available within SOAtest . These projects should be checked out from source control.

Configuring a Test Configuration for Post-Commit Code Reviews For post-commit processes, a code review Test Configuration runs on the server each night to scan that scans the source control system, identify new/modified code that has been checked in, and match the code with designated reviewers.

635

Configuring and Running Post-Commit Code Review Scans

Important The default Code Review Test Configuration settings must be reviewed and customized before you run Code Review.

To configure a Test Configuration that detects code changes and prepares them for review: 1. Open the Test Configurations dialog by choosing SOAtest> Test Configurations or by choosing Test Configurations in the drop-down menu on the Test Using toolbar button. 2. Duplicate an existing Test Configuration (such as Built-in> Code Review> Post-Commit) or create a new one. 3. In the Scope tab, choose Test files added or modified in the last ___ days. This determines what files are prepared for code review. For example, if the Code Review scanner will run daily, enter 1 in the days field. If it will run weekly, enter 7. •

Alternatively, you can configure it to prepare all files modified within a specified time period. Instead of choosing Test files added or modified in the last ___ days, choose the Test only files added or modified since the cutoff date option and the and added or modified before option, then specify the desired time range.

4. At the top of the Code Review tab: a. Check Enable Code Review Scanner. b. Enter a unique identifier for the scanner in the Identifier field. If your team uses multiple scanners, each must have a unique id. c.

Check Generate comprehensive report (includes all scanners) if you want the report to include code review results from all available team scanners. If this is not enabled, the report will include only results for the scanner ID specified in the current Test Configuration.

d. Enable Auto publish reviews if you want review tasks to be "published" (uploaded) automatically after this Test Configuration is run. If you use the -publish option with a nightly run, tasks will be "published" regardless of this setting. 5. In the Authors, Reviewers, Monitors, and Filters tabs, define how you want your code reviews assigned. Reviewers and monitors can be assigned to specific authors, or to specific project areas.

Tip - Streamlining Configuration for a Large Number of Developers If you don’t want to define every developer, you can 1) enable the Accept all (also undefined) authors for reviewed paths, and then 2) Define which reviewers should review different parts of the code. •

In the Authors tab, define the list of developers who are writing code that you want reviewed. For each author, specify an author name and a source control login (if the author’s source control login is different than the author’s name). •

Your list of authors can include all of your developers, or only your junior developers.

636

Configuring and Running Post-Commit Code Review Scans







If the developer who commits a code change is not defined in this tab, the change will be marked as coming from an 'undefined author'.



You do not need to map authors to reviewers or monitors here. These fields are provided for backwards compatibility with earlier releases.

In the Reviewers and Monitors tabs, specify which authors and/or project areas you want each reviewer or monitor to cover. •

Reviewers examine, comment on, and approve code changes. Monitors supervise the entire process to ensure that revisions are being reviewed and then corrected in a timely manner. They do not need to perform any reviews, but can add comments to the revisions or reviews. This role is optional.



Paths are defined in logical (workspace) path convention. Wildcards are allowed. For example, * /com.parasoft.xtest.results.api/** assigns someone to review changes in the result api module. * **/*ja.properties assigns someone to review the changes in files with japanese resources.



You can define reviewers and monitors without mapping them to any particular path or author. Such users will be not assigned to any package automatically, but they will be included in the report and authors will be able to select them in the Code Review Assistant dialog.

In the Filters tab, use the available fields to specify which files you want included or excluded.

6. In the Common tab, check Update projects from source control.

Important The project should be fully updated from the source control repository before the code review scan is performed. If you have an external system that updates resources before the code review scan, you do not need to enable the Update projects from source control option. If you do not have another system performing updates, we strongly recommend that you enable this option.

7. Click Apply to commit the new Test Configuration. 8. Test the Test Configuration by selecting one of your projects in the project tree, then running the Test Configuration.

Preparing Code for Review - Automated Scanner Execution For post-commit code reviews, is typically scheduled to run in command-line mode at a regularlyscheduled interval (for example, daily). These tests execute the team’s Code Review Test Configuration. Each time a test runs, the Code Review scanner scans the designated source control repositories for new/modified code, then prepares any detected changes for review.

cli setup 1. If you have not already done so, set up your projects SOAtest UI. 2. In a local settings file: •

Set your license

637

Configuring and Running Post-Commit Code Review Scans



Enable source control scope computation



Provide information about your source control repositories



Configure access to Team Server



Configure mail server and mail options

Here is a sample local settings file: # REPORTS report.mail.enabled=true report.mail.exclude.developers=false report.developer_errors=false report.developer_reports=true report.suppressed_msgs=false [email protected] [email protected] report.mail.server=mail.parasoft.com report.mail.domain=parasoft.com report.mail.from=nightly report.mail.exclude= report.mail.subject=SOAtest report for ${project_name} - codereview report.test_params=true # Team Server tcm.server.enabled=true tcm.server.name=main1.parasoft.com tcm.server.port=18888 tcm.server.accountLogin=true tcm.server.username=devel tcm.server.password=mypass report.tag=${project_name}-codereview # SCOPE scope.sourcecontrol=true scope.author=false scope.local=false # LICENSE soatest.license.use_network=true soatest.license.network.host=main1ic.parasoft.com soatest.license.network.port=2002 soatest.license.network.edition=server_edition # SOURCE CONTROL scontrol.rep1.type=cvs scontrol.rep1.cvs.root=:pserver:[email protected]:/home/devel/cvs scontrol.rep1.cvs.pass=password

cli execution Here is a sample command for running Code Review scanner in cli mode: soatestcli -publish -config "team://xtest-codereview.properties" -resource "my_resource" -localsettings C:\tmp\localsettings.properties

After each Code Review scanner execution, the designated reviewers will then be alerted that code is ready for review. The reviewesr can perform the review as described in “Reviewers - Reviewing Code Modifications”, page 650. After the review is completed, the authors can respond as described in “Authors - Examining and Responding to Review Comments”, page 647.

638

Configuring and Running Post-Commit Code Review Scans

Running Post-Commit Scans with a Pre-Commit Process Some teams who submit code for review via a pre-commit process also like to perform a post-commit nightly scan to: •

Generate emails notifying authors and reviewers about their assigned code review tasks.



Identify any code changes that were committed to source control, but were not submitted for review using the pre-commit procedure.

If your team chooses to use post-commit scans with a pre-commit process, you can: •

Run a post-commit scan on an empty workspace: This does not perform any additional scanning or report any additional tasks; it only generates emails related to the submitted "precommit" reviews (if the appropriate reporting options are set).



Setup a workspace, then run a post-commit scan using -include to "filter in" and rescan only the critical parts of the project that you want scanned: This finds code changes in critical project areas that were not submitted for review in the pre-commit code review process and generates new tasks for them. Emails will be generated if you have the appropriate reporting options set. If you have the Generate comprehensive report (includes all scanners) option enabled in the Test Configuration’s Code Review tab, then the report will include tasks from all code review scans (pre-commit and post-commit). Otherwise, it will include only tasks from the current post-commit scan.



Setup a full workspace, and run a post-commit scan on all the code: This finds any code changes that were not submitted for review in the pre-commit code review process and generates new tasks for them. Emails will be generated if you have the appropriate reporting options set. If you have the Generate comprehensive report (includes all scanners) option enabled in the Test Configuration’s Code Review tab, then the report will include tasks from all code review scans (pre-commit and post-commit). Otherwise, it will include only tasks from the current post-commit scan.

Also be aware that in "mixed" code review processes, the scanner will compare changes reported for pre-commit code review with changes committed to source control. •

If the changes are the same, it means that file was reported to review by pre-commit code review and next committed to source control. In this case, the pre-commit process is being followed.



If the changes are not the same, it means that the file was modified, but not submitted for code review. In this case, the pre-commit process is not being followed.

However, sometimes minor changes that do not require review are introduced into source control—for instance, CVS headers like: /* * * * * * * * * * * * * * *

$RCSfile: MyFile.txt,v $ $Revision: 1.13 $ Comments: (C) Copyright Parasoft Corporation 1996. All rights reserved. THIS IS UNPUBLISHED PROPRIETARY SOURCE CODE OF Parasoft The copyright notice above does not evidence any actual or intended publication of such source code. Revision 1.2 2006/02/03 10:07:28 class repackaged

dan

Revision 1.1

mark

2005/09/18 09:26:24

639

Configuring and Running Post-Commit Code Review Scans

* new file */

In such cases, the Test Configuration for the nightly post-commit scan should specify a regular expression that describes the automatically-generated code. This is entered in the Post to Pre-Commit matching option in the Test Configuration’s Code Review> Filters tab. For example, to ignore changes in the above CVS header, you would enter (^ \* .*)|(^ \*$)

You can also set the Test Configuration to match (and then ignore) other changes that do not need to be reviewed. For example, the team might allow documentation changes to be committed directly to source control (without requiring review).

640

Working with the Code Review UI

Working with the Code Review UI This topic introduces how the Code Review UI presents code review tasks, provides an overview of how you can take action on your assigned tasks, and explains how to customize the UI to meet your specific preferences. Sections include: •

Introducing the Code Review UI



Importing Code Review Tasks into the UI



Understanding Task Status Indicators



Taking Action on a Task or Set of Tasks



Customizing the Code Review Tasks Tree

Introducing the Code Review UI You can work with Code Review from the SOAtest perspective or the Code Review perspective. To open the Code Review perspective, perform one of the following actions: •

Click the Open Perspective button in the shortcut bar, choose Other, then choose Code Review in the Select Perspective dialog that opens.



Choose Window> Open Perspective> Other, then choose Code Review in the Select Perspective dialog that opens.



Click the Code Review button in the shortcut bar (on the top right of the workbench) if the perspective has been previously opened.

SOAtest provides the following views to facilitate code reviews: •

SOAtest view – In Code Review mode, this view displays the code review task tree. It presents two types of tasks: reviewers see "to review" tasks for code revisions they need to review, and authors see "to fix" tasks for responding to reviewer comments. This view should be open by default; to put it in Code Review mode, click the Code Review Tasks button in the view’s toolbar.



Code Review Issues – Allows team members to add and review code comments about the submitted code modifications. This allows the author and reviewer to have a conversation about how to revise a submitted revision. To access this view, choose SOAtest> Show View> Code Review Issues.



Compare Editor – Highlights the differences between the most recent version of the file and the previous version that was stored under source control. This will open when you doubleclick a specific revision in the code review task tree.

641

Working with the Code Review UI

Importing Code Review Tasks into the UI Depending on your import settings, your code review tasks may be imported automatically when you import your quality tasks (as described in “Accessing Results and Reports”, page 234. You can always import your assigned code review tasks into the SOAtest view by choosing SOAtest> Import> [desired_import_option] or clicking the Import My Recommended Tasks toolbar button. If you choose to import a custom set of tasks, be sure to enable the Code Review option.

Understanding Task Status Indicators Depending on how you configure the code review tasks tree, each code review task may be marked with a status indicator, as well as the name of the file, the revision version number, and the time when the latest revision was committed into the source control.

The following table describes the various status indicators used:

642

Working with the Code Review UI

Action

Description

To Review

Indicates that a review should be performed on the revision package.

To Fix

Indicates that some improvements should be made within the files included in the revision package.

Monitor

Indicates that the designated monitor should review the status of files included in the revision packages.

Waiting

Indicates that your revision package is waiting for someone's action.

Done

Indicates that the task is completed and no additional action is needed. If you want such tasks to be shown, you need to set the Show completed tasks by option in the Preferences panel (see “Configuring Preferences”, page 622).

Taking Action on a Task or Set of Tasks The main ways to take action on a task are to double-click a code review task tree node, or right-click it and choose the appropriate shortcut menu command. You can take action on a set of tasks (such as all code review tasks for a code review package), a single code review task (such as a single modification to review or reviewer comment to address), or anything in between.

Available Actions Different shortcut menu commands are available depending on what code review task tree item you right-click and your specific role. Role-relevant commands are discussed in the topics for authors and reviewer. Generally, shortcut menus can be used to perform actions such as: •

Reassign a task to another team member.



Remove a task.



Navigate from one task to the next (or previous) tasks.



Expand or collapse the tree.



Change the tree layout.



Mark a task as done, cancel tasks, or reject tasks.



Open a compare editor that highlights the differences between the most recent version of the file and the previous version that was stored under source control.



Add a new reviewer.



Open comments that a reviewer added.

For example, a reviewer with the following task might choose Compare with Previous

643

Working with the Code Review UI

review the code change in the compare editor

then add an issue in the Code Review Issues area. The author might then open up that comment, then respond to it in the Code Review Issues area.

Applying an Action to Multiple Tasks If you right-click a code review task tree node that represents a group of items (for instance, all reviews for a specific file), you can use a single command to perform the same action on all appropriate items in that group. For example, if you want to mark all active code review tasks in a package as "done," you could do as follows:

644

Working with the Code Review UI

Customizing the Code Review Tasks Tree There are numerous ways to configure the code review tasks tree to suit your needs and preferences.

Layout Modifications The main way is to use the Layout menu to configure specific elements to show or hide.

This allows you to control not only what elements are shown, but also how they are organized. For instance, if you want a project-focused perspective, you can show the project node and have the tree organized as follows:

645

Working with the Code Review UI

Or, reviewers might want it organized by code author:

Label Decorations To fine-tune what data is displayed in the various tree nodes that you choose to display, you can use the Preferences panel’s Code Review controls to configure which labels are displayed.

Filters Additionally, you can filter the content shown by clicking the Filter button in the SOAtest view

then specifying the desired filter conditions.

646

Authors - Examining and Responding to Review Comments

Authors - Examining and Responding to Review Comments This topic explains how authors use Code Review to examine and respond to comments that a reviewer has added. Sections include: •

Typical Workflow



Cancelling a Review



Closing All Issues



Adding a Reviewer

For details on how a developer prepares code for review in a "pre-commit" code review workflow, see “Configuring and Running Pre-Commit Code Review Scans”, page 628.

Typical Workflow Authors use Code Review to review and respond to reviewer comments as follows: 1. Choose SOAtest> Show View> Code Review to open the Code Review view. 2. Refresh the Code Review view by clicking the Refresh button. 3. Double-click the node for the revision you want to review. All files belonging to it are listed within that node. 4. Choose SOAtest> Show View> Code Review Issues to open the Code Review Issues view. 5. Read the available reviewer comments (select the comment’s table row to display additional details). 6. Modify the source code as needed in your editor. 7. Add your comments on the issue (your response to the reviewer). •

To add a new comment on a specific line of code, right-click that line of code in the Compare panel, choose Add Code Review Issue, then add your comment. Code Review automatically adds the file path to your comments.



To add a new general comment regarding that set of revisions, right-click the table in the Code Review Issues view, choose Add General Thread, then add your comment.



To add an additional comment for the selected issue, enter it in the Add New Comment to Current Issue text field.

647

Authors - Examining and Responding to Review Comments

8. Click the appropriate action button in the toolbar: •

To Review – Choose this action to have your comments and revision reviewed. The revision is then reassigned to the designated reviewer.



Done – Choose this action if no additional action is required (for example, you have implemented a reviewer’s suggestion, and the reviewer does not want to review the modification). The issue will be considered "closed" after this action is committed. If your Code Review is setup for a restricted workflow, this option will not be available.

Warning Ensure that you did not overlook any review comments associated with the selected revision node. After you change the status, the revision node will be removed from the repository, and you will not be able to add any additional comments. 9. If any past issues were accepted by the reviewer, close them by clicking the Close button. 10. After you have reviewed all comments, click the Commit Review button in the top of the Code Review Issues view. A dialog box will open and ask if you want mark the review DONE. Click OK. 11. Click the Refresh button in the upper-right corner of the repository panel. The revision node disappears from the repository tree or changes its status to 'waiting'.

Cancelling a Review

648

Authors - Examining and Responding to Review Comments

If you want to cancel a review (e.g., if you mistakenly committed a package for review or want to make additional changes prior to the review), right-click the package in the Code Review view, then choose Cancel from the shortcut menu.

Closing All Issues If you want to close all issues after reviewers have commented on them, choose Close All from the menu in the upper right corner of the Code Issues view.

Adding a Reviewer If you want to add another reviewer (e.g., for a second opinion), right-click the package in the Code Review view, then choose Add Reviewer from the shortcut menu.

649

Reviewers - Reviewing Code Modifications

Reviewers - Reviewing Code Modifications This topic explains how reviewers use Code Review to examine authors’ code, then either accept it or request additional changes. Sections include: •

Typical Workflow



Rejecting a Review



Adding or Changing Assigned Reviewers

Typical Workflow Reviewers use Code Review as follows: 1. Choose SOAtest> Show View> Code Review to open the Code Review view. 2. Refresh the Code Review view by clicking the Refresh button. 3. Double-click the To Review node for the revision you want to review. This node includes new and modified files by the specified author. 4. Double-click any file node to open the selected revision in the Compare panel. The source of the most recent file revision is displayed on the left side of the Compare panel. The previous revision of that file is displayed on the right. Code Review automatically compares those two revisions and highlights the recent changes. The indicators in the right gutter indicate how many changes were made to the selected file and allow to easily skip from one change to another. •

If your source control repository was not previously specified, a configuration dialog will open after you double-click the file node. You must complete this before the code can be opened in the Compare panel.

5. Examine the source file: 6. Choose SOAtest> Show View> Code Review Issues to open the Code Review Issues view. 7. Add your comments on the code (your response to the author). •

To add a comment on a specific line of code, right-click that line of code in the Compare panel, choose Add Code Review Issue from the shortcut menu, then add your comment. Code Review automatically adds the file path to your comments.



To add a general comment regarding that set of revisions, right-click the table in the Code Review Issues view, choose Add General Thread, then add your comment.



To add an additional comment for an existing (selected) issue, enter it in the Add New Comment to Current Issue text field.

8. Choose the category that best describes your comment. 9. Indicate the severity of the issue you are raising.

650

Reviewers - Reviewing Code Modifications

10. Click the appropriate action button in the toolbar:



To Fix – Choose this action if the revision needs further modifications. The request for modification is then assigned to the author.



Accept – Choose this action if the revision does not require further action. If your team is following the default workflow policy, the issue will be considered "closed" after this action is committed. Otherwise, it will be sent to the author, who can then close the revision.

Warning Ensure that you have verified and commented on all file subnodes included in the selected revision node. After you commit the revision, the revision node will be removed from the repository (or changes to "waiting" status), and you will not be able to add any additional comments or verify any files associated with that revision node.

11. After you have verified all files included in the selected revision and commented on them all, click the Commit Review button in the top corner of the Code Review Issues view. A dialog box will open and ask if you want mark the review DONE. Click OK. 12. Click the Refresh button in the upper-right corner of the repository panel. The revision node disappears from the repository tree or changes its status to 'waiting'.

Rejecting a Review If you want to cancel a review (e.g., if you want the code for the submitted change to be reverted to its original state), right-click the package in the Code Review view, then choose Reject from the shortcut menu.

651

Reviewers - Reviewing Code Modifications

Adding or Changing Assigned Reviewers If you want to add another reviewer (e.g., for a second opinion), right-click the package in the Code Review view, then choose Add Reviewer from the shortcut menu. If you want to reassign the review to another reviewer, right-click the package in the Code Review view, then choose Reassign from the shortcut menu.

652

Monitors - Overseeing the Review Process

Monitors - Overseeing the Review Process This topic explains how the monitor (an optional role) oversees the code review process. Sections include: •

Typical Workflow



Closing an Issue



Adding or Changing Assigned Reviewers

Typical Workflow The monitor's task is similar to the reviewer's, except that the monitor doesn't have to comment on revisions. The monitor is responsible for supervising the revision process. However, the monitor has the ability to comment on a revision (for example, to give developers additional tips). The monitor may also close all issues within the revision package. Although the author is responsible for closing his issues, the monitor may do so if needed (for instance, if the author is absent or the revision is obsolete). Once an issue is closed, the file will not be scanned by Scanner and reported by Code Review—unless the new lines are added to the source code of the file.

Closing an Issue If you want to close one or more issues, select the issues you want to close in the Code Review view, right-click, then choose Close from the shortcut menu.

Adding or Changing Assigned Reviewers If you want to add another reviewer (e.g., for a second opinion), right-click the package in the Code Review view, then choose Add Reviewer from the shortcut menu.

653

Monitors - Overseeing the Review Process

If you want to reassign the review to another reviewer, right-click the package in the Code Review view, then choose Reassign from the shortcut menu.

654

Code Review Tips and Tricks

Code Review Tips and Tricks This topic provides tips on using Code Review effectively.

Setup Tips •

If your Parasoft solution uses Team Server Named Accounts, ensure that Team Server user accounts can access the Team Server 'Code Review' directory. For details on opening the appropriate path permissions, see the 'Named Accounts' section of the PST Admin Guide.'



Ensure that reviewing responsibilities are distributed across team members. If one person is responsible for reviewing a large amount of code, the code review may become overwhelming.

Author Tips •

For post-commit code reviews - When you check in code to the source control repository, enter comments that will help the reviewer quickly determine the purpose and nature of the change.

Reviewer Tips •

Review code daily to prevent the reviews from accumulating. If reviews are not addressed regularly, code review will become an overwhelming task.



For post-commit code reviews - Read the developer’s check in comments (available in a tool tip over the revision branch) before reviewing each revision, then let this inform how deeply you review the code. For instance, you might want to dedicate more time to inspecting a revision for a re-implemented algorithm than for inspecting a revision for a minor cosmetic change.

655

Platform Support and Integrations In this section: •

Using AmberPoint Management System with SOAtest



Using Oracle/BEA with SOAtest



Using HP with SOAtest



Using JMS with SOAtest



Using Microsoft with SOAtest



Using IBM/Rational with SOAtest



Using Software AG CentraSite Active SOA with SOAtest



Using Software AG webMethods with SOAtest



Using Sonic with SOAtest



Using TIBCO with SOAtest

656

Using AmberPoint Management System with SOAtest

Using AmberPoint Management System with SOAtest One of the greatest challenges of SOA quality is how to validate realistic transactions involving distributed services that may not be available for testing. Consider the example of a services-based account provisioning system that’s losing orders. To solve this problem, the development team has to test the fixed application in a safe sandbox that replicates the transaction flows of the production environment– or risk breaking the system upon redeploying the application. Parasoft and AmberPoint deliver an integrated solution that helps teams overcome this challenge by automatically emulating–or virtualizing–services based on real-world historical data collected from the runtime environment. With this baseline established, teams can exercise distributed services in context–without impacting partners’ normal business transactions. Users of AmberPoint Management System can export their runtime message sets or runtime validation baselines in the production environment, then provide this information to Parasoft SOAtest in order to create tests that can replay the SOAP messages. Users also have the option of establishing the captured response messages as the regression control within the generated tests. For details, see “Creating Tests From AmberPoint Management System”, page 391.

657

Using Oracle/BEA with SOAtest

Using Oracle/BEA with SOAtest SOAtest associates SOA artifacts with the OER (Oracle Enterprise Repository), previously known as AquaLogic Enterprise Repository (ALER). This integration enables teams to automatically execute a quality workflow and correlate quality data in context of an SOA Governance initiative. SOAtest can automatically generate tests at the time the services are published to the OER registry—including functional test cases and WSDL verification tests to ensure WSDLs are compliant to best practices and organizational policies. Policy compliance results are then reported back to the registry and updated in real-time. This provides continuous visibility into a service's quality throughout its lifecycle. For details on this functionality, see “Creating Tests From Oracle Enterprise Repository / BEA AquaLogic Repository”, page 393 SOAtest tests JMS, EJB and Web services on WebLogic (including WebLogic based WS-Security policies) as well as tests JMS, EJB and Web services on the Oracle Fusion Middleware and other Oracle packaged application solutions For database validation, SOAtest supports Oracle databases, including PL/SQL. This allows for data to be set, reset, and updated in a database as part of end-to-end scenario test executions— making scenario tests repeatable for automated execution, and allowing SQL queries to be executed in order to validate database content at select steps of your business transactions. Data may also be leveraged to parameterize and drive your tests. See “DB”, page 911 for details.

658

Using HP with SOAtest

Using HP with SOAtest SOAtest test cases can be mapped to specific requirements in HP (Mercury) Quality Center. Once a test case is mapped to a requirement, the testing of the requirement becomes as simple as clicking a button inside Quality Center. Parasoft test cases can then be run from inside Quality Center and the results will be reported back to Quality Center. Furthermore, HP SiteScope can complement Parasoft SOA Quality Solution: Parasoft helps create and generate load tests while SiteScope monitors system internals for deeper performance analysis and issue resolution.

Using SOAtest with Quality Center The following sections describe how to integrate Parasoft SOAtest with HP Quality Center. Sections include: •

Installing Microsoft SOAP Toolkit SDK



Preparing SOAtest for use with Quality Center



Creating a Test in the Test Plan Module



Modifying the Script



Running a Test in the Test Lab Module



Understanding Test Results



Configuring Quality Center Integration Features

Installing Microsoft SOAP Toolkit SDK The Microsoft SOAP Toolkit SDK must be installed in order to use SOAtest with Quality Center. Quality Center needs this SDK in order for test scripts to be able to invoke web services. The installer can be found on the Quality Center CD-ROM or can be downloaded from the Microsoft Download Center (http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=c943c0dd-ceec-4088-9753-86f052ec8450).

Preparing SOAtest for use with Quality Center Quality Center can run SOAtest test suites using the SOAtest web service interface. This means that SOAtest does not necessarily have to be installed on the same machine as Quality Center. The SOAtest web service is enabled by starting SOAtest started in web service mode. The machine running SOAtest in web service mode must have access to a workspace containing the SOAtest test suites (tst files) that you wish to run using Quality Center. Web service mode is available for the Server Edition of SOAtest. For details about running SOAtest in web service mode, see “Testing from the Web Service Interface”, page 287.

Note Quality Center can also be configured to run tests using the SOAtest command line interface. This is the default integration method used in older versions of SOAtest. This integration method is still available for legacy reasons, but is now deprecated.

Creating a Test in the Test Plan Module 659

Using HP with SOAtest

1. After logging in to Quality Center, select the Test Plan module. The Test Plan module allows you to set up your automated test and custom scripts for Quality Center. 2. In the Test Plan Tree, select Subject and click the New Folder toolbar button.

3. In the new dialog box enter SOAtest Demo for the Folder Name.

4. Select the newly created folder and click the New Test icon from the above toolbar.

660

Using HP with SOAtest

5. In the Create New Test dialog box, select VAPI-XP-TEST in the Test Type drop-down box, enter SOA Demo for the Test Name, then click OK.

6. In the HP VAPI-XP Wizard select JavaScript as the Script Language, enter SOAtest for Script Name, then click Finish. 7. Select the Test Script tab. Notice that a template has been created from the HP VAPI-XP wizard. 8. Open the customized SOAtest script HPQualityCenter.txt located in examples/scripts within the SOAtest install directory. Note that the script assumes the default install location is c:/Progra~1/Parasoft/SOAtest/ [SOAtest version number]/. The script is configured to run a test suite called HPQualityCenter.tst within the examples/tests directory. 9. Copy the HPQualityCenter.txt script and replace it in the template. 10. Save the script.

Modifying the Script The script assumes the following: •

useWebService: true



SOAtestServerHost: localhost



testSuite: "HPQualityCenter/HPQualityCenter.tst" inside a project called HPQualityCenter in the workspace.



xmlReport:"C:/Progra~1/Parasoft/SOAtest/6.x/examples/reports/HPQualityCenter.xml"



htmlReport: ""



detailedReporting: true

SOAtestServerHost and testSuite should be modified as needed. SOAtestServerHost should be set to the hostname or IP address of the machine running SOAtest in web service mode. testSuite is the relative path to your SOAtest project file from the workspace that the SOAtest web service is using.

Running a Test in the Test Lab Module 1. Select the Test Lab module. 2. Create a new Folder, SOAtest Demo, from the Root node. 3. Add a new Test Set SOAtest to the SOAtest Demo folder. 4. Find the test SOA Demo that was created from Test Plan, on the right hand side under Test Plan Tree, and add it to the new test set. 5. Once the test is added, select and run the test. This assumes that SOAtest is installed locally. If unchecked, check the Run All Tests Locally checkbox and click the Run All or Run button.

661

Using HP with SOAtest

Understanding Test Results Each test run will have a status column indicating the overall success of your SOAtest test suite. Each test run will also have a list of run steps which correspond to each test that was executed within the SOAtest test suite. The status column for the run step will indicate if that test passed or failed. After a test run is completed, the Design Steps for the test are also updated to reflect the test cases that were run. The design steps can be viewed from the test's Design Steps tab in the Test Plan module.

Detailed information about a test can be found by looking at the run step's Description field which contains the following: •

Test Information: Includes test name, start time, stop time, test duration, and status. If it is a SOAP Client test, the asset (WSDL URI) and operation name will also be listed.



Traffic: Only available for traffic-generating tests. Will show response time, the request message, and response message.



Error Source, Error Summary, Error Detail: These fields will only appear if the test failed. The Error Source will include the name of the tool that reported the error and the corresponding data source column, if applicable. The Error Summary and Error Detail sections show the brief and detailed messages corresponding to the error.

662

Using HP with SOAtest

Each run step also has Expected and Actual boxes that are used for storing the expected test result and actual test result. If a test in your SOAtest project failed because of a diff or regression failure, these boxes will be populated with the expected input and actual input received by the Diff tool that reported the error.

Configuring Quality Center Integration Features The integration script HPQualityCenter.txt (located in examples/scripts within the SOAtest install directory) is used in your test's Test Script tab in order to tell SOAtest to run your project file and report the test results back to Quality Center. Several variables located near the top of the script are used to configure the script's behavior. There are slight differences for these variables in SOAtest 5.5.x and SOAtest 6.x:

SOAtest 5.5.x Variables •

SOAtestExe: Full path to soatestcli.exe in your SOAtest installation location. By default, this variable is set to the default install location. For most users this variable is correct and would not need to be modified.



testSuite: Full path to your SOAtest project file. By default, this variable is set to HPQualityCenter.tst (located in examples/tests within the SOAtest install directory). You will need to set this variable to the full path of your SOAtest project that you want SOAtest to run.



xmlReport: Full path to the XML report file which will store the test results. This file will be created by SOAtest after each test run.



htmlReport: Full path to the HTML report file (optional). Similar to the xmlReport variable, this file will be created by SOAtest after each test run. If you do not wish for SOAtest to create an HTML report, set this variable to the empty string "" (default). If created, the HTML report will be attached to the test run and can be accessed directly from within Quality Center.



detailedReporting: Boolean variable to control level of reporting. If set to true (default), SOAtest will report detailed information for all tests that ran in the SOAtest project. If false, only test failures will be reported.



additionalArgs: Additional command line arguments send to SOAtest. For example, set this to "-testName Custom" to only run the test named Custom. Other advanced command line features can be used in a similar way.

SOAtest 6.x Variables

Variable

Description

Example

useWebService

Determines if tests will be run using SOAtest's web service interface. If false, tests will be run using the SOAtest command line interface.

var useWebService = true;

SOAtestServerHost

The host name or IP address of the machine running SOAtest in server mode.

var SOAtestServerHost = "localhost";

663

Using HP with SOAtest

Variable

Description

Example

testSuite

Relative path to your SOAtest project file from the workspace.

var testSuite = "HPQualityCenter/HPQualityCenter.tst"; Assumes that the test suite is inside a project called HPQualityCenter in the workspace.

xmlReport

Full path to the XML report file which will store the test results. This file is created after each test run.

var xmlReport = "C:/Progra~1/ Parasoft/SOAtest/6.x/examples/reports/HPQualityCenter.xml";

htmlReport

Full path to directory for optional HTML report file CANNOT be the same name as the xml file. This can be disabled by leaving it as an empty string "" var htmlReport = "";

var htmlReport = "c:/Progra~1/ Parasoft/SOAtest/6.x/examples/reports/ HPQualityCenter2.html";

detailedReporting

Controls level of reporting. If true, test results will show detailed information about each test run. If false, only information about test failures will be reported.

var detailedReporting = true;

SOAtestExe

Full path to st.exe in your SOAtest installation location.

var SOAtestExe = "C:/Progra~1/Parasoft/SOAtest/6.x/ soatestcli.exe"; This variable is deprecated and only applicable when useWebService is false.

workspaceLocation

additionalArgs

Full path to the workspace var workspaceLocation = "C:/ Users/name/soatest/workspace"; Use "" to specify the default workspace location.

var workspaceLocation = "C:/ Docume~1/soatest-workspace/"

Passes additional command line arguments to SOAtest. For example, set this to "-testName Custom" to only run the test named Custom. To run the entire test suite, leave this as empty string "" var additionalArgs = "";

var additionalArgs = ""-testName Custom"

664

This variable is deprecated and only applicable when useWebService is false.

This variable is deprecated and only applicable when useWebService is false.

Using JMS with SOAtest

Using JMS with SOAtest SOAtest provides extensive support for generating and consuming JMS messages, and simulating various patterns-including point-to-point and publish-and-subscribe patterns. This allows for end-toend testing and validation of messaging systems. See “Using JMS”, page 694 for details. Functional tests can be automatically generated from monitoring the transaction messages that touch JMS endpoints in ESBs or middleware systems. This functionality is described in “Creating Tests From JMS System Transactions”, page 401. You can also visualize and trace the intra-process JMS messages that take place as part of the transactions that are triggered by the tests, and then dissect them for validation. In addition to providing visibility into the system’s intermediate messages, this allows you replay transactions directly from SOAtest and verify that the monitored functionality continues to work as expected. As a result, test engineers gain the ability to identify problem causes and validate multi-endpoint, integrated transaction systems that have been traditionally handled only by specialized development teams. For details, see “Monitoring Other JMS Systems”, page 512.

665

Using Microsoft with SOAtest

Using Microsoft with SOAtest Microsoft .NET WCF (Windows Communication Foundation) allows for the creation of rich Web services. However, due to proprietary service bindings (protocols) and complexities within the WS-* standards that it supports, .NET WCF poses testing challenges. Given this, the Microsoft environment truly necessitates a "SOA-Aware" and "WCF-Aware" testing solution such as Parasoft SOA Quality Solution. For details, see “Using .NET WCF HTTP”, page 730,“Using .NET WCF TCP”, page 726, and “Using .NET WCF Flowed Transactions”, page 734. In addition, Parasoft SOA Quality Solution integrates with Microsoft Visual Studio Team System (VSTS) Edition for Testers. Integration with MS VSTS allows for the management and execution of test projects, and obtaining results directly within Visual Studio, thereby streamlining SOA testing for users of VSTS within their Visual Studio native environment. Microsoft Visual Studio 2005 Team Edition for Software Testers is part of the Visual Studio Team System (VSTS) platform. It allows for creating, managing, sharing and executing tests within Visual Studio. Parasoft SOAtest includes the capability to integrate with MS VSTS so that SOAtest functional tests can be managed and executed within the Visual Studio IDE and the results of execution are directly displayed within Visual Studio. This enables teams that use the VSTS platform to integrate SOAtest into their process in order to streamline their joint Microsoft and Parasoft solutions, and to further the return on investments made in the existing infrastructure.

Using SOAtest with Microsoft Visual Studio The following sections describe how to integrate Parasoft SOAtest with MS VSTS for Software Testers. These instructions were written for VSTS 2005. Sections include: •

Creating a Test



Executing Tests



Viewing the Results

Creating a Test This section describes the process of creating a new VSTS test case within Visual Studio and associating it with an existing Parasoft SOAtest project (.tst) file. To create a new VSTS test case: 1. Create a new Visual Studio test project or open an existing one. To create a new test project in Visual Studio: a. Navigate the menu File> New> Test Project... b. Select Test Projects under the Project types tree on the left.

666

Using Microsoft with SOAtest

2. Create a new test by selecting Test> New Test...

667

Using Microsoft with SOAtest

3. Select Generic Test from the Templates field and then enter a name in the Test Name field. 4. Configure the test with an existing Parasoft SOAtest project file (.tst file): a. In the existing program field at the top, specify the path to the SOAtest executable. For example: C:\Program Files\Parasoft\SOAtest\[soatest version number]\soatest.exe

b. Under Run settings for the command line arguments field, specify the following arguments -config "user://Example Configuration" -visualStudio xmlReport resource testSuite -report htmlReport

For example: -config "user://Example Configuration" -visualStudio C:\report\calculator.xml -resource calculator_project\calculator.tst -report C:\report\calculator.xml.details.html

The example above works for calculator.tst in a project called calculator_project in the workspace. The HTML file name should be xml file name + .details.html. c.

In the Additional files to deploy with this generic test: field, browse to and add your SOAtest project file.

d. Under the Results Settings section, select the Summary results file option and specify the same xml results file name as in the argument added in step b above.

668

Using Microsoft with SOAtest

e. Save the test configuration (press CTRL+S, or select File> Save).

Executing Tests Now that the Visual Studio generic test has been associated with a SOAtest project file, you may now run it. You can do so by running the entire test project from the tool bar or right-clicking on that particular test in the Test View and selecting Run Selection.

Viewing the Results Once execution of one or more SOAtest tests finishes, a summary of results is shown in the Test Results window within Visual Studio:

669

Using Microsoft with SOAtest

Notice the results count (2/4 in the figure above). This means that 2 SOAtest test cases out of 4 have passed. A test case could be a SOAP Client test, or another tool test within a SOAtest test suite. To view details, double-click on the test rule row (on the test name in the figure above) The Inner Test Results area displays pass/fail results for each SOAtest test case along with the error message. The Summary File area shows an HTML rendered view of the execution report, similar to the report that SOAtest can generate in a stand-alone execution.

670

Using Microsoft with SOAtest

Reporting Results from the Command Line You can also report results to VSTS using the -visualStudio cli option. For example: soatestcli.exe -config "team://MyConfiguration" -visualStudio -resource CLI_project\secondtest.tst

671

Using IBM/Rational with SOAtest

Using IBM/Rational with SOAtest SOAtest test cases can be mapped to specific requirements or test plans in Rational TestManager. Once a test case is mapped to a test plan, the testing of the requirement becomes as simple as pressing a button inside TestManager. SOAtest test cases can be run from inside Rational TestManager and the results are reported back to TestManager. Additionally, Parasoft SOA Quality Solution provides native support for the IBM WebSphere MQ API; tests JMS, EJB and Web services on WebSphere; and provides WebSphere JMX performance monitoring which exposes the metrics typically available on WebSphere Tivoli. For WebSphere ESB, you can also visualize and trace the intra-process events that take place as part of the transactions that are triggered by the tests, and then dissect them for validation. In addition to providing visibility into the system’s intermediate messages, this allows you replay transactions directly from the solution and verify that the monitored functionality continues to work as expected. As a result, test engineers gain the ability to identify problem causes and validate multi-endpoint, integrated transaction systems that have been traditionally handled only by specialized development teams. For details, see “Monitoring IBM WebSphere ESB”, page 500.

Using SOAtest with Rational TestManager The following sections describe how to integrate SOAtest into Rational TestManager. First, SOAtest must be configured to have access to various TestManager jar files. Second, a reusable test script is created. Third, test plans are created using the reusable test scripts. Sections include: •

Configuring the SOAtest Classpath



Configuring a Reusable SOAtest Test Script in TestManager •

Creating a new script using the command line execution adapter



Creating a test plan using an existing test script

Configuring the SOAtest Classpath Before you start, make sure to include all required JAR files to the classpath. Required JAR files include rtjavatestserver.jar, rttssjava.jar, rttseajava.jar, and rational_ct.jar. You can find the jar files in the following default locations: E:\Program Files\Rational\Rational Test\rtjavatestserver.jar E:\Program Files\Rational\Rational Test\rttssjava.jar E:\Program Files\Rational\Rational Test\tsea\rttseajava.jar E:\Program Files\Rational\Rational Test\QualityArchitect\rational_ct.jar

There are two ways you can add JAR files: •

Add JAR files in SOAtest’s classpath (recommended): For more information on adding JAR files to SOAtest’s classpath, see “System Properties Settings”, page 758.



Add JAR Files to the system classpath: •

For Windows: 1. Right-click My Computer and select Properties from the shortcut menu. 2. Select the Advanced tab and click the Environment Variables button.

672

Using IBM/Rational with SOAtest



If a CLASSPATH variable is not yet created, click the New button either for User variables or System variables and enter CLASSPATH for the Variable name and ;< path to the jar files> for Variable value.



If a CLASSPATH variable is already created, double-click the CLASSPATH variable and append ;< path to the jar files>

3. Click OK. •

For UNIX: Please refer to http://java.sun.com/j2se/1.3/docs/tooldocs/solaris/classpath.html for setting the JAR files to the classpath.

Configuring a Reusable SOAtest Test Script in TestManager Creating a new script using the command line execution adapter 1. In TestManager, select Tools> Manage> Test Script Types. 2. Click New to select a new Test Script Type, then enter a Name. 3. In the Execute Adapter Type tab, choose the Use command line execution adapter radio button and manually enter the following in the Execution command line field: <path to SOAtest's execution> -config configName -resource relPathToTest [testname {testname}] -testManagerVerbose



For example: "C:\Program Files\Parasoft\SOAtest\[SOAtest version number]\soatestcli.exe" -config "user://Example Configuration" -resource myProject/ myTest.tst -testManagerVerbose



For additional command line options, see “Testing from the Command Line Interface (soatestcli)”, page 257.

4. Select the Sources tab in the Test Script Type Properties dialog and click the Insert button.

673

Using IBM/Rational with SOAtest

5. After clicking Insert, a New Test Script Source dialog opens. Enter a Name in the General tab for the new Test Script Source and then select the Connection Data tab. 6. In the Connection Data tab, select the data path to a folder that contains the SOAtest test suites (.tst) then click OK.

674

Using IBM/Rational with SOAtest

7. After clicking OK in the Connection Data tab, the new source appears in the Test Script Type Properties dialog. Click OK to finish.

Creating a test plan using an existing test script 1. In the left GUI panel, select the Planning tab, right-click the Test Plans node and select New Test Plan from the shortcut menu. The New Test Plan dialog displays.

675

Using IBM/Rational with SOAtest

2. In the New Test Plan dialog, enter a Name for the test plan in the General Tab and click OK. The newly created test plan displays in the left GUI panel under the Test Plans node. 3. Double-click the newly created test plan to open it. Right-click the test plan and select Insert Test Case Folder from the shortcut menu. A New Test Case Folder dialog displays.

676

Using IBM/Rational with SOAtest

4. In the New Test Case Folder dialog, enter a Name and click OK. The new test case folder displays in the test plan. 5. Right-click the new test case folder and select Insert Test Case from the shortcut menu.

6. In the Implementation tab of the New Test Case dialog, click the Select button for automated implementation. Choose the desired test script from the drop-down menu (in this case, the test script created previously in “Creating a new script using the command line execution adapter”, page 673).

677

Using IBM/Rational with SOAtest

7. TestManager now prompts you to select a script file; browse and select any SOAtest project file (.tst) you would like to test. 8. Click the Test Script Options button.

678

Using IBM/Rational with SOAtest

9. Type testname as the Option Name. Type the name of the test you want to run as the Option Value. In the following example, the variable 'testname' was mapped to the value 'Method'. If you included the -testname {testname} option in the command line execution adapter, SOAtest will only run the test named 'Method'.

679

Using IBM/Rational with SOAtest

10. In the Test Plan dialog, right-click the new test case and choose Run from the shortcut menu. A Run Test Cases dialog displays.

11. Click OK. TestManager will now run the test script with the project file selected in step 7. If the Test Suite succeeds, you should see the results in the Test Log.

680

Using IBM/Rational with SOAtest

12. Click the Details tab in the Test Log window and expand the nodes to view the properties of the User Defined node. SOAtest reports the results as a user-defined event.

681

Using Software AG CentraSite Active SOA with SOAtest

Using Software AG CentraSite Active SOA with SOAtest SOAtest associates SOA artifacts with the Software AG CentraSite Active SOA registry. This integration enables teams to automatically execute a quality workflow and correlate quality data in context of an SOA Governance initiative. SOAtest can automatically generate tests at the time the services are published to the Software AG CentraSite Active SOA registry-including functional test cases and WSDL verification tests to ensure WSDLs are compliant to best practices and organizational policies. Policy compliance and functional test results are then reported back to the registry and updated in real-time. This provides continuous visibility into a service's quality throughout its lifecycle.

Test Creation SOAtest enables users to create tests that enforce policies applied to Web service assets that are declared in a CentraSite Active SOA repository. Users can select a Web service asset and choose the desired policies to enforce. For details, see “Creating Tests From Software AG CentraSite Active SOA”, page 399.

Reporting Test Execution Results to CentraSite Active SOA After running the test suite with the selected CentraSite policies, you can have instant access to quality data associated with the assets in CentraSite. To configure SOAtest to send results: 1. Configure the SOA registry settings as follows: a. Choose SOAtest> Preferences. b. Open the SOA Registry page. c.

Indicate the CentraSite URL and username/password.

2. Enable reporting as follows: •

In the GUI: In the SOAtest> Preferences> SOA Registry page, enable Send results to CentraSite .



From the command line: Add the -CentraSite cli option to your SOAtest command line invocation. For example: ./soatestcli -data "C:\My Workspace location" -config "user://Example Configuration" resource MyProject/mytest.tst -report "C:\directory to save HTML report" -CentraSite

3. Configure test suite reporting options as follows: a. Double-click the root node of the test suite for which you want to configure CentraSite reporting. b. In the Test Suite configuration panel that opens, open the Reporting Options tab. c.

Choose CentraSite from the Reporting Options box.

d. Enter the UDDI service key so that SOAtest knows which service this tst file is associated with (so it can correlate execution results with the appropriate service). This value is the key value found in the Technical Details tab of the CentraSite control Web UI for that particular service.

682

Using Software AG CentraSite Active SOA with SOAtest

With this configuration, SOAtest will automatically report the results to CentraSite after the test execution is finished. If you run an individual test or a test suite that is not the top-most root test suite, then the results will not be sent to the registry (because SOAtest assumes you are in the process of configuring the tests). SOAtest sends only full tst execution results in order to avoid sending partial or incomplete results. To view the data that is reported from SOAtest to CentraSite: 1. Open CentraSite. 2. Within the CentraSite interface, click the Service node in the left pane labeled My CentraSite, double-click the name of the desired service from the right pane, and then select the SOA Test Status tab. A summary of the tests run display.

683

Using Software AG CentraSite Active SOA with SOAtest

3. Click the Detailed Report link at the bottom of the SOA Test Status page to view test execution details.

684

Using Software AG webMethods with SOAtest

Using Software AG webMethods with SOAtest SOAtest facilitates the testing of Web services built on the webMethods platform. Our service emulation capability can automatically generate and deploy stubs that replace specific application components that you do not want to test (or cannot test). This enables you to verify specific components in isolation as well as reduce the complexity of the test environment. Moreover, SOAtest’s broad protocol support provides teams an integrated environment for performing comprehensive testing of heterogeneous composite applications. In addition to supporting JMS and SOAP/XML messaging, functional tests can be easily be configured to publish and send BrokerEvent objects to Software AG webMethods Broker, as well as subscribe to events and invoke native webMethods Integration Server services. SOAtest introspects Broker and IS to automatically configure tests and visualizes the incoming and outgoing content. This allows otherwise developer-intensive test activities to be performed visually and without scripting on the webMethods APIs— and with rich validation and data parameterization capabilities. You can also visualize and trace the Broker events that take place as part of the transactions that are triggered by the webMethods tests, and then dissect them for validation. As a result, test engineers gain the ability to identify problem causes and validate multi-endpoint, integrated transaction systems that have been traditionally handled only by specialized development teams. For details, see: •

“webMethods”, page 833.



“Creating Tests From JMS System Transactions”, page 401



“Monitoring Software AG webMethods Broker”, page 505



“Monitoring Other JMS Systems”, page 512

685

Using Sonic with SOAtest

Using Sonic with SOAtest SOAtest provides native support for SonicMQ as well as JMS and SOAP/XML messaging. See “Using SonicMQ”, page 706 for details. Functional tests can be automatically generated from monitoring the transaction messages that touch JMS endpoints in Sonic ESB. This functionality is described in “Creating Tests From Sonic ESB Transactions”, page 404. You can also visualize and trace the intra-process events that take place as part of the transactions that are triggered by the tests, and then dissect them for validation. In addition to providing visibility into the system’s intermediate messages, this allows you replay transactions directly from the solution and verify that the monitored functionality continues to work as expected. As a result, test engineers gain the ability to identify problem causes and validate multi-endpoint, integrated transaction systems that have been traditionally handled only by specialized development teams. For details, see “Monitoring Sonic ESB”, page 508.

JMS Messaging without JNDI JMS messaging without JNDI is supported for Sonic. For details, see “JMS Messaging without JNDI”, page 704.

686

Using TIBCO with SOAtest

Using TIBCO with SOAtest Parasoft SOA Quality Solution provides native support for the TIBCO Rendezvous as well as JMS and SOAP/XML messaging. This is described in “Using TIBCO Rendezvous”, page 720 Functional tests can be automatically generated from monitoring the transaction messages that touch JMS endpoints in TIBCO EMS. This functionality is described in “Creating Tests From TIBCO EMS Transactions”, page 407. You can also visualize and trace messages that take place through EMS as part of the transactions that are triggered by the tests, and then dissect them for validation. In addition to providing visibility into the system’s intermediate messages, this allows you replay transactions directly from the solution and verify that the monitored functionality continues to work as expected. As a result, test engineers gain the ability to identify problem causes and validate multi-endpoint, integrated transaction systems that have been traditionally handled only by specialized development teams. For details, see “Monitoring TIBCO EMS”, page 510.

JMS Messaging without JNDI JMS messaging without JNDI is supported for TIBCO. For details, see “JMS Messaging without JNDI”, page 704.

687

Testing Through Different Protocols In this section: •

Using HTTP 1.0



Using HTTP 1.1



Using JMS



Using IBM WebSphere MQ



Using TIBCO Rendezvous



Using RMI



Using SMTP



Testing a CORBA Server



Using .NET WCF TCP



Using .NET WCF HTTP



Using .NET WCF Flowed Transactions

688

Using HTTP 1.0

Using HTTP 1.0 When selecting HTTP 1.0 as the transport protocol in SOAtest, you can specify whether you want the client’s requests to use Keep-Alive connections. It will also be reused for a single invocation of a test suite from the GUI or the command line. You will be able to add, modify, and remove custom HTTP headers to the SOAP request from within the Transport tab of a SOAP Client or Messaging Client tool.

Configuring SOAtest to use HTTP 1.0 After selecting HTTP 1.0 from the Transport drop-down menu within the Transport tab of a SOAP Client or Messaging Client tool, the following options display in the left pane of the Transport tab: •

General



URL Parameters (Messaging Client Only)



Security



HTTP Headers



Cookies

General When selecting General from the left pane of the HTTP 1.0 Transport panel, the following options are available: •

Endpoint: The endpoint is the URL of the web service endpoint. By default, SOAP Client endpoints are set to the endpoint defined in the WSDL. Besides WSDL, there are three other endpoint options: •

Default: When this option is selected, the endpoint will be the endpoint defined in the Test Suite that has the SOAP Client test. To see the GUI for the endpoint defined in the Test Suite, click on the Test Suite node and click on the SOAP Client Options tab:



Custom: Allows you to set any custom endpoint.



UDDI serviceKey: Describes what UDDI serviceKey is used to reference this server endpoint in the UDDI registry specified in the Preferences panel’s WSDL/UDDI tab.



SOAP Action: Specifies how the server processes the request. This field is disabled if the Constrain to WSDL checkbox is selected.



Message Exchange Pattern: Specifies whether to Expect Synchronous Response or not.



Connection Settings: Specifies Keep-Alive or Close connections for the transport protocol selected. •

Keep-Alive connection: Adds a "Connection: Keep-Alive" header to request a keepalive connection if the server supports it.



Close connection (default): Outputs no additional HTTP headers and performs a regular HTTP 1.0 exchange. This is the default behavior for HTTP 1.0.

689

Using HTTP 1.0

URL Parameters When selecting URL Parameters from the left pane of the HTTP 1.0 Transport panel (Only available in the Messaging Client tool), the following options are available: •

URL Parameters: Allows you to add parameters to the URL for a GET request. After clicking the Add button, you can specify Parameters/Value pairs in the dialog that opens. If a data source is available, you can parameterize the values as well.

Security When selecting Security from the left pane of the HTTP 1.0 Transport panel, the following options are available: •

Use Client Key Store: Specifies the key store used to complete the handshake with the server.



HTTP Authentication: Select the Perform Authentication checkbox and select Basic, NTLM, Kerberos, or Digest from the Type drop-down list. For Basic, NTLM, or Digest, enter the Username and Password to authenticate the request. For Kerberos, enter the Service Principal to authenticate the request. If the correct username and password, or the correct service principal, are not used, the request will not be authenticated. Alternatively, you can select Use Global Preferences if you have set Global HTTP Authentication Properties within the Security Preferences of SOAtest. For more information, see “Security Settings”, page 754.

HTTP Headers When selecting HTTP Headers from the left pane of the HTTP 1.0 Transport panel, the following options are available: •

Add: Click to add a custom HTTP header.



Modify: Click to modify the selected HTTP header. A dialog box will display, allowing you to modify the Name and Value of the header. If the SOAP Client is using a data source, values for the header can be accessed from the data source.



Remove: Click to delete the selected HTTP header.

Cookies When selecting Cookies from the left pane of the HTTP 1.0 Transport panel, the following options are available: •

Reset existing cookies before sending request: Allows you to clear the current cookies so that next HTTP invocations start a new session.

690

Using HTTP 1.1

Using HTTP 1.1 When selecting HTTP 1.1 as the transport protocol in SOAtest, you can specify whether you want the client’s requests to use Keep-Alive connections. It will also be reused for a single invocation of a test suite from the GUI or the command line. You will be able to add, modify, and remove custom HTTP headers to the SOAP request from the Transport tab of a SOAP Client or Messaging Client tool. In addition, you will also be able to specify HTTP Chunking, which allows HTTP messages to be broken up into several parts. Chunking is most often used by the server for responses, but clients can also chunk large requests. The following subsections are available: •

Configuring SOAtest to use HTTP 1.1



Error Handling

Configuring SOAtest to use HTTP 1.1 After selecting HTTP 1.1 from the Transport drop-down menu within the Transport tab of a SOAP Client or Messaging Client tool, the following options display in the left pane of the Transport tab: •

General



URL Parameters



Security



HTTP Headers



Cookies

General When selecting General from the left pane of the HTTP 1.1 Transport panel, the following options are available: •

Endpoint: The endpoint is the URL of the web service endpoint. By default, SOAP Client endpoints are set to the endpoint defined in the WSDL. Besides WSDL, there are three other endpoint options: •

Default: When this option is selected, the endpoint will be the endpoint defined in the Test Suite that has the SOAP Client test. To see the GUI for the endpoint defined in the Test Suite, click on the Test Suite node and click on the SOAP Client Options tab:

Clicking on the Apply Endpoint to All Tests button will set endpoints of all SOAP Clients in the Test Suite to the endpoint defined in the GUI. •

Custom: Allows you to set any custom endpoint.



UDDI serviceKey: Describes what UDDI serviceKey is used to reference this server endpoint in the UDDI registry specified in the Preferences panel’s WSDL/UDDI tab.

691

Using HTTP 1.1



SOAPAction: Specifies how the server processes the request. This field is disabled if the Constrain to WSDL checkbox is selected.



Message Exchange Pattern: Specifies whether to Expect Synchronous Response or not.



Connection Settings: Specifies Keep-Alive or Close connections for the transport protocol selected.





Keep-Alive connection: Adds a "Connection: Keep-Alive" header to request a keepalive connection if the server supports it. For more information, see “Error Handling”, page 693.



Close connection (default): Outputs no additional HTTP headers and performs a regular HTTP 1.0 exchange. This is the default behavior for HTTP 1.0.

HTTP Chunking: Sends HTTP messages in chunks. Rather than sending the entire message from memory, SOAtest reads the message a block at a time, and then sends the block. A number is added before each block so that the receiving end knows how many bytes to expect.

URL Parameters When selecting URL Parameters from the left pane of the HTTP 1.0 Transport panel (Only available in the Messaging Client tool), the following options are available: •

URL Parameters: Allows you to add parameters to the URL for a GET request. After clicking the Add button, you can specify Parameters/Value pairs in the dialog that opens. If a data source is available, you can parameterize the values as well.

Security When selecting Security from the left pane of the HTTP 1.1 Transport panel, the following options are available: •

Use Client Key Store: Specifies the key store used to complete the handshake with the server.



HTTP Authentication: Select the Perform Authentication checkbox and select Basic, NTLM, Kerberos, or Digest from the Type drop-down list.For Basic, NTLM, or Digest, enter the Username and Password to authenticate the request. For Kerberos, enter the Service Principal to authenticate the request. If the correct username and password, or the correct service principal, are not used, the request will not be authenticated. Alternatively, you can select Use Global Preferences if you have set Global HTTP Authentication Properties within the Security Preferences of SOAtest. For more information, see “Security Settings”, page 754.

HTTP Headers When selecting HTTP Headers from the left pane of the HTTP 1.1 Transport panel, the following options are available: •

Add: Click to add a custom HTTP header.



Modify: Click to modify the selected HTTP header. A dialog box will display, allowing you to modify the Name and Value of the header. If the SOAP Client is using a data source, values for the header can be accessed from the data source.



Remove: Click to delete the selected HTTP header.

Cookies

692

Using HTTP 1.1

When selecting Cookies from the left pane of the HTTP 1.0 Transport panel, the following options are available: •

Reset existing cookies before sending request: Allows you to clear the current cookies so that next HTTP invocations start a new session.

Error Handling Under normal conditions, SOAtest test cases using HTTP 1.1 Keep-Alive will reuse a single connection for the duration of a scenario. When a SOAtest test case using HTTP 1.1 Keep-Alive times out while attempting to send or receive data, the client will issue a graceful close on the transport connection. The next test in the scenario will start a new connection and test execution will continue normally.

693

Using JMS

Using JMS This topic explains how to use the JMS transport in SOAtest. It includes the following sections: •

JMS Prerequisites



Configuring JMS Options



Message Object Outputs for Clients Using JMS



Using Sun JMS



Configuration for Popular JMS Providers



JMS SSL



Responding to a Temporary JMSReplyTo Queue



JMS Messaging without JNDI

JMS Prerequisites When SOAtest is being used as the JMS client, we recommend that it consult a JNDI provider to make connections to the JMS middleware. For this to happen, a JNDI Provider needs to be set up and all necessary jar files (i.e. ones containing the Initial Context) need to be added to the SOAtest classpath. (For more information on how to add jars to the SOAtest classpath, see “System Properties Settings”, page 758). You will also need to supply the names of the connection factory, destination, and reply-to queue that the JNDI provider will look up. If your JMS setup does not have a JNDI provider that SOAtest can query for an instance of a ConnectionFactory, follow the instructions in “JMS Messaging without JNDI”, page 704. Alternatively, you can set up a simple file system JNDI provider. The .jars and documentation for such a provider are available from Sun's page at http://java.sun.com/products/jndi/downloads/index.html. Setting up a file system provider is quite easy and the documentation is included in the download. Once the provider is ready, use the piece of simple Java code as described in “Using Sun JMS”, page 698 to create an instance of the ConnectionFactory to connect to the JMS server using the host and port as arguments. The same should be done for Topics and Queues used by SOAtest.

Configuring JMS Options After selecting JMS from the Transport drop-down menu within the Transport tab of a SOAP Client or Messaging Client tool, the following options display in the left pane of the Transport tab: •

Connection Settings



Queue/Topic



Messaging Model



Message Exchange Pattern



Message Type



Request Message Properties



Response Message Correlation

Connection Settings Connection Settings contains Settings and Properties tabs for JNDI Initial Context.

694

Using JMS

The Properties tab is optional and allows users to specify additional properties to be passed to the JNDI javax.naming.InitalContext constructor (in addition to the Provider URL and Initial Context factory properties that are specified in the Settings tab). Property values, which can be added by clicking Add and completing the Add JMS Property dialog, can be set to a fixed value, a parameterized value, a scripted value, or a unique value (an automatically-generated random unique value—no two test invocations will use the same value). The Settings tab contains the following: •

If you created a Shared Property for JMS Connections, a drop-down menu will be available from which you can choose Use Local Settings or Use Shared Property. •

If you select Use Shared Property, a second drop-down menu displays from which you select the desired global JMS settings that the SOAP Client tool will use. For more information on global JMS settings, see “Global JMS Connection Properties”, page 337.



If you select Use Local Settings, or if no shared property is specified, you can configure the rest of the options for Connection Settings.



Provider URL: Specifies the value of the property named javax.naming.Context.PROVIDER_URL passed to the JNDI javax.naming.InitialContext constructor.



Initial Context: Specifies a fully qualified class name string, passed to the JNDI javax.naming.InitialContext constructor as a string value for the property named javax.naming.Context.INITIAL_CONTEXT_FACTORY.



Connection Factory: Specifies the JNDI name for the factory This is passed to the lookup() method in javax.naming.InitialContext to create a javax.jms.QueueConnectionFactory or a javax.jms.TopicConnectionFactory instance.

In addition to the Settings tab, the Connection Settings also include: •

Queue Connection Authentication: Allows users to provide a username and password to create a queue connection. Select the Perform Authentication checkbox and enter the Username and Password to authenticate the request. If the correct username and password are not used, the request will not be authenticated.The username and password provided here is passed to the createQueueConnection() method in the javax.jms.QueueConnectionFactory class in order to get an instance of javax.jms.QueueConnection.



Keep-Alive Connection: Select to notify the test whether to share or close the current connection. The shared connections are returned to the connection pool to be used across the test suite. A life cycle of a connection pool is as follows: •

For a single test, it is destroyed at the end of the test execution.



For a test suite, it is destroyed at the end of the test suite execution.



For a load test, it is destroyed at the end of the load testing.

Queue/Topic The Queue/Topic settings contain the following options: •

JMS Destination: Specifies the queue name (if point to point is used) or topic name (if publish and subscribe is used) for where the message will be sent to.



JMS ReplyTo: Specifies the queue name (if point to point is used) or topic name (if publish and subscribe is used) for where to get a response message from. This can be a temporary queue if Temporary is selected instead of Form.

Messaging Model

695

Using JMS

Messaging Model options specify how messages are sent between applications. Select either Point to Point or Publish and Subscribe.

Message Exchange Pattern Message Exchange Pattern options specify whether or not SOAtest receives a response. If Get Response is selected, SOAtest sends a message and receives a response. If Get Response is not selected, SOAtest sends a one-way message and does not receive a response. If Get Response is selected, you can also enable Create consumer on the JMSReplyTo destination before sending the message.If the response is expected to become available very quickly on the JMSReplyTo topic, this option should be enabled to ensure that SOAtest has subscribed to the reply topic before the response message is published. This option is cannot be mixed with Match response JMSCorrelationID with the request JMSMEssageID because the JMS specification requires vendors to generate the JMSMessageID after the message is sent. As a result, there is no way to create the consumer on the response destination with that correlation (selector) set until after the message has been set and the JMSMessageID becomes available.

Message Type Message Type options allow you to select the message type from the drop-down menu. A JMS Message is a Java object that contains the data being transferred between JMS clients. The following Message Types are available: •

TextMessage: Used to send a message containing a java.lang.String. It is useful if you want only to send a simple text document. If you are doing SOAP over JMS, the String you enter into the TextMessage should be the SOAP Envelope.



BytesMessage: Used to send a message containing a stream of uninterpreted bytes. The receiver of the message interprets the bytes as it sees fit. If you are doing SOAP over JMS, the bytes you enter into the ByteMessage should compose the SOAP Envelope. Data is sent in its most basic representation. It may also be useful when the JMS node is only interested in forwarding/routing data so the contents of the data aren't important to them. It is one of the most commonly used along with TextMessage. •

When the encoding in the SOAtest preferences is UTF-8 (the default), you can use the Method options to control how SOAtest extracts a string from response BytesMessages. The two methods in the dropdown menu correspond to the two methods available in BytesMessage JMS API (see http://java.sun.com/products/jms/javadoc-102a/ javax/jms/BytesMessage.html).



If a different encoding is selected in the preferences, SOAtest will always invoke BytesMessage.readBytes() on the response messages in order to account for different character encodings.



StreamMessage: Used to send a stream of primitive values. If you are doing SOAP over JMS, the Stream you enter into the StreamMessage should of the SOAP Envelope.



ObjectMessage Used to send a Java Serializable. You should use the Scripting input view to return a Java Serializable. SOAtest will take this object and stick it into the ObjectMessage during run-time.



MapMessage: Used to send a set of name-value pairs. The values can only be java primitives or their respective wrappers, Strings, or byte arrays. One intricacy of MapMessage objects is that even though you inserted a value as a String (using setString()), if the value can be coerced into an integer we can call getInt() and get a value of type integer.

696

Using JMS

Request Message Properties The Request Message Properties are optional and allows for any miscellaneous property values to be set into the javax.jms.Message object before it gets sent to a queue or published to a topic. These include predefined properties that get set to the outgoing requests message using one of the corresponding "set" methods in javax.jms.Message, or any custom property provided with the setStringProperty() method. Property values, which can be added by clicking Add and completing the Add JMS Property dialog, can be set to a fixed value, a parameterized value, a scripted value, or a unique value (an automatically-generated random unique value—no two test invocations will use the same value).

Response Message Correlation The Response Message Correlation settings contain the following options: •

Match response JMSCorrelationID with request JMSMessageID: If selected, the term JMSCorrelationID = '[msgId]' will be appended to the selector expression, where msgId is dynamically generated from the outgoing (request) javax.jms.Message (using the getJMSMessageID() method). Effectively, this results in the test blocking until a message with the specified correlation id becomes available in the queue (or topic) and it will only retrieve that particular message, rather than retrieving any message in the queue (or topic). The test will timeout after the timeout amount elapses and if there is no message that watches the selector criteria.



Match response JMSCorrelationID with request JMSCorrelationID: If selected, the term JMSCorrelationID = '[correlationId]' will be appended to the selector expression, where correlationId is retrieved from JMSCorrelationID property in the Message Properties section. The option becomes enabled only if such property is added to the Message Properties section. Effectively, this results in the test blocking until a message with the specified correlation id becomes available in the queue (or topic) and it will only retrieve that particular message, rather than retrieving any message in the queue (or topic). The test will timeout after the timeout amount elapses and if there is no message that watches the selector criteria.



Additional Selector Expression Terms: (Optional) Enter a value to act as a message filter. For example, by entering username==John, only messages that contain "John" as a username will be delivered. If this field is left blank, then any messages can be received from the queue. This expression is passed to the createReceiver() method of the javax.jms.QueueSession class in point to point messaging, or the createSubscriber() method of the javax.jms.TopicSession class in publish and subscribe messaging. For tips on specifying a selector, see “Using Message Selector Filters”, page 704.

Message Object Outputs for Clients Using JMS You can add message object outputs to SOAP Clients and Messaging Clients that utilize the JMS transport. For example, an Extension tool chained to a Messaging Client that uses JMS will have access to the response JMS Message. In the ObjectMessage case, you can use getter and equals() methods to validate the response thereby creating a regression control. In addition you can chain a Diff tool to the Response Traffic and if the response is an ObjectMessage, SOAtest will convert the inserted serializable object to XML format and perform an XML diff. By doing this you can use data bank values, ignore XPath differences, etc. To do this complete the following: 1. Right-click the SOA Client or Messaging Client node for which you would like to add the output and select Add Output from the shortcut menu. The Add Output wizard displays.

697

Using JMS

2. Select Response> Message Object in the left pane of the Add Output wizard and then choose a New Output or Existing Output and the desired tool (e.g. an Extension tool) from the right pane. 3. Click the Finish button. SOAtest adds the new output to the selected Client node.

Using Sun JMS SOAtest bundles a Sun JNDI implementation which stores JNDI bindings in directories and files on the local hard drive using com.sun.jndi.fscontext.RefFSContextFactory as the Initial Context and a user defined Provider URL (directory) of “C:\JNDIRoot” or something similar. When using the Sun implementation, it is necessary to populate the “C:\JNDIRoot” directory with a .binding file such that fscontext can successfully look up the ConnectionFactory and Queue or Topic objects. The SOAtest team has put together an example that can be found in the SOAtest installation directory (/examples/ jms/JndiFileProviderTest.java), which creates a .binding file. If you require more assistance setting up SOAtest for JMS testing contact the SOAtest support team.

Configuration for Popular JMS Providers 698

Using JMS

IBM WebSphere Application Server (WAS) For

For the WAS Default JMS provider. Parasoft recommends the use of IBM's JMS thin client that is provided by WAS 7.0 or later, and which can interoperate with WAS 6.0.2 and later.

Minimum Required jars *

Found under [WAS installation dir]/runtimes - com.ibm.ws.ejb.thinclient_7.0.0.jar - com.ibm.ws.orb_7.0.0.jar - com.ibm.ws.sib.client.thin.jms_7.0.0.jar

Typical Provider URL Pattern

iiop://yourhostname:2809/

Factory class

JNDI Initial Context factory class - com.ibm.websphere.naming.WsnInitialContextFactory

Learn more at

http://publib.boulder.ibm.com/infocenter/wasinfo/v7r0/topic/com.ibm.websphere.pmc.soafep.multiplatform.doc/tasks/tjj_jmsthcli_connf.html

IBM WebSphere MQ (MQ Series) For

For the WebSphere MQ JMS provider

Minimum Required jars *

Found under [WebSphere MQ installation directory]/java/lib - com.ibm.mq.jar - com.ibm.mqjms.jar - com.ibm.mq.commonservices.jar - com.ibm.mq.headers.jar - com.ibm.mq.jmqi.jar - connector.jar - dhbcore.jar - jta.jar MQ 6.0 and earlier need to download (this jar ships with MQ 7.0): MS0B: WebSphere MQ Java classes for PCF (http://www-01.ibm.com/support/docview.wss?uid=swg24000668) com.ibm.mq.pcf.jar Need to download: ME01: WebSphere MQ - Initial Context Factory (http://www-01.ibm.com/support/docview.wss?uid=swg24004684) mqcontext.jar

Typical Provider URL Pattern

yourhostname:1414/SYSTEM.DEF.SVRCONN

Factory class

WebSphere MQ JNDI Initial Context factory class - com.ibm.mq.jms.context.WMQInitialContextFactory

699

Using JMS

JMS messaging without JNDI

To have SOAtest do JMS messaging over MQ without JNDI, leave the "Initial Context" field empty, and provide the hostname in the provider URL field. The Connection Factory field should have the fully qualified WebSphere MQ connection factory class name. Typically, this should be com.ibm.mq.jms.MQConnectionFactory. Under the properties tab, define the following properties: Name: Value provider: WebSphere MQ queue.manager: [your queue manager name] channel: [your channel name] port: [port number, the default WebSphere MQ port is 1414] With this configuration, SOAtest will create the connection factory using the provided parameters as follows: import com.ibm.mq.jms.*; ... MQConnectionFactory cf = new MQConnectionFactory(); cf.setHostName(hostname); cf.setPort(port); cf.setQueueManager(queuemanager); cf.setChannel(channel); cf.setTransportType(JMSC.MQJMS_TP_CLIENT_MQ_TCPIP); cf.setFailIfQuiesce(JMSC.MQJMS_FIQ_YES); cf.setUseConnectionPooling(true);

Note

IBM's JNDI provider authenticates itself with the Websphere MQ server by sending the user's login name. This is typically the user name that was provided when logging into the workstation before starting SOAtest. If the MQ server does not recognize the user name, then your test will fail with the error "javax.jms.JMSSecurityException: MQJMS2013: invalid security authentication supplied for MQQueueManager". This error can be resolved by adding a Windows user account on the Websphere MQ server machine. This account should 1) have the same user name as the account on the local machine where SOAtest is running and 2) be a member of the "mqm" IBM WebSphere MQ Administration Group. Alternatively, use a different user name by changing the value of the user.name Java System property. This system property can be configured by starting SOAtest using soatest.exe -J-Duser.name=username where username is the user name you want to use. In some configurations using an empty username via soatest.exe -J-Duser.name= will work. Using SYSTEM for the user.name property may also work in some configurations. It is also possible to modify Java system properties during test suite execution by using an Extension Tool to call the java.lang.System.setProperty() method from Sun's Java API. For more details on this error, see http://www.mqseries.net/phpBB/viewtopic.php?t=40640.

700

Using JMS

Learn more at

http://www-01.ibm.com/software/integration/wmq/library/ http://publib.boulder.ibm.com/infocenter/wmqv7/v7r0/topic/ com.ibm.mq.csqzaw.doc/jm10320_.htm

Oracle BEA WebLogic For

For Oracle BEA WebLogic

Minimum Required jars (Thin Client) *

Found under [weblogic home]/wlserver_x/server/lib - wljmsclient.jar - wlclient.jar Note that additional jars may be needed. We recommend using a full client; this relieves you from having to find and add many jars.

Minimum Required jars (Full Client) *

Build the single wlfullclient5.jar as described in the instructions for "Creating a wlfullclient5.jar for JDK 1.5 client applications." These instructions are available on Oracle 's site at http://edocs.bea.com/wls/docs103/client/jarbuilder.html#wp1078122.

Typical Provider URL Pattern

Thin Client: iiop://yourhostname:7001 Full Client: t3://yourhostname:7001

Factory class

JNDI Initial Context factory class: - weblogic.jndi.WLInitialContextFactory

Learn more at

http://edocs.bea.com/wls/docs103/client/basics.html#wp1068455 http://edocs.bea.com/wls/docs103/client/jms_thin_client.html

TIBCO EMS For

For TIBCO EMS

Minimum Required jars *

Found under [TIBCO install dir]/ems/clients/java - tibjms.jar

Typical Provider URL Pattern

tcp://yourhostname:7222

Factory class

JNDI Initial Context factory class - com.tibco.tibjms.naming.TibjmsInitialContextFactory

701

Using JMS

Learn more at

Refer to the TIBCO Enterprise Message Service User's Guide, section 9: Developing an EMS Client Application> Programmer Checklist

Progress Sonic MQ/ESB For

For Progress Sonic MQ/ESB

Minimum Required jars *

Found under [sonic install dir]/MQ7.x/lib - mfcontext.jar - broker.jar - sonic_Client.jar

Typical Provider URL Pattern

tcp://yourhostname:2506

Factory class

JNDI Initial Context factory class - com.sonicsw.jndi.mfcontext.MFContextFactory

Learn more at

Refer to SonicMQ Application Programming Guide, Appendix A (Using the Sonic JNDI SPI) and which also refers to other relevant sections.

JBoss JMS For

For JBoss JMS

Minimum Required jars *

Found under[JBoss install dir]/client - jboss-javaee.jar - jnp-client.jar - jboss-logging-spi.jar - jboss-messaging-client.jar - jboss-aop-client.jar - jboss-remoting.jar - jboss-common-core.jar - trove.jar - javassist.jar - jboss-mdr.jar - concurrent.jar - log4j.jar - jboss-serialization.jar

Typical Provider URL Pattern

yourhostname

Factory class

JNDI Initial Context factory class - org.jnp.interfaces.NamingContextFactory

702

Using JMS

Other JMS Providers For other JMS providers, please refer to your vendor guides on how to configure a JMS client to communicate with your system.

*Adding Required jar Files To add required jar files to SOAtest’s classpath, complete the following: 1. Choose SOAtest> Preferences. 2. Open the System Properties page. 3. Click the Add JARS button and choose and select the necessary JAR files to be added.

JMS SSL The steps to configure a JMS SSL connection is dependent upon which JMS server is being used. Consult your server documentation for more information on configuring JMS SSL connections. If you are configuring JMS SSL on a TIBCO EMS server, you will need to specify the appropriate SSL settings in the initial context .bindings file. "Trust All Certificates" will also need to be UNCHECKED in the SOAtest Security Preferences (Alt+F2). WebSphere JMS clients can achieve SSL connections by setting the appropriate properties in the initial context. Both WebSphere MQ and WebSphere JMS clients will set the same properties for SSL connectivity.

Responding to a Temporary JMSReplyTo Queue To respond to a JMS message with a temporary queue set in the JMSReplyTo field: 1. Create a Call Back Tool and configure it to receive the JMS message. 2. Right-click the Call Back Tool and choose Add Output> Incoming JMS Message> Extension tool. 3. Write a custom script that places the incoming message (input object) into the scripting context with a predetermined key. •

This key is defined within the SOAtest Extensibility API under SOAPUtil.JMS_MSG_KEY. To access the SOAtest Extensibility API, choose Help> Extensibility API.



Here is a Python example:

from soaptest.api import * def getJMSReplyTo(input, context): context.put(SOAPUtil.JMS_MSG_KEY, input)

4. Create and configure the tool that will send the JMS response message. •

SOAPUtil.JMS_CONTEXT_QUEUE, which is another keyword defined in the SOAtest Extensibility API, should be used as the JMSDestination of this tool.

703

Using JMS

SOAtest will then use the temporary queue of the received message as the destination in place of the keyword entered as the destination.

Using Message Selector Filters In various tools—including the Event Monitor, Messaging Client, SOAP Client, Call Back Tool, Message Stub tool, and the test creation wizard for JMS, Sonic, and TIBCO—you can specify a value to act as a message filter. This is specified in a field labelled Message Selector or Additional Selector Expression Terms. For example, by entering username==John, only messages that contain "John" as a username will be delivered. If this field is left blank, then any messages can be received. Here are some tips for working with message selector filters: •

If you'd like to filter event messages based on content, you can chain a tool such as XML Transformer to its XML Event Output.



The selector field links to a JMS feature called "message selectors." SOAtest passes the specified expression to the javax.jms.Session object when creating a javax.jms.MessageConsumer. The selector expression, as defined by the JMS specification, can be applied to JMS message headers and properties, but not to message body contents.



In point to point messaging, this expression is passed to the createReceiver() method of the javax.jms.QueueSession class. In publish and subscribe messaging, it is passed to the createSubscriber() method of the javax.jms.TopicSession class.



The expression syntax is a subset of SQL92. For example, the expression fruit = 'apple' or JMSCorrelationID = '123456'

will result in SOAtest picking up only JMS messages that have the property fruit defined with the value apple or the JMS header JMSCorrelationID set to the value 123456. •

For more information about JMS to a JMS feature, refer to the Java Message Service Specification at http://java.sun.com/products/jms/docs.html.

JMS Messaging without JNDI Using JNDI in order to obtain JMS connection factory and Destination instances is highly recommended from an architectural perspective because it decouples JMS consumer code from vendor-specific dependencies. In test or staging environments, a JMS system occasionally does not have a JNDI configured yet, or the JNDI does not yet include the desired connection factories. Furthermore, it might be helpful to bypass JNDI during testing in order to debug issues or isolate system performance characteristics with and without JNDI. For these reasons, SOAtest's JMS-aware tools (SOAP Client, Messaging Client and Call Back Tool) allow for sending and receiving JMS messages using vendor connection factories directly—without going through JNDI—as long as your JMS provider permits this. This capability supports certain JMS implementations that are designed to allow for JMS connections to be established without JNDI in the first place, and which provide connection factory classes with a constructor that takes a single string argument as the connection URL.

Supported JMS Implementations Since this support for JMS messaging without JNDI is not based on standard JMS API, it is not guaranteed to be portable across different JMS implementations. This capability has been tested with Sonic and TIBCO JMS. It is also supported for WebSphere MQ, with the configuration described in “IBM WebSphere MQ (MQ Series)”, page 699.

704

Using JMS

At the time of this writing, the direct creation of connection factories for Oracle/BEA WebLogic, JBoss, or WebSphere Default JMS provider is not supported by SOAtest, and in most of these cases it is not documented or encouraged by these vendors.

Configuration To configure one of the JMS messaging tools in SOAtest to send/receive messages without JNDI: •

Keep the connection URL in the JNDI connection settings Provider URL field.



Leave the Initial Context field empty.



Specify a fully-qualified MOM connection factory class name in the Connection Factory field. That classname should represent a class that implements javax.jms.ConnectionFactory. •

In the cases of Sonic and TIBCO, you can follow the same patterns as JNDI provider URLs [see “Progress Sonic MQ/ESB”, page 702 and “TIBCO EMS”, page 701 for details).

SOAtest interprets the empty Initial Context field as a signal to instantiate the connection factory object directly and without JNDI. It will attempt to do so with a constructor that takes a single string argument, passing in the connection URL that you specified in the Provider URL field. The connection authentication settings will still be used as usual. If no such constructor exists in your provider's connection factory API, then JNDI is needed in order to instantiate the connection factory class. For example, in the case of Sonic JMS, the Connection Factory class name to be provided would be progress.message.jclient.ConnectionFactory. For TIBCO JMS, it would be com.tibco.tibjms.TibjmsConnectionFactory. On the other hand, JBoss and OpenJMS do not have a connection factory that takes a single connection URL argument. In these cases, there must be a JNDI in order to exchange JMS messages with these systems using SOAtest. Regardless of the JNDI settings configuration, SOAtest will always attempt to resolve the destination (queue or topic) name from JNDI if JNDI exists and if the name exists in the directory. If it cannot find it via JNDI, then it will attempt to create the destination directly from the JMS Session instance—assuming that the user-provided name is the physical destination name and not a JNDI name.

705

Using SonicMQ

Using SonicMQ The SonicMQ Transport is implemented as an extension to the JMS Transport. In addition to JMS Message types, SOAtest supports MultiPartMessage for the SonicMQ Transport. For JMS Message types, you need to use JMS Transport. For more information on the JMS Transport, see “Using JMS”, page 694. For outbound messaging, a MultiPartMessage is constructed with a single part that wraps the request message as "text/xml" type. Users can specify the 'Content-ID' header field of the part via the Part Content ID field. For inbound messaging, both JMS and SonicMQ Transports can parse MultiPartMessage with multiple parts.

Creating New Tests for SonicMQ To configure SOAtest to access SonicMQ, complete the following: 1. Complete the WSDL test creation wizard as normal (see “Creating Tests From a WSDL”, page 385 for details). 2. Double-click the test node for the test that will be using SonicMQ. 3. In the right GUI panel, open the Transport tab and select SonicMQ from the Transport dropdown menu. Various options will display underneath the Transport drop-down menu: •

Connection Settings



Queue/Topic



Messaging Model



Message Exchange Pattern



Message Type



Request Message Properties



Response Message Correlation

4. Configure the desired options as described in the following sections.

Configuring SonicMQ Options After selecting SonicMQ from the Transport drop-down menu within the Transport tab of a SOAP Client or Messaging Client tool, the following options display in the left pane of the Transport tab: •

Connection Settings



Queue/Topic



Messaging Model



Message Exchange Pattern



Message Type



Part Content ID



Request Message Properties



Response Message Correlation

Connection Settings

706

Using SonicMQ

Connection Settings contains the Settings and Properties tabs. The Properties tab is optional and allows users to specify additional properties to be passed to the JNDI javax.naming.InitalContext constructor; in addition to the Provider URL and Initial Context factory properties that are specified in the Settings tab. The Settings tab contains: •

If you created a Shared Property for SonicMQ Connections, a drop-down menu will be available from which you can choose Use Local Settings or Use Shared Property. •

If you select Use Shared Property, a second drop-down menu displays from which you select the desired global SonicMQ settings that the SOAP Client tool will use. For more information on global SonicMQ settings, see “Global JMS Connection Properties”, page 337.



If you select Use Local Settings, or if no shared property is specified, you can configure the rest of the options for Connection Settings.



Provider URL: Specifies the value of the property named javax.naming.Context.PROVIDER_URL passed to the JNDI javax.naming.InitialContext constructor.



Initial Context: Specifies a fully qualified class name string, passed to the JNDI javax.naming.InitialContext constructor as a string value for the property named javax.naming.Context.INITIAL_CONTEXT_FACTORY.



Connection Factory: Passed to the lookup() method in javax.naming.InitialContext to create a javax.jms.QueueConnectionFactory or a javax.jms.TopicConnectionFactory instance.

In addition to the Settings tab, the Connection Settings also include: •

Queue Connection Authentication: Allows users to provide a username and password to create a queue connection. Select the Perform Authentication checkbox and enter the Username and Password to authenticate the request. If the correct username and password are not used, the request will not be authenticated.The username and password provided here is passed to the createQueueConnection() method in the javax.jms.QueueConnectionFactory class in order to get an instance of javax.jms.QueueConnection.



Keep-Alive Connection: Select to notify the test whether to share or close the current connection. The shared connections are returned to the connection pool to be used across the test suite. A life cycle of a connection pool is as follows: •

For a single test, it is destroyed at the end of the test execution.



For a test suite, it is destroyed at the end of the test suite execution.

Queue/Topic The Queue/Topic settings contain the following options: •

JMS Destination: Specifies the queue name (if point to point is used) or topic name (if publish and subscribe is used) for where the message will be sent to.



JMS ReplyTo: Specifies the queue name (if point to point is used) or topic name (if publish and subscribe is used) for where to get a response message from. This can be a temporary queue if Temporary is selected instead of Form.

Messaging Model Messaging Model options specify how messages are sent between applications. Select either Point to Point or Publish and Subscribe.

Message Exchange Pattern

707

Using SonicMQ

Message Exchange Pattern options specify whether or not SOAtest receives a response. If Get Response is selected, SOAtest sends a message and receives a response. If Get Response is not selected, SOAtest sends a one-way message and does not receive a response. If Get Response is selected, you can also enable Create consumer on the JMSReplyTo destination before sending the message.If the response is expected to become available very quickly on the JMSReplyTo topic, this option should be enabled to ensure that SOAtest has subscribed to the reply topic before the response message is published. This option is cannot be mixed with Match response JMSCorrelationID with the request JMSMEssageID because the JMS specification requires vendors to generate the JMSMessageID after the message is sent. As a result, there is no way to create the consumer on the response destination with that correlation (selector) set until after the message has been set and the JMSMessageID becomes available.

Message Type Message Type options allow you to select the message type from the drop-down menu. A SonicMQ Message is a Java object that contains the data being transferred between SonicMQ clients. The following Message Types are available: •

progress.message.jclient.MultipartMessage

Part Content ID A Sonic MultipartMessage can have multiple parts. Each part can has its own name (ID) and content. SOAtest supports sending MultipartMessages with a single part in SOAP Client and Messaging Client. This field specifies the part name and the content is defined by the Request area (Form Input, Literal XML, etc.). SOAtest supports receiving MultipartMessages with multiple parts and outputs all the part contents as XML to the Response Output of the tool.

Request Message Properties The Request Message Properties are optional and allows for any miscellaneous property values to be set into the javax.jms.Message object before it gets sent to a queue or published to a topic. These include predefined properties that get set to the outgoing requests message using one of the corresponding "set" methods in javax.jms.Message, or any custom property provided with the setStringProperty() method.

Response Message Correlation The Response Message Correlation settings contain the following options: •

Match response JMSCorrelationID with request JMSMessageID: If selected, the term JMSCorrelationID = '[msgId]' will be appended to the selector expression, where msgId is dynamically generated from the outgoing (request) javax.jms.Message (using the getJMSMessageID() method). Effectively, this results in the test blocking until a message with the specified correlation id becomes available in the queue (or topic) and it will only retrieve that particular message, rather than retrieving any message in the queue (or topic). The test will timeout after the timeout amount elapses and if there is no message that watches the selector criteria.



Match response JMSCorrelationID with request JMSCorrelationID: If selected, the term JMSCorrelationID = '[correlationId]' will be appended to the selector expression, where correlationId is retrieved from JMSCorrelationID property in the Message Properties section. The option becomes enabled only if such property is added to the Message Properties section. Effectively, this results in the test blocking until a message with the specified correlation id becomes available in the queue (or topic) and it will only retrieve that particular message,

708

Using SonicMQ

rather than retrieving any message in the queue (or topic). The test will timeout after the timeout amount elapses and if there is no message that watches the selector criteria. •

Additional Selector Expression Terms: (Optional) Enter a value to act as a message filter. For tips on specifying a selector, see “Using Message Selector Filters”, page 704.

Message Object Outputs for Clients Using SonicMQ You can add message object outputs to SOAP Clients and Messaging Clients that utilize the SonicMQ transport. For example, an Extension tool chained to a Messaging Client that uses SonicMQ will have access to the response SonicMQ Message. In the ObjectMessage case, you can use getter and equals() methods to validate the response thereby creating a regression control. In addition you can chain a Diff tool to the Response Traffic and if the response is an ObjectMessage, SOAtest will convert the inserted serializable object to XML format and perform an XML diff. By doing this you can use data bank values, ignore XPath differences, etc. To do this complete the following: 1. Right-click the SOA Client or Messaging Client node for which you would like to add the output and select Add Output from the shortcut menu. The Add Output wizard displays. 2. Select Response> Message Object in the left pane of the Add Output wizard and then choose a New Output or Existing Output and the desired tool (e.g. an Extension tool) from the right pane. 3. Click the Finish button. SOAtest adds the new output to the selected Client node.

709

Using IBM WebSphere MQ

Using IBM WebSphere MQ This topic refers to testing services over the IBM WebSphere MQ Java API, not the standard JMS API. If you intend to send and receive messages from IBM WebSphere MQ using the JMS API, select the JMS transport option and refer to “Using JMS”, page 694. The following contains a few steps required to setup SOAtest that will be accessing IBM MQ.

Adding Necessary Jar Files to the SOAtest Classpath In order to use the MQ option, you must add the com.ibm.mq.jar and connector.jar files to SOAtest’s classpath. These two jar files can be obtained from an MQ client installation or from a WebSphere MQ server installation. For information on obtaining a license or downloading MQ Client, visit http://www-01.ibm.com/software/integration/wmq/library/ For more information about WebSphere MQ requirements for Java clients such as SOAtest, refer to: http://publib.boulder.ibm.com/infocenter/wmqv7/v7r0/topic/com.ibm.mq.csqzaw.doc/ja10310_.htm To add these jar files to SOAtest’s classpath, complete the following: 1. Choose SOAtest> Preferences. 2. Open the System Properties page. 3. Click the Add JARS button and choose and select the necessary JAR files to be added. . •

For WebSphere MQ 6 and earlier: the required jars are com.ibm.mq.jar and connector.jar files. These jars can be found at [WebSphere MQ Installation directory]/java/lib. For details, see http://publib.boulder.ibm.com/infocenter/wmqv6/ v6r0/topic/com.ibm.mq.csqzaw.doc/uj10310_.htm.



For WebSphere MQ 7, the required jars are com.ibm.mq.jar, com.ibm.mq.jmqi.jar, com.ibm.mq.headers.jar, com.ibm.mq.pcf.jar, com.ibm.mq.commonservices.jar and connector.jar. These jars can be found at [WebSphere MQ Installation directory]/java/lib.

Creating New Tests for WebSphereMQ To configure SOAtest to access IBM WebSphere MQ, complete the following: 1. Complete the WSDL test creation wizard as normal (see “Creating Tests From a WSDL”, page 385 for details). 2. Double-click the test node for the test that will be using WebSphere MQ. Note: The SOAP Client and Messaging Client tools both support MQ and have the MQ option. Use SOAP Client if the messages that are being transmitted are SOAP messages, and Messaging Client if the messages are generic XML or plain text messages. 3. In the right GUI panel, open the Transport tab and select Websphere MQ from the Transport drop-down menu. Various options will display underneath the Transport drop-down menu: MQ Address, Message Exchange Pattern, Put Messages, Get Messages, Queue Manager Options, SSL, and Scripting Hook. 4. Configure the desired options as described in the following sections. •

Note: Any options set in the Scripting Hook tab overrides and takes precedence over any options set in the other tabs.

710

Using IBM WebSphere MQ



For more information on each of these options, please refer to the IBM WebSphere MQ Information Center at: http://publib.boulder.ibm.com/infocenter/wmqv6/v6r0/index.jsp?topic=/ com.ibm.mq.branding.doc/help_home_wmq.htm.



The full reference to the IBM MQ Java API that is used by SOAtest can be found at: http://publib.boulder.ibm.com/infocenter/wmqv6/v6r0/index.jsp?topic=/ com.ibm.mq.csqzaw.doc/jdmq.htm.

MQ Address Options The following options are available for MQ Address: •

Message Exchange Pattern: Specifies whether or not SOAtest receives a response. If Get Response is selected, SOAtest sends a message and receives a response. If Get Response is not selected, SOAtest sends a one-way message and does not receive a response.



Host: Specifies the name of the host running IBM MQ.



Port: Specifies the port number for IBM MQ (default is 1414).



Queue Manager: Specifies the name of the Queue Manager.



Channel: Specifies the name of the server-defined channel.



Put Queue: Specifies the queue that SOAtest sends the SOAP message to.



Get Queue: Specifies the queue that SOAtest retrieves the SOAP message from.



Authentication: Select the Perform Authentication checkbox and enter the Username and Password to authenticate the request. If the correct username and password are not used, the request will not be authenticated.

Message Exchange Pattern The following options are available for Message Exchange Pattern: •

Expect Synchronous Response: Specifies whether or not SOAtest receives a response. If Expect Synchronous Response is selected, SOAtest sends a message and receives a response. If Expect Synchronous Response is not selected, SOAtest sends a one-way message and does not receive a response.

Put Messages Options The following options are available for Put Messages: •

MQPutMessageOptions.options: Controls the action of MQQueue.put(). Any or none of the following values can be specified. •

MQPMO_SYNCPOINT: The request is to operate within the normal unit-of-work protocols. The message is not visible outside the unit of work until the unit of work is committed. If the unit of work is backed out, the message is deleted.



MQPMO_NO_SYNCPOINT: The request is to operate outside the normal unit-of-work protocols. The message is available immediately, and it cannot be deleted by backing out a unit of work. If neither this option nor MQPMO_SYNCPOINT is specified, the inclusion of the put request in unit-of-work protocols is determined by the environment, see MQPMO_SYNCPOINT. Do not specify MQPMO_NO_SYNCPOINT with MQPMO_SYNCPOINT.

711

Using IBM WebSphere MQ





MQPMO_NO_CONTEXT: Both identity and origin context are set to indicate no context. This means that the context fields in MQMD are set to: •

Blank, for character fields



Null, for byte fields



Zero, for numeric fields



MQPMO_DEFAULT_CONTEXT: The message is to have default context information associated with it, for both identity and origin.



MQPMO_SET_IDENTITY_CONTEXT: The message is to have context information associated with it. The application specifies the identity context in the MQMD structure. Origin context information is generated by the queue manager in the same way that it is for MQPMO_DEFAULT_CONTEXT.



MQPMO_SET_ALL_CONTEXT: The message is to have context information associated with it. The application specifies the identity and origin context in the MQMD structure. For more information on message context.



MQPMO_FAIL_IF_QUIESCING: This option forces the MQPUT or MQPUT1 call to fail if the queue manager is in the quiescing state.



MQPMO_NEW_MSG_ID: The queue manager replaces the contents of the MsgId field in MQMD with a new message identifier. This message identifier is sent with the message, and returned to the application on output from the MQPUT or MQPUT1 call.



MQPMO_NEW_CORREL_ID: The queue manager replaces the contents of the CorrelId field in MQMD with a new correlation identifier. This correlation identifier is sent with the message, and returned to the application on output from the MQPUT or MQPUT1 call.



MQPMO_LOGICAL_ORDER: This option tells the queue manager how the application puts messages in groups and segments of logical messages. It can be specified only on the MQPUT call; it is not valid on the MQPUT1 call. See WebSphere MQ Application Programming Reference for more information on this option.



MQPMO_ALTERNATE_USER_AUTHORITY: This indicates that the AlternateUserId field in the ObjDesc parameter of the MQPUT1 call contains a user identifier that is to be used to validate authority to put messages on the queue. The call can succeed only if this AlternateUserId is authorized to open the queue with the specified options, regardless of whether the user identifier under which the application is running is authorized to do so.



MQPMO_RESOLVE_LOCAL_Q: Use this option to fill ResolvedQName in the MQPMO structure with the name of the local queue to which the message is put, and ResolvedQMgrName with the name of the local queue manager that hosts the local queue.

Put Queue open options for MQQueue.access(): •

MQOO_OUTPUT: Open the queue to put messages. The queue is opened for use with subsequent MQPUT calls.



MQOO_PASS_IDENTITY_CONTEXT: This allows the MQPMO_PASS_IDENTITY_CONTEXT option to be specified in the PutMsgOpts parameter when a message is put on a queue. This gives the message the identity context information from an input queue that was opened with the MQOO_SAVE_ALL_CONTEXT option. The MQOO_OUTPUT option must be specified.

712

Using IBM WebSphere MQ





MQOO_PASS_ALL_CONTEXT: This allows the MQPMO_PASS_ALL_CONTEXT option to be specified in the PutMsgOpts parameter when a message is put on a queue. This gives the message the identity and origin context information from an input queue that was opened with the MQOO_SAVE_ALL_CONTEXT option. This option implies MQOO_PASS_IDENTITY_CONTEXT, which need not therefore be specified. The MQOO_OUTPUT option must be specified.



MQOO_SET_ALL_CONTEXT: This allows the MQPMO_SET_ALL_CONTEXT option to be specified in the PutMsgOpts parameter when a message is put on a queue. This gives the message the identity and origin context information contained in the MsgDesc parameter specified on the MQPUT or MQPUT1() call. The MQOO_OUTPUT option must be specified.



MQOO_SET_IDENTITY_CONTEXT: This allows the MQPMO_SET_IDENTITY_CONTEXT option to be specified in the PutMsgOpts parameter when a message is put on a queue. This gives the message the identity context information contained in the MsgDesc parameter specified on the MQPUT() or MQPUT1 call. This option implies MQOO_PASS_IDENTITY_CONTEXT, which need not therefore be specified. The MQOO_OUTPUT option must be specified.



MQOO_ALTERNATE_USER_AUTHORITY: The AlternateUserId field in the ObjDesc parameter contains a user identifier to use to validate this MQOPEN call. The call can succeed only if this AlternateUserId is authorized to open the object with the specified access options, regardless of whether the user identifier under which the application is running is authorized to do so.



MQOO_FAIL_IF_QUIESCING: The MQOPEN call fails if the queue manager is in quiescing state. This option is valid for all types of object.

MQMD.report: A report message about another message. This field enables SOAtest sending the original message to specify which report or response messages are required, whether the application message data is to be included in them, and also how the message and correlation ID in the report or reply are to be set. It comprises one or more constants from the MQC class. You may select one type from each of the following: •



Exception: •

MQRO_EXCEPTION: A message channel agent generates this type of report when a message is sent to another queue manager and the message cannot be delivered to the specified destination queue. For example, the destination queue or an intermediate transmission queue might be full, or the message might be too big for the queue.



MQRO_EXCEPTION_WITH_DATA: This is the same as MQRO_EXCEPTION, except that the first 100 bytes of the application message data from the original message are included in the report message. If the original message contains one or more MQ header structures, they are included in the report message, in addition to the 100 bytes of application data.



MQC.MQRO_EXCEPTION_WITH_FULL_DATA: Exception reports with full data required. This is the same as MQRO_EXCEPTION, except that all the application message data from the original message is included in the report message.

Expiration: •

MQRO_EXPIRATION: This type of report is generated by the queue manager if the message is discarded before delivery to an application because its

713

Using IBM WebSphere MQ

expiry time has passed. If this option is not set, no report message is generated if a message is discarded for this reason.







MQRO_EXPIRATION_WITH_DATA: This is the same as MQRO_EXPIRATION, except that the first 100 bytes of the application message data from the original message are included in the report message. If the original message contains one or more MQ header structures, they are included in the report message, in addition to the 100 bytes of application data.



MQRO_EXPIRATION_WITH_FULL_DATA: This is the same as MQRO_EXPIRATION, except that all the application message data from the original message is included in the report message.

Confirm on arrival: •

MQRO_COA: This type of report is generated by the queue manager that owns the destination queue when the message is placed on the destination queue. Message data from the original message is not included with the report message. If the message is put as part of a unit of work, and the destination queue is a local queue, the COA report message generated by the queue manager can be retrieved only if the unit of work is committed.



MQRO_COA_WITH_DATA: This is the same as MQRO_COA, except that the first 100 bytes of the application message data from the original message are included in the report message. If the original message contains one or more MQ header structures, they are included in the report message, in addition to the 100 bytes of application data.



MQRO_COA_WITH_FULL_DATA: This is the same as MQRO_COA, except that all the application message data from the original message is included in the report message.

Confirm on delivery: •

MQRO_COD: This type of report is generated by the queue manager when an application retrieves the message from the destination queue in a way that deletes the message from the queue. Message data from the original message is not included with the report message. If the message is retrieved as part of a unit of work, the report message is generated within the same unit of work, so that the report is not available until the unit of work is committed. If the unit of work is backed out, the report is not sent.



MQRO_COD_WITH_DATA: This is the same as MQRO_COD, except that the first 100 bytes of the application message data from the original message are included in the report message. If the original message contains one or more MQ header structures, they are included in the report message, in addition to the 100 bytes of application data.



MQRO_COD_WITH_FULL_DATA: This is the same as MQRO_COD, except that all the application message data from the original message is included in the report message.

You can specify how the message ID is generated for the report or reply message: •

MQRO_NEW_MSG_ID: This is the default action, and indicates that if a report or reply is generated as a result of this message, a new MsgId is generated for the report or reply message.



MQRO_PASS_MSG_ID: If a report or reply is generated as a result of this message, the MsgId of this message is copied to the MsgId of the report or reply message.

714

Using IBM WebSphere MQ

You can specify one of the following to control how to set the correlation ID of the report or reply message: •

MQRO_COPY_MSG_ID_TO_CORREL_ID: This is the default action, and indicates that if a report or reply is generated as a result of this message, the MsgId of this message is copied to the CorrelId of the report or reply message.



MQRO_PASS_CORREL_ID: If a report or reply is generated as a result of this message, the CorrelId of this message is copied to the CorrelId of the report or reply message.

You can specify one of the following to control the disposition of the original message when it cannot be delivered to the destination queue: •

MQRO_DEAD_LETTER_Q: This is the default action, and places the message on the dead-letter queue if the message cannot be delivered to the destination queue. An exception report message is generated, if one was requested by the sender.



MQRO_DISCARD_MSG: This discards the message if it cannot be delivered to the destination queue. An exception report message is generated, if one was requested by the sender.



MQRO_PASS_DISCARD_AND_EXPIRY: If this option is set on a message, and a report or reply is generated because of it, the message descriptor of the report inherits: •

MQRO_DISCARD_MSG if it was set.



The remaining expiry time of the message (if this is not an expiry report). If this is an expiry report the expiry time is set to 60 seconds.



Expiry: Enter the expiry time (in tenths of a second). It is set by the application which puts the message. After a message's expiry time has elapsed, it is eligible to be discarded by the queue manager. If the message specified one of the MQC.MQRO_EXPIRATION flags, then a report is generated when the message is discarded.



Correlation ID: Enter the correlation ID to use for the correlationID field in the message



Message sequence number: Enter the sequence number of logical message within group.



Reply queue manager name: Enter the name of the queue manager to which reply or report (response) messages can be sent.



Reply queue name: Enter the name of the queue to which a reply can be sent.



Put application name: Enter the name of the application that put the message.



Originating application data: Enter data about the originating application. This can be used by SOAtest to provide additional information about the origin of the message.



User ID: Enter the user ID. It is part of the identity of the message and identifies which user originated it.

Get Messages Options The following options are available for Get Messages: •

MQGetMessageOptions.options: Options which control the action of MQQueue.get() that is internally invoked by SOAtest. Any or none of the following values can be specified. •

MQGMO_WAIT: SOAtest waits until a suitable message arrives. The maximum time that the SOAtest waits is specified in WaitInterval.



MQGMO_NO_WAIT: SOAtest does not wait if no suitable message is available. This is the opposite of the MQGMO_WAIT option, and is defined to aid program documentation.

715

Using IBM WebSphere MQ



MQGMO_SYNCPOINT: The request is to operate within the normal unit-of-work protocols. The message is marked as being unavailable to other applications, but it is deleted from the queue only when the unit of work is committed. The message is made available again if the unit of work is backed out.



MQGMO_NO_SYNCPOINT: The request is to operate outside the normal unit-of-work protocols. The message is deleted from the queue immediately (unless this is a browse request). The message cannot be made available again by backing out the unit of work.



MQGMO_BROWSE_FIRST: When a queue is opened with the MQOO_BROWSE option, a browse cursor is established, positioned logically before the first message on the queue. You can then use MQGET calls specifying the MQGMO_BROWSE_FIRST, MQGMO_BROWSE_NEXT, or MQGMO_BROWSE_MSG_UNDER_CURSOR option to retrieve messages from the queue nondestructively. The browse cursor marks the position, within the messages on the queue, from which the next MQGET call with MQGMO_BROWSE_NEXT searches for a suitable message.



MQGMO_BROWSE_NEXT: Advance the browse cursor to the next message on the queue that satisfies the selection criteria specified on the MQGET call. The message is returned to SOAtest, but remains on the queue. After a queue has been opened for browse, the first browse call using the handle has the same effect whether it specifies the MQGMO_BROWSE_FIRST or MQGMO_BROWSE_NEXT option.



MQGMO_BROWSE_MSG_UNDER_CURSOR: Retrieve the message pointed to by the browse cursor nondestructively, regardless of the MQMO_* options specified in the MatchOptions field in MQGMO. The message pointed to by the browse cursor is the one that was last retrieved using either the MQGMO_BROWSE_FIRST or the MQGMO_BROWSE_NEXT option. The call fails if neither of these calls have been issued for this queue since it was opened, or if the message that was under the browse cursor has since been retrieved destructively. The position of the browse cursor is not changed by this call.



MQGMO_MSG_UNDER_CURSOR: Retrieve the message pointed to by the browse cursor, regardless of the MQMO_* options specified in the MatchOptions field in MQGMO. The message is removed from the queue. The message pointed to by the browse cursor is the one that was last retrieved using either the MQGMO_BROWSE_FIRST or the MQGMO_BROWSE_NEXT option.



MQGMO_LOCK: Lock the message that is browsed, so that the message becomes invisible to any other handle open for the queue.



MQGMO_UNLOCK: Unlock a message. The message to be unlocked must have been previously locked by an MQGET call with the MQGMO_LOCK option. If there is no message locked for this handle, the call completes with MQRC_NO_MSG_LOCKED.



MQGMO_ACCEPT_TRUNCATED_MSG: If the message buffer is too small to hold the complete message, allow the MQGET call to fill the buffer with as much of the message as the buffer can hold.



MQGMO_FAIL_IF_QUIESCING: Force the MQGET call to fail if the queue manager is in the quiescing state. On z/OS, this option also forces the MQGET call to fail if the connection (for a CICS or IMS application) is in the quiescing state.



MQGMO_CONVERT: Requests SOAtest data to be converted. The conversion conforms to the characterSet and encoding attributes of MQMessage, before the data is copied into the message buffer.

716

Using IBM WebSphere MQ





MQGetMessageOptions.matchOptions: Selection criteria which determine which message is retrieved. The following match options can be set: •

MQMO_MATCH_CORREL_ID: The message to be retrieved must have a correlation identifier that matches the value of the CorrelId field in the MsgDesc parameter of the MQGET call. This match is in addition to any other matches that might apply (for example, the message identifier).



MQMO_MATCH_GROUP_ID: The message to be retrieved must have a group identifier that matches the value of the GroupId field in the MsgDesc parameter of the MQGET call. This match is in addition to any other matches that might apply (for example, the correlation identifier).



MQMO_MATCH_MSG_ID: The message to be retrieved must have a message identifier that matches the value of the MsgId field in the MsgDesc parameter of the MQGET call. This match is in addition to any other matches that might apply (for example, the correlation identifier).



MQMO_MATCH_MSG_SEQ_NUMBER: The message to be retrieved must have a message sequence number that matches the value of the MsgSeqNumber field in the MsgDesc parameter of MQGMO the MQGET call. This match is in addition to any other matches that might apply (for example, the group identifier).



MQMO_NONE: Do not use matches in selecting the message to be returned. All messages on the queue are eligible for retrieval. MQMO_NONE aids program documentation. It is not intended that this option be used with any other MQMO_* option, but as its value is zero, such use cannot be detected.

Get Queue open options for MQQueue.access(): •

MQOO_BROWSE: Open the queue to browse messages. The queue is opened for use with subsequent MQGET calls with one of the following options: MQGMO_BROWSE_FIRST, MQGMO_BROWSE_NEXT and MQGMO_BROWSE_MSG_UNDER_CURSOR. This is allowed even if the queue is currently open for MQOO_INPUT_EXCLUSIVE. An MQOPEN call with the MQOO_BROWSE option establishes a browse cursor, and positions it logically before the first message on the queue.



MQOO_INPUT_AS_Q_DEF: Open the queue to get messages using the queuedefined default. The queue is opened for use with subsequent MQGET calls. The type of access is either shared or exclusive, depending on the value of the DefInputOpenOption queue attribute.



MQOO_INPUT_SHARED: Open the queue to get messages with shared access. The queue is opened for use with subsequent MQGET calls. The call can succeed if the queue is currently open by this or another application with MQOO_INPUT_SHARED, but fails with reason code MQRC_OBJECT_IN_USE if the queue is currently open with MQOO_INPUT_EXCLUSIVE.



MQOO_INPUT_EXCLUSIVE: Open the queue to get messages with exclusive access. The queue is opened for use with subsequent MQGET calls. The call fails with reason code MQRC_OBJECT_IN_USE if the queue is currently open by this or another application for input of any type.



MQOO_ALTERNATE_USER_AUTHORITY: The AlternateUserId field in the ObjDesc parameter contains a user identifier to use to validate this MQOPEN call. The call can succeed only if this AlternateUserId is authorized to open the object with the specified access options, regardless of whether the user identifier under which SOAtest is running is authorized to do so.

717

Using IBM WebSphere MQ

• •

MQOO_FAIL_IF_QUIESCING: The MQOPEN call fails if the queue manager is in quiescing state. This option is valid for all types of object.

Wait Interval: Enter the maximum time (in milliseconds) that an MQQueue.get() call waits for a suitable message to arrive. It is used in conjunction with MQC.MQGMO_WAIT. This has no effect if MQMO_WAIT is not selected. Value -1 is equivalent to having MQGMO_NO_WAIT selected.

Queue Manager Options The following options are available for Queue Manager Options: •

MQQueueManager binding options: Creates a connection to the named queue manager specifying binding options. •

MQCNO_FASTPATH_BINDING: This option causes SOAtest and the local-queuemanager agent to be part of the same unit of execution. This is in contrast to the normal method of binding, where SOAtest and the local-queue-manager agent run in separate units of execution.



MQCNO_STANDARD_BINDING: This connection option causes SOAtest and the local-queue-manager agent (the component that manages queuing operations) to run in separate units of execution (generally, in separate processes). This arrangement maintains the integrity of the queue manager, that is, it protects the queue manager from errant programs.



MQCNO_SHARED_BINDING: This connection option causes SOAtest and the localqueue-manager agent (the component that manages queuing operations) to run in separate units of execution (generally, in separate processes). This arrangement maintains the integrity of the queue manager. That is, it protects the queue manager from errant programs. However some resources are shared between SOAtest and the local-queue-manager agent.



MQCNO_ISOLATED_BINDING: This option causes SOAtest and the local-queuemanager agent (the component that manages queuing operations) to run in separate units of execution (generally, in separate processes). This arrangement maintains the integrity of the queue manager, that is, it protects the queue manager from errant programs. SOAtest process and the local-queue-manager agent are isolated from each other in that they do not share resources.

SSL The following options are available for SSL: •

CipherSuite: Specifies the CipherSuite to use for the SSL connection on the specified MQ Channel. To determine which CipherSuite to select based on the CipherSpec, refer to the following: http://publib.boulder.ibm.com/infocenter/wmqv6/v6r0/index.jsp?topic=/ com.ibm.mq.csqzaw.doc/jsslcs.htm



Trust Store: Specifies the Trust Store to be used for Server-Side SSL (authentication of the Queue Manager by the client).



Key Store: Specifies the Key Store to be used for Client-Side SSL (authentication of the client by the Queue Manager).



Key Store Password: Specifies the Key Store password.



Peer Name: (Optional) Verifies that the certificate presented by the Queue Manager matches the criteria specified with the Peer Name parameter. A server certificate will match this parameter with the Distinguished Name (DN) of the certificate presented by the Queue Manager. For

718

Using IBM WebSphere MQ

more information on DN, refer to the following: http://publibfp.boulder.ibm.com/epubs/html/ csqzas01/csqzas0134.htm If the MQ Channel does not require the client to authenticate itself, then the Key Store and Key Store Password do not need to be provided. Once an SSL connection is attempted by specifying a CipherSuite and running the test, the trust store, key store, and key store passwords cannot be changed. If they are changed, SOAtest needs to be restarted before changes will take effect.

Scripting Hook Options The Scripting Hook options allow you to customize MQ Properties by using scripting language such as Python, Java, and JavaScript. If you need more information on using SOAtest’s scripting utility, please refer to the Scripting section of the tutorial. For a list of scripting APIs, go to Help> Extensibility API in the SOAtest GUI. The following are scripting access keys: •

QueueManager – mqManager



GetQueue – mqGetQue



PutQueue – mqPutQue



PutMessage – mqPutMessage



GetMessage – mqGetMessage



PutMessageOptions – mqPutMessageOptions



GetMessageOptions – mqGetMessageOptions

For example, if you like to change the Expiry time for put message to 999: from com.ibm.mq import * def changeExpiry(context): putMessage = context.get("mqPutMessage") putMessage.expiry = 999

Once you are done running the test with the above script, note the Expiry field in the Traffic header has changed to 999. Note: Any options set in the Add MQ Hook tab overrides and takes precedence over any options set in the other tabs.

Interpreting WebSphere MQ Error Messages When a failure occurs, MQ returns a reason code for a failure. SOAtest error messages report these same reason codes in order for users to interpret them. For a list of MQ reason codes and their meaning, please refer to the IBM WebSphere Web site at: http://publib.boulder.ibm.com/infocenter/wmqv6/v6r0/topic/com.ibm.mq.amqzao.doc/csq05rea3.htm

719

Using TIBCO Rendezvous

Using TIBCO Rendezvous TIBCO Rendezvous technology provides a message bus that simplifies the process of exchanging data among large scale networks. It offers an efficient, yet simple way for applications to communicate with one another. With its increasing popularity, more and more corporations are planning to integrate TIBRV as their main communication link among applications. SOAtest enables users to test Web services that use TIBCO Rendezvous, functionality tests are generated automatically from the WSDL. Users now have the option to choose from a variety of transports available. The following documentation provides step-by-step instructions on how to configure SOAtest to run with TIBRV.

Configuring TIBCO for Use in SOAtest Before configuring TIBCO protocol properties, you must have TIBRV installed on your machine. Once TIBRV is installed on your machine, you must then add the TIBRV jar file to SOAtest’s classpath through SOAtest’s System properties. For information on obtaining a license or downloading TIBCO Rendezvous, please visit http://www.tibco.com/software/messaging/rendezvous/default.jsp. 1. Run the installation file. •

For the component selection screen, make sure the Run TimeComponent is selected.



There are two protocols available, (TRDP and PGM), be sure to select the appropriate protocol that matches the protocol that daemon uses. If you are unsure about which protocol to select please consult the system administrator.

2. Once TIBCO Rendezvous is installed on your machine, you will have to add the appropriate JAR files to SOAtest’s classpath. To do so, complete the following: a. Choose SOAtest> Preferences. b. Open the System Properties page. c.

Click the Add JARS button and choose and select the necessary JAR files to be added. In this case, add <TIBRV>\lib\tibrvj.jar to the classpath.

Configuring SOAtest to Use TIBCO Rendezvous To configure SOAtest to use TIBCO Rendezvous, complete the following: 1. Complete the WSDL test creation wizard as normal (see “Creating Tests From a WSDL”, page 385 for details). 2. Double-click the node for the test that will be using TIBCO. 3. In the right GUI panel, select the Transport tab and select TIBCO from the Transport dropdown menu. 4. Configure the following options: •

Message Exchange Pattern: Specifies whether or not SOAtest receives a response. If Get Response is selected, SOAtest sends a message and receives a response. If Get Response is not selected, SOAtest sends a one-way message and does not receive a response.



Daemon: Specifies the server name or the server’s IP followed by a colon (:) and the port number (e.g. 10.10.32.34:7500 or host_name:7500).

720

Using TIBCO Rendezvous



Network: Specifies the network where the transport objects are located. The network parameter consists of up to three parts separated by a semicolon (;) in the form of network; multicast groups; send address (e.g. lan0; 224.1.1.1; 244.1.1.6). For more information, please refer to the Network Selection section of the TIBCO Rendezvous documentation.



Service: Specifies TIBCO’s service name.



Send Subject: Specifies the subject name that the TIBCO daemon listens for.



Send Field Name: Specifies the field name from which the SOAP message is extracted from.



Message Delivery: Indicates what type of messages the SOAtest should look for on the bus. This should correspond to the delivery type established by the message sender.



Time limit: Applies in Certified delivery mode only and it indicates to the outgoing TIBCO message the maximum time allowed before the receiver gets the message. SOAtest provides this to the TIBCO API using: com.tibco.tibrv.TibrvCmMsg.setTimeLimit(double)



Create inbox for unicast (point-to-point) delivery: When this option is selected, a random subject name is created and SOAtest automatically listens for a response message on that subject. This is a point-to-point messaging pattern in TIBCO RV (synchronous request/response). When this option is unselected, you must specify a subject name.



Reply Subject: Specifies the subject name that the TIBCO daemon listens for.



Reply Field Name: Specifies the field name from which the SOAP message is extracted from.

Notice that when point-to-point is selected for receiving, it automatically constrains the delivery mode in SOAtest to the same mode as the one set in the send area because in that pattern you send and receive messages using the same delivery mode. However, when the point-to-point check box is not selected, you can select how the Reply message should be delivered, because in that mode the pattern is asynchronous, and a response message could be expected to come as certified or reliable mode, depending on how your TIBCO app is set up. Once you are done setting up TIBCO Properties, you should now be able to send and receive messages using SOAtest.

721

Using RMI

Using RMI SOAtest allows users to perform Remote Method Invocation as long as the following conditions apply: •

The machine that is to invoke the RMI has all required interfaces as well as the necessary class files.



All required Jar files or class files are included in SOAtest’s classpath: a. Choose SOAtest> Preferences. b. Open the System Properties page. c.





Click the Add JARS button and choose and select the necessary JAR files to be added, or click the Add Class Folder button and choose and select the folder where the class files are located.

The remote method returns a string (unless it is a one-way message). •

If you are using RMI transport within a SOAP Client, the returned string is expected to be a SOAP message. There are no restrictions on the returned string for Messaging Client.



To specify a one-way message, make sure the checkbox for Get Response is unchecked within the RMI properties.

The remote interface consists of a method that takes one, and only one, string parameter (which will be the SOAP request).

Configuring SOAtest to Use RMI To configure SOAtest to use RMI, complete the following: 1. Complete the WSDL test creation wizard as normal (see “Creating Tests From a WSDL”, page 385 for details). 2. Double-click the node for the test that will be using RMI. 3. In the right GUI panel, select the Transport tab and select RMI from the Transport drop-down menu. 4. Configure the following options: •

Host: Specifies the machine name or IP that hosts the RMI service.



Port: Specifies the port number for the RMI service.



Binding: Specifies the RMI service binding. Binding in this case is the RMI registry where an application has registered its remote object to. For example, a lookup registry may resemble the following: //goldfish:1717/Soatest

Where host = goldfish port = 1717 binding/registry = Soatest •

Remote Interface: Specifies the client-side remote interface (the remote interface should be included in the classpath).



Method Name: Specifies the method within the remote interface that invokes the RMI. Enter the name only (e.g. invokeSOAP). It is assumed that the method takes a String SOAP request and returns a String SOAP response if you are using the SOAP Client.

722

Using SMTP

Using SMTP In the few cases where you may want to send SOAP messages to a specific email account, from archiving SOAP messages to troubleshooting errors, SMTP is the best transport protocol candidate for such tasks.

Configuring SOAtest to Use SMTP To configure SOAtest to use the SMTP protocol, complete the following: 1. Complete the WSDL test creation wizard as normal (see “Creating Tests From a WSDL”, page 385 for details). 2. Double-click the node for the test that will be using SMTP. 3. In the right GUI panel, select the Transport tab and select SMTP from the Transport dropdown menu. 4. Configure the following options: •

From: Specifies the email address from which the message is sent.



To: Specifies the email address to which the message is sent. For multiple receivers, separate each email address by a comma. (e.g. [email protected], user2@company,com)



Subject: Specifies the subject of the email being sent.



SMTP Headers: You can configure SMTP headers by clicking the Add button and then entering the Name and Value for the header.

Once the test is set up, the email is automatically sent every time the test is run. It is also proven helpful for SOAtest to alert users via email of specific test failures by combining Test flow logic with an Messaging Client that uses SMTP. To do so, complete the following: 1. Create an Messaging Client by right clicking on the Test Suite node and select Add Test> Standard Test> New Tool> Messaging Client. 2. In the Messaging Client tool select SMTP as the transport, and fill in the receiver’s email information within the Transport Properties. 3. In the text area of the Messaging Client enter a message (to alert the users). 4. Configure the test flow logic by selecting the test suite node and click on the Test Flow Logic button. 5. Now create test dependency for the Messaging Client. (e.g. run Messaging Client only when Test 2 succeeds)

723

Testing a CORBA Server

Testing a CORBA Server This topic covers basic steps and operations to test a CORBA (Common Object Request Broker Architecture) server. There are several ways to ensure the correct functionality of a CORBA server; the following are a few examples and simple exercises to help you better understand how SOAtest can simplify the process of server testing. In this section, different scenarios will show how SOAtest can be incorporated into the testing of non-SOAP servers.

Scenario 1: CORBA Client Has Not Yet Been Implemented Note: Continue to Scenario 2 if you already have a Java client created. To use the interfaces/IDL offered by the server, you need to generate the java stubs on the client side. In this section we are going to cover simple IDL to Java conversions. Under the CORBA folder in Parasoft/SOAtest/[SOAtest version number]/build/examples, a sample Calculator.idl file is included for the following exercise. In order to use IDLJ, make sure you have J2SDK installed and set the PATH variable so you can access the J2SDK’s executables from any directory. To convert IDL to Java using IDLJ, complete the following: 1. In command prompt, change the current directory to the folder that contains Calculator.idl (In this example C:\Program Files\Parasoft\SOAtest\[SOAtest version number]\build\examples\CORBA) 2. Type: “idlj –pkgTranslate Persistent examples.CORBA –fall Calculator.idl” to automatically generate packages with correct paths. 3. Compile the java files by typing: javac /examples/CORBA/*.java Now you have the necessary class files needed to communicate with the server. Please continue on to Scenario 2 to interface SOAtest with an existing java client. For more information on IDLJ please visit sun.java’s website: http://java.sun.com/j2se/1.5.0/docs/guide/rmi-iiop/toJavaPortableUG.html

Scenario 2: Interfacing SOAtest with an Existing Java Client In this section we will demonstrate how to invoke Java services from a CORBA server by using SOAtest’s Extension tool. 1. Create a method tool by right-clicking on the test suite and select Add Test> Standard Test> New Tool> Extension. 2. Select the Extension tool node, in the right GUI panel select Python, JavaScript, or Java from the Language drop-down menu to access your CORBA Java Client. For example, for Python you can enter something similar to the following in the Text field: # In our example, examples.CORBA.PersistentClient is our CORBA Java Client from examples.CORBA import * from java.lang import * def foo(input, context): # Here we are Initializing the client by providing location of the server, # port number, and the service name client = PersistentClient("goldfish.parasoft.com", 2222, "GoldfishCorbaServer") # Here we are making the actual Method Invocation onto the Service “add(x,y)”

724

Testing a CORBA Server

return client.add(3, 5)

3. Right-click within the Text field and select Evaluate from the shortcut menu to make sure the syntax is correct. If the syntax is correct, the name of the function should be auto-populated into the Method drop-down menu: foo(). 4. Right click on the Extension tool node and select Add Return Value Output> Existing Output> Edit to show the returned values after execution of the test. 5. Run the test, if the test succeeds the return values should appear in the right GUI panel. 6. If the test failed, returning a Null Pointer exception on the edit screen; check the CORBA server and make sure the server is listening on the designated port and that the service is up and running.

Scenario 3: Interfacing SOAtest with an Existing non-Java Client In this section we will demonstrate how to invoke non-Java services from CORBA server by using SOAtest’s External tool. 1. Create an external tool by right clicking on the test suite and select Add Test> Standard Test> New Tool> External Tool. 2. Select the External tool node and change its name to CORBA Client. 3. Click on the Browse button and select the path to the CORBA client executable. 4. If CORBA client takes in parameters, add each argument buy clicking on the ADD button. A new line will get generated, allowing users to input a flag and argument associated with the executable. 5. Double-click on the line generated to enter flag and argument. A new dialog box will pop up; change the name and argument accordingly. 6. If you wish to use a parameterized value, select Parameterized in the Value drop-down menu and select variable name in the Variable drop-down menu then click OK. 7. In the right GUI panel select the Keep output checkbox to keep the returned values after each test run. 8. Right-click the External tool node and select Add Return Value Output> Existing Output> Edit to show the returned values after execution of the test. 9. Run the test, if the test succeeds return values should appear in the right GUI panel. 10. If the test failed, returning a Null Pointer exception on the edit screen; check the CORBA server and make sure the server is listening on the designated port and that the service is up and running.

725

Using .NET WCF TCP

Using .NET WCF TCP The .NET WCF TCP transport option allows the SOAP Client tool to invoke Windows Communication Foundation (.Net 3.0 or .Net 3.5) Web services that use the TCP transport. SOAtest is able to understand WCF's system-provided NetTcpBinding and custom bindings that use the TcpTransportBindingElement. Microsoft describes the NetTcpBinding as "a secure and optimized binding suitable for cross-machine communication between WCF applications." For more information, see the following: •

System-provided bindings



Custom bindings



NetTcpBinding



TcpTransportBindingElement

Configuring .NET WCF TCP for Use in SOAtest After selecting .NET WCF TCP from the Transport drop-down menu within the Transport tab of a SOAP Client tool, the following options display in the left pane of the Transport tab: •

Endpoint



Configuration File



Security

Endpoint Options The endpoint is the URL of the web service endpoint. By default, SOAP Client endpoints are set to the endpoint defined in the WSDL. The endpoint URL for WCF Web services using the TCP transport begin with "net.tcp://" as opposed to "http://". Besides WSDL, there are three other endpoint options: •

Default: When this option is selected, the endpoint will be the endpoint defined in the Test Suite that has the SOAP Client test. To see the GUI for the endpoint defined in the Test Suite, double-click the Test Suite node and open the SOAP Client Options tab:



Custom: Allows you to set any custom endpoint.



UDDI serviceKey: Describes what UDDI serviceKey is used to reference this server endpoint in the UDDI registry specified in the Preferences panel’s WSDL/UDDI tab.

Configuration File Options A WCF configuration file is an XML file that is used to define configuration settings for both WCF clients and services. The SOAP Client acts as a WCF client and can be configured using the settings from a WCF configuration file. If the Web service being tested has a WSDL document or metadata endpoint, a WCF client configuration file can be automatically generated by the Microsoft Service Model Metadata Tool. The Microsoft Service Configuration Editor can also be used to create and edit WCF config-

726

Using .NET WCF TCP

uration files using a graphical user interface. These tools are widely available as part of Microsoft Windows SDK for Windows Vista. The SOAP Client is capable of automatically determining endpoint binding information based on the policies defined in the WSDL, or based on the endpoint binding configuration defined in a WCF client configuration file. When the Constrain to WSDL SOAP Client option is selected, the endpoint's binding is automatically determined based on the policies defined in the WSDL. When Constrain to WSDL in unchecked, SOAtest will look in the specified WCF client configuration file for the endpoint's binding settings. The SOAP Client can also use the endpoint behavior information defined in the WCF client configuration file. An endpoint behavior section containing a clientCredentials element must be present if the service will be negotiating service credentials using certificates. The following Configuration File options are available: •

WCF Client Configuration File: Click the Browse button to select the desired configuration file.



Persist As Relative Path: Check this option if you want the path to this file to be saved as a path that is relative to the current configuration file. Enabling this option makes it easier to share tests across multiple machines. If this option is not enabled, the test suite will save the path to this file as an absolute path.



Open Service Configuration Editor: This option is only displayed if Microsoft's Service Configuration Editor is installed on your machine. This feature, allows you to quickly open the editor to create a new service configuration file, or to view or edit the currently selected service configuration file.

Security Options The Security options tab allows you to enter username and password information for various security credentials. In addition, the Open Certificate Manager option is available. Click Open Certificate Manager to open the Windows Certificate Manager. The Windows Certificate Manager allows you to manage any certificates needed for authentication when invoking your Web service. The WCF Configuration file may have references to certificates shown by the Windows Certificate Manager

727

Using .NET WCF TCP

Configuring the Request When selecting the Request tab within the Project Configuration panel of the SOAP Client, you can configure the SOAP envelope for the request message. This SOAP envelope is configured before any transport and message transformations, such as encryption. .NET WCF will automatically apply the message and transport security when you run the test. The XML Signer and XML Encryption tools are not necessary and should not be chained to the SOAP Client test. Also, no security headers need to be added to the SOAP request. The only SOAP header that should be present is a WS-Addressing header which SOAtest automatically adds if your test was created from a WSDL.

Configuring Transaction Support SOAtest supports flowed transactions via the WS-Atomic Transactions protocol and MS OLE Transactions protocol for WCF web services. For more information, see “Using .NET WCF Flowed Transactions”, page 734.

Configuring Global .NET System Properties in SOAtest Global properties in the .NET System.Net.ServicePointManager class can be configured by setting java system properties on the SOAtest command line. The ServicePointManager.DefaultConnectionLimit limit property limits the number of HTTP keep alive connections between SOAtest and a remote host. The default value for the DefaultConnectionLimit property is 2. If you are performing load testing and want more than two HTTP keep-alive connections, we recommend increasing this property.

728

Using .NET WCF TCP

For example, to set the DefaultConnectionLimit property to 50 you can pass "-J-DSystem.Net.ServicePointManager.DefaultConnectionLimit=50" as a command line argument when starting SOAtest. The full list of supported ServicePointManager properties that can be configured on the SOAtest command line are listed below: •

System.Net.ServicePointManager.CheckCertificateRevocationList



System.Net.ServicePointManager.DefaultConnectionLimit



System.Net.ServicePointManager.DnsRefreshTimeout



System.Net.ServicePointManager.EnableDnsRoundRobin



System.Net.ServicePointManager.Expect100Continue



System.Net.ServicePointManager.MaxServicePointIdleTime



System.Net.ServicePointManager.MaxServicePoints



System.Net.ServicePointManager.UseNagleAlgorithm

For more information about these properties please see http://msdn.microsoft.com/en-us/library/system.net.servicepointmanager_properties.aspx.

729

Using .NET WCF HTTP

Using .NET WCF HTTP The .NET WCF HTTP transport option allows the SOAP Client tool to invoke Windows Communication Foundation (.Net 3.0 or .Net 3.5) Web services that use the HTTP transport. SOAtest is able to understand WCF's system-provided BasicHttpBinding, WSHttpBinding, and custom bindings that use the HttpTransportBindingElement. The .NET WCF HTTP Transport can also be used to invoke Javabased HTTP web services for testing interoperability with .Net WCF Clients. For more information, see the following: •

System-provided bindings



Custom bindings



BasicHttpBinding



WSHttpBinding



HttpTransportBindingElement

Configuring .NET WCF HTTP for Use in SOAtest After selecting .NET WCF HTTP from the Transport drop-down menu within the Transport tab of a SOAP Client tool, the following options display in the left pane of the Transport tab: •

Endpoint



Configuration File



Security

Endpoint Options The endpoint is the URL of the web service endpoint. By default, SOAP Client endpoints are set to the endpoint defined in the WSDL. Besides WSDL, there are three other endpoint options: •

Default: When this option is selected, the endpoint will be the endpoint defined in the Test Suite that has the SOAP Client test. To see the GUI for the endpoint defined in the Test Suite, click on the Test Suite node and click on the SOAP Client Options tab:



Custom: Allows you to set any custom endpoint.



UDDI serviceKey: Describes what UDDI serviceKey is used to reference this server endpoint in the UDDI registry specified in the Preferences panel’s WSDL/UDDI tab.

Configuration File Options A WCF configuration file is an XML file that is used to define configuration settings for both WCF clients and services. The SOAP Client acts as a WCF client and can be configured using the settings from a WCF configuration file. If the Web service being tested has a WSDL document or metadata endpoint, a WCF client configuration file can be automatically generated by the Microsoft Service Model Metadata Tool. The Microsoft Service Configuration Editor can also be used to create and edit WCF config-

730

Using .NET WCF HTTP

uration files using a graphical user interface. These tools are widely available as part of Microsoft Windows SDK for Windows Vista. The SOAP Client is capable of automatically determining endpoint binding information based on the policies defined in the WSDL, or based on the endpoint binding configuration defined in a WCF client configuration file. When the Constrain to WSDL SOAP Client option is selected, the endpoint's binding is automatically determined based on the policies defined in the WSDL. When Constrain to WSDL in unchecked, SOAtest will look in the specified WCF client configuration file for the endpoint's binding settings. The SOAP Client can also use the endpoint behavior information defined in the WCF client configuration file. An endpoint behavior section containing a clientCredentials element must be present if the service will be negotiating service credentials using certificates. The following Configuration File options are available: •

WCF Client Configuration File: Click the Browse button to select the desired configuration file.



Persist As Relative Path: Check this option if you want the path to this file to be saved as a path that is relative to the current configuration file. Enabling this option makes it easier to share tests across multiple machines. If this option is not enabled, the test suite will save the path to this file as an absolute path.



Open Service Configuration Editor: This option is only displayed if Microsoft's Service Configuration Editor is installed on your machine. This feature, allows you to quickly open the editor to create a new service configuration file, or to view or edit the currently selected service configuration file.

Security Options The Security options tab allows you to enter username and password information for various security credentials. In addition, the Open Certificate Manager option is available. Click Open Certificate Manager to open the Windows Certificate Manager. The Windows Certificate Manager allows you to manage any certificates needed for authentication when invoking your Web service. The WCF Configuration file may have references to certificates shown by the Windows Certificate Manager

731

Using .NET WCF HTTP

Configuring the Request When selecting the Request tab within the Project Configuration panel of the SOAP Client, you can configure the SOAP envelope for the request message. This SOAP envelope is configured before any transport and message transformations, such as encryption. .NET WCF will automatically apply the message and transport security when you run the test. The XML Signer and XML Encryption tools are not necessary and should not be chained to the SOAP Client test. Also, no security headers need to be added to the SOAP request. The only SOAP header that should be present is a WS-Addressing header which SOAtest automatically adds if your test was created from a WSDL.

Configuring Transaction Support SOAtest supports flowed transactions via the WS-Atomic Transactions protocol and MS OLE Transactions protocol for WCF web services. For more information, see “Using .NET WCF Flowed Transactions”, page 734.

Configuring Global .NET System Properties in SOAtest 732

Using .NET WCF HTTP

Global properties in the .NET System.Net.ServicePointManager class can be configured by setting java system properties on the SOAtest command line. The ServicePointManager.DefaultConnectionLimit limit property limits the number of HTTP keep alive connections between SOAtest and a remote host. The default value for the DefaultConnectionLimit property is 2. If you are performing load testing and want more than two HTTP keep-alive connections, we recommend increasing this property. For example, to set the DefaultConnectionLimit property to 50 you can pass "-J-DSystem.Net.ServicePointManager.DefaultConnectionLimit=50" as a command line argument when starting SOAtest. The full list of supported ServicePointManager properties that can be configured on the SOAtest command line are listed below: •

System.Net.ServicePointManager.CheckCertificateRevocationList



System.Net.ServicePointManager.DefaultConnectionLimit



System.Net.ServicePointManager.DnsRefreshTimeout



System.Net.ServicePointManager.EnableDnsRoundRobin



System.Net.ServicePointManager.Expect100Continue



System.Net.ServicePointManager.MaxServicePointIdleTime



System.Net.ServicePointManager.MaxServicePoints



System.Net.ServicePointManager.UseNagleAlgorithm

For more information about these properties please see http://msdn.microsoft.com/en-us/library/system.net.servicepointmanager_properties.aspx.

733

Using .NET WCF Flowed Transactions

Using .NET WCF Flowed Transactions SOAtest is capable of initiating flowed transactions using WS-Atomic Transaction or the Microsoft OleTransactions protocol. Flowed transactions can be used when the service endpoint's binding configuration has transactionFlow="true" and either transactionProtocol="OleTransactions" or transactionProtocol="WSAtomicTransactionOctober2004". If the operation contract for the Web service uses the TransactionFlow attribute with the TransactionFlowOption.Allowed property, then enabling transactions in SOAtest is optional and does not require any extra test configuration. However, if the operation contract is using the TransactionFlowOption.Mandatory property, then transactions are required and the test suite must be configured to initiate a flowed transaction. For general information about WCF flowed transactions, see the following: •

WS Transaction Flow: http://msdn.microsoft.com/en-us/library/ms752261.aspx



TransactionFlowAttribute: http://msdn.microsoft.com/en-us/library/system.servicemodel.transactionflowattribute.aspx



TransactionFlowOption: http://msdn.microsoft.com/en-us/library/system.servicemodel.transactionflowoption.aspx

The above WS Transaction Flow MSDN article explains what is required to configure a WCF web service client to initiate a flowed transaction. The steps necessary to configure transactions in SOAtest are similar or analogous to what is described in this article. It is highly recommended for users to read the WS Transaction Flow article in order to first understand the basic concepts behind transactions in WCF.

Configuring MSDTC The Microsoft Distributed Transaction Coordinator (MSDTC) must be configured correctly on the machine running SOAtest, as well as the machine hosting the Web service. The MSDN WS Transaction Flow article has instructions for configuring MSDTC and how to configure the Windows firewall to allow DTC. Here are a a few other articles that may be useful when configuring MSDTC. •

How to enable network DTC access in Windows Server 2003: http://support.microsoft.com/kb/817064/en-us



How to troubleshoot MS DTC firewall issues: http://support.microsoft.com/kb/306843/en-us

Configuring the Test Suite Invoking several Web service operations within the scope of a transaction requires configuring a TransactionScope object. Take the following C# code snippet: CalculatorClient client = new CalculatorClient("transactionEnabledEndpoint"); using (TransactionScope tx = new TransactionScope(TransactionScopeOption.Required)) { client.add(1, 2); client.add(2, 3); tx.Complete(); } client.Close();

In SOAtest, an equivalent test suite would be structured as follows: •

Extension Tool 1: Create Transaction Scope •

XML Data Bank: propagation token



SOAP Client 1: add(1,2)



SOAP Client 2: add(2,3)

734

Using .NET WCF Flowed Transactions



Extension Tool 2: Complete Transaction

In the test suite above, Extension Tool 1 is needed to create the TransactionScope object and to get the propagation token for the current transaction. The propagation token is then put into an XML Data Bank because it will be needed later by the SOAP Client tests (described later). Here is sample python code that can be used for Extension Tool 1: from webtool.soap.wcf import *; from soaptest.api import *; def createTransaction(input, context): id = WcfUtil.createTransactionScope(WcfUtil.TRANSACTION_SCOPE_OPTION_REQUIRED, WcfUtil.ISOLATION_LEVEL_SERIALIZABLE, 60000) context.put("MY_TRANSACTION_ID", id) token = WcfUtil.getCurrentPropagationToken() return SOAPUtil.getXMLFromString([token])

The Extension Tool 2 in the test suite is used to complete the transaction. This method tool is doing the equivalent of tx.Complete() in the above C# code snippet. Here is the python code for Extension Tool 2: from webtool.soap.wcf import *; def endTransaction(input, context): id = context.get("MY_TRANSACTION_ID") WcfUtil.completeTransactionScope(id)

The WcfUtil methods and corresponding .NET API are described below:

int webtool.soap.wcf.WcfUtil.createTransactionScope(int scopeOption, int isolationLevel, int scopeTimeout) Parameters scopeOption •

See TransactionScopeOption Enumeration: http://msdn.microsoft.com/en-us/library/system.transactions.transactionscopeoption.aspx



Can be one of the following: •

WcfUtil.TRANSACTION_SCOPE_OPTION_REQUIRED



WcfUtil.TRANSACTION_SCOPE_OPTION_REQUIRES_NEW



WcfUtil.TRANSACTION_SCOPE_OPTION_SUPPRESS

isolationLevel •

See IsolationLevel Enumeration: http://msdn.microsoft.com/en-us/library/system.transactions.isolationlevel.aspx



Can be one of the following: •

WcfUtil.ISOLATION_LEVEL_SERIALIZABLE



WcfUtil.ISOLATION_LEVEL_REPEATABLE_READ



WcfUtil.ISOLATION_LEVEL_READ_COMMITTED



WcfUtil.ISOLATION_LEVEL_READ_UNCOMMITTED



WcfUtil.ISOLATION_LEVEL_SNAPSHOT



WcfUtil.ISOLATION_LEVEL_CHAOS



WcfUtil.ISOLATION_LEVEL_UNSPECIFIED

scopeTimeout •

The timeout of the TransactionScope in ms.

735

Using .NET WCF Flowed Transactions

Returns An ID that can be used to reference the TransactionScope.

String WcfUtil.getCurrentPropagationToken() Returns The Base64 encoding of the propagation token for the current transaction.

WcfUtil.completeTransactionScope(int transactionScopeRef) Parameters transactionScopeRef •

The id returned by WcfUtil.createTransactionScope()

Configuring Transaction Headers The SOAP Client tests each need to have a SOAP header that contains the propagation token. This propagation token comes from the XML Data Bank chained to the first Extension Tool test that was described earlier. The type of SOAP header that is needed depends on the transaction protocol.

OleTransaction Headers For OleTransactions, a OleTxTransaction SOAP header is needed. For convenience, here is a sample OleTxTransaction SOAP header: <OleTxTransaction b:Expires="59904" s:mustUnderstand="1" xmlns="http://schemas.microsoft.com/ws/2006/02/tx/oletx" xmlns:b="http://schemas.xmlsoap.org/ws/2004/10/wscoor" xmlns:s="http://www.w3.org/2003/05/soap-envelope"> <PropagationToken></PropagationToken> </OleTxTransaction>

To add the above header to SOAP Client 1, complete the following: 1. Select the SOAP Header tab from within a SOAP Client’s Request tab.

736

Using .NET WCF Flowed Transactions

2. Click the Add button within the SOAP Header tab. The Add New SOAP Header dialog displays.

3. Select Custom from the Add New SOAP Header dialog and click the OK button. A new row is added to the SOAP Header tab. 4. Double-click the new row in the SOAP Header tab. An Edit dialog displays.

737

Using .NET WCF Flowed Transactions

5. Paste the following sample OleTxTransaction SOAP header into the Literal/XML text field of the Edit dialog: <OleTxTransaction b:Expires="59904" s:mustUnderstand="1" xmlns="http://schemas.microsoft.com/ws/2006/02/tx/oletx" xmlns:b="http://schemas.xmlsoap.org/ws/2004/10/wscoor" xmlns:s="http://www.w3.org/2003/05/soap-envelope"> <PropagationToken></PropagationToken> </OleTxTransaction>

6. Select Form XML from the Views drop down box of the Edit dialog, and click Yes to override Form XML values with Literal XML values. 7. In the Form XML view, select the PropagationToken element, then select the Parameterized option for the element's value. This will allow you to choose the XML Data Bank column containing the propagation token. 8. Click the OK button.

WS-Atomic Transaction Headers For WS-Atomic Transaction, a CoordinationContext SOAP header is needed instead. Similar to the OleTxTransaction header, the PropagationToken element must be parameterized to include the propagation token from the XML Data Bank. For convenience, here is a sample CoordinationContext SOAP header: <CoordinationContext s:mustUnderstand="1" xmlns="http://schemas.xmlsoap.org/ws/2004/10/wscoor" xmlns:a="http://www.w3.org/2005/08/addressing" xmlns:mstx="http://schemas.microsoft.com/ws/2006/02/transactions" xmlns:s="http://www.w3.org/2003/05/soap-envelope"> <wscoor:Identifier xmlns:wscoor="http://schemas.xmlsoap.org/ws/2004/10/wscoor"></wscoor:Identifier> <Expires>59904</Expires> <CoordinationType>http://schemas.xmlsoap.org/ws/2004/10/wsat</CoordinationType> <RegistrationService> <Address xmlns="http://schemas.xmlsoap.org/ws/2004/08/addressing">https://hostname/WsatService/Registration/Coordinator/Disabled/</Address> <ReferenceParameters xmlns="http://schemas.xmlsoap.org/ws/2004/08/addressing"> <mstx:RegisterInfo> <mstx:LocalTransactionId></mstx:LocalTransactionId>

738

Using .NET WCF Flowed Transactions

</mstx:RegisterInfo> </ReferenceParameters> </RegistrationService> <mstx:IsolationLevel>0</mstx:IsolationLevel> <mstx:LocalTransactionId></mstx:LocalTransactionId> <PropagationToken xmlns="http://schemas.microsoft.com/ws/2006/02/tx/oletx"></PropagationToken> </CoordinationContext>

To add the above header to SOAP Client 2, follow the same procedure as outlined for SOAP Client 1, but instead paste the above CoordinationContext SOAP header into the Edit dialog.

739

Reference In this section: •

Available Tools



Built-in Test Configurations



Built-in Static Analysis Rules



Preference Settings



Extensibility (Scripting) Basics



Using Eclipse Java Projects in SOAtest



Available Tools

740

Built-in Test Configurations

Built-in Test Configurations This topic describes the preconfigured "built-in" Test Configurations that are included with SOAtest.

Code Review Category Name

Scope

Code Review

Post-Commit

All project files

Template code review configuration for teams who want to review code after it is committed to source control

Pre-Commit

Only files added or modified locally

Template code review configuration for teams who want to review code before it is committed to source control

Functional Testing Category Name

Description

Run Web Functional Tests in Both Browser

Executes each test in both Firefox and Internet Explorer.

Run Web Functional Tests in Browser Specified by Tests

Executes each test using the browser playback settings configured in the test scenario’s Browser Playback Options tab. If you have multiple scenarios, each with different browser playback settings, this Test Configuration would run all the scenarios in the designated browser(s).

Run Web Functional Tests in Firefox

Executes each test in Firefox. If a test was configured to run in Internet Explorer, this does not perform any testing.

Run Web Functional Tests in Internet Explorer

Executes each test in Internet Explorer. If a test was configured to run in Firefox, this does not perform any testing.

Load Testing Category Name

Description

Configure for Load Test

Configures browser-based web functional tests to run in a browser-less load test environment. See “Configuring Tests”, page 562 for details.

Validate for Load Test

Executes tests in load testing mode and alerts you to any outstanding issues that might impact your load testing—for example, incorrectly configured HTTP requests. See “Validating Tests”, page 566 for details.

741

Built-in Test Configurations

Security Category Name

Description

Run Hybrid Analysis Penetration Tests for Web Applications

Executes penetration testing (using attacks designed for web applications) with runtime error detection. See “Penetration Testing”, page 484 for details.

Run Hybrid Analysis Penetration Tests for Web Services

Executes penetration testing (using attacks designed for web services) with runtime error detection. See “Penetration Testing”, page 484 for details.

Run Penetration Tests for Web Applications

Executes penetration testing (using attacks designed for web applications). See “Penetration Testing”, page 484 for details.

Run Penetration Tests for Web Services

Executes penetration testing (using attacks designed for web services). See “Penetration Testing”, page 484 for details.

Static Analysis Category Name

Description

Check HTML WellFormedness

Reports missing end tags. Reports if default values need to be added for attributes (i.e., those that are "true" by default). Reports attribute values that do not have quotes around them. Reports if attributes that require numerical values have non-numerical values. Reports orphaned end tags. Reports missing required tags. Documents require <HTML> <HEAD> <TITLE> </TITLE> </HEAD> <BODY> </ BODY> </HTML> Framesets require <HTML> <HEAD> <TITLE> </TITLE> </HEAD> <FRAMESET> </ FRAMESET> </HTML>

Check Links

Detects broken links, program failures, and other critical site problems.

Check Spelling

Checks for spelling errors in HTML and XML files.

Check Web Standards

Verifies whether code follows coding standards for HTML, JavaScript, CSS, and ASP/VBScript.

Check XML Wellformedness

Checks whether XML files are well-formed and (optionally) validates them.

Schema Best Practices

Checks compliance to best practices for ensuring schema interoperability, maintainability, and security.

WSDL Best Practices

Checks compliance to best practices for ensuring WSDL interoperability, maintainability, and security.

742

Built-in Test Configurations

Accessibility Sub-Category Name

Description

Recommended Section 508

Monitors compliance to critical and recommended Section 508 accessibility guidelines.

Recommended WCAG 2.0

Monitors compliance to recommended WCAG 2.0 accessibility guidelines.

Section 508

Monitors compliance to the most critical Section 508 accessibility guidelines.

WCAG 1.0 - Level 1

Monitors compliance to Level 1 WCAG 1.0 accessibility guidelines. Level 1 success criteria: 1. Achieve a minimum level of accessibility through markup, scripting, or other technologies that interact with or enable access through user agents , including assistive technologies. 2. Can reasonably be applied to all Web resources.

WCAG 1.0 - Level 1 and 2

Monitors compliance to Level 1 and Level 2 WCAG 1.0 accessibility guidelines. Level 2 success criteria 1. Achieve an enhanced level of accessibility through one or both of the following: a. markup, scripting, or other technologies that interact with or enable access through user agents, including assistive technologies b. the design of the content and presentation 2. Can reasonably be applied to all Web resources.

WCAG 1.0 - Level 1, 2, and 3

Monitors compliance to Level 1 , Level 2, and Level 3 WCAG 1.0 accessibility guidelines. Level 3 success criteria 1. Achieve additional accessibility enhancements for people with disabilities. 2. Are not applicable to all Web resources.

WCAG 2.0 - Level A

Monitors compliance to Level A WCAG 2.0 accessibility guidelines.

WCAG 2.0 - Level A and AA

Monitors compliance to Level A and AA WCAG 2.0 accessibility guidelines.

WCAG2.0 - Level A, AA, and AAA

Monitors compliance to Level A, AA, and AAA WCAG 2.0 accessibility guidelines.

See “Creating Custom Test Configurations”, page 244 to learn how to develop custom Test Configurations that are tailored to your projects and team priorities.

743

Built-in Static Analysis Rules

Built-in Static Analysis Rules This topic describes the preconfigured "built-in" static analysis standard rules that are included with SOAtest. Sections include: •

Understanding Rule Categories



Viewing Rule Descriptions



Severity Levels



Custom Rules

Understanding Rule Categories SOAtest checks over 500+rules that check whether expectations for security, reliability, and compliance are met. Rules are organized into thematic categories such as: •

Accessibility - WCAG 1.0 / Section 508



Accessibility - WCAG 2.0 / ACC - WCAG2



Browser Compatibility



Coding Conventions



Check HTML Well-Formedness



Check Links



Check Spelling



Forms



Interoperability



Invalid Code



Localization



Maintainability



Navigation



Performance



Presentation



Security



Utility



Unused Code



Validate XML

Viewing Rule Descriptions To view descriptions of all the static analysis rules that are included with SOAtest, choose SOAtest> Help, open the SOAtest Static Analysis Rules book, then browse the available rule description files. To view a list of all static analysis rules that a given Test Configuration is configured to check, as well as descriptions of these rules:

744

Built-in Static Analysis Rules

1. Open the Test Configurations dialog by choosing SOAtest> Test Configurations or by choosing Test Configurations in the drop-down menu on the Test Using toolbar button. 2. Select the Test Configuration that you want a rule list for. 3. Open the Static tab. 4. Click Printable Docs. If you want to print the list of rules and all related rule descriptions, enable your browser’s Print all linked documents printer option before you print the main the list of rules.

Notes •



Rules are listed in the following format: Rule description (RULE ID- RULE SEVERITY LEVEL) •

RULE ID is the string used to identify this rule in the Test Configurations panel and to identify violations of this rule in reports and results.



SEVERITY LEVEL is a number from 1 to 5 that indicates the chance that a violation of the rule will cause a serious error. Possible severity levels (listed from most severe to least severe) are Level 1, Level 2, Level 3, Level 4, and Level 5.

Special wizard hat icons are used to indicate rule properties as follows: •

A new rule (a rule added since the previous SOAtest release) has an icon that contains the text "NEW."



A parameterizable rule (a rule that you can customize by modifying the available rule parameters) has an icon with a radio button.

Severity Levels Each rule is assigned a severity level. The severity level indicates the chance that a violation of the rule will cause a serious construction defect (a coding construct that causes application problems such as slow performance, memory leaks, security vulnerabilities, and so on). Possible severity levels (listed from most severe to least severe) are: •

Severe Violation (SV) - Level 1



Possible Severe Violation (PSV) - Level 2



Violation (V) - Level 3



Possible Violation (PV) - Level 4



Informational (I) - Level 5

Custom Rules SOAtest can also check any number of custom rules that you design with the RuleWizard module. With RuleWizard, rules are created graphically (by creating a flow-chart-like representation of the rule) or automatically (by providing code that demonstrates a sample rule violation). By creating and checking custom rules, teams can verify unique project and organizational requirements, as well as prevent their most common errors from recurring. RuleWizard can be used to modify built-in coding standards and to add additional ones. For more information about creating custom coding rules, see “Creating Custom Static Analysis Rules”, page 612.

745

Built-in Static Analysis Rules

746

Preference Settings

Preference Settings This topic provides details about the SOAtest’s general settings (settings that are not specific to any particular Test Configuration, project, or test). For details on how to access and share these settings, see “Modifying General SOAtest Preferences”, page 242. Sections include: •

Browser Testing Settings



Code Review



Configurations Settings



Console Settings



Dictionary Settings



E-Mail Settings



JDBC Drivers Settings



License Settings



MIME Type Settings



Miscellaneous Settings



Proxy Settings



Report Center / Project Center Settings



Report Settings



Scanning Settings



Scope and Authorship Settings



Scripting Settings



Security Settings



SOA Registry Settings



SOAP Client Settings



Start-Up Settings



System Properties Settings



Tasks Settings



Team Server Settings



UDDI Settings



Updates Settings



WSDL Settings



XML Schema History Settings



XML Schema Locations Settings



Using Variables in Preference Settings

Using Variables in Preference Settings Variables can be used in Preferences Settings. For details, see “Using Variables in Preference Settings”, page 761

747

Preference Settings

Browser Testing Settings The Browser Testing panel lets you set options related to Web functional testing. Available settings include: •

Firefox Executable Path: Specifies the path to the Firefox communication executable.



Proxy Port: Specifies the proxy port. See Proxy Configuration Details below for more information and tips.



Firefox Communication Port: Specifies the Firefox communication port.



Browser Timeout Settings: Specifies the length of delay (in milliseconds) after which SOAtest should stop waiting for browser startup or a user action and consider it to be "timed out."



Wait Condition Default Timeout Settings: Specifies the length of delay (in seconds) after which SOAtest should stop waiting for the activity specified in the wait condition to occur and consider it to be "timed out."



Print debugging info: During recording of a browser functional test scenario, it is possible that an action taken is not recorded by SOAtest. Enabling this option will allow messages to be printed to the message console during recording, with information about what events SOAtest handled, any locators that may have been generated, and if applicable, any exceptions that took place during recording.



Allow Binary Files in Traffic Viewer and Outputs: Allows binary files with the specified extensions or MIME types to be used in the Traffic Viewer and output. By default, only text files will be allowed.

Proxy Configuration Details When you use SOAtest to record or run functional tests in a browser, the proxy settings in that browser is set to an internal proxy maintained by SOAtest. All communication to and from the browser during recording and playback goes through this SOAtest proxy, which is an intermediary used to capture traffic and otherwise facilitate the execution of functional tests. During recording and playback, SOAtest temporarily creates this proxy on localhost using the port specified by the Browser Testing setting’s Proxy Port option. The default proxy host and port used by SOAtest is localhost:55555. If your machine is already using this port for something else, you must change the port to one that your machine is not using. This is done using the controls Proxy Port field referenced above. Do not change this from within the browser. If you have configured your machine to use your own proxy, you should configure SOAtest to point to that proxy. SOAtest will then configure its internal proxy to forward all traffic to the specified proxy. This is configured in “Proxy Settings”, page 752.

Internet Explorer Notes Note that SOAtest modifies the global registry settings prior to starting its instance of the browser. However, if there was an existing instance of a browser running on the machine before you launched SOAtest (not a recommended practice), the existing browser instance does not pick up those settings. In that case, if you look at the Internet Options panel in the existing browser instance while a test is running in SOAtest, the settings may point at SOAtest’s proxy. However, it will not use those settings unless you click OK in the Internet Options panel. If you click OK, the proxy settings are updated in the existing browser instance. If you click Cancel, or do not go to the Internet Options panel, then the existing browser instance never picks up the SOAtest proxy settings and should continue to navigate fine.

748

Preference Settings

Proxy settings may not be reset properly if the browser exited abnormally, if there is a hanging browser process, etc. . Such issues can affect new browser instances (or other programs that connect to the internet). If this happens, you can resolve it by resetting your machine’s proxy settings to the appropriate settings or killing any hanging browser processes. For a more detailed discussion of how to prevent SOAtest from changing Internet Explorer settings, see the related topic in the Parasoft forum.

Code Review Contains settings for automating preparation, notification, and tracking the peer review process, which can be used to evaluate critical SDLC artifacts (source files, tests, etc.) in the context of the organization’s defined quality policies. For details, see “General Code Review Configuration”, page 622 .

Configurations Settings The Configurations panel allows you to determine the number of Test Configurations available in the SOAtest> Test History menu, the location where user rules are saved and searched for, the location where user-defined Test Configurations and rules are saved and searched for.

Console Settings The Console panel allows you to determine the amount of information that is reported to the Console view and whether it is automatically activated when it contains messages. For a description of what information is provided at each level, see the following table.

High Verbosity

Normal Verbosity

Low Verbosity

Basic info about the current step’s name and status (done, failed, up-to-date)

Yes

Yes

Yes

Errors

Yes

Yes

Yes

Warnings

Yes

No

No

Command Lines

Yes

Yes

No

Violations printed out during static analysis and test execution

Yes, full-format

Yes, short-format

No

Dictionary Settings The Dictionary panel allows you to customize the dictionary that the Spell Tool uses to identify misspelled words.

Adding Words To add words to the dictionary:

749

Preference Settings



To add a new word, click Add, then enter it in the dialog that opens.



To import a set of words from a text file, click Import, then specify the file that contains the set of words you want to import.



To remove a word, select it in the list of words, then click Delete. You can select more than one word, then delete all selected words with one click.



To export a list of words into a text file (for example, to export your User Added Words list so that your team members can import them) click Export, then indicate what file you want to contain the exported words.

Tip- Adding Words from the SOAtest view You can also add reported misspelled words to the dictionary from the SOAtest view. Just right-click the reported misspelled word, then choose Add to Dictionary.

Adding Dictionaries You can expand SOAtest’s built-in dictionary by extending it with additional sets of ispell-format dictionaries (such as dictionaries for language other than English, dictionaries of industry-specific terms, etc.). Each dictionary set has a name and one or more dictionaries. To add an additional dictionary set: 1. Save the dictionaries in the SOAtest installation directory. 2. Click the Add button, then use the file chooser to select the set of dictionaries you want to add.

Adding Non-Text Characters or Words Containing Non-Text to the Dictionary By default, SOAtest treats non-text characters as white space and does not allow you to add dictionary words that contain non-text characters. If you want SOAtest to consider a designated non-text character as a valid character within a word (rather than as one unit of white space), you need to add that

750

Preference Settings

character to the list of allowable non-text characters. This allows you to identify spelling errors in words that contain allowed non-text characters and to add dictionary words that contain allowed non-text characters. To add non-text characters to the list of allowable non-text characters: •

Enter them in the Allowable non-text characters field. If you want to allow multiple non-text characters, list them one after the other—do not separate them with a space character, comma, or other delimiter.

E-Mail Settings The E-Mail panel allows you to configure email settings used for report notifications and for sending files to Parasoft Technical Support.

JDBC Drivers Settings The JDBC Drivers panel lets you add jar files for the JDBC Drivers (e.g., oracle.jar, mysql.jar).

License Settings The License panel lets you configure license settings. See “Licensing”, page 29 for details.

MIME Type Settings The MIME Types panel lets you add and remove MIME types. In addition, it lets you specify the location of your preferred text and XML editors and lets you specify what editor you want to use to edit all files that have a particular MIME type. To add, edit, or remove a MIME type: •

To add a MIME type, click Add MIME Type, enter the new MIME type in the dialog box that opens, then enter the file extensions that you want to assign to this MIME type, and (optionally) indicate the implied MIME type by checking the appropriate check boxes. If you enter multiple extensions for a MIME type, separate the extensions with one space character.



To edit a MIME type’s settings, select the MIME type whose settings you want to edit, then modify the settings as needed.



To remove a MIME type, select the MIME type that you want to remove, then click Delete MIME Type.

Miscellaneous Settings The Misc panel allows you to set the following miscellaneous settings: •

Show Tool Descriptions: Tells SOAtest whether or not to display tool descriptions in the Add Test Wizard and Add Output Wizard.



XML Tree Settings: Tells SOAtest how to expand XML trees. •

Expand all elements by default: Tells SOAtest whether XML Trees should be fully expanded when first displayed, or expanded up to the value specified in the following option.

751

Preference Settings



Maximum depth of elements to expand: Specifies the depth to which any XML tree should be expanded.



Auto Beautify: Tells SOAtest to automatically beautify XML messages in the selected tool or tools (Traffic Viewer, Diff, Editor) if the message is under the specified size (10 KB is the default setting).



Character Encoding: Tells SOAtest to use the default character set for the particular system being used. •

System Default: Tells SOAtest to encode characters using CP1252.



Custom: Tells SOAtest to encode characters from the list of encodings available on the JVM being used.



Default timeout (milliseconds): Allows you to enter the length of delay (in milliseconds) after which SOAtest should consider your FTP, telnet, or HTTP requests to be “timed out.”



Report each duplicate error that occurs on the same line: Tells SOAtest to show only the first instance of duplicate errors that occur for the same line of code.



Reset Cookies: Allows you to clear the current global cookies so that next HTTP invocations start a new session.



Automatically backup project files: Tells SOAtest to automatically back up and save the project files you are working on.



Warn on file size large than (MB): Specifies the point at which SOAtest will warn you that the .tst file is too large. You can then reduce the size (and prevent peformance problems) by dividing it into smaller .tst files.



Project file format: Allows you specify whether project files are saved as XML or binary. We strongly recommend using XML format. Binary format is provided to support project files from earlier versions of SOAtest and WebKing.

Proxy Settings The Proxy panel controls how SOAtest works with proxy servers. This is only relevant for browser functional tests (for details, see “Proxy Configuration Details”, page 748). •

If Windows and IE (which use the same settings) are configured to properly use the proxy to access the relevant websites, select Use system proxy configuration.



Otherwise, select Enable proxy and manually enter the correct settings. These settings should be equivalent to what you would use in the browser outside of SOAtest. •

To use an automatic configuration script, select Use automatic configuration script, then enter the proxy address in the Address field.



If you want to use the same proxy server for all protocols, check the Same proxy server for all protocols check box, then enter the address and port of the proxy server you want to use in the Proxy Address and Proxy Port fields.



If you want to use different proxy servers for different protocols, clear the Same proxy server for all protocols check box, then enter the address and port of each proxy server you want to use in the Proxy Address and Proxy Port fields.



If your proxy server requires authentication, check the Enable proxy authentication check box, then enter a valid user name and password in the Username and Password fields.

752

Preference Settings



If you want to allow Web traffic from designated IP addresses to pass through directly (avoiding the proxy), enter those IP addresses in the Proxy Exceptions text field. If you enter multiple addresses, use a semicolon (;) to separate the entries.



If you want to specify new proxy settings from the command line, use the -proxy [username:password@]host:port option after the -cmd flag. The syntax mimics an HTTP basic auth URL.

Report Center / Project Center Settings The Report Center/Project Center panel lets you configure how results are sent to Parasoft Report Center and Parasoft Project Center. See “Connecting SOAtest Server to Report Center”, page 203 for details.

Report Center> Task Assistant This panel lets you specify settings for the Task Assistant, which helps you work with Parasoft Project Center from the SOAtest environment. See “Connecting All SOAtest Installations to Parasoft Project Center”, page 206 for details.

Report Settings The Reports panel allows you to configure reporting content and format settings. See “Configuring Reporting Settings”, page 297 for details.

Reports> E-Mail Notifications This panel allows you to configure e-mail notification settings. See “Configuring Reporting Settings”, page 297 for details.

Reports> Structure Reports This panel allows you to customize the settings related to structure reports, which are discussed in “Creating a Report of the Test Suite Structure”, page 380. The following options are available: •

Report Name: Select from the drop-down list to choose the type of report to configure. You can also Add, Remove, or Copy reports in this list by clicking the appropriate button.



Test Structure: Select the elements that you want displayed in the report.

Scanning Settings This Scanning panel specifies settings related to how SOAtest scans Web applications. Available options include: •

Agent Name: Determines the user agent that SOAtest uses to identify itself.



FTP Log: Determines whether (and how) to create a log of FTP connections made to scan Web resources from SOAtest.



Script options: •

Script Extensions: Determines what files SOAtest considers "scripts".



Limit the number of script items per page to: Determines the maximum number of script items per page that SOAtest will consider. If a page has more script items than

753

Preference Settings

the number that you allow, SOAtest will place a "red flag" icon next to the related Project tree node. •

Load JavaScript: Determines whether or not SOAtest loads JavaScript.



Simulate JavaScript events: Determines how SOAtest simulates JavaScript events (such as opening and closing additional windows, running timers, and so on). If you choose single time, SOAtest triggers each handler once, with default arguments. If you choose multiple times, SOAtest tries to create multiple kinds of events while loading a site (in order to find new links).

Scope and Authorship Settings The Scope and Authorship panel allows you to specify how SOAtest assigns tasks to different team members. See “Configuring Task Assignment”, page 215 for details.

Scripting Settings The Scripting panel allows you to specify Python and Java properties used for custom scripts. •



Java: For Java, you can specify the Java home directory and the path to the javac compiler. You need to specify these parameters if you want to compile Java methods within SOAtest’s Editor. Note: The javac compiler is not included in the SOAtest installation. •

Java Home: Specifies the Java installation directory.



Javac Classpath: Specifies the Java classpath.

Java Script: If you create scripts using Python or JavaScript, you can specify a script template in the Script Template field. •



Script Template: Whatever code is specified in this field will be used as the default code for inlined scripting in the language with which the field is associated. This is primarily useful for setting default inputs and common global variables. Script templates apply to scripts used in Extension tools; they do not apply to JavaScript tests that run in a browser context.

Python: For Python, you can specify the python.home and python.path variables. Both variables are used to locate python modules, and Python code that does not import any python modules can use the python scripting support without setting either variable. If you set the python.home and python.path variables, you need to restart SOAtest before the changes will take effect. •

Python Home: Specifies the Python installation directory. python.home must be a single directory.



Python Path: Used to add to your path modules that are not in your python.home/Lib directory. Multiple paths can be listed in python.path



Script Template: Python code that does not import any python modules can use the python scripting support without setting either the python.home or python.path.

Security Settings The Security panel allows you to set the following Security settings: •

Global HTTP Authentication Properties: Allows you to specify global HTTP authentication properties that can be used when configuring HTTP protocols within a SOAP Client or Messaging Client tool. Select the Perform Authentication checkbox and enter the Username and

754

Preference Settings

Password to authenticate the request, and select either Basic, NTLM, or Digest, from the Authentication Type drop-down list.For Kerberos, enter the Service Principal to authenticate the request. If the correct username and password, or the correct service principal, are not used, the request will not be authenticated. •

Kerberos Realm: Specify the Kerberos realm associated with your network. By convention, this is typically your domain name in all caps (e.g. PARASOFT.COM).



KDC Server: Specify the hostname of your Key Distribution Center (e.g. kdc.parasoft.com).



Check Ticket: This will execute a simple test to locate a cached Kerberos TGT (Ticket Granting Ticket) to grant access to the service. SOAtest will not be able to communicate with the service if it cannot first locate a valid TGT. For more information about Kerberos, see “Configuring Kerberos Authentication in SOAtest”, page 756.



Trust all certificates: Tells SOAtest that you want it to accept any certificate. This is useful if you want to load pages whose certificates are not "trusted."



Use default Java cacerts: Tells SOAtest that you want it to accept only certificates from the standard list of Java trusted certificate vendors.



Use Client Keystore: Enables you to specify settings for both the server side and client side certificates for SSL through the Client Keystore options. If this option is selected, the following options are available in the Certificate and Private Key tabs: •



Certificate tab options: •

Use same key store for private key: Select if the Key Store contains private keys for the certificate.



Key Store File: Specify the key store file by clicking the Browse button and using the file chooser that opens. If you want the path saved as a relative path (for example, to facilitate project sharing), check the Persist as Relative Path option.



Key Store Password: Specify the Key Store password and select the Save key store password option to save the password on future runs of the test.



Key Store Type: Specify the type of Key Store being used (e.g. JKS, PKCS12, BKS, UBER)



Load: Click to populate the aliases with the available certificates/keys if the path, type, and key store password are valid.



Certificate Alias: Specify the certificate alias.

Private Key tab options: •

Key Store File: (Only available if the Key Store Contains Keys option is unselected in the Certificate tab) Specify the key store file by clicking the Browse button and using the file chooser that opens. If you want the path saved as a relative path (for example, to facilitate project sharing), check the Persist as Relative Path option.



Key Store Password: (Only available if the Key Store Contains Keys option is unselected in the Certificate tab) Specify the Key Store password and select the Remember key store password option to remember the password on future runs of the test.



Key Store Type: (Only available if the Use same key store for private key option is unselected in the Certificate tab) Specify the type of Key Store being used (e.g. JKS, PKCS12, BKS, UBER)

755

Preference Settings



Load: Click to populate the aliases with the available certificates/keys if the path, type, and key store password are valid.



Private Key Alias: Specify the private key alias.



Private Key Password: Specify the private key password and select the Remember private key password option to remember the password on future runs of the test.

About Kerberos Authentication Kerberos authentication is known as a trusted third-party authentication mechanism. A client requests access to a service not directly, but from another service: the Key Distribution Center, which manages network-wide authorization. This mechanism facilitates Single Sign-On (SSO) so that a client need only provide authorization credentials once in a given time period (usually 8-10 hours). The authorization is granted in the form of a ticket which can then be cached and reused throughout the granted time period without re-authenticating. Entities in a Kerberos-protected network, such as clients and servers, are known as principals. The network-space that Kerberos protects is known as a realm. Microsoft's IIS (Internet Information Services) Server provides HTTP-based services with Kerberos through the Negotiation protocol. Other server vendors provide their own implementations of Microsoft's Negotiate protocol. The ticket that is received upon initial authentication is known as a Ticket Granting Ticket, or TGT. For example, in a Windows environment, the TGT is generated when first logging on to the workstation in the morning. SOAtest authorizes itself to use a Kerberos-protected service by retrieving a user's TGT from the system cache.

Avoiding Common Kerberos Errors For tips on common Kerberos errors and how to solve them, see http://java.sun.com/j2se/1.5.0/ docs/guide/security/jgss/tutorials/Troubleshooting.html.

Configuring Kerberos Authentication in SOAtest To begin setting up Kerberos authentication in SOAtest, you must first place a file in the installation directory of SOAtest called kerberos.config. This file can be found in the Parasoft/SOAtest/[version number] directory. To setup your SOAP and HTTP Client tools to use Kerberos authentication, complete the following: 1. Configure the following options in the Security preferences panel: •

Kerberos Realm: Specify the Kerberos realm associated with your network. By convention, this is typically your domain name in all caps (e.g. PARASOFT.COM).



KDC Server: Specify the hostname of your Key Distribution Center (e.g. kdc.parasoft.com).



Check Ticket: Click to execute a simple test to locate a cached Kerberos TGT (Ticket Granting Ticket) to grant access to the service. SOAtest will not be able to communicate with the service if it cannot first locate a valid TGT.

2. Select the SOAP Client or HTTP Client tool for which you intend to use Kerberos authentication. 3. Select the Transport tab and select Security from the left pane of the Transport tab.

756

Preference Settings

4. Configure the following options from the security panel of the Transport tab: •

Perform Authentication: Select this option to activate authentication.



Use Global Preferences: Select this option if you have authentication properties setup in Security Preferences.



Type: Select Kerberos to perform Kerberos Authentication.



Service Principal: Specify the name of the service/server as defined in the Kerberos database (e.g. HTTP/soatest.parasoft.com).

5. Now when you invoke your test, the required Negotiate token will automatically be generated and send as an HTTP header. Note: Kerberos provides a mechanism to prevent so-called "replay" attacks where a user tries to provide captured duplicate credentials for a service in order to gain access to them. When performing a load test, where multiple virtual users provide the same user credentials, the KDC will respond as if a replay attack is occurring and errors will be thrown. This is expected behavior and it is uncertain at this point whether there is a work-around.

SOA Registry Settings SOAtest can create tests that enforce policies applied to Web service assets that are declared in a BEA AquaLogic Enterprise Repository or Software AG Centrasite repository as described in “Using Oracle/BEA with SOAtest”, page 658 and “Using Software AG CentraSite Active SOA with SOAtest”, page 682. The SOA Registry panel allows you to specify settings that SOAtest will use by default in forms that reference such repositories. For instance, if you specify settings for BEA ALER here, SOAtest will use these values by default in the wizard for creating tests from BEA ALER.

SOAP Client Settings The SOAP Client panel allows you to specify the following settings: •

Default Transport: Allows you to set the default transport protocol for SOAP Clients.



Attachment Encapsulation Format: Allows you to choose MIME, DIME, or MTOM, for the default attachment encapsulation.



SOAP Version: Allows you to select SOAP 1.1 or SOAP 1.2.



Outgoing Message Encoding: Allows you to choose the encoding for outgoing messages. You can choose any Character Encoding you wish to read and write files, but the Outgoing Message Encoding provides additional flexibility so you can set a different charset encoding for the SOAP request.

757

Preference Settings

Customizing SOAP Serialization Settings You can also customize how SOAtest serializes the SOAP objects it transmits and deserializes the SOAP messages it receives, but you cannot do so within the Preferences panel. SOAP messages are deserialized from XML into some native format and objects are serialized into XML format so that they can be sent as responses. To add a serializer/deserializer pair, you add a line to the register.py file in the <SOAtest_62_Installation_Directory>/plugins/com.parasoft.xtest.libs.web_<soatest_version>/root/startup directory. You must programmatically

use Jython register Apache Axis-compliant serializers. For Axis, you can retrieve the TypeMappingRegistry that SOAtest uses by calling soatest.api.SOAPUtil.getDefaultAxisRegistry(). After you retrieve that registry, you can use the Axis API to register the serializer as needed.

Source Control Settings The Source Controls panel configures what source control systems SOAtest connects to. See “Connecting All SOAtest Installations to Your Source Control System”, page 192 for more details.

Start-Up Settings The Start-Up panel allows you to tells SOAtest to automatically start the stub server upon SOAtest start-up.

System Properties Settings The System Properties panel lets you add JAR files, class folders, and Java projects to the classpath if needed. Use the available controls to add or remove JAR files, class folders, and Java projects. The specified JAR files, classpaths, and Java projects will be added to the system's classpath and the corresponding classes will be loaded into the JVM after SOAtest is restarted. To force SOAtest to reload classes from the classpath entries, click the Reload button. If you want SOAtest to attempt to reload classes from your Eclipse project after being modified or recompiled, enable the Automatically reload classes option.

Tasks Settings The Tasks panel allows you to specify how tasks are reported in SOAtest. Available settings include: •

Clear existing tasks on startup: Determines whether tasks from previous tests are cleared upon startup.



Automatically expand tasks tree during testing: Specifies whether the task tree is automatically expanded during testing.



Decorate code markers when tasks become out of date: Specifies whether markers are placed next to tasks that are now outdated (i.e., because the current source code does not match the analyzed code).



Import only tasks reported for tests ran in the last n days: Specifies the time range for importing Team Server tasks. For example, if your team tests only twice a week, you might

758

Preference Settings

want to import tasks from tests performed within the past 4 days—rather than the past 2 days—to ensure that the most recent test results are imported. •

Import out of date tasks for modified resource: Allows SOAtest to import violations generated for files that had been locally changed since the time the task was generated. If these tasks are imported, they will be marked as "out-of-date" if the Decorate code markers when tasks become out of date option (described above) is also enabled.



Relocate tasks during import: Allows SOAtest to import tasks into a project with different logical layout (compared to the original project analyzed - e.g., if the project on your desktop system is different than the one on the machine running soatestcli). Requirements and limitations are: •





Server set-up: Generated reports have to contain information about physical file paths. •

Go to SOAtest> Preferences> Reports.



Enable the Add absolute file paths to XML data option in the Advanced Settings section

Desktop set-up: Mapping between physical paths on the server and desktop has to be defined: •

Go to SOAtest> Preferences> Tasks.



Enable the Relocate tasks during import option.



Enter mapping details into the table. Each mapping entry is defined by "Original location" (the physical location on the server) and "Destination location" (the equivalent physical location on the desktop). Entries are processed top to bottom; the first matching entry will be used.

Limitations: Relocation can only be used for importing tasks. Other elements of the team work flow, like GUI Suppressions or Authorship Mapping, will not be relocated automatically.

759

Preference Settings

Team Server Settings The Team Server panel specifies connections to the Team Server module, which ensures that all team members have access to the appropriate projects, rules, and policies. Team Server is available and licensed separately. For details on connecting to Team Server, see “Connecting All SOAtest Installations to Team Server”, page 199.

UDDI Settings The UDDI panel lets you set the UDDI inquiry endpoint, which is the endpoint that you want SOAtest to reference when performing dynamic router resolution. If you specify a UDDI registry here, the SOAP Client tool can search for a service by querying that registry using the UDDI serviceKey specified in the SOAP Client parameters. If you do not specify a UDDI registry here, you have to configure your SOAP Client tool so that the server endpoint is hard-coded as a router value.

Updates Settings The Updates panel lets you configure an internal update site instead of the public one (for example, so that they can internally standardize their versions). For details, see “Service Pack Installation”, page 28.

WSDL Settings The WSDL panel lets you review or modify the WSDLs that have been used in SOAP Clients and Projects. These WSDLs will be available for selection in relevant drop-down boxes. This way, if you need to specify the same WSDL multiple times, you don’t need to constantly type it in over and over again. •

Save WSDLs used in SOAP Clients and Projects: If you do not want SOAtest to save test suites’ WSDL URIs, clear this check box.



WSDL URI: Lists the WSDL URIs that will be available in the SOAP Client tool’s WSDL URI drop down menu. By default, all WSDL URIs used in SOAP Client tests are added to this list.



Refresh WSDL: If you would like to refresh the WSDL from the given location URL and reparse it click the Refresh WSDL button.



WSDL Parsing> Honor All Schema Locations: If this option is enabled, SOAtest will check all schema locations in order to locate components belonging to a given target namespace. If it is disabled, SOAtest may only use the first schema location encountered in order to resolve components for a given target namespace.

XML Schema History Settings The XML Schema History panel lets you review or modify the XML Schemas that have been used in Messaging Clients and Projects. These Schemas will be available for selection in relevant drop-down boxes. This way, if you need to specify the same Schema multiple times, you don’t need to constantly type it in over and over again.

XML Schema Locations Settings The XML Schema Locations panel lets you view, add, and remove schema locations. The XML Validator tool needs to know where to find the schema that it should use to validate the document of concern.

760

Preference Settings

In most cases this is a URI and is supplied within the document being validated. If however the URI for the schema is not supplied or if you want to use a different location, then disable the Use namespace as location URI for Schemas option for the XML Validator tool. For more information on the XML Validator tool, see “XML Validator”, page 862. When the test is run with this box unchecked, SOAtest will use the schema location(s) indicated in this panel. To add a new schema location: 1. Click the Add button beneath the Namespace and Location columns. 2. In the dialog that opens, specify the Namespace and Schema Location. 3. Click OK after you have added all of the necessary locations. To specify namespaces to skip: 1. Click the Add button beneath the List of namespaces to skip during XML Validation table. . 2. In the dialog that opens, specify the namespace you want to skip. 3. Click OK. To add OASIS XML Catalog Locations: 1. Click the Add button beneath the OASIS XML Catalog Locations section of the Schema Locations tab. The Location dialog box displays. 2. Type in the OASIS XML Catalog Location or Browse to it by clicking the Browse button. 3. Click OK after you have added all of the necessary locations.

Using Variables in Preference Settings The following variables can be used in reports, e-mail, Report Center, Team Server, and license settings specified in the Preferences panel or in the Local Settings (Options) files used with soatestcli. Note that report tag value can't contain any ':' characters.

761

Preference Settings

Using Variable Assist For help specifying variables, you can use the variable assist feature, which automatically proposes possible variables when you type $. For example:

env_var example: ${env_var:HOME} Outputs the value of the environmental variable specified after the colon. project_name example: ${project_name} Outputs the name of the tested project. If more than one project is provided as an input, it first outputs the tested project name, then "..." workspace_name example: ${workspace_name} Outputs an empty string in Eclipse. config_name $ example: ${config_name} Outputs the name of executed test configuration; applies only to Reports and Email settings. analysis_type $ example: ${analysis_type}

762

Preference Settings

Outputs a comma separated list of enabled analysis types (for example: Static, Execution); applies only to Reports and Email settings. tool_name $ example: ${tool_name} Outputs the tool name (for example: SOAtest). For example:

763

Extensibility (Scripting) Basics

Extensibility (Scripting) Basics This topic provides an overview of issues related to SOAtest’s extensibility (scripting) capabilities and their various applications. Sections include: •

Understanding the Extensibility Feature



Setting Your Environment for Scripting



Specifying the Script



Using Script Templates



Interacting with SOAtest



Accessing Required Jar Files



SOAtest Script Variables



Configuring Jython Preferences



Accessing Data Sources from Scripts



Extensibility Examples



Language-Specific Tips



Additional Extensibility Resources

Understanding the Extensibility Feature The SOAtest extensibility feature allows you to apply a custom Java, JavaScript, or Python script to perform any specific function that would be helpful to you while testing. This feature gives you the ability to customize SOAtest to your specific needs without learning an application-specific language. You can also use scripting to customize rules you create with RuleWizard. For details on this feature, see the RuleWizard User’s Guide.

Setting Your Environment for Scripting If you are using Python scripts, you might need to specify your python.home and python.path variables in the Scripting tab of the SOAtest Preferences panel. Both variables are used to locate Python modules, and Python code that does not import any Python modules can use the Python scripting support without setting either variable. python.home specifies the python installation directory. python.path is used to add to your path modules that are not in your python.home/Lib directory. Multiple paths can be listed in python.path, while python.home must be a single directory. If you set the python.home and python.path variables, you need to restart SOAtest before the changes will take effect. If you are using Java for scripting and want to use SOAtest to recompile modified Java files, see “Using Eclipse Java Projects in SOAtest”, page 772 for details on how to set your environment on scripting.

Specifying the Script To define a script from SOAtest GUI controls that provide a scripting option : 1. If your Java class is from a Java project in your Eclipse workspace, add the Java project to SOAtest's classpath. See “Using an Existing Java Project”, page 772 for details. 2. From the Language box, select Java, JavaScript, or Python to indicate the language that your method is or will be written in. 3. Define the script to be implemented in the large text field.

764

Extensibility (Scripting) Basics



For Java methods, specify the appropriate class in the Class field. Note that the class you choose must be on your classpath (you can click the Modify Classpath link then specify it in the displayed Preferences page). Click Reload Class if you want to reload the class after modifying and compiling the Java file.



For JavaScript and Python scripts, you can use an existing file as the source of code for your method or you can create the method within SOAtest. •

To use an existing file, select the File radio button and click Browse. Select the file from the file chooser that opens, then click OK to complete the selection.



To create the method from scratch from within SOAtest, select the Text radio button and type or cut and paste your code in the associated text window.



To check that the specified script is legal and runnable, right-click the File or Text text field (click whichever one you used to specify your script), then choose Evaluate from the shortcut menu. SOAtest will report any problems found.

Using Script Templates If you create scripts using Python or JavaScript, you can specify a script template in the SOAtest Preference panel’s Script Template field (in the Scripting page). Whatever code is specified in this field will be used as the default code for inlined scripting in the language with which the field is associated. This is primarily useful for setting default inputs and common global variables. Script templates apply to scripts used in Extension tools, SOAP services, and scripted SOAP inputs.

Interacting with SOAtest If you need your scripts to interact with the SOAtest program, you can do so using the SOAtest Extensibility API. For example, you can use the SOAtest Extensibility API to send results of a script to a SOAtest Message window or pass specific values to specific SOAtest. To access the SOAtest Extensibility API, choose Help> Extensibility API.

Accessing Required Jar Files If you are developing Java code that uses the SOAtest Extensibility API and want to compile that code within the SOAtest environment, use the SOAtest Java Project wizard. This wizard will create a new Eclipse Java project that has access to the SOAtest Extensibility API. For details, see “Using Eclipse Java Projects in SOAtest”, page 772. If you are developing Java code that uses the SOAtest Extensibility API and want to compile that code outside of the SOAtest environment, you will need the following jar files: •

webking.jar



parasoft.jar

These jar files are available at [SOAtest_install_directory]/plugins/com.parasoft.xtest.libs.web_1.0.0/root. Additionally, if scripts need to use other functionalities beyond these API classes, you may need to add additional (3rd party or open source) jars to your classpath. For example: •

For XPath processing, you would also need the class org.apache.xpath.XPathAPI. This is located in xalan.jar.

765

Extensibility (Scripting) Basics



For ISO 8583-related scripting, you would also need classes from org.jpos.iso.* such as ISOMsg, BaseChannel. ISOChannel, and ISOPackager. These are located in jpos.jar.

SOAtest Script Variables You can declare and use variables in SOAtest scripts. Variable declaration must start with a keyword "var" followed by an equals sign "=" followed by a variable value. Each variable must be declared on a separate line. Variables can be declared anywhere in the script, a variable can be used immediately after its declaration. To assign an existing variable a new value, "re-declare" the variable. A variable can be referenced in a script as follows: ${varName}. Script example: ----------------------------var TestHome=C:\functional-tests\src\test\soatest\tests\LoadTests var ReportHome=C:\LoadTestReports var minutes = 1 var scenario = "Linear Increase"

var test=echo open ${TestHome}\${test}.tst loadtest -minutes ${minutes} -report ${ReportHome}/%d/${test}/${test}.rpt -xml ${ReportHome}/%d/${test}/${test}.xml -html ${ReportHome}/%d/${test} ${scenario}

-----------------------------

Configuring Jython Preferences Python syntax coloring for the Extension tool is enabled using the PyDev plugin (http://pydev.sourceforge.net). This plugin must have its jython interpreter configured for advanced functionality such as code auto completion. For more details, see: http://www.fabioz.com/pydev/manual_101_interpreter.html

Accessing Data Sources from Scripts From Extension Tool For example, let's say you have a test suite that looks similar to that in the following image:

766

Extensibility (Scripting) Basics

The name of the table data source is Data Source Name. Not pictured are the following important facts: •

The table Data Source Name has a column labeled Column Name.



The XML Data Bank contains a column labeled Test 1: Result.



The XML Data Bank and Writable Data Source belong to a data source labeled Generated Data Source.

In order to make one of these data sources available to your script, select the data source in the top of the Extension tool’s configuration panel, then check Use Data Source.

Then, within any other method in your script, store a value from the data source into a variable "x" with the line: x = context.getValue("Data Source Name", "Column Name")

Outside of Extension Tool Outside of the Extension tool (e.g., in a Scripted XML response for a Message Stub tool), you would need to add the following method to your script in order to make a data source available: def addDataSources(context): return "Data Source Name"

Then, within any other method in your script, you could store a value from the data source into a variable "x" with the line: x = context.getValue("Data Source Name", "Column Name")

Notes

767

Extensibility (Scripting) Basics

When opening older files (created prior to SOAtest 6.1), note that Browser Data Bank Tool column names are automatically transformed to "Extracted: xyz", where "xyz" was the column name that you specified. This provides support for legacy scripts that reference "Extracted: xyz". You can change the column name in the Browser Data Bank to just "xyz" or "abc." For more information on scripting within SOAtest, see Help> Extensibility API or check out the SOAtest forums at http://forums.parasoft.com.

Extensibility Examples This section contains examples of scripts which are often created within SOAtest. To view a project file containing these examples, please reference SOAtestInstallRoot/examples/tests/ScriptingExamples.tst.

One typical usage of a method tool is accessing data from a data source, manipulating it, and then storing it dynamically in an XML Data Bank or Writable Data Source. In this scripting example, we define a method called getKeywords in which we are accessing data from a data source titled “Books” with a data source column titled “keywords”. We then return an XML representation of this string so that we can send the output from this script to an XML Data Bank. Note that additional configuration is required to access Data Sources from an Extension Tool. See the previous section for details. In these samples, the methods take two arguments as input: •

Object input: Represents input passed to the method from a previous test case. For example, if we chain a method tool to the SOAP Response of a SOAP Client tool, the input Object will be a string representing the SOAP Response.



MethodToolContext context: The method getValue gives access to Data Sources from an Extension Tool. This method can be used along with other methods exposed by the Context interface for sharing Data Source values between scripts and other tools.

Python Example from soaptest.api import * def getKeywords(input, context): title = context.getValue(“Books”, “keywords”) return SOAPUtil.getXMLFromString([title])

Jython Example The method tool can directly invoke only one method within your script. For Jython scripts, the method invoked must conform to one of the following forms: 1) def methodName(): # code

2) def methodName(input): # code

3)

768

Extensibility (Scripting) Basics

def methodName(input, context): # code

In the above examples, "input" refers to either: •

The data passed from the output of the tool to which this Extension Tool is chained. For example, the Response SOAP Envelope of a SOAP Client OR



The data that appears in the Input area of the Extension Tool. This is found at the bottom of the Extension Tool interface.

In the vast majority of cases, you can assume that the value passed to the "input" variable will be a primitive Jython string. Generally, the "context" variable is determined dynamically by SOAtest and will be an instance of the Context class, found in the Extensibility API. In the case of an Extension Tool, the context will be an instance of MethodToolContext for this specific tool (as compared to other Extension Tools in your Test Suite).

JavaScript Example var SOAPUtil = Packages.soaptest.api.SOAPUtil function getKeywords(input, context) { title = context.getValue("Books", "keywords") return SOAPUtil.getXMLFromString([title]) }

Java Example package examples; import soaptest.api.*; import com.parasoft.api.*; public class Keyword { public Object getKeywords(Object input, MethodToolContext context) throws com.parasoft.data.DataSourceException { String[] titles = new String[1]; titles[0] = context.getValue("Books", "keywords"); return SOAPUtil.getXMLFromString(titles); } }

Note: Java code such as this example must be compiled outside of SOAtest. You will need to make sure that you have the SOAtest jar files on the Classpath for your Java compiler in order to access packages from the SOAtest Extensibility API, in particular webking.jar and parasoft.jar. For this Keyword example, the source code and compiled class file is available at SOAtestInstallRoot/ build/examples. The SOAtestInstallRoot/build folder is already on the Classpath for SOAtest so you can use the Keyword example in a test suite. You could also put your own Java class files here to use the Java class in SOAtest. An alternative to using the SOAtestInstallRoot/build folder is to add the class file using the System Properties tab in the SOAtest Preferences.

769

Extensibility (Scripting) Basics

Language-Specific Tips Java •

When you specify a Java class in any of the scripting Class fields, you must specify a compiled class that is on your classpath. You can click the Modify Classpath link then specify it in the displayed Preferences page.



If the class you are using is part of a package, you need to specify the complete package name as well as the class name (for example, java.lang.String)



If one of your scripts uses a class that has been changed and recompiled within SOAtest, SOAtest will reload the class and use most recent class for object construction when you invoke the method. SOAtest does not function this way for any class whose package name starts with one of the following prefixes:





sun.



com.sun.



org.omg.



javax.



sunw.



java.



com.parasoft.



webtool.



wizard.

To manually prompt SOAtest to reload a class that has been modified and recompiled, click Reload Class.

JavaScript •

You can call Java classes and methods that are on your classpath from inside JavaScript methods, or JavaScript tests. For instance, if you want to call Application.report() (from the SOAtest Extensibility API) from inside JavaScript, you need to need to reference it as Packages.com.parasoft.api.Application.report(). You could also reference it by prepending the name with Packages and the name of the package where the Java class lives as follows var Application = Packages.com.parasoft.api.Application Application.report("Message", "Result Window")



The SOAtest JavaScript emulation is based on FESI. For details on FESI, visit http:// home.worldcom.ch/~jmlugrin/fesi/



To check that the specified script is legal and runnable (or to add method entries to the Method box), right-click the File or Text text field (click whichever one you used to specify your script), then choose Evaluate from the shortcut menu.

Python/Jython For details on Jython (an implementation of Python that is integrated with Java), including information on how to write Jython and also how to invoke Java classes from inside Jython, visit http:// www.jython.org. Note that SOAtest ships with Jython 2.2.1.

770

Extensibility (Scripting) Basics



If you are using Python scripts, you might need to specify your python.home and python.path variables in the Scripting tab of the SOAtest Preferences panel. Both variables are used to locate Python modules, and Python code that does not import any Python modules can use the Python scripting support without setting either variable. python.home specifies the python installation directory. python.path is used to add to your path modules that are not in your python.home/Lib directory. Multiple paths can be listed in python.path, while python.home must be a single directory. If you set the python.home and python.path variables, you need to restart SOAtest before the changes will take effect.



To check that the specified script is legal and runnable (or to add method entries to the Method box), right-click the File or Text text field (click whichever one you used to specify your script), then choose Evaluate from the shortcut menu.

Additional Extensibility Resources For a step-by-step demonstration of how to extend SOAtest with scripting, see “Extending SOAtest with Scripting”, page 128 For details on how to create and apply an Extension tool that executes a custom Java, JavaScript, or Python script you have written, see “Extension (Custom Scripting)”, page 960.

771

Using Eclipse Java Projects in SOAtest

Using Eclipse Java Projects in SOAtest This topic explains how to configure SOAtest scripts and Extension tools to invoke code from Eclipse Java projects in your workspace. Sections include: •

Creating a New SOAtest Java Project



Using an Existing Java Project

Creating a New SOAtest Java Project SOAtest allows you to create a new Eclipse Java project that has access to SOAtest's Extensibility API, then configure SOAtest scripts and Extension tools to invoke classes from the new Java project. To create a new SOAtest Java project: 1. Choose File> New> Project. 2. Select SOAtest> Custom Development> SOAtest Java Project, then click Next. 3. Complete this wizard, which has the same options as Eclipse’s Java Project wizard. 4. Click Finish. Your new Java project will be shown in the Package Explorer view in the Eclipse Java development perspective. The project's build path will automatically have the jar files needed in order to use SOAtest's Extensibility API. Any Java classes added to your project can be accessed by Extension tools in your SOAtest test suite. For an example of how to do this, see “Java Example”, page 769.

Using an Existing Java Project To use an existing Java project from your workspace, you must first add that Java project to SOAtest's classpath as follows: 1. Choose SOAtest> Preferences. 2. Open the System Properties page. 3. Click the Add Java Project button and choose the appropriate project. The selected Java Project's build output folder and build path entries will be added to the classpath table. If the Automatically reload classes option is enabled, then SOAtest will attempt to reload classes from your Eclipse project after being modified or recompiled. The Reload button can also be used to force SOAtest to reload classes from the classpath entries.

772

Available Tools

Available Tools This topic introduces SOAtest’s tools. Sections include: •

Introducing SOAtest Tools



Customizing Tool Settings



Combining Tools

Introducing SOAtest Tools SOAtest includes a variety of tools that help you test Web services. You can also extend SOAtest’s capabilities by scripting your own tools or integrating third-party tools into the SOAtest environment. SOAtest can apply the following tools. Category Messaging Tools

XML Tools

Available Tools •

SOAP Client



Messaging Client



Message Stub



Common Messaging Options



webMethods



EJB Client



Call Back



UDDI Query



Transmit Tool



ISO 8583



XML Validator



XML Data Bank



XML Transformer



XSLT



XML Encryption



XML Signer



XML Signature Verifier



XML Encoder



XML Decoder

773

Available Tools

Viewing Tools

Validation Tools

Web Application Tools

Other Tools



Traffic Viewer



Event Monitor



Edit



Write File



File Stream Writer



stderr



stdout



Diff



WS-I



DB



XML Assertor



WS-BPEL Semantics Validator



Browser Testing



Browser Contents Viewer



Browser Validation



Browser Data Bank



Browser Stub



Scanning



Check Links



Spell



HTML Cleanup



Search



Browse



Header Data Bank



JSON Data Bank



Object Data Bank



Coding Standards



FTP Client



External



Aggregate



Extension (Custom Scripting)



Attachment Handler



Decompression



Jtest Tracer Client



WSDL Content Handler



WSDL Semantics Validator

774

Available Tools

Most test suite test cases are based on a SOAP Client tool and use other tools as “outputs” that operate on the messages that the SOAP Client tool returns. For example, if you wanted to test if a certain SOAP remote procedural call always returned the same output for a given input, you might create a SOAP Client tool, then use a Diff control as its output. Or, if you wanted to test whether a Web service returned values in a correct format, you might create a SOAP Client tool, then use a Coding Standards output to apply a set of rules that checked if the output matched a certain required pattern. If it did, you could attach a regression control to the Coding Standards; SOAtest would then alert you if the Web service failed to match that required pattern in subsequent tests.

Customizing Tool Settings You can customize settings related to the tools in the SOAtest toolbar by modifying options in the tool’s configuration panel. To access this panel, double-click the tool’s node in the Test Case Explorer.

Combining Tools It is possible to combine multiple tools into a single, more complex tool. For details on how this is done, see “Aggregate”, page 959.

775

Messaging Tools In this section: •

SOAP Client



Messaging Client



Message Stub



Common Messaging Options



REST Client



webMethods



EJB Client



Call Back



UDDI Query



Transmit Tool



ISO 8583

776

SOAP Client

SOAP Client The SOAP Client tool sends messages to SOAP servers. It can be used to test a Web service, test the communication between the client and server, and check the content of the SOAP messages. You can use the SOAP Client tool to test services with or without a WSDL. This tool is the foundation of most test cases. If you have access to a WSDL, you might want to use the New Test Suite Wizard to automatically generate SOAP Client tests for the service you want to test. For information on this functionality, see “Automatic Creation of Test Suites for SOA: Overview”, page 384. In addition, you can configure SOAtest to report or manipulate the results of any SOAP Client test by adding an appropriate “output” tool. For information on this functionality, see “Adding Test Outputs”, page 333. To send a message using the SOAP Client tool, you need to tell SOAtest what message to send and how to send it. This is done by specifying the parameters in the Project Configuration panel (this can be accessed by double-clicking the Test: SOAP Client node in your Test Suite tree). The SOAP Client Project Configuration panel is divided into two main sections. The upper section contains the general options that are available for all types of SOAP envelopes (i.e. Literal XML, Form XML, Scripted XML, Form Input). The bottom section contains tabs in which you can configure Request, Transport, Attachment, WS-Policy, and Miscellaneous options. Information on how to configure the different options of the SOAP Client tool can be found in the following subsections: •

General Options (For All Types of SOAP Envelopes)



WSDL Options



Request Options



Transport Options



Attachment Options



WS-Policy Options



Misc Options

General Options (For All Types of SOAP Envelopes) The SOAP Client tool shares general options with the Messaging Client and Message Stub tools. For more information on these shared options, see “General Options (For All Types of SOAP Envelopes)”, page 792.

WSDL Options Specifying options in the WSDL tab allows SOAtest to populate the Request tab with items that make it easier for you to specify the request message. You can specify the following WSDL settings: The WSDL tab allows you to specify the following WSDL settings: •

Resource Mode: Specifies WSDL URI or Schema URL mode.



WSDL URI: Describes the WSDL URI where this Web service can be accessed. You can either enter a value or click the Browse button. If you do not have a WSDL, you can leave this field empty.

777

SOAP Client



Schema URL: Describes the Schema URL where this Web service can be accessed. You can either enter a value or click the Browse button. If you do not have a schema, you can leave this field empty.



Constrain to WSDL: (Constrain to Schema if Schema resource mode is selected) Determines whether certain parameters of the messaging tool obtain their values from the WSDL (or Schema) rather than from manual entry. If this option is enabled, certain parameters (e.g. router endpoint, SOAP action, SOAP body and header parameters) are disabled and get their values from the WSDL. If this option is disabled, the Refresh WSDL (Refresh Schema) button will also be disabled.



WSDL Documentation: (Automatically completed if available): Describes the Web service at the given WSDL URI.

Request Options The Request tab allows you to configure the request that you want the tool to send. From the Request tab of the SOAP Client tool, you can select input modes from the Views drop-down list. The SOAP Client tool shares these options with the Messaging Client and Message Stub tools. For more information on these shared options, see “Request/Message Body Options”, page 792.

Transport Options The Transport options allow you to determine whether the client sends requests using HTTP 1.0, HTTP 1.1, JMS, SonicMQ, WebSphere MQ, RMI, SMTP, TIBCO, .NET WCF HTTP, or .NET WCF TCP protocols. To configure the properties of each protocol, select the appropriate protocol from the Transport drop-down list within the Transport tab of the SOAP Client tool.

Tip - Copying and Pasting Transport Settings Across Tests To simplify test creation, you can copy transport settings from one test to another. Just select the left-pane setting categories that you want to copy, then use right-click copy and paste commands.

For more information, see the following sections: •

“Using HTTP 1.0”, page 689



“Using HTTP 1.1”, page 691

778

SOAP Client



“Using JMS”, page 694



“Using IBM WebSphere MQ”, page 710



“Using RMI”, page 722.



“Using SMTP”, page 723



“Using TIBCO Rendezvous”, page 720



“Using .NET WCF TCP”, page 726

Attachment Options The Attachment tab allows you to send either Binary or XML attachments without scripting. To send an attachment, perform the following from the Attachment tab: 1. Click the Add button. An XML Attachment entry displays in the Attachment table. 2. Double-click the XML Attachment entry. An Edit Attachments dialog displays. 3. In the Edit Attachments dialog, select either XML or Binary from the Mode drop-down menu. •

The following options are available for XML Mode: •



Views: Select the desired view from the drop-down menu and configure accordingly. Options are Literal XML, Form XML, Scripted XML, and Form Input.

The following options are available for Binary Mode: •

Base 64 Encoding: Enables Base 64 Encoding to encode the binary value.



Data Source Column: Select to send values from a data source column.



File: Select to send values from a file. Choose the desired file by clicking the Browse button. Check the Persist as Relative Path option if you want the path to this file to be saved as a path that is relative to the current configuration file.



Text: Select to send a text value.



Content type: Specify the content type. Click the Edit Headers button if you want to add, modify, or delete attachment headers.

WS-Policy Options The WS-Policy tab is used to keep track of which policy the test case is using. It also allows you to switch between policies and update the attached tools based on that policy. The following options are available in the WS-Policy tab of the SOAP Client tool:

779

SOAP Client



Constrain to policies defined in WSDL: If this option is selected, you will be able to update policy configurations.



Update policy configuration: Clicking this button will remove the current policy configuration and add the newly chosen policy configuration. •

Policies in a WSDL can be attached to the Endpoint, Operation, or Message subjects. The result is the union of all the alternatives chosen for each subject.

Misc Options The following options are available in the Misc tab of the SOAP Client tool: •

Notes: (optional) Records additional information about the test that you enter in this field.



Timeout after (seconds): Specifies the length of delay (in seconds) after which SOAtest should consider your FTP, telnet, or HTTP requests to be timed out. The Default setting corresponds to the timeout set in the Preferences panel. The Custom setting allows you to enter a timeout. A non-positive timeout value can be entered to specify an infinite timeout. •

Fail the test on Timeout: Select this option to fail the test on the specified timeout.



Pass the test only if a Timeout occurred: Select this option to pass the test if the specified timeout occurred (i.e. test did not finish execution within the specified time).



Attachment Encapsulation Format: Specifies whether to use the Default or Custom encapsulation format. The Default option specifies whatever is chosen as the Attachment Encapsulation Format in the Misc tab of the system preferences (for more information, see “SOAP Client Settings”, page 757). The Custom option allows you to choose MIME, DIME, MTOM Always, or MTOM Optional.



SOAP Version: Select Custom from the drop-down menu and select either SOAP 1.1 or SOAP 1.2. The default value is SOAP 1.1.

780

SOAP Client



Outgoing Message Encoding: Allows you to choose the encoding for outgoing messages. You can choose any Character Encoding you wish from the SOAtest Preferences panel to read and write files, but the Outgoing Message Encoding provides additional flexibility so you can set a different charset encoding for the SOAP request from the global setting.



Style/Use: (Options are disabled if the Constrain request to WSDL checkbox is selected in the WSDL tab) These options allow you to select the body style and encoding of the SOAP message: •

Body Style: Select either document or rpc.



Use: Select either encoded or literal.



Encoding Style URI: (Automatically completed if available) Lists the encoding style URI used to send requests.



Target Object URI: Specifies the target object URI.

781

Messaging Client

Messaging Client The Messaging Client tool is similar to the SOAP Client tool in that it sends payloads over HTTP to the server. However, this tool is very useful for non-SOAP clients, such as XML servlets and proprietary Web services that have their own specifications. It can be used to test a Web service, test the communication between the client and server, and check the content of the HTTP messages. In addition, the Messaging Client can be used to test REST style web services. For more information, see “Testing RESTful Services”, page 420. To send a message using the Messaging Client tool, you need to tell SOAtest what message to send and how to send it. This is done by specifying the following parameters in the Project Configuration panel (this can be accessed by double-clicking the Test: Messaging Client node in your Test Suite tree): This topic explains how to configure and apply the Messaging Client tool that sends HTTP messages to servers. Sections include: •

General Options



Request Options



Transport Options



Success Criteria Options

General Options The Messaging Client tool shares general options with the SOAP Client and Message Stub tools. For more information on these shared options, see “General Options (For All Types of SOAP Envelopes)”, page 792.

Request Options From the Request tab of the Messaging Client tool, you can select input modes from the Input Mode drop-down list. The Messaging Client tool shares Input Mode options with the SOAP Client and Message Stub tools. For more information on these shared options, see “Request/Message Body Options”, page 792.

Transport Options The Transport options allow you to determine whether the client sends requests using HTTP 1.0, HTTP 1.1, JMS, SonicMQ, WebSphere MQ, RMI, SMTP, or TIBCO protocols—or a custom protocol (To select a custom method, choose CUSTOM then enter the name of your custom method in the Value field that displays). To configure the properties of each protocol, select the appropriate protocol from the Transport drop-down list. For more information, see the following sections: •

“Using HTTP 1.0”, page 689



“Using HTTP 1.1”, page 691



“Using JMS”, page 694



“Using IBM WebSphere MQ”, page 710



“Using RMI”, page 722.



“Using SMTP”, page 723



“Using TIBCO Rendezvous”, page 720

782

Messaging Client

Success Criteria Options The following options are available in the Success Criteria tab of the Messaging Client tool: •

Valid HTTP Response Codes: Allows you to enter a range of valid HTTP response codes that will allow the test to succeed. You can enter ranges as comma (or semicolon) separated values, with each value being either a single code or a range of codes separated by hyphens. For example, if you enter "302, 400-499", a 302 code or any code in the 4xx range will be accepted.



Timeout after (milliseconds): Specifies the length of delay (in milliseconds) after which SOAtest should consider your FTP, telnet, or HTTP requests to be timed out. The Default setting corresponds to the timeout set in the Preferences panel. The Custom setting allows you to enter a timeout. A non-positive timeout value can be entered to specify an infinite timeout. •

Fail the test on timeout: Select this option to fail the test on the specified timeout.



Pass the test only if a timeout occurred: Select this option to pass the test if the specified timeout occurred (i.e. test did not finish execution within the specified time).

783

Message Stub

Message Stub The Message Stub (formerly called "Client Tester") is used to emulate a service. This tool is used to stub out a service and can automatically generate stubs for operations defined in a WSDL. It supports HTTP, JMS, and REST, and can be automatically generated from SOAP Client tests and Messaging Client tests as described in “Creating Stubs from WSDLs or Manually”, page 539. Information on how to configure the different options of the Message Stub tool can be found in the following subsections: •

General Options (For All Types of SOAP Envelopes)



WSDL Options



Response Options



Test Correlation Options



Data Source Correlation Options



Attachment Options



Service Options



Using Existing Data Sources or Rapidly Creating Data Sources for Responses



Using Scripted Logic



Adding Attachment Handlers to the Message Stub

Tip- More about Stubs and Virtualization For a broader look at how to use SOAtest to emulate the behavior of resources that you cannot (or do not want to) access during testing, see “Service Virtualization: Creating and Deploying Stubs”, page 531.

General Options (For All Types of SOAP Envelopes) The Message Stub control panel is divided into two main sections. The upper section contains the general options that are available for all types of SOAP envelopes (i.e. Literal XML, Form XML, Scripted XML, Form Input). The Message Stub tool shares general options with the SOAP Client and Messaging Client tools. For more information on these shared options, see “General Options (For All Types of SOAP Envelopes)”, page 792. The bottom section consists of tabs used to configure the SOAP envelope.

WSDL Options Specifying options in the WSDL tab allows SOAtest to populate the Response tab with items that make it easier for you to specify the response message. You can specify the following WSDL settings: •

Resource Mode: Specifies WSDL URI or Schema URL mode.



WSDL URI: Describes the WSDL URI where this Web service can be accessed. You can either enter a value or click the Browse button. If you do not have a WSDL, you can leave this field empty.

784

Message Stub



Schema URL: Describes the Schema URL where this Web service can be accessed. You can either enter a value or click the Browse button. If you do not have a schema, you can leave this field empty.



Constrain to WSDL: Determines whether certain tool parameters obtain their values from the WSDL rather than from manual entry. If this option is enabled, certain parameters (e.g. router endpoint, SOAP action, SOAP body and header parameters) are disabled and get their values from the WSDL. If this option is disabled, the Refresh WSDL button will also be disabled.



WSDL Documentation: (Automatically completed if available): Describes the Web service at the given WSDL URI.

Response Options The Response tab allows you to configure the response values that you want the emulated asset to deliver when specified requests are received. The options available vary depending on what option is selected in the Views menu.

Literal XML, Form XML, Scripted XML, and Form Input Views The available options for Literal XML, Form XML, Scripted XML, or Form Input are the same as those available with the SOAP Client tool. For more information on these options, see: •

“Common Messaging Options”, page 792



“Literal XML View Options”, page 793



“Form XML View Options”, page 794



“Scripted XML View Options”, page 804



“Form Input View Options”, page 805

Multiple Responses View The Multiple Responses view allows you to specify which responses to use for specific requests. In the Conditions tab you can specify the request conditions that must be met in order for the correlated response to be sent. You can specify the condition using XPath functions and/or URL parameters for RESTful services (built-in support is available for GET and POST). The Message tab specifies what response is sent if the conditions are met.

Tip- Using ’Always Match’ with Multiple Responses Enable the Always Match option if: •

You want to specify a single response that the stub should always return—regardless of the content of the incoming message, or



You have multiple responses and you want to specify the final response that the stub should return. In this case, it servers a catch-all, last-resort response if none of the previous conditions matched. For example, you could specify an "always match" final response that returns a SOAP Fault indicating that the service failed to process the request and return something based on the incoming values.

785

Message Stub

If more than one XPath or URL parameter matches a response, SOAtest will return the first matching response in the list (use the Up and Down buttons to specify the desired order or responses). If the XPath or URL parameter provided for each response results in a unique match, the order of the responses is irrelevant.

Scripted XML View The Scripted XML view allows you to specify very complex logic. For details, see “Using Scripted Logic”, page 790.

Test Correlation Options The Test Correlation tab allows you to specify which messages this Message Stub tool/test accepts and processes. Various messages sent to the stub URL are routed to specific Message Stub tools (each of which handle different operations) based on the settings here. For example, one Message Stub tool might respond to customer registration messages, another might respond to payment messages, and another might function as a default "catch all" that it used when none of the others match. You can specify which messages a Message Stub accepts by entering values in the Transport or Message correlation tab: •

Transport: Allows you to specify HTTP Headers or JMS message properties within the SOAP message that will determine whether or not the message is processed.



XML Message: Allows you to specify XPaths within the SOAP message that will determine whether or not the message is processed.



URL Parameters (for RESTful services): Allows you to specify URL parameters that will determine whether or not the message is processed. URL parameters can be repeated (you can have the same parameter set to different values).

You can configure one, both, or neither the Transport Correlation or Message Correlation. If neither of the correlations are configured, everything in the message will be processed.

Transport Correlation To configure the transport correlation: 1. Select the Enable correlation check box. 2. Click the Add button. A new entry row displays. 3. Enter the Name and Value of the message correlation you would like to specify.

Message Correlation To configure the message correlation: 1. Select the Enable correlation check box. 2. Click the Edit button. An Edit XPath Function dialog displays. 3. Select an element from the Element tree, select a function from the Function drop-down menu, and click OK.

Data Source Correlation Options The Data Source Correlation tab allows you to specify which data source row values to use in stub responses.

786

Message Stub

Benefits One way to configure stubs with Multiple Responses mode is to manually configure the Message Stub tool to send different response messages based on the nature of the incoming message. Another way to configure stubs to dynamically respond with the desired message is to use data sources. You can easily fill out data source tables (Excel, etc.) where each row contains values in the incoming message (typically, just the leaf node values) that you want the stub to respond to, then another column that specifies how you want the stub to respond when the specified conditions are met (see “Using Existing Data Sources or Rapidly Creating Data Sources for Responses”, page 789 for details). After that, you can configure the mapping between the request and response message values and the columns within the data source. This allows for request/response use cases to be configured easily in one easy-to-edit table (data source), it allows them to be managed there to scale further with more and more messages, and it also provides more flexibility with response messages (since Form Input allows you to have some values fixed, others parameterized, some automatic or even scripted).

Configuration To parameterize stub responses from a data source: 1. Ensure that you have a data source configured in SOAtest and available for the Message Stub. 2. Click the Add button in the appropriate part of the Data Source Correlation tab, provide an XPath and/or URL parameter, and select a column name. •

If a WSDL was provided, the XPath can be generated using the Edit button. That dialog validates the XPath expression and the column name in real time.



SOAtest validates the XPath syntax while you type/edit these XPath expressions—if you choose to customize them instead of using the visual Edit option.

3. Parameterize the Response area of the Message Stub by referencing other data source columns within that same data source (i.e., Form Input, Form XML, etc.). You can add several XPaths or URL parameters to column name mappings.

Processing Details When the Message Stub executes in a test suite, or as a hosted stub, the incoming messages are evaluated through these XPath expressions/URL parameters. The values are then matched against the corresponding data source values (each with its respective column) to find a row that matches the values. If no match was found, then an error is returned. If a row was found, then that row will be used for any parameterized values in the Message Stub Response message. This way, the stub can respond with the desired message values based on values in the incoming messages. The value matching supports the wild cards * and ?. For example, if you want an incoming value named "title keyword" to match a certain row as long as it contains the word Linux, then you can have the data source value "*Linux*". * matches zero or more characters, ? matches a single character.

Attachment Options The Attachment tab allows you to send either Binary or XML attachments without scripting. To send an attachment, perform the following from the Attachment tab: 1. Click the Add button. An XML Attachment entry displays in the Attachment table. 2. Double-click the XML Attachment entry. An Edit Attachments dialog displays. 3. In the Edit Attachments dialog, select either XML or Binary from the Mode drop-down menu.

787

Message Stub

The following option is available for XML Mode: •

Views: Select the desired view from the drop-down menu and configure accordingly. Options are Literal XML, Form XML, Scripted XML, and Form Input.

The following options are available for Binary Mode: •

Base 64 Encoding: Enables Base 64 Encoding to encode the binary value.



Data Source Column: Select to send values from a data source column.



File: Select to send values from a file. Choose the desired file by clicking the Browse button. Check the Persist as Relative Path option if you want the path to this file to be saved as a path that is relative to the current configuration file.



Text: Select to send a text value.



Content type: Specify the content type. Click the Edit Headers button if you want to add, modify, or delete attachment headers.

Service Options The Service Options tab allows you to configure how the SOAP message is processed. The following options are available: •

Message Exchange Pattern: Select Solicit Response or Notification Only.



Time Options: Allows you to set the following options related to the execution time (note that the Execution Time of the Message Stub is calculated from the time that the servlet finishes receiving the request to the time that the sending of the response begins): •







Timeout after (milliseconds): Specifies the length of delay (in milliseconds) after which SOAtest should consider SOAP message to be timed out. The Default setting is 30000. The Custom setting allows you to enter a timeout. A non-positive timeout value can be entered to specify an infinite timeout. •

Fail the test on Timeout: Select this option to fail the test on the specified timeout.



Pass the test only if a Timeout occurred: Select this option to pass the test if the specified timeout occurred (i.e. test did not finish execution within the specified time).

Think Time: Enter the amount of time (in milliseconds) for the message delay you want to emulate. This can be used to emulate a slow service. If a data source is available, you may also parameterize this value. A Think Time value larger than the Timeout value will not force a timeout. Timeouts will only occur if a message is not correlated successfully and the Message Stub is sitting idle, unable to complete.

Return Status: Allows you to specify how the message is returned—for instance, to emulate working or faulty services. To use the default value of 200 OK, select the Use Default Return Status check box. If this option is not selected, the following options are available: •

Return Status: Enter the custom return status value. If a data source is available, you can also parameterize this value.



Return Message: Enter the custom return message. If a data source is available, you can also parameterize this value.

Run Mode: Select Run Indefinitely to enable the Message Stub tool to run indefinitely when run as a functional test. This option is intended for backwards compatibility and for complex scenarios where the Message Stub is run as part of a functional test scenario.

788

Message Stub



Style/Use: These options allow you to select the body style and encoding of the SOAP message: •

Body Style: Select either document or rpc.



Use: Select either encoded or literal.



Encoding Style URI: (Automatically completed if available) Lists the encoding style URI used to send requests.



Target Object URI: Specifies the target object URI.



Attachment Encapsulation Format: Specifies whether to use the Default or Custom encapsulation format. The Default option specifies whatever is chosen as the Attachment Encapsulation Format in the SOAP Client tab of the Preferences panel (for more information, see “SOAP Client Settings”, page 757). The Custom option allows you to choose MIME, DIME, MTOM Always, or MTOM Optional.



Headers: Allows you to add custom headers in the SOAP response.

Adding Custom Headers To add custom headers: 1. Click the Add button in the Service Options tab. A dialog opens. 2. Enter the Name and Value of the message correlation you would like to specify. 3. Click OK.

Using Existing Data Sources or Rapidly Creating Data Sources for Responses Specifying response values in a data source is a very efficient way to add a significant volume of request/response pairs.

Using Existing Data Sources If you already have a data source that specifies values for request parameters and the desired corresponding response parameters, you can use those values as follows: 1. Add the data source to SOAtest (as described in “Adding a Data Source”, page 346). 2. Configure the Message Stub’s data source mapping with the appropriate request columns (as described in “Data Source Correlation Options”, page 786).

Rapidly Creating Data Sources If you do not already have such a data source but want a fast way to specify multiple request/response sets: 1. Use the procedure described in “Generating a Data Source Template for Populating Message Elements”, page 353 to create a CSV file from the Form Input view of the Message Stub tool. A data source will be generated and added to the test suite. This generated data source will contain columns for responses. The Message Stub tool’s Form Input view will be parameterized automatically. 2. Add a new data source column (e.g., using Excel, OpenOffice or a similar spreadsheet application) for each request parameter that should be used to determine the response. 3. Configure the Message Stub’s data source mapping with the new request columns you added. See “Data Source Correlation Options”, page 786 for details.

789

Message Stub

4. Add new rows to the data source (e.g., using Excel, OpenOffice or a similar spreadsheet application) in order to specify values for request parameters and the desired corresponding response parameters.

Using Scripted Logic You can script response values based on the incoming request. This allows for more complex logic for your stubs. In addition, the Message Stub allows you to access data source values through the script. Access to these values is similar to how you would access them through the Extension Tool. To use scripted logic within the Message Stub tool: 1. Enter a WSDL in the WSDL URL field of the Message Stub tool. 2. Right-click the Message Stub node and choose Add Output. The Add Output wizard displays. 3. From the Add Output wizard, select Incoming Request from the left pane and XML Data Bank from the right pane and click Finish. 4. Double-click the Incoming Request> XML Data Bank node beneath the Message Stub node. The XML Data Bank configuration panel displays in the right GUI panel. 5. In the right GUI panel, add the XPath of the value you want to access in your script.

6. Double-click the Message Stub node. The Message Stub configuration panel displays in the right GUI panel. 7. In the Response tab, choose the Scripted XML view. 8. Specify your logic. The following is the basic template for accessing data source and data bank values: def customLogic(context): # Retrieve the data source value. "Data Source Name" should be replaced # with the name of your data source and "Column Name" should be the column # that your value is coming from. You can access many columns from the same # data source within this script. For Data Bank values, the table is always # named "Generated Data Source" so you only need to replace "Data Bank Column Name" dataSourceValue = context.getValue("Data Source Name", "Column Name") dataBankValue = context.getValue("Generated Data Source", "Data Bank Column Name") # add custom logic that uses value from data source # The following method tells what data source you will be using in this script. # "Data Source Name" should be replaced with the name of your data source def addDataSources(context): return "Data Source Name"

790

Message Stub

9. Select the correct method from the Method combo box. The method you select should be your entry point. In the above example, the method would be customLogic().

Adding Attachment Handlers to the Message Stub It may be useful to add an Attachment Handler to the Message Stub in order to test your attachment handling when sending message attachments. To add an Attachment Handler to a Message Stub, complete the following: 1. Select the Message Stub node and click the Add test or output button. The Add Output wizard displays. 2. From the Add Output wizard, select Incoming Attachment from the left pane and Attachment Handler from the right pane and click Finish. 3. Double-click the Attachment Handler node, then configure the tool in the tool configuration panel. 4. (Optional) Select the Attachment Handler node beneath the Message Stub node and click the Add Test/Add Output button. The Add Output wizard displays from which you can add a Write File tool to write the attachment out as a binary file.

791

Common Messaging Options

Common Messaging Options The SOAP Client, Messaging Client, and Message Stub tools share common options. This section explains these options.

General Options (For All Types of SOAP Envelopes) The following are the general options available for all types of SOAP envelopes for the SOAP Client, Messaging Client, and Message Stub tools.: •

Data Source: Specifies the Data Source to be used for providing values sent to the server. This menu is only available if a Data Source was added to the project.



WSDL URI: (Schema URL for Messaging Client) Describes the WSDL URI (or Schema URL) where this Web service can be accessed. You can either enter a value or click the Browse button. If you do not have a WSDL (or schema), you can leave this field empty.



Constrain to WSDL: (Constrain to Schema for Messaging Client) Determines whether certain parameters of the messaging tool obtain their values from the WSDL (or Schema) rather than from manual entry. If this option is enabled, certain parameters (e.g. router endpoint, SOAP action, SOAP body and header parameters) are disabled and get their values from the WSDL. If this option is disabled, the Refresh WSDL (Refresh Schema) button will also be disabled.



Refresh WSDL: (Refresh Schema for Messaging Client) Refreshes the WSDL (or schema) from the given location URL and reparses it.

Request/Message Body Options From the Request tab of the SOAP Client and Messaging Client tools, or from the Messaging Body tab of the Message Stub tool, you can select SOAP envelopes from the Views drop-down list (or Input Mode drop-down list for the Messaging Client). The Views/Input Mode drop-down list specifies how you want to enter the SOAP envelope (header and body). Note: If using the Message Stub tool, Single Response must be selected from the Mode drop-down list in order for the Views drop-down list to display. You can choose from the following options in the Views/Input Mode drop-down list:

View

Use this to...

For details, see...

Literal XML

Use actual XML

Literal XML View Options

Form XML

Configure XML messages via a tree view of the XML requests and responses

Form XML View Options

Scripted XML

Use this option to dynamically generate XML from a script

Scripted XML View Options

Form Input

Enter parameters in the UI form fields

Form Input View Options

MapMessage Input

Specify input in a Messaging Client tool when JMS is selected as the Transport type, and javax.jms.MapMessage is chosen as the MessageType

MapMessage Input Options

792

Literal XML View Options

Literal XML View Options This topic covers literal XML view options, which allow you to use XML to specify the request/message body. The Literal XML View contains the following options: •

Text



File



Data Source

Text Use this option if you want to type or copy the XML that specifies the SOAP envelope into the UI. Select the appropriate MIME type, then enter the XML in the text field below the Text radio button. •

You can beautify the XML by right-clicking within the Text field and selecting Beautify from the shortcut menu.



You can choose elements from a schema for both the body and the header by right-clicking within the Text field and selecting Import Schema Element. After selecting this option, a dialog appears from which you can load declared elements from a schema location. After loading elements, you can select multiple elements for the SOAP header. Once you click OK, a SOAP Envelope will be created based on the chosen element definitions.

File If you already have an XML file that specifies the SOAP envelope, use this option to indicate the location of that file. •

Check the Recursively Process Input option if you want the input processed recursively.



Check the Persist as Relative Path option if you want the path to this file to be saved as a path that is relative to the current configuration file. Enabling this option makes it easier to share tests across multiple machines. If this option is not enabled, the test suite will save the path to this file as an absolute path.

Data Source If you already have an XML file containing the various requests that you want to send, setup a File data source for that file (as described in “Adding a File Data Source”, page 351), then use the Data source name column box to indicate which column of data you want to use.

793

Form XML View Options

Form XML View Options This topic covers the Form XML view, which provides a tree view of the XML request and XML response, allowing for XML manipulation through the SOAtest GUI. Sections include: •

Overview



Manipulating the XML View Tree



Manipulating the XML Configuration Tabs



Using Data Sources with Form XML: Parameterized Values



Using Data Sources with Form XML: Excluding Elements and Attributes

Overview The literal XML is graphically represented in the Form XML mode, giving you a flexible alternative in configuring XML messages. You can add, remove, and rename XML components of the XML message. In addition, if any data sources are available, data source values can be used in configuring XML messages. Any changes made in the Form XML mode will also be made in the Literal XML mode and vice versa. If form XML is selected as the SOAP Envelope, the lower part of the panel consists of an XML View Tree and XML Configuration Tabs.

Manipulating the XML View Tree The XML View Tree (located in the lower left corner of the right GUI panel) displays the literal XML as a tree, with each tree node representing an element. By right-clicking any of the nodes in the XML View Tree, a shortcut menu appears.

794

Form XML View Options

The XML View Tree shortcut menu contains the following options:

Option

Description

Cut/Copy/Paste

Select to cut, copy, or paste an XML node in Form XML. You may use keyboard shortcuts to accomplish this as well (CTRL+X for cut, CTRL+C for copy, CTRL+V for paste). When attempting to paste an element within the root element, a dialog will display asking you whether to replace the entire content, or add as a child. You can perform cut, copy, and paste operations across multiple XML View Trees within Diff tools, SOAP Clients, HTTP Traffics, etc. These operations persist parameterization and any other exclude/encode settings.

Insert New

Select to insert a new Element, Attribute, or Namespace Declaration. Depending on your selection, the corresponding tab will display parameter options. You can also insert new components by pressing the Insert key on your keyboard.

Move Up

Select to move a node up within the XML View Tree. You can also move a node up by selecting the node with your mouse and pressing the u key on your keyboard.

795

Form XML View Options

Option

Description

Move Down

Select to move a node down within the XML View Tree. You can also move a node down by selecting the node with your mouse and pressing the d key on your keyboard.

Remove

Select to remove a node from the tree. You can also remove a node from the tree by selecting the node with your mouse and pressing the Delete key on your keyboard.

Encode Child Elements

Select to encode a subtree of elements as text content for its parent element. If this option is selected for a particular node, all child elements of that node will display as italicized in the Form XML tree, and will be encoded as text within the parent element, rather than as XML. If this option is not selected, child elements will not be encoded and will remain as sub-elements.

Import Schema Element

Select to choose elements for both the body and the header. After selecting this option, a dialog appears from which you can load declared elements from a schema location. After loading elements, you can select multiple elements for the SOAP header. Once you click OK, a SOAP Envelope will be created based on the chosen element definitions.

Populate

Fills SOAP array and element parameters. This also sets any element nils to false and expands them out. This command is only available if the test is created from a WSDL.

Expand All

Select to expand all nodes in the XML View Tree. You can also expand all the nodes in the XML View Tree by clicking in any area of the XML View Tree and pressing the e key on your keyboard.

Collapse All

Select to collapse all nodes in the XML View Tree. You can also collapse all the nodes in the XML View Tree by clicking in any area of the XML View Tree and pressing the c key on your keyboard.

Show Namespaces

Select to view the namespaces within the XML View Tree.

Show Attributes

Select to view the attributes within the XML View Tree.

Beautify

Select to beautify all well-formed XML fragments.

Manipulating the XML Configuration Tabs The XML Configuration Tabs (Elements, Attributes, Namespace Declarations) allow you to add, remove, and rename XML components. To add or modify an XML component, select the appropriate node from the XML View Tree, and then click on the appropriate XML configuration tab. The following tabs are available: •

Element tab



Attributes Tab



Namespace Declarations Tab

Element tab Displays options to modify Element parameters of the selected node from the XML View Tree.

796

Form XML View Options

To add a new Element, right-click the desired node from the XML View Tree and select Insert New> Element from the shortcut menu. A NewElement node will appear underneath the node you right clicked. Select the NewElement node to configure it.

The following options are available in the Element tab of the Form XML mode:

Option

Description

Encode Value

(Only available for nodes that contain no child nodes) Select to encode an escaped XML fragment value as text. For example, if you are sending values such as <UserName>, you can encode the less than (<) and greater than (>) characters as &lt and &gt. This way, the XML parser will not misconstrue the <UserName> value as an XML tag, but rather as text. This feature also works with parameterized values stored in a data source.

Use Data Source: Exclude with empty string

(Only available if a data source is specified in the SOAP Client tool) Allows you to control whether elements and attributes are sent depending on the values in a data source. For more information, see “Using Data Sources with Form XML: Excluding Elements and Attributes”, page 801.

797

Form XML View Options

Option

Description

Value

(Only available if the Element has no children elements within it) Specifies the content value of the Element. If a data source is available, you will have the option to choose either a Fixed or a Parameterized value from a drop-down menu. For more information, see “Using Data Sources with Form XML: Parameterized Values”, page 800.

Prefix

Specifies the namespace prefix for the Element. Depending on the namespace declarations of the selected element and it’s ancestors, the options in the Prefix drop-down menu will vary.

Local name

Specifies the local name for the Element. If a data source is available, you will have the option to choose either a Fixed or a Parameterized local name from a drop-down menu. For more information, see “Using Data Sources with Form XML: Parameterized Values”, page 800.

Attributes Tab Displays options to add and/or modify Attribute parameters of the selected node from the XML View Tree.

The following options are available in the Attributes tab of the Form XML mode:

Option

Description

Attributes

Displays a list of the current attributes of the selected node.

798

Form XML View Options

Option

Description

Remove

Click to remove the selected attribute from the Attributes list.

Add New Attribute

Click to add a new attribute to the Attributes list.

Use Data Source: Exclude with empty string

(Only available if a data source is specified in the SOAP Client tool) Allows you to control whether elements and attributes are sent depending on the values in a data source. For more information, see“Using Data Sources with Form XML: Parameterized Values”, page 800.

Value

Specifies the content value of the Attribute. If a data source is available, you will have the option to choose either a Fixed or a Parameterized value from a drop-down menu. For more information, see “Using Data Sources with Form XML: Parameterized Values”, page 800.

Prefix

Specifies the namespace prefix for the Attribute. Depending on the namespace declarations of the selected element and it’s ancestors, the options in the Prefix drop-down menu will vary.

Local name

Specifies the local name for the Attribute. If a data source is available, you will have the option to choose either a Fixed or a Parameterized local name from a drop-down menu. For more information, see “Using Data Sources with Form XML: Parameterized Values”, page 800.

Namespace Declarations Tab Displays options to add and/or modify Namespace Declaration parameters of the selected node from the XML View Tree:

799

Form XML View Options

The following options are available in the Namespace Declarations tab of the Form XML mode:

Option

Description

Namespaces

Displays a list of the current namespace declarations of the selected node.

Remove

Click to remove the namespace from the Namespaces list.

Add New Namespace

Click to add a new namespace declaration to the Namespaces list

URI

Specifies the URI of the namespace declaration. If a data source is available, you will have the option to choose either a Fixed or a Parameterized URI from a drop-down menu. For more information, see “Using Data Sources with Form XML: Parameterized Values”, page 800.

Prefix

Specifies the prefix of the namespace declaration. If a data source is available, you will have the option to choose either a Fixed or a Parameterized Prefix from a drop-down menu. For more information, see “Using Data Sources with Form XML: Parameterized Values”, page 800.

Using Data Sources with Form XML: Parameterized Values If a data source is specified in the upper right corner of the SOAP Client tool, you will be able to use the values in that data source as parameterized values in the XML components of the Form XML mode. For example, you can use all the values in a data source column as a content value for an Element. If a data source is specified in the SOAP Client tool, you will have a choice to select either a Fixed or a Parameterized value. If a data source is not available, then only a Fixed value can be entered. •

Fixed values are the literal values that you specify by entering an input into the available text field.



Parameterized values are the values from a data source column, or from the Data Source wizard (which allows you to parameterize the current test with values from other tests). When configuring Parameterized values, a drop-down box containing column names from a data source

800

Form XML View Options

will become available.

Using Values Stored in an Existing Data Source Column The column names in the drop-down menu to the right of the Parameterized field correspond to the data source columns specified in the SOAP Client tool. All values in the data source column that you select will be sent as the literal value by the SOAP Client tool.

Using Values from Other Tests To use a value from another test, choose Use Data Source Wizard from the box that also displays column names. See “XML Data Bank”, page 863 for more details on completing this wizard. Once a value is extracted, it will appear in the list of available columns.

Using Data Sources with Form XML: Excluding Elements and Attributes You can create a data source that controls whether or not the SOAP Client tool includes or excludes particular elements and attributes as part of the SOAP message. If the Use Data Source: Exclude with empty string check box is selected in either the Element or Attribute Form XML Configuration Tabs, the SOAP Client tool will use an empty string (a string of length 0) of the specified data source as a condition that controls whether or not to include or exclude elements and attributes as part of the SOAP message. For example, you can create a data source that contains values for users’ names and passwords. By entering empty strings that correspond to the password values, you can exclude these password values from the SOAP message. To exclude an element or attribute from a SOAP message: 1. Create a data source with columns that contain the values you would like to send as part of a SOAP message, and columns that contain the empty and non-empty strings that signify the inclusion or exclusion of an element. In the following data source example, the Value column contains values of name and password, and the Exclude Password column has a value of Don’t Exclude, as well as an empty string.

801

Form XML View Options

For more information on configuring data sources, see “Parameterizing Tests (with Data Sources or Values from Other Tests)”, page 345. 2. Ensure that the correct data source is selected from the Data Source menu in the SOAP Client tool panel. 3. Check the Use Data Source: Exclude with empty string check box in either the Element or Attribute Form XML Configuration Tab. 4. In the drop-down box next to the Use Data Source: Exclude with empty string check box, select the appropriate column that contains the empty string.

802

Form XML View Options

5. In the Value drop-down boxes, specify the content value of the Element/Attribute by selecting Parameterized and the appropriate column that contains the data source values to be sent as part of the SOAP message.

When the test is run, the password value from the data source will not be sent as part of the SOAP message because it was excluded by the empty string in the Exclude Password column that was chosen from the Use Data Source: Exclude with empty string drop-down box. To send all of the parameterized values from a data source, do not select the Use Data Source: Exclude with empty string check box.

803

Scripted XML View Options

Scripted XML View Options This topic covers the Scripted XML view, which can be used to dynamically generate XML from a script. The Scripted XML View contains the following options:

Option

Description

Language

Select the scripting language in which your script is written.

File

(For JavaScript and Python) If you already have a script that dynamically generates the XML for the SOAP envelope, use this option to specify the location of that script.

Text

(For JavaScript and Python) Use this option to directly enter in the UI the script that specifies how to dynamically generate the SOAP envelope.

Class (For Java)

Use this option to specify the appropriate class in the Class field. Note that the class you choose must be on your classpath.

Method

Use this box to select the argument you want to use. If no arguments are listed for a JavaScript or Python script, right-click the File or Text text field (click whichever one you used to specify your script), then choose Evaluate from the shortcut menu. If no arguments are listed for a JavaScript, check that the specified class file is on your classpath. For general guidelines on adding methods, see “Extensibility (Scripting) Basics”, page 764.

804

Form Input View Options

Form Input View Options This topic covers the Form Input view, which allows you to enter parameters in the UI form fields. Sections include: •

General Options



Parameter Options (For RPC-Style Services)



Body Options (For Document-Style Services)



Literal Value Options (For RPC-Style and Document-Style Services)



Advanced Options



Adding SOAP Headers

General Options The Form Input View contains the following options:

Option

Description

Operation

(Automatically completed if available) Lists the available methods/ operations you can call and test. The operation list is created automatically when a valid WSDL URI is entered. You can select which operation you want to test by selecting an operation from the list.

Element

(Messaging Client only, automatically completed if available) Lists the available elements you can call and test. The element list is created automatically when a valid schema is entered. You can select which element you want to test by selecting an element from the list. This is the only option available in the Form Input View of the Messaging Client.

There are two sub-tabs beneath the Operation drop-down menu: the SOAP Body and the SOAP Header tabs. For information on the Header sub-tab, see “Adding SOAP Headers”, page 811. The remaining options in the SOAP Body tab depend on whether the service under test is an RPCstyle service or a document-style service (the Body Style you choose from the Misc. tab).

Parameter Options (For RPC-Style Services) The following options are only available for RPC-Style services:

Option

Description

Parameters

(Automatically completed if available) Lists the parameters available for the selected method.

Name

(Automatically completed if available) Lists the name of the selected parameter.

805

Form Input View Options

Option

Description

Value

Lets you specify how you want to specify the parameter value. Options include:

Value Area (below the Value option)



Fixed: Use this option if you want to enter a literal value (such as a string) by completing form fields.



Parameterized: Use this option to use values specified in a data source. (You must first add the appropriate data source to the test suite.)



Auto Use this option to have SOAtest automatically generate parameter values. Automatic parameter generation is particularly useful when you want to assess the service’s constraints and determine what type of data will fail.



Script: Use this option if you want to have a script generate parameters at runtime.



Input: Use this option if this SOAP Client tool is chained to another SOAP Client tool and you want the result from the first tool to be used as the parameter for the second tool.

Lets you determine the parameter values. •

If you selected Auto in the Value box, you do not need to enter anything in this area. Values will be generated automatically.



If you selected Fixed in the Value box, enter the literal value for your test parameters in the control that opens. The nature of the control will vary depending on the method. For more information, see “Literal Value Options (For RPC-Style and Document-Style Services)”, page 807.



If you selected Script in the Value box, click Edit Script, then enter the method’s location (or the method itself) in the dialog box that opens. The method should be a static Java method that returns an object (or a Java class that contains a default constructor) and that meets the signature the Web service is expecting. SOAtest will transform the method into a SOAP parameter and send it as a SOAP call when the test case is executed. For general guidelines on adding methods, see “Extensibility (Scripting) Basics”, page 764.

Body Options (For Document-Style Services) For document-style services, the Value option Lets you to specify inputs if you chose Document as the Body Style. Use the Value field to specify whether you want to use a literal value (i.e., a string), an automatically-generated value, or a custom method (i.e., if you want to programmatically specify the value to pass as a parameter). •

If you selected Auto in the Value box, you do not need to enter anything in this area. Values will be generated automatically.

806

Form Input View Options



If you selected Fixed in the Value box, enter the literal value for your test parameters in the control that opens. The nature of the control will vary depending on the method. For more information, see “Literal Value Options (For RPC-Style and Document-Style Services)”, page 807.



If you selected Script in the Value box, click Edit Script, then enter the method’s location (or the method itself) in the dialog box that opens. The method should be a static Java method that returns an object (or a Java class that contains a default constructor) and that meets the signature the Web service is expecting. SOAtest will transform the method into a SOAP parameter and send it as a SOAP call when the test case is executed. For general guidelines on adding methods, see “Extensibility (Scripting) Basics”, page 764.

Literal Value Options (For RPC-Style and DocumentStyle Services) The SOAP Client tool can send a literal value as part of the SOAP message if Fixed was chosen in the Value box of either the Parameters options (RPC-style services) or Body options (document-style services) of the SOAP Client Panel. Depending on the method, a number of different options may be available. You can either configure options for sending simple parameters as the literal value, or you may configure options for sending nested parameters within complex types of objects (such as an array). These same options are also available in the SOAP Response mode of the Diff tool and are configured in the same way.

Configuring Simple Parameters When configuring a simple parameter to send from the SOAP Client tool, the options will vary depending on whether a data source was specified in the SOAP Client tool, and if the XML schema of the WSDL contains any nillable elements. The following options may be available when configuring values for simple parameters: •

Nill: Determines whether the nill attribute is sent by the SOAP Client tool. If this option is enabled, the nill attribute will be sent by the SOAP Client tool. If this option is not enabled, the nill attribute will not be sent.



Fixed/Parameterized Value: (Only available if the Nill check box is not selected) Determines the value to be sent by the SOAP Client tool. If a data source is specified in the SOAP Client tool, you will have a choice to select either a Fixed or a Parameterized value. If a data source is not available, then only a Fixed value can be entered. •

Fixed values are the literal values that you specify by entering an input into the available text field.



Parameterized values are the values from a data source column. When configuring Parameterized values, a drop-down box containing column names from a data source will become available. These column names correspond to the data source specified in the SOAP Client tool. All values in the data source column that you select will be

807

Form Input View Options

sent as the literal value by the SOAP Client tool.

Configuring Nested Parameters If the WSDL specified in the SOAP Client tool contains an array of objects, the GUI beneath the Value box displays an Index column and a Value column. You will be able to nest parameter values that the SOAP Client tool will send as part of the SOAP message.

Advanced Options The following advanced options can be performed in the Form Input view: •

Schema Type Enforcement



minoccurs/maxoccurs Enforcement



Base 64 Encoding



Adding Multiple Values to an Array



Replacing Specific Elements with Data Source Values



Populating a Set of Elements from a Data Source Values



Populating a Set of Elements with Automatically-Generated Values



Substituting Elements



Using Data Source Values to Determine if Optional Elements Are Sent



Using Data Source Values to Configure if Nill Attributes Are Used

Schema Type Enforcement By default, SOAtest checks to ensure that the parameter value entered conforms to its specified schema type (shown when you hover your cursor over the element). If you want to use a value that does not conform to the schema type, disable SOAtest’s schema type enforcement by right-clicking the element, then choosing Enforce Schema Type. To disable schema enforcement for child elements, right-click the parent element, then choose Ignore Schema Type of Children. If you later want to re-enable schema type enforcement, choose the appropriate right-click option.

808

Form Input View Options

minoccurs/maxoccurs Enforcement By default, SOAtest checks to ensure that the parameter value entered conforms to minoccurs and maxoccurs constraints. If you want to use a value that is beyond these limits, right-click that element, then choose Enforce Occurrences. If you later want to re-enable schema type enforcement, repeat the same action.

Base 64 Encoding Base 64 encoding is not enabled by default. To use base 64 encoding, right-click the element you want encoded, then choose Base 64 Encoding from the shortcut menu. If you later want to disable base 64 encoding, repeat the same action.

Adding Multiple Values to an Array To quickly add multiple values to an array, right-click the related element, then choose Insert Multiple. In the dialog that opens, use the available controls to specify the values that you want to use.

Replacing Specific Elements with Data Source Values You can configure SOAtest to replace an entire element (or just its content) with a value from a data source or XPath when the test is run. To do this, right-click that element, then choose Replace with Data Source Value. Then, in the Replacement Settings dialog that opens, specify the replacement mode (entire element or content only) and whether you want to use a data source value or an XPath, and click Next. The content of the next wizard page allows you to specify the appropriate data source column or XPath. Complete that page, then click Finish. If you later want to stop using the data source or XPath value, right-click the element and choose Remove Replacement Setting.

Populating a Set of Elements from a Data Source Values SOAtest’s Populate feature allows you to automatically fill a set of form fields using values stored in an existing data source (as opposed to specifying values manually). For example, you can create a data source with one column per element, then have SOAtest automatically add the corresponding values to all of the available elements. To populate an element’s fields using data source values, right-click that field in the Form Input view, then choose Populate. Leave Map parameter values to data source columns enabled, then specify exclusion and nillable settings if applicable. For additional details, see “Populating and Parameterizing Elements with Data Source Values”, page 364.

Populating a Set of Elements with Automatically-Generated Values SOAtest’s Populate feature can also automatically generate simple values for a set of form fields. To populate an element’s fields using simple automatically-generated values, right-click that field in the Form Input view, then choose Populate. Clear Map parameter values to data source columns, then specify exclusion and nillable settings if applicable. For additional details on the options available in the Populate wizard (Element Exclusion, Nillable Elements, Attribute Exclusion), see “Populate Wizard Options”, page 366.

Substituting Elements

809

Form Input View Options

If you want to replace the original element with another element defined in a schema (for example, to use a foreign-language equivalent of the original element name, or to use a different variation of the original element), you can use the Substitute Element feature. Substituted elements need to have the same type as the original—unless the original element’s type is anyType. In that case, you can replace it with any element that is a valid substitution (any element that belongs to the 'substitution group' defined by the abstract element that's being substituted/replaced). To substitute an element, right-click the original element, then choose Substitute Element from the shortcut menu. In the dialog that opens, specify the schema that contains the new element, then add the desired element to the substitution list.

Using Data Source Values to Determine if Optional Elements Are Sent (Only available if a data source is included in the test suite and the WSDL or schema specifies that minOccurs=0 for that parameter) To have values stored in a data source dictate whether a value is sent for an optional element, rightclick the related tree node, then choose Use Data Source> Exclude with Empty String. When the value in the specified data source column is an empty string, the optional element will not be included in the message. If it is an actual value, that value will be sent as part of the message

Using Data Source Values to Configure if Nill Attributes Are Used (Only available if a data source is included in the test suite and the Nill check box for the given element is not selected) To have values stored in a data source dictate whether a nill attribute or an actual value is used for various elements, right-click the related tree node, then choose Configure Nill Attribute> Use Data Source: Set Nill with Empty string [element]. When the data source has an empty string, the nill attribute will be sent as part of the message. If a value is specified, the nill attribute will not be sent; instead, the specified value will be used. For instance, assume you have the following data source:

Element Value

Nil with Empty String

value

value

value value

Nil with Empty String takes precedence over Element Value. Consequently, whenever the Nil with Empty String column has an empty string, SOAtest sends xsi:nil="true" regardless of the value in the Element Value column.

Generating a Data Source Template for Populating Message Elements Manually creating a data source for parameterizing large, complex XML messages can be time-consuming and tedious. For a fast way to accomplish this, have SOAtest automatically generate a CSV

810

Form Input View Options

data source template based on the structure of the request or response message that you want to parameterize. Details on how to do this are provided in “Generating a Data Source Template for Populating Message Elements”, page 353.

Adding SOAP Headers The SOAP Header sub-tab in the Form Input view allows you to specify header parameters. To add a header, perform the following from the Header sub-tab of the Form Input view: 1. Click the Add button. The Add New SOAP Header dialog displays.

2. Select Custom, WS-Security, SAML 1.1 Assertion, SAML 2.0 Assertion, WS Addressing, WS ReliableMessaging, or Import Schema Element as Header from the Available Header types and click OK. For details about the available options, see: •

WS-Security Options



WS Addressing Options



SAML 1.1 Assertion Options



SAML 2.0 Assertion Options



WS Reliable Messaging Options



Custom Options



Import Schema Element as Header Options

WS-Security Options If you selected WS-Security from the Available Header Types, the following options are available via the Timestamp, Username Tokens, BinarySecurityToken, and Options tabs in the right GUI panel:

Timestamp Tab The following options are available in the Timestamp tab:

811

Form Input View Options

Option

Description

Send Timestamp

Select to add a timestamp value.

Dynamically generate timestamp

Select to generate a timestamp value each time a message is generated.

wsu:Created

(Only available if Dynamically generate timestamp is not selected) Manually enter a timestamp value. This value will be the same for each message generated.

Send Expires

Select to add an expiration value.

Time Interval between Created and Expires

(Only available if Send Expires is selected) Manually enter a time interval to take place between the created and expires timestamps.

Username Tokens Tab The following options are available in the Username Tokens tab:

Option

Description

Send Username Token

Select to add a username value.

wsse: Username

Enter a username.

wsse: Password

Enter a password.

Add Nonce

Select to add a nonce value.

Dynamically generate Nonce

Select to randomly generate a nonce value each time a message is generated.

wsse:Nonce

(Only available if Dynamically generate Nonce is not selected) Manually enter a Nonce value. This value will be the same for each message generated.

Add Timestamp

Select to add a timestamp value.

Dynamically generate timestamp

Select to generate a timestamp value each time a message is generated.

wsu:Created:

(Only available if Dynamically generate timestamp is not selected) Manually enter a timestamp value. This value will be the same for each message generated.

BinarySecurityToken Tab The following options are available in the Username Tokens tab:

812

Form Input View Options

Option

Description

Tokens

Select a BinarySecurityToken from the Tokens drop-down list.

Add

Click to add a new BinarySecurityToken. In the Add BinarySecurityToken dialog that displays, specify the wsuID of the token.

Remove

Click to remove a BinarySecurityToken.

Options Tab The following options are available in the Options tab:

Option

Description

WS-Security URI

Enter or select from the drop-down menu, the namespace of the Username token header. The default corresponds to the latest WS-Security spec from OASIS. •

WS-Security Utility URI

Note: Selecting a WS-Security URI also changes the WSSecurity Utility URI correspondingly. However, you can change the WS-Security Utility URI so that it does not correspond to the WS-Security URI.

Enter or select from the drop-down menu, the utility namespace of the Username token header. The default corresponds to the selection in the WS-Security URI menu.

WS Addressing Options If you selected WS Addressing from the Available Header Types, the following four Header types are sent by default: Action, To, MessageID, and Reply/To. The following headers are not by default: RelatesTo, From, FaultTo. These Header types can be configured via the Action/To, MessageID/ReplyTo, and RelatesTo/From/FaultTo tabs in the right GUI panel. Also by default, the 2004/08 namespace is used (i.e. http://schemas.xmlsoap.org/ws/2004/08/addressing).

Action/To Tab The following options are available in the Action/To tab:

Option

Description

wsa:Action

Specifies the WS Addressing Action value.

wsa:To

Specifies the WS Addressing To value.

WS-Addressing URI

Enter or select from the drop-down menu, the WS-Addressing URI.

MessageID/ReplyTo Tab The following options are available in the MessageID/ReplyTo tab:

813

Form Input View Options

Option

Description

wsa:MessageID

Specifies the WS Addressing MessageID. The default automatically generates a unique value.

wsa:ReplyTo

Specifies the WS Addressing ReplyTo endpoint reference to which the return message will be sent. •

wsa:Address: You can configure the wsa:Address as Fixed, Automatic, Script, Anonymous URL, or Call Back URL.



wsa:ReferenceParameters and wsa:ReferenceProperties: Allows you to modify the given parameters and properties.

RelatesTo/From/FaultTo Tab The following options are available in the RelatesTo/From/FaultTo tab:

Option

Description

Send wsa:RelatesTo

Specifies the WS Addressing RelatesTo value.

Send wsa:From

Specifies the WS Addressing From endpoint reference from which the return message is sent.

wsa:FaultTo



wsa:Address: You can configure the wsa:Address as Fixed, Automatic, Script, Anonymous URL, or Call Back URL.



wsa:ReferenceParameters and wsa:ReferenceProperties: Allows you to modify the given parameters and properties.



Specifies the WS Addressing FaultTo endpoint reference to which the fault message will be sent.



wsa:Address: You can configure the wsa:Address as Fixed, Automatic, Script, Anonymous URL, or Call Back URL.



wsa:ReferenceParameters and wsa:ReferenceProperties: Allows you to modify the given parameters and properties.

SAML 1.1 Assertion Options If you selected SAML 1.1 Assertion from the Available Header Types, see “Adding SAML Headers”, page 815.

SAML 2.0 Assertion Options If you selected SAML 2.0 Assertion from the Available Header Types, various options will be available that correspond to the OASIS SAML Token Profile. For more information, see http://www.oasis-open.org/committees/tc_home.php?wg_abbrev=security

WS Reliable Messaging Options

814

Form Input View Options

If you selected WS ReliableMessaging from the Available Header Types, the following options are available:

Option

Description

wsrm:Sequence

Specifies the wsrm:Sequence parameters such as the Identifier and MessageNumber.

wsrm:AckRequested

Specifies the wsrm:AckRequested parameters such as the Identifier and MaxMessageNumberUsed. These options are only available if the Send AckRequested checkbox is selected.

Custom Options If you selected Custom from the Available Header Types, the following options are available:

Option

Description

Views

Select the desired view from the drop-down menu and configure accordingly. Options are Literal XML, Form XML, Scripted XML, and Form Input.

Populate

Fills SOAP array and element parameters. Clicking this button also sets any element nils to false and expands them out. This button is only enabled if the test is created from a WSDL.

Import Schema Element as Header Options If you selected Import Schema Element as Header from the Available Header Types, a dialog appears from which you can load declared elements from a schema location. After loading elements, you can select multiple elements for the SOAP header. Once you click OK, a new header with the chosen element's structure will be added to the SOAP Client.

Adding SAML Headers The Header sub-tab in the Form Input view allows you to specify header parameters. When adding SAML 1.1 headers, you can choose from four types of SAML confirmation methods: •

Sender-Vouches: Unsigned: Contains minimal sender-vouches SAML assertion with no optional elements included in the request message creation. There are no signatures or certificates required for the SAML assertion. •



Sender-Vouches: Unsigned: SSL: Contains minimal sender-vouches SAML assertion, whereas a signature is not required, but the transport is over SSL and therefore certificates will be required in order to support SSAL at the transport layer. •



See “Sender-Vouches: Unsigned”, page 816 for details.

See “Sender-Vouches: Unsigned: SSL”, page 818 for details.

Sender-Vouches: Signed: Contains a sender-vouches SAML assertion, whereas both the assertion and the body elements are signed, while a reference to the certificate used to verify the signature is to be provided in the header.

815

Form Input View Options

• •

See “Sender-Vouches: Signed”, page 821 for details.

Holder-of-Key: Contains a holder-of-key SAML assertion where an enveloped signature is over the Assertion element using the issuer key store, the certificate used to verify the issuer signature is contained within the assertion signature. The body is then signed with the user certificate which is to be contained within the assertion’s SubjectConfirmation element. •

See “Sender-Vouches: Signed”, page 821 for details.

Sender-Vouches: Unsigned To add a Sender-Vouches: Unsigned confirmation method, perform the following from the Header sub-tab of the Form Input view: 1. Click the Add button. The Add New SOAP Header dialog displays.

2. Select SAML 1.1 Assertion and click OK. The SAML Assertion wizard displays.

816

Form Input View Options

3. Select Sender-Vouches: Unsigned and click the Next button.

817

Form Input View Options

4. Select the desired SAML statement type and click Next.

5. Complete the necessary fields for the Authentication Statement and click Finish. For more information on Authentication Statements, see “Adding and Modifying SAML Statements”, page 826.

Sender-Vouches: Unsigned: SSL To add a Sender-Vouches: Unsigned: SSL confirmation method, perform the following from the Header sub-tab of the Form Input view: 1. Click the Add button. The Add New SOAP Header dialog displays.

2. Select SAML 1.1 Assertion and click OK. The SAML Assertion wizard displays. 3. Select Sender-Vouches: Unsigned: SSL and click the Next button.

818

Form Input View Options

4. Select Use Conditions, if needed. •

If Use Conditions was selected, then choose the desired Condition type or None.



If Audience Restriction was selected and a value greater than 0 was entered, then click Next to be prompted by the Audience Restriction control and fill each field in accordingly.

5. Click the Next button.

819

Form Input View Options

6. Select the desired SAML statement type and click Next.

7. Complete the necessary fields for the Authentication Statement and click Finish. For more information on Authentication Statements, see “Adding and Modifying SAML Statements”, page 826.

820

Form Input View Options

Sender-Vouches: Signed To add a Sender-Vouches: Signed confirmation method, perform the following from the Header subtab of the Form Input view: 1. Click the Add button. The Add New SOAP Header dialog displays.

2. Select SAML 1.1 Assertion and click OK. The SAML Assertion wizard displays.

3. Select Sender-Vouches: Signed and click the Next button.

821

Form Input View Options

4. Select the desired Signature Options and click the Next button.

5. Select Use Conditions, if needed. •

If Use Conditions was selected, then choose the desired Condition type or None.



If Audience Restriction was selected and a value greater than 0 was entered, then click Next to be prompted by the Audience Restriction control and fill each field in accordingly.

6. Click the Next button.

822

Form Input View Options

7. Select the key store to be used for the issuer signature on the assertion and the body from the Issuer Key Store drop down menu and click Next. 8. Select the desired SAML statement type and click Next.

9. Complete the necessary fields for the Authentication Statement and click Finish. For more information on Authentication Statements, see “Adding and Modifying SAML Statements”, page 826.

823

Form Input View Options

Holder-of-Key To add a Holder-of-Key confirmation method, perform the following from the Header sub-tab of the Form Input view: 1. Click the Add button. The Add New SOAP Header dialog displays.

2. Select SAML 1.1 Assertion and click OK. The SAML Assertion wizard displays.

3. Select Holder-of-Key and click the Next button.

824

Form Input View Options

4. Select Use Conditions, if needed. •

If Use Conditions was selected, then choose the desired Condition type or None.



If Audience Restriction was selected and a value greater than 0 was entered, then click Next to be prompted by the Audience Restriction control and fill each field in accordingly.

5. Click the Next button.

6. Select the key store used for the issuer enveloped signature over the Assertion element, and the key store used by the user to sign the body element, and click Next. 7. Select the desired SAML statement type and click Next.

825

Form Input View Options

8. Complete the necessary fields for the Authentication Statement and click Finish. For more information on Authentication Statements, see “Adding and Modifying SAML Statements”, page 826.

Adding and Modifying SAML Statements SAML Statements must be added during the creation of an assertion. The following choices are available: •

Authentication Statement



Authorization Decision Statement



Attribute Statement

SAML Statements can be added and modified after an assertion has been created in order to extend or customize the assertion. While in the SAML Header view, simply click the Add button to add a SAML Statement. To modify an existing SAML Statement, select any statement in the SAML Statements list and click the Edit button. The SAML Statement Wizard displays.

826

Form Input View Options

When adding or modifying a statement, be sure to fill out each of the enabled fields. These are the minimum requirements for the Statements. If desired, enable fields that are disabled by default and fill out the fields for them as well.

827

MapMessage Input Options

MapMessage Input Options The MapMessage Input Mode is only available in a Messaging Client tool when JMS is selected as the Transport type, and javax.jms.MapMessage is chosen as the MessageType. The MapMessage Input Mode allows you to Add, Modify, and Remove MapMessages by clicking the appropriate buttons. MapMessages can also be Diffed by right-clicking the Messaging Client node and selecting Create Regression Controls from the shortcut menu.

828

REST Client

REST Client This topic explains how to configure and apply the REST Client tool, which sends messages to RESTful services. Messages can be sent with HTTP GET, POST, PUT, DELETE, HEAD, OPTIONS, or custom methods. Sections include: •

Header Options



Resources Options



Payload Tab



HTTP Options



Success Criteria Options

Header Options The upper section of the REST Client control panel lets you: •

Specify the method you want to use (get, post, put, delete, etc.). To select a custom method, choose CUSTOM then enter the name of your custom method in the Value field that displays.



Preview the actual URL that will be used. The URL is instantly updated as you change the tool’s configuration. If you are configuring the tool using environment variables, parameterized values, etc., the URL shown in the Resource field will show you how they resolve into an actual URL.

Resources Options In the Resources tab, you specify the resource to which you want to send messages. You can use the following modes to specify input: •

Form: Lets you specify the various URL components in a form. You can specify host, port, and path values by entering them in fields. You can also enter additional URL parameters via the table. •

You can specify fixed or parameterized inputs for each individual component of the URL. For instance, you can parameterize just the port, or both the port and the path. For details about parameterizing values, see “Parameterizing Tests (with Data Sources or Values from Other Tests)”, page 345.



You can use the environment variables specified in the test suite’s environment’s configuration. For details about environments, see “Configuring Testing in Different Environments”, page 369.

829

REST Client





Literal: Lets you specify the URL as a literal string, which can be either a fixed value or a single parameterized value. To parameterize individual components, use Form Input mode. •

For details about parameterizing values, see “Parameterizing Tests (with Data Sources or Values from Other Tests)”, page 345.



You can use the environment variables specified in the test suite’s environment’s configuration. For details about environments, see “Configuring Testing in Different Environments”, page 369.

Scripted: Lets you set the field values using custom methods. See “Extensibility (Scripting) Basics”, page 764 for details.

Payload Tab If the method you are using sends data (e.g., PUT, POST), the Payload tab allows you to specify the payload for the message that will be sent. From the Payload tab, you can select input modes from the Input Mode drop-down list. The REST Client tool shares Input Mode options with the SOAP Client and Message Stub tools. For more information on these shared options, see “Request/Message Body Options”, page 792.

830

REST Client

HTTP Options The HTTP options allow you to determine which protocol (HTTP 1.0 or 1.1) is used to send the request, as well as various options related to the protocol (security, headers, cookies, etc.). Select the appropriate protocol from the Transport drop-down list, then configure its properties, which are described in the following sections: •

“Using HTTP 1.0”, page 689



“Using HTTP 1.1”, page 691

Success Criteria Options The following options are available in the Success Criteria tab of the Messaging Client tool: •

Valid HTTP Response Codes: Allows you to enter a range of valid HTTP response codes that will allow the test to succeed. You can enter ranges as comma (or semicolon) separated val-

831

REST Client

ues, with each value being either a single code or a range of codes separated by hyphens. For example, if you enter "302, 400-499", a 302 code or any code in the 4xx range will be accepted. •

Timeout after (milliseconds): Specifies the length of delay (in milliseconds) after which SOAtest should consider your FTP, telnet, or HTTP requests to be timed out. The Default setting corresponds to the timeout set in the SOAtest Preferences panel. The Custom setting allows you to enter a timeout. A non-positive timeout value can be entered to specify an infinite timeout. •

Fail the test on timeout: Select this option if you want the test to fail on the specified timeout.



Pass the test only if a timeout occurred: Select this option to have the test pass if the specified timeout occurs (i.e. the test does not finish execution within the specified time).

832

webMethods

webMethods This topic explains how to configure and apply the webMethods tool, which allows you to publish and send BrokerEvent objects to Software AG webMethods Broker, as well as subscribe to events and invoke native webMethods IS (Integration Server) services. Sections include: •

Working with Broker



Working with Integration Server

Working with Broker The configuration steps vary according to the execution mode you want to use (publish, subscribe, request/reply). This section covers: •

Adding Required Jar Files to the SOAtest Classpath



Using Publish Execution Mode



Using Subscribe Execution Mode



Using Request/Reply Execution Mode



Specifying Scripted Inputs

Adding Required Jar Files to the SOAtest Classpath The following jar files need to be added to the SOAtest classpath: •

wmbrokerclient.jar



g11nutils.jar

The jar files can be found under [webmethods install dir]/Broker/lib/. For more details, please refer to webMethods Broker Client Java API Programmer's Guide> Getting Started> Using the webMethods Broker Java API. To add these jar files to SOAtest’s classpath, complete the following: 1. Choose SOAtest> Preferences. 2. Open the System Properties page. 3. Click the Add JARS button and choose and select the necessary JAR files to be added.

Using Publish Execution Mode To configure the webMethods tool to publish BrokerEvent objects: 1. Double-click the webMethods Test Case Explorer node to open up the tool configuration panel. 2. Open the Tool Settings tab. 3. Specify tool settings as follows: a. From webMethods, select Broker. b. From Execution mode, select Publish. c.

In the Host, Broker Name, Client Group, and Client ID fields, specify the settings for connecting to Broker.

833

webMethods

d. In the AppName field, enter any name that you want to use to identify the application (SOAtest). e. Click Refresh. The Event Type menu will then be populated with the available event types on the broker. f.

(Optional) In Subscription Filter, specify a filter string in order to retrieve only the events with desired field values. For example, you may use expressions such as: my_Strin_field="some value" and my_int_value < 5

For more details on the syntax of filter strings, please refer to the WebMethods Broker Client Java API Programmer's Guide, "Using Event Filters" section. g. In Event Type, select the event type to which you want to publish. h. In the Timeout area, customize the timeout period and timeout conditions if desired. •

If the Pass the test only if timeout occurred option is selected, then the tool will fail if a Broker Event is received at the specified event type. This is typically useful for subscribing to error event types and ensuring that no error event types are published as part of a test scenario.

4. Open the Inputs tab. 5. Use the available controls to specify what BrokerEvent object you want published. •

The available options (Form XML, Literal XML, Scripted, Form Input) and the related controls are the same as the ones available for the SOAP Client tool and other messaging tools. See “Request/Message Body Options”, page 792 for details.



If you are using Form XML, Literal XML or Form Input and want to generate a default input template, click Refresh.



In scripted mode, you can directly manipulate the object (e.g, if you want to set the field values by scripting using the webMethods API instead of the GUI or XML representations). See “Specifying Scripted Inputs”, page 836 and “Extensibility (Scripting) Basics”, page 764 for details.



For more information about BrokerEvent objects, please refer to the COM.activesw.api.client.BrokerEvent documentation in the webMethods Broker Client Java API Programmer’s Guide.

During test execution, SOAtest publishes the event by invoking the broker client Java API and providing it with a BrokerEventobject . Any applications that are subscribed to that event type will receive that event that SOAtest published. To view an XML representation of the events that are published, use the Traffic Viewer tool, which is automatically chained to the webMethods tool. Note that even though actual BrokerEvent objects are sent and received, the viewer uses an XML representation. This makes it easier to read and allows you to chain tools that validate or process the content (e.g., the Diff tool or XML Assertor tool).

Using Subscribe Execution Mode To configure the webMethods tool to subscribe to BrokerEvent objects: 1. Double-click the webMethods Test Case Explorer node to open up the tool configuration panel. 2. Open the Tool Settings tab. 3. Specify tool settings as follows: a. From webMethods, select Broker.

834

webMethods

b. From Execution mode, select Subscribe. c.

In the Host, Broker Name, Client Group, and Client ID fields, specify the settings for connecting to Broker.

d. In the AppName field, enter any name that you want to use to identify the application (SOAtest). e. Click Refresh. The Event Type menu will then be populated with the available event types on the broker. f.

In Event Type, select the event type to which you want to subscribe.

During test execution, SOAtest starts listening for events that are published to the specified event type. To view an XML representation of the events that are received, use the Traffic Viewer tool, which is automatically chained to the webMethods tool. Note that even though actual BrokerEvent event objects are sent and received, the viewer uses an XML representation. This makes it easier to read and allows you to chain tools that validate or process the content (e.g., the Diff tool or XML Assertor tool).

Using Request/Reply Execution Mode To configure the webMethods tool to send a BrokerEvent object and wait for a reply before sending another:

835

webMethods

1. Double-click the webMethods Test Case Explorer node to open up the tool configuration panel. 2. Open the Tool Settings tab. 3. Specify tool settings as follows: a. From webMethods, select Broker. b. From Execution mode, select Request/Reply. c.

In the Host, Broker Name, Client Group, and Client ID fields, specify the settings for connecting to Broker.

d. In the AppName field, enter any name that you want to use to identify the application (SOAtest). e. Click Refresh. The Event Type menu will then be populated with the available event types on the broker. f.

In Event Type, select the event type to which you want to send.

4. Open the Inputs tab. 5. Use the available controls to specify what BrokerEvent object you want sent. •

The available options (Form XML, Literal XML, Scripted, Form Input) and the related controls are the same as the ones available for the SOAP Client tool and other messaging tools. See “Request/Message Body Options”, page 792 for details.



If you are using Form XML, Literal XML or Form Input and want to generate a default input template, click Refresh.



In scripted mode, you can directly manipulate the object (e.g, if you want to set the field values by scripting using the webMethods API instead of the GUI or XML representations). See “Specifying Scripted Inputs”, page 836 and “Extensibility (Scripting) Basics”, page 764 for details.



For more information about BrokerEvent objects, please refer to the COM.activesw.api.client.BrokerEvent documentation in the webMethods Broker Client Java API Programmer’s Guide.

During test execution, SOAtest sends a request by invoking the broker client Java API and providing it with a broker event object, then it waits for a reply. To view an XML representation of the request and reply events, use the Traffic Viewer tool, which is automatically chained to the webMethods tool. Note that even though actual BrokerEvent objects are sent and received, the viewer uses an XML representation. This makes it easier to read and allows you to chain tools that validate or process the content (e.g., the Diff tool or XML Assertor tool).

Specifying Scripted Inputs When publishing to a Broker or doing a request/reply using scripted input, two arguments will be passed into your custom method. The first argument is a BrokerEvent object for the type name you specified in the tool settings. You need to populate this object with the desired input values (see the webMethods Broker Client Java API Programmer’s Guide for details on how to accomplish this) then return it. The second argument is a test tool com.parasoft.api.Context object. Here is a sample Python script:

from com.parasoft.api import * from COM.activesw.api.client import *

836

webMethods

def configureBrokerEvent(event, context): event.setUCStringField("s", "something") event.setIntegerField("i", 1) event.setUCStringSeqField("sl", 0, BrokerEvent.ENTIRE_SEQUENCE, BrokerEvent.AUTO_SIZE, ["one", "two", "three"]) return event

Working with Integration Server This section covers: •

Adding Required Jar Files to the SOAtest Classpath



Using SOAtest with Integration Server



Specifying Scripted Inputs

Adding Required Jar Files to the SOAtest Classpath The following jar file needs to be added to the SOAtest classpath: •

wm-isclient.jar

It can be found under [webmethods install dir]/common/lib/wm-isclient.jar. To add this jar file to SOAtest’s classpath, complete the following: 1. Choose SOAtest> Preferences. 2. Open the System Properties page. 3. Click the Add JARS button and choose and select the necessary JAR file to be added.

Using SOAtest with Integration Server To configure SOAtest to invoke webMethods IS (Integration Server) services and receive responses: 1. Double-click the webMethods Test Case Explorer node to open up the tool configuration panel. 2. Open the Tool Settings tab. 3. Specify tool settings as follows: a. From webMethods, select Integration Server. b. In the Host, User, and Password fields, specify the settings for connecting to Integration Server. c.

Click Refresh. The Package menu will then be populated with the packages available on Integration Server.

d. In Package , choose the package that contains the service that you want to invoke.

837

webMethods

e. In Service, select the service that you want to invoke.

4. Open the Input tab. 5. Use the available controls to specify what iData object will be used to invoke that service. •

The available options (Form XML, Literal XML, Scripted, Form Input) and the related controls are the same as the ones available for the SOAP Client tool and other messaging tools. See “Request/Message Body Options”, page 792 for details.



If you are using Form XML, Literal XML or Form Input and want to generate a default input template, click Refresh.



In scripted mode, you can directly manipulate the object (e.g, if you want to set the field values by scripting using the webMethods API instead of the GUI or XML representations). See “Specifying Scripted Inputs”, page 840 and “Extensibility (Scripting) Basics”, page 764 for details.

838

webMethods



For more information about iData objects, please refer to the com.wm.data.iData documentation in the webMethods Integration Server Developer Guide.

During test execution, SOAtest sends iData objects to invoke Integration Server services and receives responses. To view an XML representation of the objects that are sent and received, use the Traffic Viewer tool, which is automatically chained to the webMethods tool. This XML representation uses the iData XML coder format. Note that even though actual iData objects are sent and received, the viewer uses an XML representation. This makes it easier to read and allows you to chain tools that validate or process the content (e.g., the Diff tool or XML Assertor tool).

839

webMethods

Specifying Scripted Inputs When using an Integration Server service with scripted input, your custom method will be passed two arguments. The first is an IData object that you need to populate with the desired data and then return. The second argument is a com.parasoft.api.Context object. Here is a sample Python script: from com.wm.app.b2b.client import * from com.wm.data import * def buildIData(idata, context): IDataUtil.put(idata.getCursor(), "in1", "12") IDataUtil.put(idata.getCursor(), "in2", "13") return idata

840

EJB Client

EJB Client This topic explains how to configure and apply the EJB Client tool that invokes methods of EJB remote objects deployed on a J2EE application server. Sections include: •

Understanding the EJB Client Tool



Configuring the EJB Client Tool



Using Regression Controls



Testing EJBs deployed on IBM WebSphere Application Server (WAS)



Using XML Encoder/Decoder with the EJB Client

Understanding the EJB Client Tool The EJB Client obtains an EJB Remote Object via a JNDI query or from the Object Data Bank. It can then be used to invoke methods on that object. The return values of those methods can then be passed to a chained tool, such as a Diff or an Object Data Bank. In order to invoke an EJB Client tool on a certain method of an EJB remote object, you must tell the tool the source of the remote object. An EJB remote object can either be obtained from a remote directory via a JNDI query, or from SOAtest’s local Object Data Bank. In the latter case, the EJB remote object should be put into the Object Data Bank as a return value from a previous EJB Client tool invocation.

Configuring the EJB Client Tool The options of the EJB Client tool will vary depending on where the EJB remote object is obtained from (from a JNDI Query or from an Object Data bank).

Obtaining EJB Remote Object from JNDI Query In order to obtain an EJB Remote Object from a JNDI Query, select the JNDI Query radio button from the EJB Remote Object Source area and configure the following parameters:

841

EJB Client



JNDI Properties: Allows you to configure the properties for the remote directory from a JNDI Query. The following options are available: •

Initial Context Factory: Specifies the context factory, such as org.jnp.interfaces.NamingContextFactory.



Provider URL: Specifies the location of the JNDI query, such as cheetah.parasoft.com. Also specify the User and Password.



Object Name: Specifies the name of the object in the JNDI directory, such as ejb/ CartHomeRemote. This object should be an instance of the class that you enter in the Class field of the EJB Remote Object panel. You can find the JNDI names of the EJB objects bound to the JNDI directory in your J2EE server configuration. •



Note: Make sure the Initial Context Factory class is included in SOAtest’s classpath. For more information on adding class files to SOAtest, see “System Properties Settings”, page 758.

EJB Remote Object: Allows you to configure the properties for invoking a remote method for the EJB Remote Object. The following options are available: •

Class: Specifies the class of the Object that is expected to be returned from either the JNDI query or the Object Data Bank. For instance: com.parasoft.soatest.bookstore.cart.CartRemote. If the class is in SOAtest's classpath, the Method selection box will be automatically populated with public methods of this class.



Method: Specifies the method you would like to invoke. If the class specified in the Class field is found on SOAtest’s classpath, the Method combo box will be automatically filled with available methods.



Method Arguments: If the selected method takes arguments, the Method Arguments sub panel will prompt you to specify the Input Type and Parameter Input for each of the arguments of the remote method.

842

EJB Client





For Java primitive types (java.lang.String and java.util.Date), the following input types are available: •

Literal: Specifies the input value as a string. If for example the Parameter Type is Java primitive type int, the following will be a valid input 10, 12345. For Java float type: 7.62, 105.3, etc.



Parameterized: This input will be available if there is at least one Data Source “visible” to this test. Select the desired data source column as input.



Scripted: Specifies the parameter as a return value of a Python, JavaScript, or Java method.

For the rest of the Java types, the following input types are available: •

Interpreted: Allows use of tabular input (CSV, Excel, etc.) to instantiate a Java Object that will be used as a remote method parameter. For more information, see “Using Interpreted Data Sources”, page 361.



Scripted: Specifies the parameter as a return value of a Python, JavaScript, or Java method.

Obtaining An EJB Remote Object from an Object Data Bank Before you can obtain an EJB Remote Object from an Object Data Bank (so it can be used in subsequent tests), you must first store a return value in an Object Data Bank. To do this: 1. Right-click the EJB Client Tool node and select Add Output> Object Output> New Output> Object Data Bank. 2. In the Object Data Bank control panel, specify a unique Column Name by which this value can be later queried. Once a value is stored in an Object Data Bank, select the Object Data Bank radio button from the EJB Remote Object Source sub panel of the EJB Client tool and configure the following parameters:

843

EJB Client



Object Data Bank Properties: Allows you to configure the properties for the remote directory from an Object Data Bank. The following options are available: •



Column Name: Select the appropriate column name for the value stored in the Object Data Bank.

EJB Remote Object: Allows you to configure the properties for invoking a remote method for the EJB Remote Object. The following options are available: •

Class: Specifies the class of the Object that is expected to be returned from either the JNDI query or the Object Data Bank. For instance: com.parasoft.soatest.bookstore.cart.CartRemote. If the class is in SOAtest's classpath, the Method selection box will be automatically populated with public methods of this class.



Method: Specifies the method you would like to invoke. If the class specified in the Class field is found on SOAtest’s classpath, the Method combo box will be automatically filled with available methods.



Method Arguments: If the selected method takes arguments, the Method Arguments sub panel will prompt you to specify the Input Type and Parameter Input for each of the arguments of the remote method. •

For Java primitive types (java.lang.String and java.util.Date), the following input types are available: •

Literal: Specifies the input value as a string. If for example the Parameter Type is Java primitive type int, the following will be a valid input 10, 12345. For Java float type: 7.62, 105.3, etc.



Parameterized: This input will be available if there is at least one Data Source “visible” to this test. Select the desired data source column as input.

844

EJB Client

• •

Scripted: Specifies the parameter as a return value of a Python, JavaScript, or Java method.

For the rest of the Java types, the following input types are available: •

Interpreted: Allows use of tabular input (CSV, Excel, etc.) to instantiate a Java Object that will be used as a remote method parameter.



Scripted: Specifies the parameter as a return value of a Python, JavaScript, or Java method.

Viewing the Data Bank Variables Used During Test Execution You can configure the Console view (Window> Show View> Console) to display the data bank variables used during test execution. For details, see “Console view”, page 35.

Using Regression Controls To create or update regression controls for a set of EJB Client tool tests, right click on the test suite encompassing the tests and select Update or Create Regression Controls from the shortcut menu. Select Multiple Controls if you are using data sources.

Testing EJBs deployed on IBM WebSphere Application Server (WAS) To test an EJB deployed on the WebSphere Application Server (WAS) using the Sun JRE shipped with SOAtest, some special configuration is necessary. All other configuration settings explained in this document apply. To configure SOAtest, complete the following: 1. Use the Sun Initial Context Factory (com.sun.jndi.cosnaming.CNCtxFactory): This setting should be configured in the EJB Client's JNDI properties. This setting requires that Sun's j2ee.jar is on the SOAtest classpath. This .jar file is available from Sun's page at http://java.sun.com/products/jndi/downloads/index.html. For more information on adding class files to SOAtest, see “System Properties Settings”, page 758. 2. In the EJB Clients's JNDI properties, use a provider URL in the form of corbaloc:iiop:WAS:2809 where WAS is the location of your WebSphere Application Server. (for example, corbaloc:iiop:bison:2809) 3. In the EJB Client's JNDI properties, specify the Object Name in the following format: cell/nodes/[node]/servers/[server]/YOUR_EJB_OBJECT_NAME (for example, cell/ nodes/bisonNode01/servers/server1/ejb/BasicCalculator)

Using XML Encoder/Decoder with the EJB Client The XML Encoder and XML Decoder are complementary tools which allow you to transfer a Java Bean object graph into an XML representation (XML Encoder) or transfer a compatible XML output to a Java Bean (XML Decoder). The XML Encoder and XML Decoder tools can be particularly useful when creating EJB Tests. An XML Encoder can be used to transform the Java Bean Object Output of an EJB Tool to an XML format. This output can be further manipulated by other SOAtest tools, such as XML Transformer, to extract or modify portions of the XML document that are of interest to you. The modified XML output can then be transformed back to Java Bean format with the help of the XML Decoder. The result can be chained to

845

EJB Client

the Object Data Bank—thus making it possible for the subsequent tests to use the modified Java Bean object as a Tool input. Please keep in mind that the EJB Tool conveniently allows you to add chained Tools to its XML or Object outputs. The XML Output of the EJB Tool is an Object Output transformed by an XML Encoder.

846

Call Back

Call Back This topic explains how to configure and apply the Call Back tool that supports asynchronous HTTP testing by listening for the asynchronous server response. Sections include: •

Understanding the Call Back Tool



Configuring the Call Back Tool



Configuring TIBCO options for the Call Back Tool



Configuring JMS options for the Call Back Tool

Understanding the Call Back Tool SOAtest supports asynchronous HTTP testing, including Parlay, Parlay-X, SCP (SOAP Conversation Protocol), WS-Addressing, custom protocols, JMS, TIBCO, WebSphere MQ and SonicMQ. After you configure the Call Back tool, SOAtest sets up a server to manage the Call Back Messages.It uses the keys you specify in the tool configuration to recognize incoming messages.

Configuring the Call Back Tool To configure the Call Back tool, complete the following: 1. Ensure that the local stub server is running. The server can be started through the Stub Server view. If the Stub Server view is not visible in your workspace, select SOAtest> Show View> Stub Server •

If the Server node does not have green ball, right-click that node and choose Start Server.

2. Ensure that there is a SOAP invocation from a SOAP Client tool. 3. Select the main test suite node and click the Add test or output button. The Add Test wizard displays. 4. Select Standard Test> Call Back Tool from the Add Test wizard, and then click the Finish button. A Call Back Tool test node displays in the test suite. 5. Double-click the Call Back Tool test node. The following options display in the Call Back tool workspace:

847

Call Back



Data Source: (Only available if a Data Source was added to the test suite) Select the data source that has the desired parameters to be used as call back values.



Call Back URL: Displays the server location of the Call Back tool. For HTTP protocols only.



Timeout (sec): Specifies the length of delay (in seconds) after which SOAtest should consider your requests to be timed out.



Protocol: Specifies the asynchronous protocol to be used.





Parlay: For the Parlay protocol.



Parlay X: For the Parlay X protocol.



SCP: For the SCP protocol.



WS-Addressing: For the WS-Addressing protocol.



Custom: For a custom protocol. Specify the desired protocol using a custom XPath and value.



JMS: For the JMS protocol. If JMS is selected as the protocol, additional options will be made available. For more information, see “Configuring JMS options for the Call Back Tool”, page 850



TIBCO: For the TIBCO protocol. For more information, see “Configuring TIBCO options for the Call Back Tool”, page 849.



WebSphere MQ: For the WebSphere MQ protocol.



SonicMQ: For the SonicMQ protocol.



No correlation: Select this to use the first message in http://localhost:9080/ servlet/NoCorrelationCallBackHandler. If this "protocol" is selected and there is no message at that location, the test will fail once it times out.

Key Set: Displays Keys and Values used in the Call Back message. These options are not available if JMS was selected as the Protocol. •

Key: The Keys will vary depending on the Protocol selected.

848

Call Back



For Parlay, the keys used to determine a unique Call Back message are sessionID and requestNumber. For more information on Parlay specifications/API, see http://www.parlay.org.



For Parlay X, the key used to determine a unique Call Back message is correlator.



For SCP, the key used to determine a unique Call Back message is conversationID. For more information on SCP specifications/API, see http://dev2dev.bea.com/technologies/webservices/SOAPConversation.jsp.



For WS-Addressing, the key used to determine a unique Call Back message is MessageID. For more information on WS-Addressing specifications, see http://www-106.ibm.com/developerworks/library/specification/ws-add/



XPath: (Only available for Custom) Double-click to enter an XPath.



Value: (Not available for JMS) Double-click to enter either a Fixed value or a Parameterized value (if a Data Source is available).

6. Right-click the Call Back Tool test node and select Add Output. The Add Output wizard displays. 7. Select the desired tool from the Add Output wizard to create an output for the Call Back message from the server. For example, you can select the Diff tool to create a regression control on the Call Back message from the server.

Configuring TIBCO options for the Call Back Tool If TIBCO is selected as the Protocol for the Call Back tool, a TIBCO Properties panel displays at the bottom of the tool configuration panel. SOAtest can listen on a local TIBCO DAEMON or a remote one. That is, the bus daemon could be running on the local machine or somewhere else. The following options are available: •

The Timeout field represents the amount of time SOAtest should block (wait) until a message becomes available on the specified Reply Subject value.



Daemon: Specifies the server name or the server’s IP followed by a colon (:) and the port number (e.g. 10.10.32.34:7500 or host_name:7500).



Network: Specifies the network where the transport objects are located. The network parameter consists of up to three parts separated by a semicolon (;) in the form of network; multicast groups; send address (e.g. lan0; 224.1.1.1; 244.1.1.6). For more information, please refer to the Network Selection section of the TIBCO Rendezvous documentation.



Service: Specifies TIBCO’s service name.

TIBCO messages, when generated and accessed via one of the programming language APIs, can put content under named fields. Content (SOAP messages, XML, text, etc.) must be placed under a TIBCO message "field". To retrieve desired content from the message, the Reply Field Name must be provided. The field name is determined by the application that generates a TIBCO message, so to determine what that field is, one would need to know the field name that is used by the application (if unknown, the source code of the application that sends the TIBCO message should include the field name). The Message Delivery field indicates what type of messages the Call Back tool should look for on the bus. This should correspond to the delivery type established by the message sender.

849

Call Back

Additional information on the Field Name can be found in the TIBCO Rendezvous docs under TIBCO Rendezvous Concepts> Fundamentals> Messages and Data> Fields. For more information on the Reply subject field and its meaning in TIBCO, refer to TIBCO docs under TIBCO Rendezvous Concepts> Fundamentals> Names and Subject-Based Addressing.

Configuring JMS options for the Call Back Tool If JMS is selected as the Protocol for the Call Back tool, a JMS Properties panel displays at the bottom of the Call Back tool workspace. The following options are available:

Connection Settings Connection Settings contains Settings and Properties tabs for JNDI Initial Context. The Properties tab is optional and allows you to specify additional properties to be passed to the JNDI javax.naming.InitalContext constructor (in addition to the Provider URL and Initial Context factory properties that that are specified in the Settings tab). Property values, which can be added by clicking Add and completing the Add JMS Property dialog, can be set to a fixed value, a parameterized value, a scripted value, or a unique value (an automatically-generated random unique value—no two test invocations will use the same value). The Settings tab contains the following: •

If you created a Shared Property for JMS Connections, a drop-down menu will be available from which you can choose Use Local Settings or Use Shared Property. •

If you select Use Shared Property, a second drop-down menu displays from which you select the desired global JMS settings that the SOAP Client tool will use. For more information on global JMS settings, see “Global JMS Connection Properties”, page 337.



If you select Use Local Settings, or if no shared property is specified, you can configure the rest of the options for Connection Settings.



Provider URL: Specifies the value of the property named javax.naming.Context.PROVIDER_URL passed to the JNDI javax.naming.InitialContext constructor.



Initial Context: Specifies a fully qualified class name string, passed to the JNDI javax.naming.InitialContext constructor as a string value for the property named javax.naming.Context.INITIAL_CONTEXT_FACTORY.



Connection Factory: Passed to the lookup() method in javax.naming.InitialContext to create a javax.jms.QueueConnectionFactory or a javax.jms.TopicConnectionFactory instance.

In addition to the Settings tab, the Connection Settings also include: •

Queue or Topic Connection Authentication: Allows you to provide a username and password to create a queue connection. Select the Perform Authentication checkbox and enter the Username and Password to authenticate the request. If the correct username and password are not used, the request will not be authenticated. The username and password provided here is passed to the createQueueConnection() method in the javax.jms.QueueConnectionFactory class in order to get an instance of javax.jms.QueueConnection.

Queue/Topic The Queue/Topic settings contain the following options: •

JMS Address: Specifies the queue name (if point to point is used) or topic name (if publish and subscribe is used) for where the message will be sent to.

850

Call Back

Messaging Model Messaging Model options specify how messages are sent between applications. Select either Point to Point or Publish and Subscribe.

Incoming Message Correlation The Incoming Message Correlation settings contain the following options: •

When Match incoming JMSCorrelationID with is selected, the term JMSCorrelationID = '[correlationId]' will be appended to the selector expression, where correlationId is retrieved from the JMSCorrelationID property in the Message Properties section. The option becomes enabled only if such a property is added to the Message Properties section. Effectively, this results in the test blocking until a message with the specified correlation id becomes available in the queue (or topic) and it will only retrieve that particular message, rather than retrieving any message in the queue (or topic). The test will timeout after the timeout amount elapses and if there is no message that watches the selector criteria. •



The criteria can be specified with a fixed value, a parameterized value (from a data source), or a scripted value.

Additional Selector Expression Terms: (Optional) Enter a value to act as a message filter. For tips on specifying a selector, see “Using Message Selector Filters”, page 704.

851

UDDI Query

UDDI Query This topic explains how to configure and apply the UDDI Query tool that allows you to query UDDI registries for verification and validation. Sections include: •

Understanding the UDDI Query Tool



Configuring the UDDI Query Tool



Usage Notes

Understanding the UDDI Query Tool The UDDI Query tool allows you to send inquiries to a UDDI registry for verification and validation. Queries may be done by keyword or with literal XML queries as per the UDDI specification. This tool sends an inquiry to the UDDI registry you specify, and the UDDI XML response from the service is returned.

Configuring the UDDI Query Tool To configure the UDDI Query tool, complete the following: 1. Select the main test suite node and click the Add test or output button. The Add Test wizard displays. 2. Select Standard Test> UDDI Query from the Add Test wizard, and then click the Finish button. A UDDI Query test node displays in the test suite. 3. Double-click the UDDI Query test node. The following options display in the UDDI Query tool workspace:



UDDI Inquiry Endpoint: Specifies the UDDI endpoint to which the inquiry is sent.



UDDI Inquiry Type: Specifies the UDDI endpoint type: name query, key lookup, or XML query.



Type: Allows you to specify Business, Service, or TModel types.

852

UDDI Query



Name: Specifies the search keyword.



Maximum Results: Limits the number of results from the query



Sort by: Sort the results of the query according to ascending/descending name or date.

Usage Notes To view the XML response returned from the UDDI registry, you can add an Edit tool or Browse tool to the UDDI Query tool by clicking the Add Test/Add Output button and adding the Edit or Browse tool through the Add Output wizard. After running the test, the XML will appear in the chained output tool. The UDDI Query tool has two types of output, Query Results and WSDL URI: •

Query Results: If a tool is chained to this output, the tool will receive the XML response from the UDDI registry. For example, an XML Data Bank tool chained to this output could extract and store an endpoint used to invoke a service in a later test



WSDL URI: This output will drill down through the query results and find any WSDLs associated with the results. This can be useful to enforce policies and validation requirements on all (or a subset) of WSDLs contained in the registry.

853

Transmit Tool

Transmit Tool This topic explains how to configure and apply the Transmit tool that lets you transmit and receive over a socket connection. Sections include: •

Understanding Transmit/Listen



Transmitting



Listening



Output Options

Understanding Transmit/Listen The Transmit tool lets you transmit and receive over a socket connection. It is used for communicating directly with a specific port of a host. For example, you can use it to send a direct request to the server, then verify the response. If you are expecting something sent to a specific port on your machine, you can use this tool to listen at a particular port. You can use this tool to perform a lower level of testing than the SOAP tool allows because it looks at the socket connection, which is handled by HTTP and sits one layer below the SOAP call. It is also useful if you have applications communicating through XML over a socket connection; it can be used to perform functional and regression testing of components that communicate using socket-based protocols.

Transmitting This tool can be used as part of a test suite, or it can be applied to a file or set of files available in the File tree. If you are using this tool as a test output, you need to specify the following information: •

The host machine.



The port number that you want to use.



Whether or not you want it to wait for a response until the other side closes the connection.

You can specify this information in the Transmit Parameters panel. To open this panel, double-click the Transmit node in your Test Suite tree. If you are using this tool from the tool bar or as a test suite test tool, you need to specify the following information: •

The file or text it should transmit.



The host machine.



The port number that you want to use.



Whether or not you want it to wait for a response until the other side closes the connection.

You can specify this information in the Transmit Parameters panel. To open this panel, double-click the Transmit node in your Test Suite tree. To learn about special options available for this tool’s output, see the Output Options section below.

854

Transmit Tool

Listening To listen using the Transmit/Listen tool: 1. Right-click the Transmit Test Suite tree node. 2. Choose either Start Listening> Single Session or Start Listening> Sequential Sessions from the shortcut menu that opens. •

In Single Session mode, the tool will listen for a client to open a connection to the specified port, receive the client's transmission after the connection has been opened, then stop listening to the port after the client closes the connection.



In Sequential Session mode, the tool will resume listening (waiting for a new connection) after the current connection has been closed by the transmitting side; this allows you to receive transmissions from multiple clients, or multiple transmissions of a single client.

Note: You do not need a host specified in the Transmit parameters panel if you want to listen; the host parameter is only required if you want to transmit. If you stop listening but do not close the connection to the host, the tool will ignore the incoming data from the connection, but you will still be able to send data over the connection. To stop listening: 1. Right-click the Transmit Test Suite tree node. 2. Choose Stop Listening from the shortcut menu that opens. When the connection is closed, you cannot send or receive any data. To close the connection to the host: 1. Right-click the Transmit Test Suite tree node or tool bar button 2. Choose Close Connection from the shortcut menu that opens. For details on special options for this tool’s output, see the Output Options section below.

Output Options Transmit tool contains the following output options: •

stdout



stderr



Result Window



Packetize by New Lines



Packetize by Documents



File

For additional information on these output options, see “Adding Test Outputs”, page 333.

855

ISO 8583

ISO 8583 This topic covers the ISO 8583 tool, which allows for sending and receiving ISO 8583 messages over a variety of channels and message packaging configurations. Sections include: •

Understanding the ISO 8583 Tool



Configuring the ISO 8583 Tool



Viewing Traffic



Extracting Values for Reuse Across Tests and Creating Regression Controls



Scripting Header and ISO 8583 Message Fields

Understanding the ISO 8583 Tool ISO 8583 is a standard for systems that exchange electronic financial transactions made by cardholders using payment cards. However, certain challenges arise when testing the ISO 8583 standard: •

Consistency: There are many variations and local implementations of ISO 8583.



Maintainability: Existing business-critical, payment-related systems are evolving and need to be maintained and tested.

Parasoft SOAtest provides a comprehensive testing framework that alleviates these challenges and allows you to establish a consistent, modern way to manage quality for ISO 8583-based systems, making the standard a part of the overall SOA/IT quality governance initiatives. You can also leverage SOAtest's productivity framework (such as data sources, test suites, rich data validation, etc.), and build a continuous regression testing infrastructure to evolve your electronic payment systems safely. The Parasoft SOAtest ISO 8583 tool provides an easy-to-use GUI for use on an obscure, binary message format. You can use the ISO 8583 tool to: •

Simulate clients (e.g. card acquires) for sending and receiving ISO 8583 messages



Configure messages using a visual interface for message field configuration



Provide customizable message headers



Create regression tests for ISO 8583-based systems with rich and flexible data validation



Chain ISO 8583 tests into multi-step scenarios to exercise complex financial transactions



Easily configure, send and receive any structured, fixed length binary messages that are not necessarily based on ISO 8583

Configuring the ISO 8583 Tool Tool options can be configured in the ISO 8583 tool configuration panel, which can be accessed by double-clicking the tool node in the Test Case Explorer.

Tool Settings tab In the Tool Settings tab, you can configure the following basic tool settings: •

Name: Specifies the name of the tool.



Host and Port: Specifies the target system to which the messages are sent.



Keep Alive: When selected, will leave the TCP connection opened by the current test so it can be reused by subsequent ISO 8583 Tool tests configured with the same host and port. For

856

ISO 8583

example, the subsequent test would reuse the same socket connection. If Keep Alive is not selected, then the subsequent test will open a new TCP connection (new socket). •

Message Fields: Specifies ISO 8583 message field packaging. The available options are based on configurations provided by jPOS (www.jpos.org). Each configuration establishes the field Ids that are allowed in the message, and what data types and lengths they should have. •





Selecting Custom from the drop-down menu will enable you to browse to a packager file that is designed to describe your specific message configuration. Custom packagers are jPOS-based XML files. For examples of such packager files, please refer to www.jpos.org. At the time of this writing, various XML packagers configurations can be found at http://jpos.svn.sourceforge.net/viewvc/jpos/trunk/jpos6/modules/jpos/cfg/ packager/.

Transport Channel: Determines the protocol configuration of how messages are sent and received. These options influence how ISO 8583 message headers and trailers are handled, how overall message length values are sent and received, and possibly other factors. There are several channel options available to choose from based on channels that are available in jPOS (www.jpos.org) and other common configurations. •

Selecting Custom from the drop-down menu allows you to provide your own channel implementation. Custom results in the text field next to the drop-down menu are enabled and a fully qualified class name can be specified. A custom channel class needs to implement the interface org.jpos.ISOChannel or extend org.jpos.BaseChannel. Examples of channel implementations can be found at http:/ /jpos.svn.sourceforge.net/viewvc/jpos/trunk/jpos6/modules/jpos/src/org/jpos/iso/channel/.



Once such a custom channel class is written, it needs to be added to the SOAtest classpath. For more information on how to add jars to the SOAtest classpath, see “System Properties Settings”, page 758.

Scripting Hook: Allows for custom manipulation of the ISO 8583 message before it is sent. Methods used in the scripting hook expect three parameters in the following order: ISOMsg, ISOChannel and Context. Scripting hook scripts execute after the ISOMsg object has been initialized with the content provided in the Input area of the ISO 8583 tool, and before the message is sent. For Javadocs and related jPOS classes, please refer to http://jpos.org/doc/ javadoc/index.html

Input Type tab In the Input Type tab, you can configure the ISO 8583 message (in the ISO 8583 Message sub-tab), as well as any optional headers (in the Headers tab).

ISO 8583 Message Sub-Tab The following options can be configured in the ISO 8583 Message tab: •

Input Mode: The ISO 8583 Form view is an alternative (and the preferred) GUI for configuring the request messages (Literal XML and Form XML views are available as well) . In the ISO 8583 Form view, new fields can be added with the Add button. Select one or more fields (hold CTRL while selecting multiple fields) and click Remove in order to remove the selected fields. When a single field is selected, it can be dragged as well. The order of fields when declared in any of the ISO 8583 Message views does not influence how the fields are actually sent. Fields are always sent in ascending order based on the numeric field ID. This is consistent with the ISO 8583 message specification. However, order can be a factored when scripting is used in terms of what values are initialized first. For more

857

ISO 8583

details, please see “Scripting Header and ISO 8583 Message Fields”, page 859. The following options are available in the ISO 8583 Form view:





Field ID: Enter a numeric value. Only values that have been declared by the message field configuration (or packager file) are allowed. However, it is possible to declare more fields in the packager than the ones actually used.



Value: When clicking on a a specific field value area, you can select between Fixed or Script. Parameterized is also an available option when a data source is visible from the test.



Type: Allows the choice between String and Binary. When it is set as Binary, it indicates that the value contains the literal hexadecimal representation of the content where every two hex digits represent a single byte. String indicates that the value should be interpreted based on the type specified in the message packager for that particular field (the Message Field Configuration).



Description: (Optional) Allows you to specify a description of the message for clarity. This option does not affect the sent message in any way.



Use Data Source: Exclude with empty string: (Only visible when a data source is available). When the check box in this cell is selected, you may map the value to a data source column. As the test executes and iterates over the data source rows, it will either include or exclude the specified ISO field based on whether the data source row at that column is empty. Whenever the data source value is empty, the field will be excluded. This feature is very useful in creating regression tests that iterate over many test cases using a data source, and where some test cases have certain ISO fields while others don't. So this effectively makes the inclusion of fields dynamic based on the data source.

Literal XML and Form XML views allow for ISO 8583 messages to be provided in an XML format. The actual message on the wire is sent in a binary format (not XML) unless jPOS XML Channel and XML packager are selected. The XML representation of the message is defined as follows:

<isomsg> <header>{Hexadecimal representation binary content}</header> <field id=”{INTEGER}” value=”{STRING}” [type=”binary”]/>+ </isomsg> So and example message can look similar to: <isomsg> <header>16380c18601860a186b01868fff486e0bb21</header> <field id="0" value="0100"/> <field id="2" value="5048993400009931"/> <field id="3" value="031000"/> ... <field id="128" value="0D0F030D040C0602" type="binary"/> </isomsg>



The type attribute is optional and when its value is set as “binary”, it indicates that the value attribute contains that literal hexadecimal representation of the content whereby every two hex digits represent a single byte. Omitting the type attribute results in the value being interpreted based on the type specified in the message packager for that particular field (the Message Field Configuration). Form XML allows for data source parameterization when a data source is present.

Header Sub-Tab

858

ISO 8583

The Header tab can be used to configure a custom binary header for the ISO 8583 message. Header fields can rearranged (with drag and drop), and new fields can be added and removed with the corresponding buttons. Select multiple fields while holding CTRL in order to remove several fields at once. The overall size shown next to the buttons is the sum of all field sizes. The following options can be configured in the Headers tab: •

Name: Specifies the name of the header.



Value: When clicking on a a specific field value area, you can select between Fixed or Script. Parameterized is also an available option when a data source is visible from the test.



Type: Specifies the data type for the header field. The available types influence how the content in the value cell is to be sent over the wire. The Binary (hex) option indicates that the value cell has a hex representation of the raw binary content and should be sent as is. The hex representation assumes two hexadecimal digits for every byte.



Size: Indicates the size of the field.



Exclusion: (Only visible when a data source is available). When the check box in this cell is selected, you may map the value to a data source column. As the test executes and iterates over the data source rows, it will either include or exclude the specified ISO field based on whether the data source row at that column is empty. Whenever the data source value is empty, the field will be excluded. This feature is very useful in creating regression tests that iterate over many test cases using a data source, and where some test cases have certain ISO fields while others don't. So this effectively makes the inclusion of fields dynamic based on the data source.

Viewing Traffic When an ISO 8583 test is executed, the traffic viewer will display an XML representation of the messages sent and received. This representation is intended for making analysis easier, but does not reflect the actual byte stream on the wire. In order to view the raw bytes, select Show Row Traffic from the Traffic Viewer and you will see a hexadecimal dump of the message exactly as it was captured at the socket level.

Extracting Values for Reuse Across Tests and Creating Regression Controls Once ISO 8583 messaging scenarios are setup, regression controls or various value validation features can be applied to the response messages. To apply a regression control, complete the following: •

Right-click on the ISO 8583 test and select Create/Update Regression Control from the shortcut menu. This results in a Diff tool being attached and capturing the ISO 8583 response in XML representation. You can also use the Add Output/Add Test tool bar button to chain other XML-based validation tools, or XML Data Bank to extract a value and use it in a subsequent test.

Scripting Header and ISO 8583 Message Fields When Script is selected in a value cell, you may write Java, Jython or JavaScript code to generate the field value. Examples of such usage is to encrypt content or generate MAC values. The script methods will accept either zero or a single argument. When a single argument is declared, it references the ISOMsg object representation of the current request ISO 8583 message. The ISOMsg object will have its

859

ISO 8583

fields initialized up to the current field. For example, if you are scripting the tenth field, then the ISOMsg object will set all prior 9 fields (if any) based with those field values. This is where rearranging ISO fields can make a difference despite the fact that rearranging order does not influence the actual field order of the sent messages.

860

XML Tools In this section: •

XML Validator



XML Data Bank



XML Transformer



XSLT



XML Encryption



XML Signer



XML Signature Verifier



XML Encoder



XML Decoder

861

XML Validator

XML Validator This topic introduces the XML Validator tool, which checks whether XML is well-formed and can also validate XML against schema definitions. Sections include: •

Understanding XML Validator



Customizing XML Validator

Understanding XML Validator This tool checks whether XML is well-formed and can also be configured to validate XML against schema definitions. It is primarily chained to the response output of a SOAP or Messaging Client. A well-formed document conforms to the following guidelines: •

All elements that require both start and end tags have both start and end tags.



Elements nest properly within each other.



Attribute values must be quoted.

A valid XML document contains a proper document type declaration and obeys the constraints of that declaration (correct elements, nesting, attributes, attribute values, etc.). As a test suite tool, it allows you to check XML as part of your functional test scenario. To check XML during static analysis, use the "Check XML Well-formedness" rule, which is in the Validate XML category. This rule has the same customization options as the XML Validator tool.

Customizing XML Validator You can customize the following options: •

Validate Against Schema: Select if you would like to validate XML files against schema.



Check well-formedness only: Select if you would like to check the well-formedness of XML files.



Use namespace as location URI for Schemas: (Only available if Validate Against Schema is selected) When validating XML that uses schemas this option is used to indicate to SOAtest where to find the schema. If it is checked, then it will be assumed that the namespace is synonymous with the location. If it is not checked, then each time the Validator encounters a namespace that it does not recognize, it will prompt you to supply the location. SOAtest will only recognize those schemas that have been added to the Preference panel’s Schema Locations tab. For more information on adding schema locations, see “XML Schema Locations Settings”, page 760.



Validate against schemas referenced in WSDL or Schema: This option tells SOAtest to look for imported schemas in the specified WSDL or schema file and to validate the XML against those schemas.



List of namespaces mapped to schema locations: Allows you to map namespaces to locations (if the namespace is not synonymous with the location).



List of namespaces to skip during XML Validation: Specifies namespaces to skip.

862

XML Data Bank

XML Data Bank This topic explains how to configure and apply the XML Data Bank tool that extracts values from a test so that those values can be used to populate the value of an element in the request of another test. Data can also be sent to a Writable Data Source and accessed in the Extension Tool, or it can be sent to test variables for easy reuse across the test suite. Sections include: •

Understanding XML Data Bank



Configuring XML Data Bank Using a Wizard



Configuring the XML Data Bank Manually



Viewing the Data Bank Variables Used During Test Execution

Understanding XML Data Bank The XML Data Bank tool enables you to extract certain parameters from one test in a test suite, and input those parameters into another test. In other words, you can use a parameter from the SOAP response of Test 1 as a parameter in the SOAP request of Test 2. The XML Data Bank tool can be chained to any other SOAtest tool that outputs XML. It is able to extract any information from the XML and makes that information available for later use. For example, you can configure a test suite that tests a bank’s Web service transactions. Test 1 of that test suite can log on to the service using a User ID, then the SOAP response would return a session ID back to Test 1. Test 2 of that test suite can be configured to use the session ID from Test 1 to perform transactions. You can configure any of the tests in a test suite to use SOAP response parameters as SOAP request parameters. Users typically configure an XML Data Bank by accessing a wizard while parameterizing a value in a tool such as the SOAP Client or Messaging Client tool. This provides a quick, intuitive, and largely automated way to extract data from one test and use it another. You simply go to the test where you want to insert extracted data, then use a wizard to specify what data (e.g., from what test) you want to extract. The extractions and parameterizations will all be setup manually. This is the usage model demonstrated in the Configuring an XML Data Bank tutorial. This same method can be used to extract data that is used to set a test variable. Alternatively, you can manually configure an XML Data Bank tool to extract data from one test, then manually configure other tests to use the extracted values.

Configuring XML Data Bank Using a Wizard To use a wizard to configure an XML Data Bank: 1. Ensure that you have a test suite with at least two test cases. 2. Double-click the test node that you want use the stored test value (you can select any node except for the first test node) and select Form Input from the Views drop-down menu in the right GUI panel. 3. Select the desired value you would like to use (for example, the id value) from the Operation drop-down menu. 4. In the element view (for example, the id value), select Parameterized and Use Data Source Wizard from the available drop-down menus.

863

XML Data Bank

864

XML Data Bank

5. In the wizard that opens: a. Select the test you would like to use parameters from. The Test drop-down menu will contain all tests in the test suite that occur before the current test you are configuring. For example, if you are configuring Test 4, tests 1, 2, and 3 will display in this menu along with any data sources that may be available. b. In the Expected XML tree, select the desired response element, then click Add to add it to the Selected XPaths area, which specifies the value you want to be saved and made available as a parameterized value. The value you added displays with a Data Source Column Name corresponding to the Test the value was chosen from and the value name.

Tip- Right-Click Options for Adding XPaths You can also right-click a value from the Expected XML list and select Add XPath from the shortcut menu. If you right-click a value that has multiple occurrences, you can choose to add XPaths for the following: •

Add XPath for this occurrence only: Adds only the selected XPath.



Add XPath for all occurrences within row: Adds all occurrences of the selected XPath. c.

(Optional) If you want to specify additional options (e.g., if you want the value saved to a data source or test variable—or if you want to modify advanced XPath settings), select the appropriate element in the Selected XPaths area, configure the options as needed, then click OK. Available options are described below:

865

XML Data Bank

Available Options XPath options are •

XPath: Displays the selected XPath. If you are looking for a more general XPath, you can easily type in a different number into the list indices. For example [1] can be changed to [2] if you are interested in the 2nd occurrence only. After editing the XPath text, click the Validate button to validate the XPath format, click OK.



Extract: Allows you to extract either the Entire Element, or the Content Only.





Selecting Entire Element will output the entire XPath. For example, XPath/Parent will output <parent>VALUE</parent>. You can also modify the Index to extract to specify which element is extracted if the element occurs more than once.



Selecting Content Only will output only the value. For example, XPath/Parent will output VALUE. Text Content extracts the text content of the element selected, and All Child Nodes extracts all child nodes of the element selected.

XPath Evaluation: Clicking the Evaluate button displays the result of applying the XPath expression against the expected XML.

Data source options are: •

Custom column name: Tells SOAtest to store the value in a data source column with the specified name.



Writable data source: Tells SOAtest to store the value in a writable data source (see “Adding a Writable Data Source”, page 351 for details). This allows you to store an array of values. Other tests can then iterate over the stored values.



Test variable: Tells SOAtest to save the value in the specified test variable so it can be reused across the current test suite. The test variable must already be added to the current test suite as described in “Defining Test Variables”, page 326. Any values set in this manner will override any local test variable values specified in the test suite properties panel.

Configuring the XML Data Bank Manually You can also manually chain the XML Data Bank tool to a test within the test suite. To configure the XML Data Bank as a chained tool, complete the following: 1. Ensure that you have a test suite available with at least two test cases. 2. Right-click the Test 1 node and select Add Output. The Add Output wizard displays. 3. In the Add Output wizard, select Response> SOAP Envelope> XML Data Bank and click the Finish button. An XML Data Bank node displays in the Test 1 branch. 4. Double-click the XML Data Bank node. The following XML Data Bank operations display in the right GUI panel: •

Name: Specifies the name of the XML Data Bank tool.



Expected XML Views: Specifies the expected XML response. A Literal tab, a Tree tab, and an Element tab are available for viewing the expected XML in literal, tree, or simple element form.



Selected XPaths: Specifies the XPaths you would like to add as a parameterized value.

866

XML Data Bank



Add XPath: To add an XPath, select a value from the Expected XML list and click the Add XPath button. The value you added displays in the Selected XPaths list with a Data Source Column Name corresponding to the Test the value was chosen from and the value name.



Remove: To remove an XPath, select a value from the Selected XPaths list and click the Remove button.



Modify: To modify an XPath, select a value from the Selected XPaths list and click the Modify button. The Modify dialog displays and contains the following options: •

XPath: Displays the selected XPath. If you are looking for a more general XPath, you can easily type in a different number into the list indices. For example [1] can be changed to [2] if you are interested in the 2nd occurrence only. After editing the XPath text, click the Validate button to validate the XPath format, click OK.



Extract: Allows you to extract either the Entire Element, or the Content Only. •

Entire Element: Selecting Entire Element will output the entire XPath. For example, XPath/Parent will output <parent>VALUE</ parent>. You can configure the following Entire Element Options:



Index to extract: Default value is 1. This controls which element is extracted in the case where the element occurs more than one time.



Content Only: Selecting Content Only will output only the value. For example, XPath/Parent will output VALUE. You can configure Text Content: Extracts the text content of the element selected, or All Child Nodes: Extracts all child nodes of the element selected.



Save Expected XML: Specifies whether or not to save the expected XML.



Canonicalize XML Output: Specifies whether or not an extracted element is canonicalized. If this option is selected, and if an entire element is extracted, any necessary namespace declarations are added to the extracted element if the element contains prefixes referencing namespaces that are not declared in the same element.



Allow Alteration: Specifies whether to allow the alteration of an XPath. When this option is selected, an Extract tab and an Alter tab display beneath the Selected XPaths list. To alter an XPath, select the Allow Alteration checkbox, select the Alter tab, add an XPath by clicking the Add button, and then modify the XPath by clicking the Modify button. The Modify dialog displays and contains the following options: •

XPath: Displays the selected XPath. To edit and validate a selected XPath, edit the XPath text, click the Validate button to validate the XPath format, click OK.



Alteration Type: Allows you to select how the Value you enter alters the XML. Selecting Append will add the altered value to the end of the XPath. Selecting Prepend will add the altered value to the beginning of the XPath. Selecting Replace With will replace the entire XPath with the altered value you specify.

• •

Alteration Value: Allows you to enter the value you would like to alter the XPath with.

Extract Empty Elements As: Specifies whether or not empty XML elements will be extracted. When selected, specifying a text string in the adjacent field will tell the XML

867

XML Data Bank

Data Bank to replace the empty content of the extracted elements with the specified string. 5. After adding and/or modifying XPaths, double-click the Test 2 node and choose Generated Data Source from the Data Source drop-down menu in the right GUI panel. The XPaths you added to the XML Data Bank in Test 1 are now available to be used as Parameterized values for the SOAP Client in Test 2 when using Form Input or Form XML settings for the SOAP Envelope.

Viewing the Data Bank Variables Used During Test Execution You can configure the Console view (Window> Show View> Console) to display the data bank variables used during test execution. For details, see “Monitoring Test Variable Usage”, page 328.

868

XML Transformer

XML Transformer This topic explains how to configure and apply the XML Transformer tool. Sections include: •

Understanding XML Transformer



Configuring XML Transformer

Understanding XML Transformer Oftentimes, a SOAP message contains a large XML payload. However, you may only be interested in only a small portion of that payload, and wish to create regression controls using only a few elements of the SOAP response or request. Ignoring all of these elements one at a time can become very tedious. In this case, it would be more efficient to use the XML Transformer tool. The XML Transformer gives you XSLT-like functionality to transform any XML. This is very useful if you would like to create a regression control using only a few elements of the SOAP response or request.

Configuring XML Transformer To configure the XML Transformer, complete the following: 1. Select the main test suite node and click the Add test or output button. The Add Test wizard displays. 2. Select Standard Test> XML Transformer from the Add Test wizard, and then click the Finish button. A XML Transformer test node displays in the test suite. 3. Double-click the XML Transformer node. The following options display in the XML Transformer tool workspace: •

Name: Specifies the name of the Transformer tool.



Tree/Literal/Element: Specifies the expected XML response. A Literal tab, a Tree tab, and an Element tab are available for viewing the expected XML in literal, tree, or simple element form.



Selected XPaths: Specifies the XPaths you would like to add as a parameterized value.



Add XPath: To add an XPath, select a value from the Expected XML list and click the Add XPath button. The value you added displays in the Selected XPaths list with a Data Source Column Name corresponding to the Test the value was chosen from and the value name.



Remove: To remove an XPath, select a value from the Selected XPaths list and click the Remove button.



Modify: To modify an XPath, select a value from the Selected XPaths list and click the Modify button. The Modify dialog displays and contains the following options: •

XPath: Displays the selected XPath. If you are looking for a more general XPath, you can easily type in a different number into the list indices. For example [1] can be changed to [2] if you are interested in the 2nd occurrence only. After editing the XPath text, click the Validate button to validate the XPath format, click OK.



Options: Allows you to extract either the Entire Element, or the Content Only.

869

XML Transformer





Entire Element: Selecting Entire Element will output the entire XPath. For example, XPath/Parent will output <parent>VALUE</ parent>. You can configure the following Entire Element Options:



Index to extract: Default value is 1. This controls which element is extracted in the case where the element occurs more than one time.



Content Only: Selecting Content Only will output only the value. For example, XPath/Parent will output VALUE. You can configure Text Content: Extracts the text content of the element selected, or All Child Nodes: Extracts all child nodes of the element selected.

XPath Evaluation: Clicking the Evaluate button displays the result of applying the XPath expression against the expected XML.



Save Expected XML: Specifies whether or not to save the expected XML.



Canonicalize XML Output: Specifies whether or not an extracted element is canonicalized. If this option is selected, and if an entire element is extracted, any necessary namespace declarations are added to the extracted element if the element contains prefixes referencing namespaces that are not declared in the same element.



Allow Alteration: Specifies whether to allow the alteration of an XPath. When this option is selected, an Extract tab and an Alter tab display beneath the Selected XPaths list. To alter an XPath, select the Allow Alteration checkbox, select the Alter tab, add an XPath by clicking the Add button, and then modify the XPath by clicking the Modify button. The Modify dialog displays and contains the following options: •

XPath: Displays the selected XPath. To edit and validate a selected XPath, edit the XPath text, click the Validate button to validate the XPath format, click OK.



Alteration Type: Allows you to select how the Value you enter alters the XML.



Selecting Append will add the altered value to the end of the XPath.



Selecting Prepend will add the altered value to the beginning of the XPath.



Selecting Replace With will replace the entire XPath with the altered value you specify.



Alteration Value: Allows you to enter the value you would like to alter the XPath with.



Extract Empty Elements As: Specifies whether or not empty XML elements will be extracted. When selected, specifying a text string in the adjacent field will tell the XML Transformer to replace the empty content of the extracted elements with the specified string.



Wrap Output in XML: Specifies whether the output of the XML Transformer tool will be well-formed XML. If this option is selected, the content will be output as well-formed XML. For example, the XML encoded value of: &lt;foo

would be outputed as: <root> <ElementName>&lt;foo&gt;</ElementName> </root>

Usage Notes You can add a regression control using the XPath values you extracted or altered. Right-click the SOAP Client node that contains an XML Transformer and select Create Regression Control from the

870

XML Transformer

shortcut menu. A Transformed XML> Diff node appears underneath the XML Transformer node. If you select this node, you will see the XPaths you added to the Selected XPaths list in the XML Transformer tool.

871

XSLT

XSLT This topic explains how to apply and customize the XSLT tool that transforms XML files. Sections include: •

Understanding the XSLT Tool



Transforming XML



Configuring the XSLT Tool

Understanding the XSLT Tool The XSLT tool transforms XML files using the style described in the specified XSL file.

Transforming XML We recommend that you specify the .xsl file you want the XSLT tool to use prior to applying the tool. For instruction on specifying the .xsl file, see “Configuring the XSLT Tool”, page 872. This tool can be used as part of a test case, or it can be applied to a file or set of files opened in the File tree. If you are using this tool as a test suite output, you need to specify what XSL file it should use to transform the XML it receives. You can specify this in the XSLT Parameters panel. To open this panel, select the XSLT node in your Test Suite tree. In this panel, you can also specify whether you want this tool to store information in using relative paths (instead of absolute paths) and the MIME type of the transformed document. If you are using this tool as a test suite tool, you need to specify what XML you want to transform as well as what XSL file it should use to transform the specified XML. You can specify both of these parameters in the XSLT Parameters panel. To open this panel, select the Test: XSLT node in your Test Suite tree. In the XSLT Parameters panel, specify the file or text you want to transform in the Input portion of the panel, then specify the XSL files that you want to use in the Tool portion of the panel. In the Tool portion of the tab, you can also specify whether you want this tool to store information in using relative paths (instead of absolute paths) and the MIME type of the transformed document. Important: An Edit output is added to this test suite tool by default. If you want the transformed output saved, you need to add the Write File as an output for this tool.

Configuring the XSLT Tool If you want a XSLT tool to always use the same XSL files for transformations, you can permanently associate a file with the tool. You can also determine the MIME type of the XSLT tool’s output. You can set the XSLT tool’s XSL file and output type in the Configure XSLT panel. You can reach the Configure XSLT panel by Double-clicking the XSLT Test Suite node. You can then use the Mime type of transformed document box to indicate which MIME type you want to use for transformed files, and use the XSL File field to indicate which (if any) file you want to use as a permanent XSL file.

872

XML Encryption

XML Encryption This topic explains how to configure and apply the XML Encryption tool that encrypts XML documents for security purposes. Sections include: •

Understanding XML Encryption



Configuring the XML Encryption Tool



Usage Notes

Understanding XML Encryption In order to securely send data across the Internet during a Web service transaction, security standards must be put in place to ensure that outside parties cannot view or read any of the private transaction data. The XML Encryption standard recommended by the W3C defines a process that allows any data in XML documents to be encrypted and decrypted. It specifies the data to be encrypted, as well giving information about the cipher or key used to encrypt the data. SOAtest’s XML Encryption Tool supports the W3C XML Encryption standard and the WS-Security standard from OASIS. The XML Encryption Tool allows you to encrypt and decrypt data to be sent as Web service transactions. The XML Encryption tool also allows you to encrypt individual elements of the XML document, or the entire document itself. This feature is especially useful for Web service transactions that are performed between multiple partners or endpoints. For example, a credit card transaction can be encrypted where the user’s name and address are visible, but the user’s credit card number is encrypted.

Configuring the XML Encryption Tool The XML Encryption Tool allows you to either Encrypt or Decrypt data. Depending on the Encryption Mode you select, the options for Encrypt mode and Decrypt will vary. Note: Before using the XML Encryption Tool, you must download the Unlimited Strength Java Cryptograpy Extension. For more information, see “Unlimited Strength Java Cryptography Extension”, page 877.

Tool Settings The following options display in the left pane of the Tool Settings tab: •

General



WS-Security



Target Elements



Emulation Options



Encryption Options



Decryption Options

873

XML Encryption

General When selecting General from the left pane of the Tools Settings tab, the following options are available: •

Encrypt or Decrypt: Select the appropriate radio button to encrypt or decrypt data.



WS-Security Mode: Tells SOAtest to use OASIS WS-Security 1.0 to encrypt SOAP messages. If this option is selected together with the Encrypt SOAP Body (WS-Security Mode) or Entire Document (non WS-Security mode) checkbox, the content of the Body element will be automatically encrypted. WS-Security Mode uses asymmetric encryption but does not allow use of key stores or explicit key values.



Asymmetric (non-WS-Security): Tells SOAtest to use a symmetric key to encrypt the data and an asymmetric key to encrypt the block encryption key. You may also choose to use a key store or an explicit key value.



Symmetric (non-WS-Security): Tells SOAtest to use only one secret, shared, symmetric key to encrypt the data directly.



Key Transport (Key Encryption): Specifies the unique algorithm used to encrypt the key.



Key Store: Specifies the key store for the key-encryption key. The Key Stores available in this menu are dependent on the Key Stores you added at the test suite level. For more information on adding Key Stores, see “Global Key Stores”, page 340.



Symmetric (Block Encryption): Specifies the unique algorithm used to encrypt the data.

874

XML Encryption



Automatic Key: Tells SOAtest to automatically generate the block encryption key.



Use Key Store: Specifies the key store used for the Symmetric (Block Encryption) key. The Key Stores available in this menu are dependent on the Key Stores you added at the test suite level. For more information on adding Key Stores, see “Global Key Stores”, page 340.



Explicit Key Value: Enter an explicit key, depending on the Algorithm you selected from the Symmetric (Block Encryption) menu and choose Save to remember the key value in the project (.tst) file.

WS-Security When selecting WS-Security from the left pane of the Tools Settings tab, the following options are available: •

Form: Choose the appropriate form to specify the key and certificate used for encryption.



Actor: Enter a value to specify a SOAP actor.



Add mustUnderstand="1" attribute: Specifies whether or not the receiver must recognize and decrypt the message. If this option is enabled, a SOAP fault will be sent back if the receiver does not know how to decrypt and deserialize the message.



Add Timestamp: Select to add a timestamp to the message. When this option is enabled, the following are available:



Sign the Timestamp: Select to provide a digital signature with the timestamp.



Add expiration: Select to enter an expiration value in the Time to Live field.

Target Elements When selecting Target Elements from the left pane of the Tools Settings tab, the following options are available: •

SOAP body/entire document: Select to encrypt the entire SOAP body or entire XML document.



Click the Add button (only available if SOAP body/entire document is unselected) to specify an XPath and encrypt a specific element within the XML document. After clicking the Add button, a row will appear in the Element Selection list. The Element Selection list consists of the following two columns: •

XPath Expression: Allows you to enter the desired XPath you would like encrypted.



Target: Allows you to choose either Entire Element or Content Only. Selecting Entire Element will encrypt the entire XPath. Selecting Content Only will encrypt only the text content.

Emulation Options When selecting Emulation Options from the left pane of the Tools Settings tab, the following options are available: Note: The following options are available only if WS-Security Mode is selected in the General tab. •

Emulate: Select the application server you are using to automatically configure the emulation options. You can also select the appropriate version number of your application server from the Version drop-down menu. •

To manually configure the emulation options, select Custom from the Emulate dropdown menu. The following options will be available for you to manually configure: •

wsse URI: Select the namespace URI of the WS-Security specification used.

875

XML Encryption



wsu URI: Select the utility namespace URI of the WS-Security specification used.



Qualify signed element ID attribute: Select to qualify signed element ID attribute.



Qualify BinarySecurity Token attributes: Select to qualify binary security token attributes with the wsse namespace.



Prefix BinarySecurity Token attribute values: Select to prefix binary security token attributes with the wsse URI.

Encryption Options When selecting Encryption Options from the left pane of the Tools Settings tab, the following options are available: •

Security Header Layout: This property indicates which layout rules to apply when adding items to the security header. The following options are available: •

Lax: Items are added to the security header in any order that conforms to WSS: SOAP Message Security



LaxTimestampFirst: Same as Lax, except that the first item in the security header MUST be a wsse:Timestamp



LaxTimestampLast: Same as Lax, except that the last item in the security header MUST be a wsse:Timestamp



Strict: Items are added to the security header following the numbered layout rules described below according to a general principle of 'declare before use'.

Decryption Options When selecting Encryption Options from the left pane of the Tools Settings tab, the following options are available: Note: The following options are available only if the Decrypt radio button is selected in the General tab. •

Key Store: Specifies the key store used for the key-encryption key. The Key Stores available in this menu are dependent on the Key Stores you added at the test suite level. For more information on adding Key Stores, see “Global Key Stores”, page 340.



Decrypt all known WSS headers: Select to decrypt WSS header formats.



Decrypt specific header formats: Select to decrypt specific header formats.

Input Type Tab The Input Type tab is only available if the XML Encryption tool is added as a stand-alone tool and not chained to another tool. The following options are available from the Input Type tab: •

Text: Use this option if you want to type or copy the XML document into the UI. Select the appropriate MIME type, enter the XML in the text field below the Text radio button.



File: Use this option if you want to use an existing file. Click the Browse button to choose a file. •

Check the Persist as Relative Path option if you want the path to this file to be saved as a path that is relative to the current configuration file. Enabling this option makes it easier to share tests across multiple machines. If this option is not enabled, the test suite will save the path to this file as an absolute path.

876

XML Encryption

Usage Notes You can use the XML Encryption tool as a stand-alone tool at the test case level by right-clicking the main test suite node and selecting Add New> Test from the shortcut menu and then selecting XML Encryption from the dialog that opens. You may also chain the XML Encryption tool to a SOAP Client Tool by right-clicking the desired SOAP Client node and selecting Add Output from the shortcut menu and then selecting XML Encryption from the dialog that opens. The SOAP Client tool will use the transformed XML in the invocation of the Web service call. You can chain the XML Encryption tool and the XML Signer tool to a SOAP Client to perform both encryption and XML signature on a SOAP message. For more information on the XML Signer tool, see “XML Signer”, page 878. You can also chain any tool, such as an Edit or Browse tool, to the XML Encryption Tool by right-clicking the desired XML Encryption Tool node and selecting Add Output from the shortcut menu and then selecting XML Encryption from the dialog that opens.

Unlimited Strength Java Cryptography Extension Important: In order to perform security tests using the XML Signature Verifier, XML Signer, or XML Encryption tools, or if using Key Stores, you will need to download and install the Unlimited Strength Java Cryptography Extension. To do so, go to http://java.sun.com/javase/downloads/index_jdk5.jsp and download the JCE Unlimited Strength Jurisdiction Policy Files. The files downloaded should be installed into the following directory on your machine: [SOAtest install dir]\[SOAtest version_number]\plugins\com.parasoft.xtest.jre.eclipse.core.[platform]_[jre version]\jre\lib\security

Be sure to replace the existing local_policy.jar and US_export_policy.jar files with the new ones that you downloaded.

877

XML Signer

XML Signer This topic explains how to configure and apply the XML Signer tool that signs XML documents for security purposes. Sections include: •

Understanding XML Signature



Configuring the XML Signer Tool



Usage Notes

Understanding XML Signature In order to securely send data across the Internet during a Web service transaction, security standards must be put in place to ensure that the different users taking part in the transaction can identify each other. The XML Signature standard recommended by the W3C defines a process that allows any data in XML documents to be digitally signed. With XML Signature, Web service users can verify the identity of others involved in a transaction and can be ensured that the data has not been altered since the document was signed. OASIS leverages this standard so that it can be used in SOAP.

Configuring the XML Signer Tool SOAtest’s XML Signer Tool supports the W3C XML Signature standard and allows you to digitally sign data to be sent as Web service transactions. The XML Signer Tool also allows you to sign individual elements of the XML document, or the entire document itself. This feature is especially useful for Web service transactions that are performed between multiple partners or endpoints. For example, a transaction for the purchase of car may take place through a Web service. In this instance, the buyer would have to sign certain parts of the document, the loan officer may have to sign certain parts of the document, and the seller would have to sign certain parts of the document.

Tool Settings The following options display in the left pane of the Tool Settings tab: •

General



WS-Security



Target Elements



Emulation Options

878

XML Signer

General When selecting General from the left pane of the Tools Settings tab, the following options are available: •

WS-Security Mode: Tells SOAtest to use OASIS WS-Security 1.0 to sign SOAP messages. If this option is selected together with the Sign SOAP Body (WS-Security Mode) or Entire Document (non WS-Security mode) checkbox, the content of the Body element will be automatically signed.



Perform SAML signature: Selecting this option will result in using OpenSAML to sign the assertion. When this option is unselected, PoX signing will be used.



SAML Key Store: The SAML Key Store option is specifically tied to applying a SAML signature, namely for a certain use case of Sender-Vouches: Signed. When performing an enveloped signature under this confirmation method, it is actually emulating a Holder-of-Key confirmation method. In essence, performing an enveloped signature for a Sender-Vouches confirmation method allows you to sign the Body message of the SOAP payload as well as including its certificate in the SAML Assertion. This option is turned on by default for SenderVouches: Signed > Perform an enveloped signature option, and turned off for any other SAML confirmation method and its respective signature option use case.



Key Store: Select the Key Store used to verify your identity and to sign the XML data from the Key Store drop-down menu. The Key Stores available in this menu are dependent on the Key Stores you added at the test suite level. For more information on adding Key Stores, see “Global Key Stores”, page 340.



Algorithm: Specifies the unique algorithm used for defining the certificate keys.

WS-Security When selecting WS-Security from the left pane of the Tools Settings tab, the following options are available:

879

XML Signer



Form: Choose the appropriate form to specify the key and certificate used for encryption.



Actor: Enter a value to specify a SOAP actor.



Canonical Form: Specifies the algorithm used to create a canonicalized form of the information being signed.



Add mustUnderstand="1" attribute: Specifies whether or not the receiver must recognize and verify the signature of the message. If this option is enabled, a SOAP fault will be sent back if the receiver does not know how to process the signature header.



Add Timestamp: Select to add a timestamp to the message. When this option is enabled, the following are available: •

Sign the Timestamp: Select to provide a digital signature with the timestamp.



Add expiration: Select to enter an expiration value.

Target Elements When selecting Target Elements from the left pane of the Tools Settings tab, the following options are available: •

SOAP body/entire document: Select to sign the entire SOAP body or entire XML document.



Click the Add button (only available if SOAP body/entire document is unselected) to specify an XPath and sign a specific element within the XML document. After clicking the Add button, a row will appear in the Element Selection list. The Element Selection list consists of the following two columns: •

XPath Expression: Allows you to enter the desired XPath you would like signed.



Target: Allows you to choose either Entire Element or Content Only. Selecting Entire Element will sign the entire XPath. Selecting Content Only will sign only the text content.

Emulation Options When selecting Emulation Options from the left pane of the Tools Settings tab, the following options are available: Note: The following options are available only if WS-Security Mode is selected in the General tab. •

Emulate: Select the application server you are using to automatically configure the emulation options. You can also select the appropriate version number of your application server from the Version drop-down menu. •

To manually configure the emulation options, select Custom from the Emulate dropdown menu. The following options will be available for you to manually configure: •

wsse URI: Select the namespace URI of the WS-Security specification used.



wsu URI: Select the utility namespace URI of the WS-Security specification used.



Qualify signed element ID attribute: Select to qualify signed element ID attribute.



Qualify BinarySecurity Token attributes: Select to qualify binary security token attributes with the wsse namespace.



Prefix BinarySecurity Token attribute values: Select to prefix binary security token attributes with the wsse URI.

880

XML Signer

Input Type Tab The Input Type tab is only available if the XML Signer tool is added as a stand-alone tool and not chained to another tool. The following options are available from the Input Type tab: •

Text: Use this option if you want to type or copy the XML document into the UI. Select the appropriate MIME type, enter the XML in the text field below the Text radio button.



File: Use this option if you want to use an existing file. Click the Browse button to choose a file. •

Check the Persist as Relative Path option if you want the path to this file to be saved as a path that is relative to the current configuration file. Enabling this option makes it easier to share tests across multiple machines. If this option is not enabled, the test suite will save the path to this file as an absolute path.

Usage Notes You can use the XML Signer tool as a stand-alone tool at the test case level by right-clicking the main test suite node and selecting Add New> Test from the shortcut menu and then selecting XML Signer from the dialog that opens. You may also chain the XML Signer tool to a SOAP Client Tool by right-clicking the desired SOAP Client node and selecting Add Output from the shortcut menu and then selecting XML Signer from the dialog that opens. The SOAP Client tool will use the transformed XML in the invocation of the Web service call. You can chain the XML Signer tool and the XML Encryption tool to a SOAP Client to perform both encryption and XML signature on a SOAP message. For more information on the XML Encryption tool, see “XML Encryption”, page 873. You can also chain any tool, such as an Edit or Browse tool, to the XML Signer Tool by right-clicking the desired XML Signer Tool node and selecting Add Output from the shortcut menu and then selecting XML Signer from the dialog that opens.

Unlimited Strength Java Cryptography Extension Important: In order to perform security tests using the XML Signature Verifier, XML Signer, or XML Encryption tools, or if using Key Stores, you will need to download and install the Unlimited Strength Java Cryptography Extension. To do so, go to http://java.sun.com/javase/downloads/index_jdk5.jsp and download the JCE Unlimited Strength Jurisdiction Policy Files. The files downloaded should be installed into the following directory on your machine: [SOAtest install dir]\[SOAtest version_number]\plugins\com.parasoft.xtest.jre.eclipse.core.[platform]_[jre version]\jre\lib\security

Be sure to replace the existing local_policy.jar and US_export_policy.jar files with the new ones that you downloaded.

881

XML Signature Verifier

XML Signature Verifier This topic explains how to configure and apply the XML Signature Verifier tool that verifies XML documents for security purposes. Sections include: •

Understanding XML Signature Verifier



Configuring the XML Signature Verifier Tool



Usage Notes

Understanding XML Signature Verifier The XML Signature standard recommended by the W3C defines a process that allows any data in XML documents to be digitally signed. OASIS specifies how this can be used with SOAP. With XML Signature, Web service users can verify the identity of others involved in a transaction and can be ensured that the data has not been altered since the document was signed.

Configuring the XML Signature Verifier Tool SOAtest’s XML Signature Verifier Tool allows you to verify the authenticity of digital signatures sent with Web service transactions.

Tool Settings The following options display in the left pane of the Tool Settings tab: •

General



Emulation Options

882

XML Signature Verifier

General When selecting General from the left pane of the Tools Settings tab, the following options are available: •

WS-Security Mode: Select to use OASIS WS-Security 1.0 to verify SOAP messages.



Actor: Enter a value to specify a SOAP actor.



SAML signature verification: Select to use OpenSAML to perform signature verification. When WS-Security mode is selected, proper verifications will be performed by WSS4J on SAML and/or other tokens.



Plain XML Mode: Select to use plain XML to perform signature verification.



Use Key Store: Select the Key Store used from the Key Store drop-down menu. The Key Stores available in this menu are dependent on the Key Stores you added at the test suite level. Depending on whether your service includes the certificate, you may or may not need this option. For more information on adding Key Stores, see “Global Key Stores”, page 340.



Verify all known WSS headers: Select to have SOAtest attempt to process all known Web service security header formats.



Verify specific header formats: Select to have SOAtest process specific headers. Selecting this option enables the Emulation Options.

Emulation Options When selecting Emulation Options from the left pane of the Tools Settings tab, the following options are available: Note: The following options are available only if WS-Security Mode and Verify specific header formats are selected in the General tab.

883

XML Signature Verifier



Emulate: Select the application server you are using to automatically configure the emulation options. You can also select the appropriate version number of your application server from the Version drop-down menu. •

To manually configure the emulation options, select Custom from the Emulate dropdown menu. The following options will be available for you to manually configure: •

wsse URI: Select the namespace URI of the WS-Security specification used.



wsu URI: Select the utility namespace URI of the WS-Security specification used.



Qualify signed element ID attribute: Select to qualify signed element ID attribute.



Qualify BinarySecurity Token attributes: Select to qualify binary security token attributes with the wsse namespace.



Prefix BinarySecurity Token attribute values: Select to prefix binary security token attributes with the wsse URI.

Input Type Tab The Input Type tab is only available if the XML Signer tool is added as a stand-alone tool and not chained to another tool. The following options are available from the Input Type tab: •

Text: Use this option if you want to type or copy the XML document into the UI. Select the appropriate MIME type, enter the XML in the text field below the Text radio button.



File: Use this option if you want to use an existing file. Click the Browse button to choose a file. •

Check the Persist as Relative Path option if you want the path to this file to be saved as a path that is relative to the current configuration file. Enabling this option makes it easier to share tests across multiple machines. If this option is not enabled, the test suite will save the path to this file as an absolute path.

Usage Notes You can use the XML Signature Verifier tool as a stand-alone tool at the test case level by right-clicking the main test suite node and selecting Add New> Test from the shortcut menu and then selecting XML Signature Verifier from the dialog that opens.

Unlimited Strength Java Cryptography Extension Important: In order to perform security tests using the XML Signature Verifier, XML Signer, or XML Encryption tools, or if using Key Stores, you will need to download and install the Unlimited Strength Java Cryptography Extension. To do so, go to http://java.sun.com/javase/downloads/index_jdk5.jsp and download the JCE Unlimited Strength Jurisdiction Policy Files. The files downloaded should be installed into the following directory on your machine: [SOAtest install dir]\[SOAtest version_number]\plugins\com.parasoft.xtest.jre.eclipse.core.[platform]_[jre version]\jre\lib\security

Be sure to replace the existing local_policy.jar and US_export_policy.jar files with the new ones that you downloaded.

884

XML Encoder

XML Encoder The XML Encoder tool is used to output the XML representation of a Java Bean for a captured Java object. This and the XML Decoder are especially useful for EJB testing in conjunction with the EJB Client. For details on using this tool, see “EJB Client”, page 841.

885

XML Decoder

XML Decoder The XML Encoder tool is used to convert an XML body representing a Java Bean into its corresponding Java object. This and the XML Encoder are especially useful for EJB testing in conjunction with the EJB Client. For details on using this tool, see “EJB Client”, page 841.

886

Viewing Tools In this section: •

Traffic Viewer



Event Monitor



Edit



Write File



File Stream Writer



stderr



stdout

887

Traffic Viewer

Traffic Viewer This topic explains how to configure and apply the Traffic Viewer tool that displays the HTTP traffic of the SOAP Request and SOAP Response. Sections include: •

Understanding the Traffic Viewer



Configuring the Traffic Viewer Tool

Understanding the Traffic Viewer Each time you run a test suite, SOAtest will automatically record the HTTP traffic between the client and server for each test case within the test suite. The Traffic Viewer tool helps you to manage and visualize the SOAP Requests and SOAP Responses for each test case. The Traffic Viewer can record multiple instances of HTTP traffic that correspond to each test run.

Configuring the Traffic Viewer Tool After a test is run, double-click the Traffic Object node beneath the test to view that test’s traffic. You can configure the following options for the Traffic Viewer tool: •

Name: Specifies the name of the Traffic Viewer tool



Test Run: Displays the timestamp of the selected test case.



Remove: Click to delete the selected Traffic instance from the Test Run drop-down menu.



Clear All: Click to delete all the Traffic instances from the Test Run drop-down menu.



Save Traffic: Select to save traffic data (http requests and responses) to be available after saving the project. If this option is not selected, http traffic will be lost after the project is closed.



Data Source Row: Displays data source row values sent within the HTTP traffic. Select the appropriate row to display the corresponding SOAP Request and SOAP Header. (The Data Source Row menu only displays if a data source was used for the corresponding test case).



Response: Displays the Response HTTP Header and Body that corresponds to the selected test from the Test Run drop-down menu and also (if applicable) to the selected Data Source Row.

888

Traffic Viewer

You can view the Response body via three different tabs. Each of these tabs provide you with a different graphical representation of the XML message. The tabs are as follows:





Literal: By default, the Response body displays in the Literal tab. The Literal tab allows you to directly edit the text of the Response body. If you find that you have to scroll from left to right to view the HTTP traffic, you can click in the Literal view and press CTRL + B to beautify the XML. After pressing CTRL + B, all well-formed XML fragments will be beautified, alleviating the need to scroll from left to right.



Tree: The Tree tab allows you to view the Response body via a tree view.



Element: The Element tab allows you to view a simplified version of the XML in an easy to read format.

Request: Displays the Request HTTP Header and Body that corresponds to the selected test from the Test Run drop-down menu and also (if applicable) to the selected Data Source Row.

889

Traffic Viewer

You can view the Request body via three different tabs. Each of these tabs provide you with a different graphical representation of the XML message. The tabs are as follows: •

Literal: By default, the Request body displays in the Literal tab. The Literal tab allows you to directly edit the text of the Request body. If you find that you have to scroll from left to right to view the HTTP traffic, you can click in the Literal view and press CTRL + B to beautify the XML. After pressing CTRL + B, all well-formed XML fragments will be beautified, alleviating the need to scroll from left to right.



Tree: The Tree tab allows you to configure the Response body via a tree view.



Element: The Element tab allows you to view a simplified version of the XML in an easy to read format.

Tip: You can toggle between Response and Request views by clicking the Up or Down arrows located to the left of the Response body and Request header views.

890

Event Monitor

Event Monitor The Event Monitor tool traces the internal events within systems such as ESBs and business applications and allows you to make them part of SOAtest’s end-to-end test scenarios. For details, see “Event Monitoring (ESBs, Java Apps, Databases, and other Systems)”, page 494.

891

Edit

Edit This topic explains how to use and configure the Edit tool that sends data to the internal Edit. Sections include: •

Understanding Edit



Using Edit



Understanding Edit The Edit tool sends data to a SOAtest text editor window. This tool is primarily used to view the output of another tool, such as traffic from a messaging tool or the output of an XML Transformer.

Using Edit To use Edit, add it as a test suite output or chain it to the tool whose results you want it to display.

892

Write File

Write File This topic explains how to configure and apply the Write File tool, which saves the output of test cases or tools. Sections include: •

Understanding Write File



Customizing Write File

Understanding Write File The Write File tool can convert output data to files. The Edit tool is often used in conjunction with this tool because it provides a means to view the files that the Write File tool creates. This tool is primarily used to save the files that result from transformations (XSLT, etc.). It is typically added as an output to an existing tool as described in “Adding Test Outputs”, page 333.

Customizing Write File You can customize the following options for a Write File tool: •

Target Name: Determines how this tool names the file it creates. You can enter a specific name for a file, or use wildcards where appropriate. Acceptable wildcards include: •

%b: Base name of original file (no extension).



%f: Original file name (with extension).



%e: Extension of original file name.



%u: Time based on unique field ID.



%d: Current date.



%t: Current time.



%n: Test name.



Target Directory: Determines where this tool places the file it creates. You can choose to place the file within a File System or within a Workspace.



Create directories: Determines whether the tool creates directories.



Override directory from input: Determines whether this tool always saves files in the location specified in the Target Directory field. If this option is enabled, the tool will always save the file in the location specified in the Target Directory field. If this option is not enabled, the tool will try to save the file in the source file’s directory; if the tool cannot write to that directory, it will save the file in the location specified in the Target Directory field.



Backup file before overwriting: Determines whether backup files are created before the modified file is saved over the previous version.



Use UTF-8 encoding: Determines whether the tool writes files using UTF-8 encoding.



Append: Determines whether the tool appends new content to the existing file instead of overwriting it.

893

File Stream Writer

File Stream Writer The File Stream Writer is similar to the Write File tool. Whereas the Write File tool saves the output of a test case or tool, the File Stream Writer tool saves a traffic stream. For details on available options, see “Write File”, page 893.

894

Results Stream Writer

Results Stream Writer When the associated test is run, this tool will open a new tab that displays the request/response traffic, including HTTP headers. It is useful for debugging to see the actual traffic on the wire.

895

stderr

stderr When the associated test is run, this tool writes the traffic stream to the standard error stream.

896

stdout

stdout When the associated test is run, this tool writes the traffic stream to the standard output stream.

897

Validation Tools In this section: •

Diff



WS-I



DB



XML Assertor



WS-BPEL Semantics Validator

898

Diff

Diff This topic covers the Diff tool, which compares saved data with incoming data and reports differences. Sections include: •

Understanding Diff



Binary Diff Mode Options



Text Diff Mode Options



XML Diff Mode Options



JSON Diff Mode Options



Understanding XPaths



Comparing Large XML Messages

Understanding Diff The Diff tool compares two files or outcomes and reports how they differ. This tool is primarily used in test suites. To open this panel, double-click the Diff node in your Test Suite tree. The Diff Project Configuration panel is divided into two main sections. The upper section contains general options. The bottom section contains the Regression Control, Ignored Differences, and Options tabs in which you can configure the options for Text, Binary, and XML comparisons. Information on how to configure the different options of the Diff tool can be found in the following subsections: •

Binary Diff Mode Options



Text Diff Mode Options



XML Diff Mode Options



JSON Diff Mode Options



Understanding XPaths



Comparing Large XML Messages

Binary Diff Mode Options When Binary is selected from the Diff Mode drop-down menu of the Diff tool, the following options are available: •

Name: Specifies the name of the Diff tool.



Data Source: Specifies the Data Source to be used for providing control values. This menu is only available if a Data Source was added to the project. For more information on Data Sources, see “Parameterizing Tests (with Data Sources or Values from Other Tests)”, page 345.



Regression Control Source: Determines what data source values, file, or text SOAtest uses as the "control" value (the value against which it will compare all subsequent results). •

Editor: Tells SOAtest to use the text entered in the related text field as a regression control.



File: Tells SOAtest to use the specified file as a regression control. If you want to ensure that this file's path is always relative to your project file, enable the Persist as Relative Path option.

899

Diff



Data Source Column: Tells SOAtest to use values from the designated data source column as regression controls. This option is only available if your project includes a data source.

Text Diff Mode Options When Text is selected from the Diff Mode drop-down menu of the Diff tool, the Regression Controls, Ignored Differences, and Options tabs are available:

Text Regression Controls The following options are available in the Regression Control tab of the Diff tool for Text mode. •

Name: Specifies the name of the Diff tool.



Data Source: Specifies the Data Source to be used for providing control values. This menu is only available if a Data Source was added to the project. For more information on Data Sources, see “Parameterizing Tests (with Data Sources or Values from Other Tests)”, page 345.



Regression Control Source: Determines what data source values, file, or text SOAtest uses as the "control" value (the value against which it will compare all subsequent results). •

Editor: Tells SOAtest to use the text entered in the related text field as a regression control.



File: Tells SOAtest to use the specified file as a regression control. If you want to ensure that this file's path is always relative to your project file, enable the Persist as Relative Path option.



Data Source Column: Tells SOAtest to use values from the designated data source column as regression controls. This option is only available if your project includes a data source.

Text Ignored Differences From the Ignored Differences tab of the Diff tool for Text mode, you can delete any ignored text differences by clicking the Delete button.

Text Options The following options are available in the Options tab of the Diff tool for Text mode. •

Regular Expression: Determines whether the expected value will be parsed as a regular expression. If this box is unselected, the control value will not be parsed as a regular expression. If this box is selected, the control value will be parsed as a regular expression. For example, the following regular expression may be entered: Java[a-zA-Z ]+\Q(\E[4-9]+th Edition\Q)\E

Where the actual string extracted may be the following: Java How to Program (4th Edition) The regular expression will parse through the above string to check if "Java" appears as the first substring, followed by one or more occurrences (indicated by the + operator after the open and closed brackets) whereas the occurrences are limited to only letters ranging between a-z and A-Z and whitespace. Notice we escaped ( and ) by adding \Q and \E around ( and ). This needs to be done because open and closed parentheses can be part of a regular expression, therefore they need to be escaped if they are represented to be substrings. We also have a digit appear after the first open parenthesis which is indicated by [4-9]+. This specifies at least

900

Diff

one occurrence of a digit between 4 and 9 that should appear and be followed by "th Edition". Lastly we have the closed parenthesis which we escape with \Q\E. •

Ignore Whitespace: Determines whether empty lines and whitespaces at the end and beginning of input lines and diff control lines are ignored. If this box is unselected, empty lines and leading/trailing whitespaces will cause the regression test to fail. If this box is selected, empty lines and leading/trailing whitespaces will be ignored. A whitespace is any of the following: horizontal tabulation, new line, form feed, carriage return, space. An empty line is a line that contains one or more of these whitespaces.



Output results as UNIX-style diff: Determines the diff output format. If this box is unselected, the output will display in table form. If this box is selected, the output will display in UNIX-style.

XML Diff Mode Options When XML is selected from the Diff Mode drop-down menu of the Diff tool, the Regression Controls, Ignored Differences, and Options tabs are available. XML Mode parses XML files, then compares them element by element and attribute by attribute (ignoring “ignorable” whitespace). The result of the comparison is expressed as an XML document. If you want to use this mode, make sure that both inputs are well-formed XML documents which have the same type of document element. The XML Mode provides three different graphical representations to configure XML messages: •

Literal XML



Form XML



SOAP Response

Literal XML Regression Controls The following options are available in the Regression Control tab of the Diff tool for Literal XML mode. •

Name: Specifies the name of the Diff tool.



Data Source: Specifies the Data Source to be used for providing control values. This menu is only available if a Data Source was added to the project. For more information on Data Sources, see “Parameterizing Tests (with Data Sources or Values from Other Tests)”, page 345.



Regression Control Source: Determines what data source values, file, or text SOAtest uses as the "control" value (the value against which it will compare all subsequent results).





Editor: Tells SOAtest to use the text entered in the related text field as a regression control.



File: Tells SOAtest to use the specified file as a regression control. If you want to ensure that this file's path is always relative to your project file, enable the Persist as Relative Path option.



Data Source Column: Tells SOAtest to use values from the designated data source column as regression controls. This option is only available if your project includes a data source.

Set From WSDL: Initializes the Form XML content with the expected response based on the WSDL. This button is only available if a WSDL document exists for the particular SOAP client.

901

Diff

Form XML Regression Controls When Form XML is selected as the Diff mode, the options in the Regression Control tab are divided into two sections: the XML View Tree and the XML Configuration Tabs. The XML View Tree displays the literal XML as a tree, with each tree node representing an element. The options for the XML View Tree in the Diff panel can be configured in the same fashion as the Form XML SOAP Envelope options of the SOAP Client tool. For more information, see “Manipulating the XML View Tree”, page 794. The XML Configuration Tabs allow you to add, remove, and rename XML components. The options for the XML Configuration Tabs can be configured in the same fashion as the Form XML SOAP Envelope options of the SOAP Client tool. For more information, see “Manipulating the XML Configuration Tabs”, page 796.

SOAP Response Regression Controls The SOAP Response mode compares values that you define to the actual outcome of the SOAP response. It allows you to compare multiple elements and attributes of the most complex XML structures. Depending on the method, a number of different options may be available. The options for the SOAP Response mode can be configured in the same fashion as the Literal Value options of the SOAP Client tool. For more information, see “Literal Value Options (For RPC-Style and Document-Style Services)”, page 807.

XML Ignored Differences From the Ignored Differences tab of the Diff tool for XML mode, you can Add and Modify XPath settings by clicking the appropriate buttons. For more information on configuring Ignored Differences, see “Understanding XPaths”, page 903.

XML Options The following options are available in the Options tab of the Diff tool for XML mode. •

Output Results as XML: Determines the diff output format. If this box is unselected, the output will display in table form. If this box is selected, the output will display in XML. Differences are only reflected when another tool is chained to the message output of the Diff tool.



SOAP Mode: If this box is selected, the following actions are enabled:





SOAP multi-references are resolved before diffing and are not reported as errors. For example, some services (such as Axis) rearrange the XML responses in an unpredictable way when using SOAP multi-refs (i.e., giving different id numbers to the references). When such responses are diffed in text mode, failures occur even though the SOAP messages are logically equivalent, but their references are different.



Namespace prefix changes in type and arrayType attributes are ignored. This is needed when the control is generated automatically from the WSDL because prefixes cannot be determined in advance.



Numerical values are compared as numbers. For example, the difference between 1 and 1.0 would not be reported as an error.

Ignore Element Order: If this box is selected, the reordering of sibling element sequences are not reported as a difference. For example, when Ignore Element Order is selected <root> <a/> <b/>

902

Diff

<c/> <root>

would match: <root> <c/> <b/> <a/> <root>

If Ignore Element Order is not selected, reordered XML contents will not match.

JSON Diff Mode Options When JSON is selected from the Diff Mode drop-down menu of the Diff tool, the Regression Controls, and Ignored Differences are available.

JSON Regression Controls The following options are available in the Regression Control tab of the Diff tool for JSON mode. •

Name: Specifies the name of the Diff tool.



Data Source: Specifies the Data Source to be used for providing control values. This menu is only available if a Data Source was added to the project. For more information on Data Sources, see “Parameterizing Tests (with Data Sources or Values from Other Tests)”, page 345.



Regression Control Source: Determines what data source values, file, or text SOAtest uses as the "control" value (the value against which it will compare all subsequent results). •

Editor: Tells SOAtest to use the text entered in the related text field as a regression control.



File: Tells SOAtest to use the specified file as a regression control. If you want to ensure that this file's path is always relative to your project file, enable the Persist as Relative Path option.



Data Source Column: Tells SOAtest to use values from the designated data source column as regression controls. This option is only available if your project includes a data source.

JSON Ignored Differences From the Ignored Differences tab of the Diff tool for JSON mode, you can Add and Delete XPath settings by clicking the appropriate buttons.

Understanding XPaths When the Diff tool is configured in the XML mode, any differences found between the actual value and the expected value are expressed in XPath. XPaths represent the position of an XML element and specifies where a difference occurred, the type of difference that occurred, and whether that difference was caused by a modification, an insertion, or a deletion. You can determine which XPaths to ignore when running the Diff tool. You can choose to either ignore an entire XPath, or you can ignore a specific XPath operation such as a Content: Insert operation. Ignoring XPaths is useful for ignoring transient values that would normally cause the regression test to fail. For example, the actual outcome may contain a time stamp value that is constantly changing.

903

Diff

Since this value is never the same, it will more than likely not match the value you configure in the Diff tool, causing the test to fail. Therefore, you can specify the XPath of this value to be ignored so that the test will not fail. There are three different ways you can set XPaths to Ignore: •

Ignore XPaths from the SOAtest View: This is the easiest way to configure the Diff tool to ignore XPaths. After a failed regression test, you can simply right-click the SOAtest view node to select an XPath to ignore.



Manually entering an XPath in the Diff GUI: You can manually enter an XPath by clicking the Configure Ignored Differences button in the Diff GUI.



Ignore XPaths from the Form XML tree: You can right-click on an element node from the Form XML tree and configure an XPath to ignore based on the selected element.

Ignoring XPaths from the SOAtest View When entering XPaths to ignore in the Diff tool, it is easiest to right-click in the SOAtest view rather than manually entering the XPath position into the Diff tool GUI. XPath positions are displayed in tree form in the Test Results panel after a failed regression test: To ignore an XPath from the SOAtest view after a failed regression test: 1. Right-click the error, then choose Ignore XPath from the shortcut menu. An Ignored XPath Settings dialog displays with the selected XPath automatically entered.

2. Select the appropriate checkboxes of the XPath operations you would like to ignore. The following options are available: •

XPath: Specifies the XPath position that you selected.



Recursive: Select to apply the Ignored XPath settings to child elements.



Text Content (Modify/Insert/Delete): Select the content operation you want to ignore.



Element/Subtree (Insert/Delete): Select the element or subtree operation you want to ignore.



Attribute (Modify/Insert/Delete): Select the attribute operation you want to ignore. If this field is selected, SOAtest will only ignore the specified attribute name within the XPath. To ignore a specific attribute, enter the attribute name in the field next to the Attribute checkbox. If you want to ignore more than one attribute at an element’s

904

Diff

XPath location, leave the attribute name empty or use the wild card * (e.g. myAttribute*). •

Element Name and Namespace (Modify/Insert/Delete): Select the element name operation you want to ignore.

3. Click OK. The XPath you specified will now be ignored for any future test runs. In addition, the XPath you specified now appears in the Ignored Differences tab of the XML Mode in the Diff tool. To modify the XPath, see “Modifying XPath Settings”, page 906.

Manually Entering XPaths to Ignore You can also manually type or paste an XPath to ignore into the Diff tool GUI. To manually enter an XPath: 1. Select the Ignored Differences tab within the Text, XML, or JSON mode of the Diff tool. 2. Click the Add button. An empty field displays in the XPath column of the Ignored XPaths list. By default, the Settings column is automatically filled in with all XPath operations specified, meaning that the entire XPath you add will be ignored. To specify a single XPath operation to ignore, see “Ignoring XPaths from the Form XML Tree”, page 905. 3. Either enter an XPath position in the empty XPath field. 4. To add additional XPaths, repeat steps 1 through 3. 5. Click the OK button. All XPaths added will be ignored in future runs of the modified regression test. For information on modifying ignored XPaths, see “Modifying XPath Settings”, page 906.

Ignoring XPaths from the Form XML Tree You may also ignore XPaths directly from the Form XML tree within the Form XML tab by completing the following: 1. Right-click an element node from the tree and choose Setup Ignored XPaths from the shortcut menu. •

If the element you selected from the Form XML tree is not a repeated element, the Ignored XPaths Settings dialog box displays with the selected XPath automatically entered.



If the element you selected from the Form XML tree is a repeated element that has siblings with the same local name, a Repeated XPath Settings dialog displays.

905

Diff

You can either select to Ignore All of the repeated elements, or Ignore Selected Element Only. After making your selection and clicking OK, the Ignored XPaths Settings dialog box displays with the selected XPath automatically entered. 2. Select the appropriate checkboxes of the XPath operations you would like to ignore. The following options are available: •

XPath: Specifies the XPath position that you selected.



Recursive: Select to apply the Ignored XPath settings to child elements.



Text Content (Modify/Insert/Delete): Select the content operation you want to ignore.



Element/Subtree (Insert/Delete): Select the element or subtree operation you want to ignore.



Attribute (Modify/Insert/Delete): Select the attribute operation you want to ignore. If this field is selected, SOAtest will only ignore the specified attribute name within the XPath. To ignore a specific attribute, enter the attribute name in the field next to the Attribute checkbox. If you want to ignore more than one attribute at an element’s XPath location, leave the attribute name empty or use the wild card * (e.g. myAttribute*).



Element Name and Namespace (Modify/Insert/Delete): Select the element name operation you want to ignore.

3. Click OK. The selected element node now displays in gray in the Form XML tree. All XPath operations specified will be ignored in future runs of the modified regression test. To reset and no longer Ignore an XPath, right-click the Ignored XPath node and select Remove Ignored XPath from the shortcut menu. The node will then display in black and will no longer be ignored.

Creating a Shared Ignored XPaths List Diff tools can either have their own local, exclusive Ignored XPath settings, or they can reference a list of XPaths that can be used by other Diff tools. To create a shared list of ignored XPaths, see “Global Ignored XPath Properties”, page 338.

Modifying XPath Settings By default, when you add an XPath to the Ignored Differences tab, all of the possible operations for that XPath display in the Settings column. Since all of the possible operations for the XPath are specified by default, all of these operations will be ignored the next time the regression test is run. However, you

906

Diff

can specify which operations are ignored, rather than having all of them ignored, by modifying the XPath settings. To modify XPath settings: 1. In the Ignored Differences tab, select the XPath you would like to modify and click the Modify button. The Ignored XPaths Settings dialog box displays.

2. Select the appropriate checkboxes of the XPath operations you would like to ignore. The following options are available: •

Recursive: Select to apply the Ignored XPath settings to child elements.



XPath: Specifies the XPath position that you selected.



Text Content (Modify/Insert/Delete): Select the content operation you want to ignore.



Element/Subtree (Insert/Delete): Select the element or subtree operation you want to ignore.



Attribute (Modify/Insert/Delete): Select the attribute operation you want to ignore. If this field is selected, SOAtest will only ignore the specified attribute name within the XPath. To ignore a specific attribute, enter the attribute name in the field next to the Attribute checkbox. If you want to ignore more than one attribute at an element’s XPath location, leave the attribute name empty or use the wild card * (e.g. myAttribute*).



Element Name and Namespace (Modify/Insert/Delete): Select the element name operation you want to ignore.

3. Click OK. All XPath operations specified will be ignored in future runs of the modified regression test.

Comparing Large XML Messages Parasoft SOAtest can use ExamXML's command line utility MDCXML.exe as the XML comparison engine within the Diff tool. MDCXML is highly scalable, and it can ignore order of elements at any level, in addition to optionally ignoring specific element values. ExamXML is a 3rd party tool and is licensed and purchased separately from A7Soft. If you prefer to use the ExamXML engine instead of the Parasoft default (VMToolsAlgorithm):

907

Diff

1. When in XML mode, open the Options tab and choose ExamXML MDCXML for XML Differencing Algorithm.

2. Download MDCXML.exe from http://www.a7soft.com/mdcxml.html. •

The ExamXML command line utility mdcxml is not included with Parasoft SOAtest. ExamXML and its command line utility mdcxml.exe is provided and supported by A7Soft.

3. In the MDCXML Executable field, enter or browse to the location of MDCXML.exe. 4. In the Options File field, enter or browse to the location of the MDCXML.exe options file. •

The MDCXML.exe options file contains various settings for mdcxml, including a list of elements to ignore. An example of an options file is provided on the A7Soft Web site at http://www.a7soft.com/mdcxml.html.



The options file needs to contain the following minimum settings (SOAtest does not support the use of different values for the following parameters): •

ValidationErrorAsFatal=0



InsertPI=1



InsertTags=0



InsertComments=0

In order to use the ExamXML mode, the Diff tool’s Control content must be provided in a file. It is possible to create external regression controls automatically (save the expected content into files instead of embedding it in the Diff tool) as described in “Configuring Regression Testing”, page 425.

908

WS-I

WS-I This topic explains how to configure and apply the WS-I tool that analyzes WSDLs and SOAP messages, then validates their conformity to WS-I assertions. Sections include: •

Understanding WS-I



Analyzing SOAP Messages for WS-I Conformance



Customizing WS-I



Conformance Report Output

Understanding WS-I The WS-I tool tests the Web Services Description Language (WSDL) for conformance to Basic Profile 1.1 and Simple SOAP Binding Profile (SSBP) 1.0 using the Test Tools developed by WS-I. Verifying conformance to WS-I Basic Profile 1.1 is especially useful in ensuring the interoperability of your Web service. By restricting Web service development to the technologies specified in Basic Profile 1.1, you can increase the odds that your Web service will interoperate with others’ systems. In the case of nonconformance, you can pinpoint exactly what needs to be changed to ensure interoperability. The tool parses the WSDL, passes it to the Test Tools and produces a WS-I conformance report that can be viewed in a web browser or in SOAtest’s internal editor. In addition to analyzing WSDLs, it can also analyze SOAP messages and validates their conformity to WS-I assertions. As a test suite tool, it allows you to check WSDLs as part of your functional test scenario. As an output to a test suite’s SOAP Client tool, it can validate SOAP messages for conformance to the BP 1.1 & SSBP 1.0 profiles. To check interoperability during static analysis, use the "Check WS-I Interoperability" rule, which is in the Interoperability category. This rule has the same customization options as the WS-I tool.

Analyzing SOAP Messages for WS-I Conformance The WS-I tool can also be added to the Object Output of a SOAP Client for the purpose of validating SOAP messages for conformance to BP (Basic Profile) 1.1 & SSBP (Simple SOAP Binding Profile) 1.0 profiles.

Customizing WS-I You can customize the following options: •

wsdlURI: Specifies the WSDL you would like to test for conformance to Basic Profile 1.1. After entering a valid WSDL, you can hit Enter on your keyboard to automatically populate the wsdlElement fields. If the WSDL you enter is not valid, an error message displays in the Messages panel.



wsdlElement: The wsdlElement fields (parent, name, namespace, type) automatically populate after entering a valid WSDL in the wsdlURI field and pressing Enter on your keyboard.



assertionResults: Specifies the type of assertion results that should appear in the conformance report. The valid options available in the assertionResults drop-down menu are as follows: •

all: Lists the results from all test assertions.



notPassed: Lists all of the assertion test results except the ones that have a result of passed.

909

WS-I



onlyFailed: Lists only the test assertion results which have a result of failed.



failureMessage: Specifies whether or not the pre-defined Basic Profile 1.1 error messages for each test assertion are included in the conformance report. If this box is unselected, the predefined error messages will not display in the conformance report. If this box is selected, the pre-defined error messages will display in the conformance report.



failureDetail: Specifies whether or not error detail messages that are specific to your WSDL or SOAP artifacts are included in the conformance report. If this box is unselected, the error detail messages will not display in the conformance report. If this box is selected, the error detail messages will display in the conformance report.

Conformance Report Output The WS-I tool produces one output file. This output file contains the conformance report for the Web service corresponding to the WSDL you entered. This conformance report displays the test results for the assertions that have been processed. Also, the conformance report displays the conformance level for each test assertion that was processed, and may list detailed information for any errors that were encountered. The report also contains a summary of the test assertions results that indicate if the Web service passed or failed the conformance test. After you have customized the WS-I tool, you can chain an Edit or Browse tool to display the output of the WS-I conformance report. To do so, complete any of the following: •

Right-click the WS-I test node in the Test Case Explorer tab and select Add Output from the shortcut menu. The Add Output dialog displays. Select Conformance Report from the left pane and Browse from the right pane and click Finish. After you run the WS-I test, a browser window will open and display a WS-I Profile Conformance Report.



Right-click the WS-I test node in the Test Case Explorer tab and select Add Output from the shortcut menu. The Add Output dialog displays. Select Conformance Report from the left pane and Edit from the right pane and click Finish. After you run the WS-I test, the WS-I Profile Conformance Report displays in the right GUI panel.

For more information on the WS-I tool specifications, or for more information on WS-I conformance reports, see http://www.ws-i.org.

910

DB

DB This topic explains how to configure and apply the DB tool that allows you to query databases for validating SQL statements. Sections include: •

Understanding the DB Tool



Configuring the DB Tool



Oracle Type Extensions



Support for PL/SQL and Stored Procedures



Calling Methods on JDBC Objects

Understanding the DB Tool This tool sends an inquiry to the database you specify, and the results from the query will be returned. SOAtest formats the results into XML; this allows XML based tools (XML Data Bank, XSLT Tool, etc.) to be chained to the output. The DB tool can be used to call stored procedures (e.g., using PL./SQL) as described in “Support for PL/SQL and Stored Procedures”, page 914. A Traffic Viewer is automatically attached to each DB tool. It lets you see the end result of the query that was sent to the server (taking into consideration parameterizations, variables, and so forth). In the response, JDBC-related objects are shown. You can chain tools to these objects as described in “Calling Methods on JDBC Objects”, page 916.

Configuring the DB Tool The following options can be configured in the DB tool:

General •

Data Source: Specifies the Data Source to be used for providing values. This menu is only available if a Data Source was added to the test suite.

Connection Tab •



Use Local Settings/Use Shared Property: Specifies whether the test should use only settings within the corresponding DB tool, or if the test should use settings from a shared global database account. This drop-down list is only available if a Database Account shared property was added to the test suite. For more information on shared properties, see “Global Database Account Properties”, page 340. •

Use Local Settings: Select from the drop-down list if you would like to only use the settings that you configure within this test.



Use Shared Property: Select from the drop-down list if you would like to only use the global database settings that you added to the test suite as a Global Property.

Database Connection Settings: Allows you to specify the Driver, URL, Username, and Password of the desired database to be queried. For more information, see “Database Configuration Parameters”, page 348 and the SOAtest forum.

SQL Query Tab

911

DB



SQL Query: Allows you to specify either a Fixed or Parameterized SQL statement. •

If Fixed is selected, you can do finer-grain parameterization by entering the special string $[column] into the text box and selecting a data source from the Data Source drop-down menu. During runtime, the tool will look up the column in the specified data source. Any string value can be parameterized. For example, if a data source is created and has three rows that contain CA, NY, and WA, you can enter the following into the Fixed text box: SELECT * FROM People WHERE state = ’$[States]’

When ran, the DB tool will create the following queries: SELECT * FROM People WHERE state = "CA" SELECT * FROM People WHERE state = "NY" SELECT * FROM People WHERE state = "WA"

To access values from a Writable Data Source, use $[<Data Source Name>: <Column Name>"]. To access values from an XML Data Bank, use $[<Data Source Column Name mapped to Selected XPath>]



JDBC OUT parameter types for stored procedures: Allows you to invoke stored procedures and functional that take OUT parameters or return a value. You can leave this field empty if you are executing a regular SQL query or a stored procedures that does not have any OUT parameters. See JDBC OUT Parameter Types for Stored Procedures below for details.

After executing an SQL Query, you can view the results in tabular form. To do this, right-click the DB tool node and choose Show Result from the shortcut menu.

Options Tab •

Fail on SQL Exception: Specifies whether or not the test should fail when a SQL Exception is encountered. Note that if a validation tool is chained as an output to this DB tool, the outcome of that validation will determine this test’s success or failure, and this setting is not applicable.



Separate column names from values in the XML output: Determines whether SOAtest should separate column names from values. This may result in output that is more difficult for a human to read, so it is disabled by default.



Encode XML Characters in Result: Specifies whether XML characters are encoded in the tool’s results. Enable this option if you want to ensure that the XML output of the tool is wellformed and that the values do not break the XML. For example if a String value of a parameter contains the character < then it would break the XML syntax unless it is escaped/replaced with &gt; and which is decoded by XML parses back to < .

912

DB

JDBC OUT Parameter Types for Stored Procedures The JDBC OUT parameter types for stored procedures field allows you to invoke stored procedures and functions that take OUT parameters or return a value. You can leave this field empty if you are executing a regular SQL query or a stored procedures that does not have any OUT parameters The types you list in that field can be separated with spaces or commas, and the type names correspond to the field names in the class java.sql.Types (http://java.sun.com/j2se/1.5.0/docs/api/java/sql/ Types.html) or oracle.jdbc.OracleTypes class fields (http://download.oracle.com/docs/cd/A97329_03/ web.902/q20224/oracle/jdbc/OracleTypes.html). SOAtest will look for the type you provide in java.sql.Types first. If it is not found there, it will look for it in oracle.jdbc.OracleTypes. The type name you provide is passed to the JDBC CallableStatement.registerOutParameter().

Examples •

To invoke a stored procedure that takes 4 IN parameters, use CALL MYPROC(XMLTYPE('<hello>Some XML</hello>'), 55, 'character large object', 'some text')



To invoke a stored procedure that takes 4 parameters (Oracle XMLType, INTEGER, CLOB and VARCHAR respectively)—assuming that the 2nd (INTEGER) and 4th (VARCHAR) parameters are OUT parameters— then the SQL query content you provide can look like CALL MYPROC(XMLTYPE('<hello>some XML</hello>'), ?, 'character large object', ?)

and the JDBC OUT parameter types field would have the content INTEGER, VARCHAR



To invoke a function that returns a VARCHAR and takes 5 OUT parameters with the types OUT INTEGER, IN INTEGER, Oracle XMLTYPE, Oracle CLOB, VARCHAR and DATE, the SQL query would look like ? = CALL MYFUNCTION(XMLTYPE('<hello>Some XML</hello>'), 'character large object', ?, 33, ?, ?, ?, ?)

and the JDBC OUT parameter types field would have the content VARCHAR, INTEGER, SYS.XMLTYPE, CLOB, VARCHAR, DATE



To invoke the same function as above, but using data source values for some of IN parameters, use ? = CALL MYFUNCTION(XMLTYPE('$[My CML Content Columna name]'), 'character large object', ?, $[Column Name 2], ?, ?, ?, ?)

DB tool will serialize the function return value and the OUT parameters in XML, similar to how it does with regular SQL queries. This allows you to see the execution results and chain XML validation/extraction tools such as XML Assertor and XML Data Bank. If the return value or any of the OUT parameters is a CURSOR type, SOAtest will handle it as a JDBC ResultSet, and serialize it into XML nested within the overall result XML—the same way that it operates on regular query results. •

To invoke a function that takes an IN integer parameter and returns a CURSOR, you would use

913

DB

? = CALL MYFUNC(15000)

and the parameter types field would contain CURSOR

In the case of CURSOR return or OUT parameters, DB tool returns them in a ResultSet XML format embedded within the overall XML output of DB tool, which includes column names associated with the CURSOR. Note that if you receive a message like SQLException "Bigger type length than maximum," you are probably encountering a bug in the Oracle driver. If this occurs, please try a newer Oracle JDBC driver. Parasoft has verified that ojdbc14.jar version 10.2.0.4 does not suffer from this issue.

Oracle Type Extensions Oracle databases support various proprietary data types. If your database utilizes these data types, you must add the appropriate jar to SOAtest’s classpath. For example, Oracle supports oracle.xdb.XMLType. This is available with xdb.jar, which is shipped with specific Oracle applications.

Support for PL/SQL and Stored Procedures Considering that in today’s systems, the business logic can reside inside the stored procedures, they need validation—just like any other component in the system. SOAtest can store procedures (also known as functions, procedures, packages, and triggers) to an Oracle database. It can also call those procedures over and over. This is especially useful for complex queries that need to be executed frequently. PL/SQL (Procedural Language/Structured Query Language), Oracle’s proprietary procedural extension to the SQL database language, is used in the Oracle database. Other SQL database management systems offer similar extensions to the SQL language. The following image shows an example of how to configure the DB tool to create or replace a stored procedure:

914

DB

The following image shows an example of how to configure the DB tool to call a stored procedure:

915

DB

Calling Methods on JDBC Objects You can chain tools to the traffic objects that result from DB tool execution. For instance, if you want to call methods on JDBC objects directly (instead of just using the XML output), you can attach an Extension tool to the traffic object and get the JDBC ResultSet, Connection, or Statement. For example, you could use the following for the Extension tool: def checkJdbcObjects(input, context): rs = input.get("ResultSet") c = input.get("Connection") s = input.get("Statement") # your own code calling methods on rs, c, s

You can then have your own code calling methods on rs, c, and s.

916

XML Assertor

XML Assertor This topic covers the XML Assertor tool, which enforces the correctness of data in an XML message. Sections include: •

Understanding XML Assertor



Configuring the XML Assertor

Understanding XML Assertor This tool is most commonly connected to a SOAP Client or Messaging Client in order to verify the data returned by a service. The XML Assertor provides support for complex message validation needs without the need for scripting, and allows you to easily create and maintain validation assertions on your XML messages. When verifying results from a Web service, you may want to apply specific rules conformant to your business requirements. It may not be sufficient to verify that changes occurred; you may also want to define complex rules defining what constitutes an acceptable XML message. The XML Assertor is designed to provide a range of assertions which can enforce more logical and business-appropriate rules for message validity.

Configuring the XML Assertor The XML Assertor consists of three main tabs: •

Summary: This tab contains a table showing the details of the XML Assertions that have been configured.



Configuration: This tab is used to create and configure XML Assertions.



Expected XML: Specifies the expected XML response. The expected XML response is used as a template to choose elements from when creating assertions. If the XML Assertor is attached to a SOAP Client, this panel will be populated automatically. If not, or if you would like to configure the expected response manually, you can use the Literal or Tree tabs to populate the expected XML manually.

To use the XML Assertor, complete the following: 1. Select a test node and click Add Output from the shortcut menu. The Add Output wizard displays.

917

XML Assertor

2. In the Add Output wizard, select Response> SOAP Envelope from the left GUI pane, select XML Assertor from the right GUI pane, and click the Finish button. A Response SOAP Envelope> XML Assertor node is added underneath the selected test node. 3. Click the Add button in the Configuration tab. The Select Assertion wizard displays.

918

XML Assertor

4. Select an assertion type. The following is a brief summary of the available types of Assertions. •



Value Assertions: The following value assertions are available: •

Value Assertion: Enforce the value of a particular element.



Value Occurrence Assertion: Enforce the occurrence of a value for an element.



Numeric Assertion: Enforce the numeric value of an element.



String Comparison Assertion: Enforce the value of the text content of a given element.



Regular Expression Assertion: Enforce that an element matches a regular expression.



Expression Assertion: Enforce the value of an expression composed of elements.



Custom Assertion: Enforce custom assertion on an element.

Structure Assertions: The following structure assertions are available: •

Occurrence Assertion: Enforce the number of occurrences of an element.

919

XML Assertor

• •





Has Content Assertion: Enforce that an element has text content.

Compound Assertions: The following compound assertions are available: •

AND Assertion: Group multiple assertions that all must succeed.



OR Assertion: Group multiple assertions where at least one must succeed.



Conditional Assertion: Enforce an assertion only if a condition is met.

Difference Assertions: The following difference assertions are available: •

Numeric Difference Assertion: Enforce a numeric difference on a value of a particular element. Assert that the numeric value of an element differs from a user-specified base value by a user-specified value. For example, in order to assert that a value in degrees Fahrenheit is 3 degrees below freezing, you would set the Base Value to 32 and the Difference Value to -3.



Date Difference Assertion: Enforce a date difference on a value of a particular element. Assert that an element with a date value differs from a user-specified date by a user-specified number of years, months, and days. For example, in order to assert that a date in the response message is the day before March 1st 2004, you would specify 2004-03-01 as the Base Value with a Difference Configuration of 0 Years, 0 Months, and -1 Days. Thus, the Date Difference Assertion will automatically account for leap years.



DateTime Difference Assertion: Enforce a date time difference on a value of a particular element. Assert that an element with a datetime value differs from a user-specified datetime—in the format of yyyy-MM-dd'T'HH:mm:ss—by a user-specified number of years, months, days, hours, minutes, seconds, and milliseconds.

Range Assertions: The following difference assertions are available: •

Numeric Range Assertion: Enforce a numeric range on a value of a particular element.



Date Range Assertion: Enforce a date range on a value of a particular element.



DateTime Range Assertion: Enforce a date time range on a value of a particular element.

5. Click the Next button. A tree view displays.

920

XML Assertor

6. Select an element from the tree view and click the Finish button. The Configuration tab is now populated with the selected assertion.

921

XML Assertor

You may add additional assertions to apply to the message (such as a Numeric assertion to enforce on the price element) by clicking the Add button in the Configuration tab. If you later want to modify the element referenced by an assertion, click Change Element, which is at the bottom right of the Configuration tab. This opens a dialog that lets you graphically or manually edit the given element. You click the Evaluate XPath button to see the result of applying the XPath expression against the expected XML.

922

WS-BPEL Semantics Validator

WS-BPEL Semantics Validator This topic covers the BPEL Semantics Validator, which uses the BPEL Semantics Validator to verify a BPEL file’s syntactic correctness. Sections include: •

Understanding WS-BPEL Semantics Validator



Configuring the WS-BPEL Semantics Validator

Understanding WS-BPEL Semantics Validator The BPEL Semantics Validator is a static analysis tool that verifies syntactic correctness through schema validation, which verifies that the elements and attributes conform to the XML Schema. Beyond this, the validator explicitly verifies constraints imposed by the BPEL specification that are not enforced by the XML Schema. The validator finds errors such as: •

Unresolved references to BPEL, WSDL, and XML Schema types.



Violations to constraints on start activities, correlations, scopes, variables, links, and partner links.



Incompatible types in assignments.



Errors in embedded XPath expressions.

Configuring the WS-BPEL Semantics Validator You can configure the following settings for the WS-BPEL Validator: •

BPEL URL: Specifies the URL of the BPEL file to validate.



Imported documents: Allows you to import BPEL files to validate.

923

Web Application Tools In this section: •

Browser Testing



Browser Contents Viewer



Browser Validation



Browser Data Bank



Browser Stub



Scanning



Check Links



Spell



HTML Cleanup



Search



Browse

924

Browser Testing

Browser Testing The Browser Testing tool, which plays back a user action in a web browser on the current page (for example, clicking on a link or typing). It can also be used to validate or extract data from the web page.

Configuring Browser Testing The Browser Testing tool is the basis for all Web tests recorded through the browser. •

For details on how to create these tests, see “Web Functional Testing: Overview”, page 430 and “Recording Tests from a Browser”, page 431.



For details on how to configure and customize these tests, see: •

“Configuring User Actions (Navigation, Delays, etc.)”, page 449



“Specifying Wait Conditions”, page 454



“Validating or Storing Values”, page 459



“Stubbing Test Requests/Responses”, page 465



“Customizing Recording Options”, page 472

The tool’s configuration panel has the following tabs: •

Pre-Action Browser Contents: A rendered view of what was in the browser before the test executed. For an explanation of the colored borders used to mark notable elements, see “Colored Borders”, page 436.



User action: Allows you to view/modify the user actions simulated by this test. For details, see “Configuring User Actions (Navigation, Delays, etc.)”, page 449).



Wait condition: Allows you to view/modify the wait conditions for this test. For details, see “Configuring Wait Conditions”, page 454).



Pre-Action HTML: A code-level view of what was in the browser before the test executed.

925

Browser Testing

Note that you can attach an output to an HTTP request response as well as add an output directly attached to the Browser Testing Tool. The following tools can be attached as outputs: •

Browser Contents Viewer



Browser Data Bank



Browser Validation



Extension

These tools directly access the final browser contents for the test (that is, what you see in the postaction rendered view). Only one Browser Validation tool and one Browser Data Bank tool can be added as outputs to each Browser Testing tool.

926

Browser Contents Viewer

Browser Contents Viewer The Browser Contents Viewer tool stores the browser contents from the last test run, regardless of success. It is automatically added as an "output" for all Web tests recorded through the browser. From this tool, you can easily validate or store data (as described in “Validating or Storing Values”, page 459) as well as add wait conditions (as described in “Adding a Wait Condition from the Browser Contents Viewer”, page 456).

Configuring Browser Contents Viewer For details on how to create web functional tests (which include Browser Contents Viewers), see “Web Functional Testing: Overview”, page 430 and “Recording Tests from a Browser”, page 431. For details on how to use the Browser Contents Viewer to define validations, extractions, and wait conditions, see “Specifying Wait Conditions”, page 454 and “Validating or Storing Values”, page 459. The tool’s configuration panel has the following tabs: •

Post-Action Browser Contents: A rendered view of what was in the browser after the test executed. For an explanation of the colored borders used to mark notable elements, see “Colored Borders”, page 436. If the window/frame was modified via JavaScript—including AJAX— during testing (from the time the test starts and the last wait condition is satisfied), the contents shown here will reflect that change.



Post-Action HTML: A code-level view of what was in the browser after the test executed.If the window/frame was modified via JavaScript—including AJAX—during testing, this HTML will reflect that change;.it will be different than the original HTML for the window/frame that appears in the Traffic Viewer.

In addition, it also has a Save Browser Contents option which configures SOAtest to store browser contents (so they will persist across SOAtest sessions).

927

Browser Contents Viewer

928

Browser Validation

Browser Validation The Browser Validation tool can be chained to a Browser Testing tool to verify values within a browser page.

Configuring Browser Validation Data Bank The Browser Validation tool is automatically added when you configure a validation as described in “Validating or Storing Values”, page 459. It can also be added as a tool output as described in “Adding Test Outputs”, page 333.

The tool’s configuration panel has the following tabs: •

Validations: Allows you to view/modify the value that is extracted and how it is validated. See “Validating or Storing Values”, page 459 for details on the available options.



Post-Action Browser Contents: A rendered view of what was in the browser after the test executed. For an explanation of the colored borders used to mark notable elements, see “Colored Borders”, page 436. If the window/frame was modified via JavaScript—including AJAX— during testing (from the time the test starts and the last wait condition is satisfied), the contents shown here will reflect that change.

929

Browser Validation



Post-Action HTML: A code-level view of what was in the browser after the test executed. If the window/frame was modified via JavaScript—including AJAX—during testing, this HTML will reflect that change;.it will be different than the original HTML for the window/frame that appears in the Traffic Viewer.

930

Browser Data Bank

Browser Data Bank The Browser Data Bank can be chained to a Browser Testing tool to extract values from a browser page into a data source. The extracted value can be used wherever parameterized values are allowed, such as the value to type into an input in a subsequent test.

Configuring Browser Data Bank The Browser Data Bank is automatically added when you extract values as described in “Validating or Storing Values”, page 459. It can also be added as a tool output as described in “Adding Test Outputs”, page 333.

The tool’s configuration panel has the following tabs: •

Extractions: Allows you to view and modify the value that is extracted. See “Validating or Storing Values”, page 459 for details on the available options.



Post-Action Browser Contents: A rendered view of what was in the browser after the test executed. For an explanation of the colored borders used to mark notable elements, see “Colored Borders”, page 436. If the window/frame was modified via JavaScript—including AJAX— during testing (from the time the test starts and the last wait condition is satisfied), the contents shown here will reflect that change.



Post-Action HTML: A code-level view of what was in the browser after the test executed.If the window/frame was modified via JavaScript—including AJAX—during testing, this HTML will reflect that change;.it will be different than the original HTML for the window/frame that appears in the Traffic Viewer.

931

Browser Stub

Browser Stub This topic introduces the Browser Stub tool, which is used to simulate a server response or request. This can be useful for a variety of reasons. Stubbing a response can be used to simplify testing by stabilizing dynamic data, or it can be used to simulate unexpected server conditions. Stubbing a request can help isolate sever versus client side code errors.

Configuring Browser Stub The Browser Stub tool is automatically added when you create a stub for a Web functional test as described in “Stubbing Test Requests/Responses”, page 465. It can also be added as a tool output as described in “Adding Test Outputs”, page 333. Configuration details are provided in “Configuring the Browser Stub Tool”, page 466.

932

Scanning

Scanning The Scanning tool scans a web application and performs static analysis. It provides a list of the resources that were scanned.

“Configuring SOAtest to Scan a Web Application”, page 583 discusses how to configure and use the Scanning tool. “Web Static Analysis”, page 183 provides a step-by-step tutorial of how to perform static analysis on a Web application.

933

Check Links

Check Links This topic introduces the Check Links tool, which identifies broken links. Sections include: •

Understanding Check Links



Customizing Check Links

Understanding Check Links The Check Links tool identifies links in the input that are broken due to missing pages, invalid email addresses, malformed URLs, or server connection problems. It can also identify duplicate anchors and links that point to pages beyond the root of the site. As a test suite tool, it allows you to add a link checking test to your functional test scenario. To check links during static analysis, use the "Include only Valid Links" rule, which is in the Check Links category. This rule has the same customization options as the Check Links tool.

Customizing Check Links You can customize the following options: •

Check external links: Determines whether SOAtest tests external links (links to pages that are outside of the root directories of the project's sites). This setting is only applicable if you are testing a page available in the Project tree or Paths tree.



Check mailto addresses: Determines whether SOAtest tries to verify the validity of mailto addresses in your files. Note that many mail servers refuse to validate email addresses for security reasons. •

If you identify yourself by entering your own email address (for example, [email protected] or [email protected]) in the Check email as field, mail servers are more likely to return correct information.



Verify missing pages: Determines whether SOAtest uses an HTTP connection to verify if pages that appear to be missing (based on the current Project tree contents) are actually missing. When Check Links sees a link to a page that is available in the current Project tree, it knows that the link will work without having to check it. When Check Links sees a link to a page that is not available in the current Project tree, it will not check that link unless you have enabled this option. This setting is only applicable if you are testing a page available in the Project tree or Paths tree.



Follow meta refresh and JavaScript redirects: Determines whether the Check Links tool checks pages redirected to by meta refresh links and JavaScript redirects. You must enable this option if you want Check Links to report errors a) when meta refreshes and JavaScript redirects are not working correctly (for instance, a redirect leads to a missing page instead of the intended page) and b) when navigating to a broken link causes the server to respond by returning a page that contains a meta refresh link or JavaScript redirect to a designated error page that you listed in the Check Links tool Report links to these URLs as errors list. •

For example, assume that your server is configured so that when a user navigates to a broken link, the server should respond by returning a page (http://www.parasoft.com/ redirect.htm) that includes a meta refresh link or JavaScript redirect to a designated error page (http://www.parasoft.com/oops.htm), which the client then shows to the user. Also assume that you have indicated that links to http://www.parasoft.com/ oops.htm should be considered to be errors and that you have enabled the Follow

934

Check Links

meta refresh and JavaScript redirects option. If your home page has a broken link to your products page, attempting to navigate to that page will cause the server to return the page http://www.parasoft.com/redirect.htm, which redirects to http:// www.parasoft.com/oops.htm. SOAtest will report the link to the products page (<A HREF="products.htm">Products</A>) as a broken link. Your home page, http:// www.parasoft.com/index.htm, will be considered the source of this broken link.







For another example, assume that your server is configured so that users who click a "purchase" link after viewing a special presentation should be sent a page with a meta refresh link or JavaScript redirect to the purchase page (this configuration helps you track how many users decided to purchase after viewing the presentation). Also assume that you have enabled the Follow meta refresh and JavaScript redirects option. If users who click that purchase link are sent a page (http://www.parasoft.com/ redirect.htm) that contains a meta refresh link or JavaScript redirect to the purchase page (http://www.parasoft.com/purchase.htm), but the redirect/link is broken, then SOAtest will report the link to the purchase page (<A HREF="purchase.htm">Purchase</A>) as a broken link, and indicate that the page at http://www.parasoft.com/ redirect.htm is the source of this broken link.



SOAtest will only follow meta refreshes whose delay is less than or equal to the delay specified in the Maximum refresh delay field. Delay settings are not applicable to JavaScript redirects.

Check the following categories of errors: Determines what types of errors Check Links tests for. Available error categories are: •

Malformed URL: A link to a URL that uses non-standard URL format, a URL that contains a new line character (i.e., the URL is split between two lines), or a URL that uses the incorrect type of slash



Bad Anchors: A link to a duplicate or missing anchors.



Broken Link: A link that does not lead to the specified location or links to pages specified in the Report links to these URLs as errors area.



Unreachable Host: A link that leads to a Web server which has been physically disconnected from the network.



Missing Host: A link to a machine whose domain name (DNS name) has never been registered with network authorities (and is therefore invalid).



Page Beyond Root of Site: A link to a page that appears to be located beyond the current site’s root—for instance, if http://www.parasoft.com/index.htm contains a relative link such as <A HREF="../../../noaccess.html”>, noaccess.html would be reported as a page beyond root of site.



Server Connection Error: A link that leads to a Web server which is either inactive or not working properly. This occurs when the Web server machine has a valid domain name (DNS name) and is physically connected to the network, but has a problems with its Web server software.

Report links to these URLs as errors: Lets you record which URLs indicate that an error has occurred. Any time SOAtest finds a link that points to a URL in this list, it will report a broken link error so you can easily determine when errors occur. You can use the Add button to add individual URLs, or you can use the Import button to import a list of URLs from a text file.

935

Spell

Spell This topic introduces the Spell tool, which reports misspelled words. Sections include: •

Understanding the Spell Tool



Customizing the Spell Tool



Customizing the SOAtest Dictionary

Understanding the Spell Tool The Spell tool reports misspelled words found in the input. It helps you quickly identify spelling errors within HTML and XML. files. Since it understands HTML and XML tags, it does not erroneously report tags and words adjacent to tags as misspelled. It is even able to spot spelling errors that appear only when the code is rendered (for example, “an<B>error</B>”). As a test suite tool, it allows you to add a spell checking test to a functional test scenario. To check spelling during static analysis, use the "Spell Words Correctly" rule, which is in the Check Spelling category. This rule has the same customization options as the Spell tool. Both the tool and the rule use the same dictionary.

Customizing the Spell Tool You can customize the following options: •

Case sensitive: Determines whether spell checks are case-sensitive.



Always allow ’s: Determines whether SOAtest allows the possessive form of words in its dictionary.



Check HTML: Determines whether SOAtest spell checks HTML files.



Check Alt attribute: Determines whether SOAtest spell checks the words in an IMG tag’s ALT attribute.



Check XML: Determines whether SOAtest spell checks XML files.

Customizing the SOAtest Dictionary The dictionary can be customized in two ways:

936

Spell



Right-clicking a reported misspelled word in the SOAtest view, then choosing Add to Dictionary.



Directly adding acceptable words to the dictionary via the SOAtest preferences panel. See “Dictionary Settings”, page 749.

937

HTML Cleanup

HTML Cleanup This topic explains how to apply and configure the HTML Cleanup tool, which identifies and reports HTML/XHTML coding and structural problems in the input and (optionally) transforms the code by fixing these problems and optionally converting the code into XHTML. Sections include: •

Understanding HTML Cleanup



Configuring HTML Cleanup

Understanding HTML Cleanup The HTML Cleanup tool identifies and reports HTML/XHTML coding and structural problems in the input. Additionally, the tool can be used to transform the code by fixing these problems and optionally converting the code into XHTML. The transformed code is returned as an output. By default, the HTML Cleanup tool is configured to clean HTML files. If you want the tool to convert code to XHTML, operate on HTML fragments, or operate on ASP, JSP, and PHP files, you change the configuration settings as described in “Configuring HTML Cleanup”, page 938. It can be chained to another test to operates on the browser requests that occur as Web functional tests execute. Or, it can be used as a standalone test that operates on the file/text specified in the tool configuration panel’s Input tab. HTML Cleanup supports 4.01 HTML Character Entities. As a test suite tool, it allows you to identify and clean HTML problems as part of your functional test scenario. To identify HTML problems during static analysis, use the "Check HTML Well-Formedness" rule, which is in the Cleanup HTML category. This rule has the same customization options as the HTML Cleanup tool. Note that the static analysis option does not allow you also transform the code by fixing these problems and optionally converting the code into XHTML.

Configuring HTML Cleanup You can customize the following options: •

Show informational messages: With the default tool configuration, this option determines whether SOAtest reports the changes made during cleanup. If you changed the HTML Cleanup configuration so that it no longer sends a "transformed source" output to the Edit tool or another tool, this option determines whether SOAtest reports the changes that are required to make the designated transformation. Messages will be reported in a results window, and can be accessed via the Window menu.



Process ASP, JSP, PHP files: Determines whether SOAtest attempts to perform the specified target action on available ASP, JSP, and PHP files. Note that SOAtest will ignore ASP <% ... %>, JSP <% ... %>, and PHP <? ... ?> tags in those files—as well as ignore JSP scriptlets and custom actions that dynamically generate HTML attributes or attribute values—when this option is enabled.



Keep embedded scripts and styles: Determines whether SOAtest attempts to extract all scripts and styles (including those that contain special characters) to external files.



Target Document Type: Configures the type and level of cleanup performed. For more information on the available options, see “Customizing Target Document Type”, page 939.

938

HTML Cleanup



Add XML Declaration: Determines whether SOAtest adds an XML declaration (<?xml version="1.0"?>) at the beginning of the transformed source. This option is only available in XHTML (DTD) mode.



Update IDs in DOCTYPE declaration: Determines whether SOAtest replaces the IDs of the DOCTYPE declaration (if the document already contains a DOCTYPE declaration) or adds the DOCTYPE declaration with IDs (if the document does not yet contain a DOCTYPE declaration). This option is only available in XHTML (DTD) mode.

Customizing Target Document Type You can configure the type and level of cleanup performed by changing the options listed in the HTML Cleanup configuration panel’s Target Document Type field. The following table describes the available modes:

Option

Description

Example

HTML Fragment

Cleans HTML fragments, but does not convert them to XHTML. In this mode, SOAtest:

<html> hello world <table WIDTH=20>



Adds missing end tags and reports if a missing end tag was added for an unknown tag.



Sets default values for attributes (i.e., those that are "true" by default).



Adds quotes around attribute values.



Checks for non-numerical values in attributes that require numerical values.



Removes orphaned end tags.

SOAtest does not address the general structural issues in this mode. Note:

This is the default mode.

939

is transformed into <html> hello world <table WIDTH="20"></table></html>

HTML Cleanup

Option

Description

Example

HTML Document

Cleans complete HTML documents, but does not convert them to XHTML. In this mode, SOAtest:

<html> hello world <table WIDTH=20>



Performs all HTML Fragment mode actions.



Fixes problems with the overall document structure by ensuring that the file satisfies normal HTML requirements.

is transformed into <html><head><title></title></ head><body> hello world <table WIDTH="20"></table></ body></html>

Documents require <HTML> <HEAD> <TITLE> </TITLE> </HEAD> <BODY> </BODY> </HTML> Framesets require <HTML> <HEAD> <TITLE> </TITLE> </HEAD> <FRAMESET> </ FRAMESET> </HTML> XHTML Fragment

Cleans HTML fragments and converts them to XHTML. In this mode, SOAtest:

<html> hello world <table WIDTH=20>



Performs all HTML Fragment mode actions.

is transformed into



Moves embedded scripts and style sheets to external files when necessary.

<html> hello world <table width="20"> </table></html>



Adds missing attributes for various tags (for example, it adds a missing src attribute for IMG tags.



Ensures that all attributes are lower case.

940

HTML Cleanup

Option

Description

Example

XHTML (DTD)

Cleans HTML documents and converts them to XHTML. In this mode, SOAtest:

<html> hello world



Performs all XHTML Fragment and HTML Document mode actions.



Attempts to convert the document to XHTML that conforms to either the default DTD (the xhtml-transitional DTD from the W3C) or the DTD you specify in the DTD Public ID and System ID fields.



Adds a DOCTYPE declaration.



Add an XML declaration (<?xml version="1.0"?>) at the beginning of the transformed source (if the Add XML Declaration option is enabled).

is transformed into <?xml version="1.0"?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/ xhtml1-transitional.dtd"> <html><head><title /> </head><body> hello world <table width="20"> </table></body></html>

Configuring SOAtest to Save the Transformed Files If you want SOAtest to save the files transformed by the Cleanup HTML tool, add a Write File tool output as follows: 1. Right-click the HTML Cleanup tool node in the Test Case Explorer tab and select Add Output from the shortcut menu. The Add Output dialog displays.

941

HTML Cleanup

2. Select Transformed Source from the left pane and Write File from the right pane of the Add Output dialog and click Finish. A Transformed Source> Write File node is added to the HTML Cleanup node. 3. (Optional) Customize the Write File tool as described in “Write File”, page 893.

Configuring SOAtest to Check the Transformed Files If you want to verify whether the transformed source contains XHTML structure problems, add a Check XML tool output as follows: 1. Right-click the HTML Cleanup tool node in the Test Case Explorer tab and select Add Output from the shortcut menu. The Add Output dialog displays. 2. Select Transformed Source from the left pane and Check XML from the right pane of the Add Output dialog and click Finish. A Transformed Source> Check XML node is added to the HTML Cleanup node.

942

HTML Cleanup

3. (Optional) Customize the Check XML tool as described in XXX.

943

Search

Search This topic explains how to apply and configure the Search tool, which searches text for user-defined keywords and reports failures if the specified keywords are present or absent (depending on the tool settings). Sections include: •

Understanding Search



Customizing Search

Understanding Search The Search tool searches text for user-defined keywords and reports failures if the specified keywords are present or absent (depending on the tool settings). It is useful when you need to look for simple strings on a page. Since the purpose of the tool is to search for the presence or absence of user-defined keywords, you must customize it (to specify search terms) before using it.

Customizing Search The Search tool is highly-customizable to allow precise searches. You can add keywords, remove keywords, or clear the list of keywords. Additionally, you can specify whether the presence or absence of the keyword should be reported, and what message is reported if the specified pattern is detected. You can also control whether searches use regular expressions or whether searches are case sensitive. You can customize the following options for the Search tool: •

Search Terms: Specifies the terms that the tool searches for. •

To add a search term, type it in the New Search Term field, then click Add. •



When the Search tool searches HTML and XML pages, it does not parse the code; consequently, if you are looking for a string that is broken up by markup tags, you must include those tags in your search keyword.



To remove selected search terms, select the terms in the Search Terms list, then click Remove.



To remove all search terms, click Clear.



To import search terms, click Import. You will be able to import multiple messages/ terms from a CSV file.



To export search terms, click Export. All terms will be exported to a file in the same format.

Custom Output Message: Specifies the message that displays in the SOAtest view when a failure is found. If a data source is available, you can parameterize the search terms rather than manually inputting the search terms. When using a search tool in the context of a project (for example, in a test suite) you can parameterize the messages that the search tool outputs if a search term is found (or not found, depending on the settings). This allows you to use a list of output messages from a data source, which is not possible any other way. •

Two special wildcards are allowed in messages, SOAtest will replace these wildcards with appropriate values:

944

Search



%0: SOAtest will replace the %0 with the keyword that was found (or not found). In regular expression mode, this will be the actual string on the page that matched the regular expression.



%1: In regular expression mode, SOAtest will replace the %1 with the regular expression pattern that was being searched for.



Apply current message to all terms: Select to apply the current message to all terms.



Treat term as word: Select to treat the entered term as a word.



Use Regular Expressions: Determines whether the keywords should be treated as regular expressions or as normal text.



Ignore Case: Determines whether to ignore case while matching keywords.



Display Message if Search Term: Select Found to specify that the message should display if a search term is found. Select Not Found to specify that the message should display if a search term is not found.

945

Other Tools In this section: •

Header Data Bank



JSON Data Bank



Object Data Bank



Text Data Bank



Coding Standards



FTP Client



External



Aggregate



Extension (Custom Scripting)



Attachment Handler



Decompression



Jtest Tracer Client



WSDL Content Handler



WSDL Semantics Validator

946

Header Data Bank

Header Data Bank This topic explains how to configure and apply the Header Data Bank tool that extracts headers from the SOAP response, and makes those headers available as parameters in the SOAP request. Sections include: •

Understanding Header Data Bank



Configuring Header Data Bank



Viewing the Data Bank Variables Used During Test Execution

Understanding Header Data Bank The Header Data Bank tool enables you to extract values from HTTP and JMS headers from one test in a test suite, and input those values into another test in the same test suite. In other words, you can use a header value from the SOAP response of Test 1 as a parameter in the SOAP request of Test 2. The Header Data Bank tool can be chained to any other SOAtest tool that outputs HTTP or JMS headers. It is able to extract values from the HTTP or JMS header and makes that information available for later use.

Configuring Header Data Bank To configure the Header Data Bank tool: 1. Ensure that you have a test suite available with at least two test cases. 2. Right-click the Test 1 node and select Add Output from the shortcut menu. The Add Output wizard displays. 3. In the Add Output wizard, select Response> Transport Header from the left pane, Header Data Bank from the right pane, and click the Finish button. A Response Transport Header> Header Data Bank node displays in the Test 1 branch. 4. Double-click the Response Transport Header> Header Data Bank node. The following Header Data Bank operations display in the right GUI panel. •

Name: Specifies the name of the Data Bank tool.



Available Headers: Specifies the headers available in the HTTP or JMS traffic. This field is blank after first adding the Header Data Bank node to the test. The Available Headers will be automatically filled after the test suite is run.



Extract/Alter tabs: Specifies the Headers you would like to extract or alter for use in the SOAP message of another test within the test suite. You can extract a Header through the Extract tab and you can alter a Header through the Alter tab.

5. Right-click the main test suite tree node and select the Test Using ’Example Configuration’ from the shortcut menu to initialize the Header Data Bank and to verify the available headers. 6. Click the Response Transport Header> Header Data Bank node. The Available Headers text area now contains the available headers.

947

Header Data Bank

7. To extract a header, select the Extract tab and complete one of the following: •

To extract a header from the Available Headers, select a header from the Available Headers view and click the Add Header button. After selecting a header, it is added to the Selected Headers list in the Extract tab. The Selected Headers list consists of the following columns:





Header: Displays the selected Header name. To edit a selected Header, double-click the desired Header in the Header column and edit the Header text.



Data Source column name: Data Source column names of Selected Headers will display as parameterized values in the SOAP Client panel, meaning that you will be able to use these Selected XPaths in another test case to send as part of the SOAP request.

To extract a Header that is not in the Available Headers, click the New button. A new Header will be added to the Selected Headers list in the Extract tab.

8. To alter a Header, select the Alter tab and complete one of the following: •

To alter a Header from the Available Headers, select a node from the Available Headers view and click the Add Selected Nodes button. After selecting a header, it is added to the Selected Headers list in the Alter tab. The Selected Headers list consists of the following columns: •

Header: Displays the selected Header name. To edit a selected Header, double-click the desired Header in the Header column and edit the Header text.



Data Source column name: Data Source column names of Selected Headers will display as parameterized values in the SOAP Client panel, meaning

948

Header Data Bank

that you will be able to use these Selected XPaths in another test case to send as part of the SOAP request. •

To alter a Header that is not in the Available Headers, click the New button. A new Header will be added to the Selected Headers list in the Alter tab.

9. Select the Test 2 node and choose Generated Data Source from the Data Source drop-down menu in the right GUI panel. The Headers you added to the Header Data Bank in Test 1 are now available to be used as Parameterized values for the SOAP Client in Test 2 when using Form Input or Form XML settings for the SOAP Envelope. You can also parameterize the HTTP or JMS headers of Test 2 by clicking the Transport Properties button in the right GUI panel of Test 2.

Viewing the Data Bank Variables Used During Test Execution You can configure the Console view (Window> Show View> Console) to display the data bank variables used during test execution. For details, see “Console view”, page 35.

949

JSON Data Bank

JSON Data Bank This topic covers the JSON Data Bank, which extracts content from JSON for test parameterization. Sections include: •

Understanding JSON Data Bank



Configuring the JSON Data Bank



Viewing the Data Bank Variables Used During Test Execution

Understanding JSON Data Bank When the message sent by the server is a JSON object, you may want to extract property values from that object and use those values in subsequent tests. The JSON Data Bank gives you a way to easily visualize the structure of the JSON object and select the properties they wish to store for later use.

Configuring the JSON Data Bank To configure the JSON Data Bank, complete the following: 1. Right-click the Messaging Client that will be receiving a JSON object and select Add Output. 2. In the Add Output wizard that displays, select Response> Traffic then choose JSON Data Bank from the All Tools list, and click the Finish button. A Response Traffic> JSON Data Bank node displays beneath the selected Messaging Client. 3. Double-click the Response Traffic> JSON Data Bank node. When you first chain the JSON Data Bank, there will be no JSON object stored within the tool. To initialize it with a JSON object, you can either select to the Literal tab and paste in the JSON text, or you can run your test and the response object will automatically show up in the Expected JSON area. 4. Once the JSON Data Bank holds a JSON object, you can go to the JSON tab in the Expected JSON panel and select any portion of the object. Once selected, you can click on the Add button in the center to extract the value during runtime. The keyword "this" in the JSON tab represents the object itself. The extracted properties will display in the right-hand table. When you double-click on the entry or click Modify, a dialog will display that allows for more extraction options. •

Extraction: Allows you to select a different property to extract.



Data Source column name: Allows you to change the column name that this value will be mapped to. You can also choose to write the value to a Writable Data Source or test variable.

Once you have everything set up in the JSON Data Bank, you can use the extracted value by switching a parameterizable field to Parameterized mode and selecting the column you want.

Viewing the Data Bank Variables Used During Test Execution You can configure the Console view (Window> Show View> Console) to display the data bank variables used during test execution. For details, see “Console view”, page 35.

950

Object Data Bank

Object Data Bank The Object Data Bank tool is used to store remote EJB objects in order to invoke their methods in subsequent tests. It is primarily chained to the EJB Client tool. For details on using the Object Data Bank tool, see “EJB Client”, page 841.

951

Text Data Bank

Text Data Bank This topic explains how to configure and apply the Text Data Bank tool that extract values from any text content (including plain text, HTML, XML, etc.) and makes those values available as parameters in other SOAtest tests and configurations. Sections include: •

Understanding Text Data Bank



Configuring Text Data Bank



Viewing the Data Bank Variables Used During Test Execution

Understanding Text Data Bank The Text Data Bank can extract values from any text content (including plain text, HTML, XML, etc.) by specifying left-hand and right-hand boundaries that define the value to be extracted. It is useful when you want to extract plain text (e.g., for use in another test or configuration panel), and cannot use other extraction tools such as the Browser Data Bank. XML Data Bank, JSON Data Bank, and so forth. This tool is also used in configuring functional web tests for load testing as described in “Preparing Web Functional Tests for Load Testing”, page 561. The Text Data Bank tool is typically configured as an output of another tool (most commonly, a Browser Testing tool) that delivers text output. In this case, the Text Data Bank’s Text Content area will be populated with the text output of the chained tool after test execution completes. Alternatively, you can create a "standalone" test using the Text Data Bank tool, then specify a file that contains the desired text content (for instance, a log file or other file that is dynamically updated or created during every test run). You can extract any value that appears between specified text boundaries. This allows you to extract values that vary from test run to test run. For example, assume you want to extract a session ID, which changes from session to session, and this session ID always appears between "leftboundary" and "rightboundary". You can configure a Text Data Bank tool to extract whatever value appears between "leftboundary" and "rightboundary". Every time this test is run, SOAtest will extract whatever value appears in the specified location (between the given boundaries). This way, if the session ID between those boundaries changes from session to session, the value extracted will also change from session to session. Extracted values are added to the column you specify, and can be used in any tool configuration fields that allow parameterized values. If the specified boundaries change (as a result of application changes), the configured extractions need to be updated to specify the new boundary values.

Configuring Text Data Bank To configure a Text Data Bank tool to extract text during a test run: 1. Add the tool as an output of a tool that provides text output—or add it as a standalone test (if you want to extract data from a file that is being dynamically updated or created for every test run). 2. Do one of the following •

If you added the tool as a test output, run the test.



If you added the tool as a standalone tool, open the tool configuration panel and use the Input tab to specify the file from which you want to extract content.

952

Text Data Bank

3. In the tool configuration panel, go to the Text Content area and select the area of text that you want to extract. •

Note that if this value changes during subsequent test runs, SOAtest will not always extract this specific value. Instead, it will extract whatever value appears in this location. A new extraction will occur each time this test is run.



You can also define extractions manually by clicking the Add button and specifying extraction details.

4. Click Create Extraction. 5. Specify the name of the column you want to contain the extracted data. The extraction will be added to the table, and can be modified or removed using the available controls.

Viewing the Data Bank Variables Used During Test Execution You can configure the Console view (Window> Show View> Console) to display the data bank variables used during test execution. For details, see “Console view”, page 35.

953

Coding Standards

Coding Standards This topic explains how to create and customize a Coding Standards tool that verifies whether code follows a user-defined set of custom rules and/or built-in rules. Sections include: •

Understanding Coding Standards



Configuring Coding Standards

Understanding Coding Standards The Coding Standards tool verifies whether your code follows rules that verify anything from W3C guidelines for a specific language, to team naming standards, to project-specific design requirements, to the proper usage of custom XML tags. When you create a new Coding Standards instance, you can configure it to check any number or combination of custom rules and built-in rules. Custom rules can verify application-specific design and content requirements, enforce custom coding standards, enforce naming conventions, identify text (such as an exception message) that signals a problem, query XML data, or even perform custom file transformations. You create custom rules in RuleWizard, which can automatically generate rules or help you create them graphically. For details on designing custom rules, see the RuleWizard User’s Guide (choose Help> RuleWizard Documentation from the RuleWizard GUI). Typically, users create a custom Coding Standards tool to check a logical group of custom rules. For example, if a team implemented rules that check project-specific design guidelines, rules that check team naming guidelines, and rules that check for the each team member’s most common coding mistakes, they might want to create three separate Coding Standards tools—one for each logical group of rules.

Configuring Coding Standards You configure the Coding Standards tool by specifying which rules you want it to check. In the configuration panel, you can : •

Enable/disable individual rules or groups of rules by selecting or clearing the available check boxes.



Customize a parameterizable rule by right-clicking it and choosing View/Change Rule Parameters from the shortcut menu. .Parameterized rules are marked with a special icon (a wizard hat with a radio button):



Search for a rule by clicking the Find button, then using that dialog to search for the rule.



Hide the rules that are not enabled by clicking the Hide Disabled button. If you later want all rules displayed, click Show All.



View a description of a rule by right-clicking the node that represents that rule, then choosing View Rule Documentation from the shortcut menu

954

FTP Client

FTP Client This topic explains how to create an FTP Client tool, which is used to connect to a FTP server in order to put or get files. Sections include: •

Understanding FTP Client



Configuring FTP Client



Viewing Traffic

Understanding FTP Client The FTP Client is designed to provide basic FTP functionality as a SOAtest tool. Both FTP and SFTP (Secure FTP) are supported.

Configuring FTP Client You can customize the following options: •

Protocol: Choose the desired protocol (either FTP or SFTP). In SFTP mode, all transfers are in Binary format.



Host, Port, Username, Password: Specify the connection settings. These can be parameterized (using values from a data source added to the test suite), or you can use environment variables in Literal Mode.



Local Folder: Specify the path on your local machine where downloaded files should be saved and uploaded files can be found.



Transfer Mode: Specify the transfer mode to use. If binary files will be transferred, use Binary. Otherwise, use ASCII (default).



Timeout: Specify the number of seconds that SOAtest will wait after attempting to send or receive data before terminating the connection.



Commands table: Specify the commands that you want the FTP Client to perform, in the order in which you want them performed. Available commands include: •

pwd - Print working directory.



ls - List directory



cd - Change directory



mkdir - Make a directory



rmdir - Delete a directory



rm - Delete a file



put - Upload a file



get - Download a file

Viewing Traffic To view the traffic log from the FTP Server, you can attach tools to the FTP Client. For example, adding an Edit tool will let you view the traffic.

955

External

External This topic explains how to integrate and execute third-party tools in the SOAtest environment. Sections include: •

Understanding SOAtest’s Definition of an External Tool



Integrating an External (Third-Party) Tool

Understanding SOAtest’s Definition of an External Tool SOAtest’s extensible architecture makes it easy for you to integrate all of your SOAP/Web servicerelated tools into a single environment much like a traditional development IDE such as Microsoft’s Visual Studio. You can integrate any third-party (external) tool into SOAtest. Once you integrate a tool, you can access it by simply clicking the tool bar button that SOAtest created for that tool or apply it using any of the available testing techniques.

Integrating an External (Third-Party) Tool You can configure an External tool as follows: 1. Double-click the tool node. 2. Complete these fields as follows. a. In the Tool’s tab’s Name field, enter the tool’s name. b. Select a data source from the Data Source drop-down menu. This menu is only available if a data source was added to the test suite. The values from the data source you choose can be used as parameterized values in the Arguments column. c.

In the Executable field, enter (or browse to) the name and path of the executable file associated with this tool.

d. If you want to pass any arguments to the tool, enter them in the Arguments field by clicking the Add button. To modify the Flag and Arguments values click the Modify button and specify the Flag and Argument values in the dialog that opens. If you select a Parameterized value from a data source, each value from the specified data source column will be used as an argument. If no flags need to be specified, the Flag column can be left empty. An argument is required if you want this tool to operate on a selected item (file, Project tree item, Path tree item, and so on) or if you want to use this tool as a boolean filter. Several available % arguments are discussed below. If you do not specify any % arguments, the external tool will start when it is invoked, but it will not operate on any selected files, browser items, etc. •

%F: This argument passes the filename and path of the selected item. It allows the tool to operate on the selected item. If you invoke a tool using this argument, the item for which it was invoked will be “ghosted” and assigned a temporary filename. You can avoid this ghosting by using the %u argument, if applicable. %F is the most commonly used argument.



%f: This argument passes the filename, but not the path, of the selected item.

956

External



%u: This argument passes the URL of the selected item. It works on simple URLs (URLs for pages without form submissions) as long as the associated tool can work with URLs.



%l: This argument passes any relevant line number information.

e. If an exit value for this tool indicates the tool’s success, select the Exit value indicates success check box. If an exit value indicates failure, leave this option off. You can only use a tool as a gate (boolean filter) if it has return code that marks the successful completion of a test. If you are using a tool as a gate (boolean filter), you should specify arguments as described in the above step. f.

If you want this tool’s output to be displayed in the Results panel, check Keep output. If you choose to keep output, you can tell SOAtest how to interpret the output’s format and what the output means by setting the Output Pattern and Pattern Keys options. If you use both output pattern and pattern keys to tell SOAtest how to interpret the Filename (and line number, if provided), then each time you double-click a tool message reported in the results panel, SOAtest will open the correct file, and go to the correct line number (if your output contains file names and line numbers). •

Output Pattern: Tells SOAtest how to interpret the output’s format. When used in conjunction with pattern keys, it tells SOAtest what the output means.



Pattern Keys: Tells SOAtest the line number and file name from the output pattern (in the terms of your selected source editor).



Example 1: Assume the sample tool’s output is 1:magic one.htm

(where 1 is the line number, magic_space is an expression, and one.htm is the file name). In this case, you would enter the following expression in the Output Pattern field: (.*)(:magic )(.*)

This expression tells SOAtest to break the output into three pieces (each pair of parentheses represents one piece). The first piece includes everything up until the value ":magic_space". The second piece is the value ":magic_space". The third piece includes everything after the value ":magic_space.". The pattern keys settings tells SOAtest how to interpret each piece. Here, it should be lsF. l represents line number, s is used as a placeholder, and F represents file name. •

Example 2: Assume the sample tool’s output is c:/home/gecko/files/a.html: (anything)

and the tool does not report line numbers. You would enter (.*)(:)(.*) in the Output Pattern field and Fss in the Pattern Keys field. g. In the MIME Types field, specify which types of files this tool can work with.

957

External

h. In the MIME Type of Output field, specify the type of output you want this tool to deliver.

958

Aggregate

Aggregate This topic introduces the Aggregate tool, which allows you to create a single tool that runs a userdefined collection of tools. Sections include: •

Understanding the Aggregate Tool



Configuring the Aggregate Tool

Understanding the Aggregate Tool The Aggregate Tool allows you to apply a custom set of SOAtest tools (including External Tools and Extension/Custom Script tools) with a single click or command. It can aggregate any tools that are available as global test suite tools (see “Global Tools”, page 341).

Configuring the Aggregate Tool To configure an Aggregate tool: 1. Double-click the tool node. 2. Assign your tool a name in the Name field in the right GUI panel. 3. Check the check boxes that represent the tools you want the Aggregate tool to run.

959

Extension (Custom Scripting)

Extension (Custom Scripting) This topic explains how to create and apply an Extension tool that executes a custom Java, JavaScript, or Python script you have written. Sections include: •

Understanding the Scripting/Extension Tool



Creating a Custom Script/Extension Tool



Additional Scripting Resources

Understanding the Scripting/Extension Tool With SOAtest’s Extension tool you can test any part of your enterprise system and seamlessly integrate it into your testing solution.The Extension tool allows you to implement a Python, Java, or JavaScript method independently or in conjunction with your existing test suite. This additional functionality gives you the ability to customize SOAtest to your specific needs and is limited only by the capabilities of the scripting language that you are working with. The scripts that you write or incorporate into SOAtest may accept zero, one, or two arguments. The source of the output can be a derived from a client request, a server response, the output of another tool, an element of a rule, or it can be user defined beforehand. This is useful if you want to perform application-specific checks that cannot be expressed with a rule (for example, if you want to check if a given output matches a record in a database). Or you can design the script to perform any specific function that would be helpful to you while testing. If your method accepts zero arguments then it does not matter what file is selected when it is run. If the method accepts one argument then the input will be taken from the file selected or the test to which it is attached. If the method accepts two arguments then the first will be taken from the file or preceding test and the second will take contextual information about the file. For more information on the types of context arguments that can be applied, see the Scripting API by choosing SOAtest> Help> Scripting API. This API also describes how you can interact with the SOAtest program. An addition option is to create scripts that SOAtest executes each time that SOAtest is started. To do this, create a Python or JavaScript script, then add it to <SOAtest_62_Installation_Directory>/ plugins/com.parasoft.xtest.libs.web_<soatest_version>/root/startup

Creating a Custom Script/Extension Tool The following procedure describes how to add an Extension tool to the tool bar. You can also create custom methods in a test suite by adding an Extension tool as a test tool, then completing the same parameters in the Extension parameters panel. To add a custom script/method tool to SOAtest: 1. Add an Extension tool using one of the techniques described in “End-to-End Test Scenarios”, page 306. 2. Double-click the Extension tool’s Test Case Explorer node. The tool configuration panel will open on the right. 3. Give your method a name in the Name field. 4. If a return value for this tool indicates the tool’s success, select the Return value indicates success check box. If this check box is not selected, SOAtest ignores the return value of the method regardless of whether the test succeeded or failed.

960

Extension (Custom Scripting)

5. From the Language box, select Java, JavaScript, or Python to indicate the language that your method is or will be written in. 6. Define the script to be implemented in the large text field. •

For Java methods, specify the appropriate class in the Class field. Note that the class you choose must be on your classpath (you can click the Modify Classpath link then specify it in the displayed Preferences page). Click Reload Class if you want to reload the class after modifying and compiling the Java file.



For JavaScript and Python scripts, you can use an existing file as the source of code for your method or you can create the method within SOAtest. •

To use an existing file, select the File radio button and click Browse. Select the file from the file chooser that opens, then click OK to complete the selection.



To create the method from scratch from within SOAtest, select the Text radio button and type or cut and paste your code in the associated text window.



To check that the specified script is legal and runnable, right-click the File or Text text field (click whichever one you used to specify your script), then choose Evaluate from the shortcut menu. SOAtest will report any problems found.

7. Select the appropriate argument from the Method box at the bottom of the panel. This list will be composed of any definitions contained in your script. Since a script can contain multiple arguments, you can select the one that you want to use in this method.

Additional Scripting Resources For a step-by-step demonstration of how to apply custom scripting, see “Extending SOAtest with Scripting”, page 128 For an overview of issues related to SOAtest’s scripting feature and its various applications, see “Extensibility (Scripting) Basics”, page 764.

961

Attachment Handler

Attachment Handler This topic explains how to configure and apply the Attachment Handler tool that manages Response MIME attachments. Sections include: •

Understanding Attachment Handler



Configuring Attachment Handler



Attaching Outputs to the Attachment Handler

Understanding Attachment Handler The Attachment Handler tool manages all of the MIME attachments that are received as part of a SOAP response. Since so many MIME attachments can be returned in the SOAP response, it is useful to be able to keep track of each attachment. To use the Attachment Handler tool, you must chain it to a SOAP Client tool that receives MIME attachments as part of the response. The MIME attachments that are returned will be managed by the Attachment Handler.

Configuring Attachment Handler To configure the Attachment Handler tool: 1. Add the Attachment Handler tool to the SOAP Client by selecting the SOAP Client node and clicking the Add Test or Output button. The Add Output wizard displays. 2. In the Add Output wizard, select Response> Attachment from the left pane, Attachment Handler from the right pane, and click the Finish button. A Response Attachment> Attachment Handler node displays beneath the selected SOAP Client node. 3. Double-click the Response Attachment> Attachment Handler node. The following Attachment Handler operations display in the right GUI panel. •

Attachment sorting options: Specifies how attachments are sorted.



Attachments from previous response: Displays available MIME attachments by Index, Content-ID, and Base 64 Decode.



Output Headers: Select to output MIME Attachment Headers. If this option is selected, both the MIME Response Header and Body can be output. By default, only the MIME Response Body is output.



Selected Attachment: Displays the MIME attachments returned in the XML response in text form.



Expected number of attachments: Specifies the expected number of attachments.



Fail test on mismatches: Select to make the test fail if the number of expected attachments specified does not match the actual number of attachments received by the Attachment Handler Tool.

962

Attachment Handler

Attaching Outputs to the Attachment Handler You can also add outputs such as Diff tools, or XML Data Banks to the Attachment Handler tool. To do so, complete the following: 1. Right-click the desired Attachment Handler node and select Add Output from the shortcut menu (You can also select the Attachment Handler node and click the Add Test or Output toolbar button). The Add Output wizard displays.

963

Attachment Handler

2. From the Add Output wizard, complete one of the following: •

Select All Attachments from the left pane and the desired output tool from the right pane and click Finish. The SOAtest tool selected will be applied to all Response MIME Attachments specified in the Attachment Handler tool.



Select Attachment # from the left pane and the desired output tool from the right pane and click Finish. The SOAtest tool selected will be applied to the specific Attachment corresponding to the chosen Index number in the Available Attachments list.

964

Decompression

Decompression This topic describes the Decompression tool, which decompresses data that has been compressed with gzip, zip, or deflate compression methods. Sections include: •

Understanding Decompression



Configuring Decompression

Understanding Decompression If the server response for a web functional test is in a compressed format, SOAtest automatically detects the compression settings and attaches a Decompression tool to the response traffic before it is outputted to the Diff tool. The decompression tool is automatically added to Asynchronous Request tests created by recording from a browser if the content from the server is compressed using the gzip, zip, or deflate compression methods. The decompression tool will then decompress the data before sending the response to the regression control for that test. The Decompression Tool can be used by other tools (Messaging or SOAP clients, for example) in cases when compressed content needs to be inflated.

Configuring Decompression The Decompression tool only has only one option: •

Decompression Method: This controls how the data is decompressed when sent to the tool. The decompression tool can be used to decompress Gzip, Zip, and Deflate (zlib) data.

965

Jtest Tracer Client

Jtest Tracer Client This topic covers the Jtest Tracer Client tool, which is used to generate JUnit tests based on the test actions it monitors. Sections include: •

Understanding Jtest Tracer Client



Prerequisites



Configuring Jtest Tracer Client



Generating Test Cases



Troubleshooting

Understanding Jtest Tracer Client If you have the Jtest Tracer module (available through Parasoft Jtest) listening to your Java application or web service, the Jtest Tracer tool in SOAtest can be used to control when to start and stop recording method invocations, what classes or packages' methods will be monitored, and where the recorded data gets routed. Tracer provides a fast and easy way to create realistic functional JUnit test cases that capture the functionality covered by your SOAtest test cases. Using Tracer, you can trace the execution of Java applications at the JVM level (without a need to change any code or to recompile), and in the context of a larger integrated system. As your SOAtest test cases execute, Tracer monitors all the objects that are created, all the data that comes in and goes out. The trace results are then used to generate contextual JUnit test cases that replay the same actions in isolation, on the developer's desktop, without the need to access all the application dependencies. This means that you can use a single machine to reproduce the behavior of a complicated system during your verification procedure. Since the generated unit tests directly correlate tests to source code, this improves error identification and diagnosis, and allows developers to run these test cases without having to depend on access to the production environment or set up a complex staging environment. This facilitates collaboration between QA and Development: QA can provide developers traced test sessions with code-level test results, and these tests help developers identify, understand, and resolve the problematic code.

Prerequisites •

You must have Parasoft Jtest (with a Tracer license) available on a system that your team can access.

Configuring Jtest Tracer Client Before you can "trace" an application, you must perform the following one-time configuration steps: 1. Ensure that the Tracer libraries are available on the machine that is running the application you want to trace. •

See the Jtest documentation for details.

2. Set that system’s path to reference the Tracer libraries. •

See the Jtest documentation for details.

3. Start the server with the appropriate JVM arguments for tracing.

966

Jtest Tracer Client



See the Jtest documentation for details.

4. Specify which test(s) you want to trace by adding Jtest Tracer Clients tools to the test suite. •

See the following section for details.

Specifying which Test(s) to Trace You specify which test(s) to trace by adding two test tools to your test suite: •

A Jtest Tracer Client - Start Trace as a setup test before the first test action you want traced.



A Jtest Tracer Client - Stop Trace tool as a tear-down test after the last test action that you want to trace.

To configure this: 1. Right-click the Test Case Explorer node for the test suite or test where you want to start tracing and select Add New> Test from the shortcut menu. 2. In the Add Test wizard, select Set-Up from the left pane, Jtest Tracer Client from the right pane, and click the Finish button. A Set-Up: Jtest Tracer Client node displays within the selected test suite. 3. Configure the set-up test as follows: a. Double-click the Jtest Tracer Client node to open the test configuration panel. b. Specify the host and port that you will connect to. 6543 is the default. c.

Ensure that the Start trace button is selected.

d. Specify where you want to save the result file (the file that Jtest will use to generate test cases). If you want to save it to a local system, select Local file and browse to (or enter) the local file path. If you want to save it to a remote system (e.g., the system where Jtest is running or the machine where the application is running), select Remote file and then enter a path to the desired location. e. Specify which classes/packages you want to monitor in the Monitored Classes/Packages area.

f.



If you to monitor all classes and packages that were specified in the JVM argument, leave All (the default) selected.



If you want to monitor only a subset of the classes or packages that were specified in the JVM argument, choose Custom and then specify the classes/ packages in the table that opens.This is useful for focusing on specific use cases that only pertain to a certain subset of classes. All classes and packages must be fully qualified and packages must end with ".*" character sequence.

Click Save.

4. Right-click the Test Case Explorer node for the test suite or test where you want to stop tracing and select Add New> Test from the shortcut menu. 5. In the Add Test wizard, select Tear-Down from the left pane, Jtest Tracer Client from the right pane, and click the Finish button. A Tear-Down: Jtest Tracer Client node displays within the selected test suite. 6. Configure the tear-down test exactly as you configured the set-up test, but with one exception: Select the Stop trace button instead of the Start trace button.

Generating Test Cases 967

Jtest Tracer Client

To generate test cases, you need to: 1. Produce the tracer output file by executing the test suite or specific tests that you configured for tracing (as described above). 2. Use Parasoft Jtest to generate JUnit test cases from the tracer output file. •

See the Jtest documentation for details.

Troubleshooting Object states are reproduced by using Jtest’s object repository feature in cases where the object was constructed prior to the start request. Due to memory issues, there is a limit to how large such objects can be. If the object exceeds the set limit, the restored object may not contain the necessary information to recreate the method invocation accurately. You can see which objects have not been created correctly by looking at the standard output of the application/service or by seeing if the generated test case looks incorrect. However, if the invocations do not interact with the unrecorded portions of the object, then your test case will be recreated perfectly. You may want to consider removing that large class from the monitor argument so we do not sniff it. If you believe an adequate amount of memory is available to capture larger objects, contact Jtest Support for instructions on how to increase the recorded object size limit. You may also increase the amount of memory that your application or web service uses. We suggest you always allocate the most memory possible for your application. Java compilation also has size limitations and you may run into them when generating tests with Jtest. When recreating tests, we make sure that the calling sequence is true to the one that was actually enacted. Therefore, there are times where in order to recreate the calling sequence perfectly, we are adding in a large number of method invocations. The compiler may not be able to handle such large methods. We already have measures to parse out method calls that do not mutate any data. If you see this problem, you can modify your monitor argument to not sniff this class or you can use Jtest Tracer Tool’s start and stop action commands to create a more focused sub section of calls.

968

WSDL Content Handler

WSDL Content Handler This topic covers the WSDL Content Handler tool, which is used to pass a WSDL and its directly/indirectly imported WSDLs and schemas to chained tools. Sections include: •

Understanding WSDL Content Handler



Configuring WSDL Content Handler

Understanding WSDL Content Handler This tool is rarely added to a test suite manually. More often, it is automatically generated from the New Project wizard when applying Policy Configurations to enforce various coding standards with the Coding Standards tool.

Configuring WSDL Content Handler The following option is available: •

WSDL URI: Specifies the WSDL to pass to chained tools. Click the Browse button to navigate to the appropriate WSDL file.

969

Browse

Browse This topic introduces the Browse tool, which sends a file or tool output to a Web browser. It also explains how to customize this tool. Sections include: •

Understanding Browse



Customizing Browse

Understanding Browse The Browse tool’s primary use is to send the designated file or tool output to a browser. You can use this tool as a stand-alone tool or you can use it as an tool output. In addition, the Browse tool is used when you prompt SOAtest to: •

Open a selected file in a browser (by choosing the Browse shortcut menu item in the Project tree or in the results area.).



Display recreated paths (by choosing the View Recreated Path shortcut menu item in the Paths tree).



Display an HTML-format report of results (by choosing the View Report shortcut menu item in the results area).

Customizing Browse You can customize the following options for this tool: •

Browser: Determines the browser used. Select the radio button associated with the browser to which you would like output to be directed.



Command: Determines the browser command used. If you selected Automatic for your browser then you will not need to supply information here. If you selected a specific browser (Mozilla Firefox or Internet Explorer) then this information is filled in for you but you may edit it. If you selected Other then you must supply the Executable as well as the Arguments associated with the browser that you wish to use.



Use DDE: Determines whether or not Dynamic Data Exchange (DDE) lets programs share information. If you select Use DDE, the changes you make to files as you are using SOAtest will automatically be applied to other programs that use those same files. Use DDE is selected by default and may not be disabled for the Automatic option in the Browser field. It is selected by default but may be disabled for Mozilla Firefox or Internet Explorer. When Other is selected in the Browser field, Use DDE is disabled by default and may not be enabled.

970

WSDL Semantics Validator

WSDL Semantics Validator This section introduces the WSDL Semantics Validator, which checks for errors in a WSDL. Sections include: •

Understanding WSDL Semantics Validator



Configuring WSDL Semantic Validator

Understanding WSDL Semantics Validator The WSDL Semantic Checker checks for basic coding errors in a WSDL that are relevant to the meanings and definitions of the elements found within. This tool is rarely added to a test suite manually. More often, it is automatically generated from the New Project wizard when you prompt SOAtest to create WSDL tests from a test wizard. As a test suite tool, it allows you to check a WSDL as part of your functional test scenario. To check a WSDL during static analysis, use the "Check WSDL Semantics" rule, which is in the Validate XML category. There are no customization options for the static analysis rule; it applies its defaults settings to the specified WSDL.

Configuring WSDL Semantic Validator You can customize the following option: •

WSDL URI: Specifies the WSDL to be checked for errors. Click the Browse button to navigate to the appropriate WSDL file.

971

Sponsor Documents

Recommended

No recommend documents

Or use your account on DocShare.tips

Hide

Forgot your password?

Or register your new account on DocShare.tips

Hide

Lost your password? Please enter your email address. You will receive a link to create a new password.

Back to log-in

Close