Web Service Implementation Methodology
Public Review Draft, 6 July 2005
Document
identifier:
fwsi-im-1.0-guidelines-doc-wd-01b.doc
Location:
http://www.oasis-open.org/committees/documents.php?wg_abbrev=fwsi
Editors:
Eng Wah LEE, Singapore Institute of Manufacturing Technology
Contributors:
Marc HAINES, individual <mhaines@uwm.edu>
Lai Peng CHAN, Singapore Institute of Manufacturing Technology
<lpchan@SIMTech.a-star.edu.sg>
Chai Hong ANG, Singapore Institute of Manufacturing Technology
Puay Siew TAN, Singapore Institute of Manufacturing Technology
Han Boon LEE, Singapore Institute of Manufacturing Technology
Yushi CHENG, Singapore Institute of Manufacturing Technology
<ycheng@SIMTech.a-star.edu.sg>
Xingjian XU, Singapore Institute of Manufacturing Technology
Zunliang YIN, Singapore Institute of Manufacturing Technology
Abstract:
This document specifies Web Service specific activities in a Web Service Implementation Methodology and illustrates the approach to incorporate these activities into an existing agile software development methodology.
Status:
This document is updated periodically on no
particular schedule. Send comments to
the editor.
Committee members should send comments on this specification to the fwsi-imsc@lists.oasis-open.org list. Others should subscribe to and send comments to the fwsi-comment@lists.oasis-open.org list. To subscribe, send an email message to fwsi-comment-request@lists.oasis-open.org with the word "subscribe" as the body of the message.
For information on whether any patents have been disclosed that may be essential to implementing this specification, and any offers of patent licensing terms, please refer to the Intellectual Property Rights section of the FWSI TC web page (http://www.oasis-open.org/committees/fwsi/).
Table of Contents
2 Implementation
Methodology Overview
2.2
Web Service Implementation Lifecycle
3 Web Service
Implementation Methodology
3.2.1
Activity: Determine the need for Web Service
3.2.2
Activity: Elicit Web Service requirements
3.2.3
Activity: Manage the Web Service requirements
3.2.4
Activity: Model the usage scenarios
3.2.5
Activity: Prepare Test Cases for User Acceptance Test (UAT) and System Test
3.3.1
Activity: Select a technology platform as implementation framework
3.3.2
Activity: Define a candidate architecture for the Web Service
3.3.3
Activity: Decide on the granularity of the Web Service
3.3.4
Activity: Identify reusable Web Services
3.3.5
Activity: Identify service interface for new Web Services
3.3.6
Activity: Prepare Test Cases for Performance Test
3.3.7
Activity: Prepare Test Cases for Integration / Interoperability Test
3.3.8
Activity: Prepare Test Cases for Functional Test
3.3.9
Activity: Testbed preparation
3.4.1
Activity: Transform signatures of reusable Web Services
3.4.2
Activity: Refine service interface of the new Web Service
3.4.3
Activity: Design Web Service
3.4.4
Activity: Refine Test Cases for Functional Test
3.5.1
Activity: Construct Web Service code
3.5.2
Activity: Construct Web Service client code
3.5.3
Activity: Unit Test Web Service
3.6.1
Activity: Test functionality of Web Service
3.6.2
Activity: Integration Test on the Web Service
3.6.3
Activity: System Test on the Web Service
3.6.4
Activity: User Acceptance Test on the Web Service
3.7.1
Activity: Prepare deployment environment
3.7.2
Activity: Deploy Web Service
3.7.3
Activity: Test deployment
3.7.4
Activity: Create end user support material
3.7.5
Activity: Publish Web Service
List of Figures
Figure 1: Web Service Implementation Lifecycle
Figure 2: The “V” Model incorporates the Web
Services specific Interoperability test
Figure 3: Relationship between phase,
activities, tasks, roles and artifacts
List of Tables
Table 1: Mapping between phases and roles
assigned
Table 2: Overview of activities, tasks, roles
and artifacts in the Requirements Phase
Table 3: Overview of activities, tasks, roles
and artifacts in the Analysis Phase
Table 4: Overview of activities, tasks, roles
and artifacts in the Design Phase
Table 5: Overview of activities, tasks, roles
and artifacts in the Coding Phase
Table 6: Overview of activities, tasks, roles
and artifacts in the Test Phase.
Table 7: Overview of activities, tasks, roles
and artifacts in the Deployment Phase
The purpose of this document is to define a practical and extensible Web
Service Implementation Methodology that can be used as a reference for Web
Services development and deployment.
This document is a consolidation of the best practices by Web Services
practitioners and aims to improve the
Web Services implementation process through the formalization of a Web Service
implementation lifecycle and defining Web Service specific activities and
artifacts.
This document should be used in conjunction with the
Functional Elements[1] specifications to govern
the approach by which the Functional Elements are implemented.
The target audiences are likely to be:
· Project Managers
This document provides a formal methodology for Web Services implementation, which can be used for management and control.
· Software Architects/Designers/Developers/Testers
This document identifies activities that are repeatable and which can be abide by, so as to ensure the quality of the software produced.
This document
focuses Web Service specific activities, artifacts, roles and responsibilities
that can be incorporated into an existing agile software development
methodology (e.g. RUP, Extreme Programming, Feature Driven Development
etc). For a few common agile
methodologies the technical committee is preparing examples that show in detail
how the generic activities, artifacts, roles, and responsibilities described in
this document can be incorporated and used in a given methodology. These case examples are provided in separate
documents that will be published along with this document when they become
available. Currently the technical
committee is preparing cases for RUP and Extreme Programming (XP).
This document does
not define yet another novel software development methodology. Instead, the Web Service implementation
methodology highlights important features in the context of Web Services. The elements of the Web Service
implementation methodology are based on existing agile software methodology and
extend it by incorporating Web Service specific activities.
Also, it is not in
the scope of this document to specifically address how each of these software
development methodologies should be tailored to incorporate Web Service
specific parts. Examples are provided only to illustrate just one possible way
of tailoring a specific agile development methodology for Web Service
implementation.
This document does
not intend to define a new software development methodology. Instead, the Web Service Implementation
Methodology leverages on an existing agile software methodology and extend it
by incorporating the Web Services specific activities.
This document also
does not cover the detailed description or explanation of any of the existing
agile software development methodology nor does it recommend one particular
agile software development methodology over another.
The Web Service Implementation Methodology defines a
systematic approach to Web Service development by leveraging on an
agile software development methodology and extending that methodology by
specifying the Web Services specific activities and the corresponding
roles and work-products that are produced in the process.
This methodology
will define a set of common practices that create a method-independent
framework, which can be applied by most software teams for developing Web
Service applications.
A Web Service Implementation Lifecycle refers to the phases for developing Web Services from requirement to deployment.
The Web Service implementation lifecycle typically includes the following phases:
The transitions through these phases need not be a single-pass sequential process. On the contrary, the process tends to be iterative and incremental in nature and should be agile enough to accommodate revisions in situations where the scope cannot be completely defined up front.
A Phase when used in the context of a Web Service implementation lifecycle refers to the period of time a set of related software implementation activities are carried out.
In general, the phases detailed in the sub-sections are identified to be pertinent in a Web Service implementation lifecycle. These phases may overlap with each other in the course of the implementation process as shown in Figure 1.
Figure 1: Web Service Implementation Lifecycle
The objective in
the requirements phase is to understand the business requirements and
translating them to Web Service requirements in terms of the features, the
functional and non-functional requirements, and the constraints within which
the Web Service has to abide.
Requirements
elicitation should be done by the requirements analyst and should involve the
project stakeholders such as the project champion, customers, end users,
etc. Following which, the analyst should
interpret, consolidate and communicate these requirements to the development
team.
If possible,
Requirements should be aggregated in a centralized repository where they can be
viewed, prioritized, and “mined” for iterative features. In all cases, enabling the team to easily
capture requirements, search, prioritize and elaborate as necessary is the
primary function of the repository.
In the analysis phase, the requirements of the Web Service are further refined and translated into conceptual models by which the technical development team can understand. It is also in this phase that an architecture analysis is done to define the high-level structure and identify the Web Service interface contracts. This process should be performed by both the requirements analyst and the architect and communicated to the design and development teams.
The detailed design of Web Services is done in this phase. In this phase, the designers should define the Web Service interface contract that has been identified in the analysis phase. The defined Web Service interface contract should identify the elements and the corresponding data types (possibly using a XML schema) as well as mode of interaction between the Web Service and the client, for example, whether it should be synchronous/asynchronous or RPC/Document style etc.
The coding and debugging phase for Web Service implementation is essentially quite similar to other software component-based coding and debugging phase. The main difference lies in the creation of additional Web Service interface wrappers (to expose the components’ public APIs), generation of WSDLs and client stubs. Web Services in addition have to be deployed to a Web Server/Application Server before the test clients can consume them.
The component developer and/or the tester should perform these activities.
For testing of Web Services, besides testing for functional correctness and completeness, testers should also perform interoperability testing between different platforms and clients‘ programs. Furthermore, performance testing has to be conducted to ensure that the Web Services are able to withstand the maximum load and stress as specified in the non-functional requirements specification. Other tasks like profiling of the Web Service application and inspection of SOAP messages should also be done in this phase.
The purpose of the deployment phase is to ensure that the Web Service is properly deployed. The phase will be executed after the service has been tested. The deployment of the Web Service is platform specific. The service end points of the Web Service specifies where the service is deployed and it needs to be identified and configured accordingly. The deployer primary tasks are to ensure that the Web Service has been properly configured and managed (e.g. version controlled, presetting of configuration files, packaged and loaded in the correct location etc.) and running post-deployment tests to ensure that the Web Service is indeed ready for use. Other optional tasks like specifying and registering the Web Service with an UDDI registry may also be performed in this phase.
Table 1 summaries the overview of each phase against its’ respective assigned roles.
Phases |
Primary Roles |
Requirements |
Requirements Analysts |
Analysis |
Requirements Analysts Architects |
Design |
Designers |
Coding |
Developers Testers |
Test |
Testers |
Deployment |
Deployers |
Table 1: Mapping between phases and roles assigned
Commonly defined
roles in software development methodology include the following:
Roles |
Responsibilities |
Requirements Analyst |
Responsible for
eliciting and interpreting the stakeholders’ needs, and communicating those
needs to the entire team. |
Architect |
Responsible for the software architecture, which includes
the key technical decisions that constrain the overall design and
implementation for the project. |
Designer |
Responsible for designing a part of the system, within the
constraints of the requirements, architecture, and development process for
the project. |
Developer |
Responsible for developing and unit testing the
components, in accordance with the project's adopted standards. |
Deployer |
Responsible for planning the product's transition to the
user community, ensuring those plans are enacted appropriately, managing
issues and monitoring progress. |
Stakeholder |
Responsible for providing the domain expertise and
specifying the system requirements.
Stakeholder usually includes the project champion and the end users. |
Project Manager |
Responsible for managing and monitoring the project
including the project scope, schedule and staffing of the project team. |
Test Manager |
Responsible for the total test efforts including the
quality and test advocacy, resource planning and management of the testing
schedule, and resolution of issues that impede the test effort. |
Test Designer |
Responsible for defining the test approach and ensuring
its successful implementation. The
role involves identifying the appropriate techniques, tools and guidelines to
implement the required tests, and to give guidance on the corresponding
resources requirements for the test effort.
The role also involves monitoring detailed testing progress and
results in each test cycle and evaluating the overall quality as a result of
testing activities. |
Tester |
Responsible for the core activities of the test effort,
which involves conducting the necessary tests and logging the outcomes of
that testing. |
System Administrator |
Responsible for planning, installing and maintaining the
hardware and software of the different environments e.g. development, test,
live environment |
Activity |
An Activity refers to a unit of work a role may be assigned to perform. Activities are performed within each of the phases in the Web Service implementation lifecycle. |
Artifact |
An Artifact refers to the work-product that is used or produced as a result of performing an activity. Examples of Artifacts include models, source files, scripts, and binary executable files. |
Role |
A Role refers to the responsibilities that a person or a team has been assigned with. |
The term Web Service describes a specialized type of software, which is designed to support a standardized way for provision and consumption of services over the Web, through the compliance with open standards such as eXtensible Markup Language (XML), SOAP, Web Services Description Language (WSDL) and Universal Description, Discovery and Integration (UDDI).
Web Services, unlike traditional client/server systems, such as browser/Web server systems, are not meant for direct end-user consumption. Rather, Web Services are pieces of business logic, which have programmatic interfaces and it is through these interfaces that developers can create new application systems.
The motivation behind Web Services is to facilitate businesses to interact and integrate with other businesses and clients, without having to go through lengthy integration design and/or to expose its confidential internal application details unnecessarily. This is made possible by leveraging on the non-platform dependent and non-programming language dependent XML to describe the data to be exchanged between businesses or between the business and its clients, using a WSDL to specify what the service is providing; using a UDDI to publish and locate who is providing the service; and typically using SOAP over HTTP to transfer the message across the internet[2].
A Web Service, naturally, is a software element, but because of its specialized interface and mechanism to interoperate with others, all the prevalent generic software development methodology would need to be tailored to handle the unique features of Web Service. This could translate to identification of Web Service specific requirements (e.g. conformance to Web Services standards), analysis of the specific implications of Web Service on the overall system, design of the Web Service interface and XML message structure, coding, testing, deployment and execution of the Web Service.
The Web Service Implementation Methodology that we define is to promote a systematic approach to Web Service development. Rather than defining a new software development methodology and expecting software practitioners to forget their own familiar and established methodology to re-learn another, the better alternative is to leverage on what is already available and customize that methodology to incorporate the specifics of Web Services.
The candidate software development methodology should, ideally, be agile and able to accommodate refinement throughout the development cycle in an iterative and incremental approach. The methodology should consist of phases that cover from the conception of the need of the Web Service, to the construction of the Web Service and finally to be deployed for use by the eventual client application. In this document, these phases are identified as requirements, analysis, design, code, test and deployment.
The Web Service Implementation Methodology would leverage on any of the candidate agile software development methodology and extend the said methodology by specifying additional and/or customized Web Service specific activities and its corresponding roles and work-products.
The Web Service Implementation Methodology is iterative and incremental. In each iteration, the Web Service would go through all the phases (i.e. requirements, analysis, design, code, testing and finally deployment), thereby developing and refining the Web Services throughout the project lifecycle.
In addition, for Web Service testing, a multitude of tests have to be conducted to ensure that the Web Service is developed according to its functional as well as non-functional requirements. Figure 2 illustrates using the “V” Model to perform these tests.
Figure 2: The “V” Model incorporates the Web Services specific Interoperability test
The specifications produced in each of the development phases are sources of input to derive the test scenarios and test cases. From these test cases, test scripts and test data are compiled, which will be used in unit testing, functional testing, integration/interoperability testing, system/performance testing and the final user acceptance testing.
The Web Service Implementation Lifecycle describes the phases a typical Web Service would undergo, from the identification of the need of the Web Service to the final deployment and usage by the end-users. The phases identified to be relevant in the Web Service Implementation Lifecycle are: requirements, analysis, design, code, test and deployment. In each of these phases, Web Service specific activities are carried out. These activities, as well as the roles and responsibilities, and the artifacts will be elaborated in the subsequent sub-sections. Figure 3 illustrates the above-mentioned relationship between phase, activities and their respective tasks, roles and artifacts.
Figure 3: Relationship between phase, activities, tasks, roles and artifacts
Stakeholders would usually include the end users, project champion, project manager, etc.
Understand the stakeholders’ need for Web Services.
Based on the current technology available, identify needs specially for Web Services.
Architect,
Requirements Analyst, Stakeholders, Project Manager
The
results should be recorded in Business Requirement Specifications.
Identify the departments, end users, domain experts, etc. who would be impacted by the introduction of Web Services.
Non-functional requirements are requirements pertaining to Usability, Reliability, Performance, Scalability, Supportability and other design considerations.
Requirements Analyst, Architect, Test
Manager
The results should be recorded in
Requirement Specifications.
Traceability matrices help to track the requirements that have been taken care of by the Web Services identified.
Requirements
Analyst, Architect, Test Manager
The results should
be recorded in Requirement Specifications.
This is to highlight the usage of Web Services involved. Especially, the message exchange scenarios should be captured.
Requirements Analyst, Architect, Test
Manager
The results should be recorded in
Requirement Specifications.
Test case(s) can be derived from requirements. This is also a way to verify the requirements when they are implemented.
The requirement validation matrix will include the requirements and a reference to the test case(s) that will validate the requirement.
Requirements Analyst, Test Manager, Test Designer
The results should
be recorded in Test Plan – UAT and System Test.
Table 2 summaries the overview of each activities and the
corresponding tasks, roles and artifacts under the activities.
Activities |
Tasks |
Roles |
Artifacts |
§ Determine
needs |
§ Identify
stakeholders § Understand
the inadequacies/problems to address § Identify
need for WS technology § Determine
positioning of WS within the boundaries of the problem identified § Define
features of WS based on needs § Identify
limitations |
§
Architect §
Requirements Analyst §
Stakeholders §
Project Manager |
§
Business Requirement Specifications |
§ Elicit
requirements |
§ Identify
sources for requirements gathering § Gather
information § Identify
functional requirements § Identify
non-functional requirements |
§
Architect §
Requirements Analyst §
Test Manager |
§
Requirement Specifications |
§ Manage
requirements |
§ Identify
WS and establish dependencies and priorities § Create
traceability matrices § Manage
changes to requirements |
§
Architect §
Requirements Analyst §
Test Manager |
§
Requirement Specifications |
§ Model
usage scenarios |
§ Translate
functional requirements into conceptual usage models § Specify
major interaction scenarios with WS clients |
§
Architect §
Requirements Analyst §
Test Manager |
§
Requirement Specifications |
§ Prepare
test cases for UAT and System Test |
§ Write
business scenarios test cases § Build
requirement validation matrix § Manage
changes to test cases |
§
Requirements Analyst §
Test Manager §
Test Designer |
§
Test Plan – UAT and System Test |
Notes: WS stands
for Web Services
Table 2: Overview of activities, tasks, roles and artifacts
in the Requirements Phase
Identify the Web Service standards based on the requirements and implementation constraints. Consider issues like the standards compatibility, version of standards, standards adoption in industry sector, and the organization approving the standards.
Choose a technology platform that is suitable for implementation. E.g. dotNet or Java platform.
Based on implementation constrains and considerations for standards support and interoperability requirements, choose the appropriate hosting platform for Web Services.
Available options include commercial vendor’s IDE tools, open source IDE tools. Normally, the selection of IDE is tied together with the implementation platforms.
Architect
The results should
be recorded in Software Architecture Specifications.
It is necessary to identify the architectural components that implement the wrapping of functionality as Web Services and implement the message exchanges in the high level architecture.
Identify and specify the first cut definition of message that is exchanged with Web Services clients. The definition includes the element of data, data type and format.
Architect
The results should
be recorded in Software Architecture Specifications.
Set up criteria on the coarseness of Web Services operations. Its definition depends on the usage scenarios and requirements.
Based on the requirements and criteria mentioned above, identify the functions that are needed to group into the Web Services.
In case there is a need to compose individual Web Services, choose and decide the mechanism to implement the compositions.
Architect
The results should
be recorded in Software Architecture Specifications.
If the functionality of the architecture component can be fulfilled with existing Web Services (internal or third party Web Services), the architectural components should be identified to make use of these existing Web Services.
Identify and gather the information about provider of existing Web Services.
Identify the functions that are going to be used. Define the interface of invocation.
Architect
The results should
be recorded in Software Architecture Specifications.
Based on the usage models and analysis models, identify the operations and its signatures.
If message exchanges are involved, the XML schema that guides the structure of the message should be defined.
Architect, Designer
Web Service
Signature Specifications, XML schema.
Test
case(s) can be derived from Architectural Design Specifications.
Test System Administrator, Test Designer
The results should be recorded in Test Plan – Performance Test.
Test
case(s) can be derived from Architectural Design Specifications.
Test Designer, Tester
The results should be recorded in Test Plan – Integration / Interoperability Test.
Test
case(s) can be derived from Architectural Design Specifications.
Test Designer, Tester
The results should be recorded in Test Plan - Functional Test.
Test System Administrator, Test Designer
The results should
be recorded in Test Plan - Testbed.
Table 3 summaries
the overview of each activities and the corresponding tasks, roles and
artifacts under the activities.
Activities |
Tasks |
Roles |
Artifacts |
§
Select a technology platform as
implementation framework |
§
Specify implementation standards §
Decide technology platform for
implementation §
Decide technology platform for hosting §
Decide IDE tools used for development |
§
Architect |
§
Software Architecture Specifications |
§
Define candidate architecture |
§
Define high-level architecture §
Identify architectural component that
expose functionality as WS §
Specify major information exchange with WS
clients |
§
Architect |
§
Software Architecture Specifications |
§
Decide granularity |
§
Decide on coarseness of the operations to
be exposed §
Identify and group functionality §
Decide on mechanisms to compose or
aggregate functionality |
§
Architect |
§
Software Architecture Specifications |
§
Identify reusable WS |
§
Identify architectural components that can
be realized by existing WS §
Identify WS providers for reusable WS §
Define major invocation scenarios of
re-use |
§
Architect |
§
Software Architecture Specifications |
§
Identify service interface |
§
Define new WS operation signatures §
Define XML schema for message exchange |
§
Architect §
Designer |
§
WS Signature Specifications §
XML Schema |
§
Prepare test cases for Performance Test |
§
Write Performance test cases |
§
Test System Administrator §
Test Designer |
§
Test Plan – Performance Test |
§
Prepare test cases for Integration /
Interoperability Test |
§
Write Integration / Interoperability test
cases |
§
Test Designer §
Tester |
§
Test Plan – Integration / Interoperability
Test |
§
Prepare test cases for Functional Test |
§
Write functional test cases |
§
Test Designer §
Tester |
§
Test Plan – Functional Test |
§
Testbed preparation |
§
Set up testing environment |
§
Test System Administrator §
Test Designer |
§
Test Plan - Testbed |
Notes: WS stands for Web Services
Table 3: Overview of
activities, tasks, roles and artifacts in the Analysis Phase
If the type of a parameter of the reusable service is not directly supported by the identified platform, data type mapping should be performed.
Certain design patterns could be used to reuse existing Web Service(s), such as adapter pattern, façade pattern etc. Adapter pattern could be used to expose a new interface of an existing Web Service. The façade pattern could be used to encapsulate the complexity of existing Web Services and provide a coarse-grained Web Service.
Designer
The results should
be recorded in Design Specifications.
In the detailed design stage, the signature may be refined further. Care must be taken to ensure that the design decision should not affect the interoperability of the service.
The XML schema may be refined to further expand on the data structure, data types, namespaces etc.
Designer
The results should
be recorded in Design Specifications.
The design of the internal structure needs to consider the receiving and pre-processing of request, delegating of the request, processing of the request and sending of the response. Existing modeling techniques such as UML, design patterns could be applied to the design.
Designer
The results should
be recorded in Design Specifications.
Test case(s) can be refined by Design
Specifications.
Test Designer, Tester
The results should be recorded in Test Plan – Functional Test.
Table 4 summaries the overview of each activities and the
corresponding tasks, roles and artifacts under the activities.
Activities |
Tasks |
Roles |
Artifacts |
§ Transform
signatures of reusable WS |
§ Identify
data type mapping if required § Identify
design patterns for mapping the re-used WS interface to the desired one |
§
Designer |
§
Design Specifications |
§ Refine
service interface of new WS |
§ Refine
WS interface signature § Refine
XML schema for message exchange |
§
Designer |
§
Design Specifications |
§ Design
WS |
§ Use
modeling techniques to describe internal structure of WS § Consider
non-functional requirements and design constraints |
§
Designer |
§
Design Specifications |
§ Refine
test cases for Functional Test |
§ Refine
functional test cases |
§
Test Designer §
Tester |
§
Test Plan – Functional Test |
Notes: WS stands
for Web Services
Table 4: Overview of activities, tasks, roles and
artifacts in the Design Phase
· Based on the implementation language choice, code the Web Service according to the design
Consider other constraints that are imposed by the specific implementation language itself. For example, consider the language dependent data types and the need to map these data types to the ones specified by the Web Service interface.
· Expose public APIs as Web Service interface
For example, in Java, to create the interface class to expose the class method as a Web Service operation or in dotNet, to annotate the class API as a [WebMethod].
· Generate WSDL for client to consume
Most IDEs can auto-generate the WSDL from the interface code.
Developer
Web Service Implementation Codes.
· Decide on the Web Service Client programming model
Among the three available
are:
a) Static Stub
The client invokes the Web Service operation through a stub. Any IDE can generate this stub at compile time.
b) Dynamic Proxy
As the name implies, dynamic proxy is dynamically generated when the client application is executed. Because dynamic proxy is generated during runtime, Web Service invocation using this method takes the longest time amongst the three approaches.
c) DII (Dynamic Invocation Interface)
It is the most flexible approach among the three programming models. The client does not even need to know the signature of the Web Service operation until runtime. The Web Service invocation can be dynamically constructed.
Hence, identify and decide on a suitable client programming model based on the weightage of flexibility against performance requirements.
· Write client code to consume the Web Service
Use the WSDL to generate client stubs, which can be used in the client code to invoke the methods provided by the Web Service.
Developer
Web Service Client Codes.
· Deploy Web Service in local test environment and perform functional unit testing
The
emphasis is on the correctness of the functionality and the exceptions
handling.
Developer
Unit Test Scripts.
Table 5 summaries the overview of each activities and the
corresponding tasks, roles and artifacts under the activities.
Activities |
Tasks |
Roles |
Artifacts |
§ Construct
WS code |
§ Code
based on implementation language chosen § Expose
public APIs as interface § Generate
WSDL for client |
§
Developer |
§
Implementation codes |
§ Construct
WS client code |
§ Decide
client programming model § Write
client codes |
§
Developer |
§
Client codes |
§ Unit
test |
§ Deploy
in local test environment and perform functional unit testing |
§
Developer |
§
Unit test scripts |
Notes: WS stands
for Web Services
Table 5: Overview of activities, tasks, roles and
artifacts in the Coding Phase
For Web Services,
additional tests may be conducted to ensure that the Web Services are
interoperable, secured and scalable.
Interoperability is an issue in Web Services because the
standards governing Web Services are still evolving. Furthermore, different vendors that implement
these specifications may interpret and comply with these specifications
differently. Currently there is an
effort by Web Services
Interoperability Organization (WS-I) to recommend basic profiles to
minimise these incompatibilities. The
aim of conducting interoperability tests is to ensure that these
reccomendations are followed and the Web Service developed will interoperate
with other Web Services and products without problems.
Network
congestion created by Web Services is the major contributor to Web Services’
slow performance. Not only is the
messaging between requesters and Web Services impacted by network latency, but
also the service discovery and description protocols that precede those message
exchanges. The cumulative effect of
these delays can seriously degrade the performance of Web Services. Therefore it is necessary to do a performance
test on the Web Services before they are deployed for operation, and then to
monitor the Web Services to determine if they can meet the service level
agreements.
Web Services introduce special security issues e.g.
in privacy, message integrity, authentication and authorization. Tests have to be conducted to ensure that
these security requirements have been fulfilled. However, security schemes
could complicate the process of testing and debugging Web Service basic
functionality. For example,
non-intrusive monitors are often used in functional testing but encrypted
traffic presents an obvious complication to this approach to testing.
The Web Service
should respond correctly to requests from their clients. The format of the SOAP message should be in
compliance with the specifications. WSDL
files, which contain metadata about Web Services’ interfaces, should be in
compliance with the WSDL specifications published by W3C. Perform fault checking to see how it handles
unexpected input. The test scripts and
data prepared in the earlier phases are executed in this activity. The test results should be recorded, and bugs
found should be reported to the code owners and fixed by them.
If a service requires a certain level of privacy, or if it requires that messages be authenticated in a certain way, then specific tests are needed to ensure that these security requirements are met. The test scripts and test data prepared in the earlier phases should be executed in this activity. Any inadequacies that may lead to possible security breaches should be reported and resolved by the code owner, designer or architect.
If a service is
registered to a registry server, perform registering of Web Service,
then write test clients to perform finding and binding
of Web Service on the registry, and then use the registry data to actually
invoke the service. Test results from
the test scripts and data should be recorded and bugs should be fixed by the
code owners.
If particular SOAP message has one or more intermediaries along the message route that take actions based upon the instructions provided to them in the header of the SOAP message. Web Service SOAP intermediary testing must verify the proper functionality of these intermediaries. Test results from the test scripts and data should be recorded and bugs should be fixed by the code owners.
Tester, Test Designer
The results should be recorded in Client
Test Code, Test Scripts and Test Results.
This is to highlight the interoperability issues of Web Services implementation. Refer to Interoperability Guideline for the interoperability testing scenarios.
Based on the test cases prepared in the Analysis Phase, test scripts and test data, which are prepared are executed and analyzed in this activity.
Tester, Test Designer, Test System
Administrator
The results should be recorded in Client
Test Code, Test Scripts and Test Results.
The test cases that are prepared in the earlier phases are executed in this activity. The load increases can be sudden surges or gradual ramp-ups. The test results should be analyzed to determine potential bottlenecks and if the system is scalable.
The
results from the test execution should be analyzed to determine if the system
can still render the expected quality of service as specified in the
non-functional requirement specifications.
Tester, Test Designer, Test System
Administrator
The results should be recorded in Client
Test Code, Test Scripts and Test Results.
The test cases prepared in the Requirement Phase are used in this activity to validate the correctness and completeness of the Web Service system. Any bugs found should be reported and fixed by the code owners.
User, Test Manager, Test System
Administrator
The results should be recorded in Client
Test Code, Test Scripts and Test Results.
Table 6 summaries
the overview of each activities and the corresponding tasks, roles and
artifacts under the activities.
Activities |
Tasks |
Roles |
Artifacts |
§
Test functionality |
§
Test basic WS functionality §
Test for security §
Test UDDI functionality §
Test for SOAP intermediary capability |
§
Tester §
Test Designer |
§
Client test code §
Test scripts §
Test results |
§
Integration test |
§
Test for conformance to WS-I §
Perform interoperability test based on
various scenarios §
Perform integration test based on various
scenarios |
§
Tester §
Test Designer §
Test System Administrator |
§
Client test code §
Test scripts §
Test results |
§
System test |
§
Check system functionality and response
time under different degrees of load increases §
Check functionality and response time
under different combinations of valid and invalid requests |
§
Tester §
Test Designer §
Test System Administrator |
§
Client test code §
Test scripts §
Test results |
§
User acceptance test |
§
Run UAT test cases |
§
User §
Test Manager §
Test System Administrator |
§
Client test code §
Test scripts §
Test results |
Notes: WS stands for Web Services
Table 6: Overview of
activities, tasks, roles and artifacts in the Test Phase
The software may include application server, database, etc. The application server should have a SOAP listener to support Web Services. Some Web Services may need the SOAP handler to be configured.
System Engineer
Release Notes.
Web Service URL is unique and used to identify the Web Service and where it is located.
Deployment script is used to determine the steps of deployment. Although it is different for different application server, most of them will include creation of directory, copying files, shutting down and restarting the server.
Execute the prepared deployment script.
After successfully deploying the Web Service, a WSDL file is needed to describe the functions provided by the Web Service. WSDL can be created manually or by most application servers, which will automatically generate the WSDL file after deployment.
Developer
WSDL File,
Deployment Script.
The Web Service client code should be created by the developer during code and debug phase.
Because the functionality of the Web Service is properly tested, there is no need to test all the operations. To make sure the Web Service is properly deployed and configured, the best candidates of operations for invocation are the ones needed for database connection, configuration of SOAP handler or any other special features of the application server.
Tester
Web Service Client Codes.
The support material is needed to help the users to understand and use the Web Service. For example, an interoperability guide of the Web Service.
Developer
Interoperability Guide, User Guide, On-line Help, Tutorials and Training Material.
Based on the requirements, decide whether a private or public UDDI
registry is needed and the version of the UDDI Business Registry
specifications to follow.
The information may include key words for searching, description of Web Service, URL of WSDL file, etc.
Normally, the UDDI registry will support the publishing via browser.
Search the Web Service through browser provided by UDDI registry or tools provided by other vendors.
Developer
None.
Table 7 summaries the overview of each activities and the
corresponding tasks, roles and artifacts under the activities.
Activities |
Tasks |
Roles |
Artifacts |
§ Prepare
deployment environment |
§ Set
up and configure hardware § Set
up and configure software |
§
System Engineer |
§
Release Notes |
§ Deploy
WS |
§ Determine
service URL § Prepare
deployment script § Deploy
WS § Generate
WSDL |
§
Developer |
§
WSDL file §
Deployment script |
§ Test
deployment |
§ Create
(reuse) client code § Consume
WS with client code |
§
Tester |
§
Client codes |
§ Create
end user support material |
§ Create
support material |
§
Developer |
§
Interoperability guide §
User guide §
On-line help §
Tutorials §
Training material |
§ Publish
WS |
§ Identify
UDDI registry for publishing § Prepare
information for publishing § Publish
in UDDI registry § Search
by key words after publishing |
§
Developer |
§
-- |
Notes: WS stands
for Web Services
Table 7: Overview of activities, tasks, roles and
artifacts in the Deployment Phase
1. “Rational Unified Process”, Version 2003.06.00.65, IBM-Rational Software.
2. “Rational Unified Process for Developing Web Services”, Version 1.0, Java Smart Services Laboratory and Rational Software Pte. Ltd., Aug 2003.
The following individuals were members of the committee during the development of this documentation:
·
Ravi Shankar, CrimsonLogic Pte. Ltd.
·
Jagdip
Talla, CrimsonLogic Pte. Ltd.
·
Andy
Tan, Individual
·
Roberto
Pascual, The Infocomm Development Authority of
Rev |
Date |
By Whom |
What |
wd-01 |
2004-09-30 |
Lai Peng CHAN Chai Hong ANG |
Initial version |
- |
2004-12-23 |
Chai Hong ANG |
Split the
document into two |
wd-01a |
2005-05-24 |
Chai Hong ANG Puay Siew TAN Han Boon LEE |
§
Remover
Section 2.1 Terminology, Section 2.2 Concepts and Section 2.2.1 Web Service
and combined them as Section 2.1 Objective §
Renumber
Section 2.2.2 to Section 2.2 §
Renumber
Section 2.2.3 to 2.3. The rest of the
sub-sections are renumbered accordingly §
Added
Table 1 as summary for phase and its assigned roles §
Section
2.2.4 Activity and Section 2.2.6 Artifact are moved into Section 2.5 Glossary §
Renumber
Section 2.2.5 to 2.5 and put them into table format §
Section
3 is renamed as Web Service Implementation Methodology §
Removed
Section 3.1 §
Renumber
Section 3.1.1 to 3.1 §
Renumber
Section 3.1.2 to 3.2. The rest of the
sub-sections are renumbered accordingly §
Tables
are added into each phase for summary §
Removed
Normative and Non-Normative from Section 4 |
wd-01b |
2005-06-02 |
Chai Hong ANG Prof. Marc Haines |
§
Edited
based on Prof. Haines’ comments |
OASIS takes no
position regarding the validity or scope of any intellectual property or other
rights that might be claimed to pertain to the implementation or use of the
technology described in this document or the extent to which any license under
such rights might or might not be available; neither does it represent that it
has made any effort to identify any such rights. Information on OASIS's
procedures with respect to rights in OASIS specifications can be found at the
OASIS website. Copies of claims of rights made available for publication and
any assurances of licenses to be made available, or the result of an attempt
made to obtain a general license or permission for the use of such proprietary
rights by implementors or users of this specification, can be obtained from the
OASIS Executive Director.
OASIS invites any
interested party to bring to its attention any copyrights, patents or patent
applications, or other proprietary rights which may cover technology that may
be required to implement this specification. Please address the information to
the OASIS Executive Director.
Copyright © OASIS
Open 2005. All Rights
Reserved.
This document and
translations of it may be copied and furnished to others, and derivative works
that comment on or otherwise explain it or assist in its implementation may be
prepared, copied, published and distributed, in whole or in part, without
restriction of any kind, provided that the above copyright notice and this
paragraph are included on all such copies and derivative works. However, this
document itself does not be modified in any way, such as by removing the
copyright notice or references to OASIS, except as needed for the purpose of
developing OASIS specifications, in which case the procedures for copyrights
defined in the OASIS Intellectual Property Rights document must be followed, or
as required to translate it into languages other than English.
The limited
permissions granted above are perpetual and will not be revoked by OASIS or its
successors or assigns.
This document and the
information contained herein is provided on an “AS IS” basis and OASIS
DISCLAIMS ALL WARRANTIES, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO ANY
WARRANTY THAT THE USE OF THE INFORMATION HEREIN WILL NOT INFRINGE ANY RIGHTS OR
ANY IMPLIED WARRANTIES OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE.
[1] The Functional Elements are to be specified as components, which are to be exposed as Web Services where appropriate.
[2] SOAP is transport agnostic. Therefore, other Internet (e.g. SMTP) or non-Internet (IBM MQ Series) may be used. From a practical perspective, however, SOAP over HTTP appears to be the typical scenario.