Quantcast
Channel: SAP Process Orchestration
Viewing all 108 articles
Browse latest View live

Working with REST Adapter: A REST to SOAP Scenario in SAP PI/PO

$
0
0

Working with REST Adapter: A REST to SOAP Scenario in SAP PI/PO

Introduction:With the introduction of the REST Adapter in SAP PI/PO, consuming a REST service or exposing a REST service in PI becomes more fun. This is with respect to the fact that the SAP PI/PO ESB extends its capabilities to handle and process REST-based messages (XML, JSON) easily. Prior to this time, implementing or consuming a REST-based service had to be done via third party adapters (e.g Advantco) or manipulation of the AXIS Framework within SAP PI, which is often a tough task. In this blog, my aim is to highlight simple steps needed to expose a REST-based service in SAP PI/PO. An end-to-end implementation of simple REST to SOAP scenario is described.

Prerequisite: SAP NetWeaver 7.31 SP14 or SAP NetWeaver 7.40 SP 09 and above, knowledge of creating objects in Enterprise Service Repository and Integration Builder.

Scenario:The REST to SOAP scenario is implemented such that, an XML-based/JSON-based request is triggered from SOAP UI/Chrome REST Client. It is received by the Configured REST Adapter, processed and passed to the Receiver SOAP Adapter (configured to connect with the currency converter webservice from www.webservicex.com ).

REST-WS Scenario.gif

Figure: Exposing REST-based Service via SAP PI as shown in REST-to-SOAP Scenario diagram

ESR and Configuration Objects: In this scenario, the necessary ESR objects creation steps are only listed below, with more emphasis on the configuration.

ESR Objects: Create the following ESR objects

1)      Download the WSDL from: http://www.webservicex.com/ws/WSDetails.aspx?CATID=2&WSID=10

2)      Import the WSDL as external definition in the sender and receiver namespaces

3)      Create the service interfaces for sender and receiver (use the same XSD on both sides)

4)      Create the message mappings and operation mappings

Directory Objects: Described below the necessary configurations steps for each object;

Sender REST Adapter: Create the Sender Communication channel by selecting the “REST Adapter” option.  Configure the REST Adapter as show below;

  • Input message format: XML
  • Quality of Service: Best Effort
  • Output message format: XML

2015-03-31 17_05_56-Configuration_ Integration Builder (rojodemo_P74_00).gif

Figure: Sender REST Adapter Configuration with input message as XML

Navigate to the “Channel Selection” and specify a URL path.  This is optional, but useful to distinguish the different scenarios exposing REST-based services in SAP PI, as this will be added in the URL of the sender application (in this case: SOAP UI).

2015-03-31 17_19_04-Configuration_ Integration Builder (rojodemo_P74_00).gif

Figure: Sender REST Adapter Configuration Endpoint

Receiver SOAP Adapter: Create the receiver communication channel and provide the necessary connection details to the currency converter webservice.

2015-03-31 17_23_28-Configuration_ Integration Builder (rojodemo_P74_00).gif

Figure: Receiver SOAP Adapter Configuration

 

Integrated Configuration: Configure the integrated configuration (ICO) using the sender and the receiver channels created earlier.

Test the Scenario with XML: To test the scenario, SOAP UI is used as the sender application. The endpoint URL to fire the message to is constructed as shown below;

http://<server>:<port>/RESTAdapter/<pathspecifiedinthecommunicationchannel>

eg: http://xxxxxxx:50000/RESTAdapter/Rojo/CurrencyConverter/201503

Obtain a sample XML message from the test tab of the “Message Mapping” object in the ESR to be used as the XML to be sent from SOAP UI.

2015-03-31 17_33_50-soapUI 4.5.0-Beta 2.gif

Figure: XML Request message from SOAP UI


Change The Sender Adapter Configuration (for JSON to XML conversion)

In addition to the above test, change the sender adapter configuration from XML to JSON, so that SAP PI receives a JSON request message from SOAP UI;

  • Input message format: JSON with the following additional configuration
  • Convert to XML
  • Add wrapper element
  • Element name (name of the request message as in the XSD in ESR)
  • Element namespace (namespace of the request as in the XSD in ESR)
  • Quality of Service: Best Effort
  • Output message format: XML

2015-04-03 15_22_41-_ BC_DemoBPM _ CC_Currency_REST_Send_ Display Communication Channel.gif

Figure: Sender REST Adapter Configuration with input message as JSON

Test the Scenario with JSON: To test the scenario, SOAP UI is used as the sender application. The endpoint URL to fire the message to is the same as in the previous test. The SOAP UI JSON request and response messages are shown below;

2015-04-03 15_26_55-soapUI 4.5.0-Beta 2.gif

Figure: JSON Request from SOAP UI

2015-04-03 15_35_47-RESTClient.gif

Figure: JSON Request from Chrome REST Client

The results of the calls to consume the exposed REST service in SAP PI based on the REST adapter are also shown above in both test cases.

Conclusion: With this blog, I have been able to provide an insight into how to expose a REST service in SAP PI/PO using the REST Adapter as the sender communication channel. Additional settings of the adapter can also be used as well, e.g. sending a JSON message as input or output message.


Step by Step Simple iFlow to BPM to iFlow and Calling Operation Mapping in BPM

$
0
0

This post show a simple example of connection from source iFlow to BPM, call an Operation Mapping in BPM, then send output to target iFlow.

simplebpm-iflow-bpm-iflow

 

The flow :
a. Sender (BC_S1) send a text file contain message FirstNameLastName via sender file adapter to first iFlow(PI to BPM).
b. First iFlow(PI to BPM) route message FirstNameLastName via receiver SOAP adapter to BPM(BC_BPM).
c. BPM(BC_BPM) call an Operation Mapping to combine message FirstNameLastName to FullName.
d. BPM(BC_BPM) route message FullName via sender SOAP adapter to second iFlow(BPM to PI).
e. Second iFlow(BPM to PI) save text file contain message FullName via receiver file adapter to Receiver (BC_R1).

 

The steps:
1. Create and activate interface objects in ESR
2. Create, activate and deploy source iFlow (PI to BPM) using NWDS
3. Create, activate and deploy target iFlow(BPM to PI) using NWDS
4. Create, activate and deploy BPM process. using NWDS
5. Test run it.

 

All sub steps:
1.1 Create data types DT_FirstNameLastName, DT_FullName
simplebpm-dt-firstnamelastname
simplebpm-dt-fullname

 

1.2 Create message types MT_FirstNameLastName, MT_FullName
simplebpm-mt-firstnamelastname
simplebpm-mt-fullname

 

1.3 Create below service interfaces :
SI_FirstNameLastName_OUT, outbound interface from Sender(S1).
simplebpm-si-firstnamelastname-out

 

SI_FirstNameLastName_IN_XI30, inbound interface to BPM(BC_BPM). Stateless XI30 compatible.
simplebpm-si-firstnamelastname-in-xi30

 

SI_FullName_OUT_XI30, outbound interface from BPM(BC_BPM). Stateless XI30 compatible.
simplebpm-si-fullname-out-xi30

 

SI_FullName_IN, inbound interface to Receiver(R1).
simplebpm-si-fullname-in

 

1.4 Create message mapping MM_FirstNameLastName_to_FullName to combine message FirstNameLastName to FullName.
simplebpm-mm-firstnamelastname-to-fullname

 

1.5 Create operation mapping OM_FirstNameLastName_to_FullName, for use in BPM.
simplebpm-om-firstnamelastname-to-fullname

 

2.1 Create iFlow EIP_simplebpm_IF_01_FILE2BPM
2.2 Assign sender BC_S1, receiver BC_BPM
2.3 Assign sender interface SI_FirstNameLastName_OUT, receiver interface SI_FirstNameLastName_IN_XI30
simplebpm-iflow01-graphical

 

2.4 Configure sender file adapter CC_SND_FILE_S1, receiver SOAP adapter CC_RCV_SOAP_BPM.
Receiver SOAP adapter use these URL “http://<hostname>:<port>/MessagingSystem/receive/JPR/XI” to send to BPM.
simplebpm-iflow01-cc-snd-file
simplebpm-iflow01-cc-rcv-soap-general
simplebpm-iflow01-cc-rcv-soap-adapter-specific

 

2.5 Activate and deploy. ICO generated.
simplebpm-iflow01-ico

 

3.1 Create iFlow EIP_simplebpm_IF_02_BPM2FILE
3.2 Assign sender BC_BPM, receiver BC_R1
3.3 Assign sender interface SI_FullName_OUT_XI30, receiver interface SI_FullName_IN
simplebpm-iflow02-graphical

 

3.4 Configure sender SOAP adapter CC_SND_SOAP_BPM, receiver file adapter CC_RCV_FILE_R1.
simplebpm-iflow02-cc-snd-soap
simplebpm-iflow02-cc-rcv-file

 

3.5 Activate and deploy. ICO generated.
simplebpm-iflow02-ico

 

4.1 Create new project eip_simplebpm in NWDS using perspective “Proces Development”.
4.2 Import WDSL for SI_FirstNameLastName_IN_XI30
4.3 Import WDSL for SI_FullName_OUT_XI30
4.4 Import WDSL for OM_FirstNameLastName_to_FullName
simplebpm-project-explorer

 

4.5 Create new process pro_simple. The completed process is like below:
simplebpm-pro-all

4.6 Create data objects act as variables to temporarily hold values before map to next steps.
simplebpm-pro-data-object

 

4.7 At Start step, Event trigger, add StartMsg.
simplebpm-pro-start-icon
simplebpm-pro-start-event-trigger
simplebpm-pro-start-output-mapping
simplebpm-pro-start-message

 

4.8 Add automated activity, OM_FirstNameLastName_to_FullName to bpm process.
simplebpm-pro-mm-icon
simplebpm-pro-mm-interface
simplebpm-pro-mm-input-mapping
simplebpm-pro-mm-output-mapping

 

4.9 Add  automated activity, SI_FullName_OUT_XI30 to bpm process.
simplebpm-pro-fullname-icon
simplebpm-pro-fullname-interface
simplebpm-pro-fullname-input-mapping
simplebpm-pro-fullname-output-mapping

 

4.10 Ensure service references is correct:
For SI_FullName_OUT_XI30, Type is “XI”, then maintain sender component “BC_BPM”
simplebpm-pro-service-references-xi

 

For OM mapping, type is “WS”.
simplebpm-pro-service-references-localhost

 

5.1 Load source file, then file will go through source iFlow to BPM to target iFlow and finally create target file with fullname. simplebpm-testing

 

5.2 Communication Channel Monitoring:
simplebpm-pimon-cc

 

5.3 Go to process repository and view it detail:
simplebpm-pimon-process-repository
simplebpm-pimon-process-repository-detail

 

5.4 Go to manage processes, view process instances completed.
simplebpm-pimon-manage-process
simplebpm-pimon-manage-process-process-instances

 

Reference:
http://scn.sap.com/community/process-orchestration/blog/2012/06/11/using-netweaver-pi-s-mapping-in-netweaver-bpm
Using Operation Mappings (PI) in a BPM Process
Configuring Message Processing from BPM to PI
Integrating PI Operation Mapping with SAP BPM

 

Thanks for viewing.

PO REST to REST iFlow Scenario

$
0
0

In this blog I am going to show you how to achieve RESTFullness with PI REST Adapter. Use this blog as a hands-on starting point to your RESTFull journey. I have followed the blog series at http://scn.sap.com/docs/DOC-60854 and it’s really full of information. With this blog I am going to share the example I prepared to practice REST Adapter, so as to give you head start.

 

Scenario

I am going to use the rest service http://www.thomas-bayer.com/sqlrest. This service gives information about Customer, Invoice, Item & Product. The iFLOW's that I created are REST to REST (just to keep it simple and show both sender and receiver configurations).

 

Concepts

The REST call in my example can go up to three levels deep. So I created three different iFLOW's representing three different entities.

Level 0 (IFLOW: Test_R2R): http://HOST:PORT/RESTAdapter/sqlrest/

Level 1 (IFLOW: Test_R2R_Level1): http://HOST:PORT/RESTAdapter/sqlrest/CUSTOMER/

Level 2 (IFLOW: Test_R2R_Level2): http://HOST:PORT/RESTAdapter/sqlrest/CUSTOMER/11/


iFLOW

Please import the attached iFLOW’s to run the scenario in your PO SP8 and above.

 

Some important REST adapter configurations

Sender REST Endpoint is same for all three iFOWS:

  Img1.png

 

 

Since Level 0 is the default so use “/” in the URL Pattern in the sender REST.

Img2.png

 

Level 0 Receiver Rest is to be configured as follows

Img3.png

 

Level 1 represent Customer or Item or Product therefore in this configuration we use patters to capture the details of the Level 1 as “resource”. The Level 1 sender REST adapter need following configuration.

Img4.png

 

In the Level 1 receiver REST adapter the “resource” is read from the dynamic configuration and is used here to generate the right URL.

Img5.png

 

For the Level 2 sender REST adapter we now configure two parameters “resource” & “id”.

Img6.png

 

In the Level 2 receiver REST adapter the values of “resource” and “id” are read from the dynamic configuration to generate the right URL.

Img7.png

 

Good Luck!!

Concur - ECC integration centric process for Standard Accounting Extract (SAE) posting

$
0
0

Introduction

Everyone knows that cloud applications are the hot topic these days. In the area of business travel, Concur is the market leader for cloud based travel & expense solutions. It is no wonder that SAP acquired Concur recently in SAP's march towards cloud era.

 

In the area of integration, Concur provides a rich and mature set of RESTful APIs. The online API documentation is very extensive, and it also provides a web-based UI to test those APIs without any external tool or creating custom programs.

 

One of the key areas of integration for Concur is the generation of the Standard Accounting Extract (SAE). Once the business expenses are approved on Concur, an extract of the approved expenses can be generated to be fed into a payment system. In SAP's case, this would typically be integrated into the Finance module on ECC as GL/AP postings. At this point in time, there is no prepackaged PI/PO content for integration with Concur. Hopefully with the acquisition, SAP will deliver prepackaged contents similar to those available for SuccessFactors and Ariba.

 

In this blog, I will share my experience for the integration centric process developed on PO's BPM to extract the SAE file from Concur. Consumption of Concur's RESTful API is achieved via Advancto's REST adapter as it requires OAuth 2.0 authentication method, which is not yet available on SAP's own REST adapter. The focus of the blog will be on the high level overview of the solution and will not go into too much details of each step of the implementation.

 

 

Standard Accounting Extract Process Flow

In order to extract the SAE file from Concur, 4 different APIs need to be executed in sequence. As such, it cannot be achieved by a single stateless integration scenario in PI. It requires a stateful process which can theoretically be achieved by an ABAP program, however it is well suited to be modeled in BPM as there are no business logic involved.

 

Following are the sequence of steps required to extract the SAE file via Concur's Extract Web Service:

  1. Get Extract Definition
  2. Post Extract Job Initiation Request
  3. Get Extract Job Status - Repeat until Status = Completed
  4. Get Extract File

 

Note: Step 1 can be skipped if the Extract Definition is always the same, therefore the same Definition ID is always used for Step 2.

 

 

Challenges

Whilst the sequence of steps is relatively straightforward to implement in BPM (sequentially go through each step), there are various challenges in order to achieve a robust solution.

 

1. Retry mechanism

Step 3 has to be repeated indefinitely until the status returned by the synchronous call is Completed. Depending on the load and resource availability of the Concur system, this could be as fast as 1 minute or as long as 3-4 hours! As such, a multi-tiered retry interval was implemented so that the step is not repeated every minute and therefore unnecessarily consuming resources on the PO system.

retry.png

 

 

2. BPM Process Starting Trigger

In addition to the RESTful APIs listed above, I added an additional call to the ReportDigests API as the first step that triggers the start of the BPM. This API checks if there are any approved expenses in Concur. If there are no approved expenses, the BPM is terminated normally without further execution of the Extract APIs.

check.png

This first step is triggered by a sender REST communication channel in polling mode with a polling frequency of every hour.

 

 

3. Exception Handling for Step 2 (Post Extract Job Initiation)

Once an extract job for a particular definition has already been submitted to the Concur system, it will be queued in Concur while waiting for available resources. Concur does not allow more than one job for the same definition to be queued at the same time. Therefore if there is a long running job already submitted, a subsequent call of Step 2 will return an application error. A boundary event is defined for the fault message associated to this step to catch the application error. If there is an error, the BPM will be routed to the exception path to trigger an email notification and terminate normally.

term.png

 

 

4. Optimize BPM process context by reducing payload size

After completion of Step 4 to retrieve the generated extract file, the response of the web service call will be returned to the BPM process. It is possible that the response contains a large payload and thus increases the memory usage of the BPM process context. One approach to avoid loading the large response payload into the BPM process is by using the Claim Check Integration Pattern as described in the blog below. However, my system is on a lower SP so it was not available.

Claim Check enhancements in SAP Process Orchestration

 

An alternative approach was used whereby the response payload was zipped and encoded into a Base64 string. The Base64 content is stored in the following field in an XML payload. It is this XML payload that is returned back to the BPM process. This conversion is performed via a Java Mapping on the response flow of Step 4.

base64.png

 

With this, the size of the payload loaded into the BPM was about 10% of the original payload size - a 90% reduction!

 

 

5. Multiple files from Extraction job

Depending on the extract definition configured in Concur, it is possible to have multiple files as the output of the extraction job. If there are multiple files, the response payload of Step 4 will be a Zip file containing multiple pipe delimited files.

 

In order to handle multiple files in the extraction, a custom Java mapping was developed to unzip the file and create additional attachments from each file in the Zip archive. The following document covers that solution.

Java Mapping: Unzip file and create additional attachments

 

 

References

As this was my first development on NW BPM compared to previous ccBPM developments on PI dual stack system, the following SCN articles came in very handy to shorten the learning curve for developing in NW BPM as well as implementing advanced error handling.

NetWeaver BPM for System-to-System Message Orchestration (Including Migration of ccBPM to NW BPM)

Integration Centric Process - Advanced Exception Handling in BPM

 

 

Appendix

Below is the full diagram of the BPM process flow.

Process_GetSAEFile.jpg

What’s new in SAP Process Orchestration 7.31 SP15 / 7.4 SP10

$
0
0

The latest SP of SAP Process Orchestration 7.31 / 7.4 has been shipped in March this year. So let me summarize what new features and enhancements it brought to you.

 

Substitution profiles

As a BPM task owner you need to assign one or more substitutes when on leave or otherwise unavailable. So far, more generic substitution rules were supported only, e.g., Anna substitutes John from 3rd of August until 11th of August for all tasks assigned to John. With the new substitution profiles you can categorize your tasks, and hence are able to define more specific task forwarding based on the kind of tasks. For instance, you may define a substitution rule for John that defines Anna as substitute for all HR related tasks during John's absence however excluding functional tasks. The substitution profiles are currently supported in the Unified Inbox and My Inbox. Furthermore, a public Java API is supported to manage substitution profiles. For more details, refer to the blog Maintaining Substitution Profiles for SAP BPM Tasks in Unified Inbox.

SubstitutionRules.png

 

Further REST adapter capabilities

For provisioning and consumption of RESTful services, we have recently shipped a new so called REST adapter. With the latest SP, this adapter has been enhanced supporting the following new features:

  • Custom error handling: You can maintain rules for defining how the message processing should behave in certain error situations. For instance, you would like to ignore particular error codes, or you would like to reply with a custom message based on message content, etc.

REST - error handling.png

 

  • Logging of raw JSON content before and after conversion: If your RESTful service supports JSON format, the original payload in JSON format is logged before converted into XML. This may be required for audit reasons.

REST - log version.png

  • Authentication with OAuth 2.0 SAML Bearer Token profile.
  • JSON to XML conversion enhancements to escape invalid XML name and XML name start characters: If the JSON request contains keys that would lead to invalid XML names, you can escape the invalid characters with the specified sequence, e.g., a blank, an ampersand, a number at the beginning, etc.

REST - json conversion with example.png

  • Setting of custom HTTP header elements in REST receiver adapter: if you like to consume a RESTful service that requires specific http header elements added to the http request, you can specify those http header elements In the REST receiver channel. By using variables you can access payload as well as adapter specific attributes when defining the same.

If you like to learn more about the REST adapter capabilities, check out the following blog series PI REST Adapter - Blog Overview. We will soon add further blogs describing the new features in more detail.

 

Monitoring enhancements

Some of the monitors which are provided within the Configuration and Monitoring Home page (alias pimon) were still pointing to the Runtime Workbench (RWB). The issue with the Runtime Workbench is that the technology is not accessible. Over the time we have replaced RWB monitors with monitors running in the NetWeaver Administrator (NWA) which supports the accessibility product standards. The following new NWA monitors replacing the corresponding RWB monitors have been shipped with the current SP:

  • Message status overview for the Integration Server
  • CPA cache history
  • Cache connectivity test

 

In the message monitor of the RWB it was possible to select a runtime component within your PI domain. Now, this is also supported in the NWA message monitor. This means that you are able to centrally monitor all messages within your PI domain including non-central adapter engines and SAP backend systems connected via ABAP proxies.

Monitoring - Message Overview.png

Furthermore, in the send test message UI as an alternative to the payload editor you can now upload the payload from a file which is more convenient compared to copy&paste especially when testing your scenarios with large files.

Monitoring - Send Test Message.png

 

Copy and resend of successful messages from archive

In the previous SPs we have introduced a new feature that allows you to copy and re-send already successfully processed messages. The use case is to retrieve lost messages within your receiving backend system in case that a recovery is not otherwise possible. With the previous shipment, it was possible to copy messages from the data store only. However, in most cases those messages would have been archived already. With the current enhancement, copy from archive has been added. The copy from archives behaves like the copy from data store, i.e., both modes are supported copy and immediate sending and copy only.

ResendFromArchive.png

 

Enhanced message flow search for rules in compound flows

As a prerequisite for using the Message Flow Monitor in SAP Solution Manager, the integration scenarios deployed on your landscape, i.e., the so called message flows, are automatically discovered. This discovery is part of the Integration Visibility core component running on your PI system. In case of integration scenarios running through two PI runtime components, e.g., a b2b scenario that connects an SAP Process Orchestration system with a non-central adapter engine running in the DMZ, two message flows are combined in a so called compound flow. In order to be able to discover such compound flows, you need to maintain rules that define which message flow is the predecessor and which is the successor message flow. So far, when defining those rules, the only information available to pick the right message flows was their flow names. Two message flows can actually have the same name and hence it was sometimes hard to find the right message flows. With the current enhancement you have five additional attributes which are displayed in the rules editor: flow id, sender component, sender interface, receiver component, and receiver interface. To be able to see those additional columns you need to switch to the Expert mode.

MFM - Compound flows.png

 

Large file handling for SFTP adapter

The SFTP adapter is part of the PI Secure Connectivity add-on 1.0. We have recently released a new patch for SP4 of the add-on adding content conversion and large file handling capabilities to the SFTP adapter. The content conversion capabilities are the same like in the file adapter, i.e., supporting conversion from flat file to XML and vice versa. For large file handling we do support two approaches: chunking and message file transfer bypassing the PI runtime. For former mode, the incoming large messages are divided into chunks of configurable size, and aggregated at PI outbound. For latter, the large file is moved to a target directory without entering the PI runtime. Only the metadata of the file transfer such as file location, file name, file size, etc., are put into the PI runtime. For more details, see SFTP Adapter - Handling Large File.

LargeFileHandling - Bypassing RT.png

 

For more details of all new 7.31 SP15 / 7.4 SP10 features, refer to the release notes.

Service Interface XML Namespace Swap

$
0
0

Was recently working on a project and was surprised ( did not read fine print in the documentation) late in the project that namespace of XML document in the contract (WSDL) changes when you move from Dev - QA -Prod .

 

Ammm ! The challenge I had was, the WSDL (bottom up design approach) was used to generate JAXB code for Java Mapping.... so I was tightly coupled to contract . (too much effort to maintain code per landscape)....

 

Possible Options :

 

1. Re Import the WSDL from third party system (QA, Prod..), update reference to Service Interface and mappings ...( Effort , Against the Governance , Will not work for my scenario , Additional effort of managing Java Mapping code base per landscape....)

 

2. Change the namespace at Runtime in Adapter Module ...

 

 

option 2 looks good for my problem, resulted in me building this custom adapter module..... it is generic and can be used for any Interface..., configurable component..  can transport your ESR objects  ... and use adapter module to change the namespace....


* is it good practice ? .. sure it falls under message transformation.....

 

Attached is adapter module java class ( saved in txt..)... that you can copy paste in adapter module project.....


Adapter Project/Code is dependent on open source jar file JLIBS

 

Require following jar file and can be downloaded from below link

 

Jar Files: jlibs-core.jar,jlibs-xml.jar, and jlibs-xmldog.jar

 

Downloads - jlibs - Common Utilities for Java - Google Project Hosting

 

In addition to above, you require standard PI jar files for adapter module development. (Follow below link for adapter module development)

 

http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c0b39e65-981e-2b10-1c9c-fc3f8e6747fa?overridelayout=t…

 

 

 

Attached is adapter module configuration steps ..(communication channel)

 

Adapter module configuration sample

image002.png

Concur - PO: Integration centric process for Standard Accounting Extract (SAE) posting

$
0
0

Update 22 April 2015: Correct statement to indicate that OAuth 2.0 was not available during the project timeline but is now available.

 

Introduction

Everyone knows that cloud applications are the hot topic these days. In the area of business travel, Concur is the market leader for cloud based travel & expense solutions. It is no wonder that SAP acquired Concur recently in SAP's march towards cloud era.

 

In the area of integration, Concur provides a rich and mature set of RESTful APIs. The online API documentation is very extensive, and it also provides a web-based UI to test those APIs without any external tool or creating custom programs.

 

One of the key areas of integration for Concur is the generation of the Standard Accounting Extract (SAE). Once the business expenses are approved on Concur, an extract of the approved expenses can be generated to be fed into a payment system. In SAP's case, this would typically be integrated into the Finance module on ECC as GL/AP postings. At this point in time, there is no prepackaged PI/PO content for integration with Concur. Hopefully with the acquisition, SAP will deliver prepackaged contents similar to those available for SuccessFactors and Ariba.

 

In this blog, I will share my experience for the integration centric process developed on PO's BPM to extract the SAE file from Concur. Consumption of Concur's RESTful API is achieved via Advancto's REST adapter as it requires OAuth 2.0 authentication method, which is not yet available on SAP's own REST adapter which was not yet available on SAP's own REST adapter during the project timeline (it has recently been announced in PO 7.4 SP10 release). The focus of the blog will be on the high level overview of the solution and will not go into too much details of each step of the implementation.

 

 

Standard Accounting Extract Process Flow

In order to extract the SAE file from Concur, 4 different APIs need to be executed in sequence. As such, it cannot be achieved by a single stateless integration scenario in PI. It requires a stateful process which can theoretically be achieved by an ABAP program, however it is well suited to be modeled in BPM as there are no business logic involved.

 

Following are the sequence of steps required to extract the SAE file via Concur's Extract Web Service:

  1. Get Extract Definition
  2. Post Extract Job Initiation Request
  3. Get Extract Job Status - Repeat until Status = Completed
  4. Get Extract File

 

Note: Step 1 can be skipped if the Extract Definition is always the same, therefore the same Definition ID is always used for Step 2.

 

 

Challenges

Whilst the sequence of steps is relatively straightforward to implement in BPM (sequentially go through each step), there are various challenges in order to achieve a robust solution.

 

1. Retry mechanism

Step 3 has to be repeated indefinitely until the status returned by the synchronous call is Completed. Depending on the load and resource availability of the Concur system, this could be as fast as 1 minute or as long as 3-4 hours! As such, a multi-tiered retry interval was implemented so that the step is not repeated every minute and therefore unnecessarily consuming resources on the PO system.

retry.png

 

 

2. BPM Process Starting Trigger

In addition to the RESTful APIs listed above, I added an additional call to the ReportDigests API as the first step that triggers the start of the BPM. This API checks if there are any approved expenses in Concur. If there are no approved expenses, the BPM is terminated normally without further execution of the Extract APIs.

check.png

This first step is triggered by a sender REST communication channel in polling mode with a polling frequency of every hour.

 

 

3. Exception Handling for Step 2 (Post Extract Job Initiation)

Once an extract job for a particular definition has already been submitted to the Concur system, it will be queued in Concur while waiting for available resources. Concur does not allow more than one job for the same definition to be queued at the same time. Therefore if there is a long running job already submitted, a subsequent call of Step 2 will return an application error. A boundary event is defined for the fault message associated to this step to catch the application error. If there is an error, the BPM will be routed to the exception path to trigger an email notification and terminate normally.

term.png

 

 

4. Optimize BPM process context by reducing payload size

After completion of Step 4 to retrieve the generated extract file, the response of the web service call will be returned to the BPM process. It is possible that the response contains a large payload and thus increases the memory usage of the BPM process context. One approach to avoid loading the large response payload into the BPM process is by using the Claim Check Integration Pattern as described in the blog below. However, my system is on a lower SP so it was not available.

Claim Check enhancements in SAP Process Orchestration

 

An alternative approach was used whereby the response payload was zipped and encoded into a Base64 string. The Base64 content is stored in the following field in an XML payload. It is this XML payload that is returned back to the BPM process. This conversion is performed via a Java Mapping on the response flow of Step 4.

base64.png

 

With this, the size of the payload loaded into the BPM was about 10% of the original payload size - a 90% reduction!

 

 

5. Multiple files from Extraction job

Depending on the extract definition configured in Concur, it is possible to have multiple files as the output of the extraction job. If there are multiple files, the response payload of Step 4 will be a Zip file containing multiple pipe delimited files.

 

In order to handle multiple files in the extraction, a custom Java mapping was developed to unzip the file and create additional attachments from each file in the Zip archive. The following document covers that solution.

Java Mapping: Unzip file and create additional attachments

 

 

References

As this was my first development on NW BPM compared to previous ccBPM developments on PI dual stack system, the following SCN articles came in very handy to shorten the learning curve for developing in NW BPM as well as implementing advanced error handling.

NetWeaver BPM for System-to-System Message Orchestration (Including Migration of ccBPM to NW BPM)

Integration Centric Process - Advanced Exception Handling in BPM

 

 

Appendix

Below is the full diagram of the BPM process flow.

Process_GetSAEFile.jpg

13 Reasons to Migrate from SAP PI to SAP PO (and Intelligent Business Operations) – Chalk and Cheese – Key for S/4 HANA

$
0
0

Let’s face it, integrating IT systems / automating processes (often referred to as Middleware) isn’t considered as eye grabbing / interesting as mobile user interfaces.

 

So to grab some attention and prove that UI has a place in middleware, there is a screenshot below from our SAP Process Orchestration server, exposing the processes available using the Fiori launchPad.

 

POLaunchPad.png

 

Like security, middleware is a specialised area of IT, which if done well, enables the real-time data exchange and governance that is key for the business models demanded by Digital Transformation. If you want to understand this in more detail, I can recommend this OpenSAP Course – it is critical to get the middleware right first, in order to get the most from S/4HANA.

 

For example - Do you think Uber batches up your taxi requests! Your customers expect the same service level – competition is a click/tap/swipe away.

 

As I have discussed in this blog, SAP’s middleware portfolio has grown massively in features and maturity since it was introduced as XI in early 2000, I now regularly see SAP Middleware winning against pure play middleware vendors.

 

The purpose of this blog isn’t to demonstrate the long list of newer features, such as; Real-time KPI monitoring with OpInt, Complex Event processing on high volume data streams using ESP/SDS, Monitization of API’s outside your organization with SAP API Management by APIGEE or Cloud Integration using HANA Cloud Integration.

 

I want to focus on how you can build the case to move from SAP Process Integration - PI (or any other Middleware solution) to SAP Process Orchestration - PO.


What is SAP Process Orchestration

 

First let’s make sure we all understand what these two products are:

 

SAP Process Integration– This is an integration broker which uses BOTH ABAP and Java stacks to enable Message Based integration between IT systems using a number of adaptors to get messages into and out of these systems.

 

SAP Process Orchestration– This is the combination of SAP Process Integration (as above), SAP Business Process Management (used to create workflows between people and systems), SAP Business Rules Management (used to enable user defined business logic required in any business process e.g. sign off limits). It also has lots of helpful technical components like the Composite Application Framework (for creating data persistency), Enterprise Content Management (used for document management) and a light weight version of SAP Portal capabilities (used for role based access to content), all only running on the SAP Java Stack.

 

Simply put,  SAP Process Orchestration encompasses all the tools you need to create business logic, applications and integration required to plug any gaps between your IT systems – an on-premise Extension Platform

License Conversion

 

If you have PI, BPM and BRM licenses, you can get a credit for the full value from SAP to put towards your PO license.

You have to discuss with SAP about how this is turned into a license credit and how much PO will cost you. Also as the move from PI to PO is a migration (can’t do an in place upgrade) you might want to do this over a period of time (to reduce risk), so again you need to discuss with SAP about running both in parallel for the period of the migration.

 

Note to SAP: A clear policy on parallel running during migration would be really helpful.


Migration

 

Many customers I speak to are disappointed that they have to migrate from PI to PO. In my experience PI systems are often seen as something that you don’t touch as long as they work and customer side PI skills can often be low as the integration was done during an implementation project by the SI engaged at the time – the customer team can often be in support not development mode.

 

I would encourage you to look at the migration as an opportunity to spring clean your middleware landscape – implementing problem interface using state-of-the-art techniques. You can look at this blog to see the huge variety of Integration Patterns that are supported by PO.

Migration Tool

 

SAP provide a migration tool that moves/converts the PI content that can be migrated from the PI system to the PO system. The key items that can’t be moved are ccBPM logic (which is replaced by BPM in PO) and ABAP mapping logic (as you don’t have an ABAP stack!).

 

Once migrated you can take advantage of the 13 reasons I have listed below – You can easily build your business case from these.

 

13 Reasons to Migrate from PI (or other middleware product) to PO

 

  1. B2B add on– With PO you can use the SAP B2B adaptors and trading partner management and remove the need for 3rd party solutions to EDI or costly 3rd party adaptors (this alone can justify the business case).

 

  1. Simplified IT Landscape– by running on only SAP Java stack your IT landscape is simpler to manage.
  2. More Throughput– The move to Java only architecture means we get a massive boost in the performance throughput.
  3. Human to System / Workflow– With PO we get access to SAP BPM which allows for great Human (screen) integration and workflows, this means we automate integrations where possible and involve Human steps which need review or additional data (e.g. make mapping errors into a business task instead of a IT one). See the screenshot above to see the type of processes that can be enabled.
  4. Native Business Rules– Business rules allow you to empower business users to be able to “flex” the behaviour of integrations / processes. These can be maintained in production so changes can happen at the speed of business not the speed of IT. Examples might include look up value, sign off limits, % change etc.
  5. Modern User Experience– As can be seen above, Fiori has made its way into PO, so where human interaction is required this can be done using HTML5 user interfaces that work on Desktop / Tablets / Phones and Watches. The best example being the My Inbox Fiori Application.
  6. Easier to Develop– Having all the tools in PO in one box it is much easier to develop complete solutions.
  7. Easier to Configure / Monitor / Tune– Because the tools are all running on one technical platform, it is much easier to configure / monitor and tune a PO system vs PI. See this great server overview available in PO.Monitor.png
  8. Value Help– The value help framework allows easy integration to Business Suite systems for reference data (e.g. Company Codes, Plants, Customer Groups etc.) which you might want to use in your applications
  9. Java Gateway / Rest Adaptor– These technical tools allow for fast integration with SAP Fiori services and other modern restful architectures
  10. Ready for Cloud– The iFlows used in the PO system use artefacts that can also be used in the HANA Cloud Integration product – so if/when you choose to use that tool for cloud – cloud or cloud on-premise integration, you can re-use work you have put into interface definitions and mappings.
  11. Ready for HANA– With PO you can run your middleware on HANA (PI will not be released for HANA) which means you can get real-time insight from the system tables, plug Operational Process Intelligence directly into the process / integration events and use HANA Smart Data Streaming for high volume IoT scenarios.OPINT.png
  12. SAP are not developing PI dual-stack anymore– By not being on PO you are locked out of new features. So PI isn’t a burning platform (perhaps smouldering a bit), but you will be locked in the past when it comes to new features.

 

I hope that using the above list you can find the value required to go through the hassle of negotiations with SAP and the migration. With state of the art SAP middleware (including PO) you can be sure that you will not hit any bottlenecks getting data into and out of your S/4 HANA landscape.

 

One of my longer term PI colleagues goes white when I tell her she has to work on “just PI” now she is used to the PO.


Global Survey for SAP Process Integration 2015

$
0
0

IFG global Survey 2015 for SAP Process Integration

 

As in the previous years, the Global Special Interest Group for Integration conducts its annual survey for SAP Process Integration / Process Orchestration.

 

Focus of the survey is to collect the latest state of the positioning, organization and use of SAP PI/PO and SAP integration in companies around the globe. Target group are employees and consultants from companies using SAP PI or SAP PO as an integration platform.

 

The survey results serve the IFG as a strong basis for the discussion with SAP and for the active influencing of SAP's product and release strategy. In 2013 the survey set a new record with over 350 participants.

 

The survey contains 27 general questions and 11 questions about this year’s special topic 'RESTful web services and cloud integration'. The general questions stay the same each year, so that the survey can identify changes and developments in the use and positioning of PI over the years.

 

The survey for 2015 can be found at the following URL:

International Focus Group for Integration - 2015 Global Survey

 

Time Schedule

  • May 4th 2015: Official start of survey
  • August 21st 2015: End of survey
  • End of September 2015: Publication of results within the SAP user groups and on SAP SCN

 

Recent Blogs

An extract of the results of the surveys can be found in the following blogs:

 


For more background about our interest group see our central blog International Focus Group for Integration.

 

Questions and feedback are welcome.

Consuming Your First Synchronous RESTful Service - Step-by-Step - Introduction

$
0
0

Introduction

 

This is a mini series of 5 blogs describing how to build a basic SAP-to-REST synchronous interface scenario.

 

Motivation

 

I recently had to use PI to communicate with REST services. As my PI experience was almost nothing, I had to learn some basic PI principals quickly. After struggling with the many different (and often confusing) elements and objects, I managed to call a simple REST service and receive the response. I found many resources that talked about part of what I needed. However, it is a struggle to find everything at one place. So, I'm hoping writing this blog will help others kick-start their "life of PI" and will also help me recall what I did and why. Many thanks to my colleague Jon Viter for this time and help.

 

System Versions

The SAP ECC system I used for this blog is a 7.31 system (SAP_ABA that is; support pack 2).

The SAP PI system I used for this blog is a 7.31 system as well (support pack 15).

Different system version could look or act slightly differently. Please accommodate those differences in your versions.

 

Scenario

 

This blog is a step-by-step guide for building a simple synchronous REST interface from an SAP ECC system. The key word here is "simple." There are many ways such an interface could be built. There could even be simpler ways that I am not aware of. Again, I can't claim to be an expert in PI and my goal here really is to show the basic steps to get it working. Once it is, then, additional enhancements or modifications for more complicated scenarios are easily done. I based my scenario on this very valuable document: PI REST Adapter - Consuming synchronous RESTful service. The SAP ECC system will send a zip code to the REST service and expects to get the resulting XML message back. Further parsing of the XML message could be done in the SAP ECC system. PI could also be used but that is not covered in this blog. The interface we are building can be simply represented by this diagram:

https://lh6.googleusercontent.com/-Fgu2NWRvmEY/VV7I7jZ5wCI/AAAAAAAAAqo/Z8VDG4zxnVA/w846-h222-no/00.10%2B-%2BScenario.jpg

 

Prerequisites

 

Before starting to implement this scenario, you need an SAP ECC system and a PI system that can talk to each other. A simple way to test is to check whether the PI ESR is visible from the SAP ECC system. You can use transaction SPROXY or SE80. Either transaction should look similar to this screenshot. If you don't see the "Enterprise Services Browser" selection button in SE80, go to menu option: Utilities -> Settings. Then, in the "Workbench (General)" tab, make sure the "Enterprise Service Browser" checkbox is selected.

https://lh4.googleusercontent.com/-067TBfPE4HI/VV7I7q8jA8I/AAAAAAAAApI/zwYEonOid_I/w774-h938-no/00.20%2B-%2BSPROXY.jpg

 

Another pre-requisite is that you are able and authorized to use the different Process Integration tools. Those are mainly in transaction SXMB_IFR in the PI system. This transaction will open a web page that looks like this.

https://lh6.googleusercontent.com/-t71owgz_7aQ/VV7I7j5VGKI/AAAAAAAAAqs/TFMczBmSyRY/w1534-h610-no/00.30%2B-%2BNW%2BProcess%2BIntegration.jpg

 

The sending SAP ECC system must be defined in the PI system's System Landscape Directory as a business system. If not, you need to work with your PI administrator to add it there. Click the "System Landscape Directory" link and navigate to the "Business Systems" application where you should be able to see it there.

https://lh3.googleusercontent.com/-RzxE3n1IqX4/VV-Kt9Ai77I/AAAAAAAAAxQ/yl4uLA-C34E/w1762-h902-no/00.35%2B-%2BBusiness%2BSystem.jpg

 

Also, you will need to have a Java run time environment installed on your system so that the Enterprise Service Builder and Integration Builder applications can be started from the web page above. Alternatively, you could use the NW Developer Studio. This blog uses the web start applications though.

 

Building Blocks

 

PI has many components and pieces that go together. On a high-level, you need to build objects in the Enterprise Service Builder of the Enterprise Services Repository (I will call them ESR objects) and in the Integration Builder of the Integration Directory (I will call them IB objects). The ESR objects are usually transportable while IB objects are system-specific so they will need to be created for each PI system (development, QA, production, etc.). In an ESR object, you say your sender system is "an" SAP system. In an IB object, you must specify which specific system is that; providing system details from the SLD.

https://lh6.googleusercontent.com/-4xzPp3zHLFE/VV7I8P1Aq-I/AAAAAAAAApU/YO8NFh527DU/w1436-h606-no/00.40%2B-%2BESR%2Bvs%2BIB.jpg

 

Here is a diagram of the different ESR objects that we need to create:

https://lh6.googleusercontent.com/-GTE805f5Ro4/VV9crsP8aqI/AAAAAAAAAtM/tDLiUhFAEOg/w2756-h1036-no/00.50%2B-%2BESR%2BObjects.jpg

 

  • Data Type: This is the basic structure in which data will be formatted. It could be simple or complex structures. Data types could include fields of other data types (nested structures).
  • Message Type: This is the container that will carry the data.
  • Error Message Type (optional): This is the container to transport errors. The structure is auto-generated and can be enhanced by additional fields if desired.
  • Service Interface: This a collection of request and response message types. An outbound service interface is the interface from the sender system to the PI system. An inbound service interface is the interface from the PI system to the receiver system. The direction is with regards to the sending system and NOT with regards to the PI system.
  • Message Mapping: This object is how one message type is mapped to another. This involves mapping fields and writing transformation or formatting rules if necessary such as value mapping or date formatting.
  • Operation Mapping: This object is used to map interfaces by assigning different message mappings to requests and responses.

 

Here is a diagram of the different IB objects that we need to create:

https://lh3.googleusercontent.com/-D1jUvJbifaw/VV7I8ZVIQvI/AAAAAAAAAqg/HXXLg98mqJM/w2760-h992-no/00.60%2B-%2BIB%2BObjects.jpg

 

  • Business System: This is the specific SAP system of the specific integration scenario we are building.
  • Business Component: This is the third-party system of the specific integration scenario we are building.
  • Communication Channel: This is an object that indicates how PI communicates with the business system or business component for a given service interface (whether outbound or inbound).
  • Sender Agreement: This is an object to see whether a sender on a specific communication channel is actually eligible to participate in the scenario (eligible to send a message of a defined type).
  • Receiver Determination: This is an object that checks who the sender is and what message is being sent and accordingly determines who should receive it.
  • Interface Determination: This is an object that checks who the receiver is for a given message and accordingly determines what receiver interface to be used and what operation mapping to use.
  • Receiver Agreement: This is an object that checks who the sender is and who the receiver is and what interface they used to determine what communication channel to forward the message to.

 

Let's get to work

 

Now, after the brief overview above, we can start working on building the different objects. There are four major steps. Follow each link for details on each step.

  1. Prepare a software component version (SWCV) and a namespace
  2. Build ESR objects
  3. Build IB objets
  4. Generate a proxy class in the SAP ECC system and build a simple ECC application to call the interface

 

See you shortly in the second part.


Consuming Your First Synchronous RESTful Service - Step-by-Step - Preparation

$
0
0

Introduction

 

This is the second part of a mini-series blog about how to build a basic SAP-to-REST synchronous interface scenario. The blog's first part can be found here:  Consuming Your First Synchronous RESTful Service - Step-by-Step - Introduction. This part is to prepare for building our design-time and integration objects.

 

Let's get to work

 

In this part, a software component version (SWCV) and a namespace will be created. We didn't talk about this earlier. It is mainly intended for organizing and releasing objects. For more information about this topic, please refer to this document (PI Best Practices Naming Conventions). For now, we will just create a new software component version that we will be using.

    • Go to SLD

https://lh6.googleusercontent.com/-Rl4SsMLCg70/VV7I8a61CtI/AAAAAAAAApg/yU0hxJK9_Gs/w952-h604-no/01.10%2B-%2BSLD.jpg

    • Click the "Products" link in the "Software Catalog"

https://lh4.googleusercontent.com/-enWrQdV1XY0/VV7I82v9pUI/AAAAAAAAAqc/w09X3SrIN2c/w1138-h780-no/01.20%2B-%2BSLD%2BProducts.jpg

    • Click the "New" button

https://lh3.googleusercontent.com/-8WB3GgY5VfM/VV7I89MP7nI/AAAAAAAAApw/rdUZA-RvB2g/w718-h810-no/01.30%2B-%2BSLD%2BProducts%2BNew.jpg

    • Select the "Create a new product and version" radiobutton then click "Next"

https://lh4.googleusercontent.com/--ZKT0T7sZC0/VV7I9cxIqaI/AAAAAAAAAqQ/H4rkfPJTxig/w1102-h596-no/01.40%2B-%2BSLD%2BProducts%2BNew%2BRadiobutton.jpg

    • In the product details screen, enter a product name, vendor, and a product version then click "Next"

https://lh6.googleusercontent.com/-3ZKXuY5uItg/VV7I9alXH3I/AAAAAAAAAp8/kyDpzrcqS8w/w746-h656-no/01.50%2B-%2BSLD%2BProduct%2BDetails.jpg

    • In the product instance screen, enter an instance name then click "Next"

https://lh5.googleusercontent.com/-YDkkea4TDN4/VV7I9vYqtwI/AAAAAAAAAqM/4iN93bwwqDY/w852-h660-no/01.60%2B-%2BSLD%2BProduct%2BInstance.jpg

    • In the software component version details screen, enter a software component name and version then click "Finish"

https://lh5.googleusercontent.com/-UtVzITS3Fhk/VV7I90wLbuI/AAAAAAAAAqI/sc8-W5OpgPM/w1234-h614-no/01.70%2B-%2BSLD%2BProduct%2BSWCV.jpg

    • Now, if you search for the product you just created, you should be able to find it.

https://lh3.googleusercontent.com/-2JJyMzrzM4w/VV7I-LshLDI/AAAAAAAAAqU/l5a0D7VetqM/w1644-h1142-no/01.80%2B-%2BSLD%2BProduct%2BCreated.jpg

 

This is it. See you in the third part.

Consuming Your First Synchronous RESTful Service - Step-by-Step - ESR Objects

$
0
0

Introduction

 

 

This is the third part of a mini-series blog about how to build a basic SAP-to-REST synchronous interface scenario. The blog's first part can be found here:  Consuming Your First Synchronous RESTful Service - Step-by-Step - Introduction. This part is to build all Enterprise Service Repository objects (design-time objects). The objects we will be building are shown in the PI dashed box in this diagram:

https://lh3.googleusercontent.com/-GTE805f5Ro4/VV9crsP8aqI/AAAAAAAAAtM/tDLiUhFAEOg/w2234-h840-no/00.50%2B-%2BESR%2BObjects.jpg

 

Let's get to work


Creating ESR Objects:

    • The objects will be created in the following order: Data Types, Message Types, Service Interfaces, Message Mappings, then Operation Mappings.
    • Start the "Enterprise Service Builder" by clicking the link on the NW Process Integration page shown above (transaction SXMB_IFR).
    • Once the tool starts, you need to import the SWCV we created above. This needs to be done only once.
      • Click the "Create Object" button or select it from menu option: Object -> Create. Note that this is how you start the creation of any object in ESR.

https://lh6.googleusercontent.com/-rNzJBzRM3R8/VV7I-AOU0EI/AAAAAAAAAqY/tGdrFM1VtCo/w668-h460-no/02.10%2B-%2BESR%2BImport%2BSWCV.jpg

    • In the "Create Object" pop-up screen, expand the "Work Areas" node and select object type Software Component Version. Then, in the details section, select the "Import from SLD" radiobutton then click the "Display" button.
    • Search for the SWCV created earlier, select it, and then click the "Import" button.
    • Once the SWCV is imported, verify the information and click the "Create" button.

https://lh6.googleusercontent.com/-Kr6b5E1P7mw/VV7JAKn8mNI/AAAAAAAAAss/XlwRqG9PmHc/w2184-h1338-no/02.20%2B-%2BESR%2BImport%2BSWCV%2BDetails.jpg

    • In the SWCV details screen, you need to do two things: select a language and create a namespace. The PI Best Practices Naming Conventions document talks in details about how many namespaces you should create and the reasoning behind that. For the purposes of this blog, we will create only one namespace that will be used for all ESR objects.

https://lh6.googleusercontent.com/-txX6_XoATfA/VV7JAJYQiFI/AAAAAAAAArE/cYAYe6BWQCU/w1316-h1132-no/02.30%2B-%2BESR%2BSWCV%2BDetails.jpghttps://lh5.googleusercontent.com/-TnZxPkz8ZOE/VV7JAHyndZI/AAAAAAAAAso/DQiisnX5cWw/w1504-h476-no/02.40%2B-%2BESR%2BNamespace.jpg

    • Now, you should see the SWCV in the Design Objects menu. From there, all other objects will be created.

https://lh3.googleusercontent.com/-0odEBLqusRw/VV7JApFgK1I/AAAAAAAAAsk/qGoZvEfyiBM/w1256-h546-no/02.50%2B-%2BESR%2BSWCV%2BDesign%2BObjects.jpg

    • Now, we need to create data types that will "describe" what the data looks like.
      • We need to know what types of data we will be sending from the SAP ECC system and what types of data we will be receiving back. Again, for our blog, the SAP ECC system will send a zip code and will want the response of the RESTful service sent back. Calling Google's map API rest service http://maps.googleapis.com/maps/api/geocode/json?address=10001 will return back a JSON message that looks like the below screenshot. You need to notice three things: the zip code in the rest service parameter list, the "results" tag, and the "status" tag. The first is the service request while the other two are the response we will be reading back. Usually, you create two data types for a request (DT1 and DT2 from the ESR Objects diagram above). The first one is for the data that goes from the sender to the PI system and the other is for the data that goes from the PI system to the receiver system. Similarly, you usually create two response data types (DT3 and DT4 from the ESR Objects diagram above). Optionally, you can create error data types (two of them as well) for transferring error details.

https://lh3.googleusercontent.com/-VQxEJY-PAdM/VV7JAr-dWsI/AAAAAAAAArY/38S78BU7f-4/w1176-h1856-no/03.10%2B-%2BData%2BTypes%2BAPI.jpg

    • To create a data element, you can click the "Create Object" button (just like you did when you imported the SWCV) or you can simply right-click the newly created name space then select the "New" option.

https://lh5.googleusercontent.com/-yvFRUq9aKN8/VV7JA4mZMUI/AAAAAAAAArQ/mEFiYNYowSw/w934-h632-no/03.20%2B-%2BData%2BTypes%2BCreate%2BButton.jpg

    • Create a data type for the zip code field by selecting the "Data Type" option in the "Interface Objects" node. Enter the data type name and namespace (namespace will be defaulted if you created the object by right-clicking the namespace). Then, click the "Create" button.

https://lh6.googleusercontent.com/-LqG3Teq6Vko/VV7JBa7ivqI/AAAAAAAAAsc/-LIZORkD4mw/w1836-h844-no/03.30%2B-%2BData%2BTypes%2BCreate%2BScreen.jpg

    • In the data type details screen, select the first line. Then, click the "Insert New Lines" button and select the "Insert Subelement" option.

https://lh4.googleusercontent.com/-2oQ-5MeBby0/VV7JBflNE1I/AAAAAAAAAsg/YrPJRPHhw8Y/w1044-h654-no/03.40%2B-%2BData%2BTypes%2BCreate%2BDetails.jpg

    • Type "zipcode" in the name field and type "xsd:string" in the type field. You can also select the XSD type from the field's value help list. Then, click the "Save" button.

https://lh5.googleusercontent.com/-Lk5Mo13PbYY/VV7JCB7EBfI/AAAAAAAAArs/Ah9_LJsVgA4/w1126-h676-no/03.50%2B-%2BData%2BTypes%2BCreate%2BDetails%2B2.jpg

    • Now, we need to create a data type for the response of the rest service.

https://lh5.googleusercontent.com/-FIq7XdWWvN8/VV7JCcNd4-I/AAAAAAAAAsY/wppZpNaPlog/w1452-h452-no/03.60%2B-%2BData%2BTypes%2BOutput%2BDetails%2B1.jpg

    • In the data type details screen, insert two new subelements: "results" & "status." Both are of type "xsd:string." Notice that you need to use the names of the JSON/XML tags that you want to read. Hint:if you want to completely use XML mapping in PI, you need to build the exact XML structure with the appropriate nesting in the data type that will be used for transferring data from the rest service to the PI system.

https://lh4.googleusercontent.com/-0DxARgwVeLw/VV7JCVfqJ8I/AAAAAAAAAsU/rOmfYiJt91E/w1796-h740-no/03.70%2B-%2BData%2BTypes%2BCompleted.jpg

    • Great! All the data types that we need are created now as you can see on the left side of the above screenshot.
    • The next step is to create message types that will carry the data described by the above data types.
      • Again, just like with data types, you usually create two request message types (from the sender to PI and from PI to the receiver: MT1 and MT2 from the ESR Objects diagram above) and two response message types (from the receiver to PI and from PI to the sender: MT3 and MT4 from the ESR Objects diagram above). However, for simplicity, we will use one message type for the request and one message type for the response.
      • Right-click the namespace to select the "New" option. Then, expand the "Interface Objects" node to select the "Message Type" object. Type a message type name for the request zip code then click the "Create" button.

https://lh4.googleusercontent.com/-gYJ43Gq6CQE/VV7JCm2sa0I/AAAAAAAAAsQ/F4PLE-BUC8s/w1392-h414-no/04.10%2BMessage%2BType%2BCreate.jpg

    • In the message type details screen, type the name of the zip code data type (or select it from the value help menu). You will notice the data type fields will be displayed automatically in the message type structure section.

https://lh5.googleusercontent.com/-UZCGCTaZuss/VV7JCzPfCXI/AAAAAAAAAsI/8FFZyL_x_W0/w1492-h788-no/04.20%2BMessage%2BType%2BDetails.jpg

    • Similarly, create a message type for the response location details.

https://lh4.googleusercontent.com/-T-zxT52ZqAE/VV7JDG02yHI/AAAAAAAAAsA/vzwKgvGBHGA/w1516-h430-no/04.30%2BMessage%2BType%2BResponse%2BCreate.jpghttps://lh5.googleusercontent.com/-DcTD6DEqauY/VV7JDKmVIYI/AAAAAAAAAsE/mgLGqS1e0QA/w1258-h832-no/04.40%2BMessage%2BType%2BResponse%2BDetails.jpg

    • Save the message type and verify the message types exist under the namespace.

https://lh5.googleusercontent.com/-UMvP0FMbymQ/VV7JDfkXVzI/AAAAAAAAAsM/HwFGo_QIeZI/w680-h496-no/04.50%2B-%2BMessage%2BTypes%2BCreated%2BSuccessfully.jpg

    • Now, we need to create the outbound (sender -> PI) and inbound (PI -> receiver) interfaces (OBI and IBI from the ESR Objects diagram above).
      • Click the "Create New Object" button or right-click the namespace then select the "New" option. In the pop-up screen, expand node "Interface Objects" and select the "Service Interface" object type. Type the outbound service interface name and click the "Create" button.

https://lh4.googleusercontent.com/-G2GARu7Oj3A/VV9lJrt4E0I/AAAAAAAAAt0/ahcU_NZJ5pA/w2062-h750-no/05.10%2B-%2BOB%2BService%2BInterface%2BCreate.jpg

    • In the details screen, make sure the interface category is "Outbound" and the mode is "Synchronous." Then, type the request and response message types (that we created earlier) in the appropriate fields. You can also select them from the field's value help menu. Then, click the "Save" button. You can enter a "fault" message type, but we will ignore that for this blog.

https://lh6.googleusercontent.com/-kRfLPErBGu0/VV98StiWIzI/AAAAAAAAAvc/9WEZjqytcVc/w1958-h1286-no/05.20%2B-%2BOB%2BService%2BInterface%2BDetails.jpg

    • Similarly, start the creation of the inbound service interface.

https://lh5.googleusercontent.com/-pFl-i4hrP5g/VV9lJksfixI/AAAAAAAAAto/JRZdRp10YEs/w1616-h616-no/05.30%2B-%2BIB%2BService%2BInterface%2BCreate.jpg

    • In the details screen, make sure the interface category is "Inbound" and the mode is "Synchronous." Then, type or select the request and response message types (just like in the outbound service interface). Then, click the "Save" button.

https://lh6.googleusercontent.com/-0Dj1irQgD54/VV9lKaOpOkI/AAAAAAAAAuA/3Y7N-sSqvfU/w2206-h1298-no/05.40%2B-%2BIB%2BService%2BInterface%2BDetails.jpg

    • Verify the service interfaces are now created in the namespace.

https://lh4.googleusercontent.com/-H5Og7X9fUhU/VV9lKUleXDI/AAAAAAAAAt4/T-OhCBaZZyQ/w644-h620-no/05.50%2B-%2BService%2BInterfaces%2BCreated.jpg

    • Now, it is type to create message mappings for the request and response (MM1 and MM2 from the ESR Objects diagram above).
      • Click the "Create New Object" button or right-click the namespace then select the "New" option. In the pop-up screen, expand node "Mapping Objects" and select "Message Mapping." Type in a message mapping name then click the "Create" button.

https://lh3.googleusercontent.com/-DtoX6E_yUaA/VV92bpF8HLI/AAAAAAAAAuk/g8CKvue2l94/w2142-h756-no/06.10%2B-%2BMsg%2BMapping%2BCreate.jpg

    • In the message mapping details, click the "Select Mapping" button in both the "source" and "target" sections. Select the same message type as shown in the screenshot.

https://lh3.googleusercontent.com/-aTs6KYbjN4A/VV92bngW5oI/AAAAAAAAAvA/y6m-cpclFOM/w2256-h882-no/06.20%2B-%2BMsg%2BMapping%2BDetails.jpg

    • Once both message types are visible in the grid area, you need to map the source field (zipcode) to the target field (zipcode). You can either drag-and-drop the source field onto the target field or add both fields to the mapping area (by double-clicking) then map the fields there. You should have something that looks like the screenshot below.

https://lh6.googleusercontent.com/-2YPqllY4UFA/VV92bRrOiyI/AAAAAAAAAvE/Ac6ai2gegL8/w2118-h992-no/06.30%2B-%2BMsg%2BMapping%2BFields.jpg

    • Similarly, create a message mapping for the response.

https://lh6.googleusercontent.com/-KGVWCmNPuks/VV92b4vmPcI/AAAAAAAAAu0/pT4PAvL7oS0/w1514-h478-no/06.40%2B-%2BMsg%2BMapping%2BResponse%2BCreate.jpg

    • Similarly, select the response message types.

https://lh4.googleusercontent.com/-JupKoDcpunY/VV92cCL2eMI/AAAAAAAAAuw/FNFW3QVlmMg/w2238-h708-no/06.50%2B-%2BMsg%2BMapping%2BDetails.jpg

    • Map the "results" and "status" fields to the "results" and "status" fields.

https://lh3.googleusercontent.com/-8kXt8TYdUIE/VV92cbx8xAI/AAAAAAAAAu8/27uO1u-4tIo/w2100-h1188-no/06.60%2B-%2BMsg%2BMapping%2BFields.jpg

    • Drag and drop the "results" field into the lower mapping area. Then, right-click the "results" icon and select the "Return as XML" option (which should show now as checked).

https://lh6.googleusercontent.com/-8LWoAJEDnSg/VWLHxTL4a7I/AAAAAAAAA5g/yfeX6-3XUrQ/w716-h536-no/06.63%2B-%2BMsg%2BMapping%2BXML.jpg

    • Then, in the lower toolbar, select "Text" from the function drop down list.

https://lh4.googleusercontent.com/-yzvQEwJGwUg/VWLHxU7aWpI/AAAAAAAAA5k/JQ87GiyD-dc/w2544-h678-no/06.65%2B-%2BMsg%2BMapping%2BTrim%2BFunction.jpg

    • Now, drag and drop the "trim" function to the mapping area. Map the source "results" to the "trim" function. Then, the output of the "trim" function should be mapped to the target "results" field.

https://lh6.googleusercontent.com/-udppPww5a2A/VWLHxSJ_n8I/AAAAAAAAA5o/0HQAjk30kV0/w746-h456-no/06.68%2B-%2BMsg%2BMapping%2BTrim.jpg

    • Finally, verify the message mapping objects are created in the namespace.

https://lh6.googleusercontent.com/-lwG81tf0wSA/VV92cQoBCkI/AAAAAAAAAu4/a9itAX8faqw/w686-h790-no/06.70%2B-%2BMsg%2BMapping%2BObjects%2BCreated.jpg

    • The last object to be created in the ESR is the Operation Mapping object (object OM from the ESR Objects diagram above).
      • Click the "Create New Object" button or right-click the namespace then select the "New" option. Expand the "Mapping Objects" node and select the "Operation Mapping" object type. Type an operation mapping name and click the "Create" button.

https://lh3.googleusercontent.com/-OJvgNKerglI/VV99NEqWz6I/AAAAAAAAAwE/lsxYrOENBP0/w2108-h842-no/07.10%2B-%2BOperation%2BCreate.jpg

    • Select the outbound service interface in the source operation section and the inbound service interface in the target operation section. Then, click the "Read Operations" button.

https://lh5.googleusercontent.com/-GsurJLg4hkY/VV99NIS8__I/AAAAAAAAAv8/hwnlyderr2Y/w2132-h834-no/07.20%2B-%2BOperation%2BAdd%2BOperations.jpg

    • In the "Request" tab, select the request message mapping.

https://lh5.googleusercontent.com/-I9t2WyGGvJw/VV99M2fGOHI/AAAAAAAAAwM/jkSaPBuRQL0/w2102-h402-no/07.30%2B-%2BOperation%2BRequest.jpg

    • In the "Response" tab, select the response message mapping.

https://lh5.googleusercontent.com/-3MkmHG4Sdqo/VV99NT1Kg7I/AAAAAAAAAwA/FE-mE9xpw4w/w2172-h380-no/07.40%2B-%2BOperation%2BResponse.jpg

    • Finally, verify that the operation is now created in the namespace.

https://lh3.googleusercontent.com/-Gdw7aA7lfa4/VV99NqBCzII/AAAAAAAAAwI/fjwzAxJNMNc/w636-h404-no/07.50%2B-%2BOperation%2BCreated.jpg

    • Now, all of our ESR objects are created, we need to activate them.
      • Go to the "Change List" tab and expand the tree all the way.

https://lh5.googleusercontent.com/-bJ81f1gGkyI/VV-G5Pr4YaI/AAAAAAAAAww/-QmvmfTRNUQ/w856-h752-no/08.10%2B-%2BESR%2BActivation.jpg

    • Right-click the "Standard Change List" and select the "Activate" option.

https://lh3.googleusercontent.com/-6TV6SJDMgCs/VV-G5A2OG-I/AAAAAAAAAw0/x226G4ZhGNk/w876-h284-no/08.20%2B-%2BESR%2BActivation%2BButton.jpg

    • In the confirmation pop-up screen, select all the objects we just created and then click the "Activate" button.

https://lh3.googleusercontent.com/-djFhRJBy8GE/VV-G5Ikyw4I/AAAAAAAAAw4/h0kFByPWXns/w998-h896-no/08.30%2B-%2BESR%2BActivation%2BConfirmation.jpg

    • Once the list is confirmed, your "Change List" should look empty now.

https://lh6.googleusercontent.com/-wI4l_JTjHlQ/VV-G5re8rdI/AAAAAAAAAxA/EGINUkJbP-A/w736-h288-no/08.40%2B-%2BESR%2BEmpty%2BChange%2BList.jpg

 

Now, it is time to build integration builder objects. See you in the fourth part.

Consuming Your First Synchronous RESTful Service - Step-by-Step - Integration Builder Objects

$
0
0

Introduction

 

 

This is the fourth part of a mini-series blog about how to build a basic SAP-to-REST synchronous interface scenario. The blog's first part can be found here:  Consuming Your First Synchronous RESTful Service - Step-by-Step - Introduction. This part is to build all Integration Builder objects (integration system-specific objects). The objects we will be building are shown as all the colored objects in this diagram:


https://lh5.googleusercontent.com/-D1jUvJbifaw/VV7I8ZVIQvI/AAAAAAAAAqg/HXXLg98mqJM/w2236-h804-no/00.60%2B-%2BIB%2BObjects.jpg


Let's get to work


Creating IB Objects:

    • The objects will be created in the following order: Business System, Business Component, Communication Channels, Sender Agreement, Receiver Determination, Interface Determination, then Receiver Determination.
    • Start the "Integration Builder" by clicking the link on the NW Process Integration page shown above (transaction SXMB_IFR).
    • Once the tool starts, it will look very similar to the ESR tool. We will need to create a configuration scenario to group all of our objects together.
      • Click the "Create Object" button. Expand node "Administration" and select object type "Configuration Scenario." Type a configuration scenario name then click the "Create" button.

https://lh5.googleusercontent.com/-WDHoPnvXl84/VV-KvQSXNPI/AAAAAAAAAxo/-Rnaca4s1q0/w1824-h1210-no/09.10%2B-%2BConfig%2BScenario.jpg

    • In the "Configuration Scenario" details screen, click the "Save" button. You can use this screen to create other objects if you like.

https://lh4.googleusercontent.com/-aLgikOu3lj0/VV-KvGhUPrI/AAAAAAAAAxg/qZxzfCvipOo/w1226-h546-no/09.20%2B-%2BConfig%2BScenario%2BDetails.jpg

    • You can see the configuration scenario created successfully.

https://lh6.googleusercontent.com/-R6XMUfR9khI/VV-KvZR4ojI/AAAAAAAAAxk/gSl4Gq0NaWQ/w664-h774-no/09.30%2B-%2BConfig%2BScenario%2BCreated.jpg

    • In order to have a similar view like the one above, you need to change the "Object View" to "Configuration Scenario View".

https://lh5.googleusercontent.com/-IlswPeQWX44/VV-i_9yXGLI/AAAAAAAAAys/dbbLi8giADA/w546-h318-no/09.40%2B-%2BConfig%2BScenario%2BView.jpg

    • Now, we need to add the SAP ECC Business System to the configuration scenario.
      • Expand the node of the configuration scenario you just created, then right-click the "Business System" object type and select the "Assign Business System" option.

https://lh6.googleusercontent.com/-WNnxjVw1WZU/VV-OsAS6FbI/AAAAAAAAAyM/CGJdsgSr7TY/w678-h542-no/10.10%2B-%2BBusiness%2BSystem%2BCreate.jpg

    • You will be presented with a wizard-like page on which you need to click "Continue." When asked for a "Party" or to "Assign Partner", you can leave the field blank. After that, you need to select the sending SAP ECC system from the list of business systems. This system must have already been defined in the PI SLD as mentioned in the pre-requisite section. Once you select the business system, you need to click the "Finish" button.

https://lh3.googleusercontent.com/-zmKDyiXVE7s/VV-OsFS6PZI/AAAAAAAAAyQ/jWzVd8kBWOI/w1296-h1336-no/10.20%2B-%2BBusiness%2BSystem.jpg

      • Note that if the business system is not available in this list (and you are sure it is available in the SLD), it might have already been assigned to another configuration scenario. That is OK. What you need to do is find which configuration scenario it is defined in. Then, right-click the business system there and select the "Assign Configuration Scenario" button.

https://lh3.googleusercontent.com/GeHVHi1vplTa_1JSZz3tV-GgQ_K2U4Qq1W3xDul5brQ=w371-h373-no

      • In the pop-up screen, select the new configuration scenario you want to add the business system to. The configuration scenarios that are currently assigned to the business system will be already checked.

https://lh3.googleusercontent.com/2HtXuaT-QDdl9JdmPNzX2Q6j4A02RjJKltuodWOduRk=w494-h551-no

    • Verify that the business system is added to the configuration scenario.

https://lh5.googleusercontent.com/-CeUCa-DrKNI/VV-OsN2VWMI/AAAAAAAAAyU/Df2_g10bfVM/w1724-h698-no/10.30%2B-%2BBusiness%2BSystem%2BCreated.jpg

    • The next IB object is a Business Component to represent the third-party system that hosts the RESTful service.
      • Right-click the "Business Component" object type or just click the "Create Object" button. Expand the "Collaboration Profile" node in the pop-up screen and select "Business Component" object type. Then, click the "Create" button.

https://lh5.googleusercontent.com/-jQTY9YXUxYU/VV-m6au5y0I/AAAAAAAAAzU/FfqjrUVzoPs/w1908-h810-no/11.10%2B-%2BBiz%2BComp%2BCreate.jpg

    • In the business component details screen, make sure you are on the "Receiver" tab to select the inbound interface we created earlier in the ESR using the value help menu. Then, click the "Save" button.

https://lh3.googleusercontent.com/-9sGdiLBzlS8/VV-m6EH8-uI/AAAAAAAAAzI/jHENX-JuiUs/w1958-h1030-no/11.20%2B-%2BBiz%2BComp%2BReciever%2BSelection.jpg

    • Verify the business component is now created in the configuration scenario.

https://lh6.googleusercontent.com/-EBnygAYu-uU/VV-m6GHmxaI/AAAAAAAAAzM/-Gh4sryHetI/w1612-h930-no/11.30%2B-%2BBiz%2BComp%2BCreated.jpg

    • If you can't find it, it might be in the "Unassigned Objects" folder. You can right-click an object in that list and select the "Assign Configuration Scenario" option to move it to your configuration scenario.

https://lh6.googleusercontent.com/-HmIV9DU_l0Y/VV-m6s2ORbI/AAAAAAAAAzQ/ypgU_iT_3hY/w472-h344-no/11.40%2B-%2BAssign%2Bto%2BConfig%2BScenario.jpg

    • Next, we need to create a communication channel for the sender.
      • Right-click the sender Business System and select the "New" option (or click the "Create Object" button). Expand the "Collaboration Profile" node and select "Communication Channel" object type. Type a name and make sure the correct communication component and scenario names are selected. Then, click the "Create" button.

https://lh4.googleusercontent.com/-cgTMChSuhPE/VV-qxOyaRbI/AAAAAAAAAz4/80tlOMl8J44/w1788-h774-no/12.10%2B-%2BComm%2BChannel%2BSender%2BCreate.jpg

    • In the communication channel details screen, select the SOAP adapter type from the value help menu.

https://lh3.googleusercontent.com/-DHCpbTbRl90/VV-uTLzEekI/AAAAAAAAA0Y/zaYFRVYoYFI/w2240-h1480-no/12.20%2B-%2BComm%2BChannel%2BSender%2BAdapter%2BType.jpg

    • Change the message protocol to "XI 3.0" then click the "Save" button.

https://lh5.googleusercontent.com/-4q4DIRQDWzU/VV-qw_bxbsI/AAAAAAAAAz0/etAOikCRA04/w1186-h760-no/12.30%2B-%2BComm%2BChannel%2BSender%2BMsg%2BProtocol.jpg

    • Verify the communication channel has been created in the configuration scenario.

https://lh5.googleusercontent.com/-iHBvF5AQACg/VV-qxWtTHUI/AAAAAAAAAz8/BBb67lPY5SE/w1792-h592-no/12.40%2B-%2BComm%2BChannel%2BSender%2BCreated.jpg

    • Now, we need to create a communication channel for the receiver.
      • Right-click the receiver Business Component and select the "New" option (or click the "Create Object" button). Expand the "Collaboration Profile" node and select "Communication Channel" object type. Type a name and make sure the correct communication component and scenario names are selected. Then, click the "Create" button.

https://lh4.googleusercontent.com/-7gcmfz1hv_c/VV-5FmvNA_I/AAAAAAAAA1I/p3ZqYCyoSsE/w1754-h874-no/13.10%2B-%2BComm%2BChannel%2BReceiver%2BCreate.jpg

    • In the communication channel details screen, select the REST adapter type from the value help menu.

https://lh6.googleusercontent.com/-i-YNyGEZknc/VV-6BE7W2xI/AAAAAAAAA2A/jjC1p5Kra5k/w2094-h1394-no/13.20%2B-%2BComm%2BChannel%2BReceiver%2BAdapter.jpg

    • Change the communication channel to a "receiver" as shown below.

https://lh5.googleusercontent.com/-TZDw7vjejMI/VV-5Fr_4CaI/AAAAAAAAA04/vMPFh1EcJG4/w852-h514-no/13.30%2B-%2BComm%2BChannel%2BReceiver%2BType.jpg

https://lh4.googleusercontent.com/-LjaulC-Vd-4/VV-5GOzxGLI/AAAAAAAAA1M/sc-tHyQJjMM/w1380-h374-no/13.40%2B-%2BComm%2BChannel%2BReceiver%2BURL.jpg

    • In the same tab, change the variable source to "XPath Expression." Then, in the pattern element name, type the name of the variable defined in the URL. The other field is the "XPath Expression" which is basically the XML tag of the field in the interface message. This is exactly the name of the field in the data type of the message that goes from the PI system to the receiver system (DT2 in the ESR Objects diagram).

https://lh3.googleusercontent.com/-4ZOkYF4NHJ4/VV-5GZVg_hI/AAAAAAAAA1g/kkcAsaDMCvY/w2878-h1414-no/13.50%2B-%2BComm%2BChannel%2BReceiver%2BURL%2BParam.jpg

    • Switch now to the "REST Operation" tab and select the "GET" operation in the "Static Value" field. The "Value Source" field should already be set to "Manual Value" and if not, you should change it to that value.

https://lh5.googleusercontent.com/-C9x-lYp07hc/VV-5GdaYrxI/AAAAAAAAA1Y/YHheg110bqU/w928-h394-no/13.60%2B-%2BComm%2BChannel%2BReceiver%2BREST%2BOperation.jpg

    • Switch now to the "Data Format" tab. In the "Request Format" section, change the data format to JSON and check the "Convert XML Payload to JSON" checkbox.
    • In the "Response Format" section, change the data format to JSON and check the "Convert to XML" checkbox. Also, check the "Add Wrapper Element" checkbox which will enable two fields that will be used to send response data. In the "Element Name" field enter the message type of the response (DT4 in the ESR Objects diagram above) and enter the namespace in the "Element Namespace" field.

https://lh4.googleusercontent.com/-3WbRnF0mNU8/VV-5G4o4e6I/AAAAAAAAA1o/uw99GR95xJM/w2532-h1434-no/13.70%2B-%2BComm%2BChannel%2BReceiver%2BData%2BFormat.jpg

    • Now, click the "Save" button and verify the communication channel has been added to the configuration scenario.

https://lh4.googleusercontent.com/-qBYCrF77wgM/VV-5G7BCVLI/AAAAAAAAA1k/NdrM5ZvKWFw/w1818-h666-no/13.80%2B-%2BComm%2BChannel%2BReceiver%2BCreated.jpg

    • The next object to be created is the "Sender Agreement."
      • Click the "Create Object" button then expand the "Collaboration Agreement" node and select object type "Sender Agreement." Select the sender business system. Then, select the outbound interface from the value help menu. Note that you will most probably have to change the value of the "Business System" in the pop-up to "- Any -" in order to be able to find the outbound interface created earlier. Make sure to click the refresh button so that the interface list is updated.

https://lh4.googleusercontent.com/-48nK8rowA7I/VV_A5JiqebI/AAAAAAAAA2o/PtPBtIWkw4U/w1852-h1472-no/14.10%2B-%2BSender%2BAgreement%2BCreate.jpg

    • Once the communication component, interface, and namespace fields are populated, click the "Create" button.

https://lh4.googleusercontent.com/-1hUsf8L8NAk/VV_A41EX8pI/AAAAAAAAA2Y/M9SKWrY_fOw/w920-h320-no/14.20%2B-%2BSender%2BAgreement%2BCreate%2B2.jpg

    • In the sender agreement details screen, select the communication channel we created earlier and then click the "Save" button.

https://lh4.googleusercontent.com/-pRJ154bQM60/VV_A48f01gI/AAAAAAAAA28/R9dFAHLPjCY/w1678-h932-no/14.30%2B-%2BSender%2BAgreement%2BDetails.jpg

    • Verify the sender agreement has been created and added to the configuration scenario.

https://lh4.googleusercontent.com/-nW4vA5bKT-w/VV_A5QjM_8I/AAAAAAAAA20/xaQEt5vkQEg/w1938-h654-no/14.40%2B-%2BSender%2BAgreement%2BCreated.jpg

    • It is now the "Receiver Determination" turn to be created.
      • Click the "Create Object" button then expand the "Configuration Objects" node and select object type "Receiver Determination." Select the sender business system. Then, select the outbound interface from the value help menu and click the "Create" button.

https://lh5.googleusercontent.com/-ip-hyUXPAQQ/VWAlVO02OII/AAAAAAAAA3g/1JMF_oWMBxc/w2064-h896-no/15.10%2B-%2BReceiver%2BDetermination%2BCreate.jpg

    • In the receiver determination details screen, select the receiver's business component then click the "Save" button.

https://lh4.googleusercontent.com/-zR8-CU6_Fu4/VWAlVIUBfAI/AAAAAAAAA3o/6r50rJIdCQ8/w1714-h940-no/15.20%2B-%2BReceiver%2BDetermination%2BDetails.jpg

    • Verify that the receiver determination object has been created successfully.

https://lh6.googleusercontent.com/-Lc0j0BMYr8I/VWAlVFNdgoI/AAAAAAAAA3k/hhtHHscvoZA/w1964-h666-no/15.30%2B-%2BReceiver%2BDetermination%2BCreated.jpg

    • Now, we need to create the "Interface Determination" object.
      • Click the "Create Object" button then expand the "Configuration Objects" node and select object type "Interface Determination." Select the sender business system. Then, select the outbound interface from the value help menu. Finally, select the receiver business component and click the "Create" button.

https://lh6.googleusercontent.com/-CFZHdGCPh5s/VV_A5emFVyI/AAAAAAAAA24/e6i8UA2b7-0/w1766-h1276-no/15.10%2B-%2BInterface%2BDetermination%2BCreate.jpg

    • In the interface determination details screen, make sure the software component version is selected correctly. Also, select the mapping operation created in the ESR from the value help menu. Then, click the "Save" button.

https://lh6.googleusercontent.com/-18z1bUy06SM/VV_A5lxuG_I/AAAAAAAAA2s/STfQOH7o_IM/w1956-h852-no/15.20%2B-%2BInterface%2BDetermination%2BDetails.jpg

    • Verify the interface determination object has been created and added to the configuration scenario.

https://lh4.googleusercontent.com/-NMtDgPOoHng/VV_A54h2JGI/AAAAAAAAA3A/X_0CjdMpxgk/w1874-h598-no/15.30%2B-%2BInterface%2BDetermination%2BCreated.jpg

    • The last IB object to be created is the "Receiver Agreement."
      • Click the "Create Object" button then expand the "Collaboration Agreement" node and select object type "Receiver Agreement." Select the sender business system. Then, select the receiver business component and the inbound interface from the value help menu then click the "Create" button.

https://lh4.googleusercontent.com/-D0kU9q0ZCSk/VWApABCe4II/AAAAAAAAA4A/151TcyIlWmw/w1360-h758-no/17.10%2B-%2BReceiver%2BAgreement%2BCreate.jpg

    • In the receiver agreement details screen, select the receiver communication channel then click the "Save" button.

https://lh4.googleusercontent.com/-Xiabwqawf1w/VWApAKsh6tI/AAAAAAAAA4E/P6reljzTOjg/w1212-h1058-no/17.20%2B-%2BReceiver%2BAgreement%2BDetails.jpg

    • Verify that all the receiver agreement is created.

https://lh6.googleusercontent.com/-TkiA9z0n7No/VWApAAdwFcI/AAAAAAAAA4I/j-O7ZgzDitc/w1898-h716-no/17.30%2B-%2BReceiver%2BAgreement%2BCreated.jpg

    • Now, that we created all the necessary IB objects, we need to activate them. Go to the "Change List" tab. Expand the change list all the way, right-click the "Standard Change List", then select the "Activate" option.

https://lh4.googleusercontent.com/-td13QXUjUjY/VWAqStbI2SI/AAAAAAAAA4Y/NxdhZNJkyx8/w828-h574-no/17.40%2B-%2BIB%2BActivation.jpghttps://lh4.googleusercontent.com/-wDAWWf77umU/VWArXA3BUWI/AAAAAAAAA4w/OvF9Ok4EhOA/w564-h238-no/17.50%2B-%2BIB%2BActivation%2B2.jpghttps://lh6.googleusercontent.com/-bnFRUlrmsQ4/VWArXPC0drI/AAAAAAAAA40/OnsIv5kH0-w/w1002-h902-no/17.60%2B-%2BIB%2BActivation%2B3.jpg


Finally, let's create a proxy and test our work. See you in the fifth part.

Consuming Your First Synchronous RESTful Service - Step-by-Step - Proxy Class and Testing

$
0
0

Introduction

 

This is the last part of a mini-series blog about how to build a basic SAP-to-REST synchronous interface scenario. The blog's first part can be found here:  Consuming Your First Synchronous RESTful Service - Step-by-Step - Introduction. This part is to build a proxy class in the sending SAP ECC system and to test our integration scenario.


Let's get to work


Generating a proxy class and creating a test program.

    • Generating the proxy:
      • Log on to the sender SAP ECC system. Go to either transaction SPROXY or transaction SE80. In SE80, select the "Enterprise Services Browser" object type from the left navigation pane.
    • If the software component version that we created earlier is not visible, click the "Refresh" button in order to re-read the ESR catalog.
    • Once the software component version is visible, expand the nodes of the service interface until you see the "outbound" service interface we created earlier.

https://lh5.googleusercontent.com/-4imaT0b1K1I/VWLKmEcBrmI/AAAAAAAAA6Y/bgfm3WcxhRw/w680-h652-no/18.10%2B-%2BProxy%2B1.jpg

    • Now, right-click the outbound service interface and select the "Generate Proxy" option.

https://lh5.googleusercontent.com/-dxekTzdBShk/VWLKmA9g3aI/AAAAAAAAA6o/DICI2UGXp7o/w572-h188-no/18.20%2B-%2BProxy%2B2.jpg

    • You will be presented by a wizard screen on which you can either check the "local object" flag or select an existing change request. Make sure to add a prefix in your customer name space (starts with Y or Z). Then, click "Finish."

https://lh3.googleusercontent.com/-T5vLp1t3LLQ/VWLKmA7BzWI/AAAAAAAAA6c/zJ4KJ4qMu-o/w1394-h1106-no/18.30%2B-%2BProxy%2B3.jpg

    • After a short time, you will be presented with the details of the generated proxy. Click the "Activate" button in the top toolbar.

https://lh4.googleusercontent.com/-NX4gREdoI-w/VWLKmgmk0TI/AAAAAAAAA6Q/ydZ5fDeSBZw/w1796-h1156-no/18.40%2B-%2BProxy%2B4.jpg

    • Copy the proxy class name, and go to transaction SE24. Paste the class name there and click the "Display" button in . You can just double-click the class name in proxy details screen above.

https://lh6.googleusercontent.com/-seJQs33LrAo/VWLKmnDwkNI/AAAAAAAAA6U/cApXfXXh6ng/w882-h422-no/18.50%2B-%2BProxy%2B5.jpg

    • Find the method with the same name as the outbound service interface. Place the curser on the method name and click the "Parameters" button.

https://lh5.googleusercontent.com/-_mD7UyaqE00/VWLKmi6qTyI/AAAAAAAAA6k/_j6jIUPDYyY/w1044-h536-no/18.60%2B-%2BProxy%2B6.jpg

    • Take note of the types of the "output" and "input" parameter fields. These will be needed when creating the test program.

https://lh4.googleusercontent.com/-HfwD1k2GVYQ/VWLKm9UCzMI/AAAAAAAAA6g/NcivH_Jo_CQ/w1256-h466-no/18.70%2B-%2BProxy%2B7.jpg

    • Test program:
      • Now, go to transaction SE38 to create the test program. Type a test program name then click the "Create" button.

https://lh4.googleusercontent.com/-5kxTp89YYXo/VWLNFT5LQSI/AAAAAAAAA7k/U2-oOSGKWGI/w1040-h436-no/19.10%2B-%2BProgram%2B1.jpg

    • In the "Attributes" screen, enter a short description, select the "Executable" type, and then click the "Save" button.

https://lh5.googleusercontent.com/-uIhkzRZcog8/VWLNFYklZ5I/AAAAAAAAA7U/Or3vuDihOII/w1248-h1050-no/19.20%2B-%2BProgram%2B2.jpg

    • On the next screen, pick a package or assign the program as a local object. Then, click the "Save" button.

https://lh4.googleusercontent.com/-JTg4_e_JTS0/VWLNFSF9o5I/AAAAAAAAA7Q/FckT3q2AsKk/w914-h602-no/19.30%2B-%2BProgram%2B3.jpg

    • Write the following program. Note the structure types that we copied from the proxy class. The basic logic of this program is displaying an input zip code field. Then, it will call the proxy method which will return the results in string fields. Those two fields: status and results will be displayed in the output screen. Field "results" contains an XML message in plain-text format. In a more elaborate scenario, you will need to parse the returned XML text to read specific fields.

https://lh3.googleusercontent.com/-fmxZoJO_3i4/VWLNF2I04KI/AAAAAAAAA7c/yysNgPi6Dzk/w1170-h1100-no/19.40%2B-%2BProgram%2B4.jpg

    • After saving and activating the program, click the "Test" button. Type a zip code then click the "Execute" button.

https://lh3.googleusercontent.com/-0dD0XCz-h7w/VWLNF8J8jSI/AAAAAAAAA7Y/8iRaWPKfCXU/w886-h410-no/19.50%2B-%2BProgram%2B5.jpg

    • You can see that the status and results fields were read successfully.

https://lh6.googleusercontent.com/-lqzBOBuKyfs/VWLNGCIfxsI/AAAAAAAAA7g/lXLzXpE5AHs/w3128-h664-no/19.60%2B-%2BProgram%2B6.jpg


This concludes this short SAP-to-REST mini-series. Please do let me know if this was helpful or if I missed something.

New SAP Press book about SAP Process Orchestration

$
0
0

You already run SAP Process Orchestration? You want to step into SAP Process Orchestration?

 

Then the book SAP Process Orchestration - The Comprehensive Guide by John Mutumba Bilay and Roberto Viana Blanco, both experienced consultants in the area of SAP Process Integration (SAP PI) and SAP Process Orchestration (SAP PRO), with a foreword by Michal Krawczyk, SAP mentor and author of various SAP PI books himself, is a must-read to get on track with SAP PRO.

 

9781493210183_267.png

As the title suggests, it gives you a comprehensive overview of SAP PRO covering all aspects that help you tackling an integration project from planning, design, implementation, and operation: chapters about landscape architecture, installation options, system setup, designing and implementing interfaces and processes, proxy development, configuration, administration, monitoring, etc.

 

For those of you who plan to migrate from an SAP PI dual stack system to SAP PRO, two chapters are dedicated to the migration topic: migrating interfaces and ccBPM to BPM migration.

 

Besides this, the authors spent a large portion on sharing their experiences providing best practices and recommendations as well as step-by-step examples.

 

Finally, the book closes with an outlook into other integration topics such as SAP API Management, SAP HANA Cloud Integration (HCI), and Integration Advisor. By the way, John announced that he has already started to work on a new book on HCI together with two colleagues of mine from SAP. So, stay tuned.

 

Hope this made you curious. So, here's the link where you can order the book:

SAP Process Orchestration. The Comprehensive Guide of John Mutumba Bilay, Roberto Viana Blanco - by SAP PRESS

 

Before you ask. No, I don't get any credit in advertising the book, not a single dime. I'm more than happy that John and Roberto took the time to complete the book, and I would assume that they spent a lot of their spare time on it. So, many thanks for dedicating so much time and effort to this project. If you ask me, it has been well spent.


Assigning users to iFlows

$
0
0

Occasionally, there is a need to limit the usage of an iFlow to certain runtime users. The following NWDS screenshots specify the required changes that need to be applied to achieve this.

 

By default, any authorized user can send messages in the integration flow from the sender system. On adding users, you are specifying that only the added users can send messages from the sender system and thus, restricting other users.


http://help.sap.com/saphelp_nw73ehp1/helpdata/en/7e/f0af872e6e42fd9e176c53ca8510de/content.htm

 

Step 1: Allocate users to the Business System/Component.

 

BusinessComponent.png

Step 2: Add user to the iFlow by clicking on the Sending Business System/Component

 

iFlowUser.png

 

Note: This feature is available in Integration Directory under integrated configuration > Assigned user. Please take care when assigning users to an iFlow configuration in Integration Directory as this will be lost when an iFlow is redeployed.

List Of Standard Adapters Available in SAP Process Orchestration/Process Integration 7.4 SP10

$
0
0

Recently we upgraded our SAP PO7.4 SP5 to SP10 and found new and great useful features which wanted to share my experience with the community(of course many blogs already available) so that people who were struggling to understand about the new functionalities/features with in SAP PO/PI will have the better visibility.

 

The main focus in this blog is about the adapters which were pre-delivered as part of SAP PO/PI installation and adapters which were deployed at later stage and adapters available after the SAP PO/PI SP upgrade(In this case it is SP10)

 

Once the setup (Installation and post installation) of SAP PO/PI box completed, you can see the below list of standard adapters available from SAP.

Adaps.jpg

 

However still there are few adapters which were missing in ID which were part of SAP B2B Security Add-on. To confirm this, check the system information about the relevant components deployed in Java Engine. If the components are deployed in Java engine then you can see as per below screen1.jpg


If the components are not deployed then download the relevant SCA files and TPZ files from service market place and deploy into Java engine and import into ESR. Now you can see additional SAP SFTP adapter in the available adapter list as per below screen.

SFTPadp.jpg

 

SAP released the ODATA receiver adapter and SFSF adapters to consume ODATA services and to integrate with Success Factors applications. Check the system information about the relevant components deployed in Java Engine. If the components are deployed in Java engine then you can see as per below screen shots

SFSF.jpg

ODATA.jpg


If the components are not deployed then download the deployable SCA files (two different) and TPZ files from service market place and follow as explained above. Now you can see additional SAP SFSF and ODATA adapters in the available adapter list as per below screen.

ESR.jpg

ODandSFSF.jpg

To make use of the above SAP standard adapters you can just download from the market place and deploy into the Java engine no SP upgrade of the PO/PI system is required.

 

SAP released the new standard SAP adapters REST, WS_AAE but these are only available as part of the SP upgrade but not as a separate downloadable components from service market place similar to SFTP,ODATA and SFSF. After SP upgrade, Import the latest SAP BASIS component into ESR to view the REST and WS_AAE adapter metadata as per below screen.

Rest.jpg


Now SAP B2B Add-on is part of SAP Process Orchestration (refer blogs in SDN for licensing of SAP PO and B2B Add on), deploy the B2B related SCA files and import the TPZ file into ESR. Now we can see the list of SAP B2B adapters and its metadata.

B2B.jpg

B2BEsr.jpg

B2BID.jpg

 

Below is the list of all standard SAP adapters available till date in SAP PO/PI 7.4

all.jpg

Note :

Deploying the additional adapters as an add-on in SAP PI/PO does not need to test the business critical interfaces before upgrading the Live PO/PI system however if you do the complete system upgrade (for the sake of REST adapter etc and new features in the upcoming releases) there might be a need to test the interfaces.I think SAP should provide the new adapters like REST and WS_AAE as an add-on instead of providing them as part of SP upgrade (Definitely i believe there might be reason for this).


The above note and content described based on my views and observations ONLY.Please feel free to comment and share your inputs.

Developing Job Roles and Authorizations for PO

$
0
0

It is important that the correct 'roles' are specified before assigning the necessary authorizations. Rather than saying 'what access do I need' why not approach it from the opposite direction and say 'what do I need to do'. It is important that each person has the access they need and no more. It is all too easy give in and assign the equivalent of SAP_ALL (which I have seen suggested on numerous blog postings) and leave it at that. You then have no idea who is doing what and the auditors will come down on you like a ton of bricks.


We recently installed a SAP NW PO 7.31 Single Stack landscape and I worked with my companies authorization team to perfect the roles and identify the necessary authorizations. This is an ongoing process as we are not using BPM/BRM at the moment so I know we have more work to do. Anyway, we identified four specific roles

 

  • Administrator - System administration, governance, transports, support and also development
  • Developer - The name says it all!!
  • Display - Used by process specialists and support teams providing very limited access
  • Service Account - Used the integration processes to authenticate.

 

I have intentionally left our our technical teams - security and basis. Those roles were developed separately. The next challenge was identifying the authorizations. The process has been a little hit and miss at times and required testing with the various teams. In the end what we came up with is shown below. I have to say that the SAP documentation was very good (once I found it) and I have included all the relevant links below

 

DEVELOPER
SAP_SLD_GUEST Read only access to the SLD.
SAP_XI_DEVELOPER_J2EE Developer Roles.
SAP_XI_APPL_SERV_USERService User Role. Added as one of our third party adapters uses this role and it is useful for testing.
SAP_JAVA_WSNAVIGATOR Allows access to the SAP Web Services Navigator to test developments.
SERVICES_REGISTRY_READ_WRITE Access to service registry to read and write entries

SAP_PI_B2B_TPM_ADMIN

SAP_PI_B2B_NRO_ADMIN

Allows end user to maintain Trading Partner Profiles in B2B IC Allows user to maintain Number Ranges

XiDir_**_Dir_Profile

XiRep_***_Rep_Profile

Roles to control access to objects in the Repository and Directory. Not used in development but restrict access to existing ID objects only for all other systems to allow post transport configurations only.
ADMINISTRATOR
SAP_SLD_ADMINISTRATOR Full access to the SLD

SAP_XI_DEVELOPER_J2EE

SAP_XI_CONFIGURATOR_J2EE

SAP_XI_CONTENT_ORGANIZER_J2EE

SAP_XI_ADMINISTRATOR_J2EE

Developer Roles
SAP_XI_APPL_SERV_USERService User Role. Added as the Service User Role. Added as one of our third party adapters uses this role and it is useful for testing.
SAP_XI_API_DEVELOP_J2EEAllows the consumption of the PO API’s.
SAP_JAVA_WSNAVIGATOR Allows access to the SAP Web Services Navigator to test developments.
SERVICES_REGISTRY_TECHNICAL_ADMINISTRATORFull access to the service registry

SAP_PI_B2B_SUPERADMIN_J2EE

SAP_PI_B2B_TPM_ADMIN

Access to all B2B functions
NWA_SUPERADMINAccess to NWA for administration purposes

XiDir_**_Dir_Profile

XiRep_***_Rep_Profile

Roles to control access to objects in the Repository and Directory. Not used in development but restrict access to existing ID objects only for all other systems to allow post transport configurations only.
SUPPORT ACCOUNT
SAP_XI_DISPLAY_USER_J2EEAccess Java stack, display purposes only.
SAP_XI_MONITOR_J2EEView messages in the Message Monitor.
SERVICE ACCOUNT
SAP_XI_APPL_SERV_USERService User Role.

 

 

Reference Material

SAP Netweaver Security Guide

User Management and Authorization Concepts (AEX)

Role-Based Authorizations in ES Repository and Integration Director

Business Rules Management Security Guide

Business Process Management Security Guide

Integrating process repositories with SAP BPM

$
0
0

From SAP BPM 7.31 SP16, we provision the integration of any external process repositories with BPM. It brings in the flavor of seamless integration across products. For example, you can use Power Designer to model the process, save it in the repository and then using Process Composer of BPM, the process modeled in power designer can be implemented. This feature is extremely useful especially with the availability of Intelligent Business Operations (IBO) bundle which includes multiple products like Power Designer, SAP Process Orchestration and Operational Process Intelligence. In addition to that, the scope of this feature also goes beyond IBO and any process repository can be integrated with BPM.

This blog focuses on the implementation details in the BPM side on how this integration can be achieved. If you are looking for details on "how to" use the feature, please have a look at the documentation on SAP help portal.

Majorly, there are two steps:

  1. Availability of APIs/OData service to access the process definition related data from the repository. This includes list of process definitions and its BPMN content from the process repository (If these are not readily available, then they should be implemented. These implementation details is beyond the scope of this article.)
  2. Implement extensions of Search Console in SAP Netweaver Developer Studio (NWDS)

Prerequisites

The development workspace is setup and you have already decided the eclipse plugin (new or existing) in NWDS in which these extensions will be provided.  Ensure that the dependencies to the following plugins are added:

com.sap.ide.discovery.runtime.objecttype

com.sap.ide.discovery.runtime.destinationcategory

com.sap.ide.discovery.runtime.destinationsprovider

com.sap.ide.discovery.runtime.searchprovider

com.sap.ide.discovery.integration.searchexplorer

Implement extensions of Search Console in NWDS

Before we see the technical details, the question of why do we use search console might be ringing in your minds. Think about the use case in question - from SAP BPM, the user must be able to "Search, Discover and Configure" the process definition(s) residing in an external repository. The search console in NWDS provides a very sophisticated way in order to achieve the - Search and Discovery of artifacts(in this case, process definitions; the configuration will be done using SAP BPM's Process Composer).

For more details on search console, please refer this link

Lets see in details the various extensions points that needs to be implemented:

Step 1. Define a new object type

Object type is used to define a new option in the "Search for" drop down of Search console view. First make sure that nobody else has already defined such an object type.

Extension point id: com.sap.ide.discovery.runtime.objecttype

Defined in plugin: com.sap.ide.discovery.integration

Example:

<extension point="com.sap.ide.discovery.runtime.objecttype">  <objecttype displayname="<<Name to be displayed>>" id="<<ID of the object type>>" /></extension>

Step 2. Define a destination category

As the name implies, destination category is used to combine destinations under a category. This extension point is used to define categories in the "Search in" option of Search console view. First make sure that nobody else has already defined such a destination category.

Extension point id: com.sap.ide.discovery.runtime.destinationcategory

Defined in plugin: com.sap.ide.discovery.runtime

Example:

<extension point="com.sap.ide.discovery.runtime.destinationcategory">  <destinationcategory destinationclass="<<Class referring the destination>>" displayname="<<Name of the destination category>>"
id="<<ID of the destination category>>">  </destinationcategory></extension>

where "destinationclass" refers to the fully qualified name of the class defined to represent a destination in which the process definition will be searched  and "displayname" represents the name of the category.

Step 3. Create a destinations provider

Destinations provider is responsible for retrieving all the destinations defined in the given destination category.

Extension point id: com.sap.ide.discovery.runtime.destinationsprovider

Defined in plugin: com.sap.ide.discovery.runtime

Example:

<extension point="com.sap.ide.discovery.runtime.destinationsprovider">  <destinationsprovider destcategoryid="<<ID of the destination category as described in the above step>>" id="<<ID of the destinations provider>>"  provider="<<Class referring the destinations provider>>">  </destinationsprovider></extension>

where "provider" represents the fully qualified name of the class defined to combine destinations to the destination category defined above.

Step 4. Create a search provider

This is the extension point through which the actual destination will be contacted and the requested artifacts will be retrieved.

Extension point ID: com.sap.ide.discovery.runtime.searchprovider

Defined in plugin: com.sap.ide.discovery.runtime

Example:

<extension point="com.sap.ide.discovery.runtime.searchprovider">  <searchprovider id="<<ID of the search provider>>" instance="<<Class referring the search provider>>"
objecttypeid="<<ID of the object type as defined in step 1>>" supportstextsearch="true">  <category categoryid="<<ID of the destination category as defined in step 2>>"></category>  </searchprovider></extension>

where "instance" refers to the class using which the search operation will be performed.

Note that the resultant object returned from the search must have implemented the interface com.sap.glx.ide.bpmn.util.IBPMN2ContentProvider defined in the plugin: bpem.ide/com.sap.glx.ide in order for the integration to work.

Step 5. Customize/Create search results UI

As a final step, the search results should be displayed in the UI. This extension point is used to contribute customization to the generic search result UI.

Extension point id: com.sap.ide.discovery.integration.searchexplorer

Defined in plugin: com.sap.ide.discovery.integration

One can contribute the following artifacts via view customizations:

  • Content provider - provides content to be displayed in the search view for a given search result
  • Label provider - provides text and images for the displayed result entries
  • Tooltip provider - provides tool tips for the displayed result entries

Example:

<extension point="com.sap.ide.discovery.integration.searchexplorer">  <viewcustomization customizationimpl="<<Class referring the UI>>" id="<<ID of the view customization>>">  </viewcustomization></extension>

where "customizationimpl" refers the fully qualified name of the class which implements the UI.

If you interested in creating in your own UI to display search results from your search provider, then contribute to the following extension point:

Extension point id: com.sap.ide.discovery.ui.customresultui

Defined in plugin: com.sap.ide.discovery.ui

Example:

<extension point="com.sap.ide.discovery.ui.customresultui">  <contributor creator="<<Class referring the custom UI>>" id="<<ID of the custom UI>>"
searchproviderid="<<ID of the search provider defined above>>">  </contributor></extension>

Using BPM OData services to automate process model testing

$
0
0


What would be the next step after implementing a process model? Of course, it is testing all the different flows in the process to ensure that things work properly before it can be deployed on the production system. Testing a process model manually may become a tedious task as it involves multiple steps which requires human and system intervention.  In the previous post, Venu has explained how to automate process model testing using the RESTful services of BPM public APIs.

These RESTful services are no longer available because from 731 SP10, OData services are introduced for BPM public APIs so that the customer gets the required services from BPM out of the box. This blog focuses on writing JUnit tests for automation of process model testing using various OData services for creating the process, completing the human task, sending message to BPM and check the status of process.


Prerequisites

  1. SAP BPM 7.31 SP10 or higher (Note that the availability of a particular OData service depends on the version of BPM that is being used.)
  2. JUnit 4.10 from eclipse
  3. Restfuse (1.2)open source library from eclipse http://developer.eclipsesource.com/restfuse/ . Download the binaries from http://download.eclipsesource.com/technology/restfuse/restfuse-1.2.0.zip
    Restfuse is a JUnit extension to test RESTful services. A Restfuse based test uses annotations to define RESTful service URL, request type, authentication, content and its type. It is not required to deploy the tests to the BPM server. The tests run like any other JUnit test in eclipse.
    Note: We need Restfuse library as OData is a type of RESTful API.

Writing JUnit Tests for process model

Let us consider the following process model for writing JUnit tests. The goal here is to test the process isolated from other systems(like Process Integration or PI in short) in order to clear the air in the BPM space. That means the automated activities are configured with mocked service interfaces and doesn't interact with external systems. Note that the mocked service interfaces are used here to illustrate various type of tests that you can write. Mocking service interface is not a mandatory step to test the process. You can also test your process in an environment with real service interfaces in which case the type of tests that you will write may vary.

Let us see how to write JUnit tests for the above process model. The following annotations are required at the JUnit class level.


@RunWith(HttpJUnitRunner.class)
public class TestBPMProcessModel {
 // PollState is used to check/poll the system every few seconds if an expected event has happened    @Context    private PollState pollState;  /** The destination object holds the details about the server in which the process is  deployed    and needs to be tested.**/    @Rule    public Destination destination = getDestination();    @Context
// The Response object will be injected after every request sent to the BPM system.    private Response response;
// The X-CSRF-Token is mandatory to make a POST request to the OData service    private String xcsrfTokenValue;  // Cookies are necessary to maintain the session between different requests/tests    private List<String> cookiesList;
/**In order to have provide the headers or request parameter to the destination URL ,    you have to define a function which will be called before executing each test.
For eg ; **/    public Destination getDestination() {        Destination destination = new Destination(this, "dsf:22");        addHeaders(destination);        addToPathSegment(destination);        return destination;    }
// Adding headers    private void addHeaders(Destination destination) {        RequestContext context = destination.getRequestContext();        context.addHeader("Accept", "application/json");        context.addHeader("Content-Type", "application/json");        context.addHeader("x-csrf-token", xcsrfTokenValue);        StringBuilder cookiesStringBuilder = new StringBuilder();        for (String cookie : cookiesList) {            cookiesStringBuilder.append(cookie.split(";")[0]);            cookiesStringBuilder.append(",");        }        context.addHeader("Cookie", cookiesStringBuilder.toString());    }
// Adding request parameter to the destination    private void addToPathSegment(Destination destination) {        RequestContext context = destination.getRequestContext();        context.addPathSegment("process-id", processinstanceID);    }
}

 

Step 1: Create a process instance

The first step is to create an instance of the process. As the OData service to start a process is a POST request, the x-csrf-token must be retrieved first. This can be done by specifying the header "X-CSRF-TOKEN" value as "Fetch". The "order" parameter specifies the order in which the tests will be executed.

Test 1 : Get the X-CSRF-TOKEN

The OData service to get the start data of the process takes the following inputs: vendor name, DC name and the process name. You can also use the link http://host:port to derive the OData URL for a particular action.


@HttpTest(method = Method.GET,  headers = { @Header(name = "Accept", value = "application/json"),  @Header(name = "X-CSRF-TOKEN", value = "Fetch") },  authentications = @Authentication(type = AuthenticationType.BASIC, user = "userid", password = "@@@@"),  path = "/bpmodata/startprocess.svc/sap.com/demo~mdqp.bpm.businessprocess/mdqp_business_process/StartData", order = 1)  public void getStartData() {  com.eclipsesource.restfuse.Assert.assertOk(response);  Map<String, List<String>> headers = response.getHeaders();  xcsrfTokenValue = headers.get("x-csrf-token").get(0);  List<String> cookiesList = headers.get("set-cookie");  cookiesList.addAll(cookies);  }

Test 2: Start the process

The next test starts the process with the necessary payload. The start event is defined with a custom event trigger having service operation with the following input data structure:

<complexType name="Customer"><sequence><element maxOccurs="1" minOccurs="0" name="customerId" type="string"/><element maxOccurs="1" minOccurs="1" name="firstName" type="string"></element><element maxOccurs="1" minOccurs="1" name="lastName" type="string"></element><element maxOccurs="1" minOccurs="1" name="street" type="string"></element><element maxOccurs="1" minOccurs="1" name="city" type="string"></element><element maxOccurs="1" minOccurs="1" name="zipCode" type="string"></element><element maxOccurs="1" minOccurs="1" name="country" type="string"></element><element maxOccurs="1" minOccurs="1" name="creditLimit" type="double"></element><element maxOccurs="1" minOccurs="1" name="currency" type="string"></element><element maxOccurs="1" minOccurs="0" name="priority" type="string"/></sequence></complexType>

This test makes use of the same OData service as above except that this is a POST request.


@HttpTest(method = Method.POST,  type = MediaType.APPLICATION_JSON,  content = "{\"ProcessStartEvent\":{\"Customer\": {\"firstName\": \"Lavanya\",\"lastName\": \"Mothilal\",\"street\": \"BEML\",\"city\": \"Bangalore\",\"zipCode\": \"560066\",\"country\": \"India\",  \"creditLimit\": \"10000.0\", \"currency\": \"USD\"}}}",  path = "/bpmodata/startprocess.svc/sap.com/demo~mdqp.bpm.businessprocess/mdqp_business_process/StartData", order = 2)  public void startMDQPProcess() {  com.eclipsesource.restfuse.Assert.assertCreated(response);  String s = response.getBody();  HashMap obj = (HashMap) JSON.parse(s);  processInstanceID = (String) ((HashMap) obj.get("d")).get("processInstanceId");  }

Step 2: Complete the human task

Test 3: Get the task instance ID

The first step is to retrieve the task instance ID. As there is no direct way to get the task instance ID, the OData service to fetch all the non-completed tasks is used. So in order for the test to fetch the correct task instance ID, please ensure that there are no other running process/task in the system.

The @Poll annotation will retry the method twice with 5 seconds interval in order to ensure that the task is created .


@HttpTest(method = Method.GET,  headers = { @Header(name = "Accept", value = "application/json") },  path = "/bpmodata/tasks.svc/TaskCollection?$orderby=CreatedOn%20desc&$filter=Status%20ne%20%27COMPLETED%27", order = 3)  @Poll(times = 2, interval = 5000)  public void getAddCreditLimitTaskInstance() throws Exception {  if (pollState.getTimes() == 2) {  com.eclipsesource.restfuse.Assert.assertOk(response);  String s = response.getBody();  HashMap obj = (HashMap) JSON.parse(s);  Object[] tasks = (Object[]) ((HashMap) obj.get("d")).get("results");  HashMap latestTask = (HashMap) tasks[0];  taskInstanceID = (String) latestTask.get("InstanceID");  }  }

Test 4: Claim the task

The task needs to be claimed before it can be completed. For simplicity purposes, the same user which is used to start the process is used for claiming and completing the task as well. But you may want to perform these actions using a different user so in that case, a separate test needs to be added to get the X-CSRF-TOKEN with the authentication annotation  (as shown in test 1).

As there is no content to be passed for claiming the task, it is passed as empty.

@HttpTest(method = Method.POST, content = "{empty}", path = "/bpmodata/tasks.svc/Claim?InstanceID='{taskInstanceID}'&$format=json", order = 4)  public void claimAddCreditLimitTask() {  com.eclipsesource.restfuse.Assert.assertOk(response);  }

Test 5: Complete the task

The task can be completed by passing the necessary payload.

@HttpTest(method = Method.POST, type = MediaType.APPLICATION_JSON,  content = "{\"DataStewardTaskOutput\":{\"creditLimit\": \"200.0\"}}",  path = "/bpmodata/taskdata.svc/{taskInstanceID}/OutputData", order = 5)  public void completeAddCreditLimitTask() {  com.eclipsesource.restfuse.Assert.assertCreated(response);  }

Step 3: Send message to IME

After completing the task, the process would wait at the Intermediate Message Event (IME) . As there is no external system involved here, a message must be sent explicitly via the test. The OData service to send message to BPM is available from 731 SP16.

The IME in this process is modeled with an event trigger named "BackendCallbackTrigger" and the type of service interface looks as shown below:

<xsd:complexType name="BackendCreateCustomerCallbackComplexType">   <xsd:sequence>       <xsd:element name="customerId" type="xsd:string"/>   </xsd:sequence></xsd:complexType>

Test 6: Send message


@HttpTest(method = Method.POST, type = MediaType.APPLICATION_JSON, content = "{\"message\":{\"BackendCreateCustomerCallbackComplexType\":{\"customerId\":123}}}, order=6)",
path = "/bpmodata/messages.svc/sap.com/demo~mdqp.bpm.businessprocess/BackendCallbackTrigger/EventTrigger")    public void sendBackendCallback() {
com.eclipsesource.restfuse.Assert.assertCreated(response);
}

This responsibility of this OData service is only to submit the message to BPM system. So all the processes that has events (either a start event or an IME) which are awaiting a message and has matching correlation condition may consume the message.


Step 4 : Process Completion

The final step is to ensure that the process has completed successfully.

Test 7: Verify the completion of the process


@HttpTest(method = Method.GET, headers = { @Header(name = "Accept", value = "application/json") }, path = "/bpmodata/processes.svc/ProcessInstance('{process-id}')", order = 7)  @Poll(times = 3, interval = 10000)  public void checkProcessCompletion() {  if (pollState.getTimes() == 3) {  com.eclipsesource.restfuse.Assert.assertOk(response);  String s = response.getBody();  HashMap obj = (HashMap) JSON.parse(s);  String status = (String) ((HashMap) obj.get("d")).get("Status");  Assert.assertEquals("COMPLETED", status);  }  }
Viewing all 108 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>