Quantcast
Channel: SAP Process Orchestration
Viewing all 108 articles
Browse latest View live

SAP and Integration: The Dark Clouds May Have Just Passed!

$
0
0

Two years back when I wrote the blog, SAP -  A Comprehensive Integration Solution?, I wrote it with a bit of frustration having worked in Integration and numerous SAP based landscapes for quite a while. Today I think it is a good opportunity to pause, look back, and take stock of things as they are of now.

 

Let me start off by saying that it is indeed very exciting to see the focus SAP has given in the area of Comprehensive Integration capabilities in the last 24 months. If I am to represent where things stand as of today, the below figure should probably capture the essence in its entirety;

 

cis.JPG

 

Bridging the Gap

 

1. With HANA Cloud Integration, SAP has done its bit to introduce a cloud integration capability. The adoption of HCI and the way it is being positioned seems to be evolving.

 

2. The B2B Add-On has underwent multiple SP upgrades. This is a great sign since now the dependency on third party solutions for this capability has reduced drastically. The end to end monitoring features are extensive as the Add On is well integrated into the overall product stack. SAP does seem to keep on innovating in this area. Overall, great momentum.

 

3. SAP Process Orchestration itself seem to be in the center of many new innovations at SAP. Operational Intelligence and SAP 360° solutions are focused and centered around Process Orchestration capabilities.

 

4. SAP Gateway continues to evolve and is core to most or all of the Mobile and UI/UX innovations at SAP. With the promise of Integration Gateway capability to be made available in SAP PI, I believe the future holds the option of having Gateway like capabilities in the Process Orchestration suite. Personally I believe this is a major step as it provides customers with an option of centrally managing services and consolidating skills around technologies. SAP is making a bold statement - oData is here to stay!

 

Note: RESTful services provisioning and consumption on PI is still very much dependent on third party adapters. A major gap still and this continues to add additional implementation and run the business costs.

 

5. SAP doesnt deliver a specialized MFT capability and most customers still rely on third party vendors like Globalscape or IBM Sterling to bridge this gap. The Secure connectivity Add-On does seem to be a small step from SAP to provide MFT like capabilities.

 

6. The recent partnership with Apigee to deliver the SAP API Management platform is a big stride in the area of APIs. API management comes with the option of being deployed both on Cloud and On-Premise and is a boost for the Web and Mobile use cases. Early previews of the product does seem impressive and this is another area that is sure to undergo aggressive innovation.

 

7. Single IDE vision seems to be intact and on track. NWDS is central to the development in SAP PO with PI, BPM and BRM capabilities including modelling now available for developers within the development environment.

 

Note: The integration into NWDS/Eclipse is still incomplete. Many features are still unavailable. Ex. Debugging of message mappings seems to be a gap. Developers would still want to connect to the Swing tool rather than NWDS.

 

8. SAP recently simplified the license model. This again is a very welcome measure.

 

Overall, the dark clouds seem to be clearing and the latest Gartner report, positioning SAP as a Leader in the Magic Quadrant should come as relief for many, and encouraging for all others. I for one have been keenly following SAP's journey in the area of Integration and so far SAP has kept me engaging.

 

PS: I will be attending the SAP TechEd && d-code event in Berlin (Nov 11-13 2014). Please do stop to say 'Guten Morgen/Guten Tag/Guten Abend' or just a 'hello/hallo' as we cross paths


New adapters for running hybrid scenarios with SAP Process Integration

$
0
0

You would like to complement your on-premise applications with services provided in the cloud, i.e., running so called hybrid scenarios?

 

A quite natural choice for integrating cloud applications would be SAP HANA Cloud Integration, especially for cloud-to-cloud it would be definitely the preferred option.

 

When it comes to hybrid scenarios, you can actually choose between two integration platforms provided by SAP, SAP HANA Cloud Integration and SAP Process Integration / SAP Process Orchestration. Which one to use may depend on various reasons, e.g., customer preference, policy, available pre-defined content, etc.

 

For more details about the SAP HANA Cloud Integration offering, refer to the comprehensive Get started with SAP HANA Cloud Integration document on SCN.

 

If you like to learn more about hybrid integration scenarios that SAP provides, take a look at the new Hybrid Scenarios web page on SAP Service Marketplace.

 

In this blog I would like to focus on recent enhancements in terms of connectivity for SAP Process Integration / SAP Process Orchestration that facilitates support for hybrid scenarios:

  • We have shipped a SuccessFactors adapter as part of the SAP NetWeaver Process Integration Connectivity add-on, see documentation on SAP help portal.
  • We have shipped an OData adapter as part of the SAP NetWeaver Process Integration Connectivity add-on.
  • We do support sftp adapter as part of the SAP NetWeaver Process Integration Secure Connectivity add‑on, see documentation on SAP help portal.
  • Itelligence, one of our preferred partner in the area of application integration, has just released a new Salesforce.com adapter running on the adapter engine of SAP Process Integration / SAP Process Orchestration. The adapter is SAP-certified. For more details, refer to Salesforce.com-Adapter announcement at itelligence.
  • Furthermore, we plan to ship an own REST adapter by end of this month. I will publish a separate blog once available. So, stay tuned.

 

If you happen to be at SAP TechEd && d-code in Berlin next week, you may like to join me in the hands-on session INT264 - Use SAP Process Orchestration for On-Premise to Cloud Integration. Here, you will setup integration scenarios towards SuccessFactors applications.

SAP WF to BPMN

$
0
0

There is a public web service to convert Your SAP WF definition (exported in BPML from your Workflow builder) into an editable BPMN note.

Easy-to-use tool gives you the opportunity to design changes to your workflow processes, discuss them with key users or just document your workflows in a business-user understandable way.

 

You may want to check it out, if you ever found yourself in the need of working with an old and forgotten WF, one you did not create, or just design-in some new processes, and had problems getting one's bearings..

Integration && Orchestration 2014 - A Report Card from Berlin

$
0
0

I must admit that I wrote the blog, SAP and Integration: The Dark Clouds May Have Just Passed! with mild optimism. Optimistic about the strategy of SAP around integration given that we are today in a world moving rapidly to hybrid environments and the new integration challenges being fixated around;

 

1. On-Premise to On-Premise - OP2OP (A2A)

2. Business to Business (B2B)

3.On-Premise to Cloud (OP2C)

4. Cloud to Cloud (C2C)

5. User to Cloud (U2C)

6. User to On-Premise (U2OP)

 

In the SAP TechEd && d-code this year at Berlin, SAP took all the effort to reiterate and showcase its integration and orchestration portfolio. The highlights of which would be as follows;

 

1. Continuous improvements in SAP Process Orchestration, strengthening the On-Premise Integration platform.

a. Effective tooling with more and more features being continuously added to Eclipse/NWDS

b. Added functionalities to the Connectivity Add-On (SFSF adapter etc)

 

2. HANA Cloud Integration (HCI) released in Standard and Premium Editions, evolving itself to be the Cloud Integration platform.

a. Pre-packaged integration content

b. Openness - Partner innovation to develop connectors, content etc

c. Interoperability between HCI and SAP PO (ex. mappings)

 

3. SAP now has its own API Management solution

a. Delivered via partnership with Apigee

b. Simplicity of modelling stands out

c. Comes power packed with features

d. Available to be deployed both on-premise and on cloud

e. Core conceptual or rather the technical model includes the API Platform, API Analytics and Developer Services

 

4. OData takes centre stage

a. Gateway (ABAP) feature enhancements continue on (ex. server side and client side caching). If you want OData out of a SAP system, Gateway is the thing!

b. Integration Gateway (Java) on the SAP Mobile platform provides option to OData enable non-SAP systems (restricted to JDBC, SOAP, JPA and SAP Gateway data sources)

c. OData now also comes to SAP PO via Integration Gateway ported onto PI

 

5. Processes get more intelligent and Smart

a. SAP Operation Process Intelligence running on HAHA delivers an innovative solution around predictive analytics and collaboration

b. Real-time operational intelligence across big processes with visibility, insight and action

c. Spanning multiple people or org units, across multiple systems (SAP and Non-SAP), now process can be modelled with KPIs and workflow integration

d. All of this is then generated into a dashboard (UI5 based) - Very minimum code via the magic 'Generate' Button


6. B2B is high on adrenaline

a. Since SAP released the B2B add on, there has been constant patches released with added features and functionality

b. A License consolidation meant B2B add on now included as part of the SAP PO licence and no additional fee.

c. Trading partner management solution is embedded into the overall B2B solution

d. One screen B2B dashboard making the supportability, tracking and overall monitoring easy.

e. Automatic certificate rollover feature added

f. B2B content reducing the overall TCD. SAP also offers RDS for B2B

g. One incredible innovation is around Integration Advisor. This feature promises to deliver a feature that will reduce the overall message implementation (EDI message mappings etc) efforts significantly via message definition and mapping proposals and predictions.



So a simple representation of the various integration and orchestration capabilities that SAP Provides can be summarized into the below diagram;

Estate.jpg


Some key notes on the product inventory;


1. The strategic On-Premise integration solution will be SAP Process Orchestration. B2B add on will be deployed on SAP PO. Thus PO can be your organizations B2B hub.

2. The strategic Cloud integration solution is HCI

3. SAP Operational Intelligence is not an integration platform but a solution that utilizes the varied integration solutions (esp. BPM) to deliver the promise of a smart process


That leaves us with Gateway, API Management and Integration Gateway. Now these solution are key when it comes to mobility and web solutions. And on a closer inspection, they deliver almost the same solution yet retaining unique features of their own. My understanding is that Gateway (ABAP) solution will continue to stay and this will be an option to expose SAP business functionalities to the outside world. Meanwhile, SAP wants to push the OData agenda aggressively. Thus we see a desperate attempt at enabling OData not only for SAP applications but Non SAP applications too.


Note: API management has its own strong use-case. So as long as the product integrates well into the existing technology stacks of SAP, it should retain its leading edge.


Integration Gateway in SAP PO currently in its form is almost a useless solution. As of today, the only feature available is for PO (via Integration gateway) to connect to a Gateway (ABAP) system and consume an existing OData service. This is almost of NO value addition. What would be really worthwhile is a solution that can both consume and provision OData, additional expose any data source as OData.


But SAP promises to develop Integration Gateway in SAP PO further. And we might end up soon seeing Integration gateway in SAP PO become the Java counterpart of the Gateway (ABAP), providing hopefully all of the existing Gateway (ABAP) like features, the important aspect being that this time it would not only be able to have an ABAP backend as a data source but any backend. This would then leave customers;


1. To decide on when to use what technology decision based on skills and overall EA strategy

2. With an opportunity to consolidate on the integration estate i.e SAP PO for OP2OP, OP2C, B2B, U2OP use cases.


We will have to wait and watch this space for at this point we can merely speculate as I am sure we can expect a diplomatic silence from SAP.


Last but not the least, one open question might be in case of a hybrid landscape, how do we see the debate on SAP PO vs. HCI being addressed. Though there are no simple answers, a high level approach could be based on the proximity of the core business application. If the core business application(s) is on-premise, then SAP PO else if the core business application(s) resides in the cloud, use HCI.


Hope this blog helps the reader with a detailed summary and will help them keep abreast with the technological advances in the area of Integration and Orchestration.


PS: One of the common questions across many of the integration sessions were around the strategic directions, when to use what scenarios etc. I hope some of those queries would have been now answered via this blog. But a note of caution, these are just my individual views and in no way do I represent SAP.


Integration Centric Process - Exception Handling in BPM

$
0
0

Many Architects , across organizations using PI (dual stack), tend to be very discouraging of using ccBPM in integration scenarios. They are happy to go to any extend in recommending workarounds so that ccBPM based designs are negated. This skepticism had prompted me to write the blog 'Why are most skeptic about ccBPM and its usage in PI?'.

 

One of the points emphasized in the blog is around proper exception handling. Exception handling is key to any program, application or even a process. So as one moves from the traditional BPEL based ccBPM Integration scenario to BPMN based Integration centric processes, exception handling will continue to be the key. In this blog, we will focus on these aspects and hopefully for consultants who are migrating from PI dual stack to Process Orchestration will provide the ease of adoption.

 

Note: SAP Process Orchestration Details - Release NW731 SP 10

 

The Scenario:

 

For simplicity sake, the scenario we will use follows the below processing logic;

 

1. Read an XML file from a FTP server which has the input fields as FName, LName and Age.

2. If Age is Greater than 70, then perform a simple mapping (modeled inside BPM itself) of setting the FName as the FullName of the output file.

3. If Age is Less than 70, then call a PI mapping (modeled inside ESR) which will concatenate FName and LName into FullName of the output file.

 

The Input File Structure;

 

dt_in_18nov.JPG

 

 

The Output File Structure;

 

dt_out_18nov.JPG

 

The scenario will be modeled as below;

 

exep1.JPG

 

Now lets make things interesting.

 

The automated activity PI_Mapping in the above Process Model is making a call from the BPM engine into PI and executing a mapping that was created in ESR.

The mapping is a very simple mapping as described in the below screenshot;

 

mapping_18nov.JPG

 

Now, in a real time scenario, one of the common exception to be handled is of the mapping failing on execution. To make our Process Model efficient, lets introduce an exception handler so that in case of the message mapping failing in PI, we can take appropriate actions. In our case, we will catch the exception and then write the exception to an Exception Output File.


Lets assume that the Exception file is of the below format;


dt_outexp_18nov.JPG

 

How to introduce Exception Handling?

 

In our scenario, we want to handle exceptions for the PI mapping step. So in the Process Model, go to the Properties of the PI_Mapping automated activity. In the tab Boundary Events, add an event of the type TechnicalError.

 

addboundaryevent.JPG

 

Now we will perform an Output mapping for the newly created Boundary Event. You will find a standard data object 'parameters' that will hold runtime data related to an exception. We will use two of the fields from this object (logId and rootErrormessage) and map it to the Data object of the exception file.

 

addboundaryevent2.JPG

 

We will now create a final automated activity to write the details of the exception to the Output Exception File to an FTP server. The final Process Model will be as below;

 

addboundaryeventfinal.JPG

 

Build and Deploy the Process.

 

Now to ensure that the Exception handling is working, lets try and simulate a mapping exception. The easiest way would be modify the message mapping in ESR to ensure a failure

 

Change the mapping for the Age field and introduce a substring function so that we will mock a String Out of range exception.

 

mappingfail_18nov.JPG

 

Test the mapping in ESR to ensure that we do get a mapping failure;

 

mappingfailerror_18nov.JPG

 

 

Now if we test the scenario end to end, we should find that the BPM process to have triggered the boundary event and the exception details captured into the Exception Output file.

 

exceptionfile.JPG

 

Hope this acts as a starting point to help you factor in appropriate Exception Handling measures in your integration centric processes.

 

PS:

One can extend the error management strategy by introducing TechnicalError boundary events to not only individual steps but also to Sub-Processes. You can also model the process to implement Retries on particular steps (a good scenario will be a when a Webservice call fails due to the webserver being down) so that instead of immediately taking action on a failed action, the step can be re-triggered (as appropriate). I recommend the TechEd && d-code session INT361 that will provide further details into error resolution strategies.

 

Reference Links:

 

1. Using NetWeaver PI’s Mapping in NetWeaver BPM

2. Modeling Exceptions and Exceptions Handling

3. INT361 : Error Handling in BPM-Based Processes Using SAP Process Orchestration

Impact of Integration changes - How to “talk” to non-SAP (and SAP) systems in 2014+

$
0
0

This is the third blog in a series that will look at the following 5 perspectives that are changing in the SAP world:-

 

 

A new world of integration

 

Once upon a time selecting the way to integrate to and from SAP was simple – the rule was use Exchange Infrastructure (XI) that quickly became Process Integration (PI), which was a combined ABAP and Java solution.

 

Now the world has moved on and the SAP solution set supporting process and data integration has grown massively, it now includes:-

 

  • Power Designer– A tool to model your Enterprise Architecture including processes and integrations (the stuff you might have used ARIS for in the past).
  • Process Orchestration– SAP Business Process Management (BPM), SAP Business Rules Management (BRM) and SAP Process Integration (PI) – but Java only now with 10x the throughput.
  • B2B Add-in for Process Orchestration– A set of specific B2B capabilities from Trading Partner Management Portal to EDI specific adaptors.
  • HANA Cloud Integration–A cloud based integration solution that comes pre-loaded with content to connect from on-premise to the cloud.
  • Data Services– Extract, Transform and Load Technology
  • Operational Process Intelligence– Real-time process analytics across SAP and non-SAP systems.
  • Event Stream Processor– Real-time analysis and correlation of complex events.
  • SAP Gateway– Expose and manage SAP services using OData for User Interface consumption
  • SAP API Management– Publish and manage APIs both internally and externally
  • Integration Advisor– Accelerated mapping between SAP and non-SAP systems – to be launched next year.

 

So plenty of options and you can’t apply the old rules…

Integration.png

So where do you start? Read on to find out.

 

First Step:  Understand where your problems are

 

I see lots of SAP implementations, where the integration between systems is quite frankly locked in the last century. Some do not even use PI (“preferring” to dump and ship flat files), those that do use PI, often use it as a point to point file shipping service.

 

Anyway I will not dwell on the integration horror stories I have seen, but focus more on how you can start to take advantage of the long list of capabilities above.

 

The first thing to do is to analyze your integration issues / requirements. This means stepping back from your landscape and figuring out what isn’t happening in real-time, and once you have completed step 2 below, you can map the solutions to the issues / requirements.

 

So why is real-time so important? The answer in 4 words is “The Internet of Things”, another answer in 3 words is “The Networked Economy”, another in 2 words is “Omni-channel” and another in 1 word is “Amazon / Competitors”. If your systems are not updated in real-time and working as a cohesive unit, you will give the wrong answers to your customers, place the wrong orders with your suppliers, fail to meet your service commitments and produce the wrong products.

 

Whilst one SAP HANA system with all data in one place might be your target architecture, this isn’t going to happen overnight AND even when you achieve it, you will still need your “HANA Island” to talk to those of your customers, suppliers and partners.

 

As a starter for 10, look for these types of issues:-

 

  1. Where is data delivered between systems in “batch”?
  2. Where is data captured “off system” and entered once it is complete?
  3. Where are processes running, but you have no idea if it is running “right”?
  4. Where is data summarised prior to loading into systems because the data volumes are “too high” or the systems “can’t cope”?

 

Second Step:  Understand the options

 

The second step that can be run in parallel with the first one, is to understand what the options are. If you are an SAP customer, http://service.sap.com/roadmap is a good place to start, with the product roadmaps and current capabilities spelt out in the well-known “Today – Planned Innovations – Future Direction format.” - see supporting information below to jump to the key roadmaps. Some key new features to highlight would be:-

 

  • Simple Landscape
  • Up to 10x Performance – Java only landscape can deliver increased throughput by 10x on the same hardware platform for some scenarios.
  • Easier to Configure
  • Straight Forward B2B
  • Run on HANA
  • User Interface Focus
  • End to End Modelling
  • Dual Stack PI is no longer having new features added so you should migrate and not upgrade

 

Third Step:  Hardware and Software

 

A common first step when you look at the above products will be to decide that you need to migrate from Dual Stack PI to Java only PO. It is a migration, as the change in architecture required to simplify the landscape and increase performance, needed some of the technical debt to be left behind. The migration also means that the transfer does not need to be big bang, with integrations moved across group by group, reducing the business risk.

 

SAP have delivered migration tools to move what is good from your PI landscape and often it is an opportunity to re-work interfaces or re-implement them in tools that are better suited to the job (e.g SAP Gateway, SAP API Management and SAP Data Services).

 

As with all areas of the SAP product set, deciding to run your new landscape on HANA will bring more features and benefits, such as being able to run PO, OpInt and ESP all on one landscape with zero duplication of data.

 

Fourth Step: Create an Implementation Programme

 

Finally the rubber needs to hit the road and you need to create an implementation plan. Usually this will see the issues / requirements listed above put into a timeline with associated costs / benefits driving how quickly the transition from a flat-file/batch organization to real-time will take.

 

Conclusion

 

If you go through the above process, I predict two things:

 

  1. You’ll find features in the modern middleware solutions from SAP that will add value to your business – some of which you will already be licensed to use. You will also realise that you must migrate from PI to PO to get these benefits.
  2. You will remove the “black box” techie view of interfaces and deliver a solution that truly integrates your systems and those of your partners.

 

Supporting Information

 

This link below gets you to all the roadmaps from Product  from SAP (you need an S-user).

 

This is the link to the SAP PO roadmap - , and this is a webinar from May 2014 by the product managers

 

This is the link to the SAP HCI roadmap - and this is a webinar from July 2014 by the product managers

 

This is the link to the SAP OpInt roadmap

 

This is the link for the SAP Gateway roadmap

 

This is the link for the SAP Gateway for Microsoft

 

This is the link to SAP API Management

Usage of Enterprise Integration Patterns with use cases

$
0
0

Dear SCN Friends,

 

I would like to share my knowledge on IFlow Patterns by taking an example  to explain.


Enterprise Integration Patterns are based on BPMN 2.0 notations that are used as a model to create integration flows. SAP provides below built-in patterns.You can select any of the patterns and add activity blocks to the layout.

Patterns.jpg

 

Point-to-Point Channel:We select this pattern for direct communication between the sender and receiver.

 

Use case:We are suppose to send Employee details to SYSB from SYSA and PI need to send the acknowledgement back after receiving the request from SYSA.

P2P.jpg

Here source and target structures are same ,so used same Data type,Message type and opted point to point connection.

 

Recipient List: We select this pattern when we need to route the message to two or more receivers based on conditions.


When we can manage to determine the receiver using XPATH conditions in IFlow then we opt this option.By Clicking on Edit button in the below window a screen appears where you can put your XPATH condition to determine the receiver.

FixRec1.jpg

 

Recipient List (Dynamic Conditions):We select this pattern when we need to dynamically route the messages to the receivers.


Use Case:We are suppose to send Customer Address details to either SYSB or SYSC from SYSA based on some complex conditions which demands separate mappng to determine receiver.


Below is the sample mapping for your understanding.


Dynamic Rec.jpg

We opted Recipient List (Dynamic Conditions) pattern ,now we need to select above mapping which determines receiver.

DyneRec.jpg

If you want to take some action ,if no receiver is selected based on condition ;follow the below steps

DC.jpg

   Below window gets open by clicking on the above link.

DRec.jpg

If you want to configure parameters for staging and logging ,click on Runtime Configuration.Staging.jpg

Here I'm setting configurations globally in nwa .If you want to set locally then select the option "Use scenario specific configurations"

 

Message Translator: We select this pattern when the sender and receiver systems have different message formats.

 

Use Case:Our source and target structures are different ,so we need transformation (Mapping) to convert . In this case we select this pattern .

Mapping.jpg

  In the above widow by clicking on Browse button you can select your mapping .

 

NOTE:To any of these patterns you can add or delete the flowstep elements such as recipient list, interface split or mapping, from the graphical representation.

 

  • If you are using Communication channel for look up ,you can select the channel by clicking on Lookup Channels and select your channel.
  • If you are using parameterised mapping then under parameters tab your parameter name will appear you just need to provide value for that .Interface will  taking this value during runtime.

 

 

I hope you are now clear with Integration Patterns and got some idea on Enhanced receiver determination,Parameterised mapping and Channel Look ups.Enjoy

 

Regards

Venkat

New adapter in SAP Process Integration for consumption and provisioning of REST based services

$
0
0

For those of you who were following our roadmap sessions at teched and other events, you were probably eagerly awaiting this announcement. I am more than happy to tell you that we have finally shipped our own so called REST (Representational State Transfer) adapter with the latest SP of SAP Process Integration / SAP Process Orchestration. The REST adapter has been supported as of 7.31 SP14 / 7.4 SP09.


In a nutshell, the following is supported:

  • Consumption and provisioning of REST based services
  • Default and custom url pattern of http end point for provisioning
  • Basic and client certification based authentication
  • Mapping of REST operations to service interface operations
  • Support for XML and JSON message format
  • JSON to XML conversion
  • Determine channel based on JSON element
  • Limit channel to particular operation
  • Extracting information from url and http operations

 

For more details about the adapter concept, its capabilities, and how it is configured, refer to the release notes.

 

We will soon publish a couple of blogs where we will describe the concept and configuration along sample use cases, so stay tuned.


Integration Centric Process - Advanced Exception Handling in BPM

$
0
0

The blog Integration Centric Process - Exception Handling in BPM would have helped set a context around handling exceptions within your Integration centric process. In this blog, we will take the Exception handling to the next level in an attempt to model a comprehensive solution and a Integration centric process that is highly supportable.

 

Note: To maintain continuity, we will extend the Process Model that was defined in the earlier blog.

 

Adding a Notification

 

We can introduce a simple notification step within our process. This will be helpful to trigger email notifications to the business process owner, Process Admin or the Application Maintenance team so that they can take the required follow up action.

 

On the introduction of the Notification step, the process will look as below;

 

notificationonly24nov.JPG

 

You can configure the properties of the notification step to create an email. There is a concept of 'variable' that you can define. These variables can then be referenced as part of the Mail Subject or/and Body.

 

In our case, I decided to use the fields from the input file (FName, LName and Age), the Process instance id along with the actual exception details, to be embedded into the Email.

 

In the below screenshot, you can find the Notification step (Notify by Email) properties. The variables once added can be assigned an Expression and I have chosen the corresponding context objects from the expression editor. These variables once defined, can be referenced in the email subject/body using the curly braces i.e {} as shown below;

 

notpropertiesfirstnov24.JPG.

 

Upon the execution of the process (with an erroneous file), you will find an 'alert' email in your inbox as below;

 

notemailfirst24nov.JPG

 

The highlighted (orange boxes) are values of the variables at runtime, populated during the execution of the process.

 

Implement Retries for a failed Step

 

Even though notifications help better the overall maintainability of the process, what would be much more optimum for some business processes, is the ability to retry some of its failed steps.

 

Take the example of an automated activity that makes a webservice call to an external application. It is quite possible that due to the network traffic during the time of the execution of the call, the webservice could timeout. Instead of ending the process then and there, if we could implement a retry logic, then it would help continue the process and give it a better chance of a successful completion.

 

Lets us try and extend the above process model and implement a retry step for our automated activity.

 

Note: The automated activity in the process we have is making a mapping call into PI. So this might not be an ideal use case for retry. But the idea here is to demonstrate this as a strategy so that you can implement it in other processes where it finds a use.

 

The process is extended further as below;

 

advBPMfull.JPG

 

 

If I can try to describe it, what we are trying to achieve is to implement a retry feature by checking the value of a data object (Retry_Num). The Data Object is initialized as 0 during the start of the process and when an exception occurs we increment it by 1. The exclusive choice will then check the value of Retry_Num. If it is less than 4, then we route the flow back to the automated activity. A notification is also triggered that informs that an exception has occurred and the step is being re-triggered.

 

So in short, in case of exceptions, the automated step PI_Mapping, instead of proceeding to end the process now attempts retries (a maximum of 3 times) in our particular process.

 

Properties of Intermediate Timer;

 

retrycheck.JPG

 

waiting24nov14.JPG

 

Properties of Exclusive choice step;

 

retrycheck.JPG

 

Mapping to increment the Data object;

 

outputmapincrement24nov14.JPG

 

Properties of Notification Step;

 

notretry24nov14.JPG

 

On testing with an erroneous file, you will find emails in the below order informing of the exception, the retry events and the final exception handling.

 

Mail for the first retry attempt;

 

exp1nov24.JPG

 

Mail for the second retry attempt;

 

exp2nov24.JPG

 

Mail for the third and final retry attempt;

 

exp3nov24.JPG

 

Exception handled and the notification is triggered;

 

exp4nov24.JPG

 

 

With this, hope you now have options to make your Integration Centric Processes much more supportable.

 

Reference:

 

INT361 : Error Handling in BPM-Based Processes Using SAP Process Orchestration

Step by Step Document for Simple BRM Rule set

$
0
0

Dear SCN Friends,

 

SAP NetWeaver Business Rules Management (BRM) provides a powerful toolset covering all aspects of rule handling in business applications. Message processing in SAP NetWeaver Process integration (PI) can take advantage of those business rules in multiple ways.

 

Below are the most common use cases in PO where we can make use of BRM .

 

1)Within routings, business rules may provide additional data

      Ex:Extended Receiver Determinations and Enhanced Interface Determinations

 

2)Within a mapping step, business rules information may enrich the original data providing lookup and value mapping functionality.BRM lookup will be effective compared to other lookups(calling external systems) ,when it is installed along with PO server in a single instance. 

 

3)Within NW BPM, rules may also be used to enrich process data or to provide decision criteria for conditions steps

 

I Would like to share my knowledge on BRM Rule set creation by taking a simple use case to explain .

 

UseCase:We need to calculate Employee Hike based on his/her Region and Grade .

 

 

Step1 :In PO server Create a Data type (HikeCalc),Message Type (HikeCalc) with the below fields.

BRM_DT.jpg

 

Step2:In the XSD tab of the Message Type, copy or export the XSD to a local file.

BRM_MT_Export.jpg

Step3:Now open NWDS server and follow the below navigation to open BRM Rules Composer

 

Windows-->Open Prospective-->Other-->Rules composer.

 

<-----------------------------------------------------------------BRM Project Creation----------------------------------------------------------------->


Step4:Create a New Project in BRM Rules Composer.

 

FIle-->New-->Project.

Project.jpg

Select Development Component and Click on Next

DC.jpg

Select the Rules Composer and Click on Next

RC.jpg

Select MyComponent under LocalDevelopment and click on Next.

SWC.jpg

Name the project as per your functionality and click on FInish.

Proj.jpg

Step5:Import the XSD into the Project .

WSDL_Import.jpg

select xsd option and click on next

XSDImport.jpg

Now browse your xsd file from your local machine and click on Finish.Now we can see our XSD Imported into our Project.

WSDL.jpg

Step6:Add the XSD by double-click on “Project Resources”, select the “Alias” tab and click on “Add” and select “XSD Element”:

 

AddXSD.jpg

Select the imported XSD and click on “Finish”.

Select the XSD elements which will be used in the development of rule or decision table.Expand the XSD structure and check the following elements to be used:

  • HikeCalc/Grade
  • HikeCalc/Region
  • HikeCalc/Hike = {String}

 

Those elements designated as “= {something}”, such as Discount, is used for output. Those without it are used as input.

BRMRules1.jpg

Step7:Create a ruleset by right-click on “Rules Modeling” and select “New Ruleset”:

RulesSet.jpg

Give the name to ruleset and click on OK.

NewRules.jpg

Step8:Now Create a Decision Table .This will help to determine the employee hike .This  table consists of Grade and Region as input, and Hike as the output.

 

   Create a new decision table by right-click on “Decision Table” and select “New Decision Table…”:

DT.jpg

Give the name for DecisionTable and click on Next

DTName.jpg

Select Grade and Region as the conditions of the table (These are the inputs) and click on down arrow button

COND1.jpg

  Click on Next

COND2.jpg

Select the Action HikeCalc/Region and click on down arrow button ,now below screen will appear.

COND3.jpg

Click on Finish.

Now for adding condition ,Right click on DefaultValue under HikeCalc/Grade column and click on "Add Condition Values by Pressing Ctrl++"

Table.jpg

Give the values (one value in one line) and click on OK.

Val.jpg

Repeat the same for HikeCalc/Region.

 

Now enter the Values for HikeCalc/Hike by double clicking and entering values manually.

Val2.jpg

 

Step9:Now Create a Rule by double clicking on Rules folder-->New Rule

NewRules.jpg

Give the name for the Rule and click on OK.

RulesCreation.jpg

Double click on below highlighted rule

Rules.jpg

In the Rule development screen, click “+” under the “If” condition.Replace the left-hand side with “Boolean.TRUE” .Repeat the same for the right-hand side of the Equals.

Rules_CC.jpg

Now assign the Decision table by clicking on “+” under the “Then” clause, and select your Decision Table “EmpHikeCalcTable”.

DecTable.jpg

Now below screen will appear.

DTable.jpg

Step10:Now create a web service from the Rule.Use the below navigation to create.

WS.jpg

Just Navigate through couple of screens and click on Finish button.

 

<-----------------------------------------------------------------Building the BRM Project----------------------------------------------------------------->

 

Step10:Now we need to build the project by right clicking on project-->Development Component-->Build-->Click OK

DC_Build.jpg

 

<-----------------------------------------------------------------Deploying the BRM Project----------------------------------------------------------------->


Step11:Now we need to deploy the project by right clicking on project-->Development Component-->Deploy-->Click OK

 

Deploy.jpg

<-----------------------------------------------------------------Testing----------------------------------------------------------------->


Step12:You can test your BRM rule using WSNavigator

 

  Go to NWA-->SOA-->Application and Scenario Communication-->Single Service Administration

Testing.jpg

GIve your project relevant filter say "*empHike*"-->Click on Go-->Select your WSDL port type Name-->Click on WSDLs-->

Testing1.jpg

You can see two buttons "ZIP Download" and "Test".First Click on "Test" ;it will navigate to WSNavigator page.

 

Select "invokeRules" and click on Next.

WSNav.jpg

Give the Input Values and click on next

TestInp.jpg

Below is the desired o/p.

Output.jpg

 

Now we are done with BRM part ....

 

In blog I covered ,calling the above BRM rules from SAP PO mapping(Using SOAP Lookup) and calling Directly from BPM by taking an use case.

For this pre requisite is downloading wsdl .In the above "Single Service Administration" click on Zip Download button and save the wsdl in your local machine.

 

Regards

Venkat

Exposing Gateway Services from SAP Process Orchestration

$
0
0

When it comes to exposing data in the OData format from your SAP backend systems SAP Gateway is your go-to solution. There are a number of different deployment options how you can set up your system landscape for this scenario.

 

Since the release of SAP Process Orchestration 7.31 SP13 or SAP Process Orchestration 7.40 SP08 you have the additional option to expose SAP Gateway services from your SAP backend systems as OData services directly from your Process Orchestration server.

 

This has the benefit that it can replace the need to install a separate Gateway Hub server in many cases, thus lowering the TCO of your IT landscape. Also, it gives you the possibility to consolidate the exposure of your different web services in one location - the SAP Process Orchestration server.

 

Additionally, there are some technical upsides to this approach: For example, you might create a BPM process with some generated TaskUI screens, running directly on your SAP Process Orchestration server. If you now want to add some some SAP backend data that is not a part of your process context, you can now expose this data as an OData service from the same server, thus avoiding Cross-Origin Requests that would trigger security warnings otherwise.

 

At the moment, there are some scenarios in which you can't use this feature:

  • A SAP Process Orchestration Java Single Stack installation is a pre-requisite
  • SAP Fiori app deployment is not yet supported
  • Unified Inbox is not yet supported

 

The OData Provisioning feature is a part of the usage type AEX and therefore comes pre-installed with SAP Process Orchestration. The only thing left for you to do, is to assign the proper authorizations GW_ADMIN and GW_USER to the respective users.

 

You can find the official documentation for the feature Exposing Back-End Data as OData Service here:

 

I now want to show you how easy it is to expose a Gateway service on SAP Process Orchestration:

 

Adding a new destination to your backend system

First you want to add a new destination in your NetWeaver administrator console pointing to the IWBEP service in your SAP backend system. This is the "catalogue" from where all the Gateway services are exposed.

      1. Go to your NetWeaver Administrator interface:
        http://<SAP Process Orchestration server>:<port>/nwa
      2. Go to Configuration --> Destinations
        Picture1.png
      3. In the Destination List, click on the Create button:
        Picture2.png
      4. In the Destination Wizard, select your Hosting System, assign a Destination Name and select the Destination Type„HTTP“:
        Picture3.png
      5. In the Connection and Transport Security Settings step enter the connection information to your SAP Gateway IW_BEP service and enter the Additional Settings for SAP Systems:
        Picture4.png
      6. If you use a HTTPS connection, select the Trusted Servers Certificates Keystore, where the (root) certificate of the backend server is stored:
        Picture5.png
      7. In the Logon step enter the logon information to authenticate against the IW_BEP service in the SAP backend:
        Picture6.png
      8. To test your destination, select the new entry and click on the Ping Destination button; you should receive a success message if everything is configured correctly:
        Picture7.png

Register Gateway Services in the OData Provisioning Administration interface

      1. Go to Integration Gateway OData Provisioning Administration:
        http://<SAP Process Orchestration Server>:<Port>/
      2. Click on Register a New Service
        Picture8.png
      3. Select the backend destination you created in the NetWeaver Administrator
        Picture9.png
      4. Perform a search for the service you would like to register from SAP Gateway
        Picture10.png
      5. Select the service and click on Register
        Picture11.png
      6. Open the Service Document
        Picture12.png
      7. View and work with your OData Service published on the Process Orchestration Server
        Picture13.png

This is all you need to do to expose an SAP Gateway service on your SAP Process Orchestration server. Apart from the not supported scenarios mentioned above, you can do everything to this service that you could do to a regular Gateway service.

Results of the global NetWeaver PI Survey 2014 (Part 1 of 2)

$
0
0

As the previous years, the International Focus Group for Integration (IFG for Integration) performed its annual survey for SAP NetWeaver PI. The survey was scheduled from May till August 2014 and was open for all users and companies using SAP NetWeaver PI, PO or XI as an integration platform. The survey contained 23 general questions and 10 questions about this year's focus topic BPM.

 

Many thanks to all of the 362 participants from over 20 different user groups that replied to our survey!

 

The results are published in two separate blogs. This first one covers the general questions which have remained the same as in the previous years. The second blog with the results of the focus topic BPM will be published in the following weeks. The complete survey results are forwarded to all participants who have included their email address in the survey registration.

 

Survey Results

The following section shows an extract of the results.

 


Number of participants: 362

 

Participants per user group:

ParticipantsPerUserGroup.PNG

 

Used productive release (XI, PI, PO):

PIReleases.png

An interesting development is that the latest 7.31 and 7.40 already gained a very good adoption rate while PI 7.11 is still the most used PI release.

 

Number of interfaces (counting point-to-point connections):

NumberOfInterfaces.PNG


Evaluation of system performance:

SystemPerformance.PNG

 

Indication whether monitoring is still a major pain point:

 

MajorPainPoints.jpg

 

Plans to use central PI monitoring on SAP Solution Manager:


SolManPIMonitoring.PNG


SAP has addressed the monitoring pain point with improvement of the local PI monitoring as well as the central PI monitoring via SAP Solution Manager. One problem of the Solution Manager based monitoring is still its low adoption rate.When analysing the survey results in detail one interesting and important finding was that even for 58% of the companies already using the central PI monitoring of SAP Solution Manager monitoring is still a major pain point.



Weighting of pain points over the last 4 years:


GeneralPainPoints.PNG


The general question in regards to pain points confirms that performance and stability is no critical issue for most of the companies, while end-to-end monitoring remains at a constantly high level.

 

Stay tuned for part two of the survey results to be published in December.

Using Idocs in NWBPM

$
0
0

SAP has provided the below link which explains about "how to import idocs in NWBPM".


http://help.sap.com/saphelp_nw73ehp1/helpdata/en/f1/24e6e6f548480b85197bde372d13c9/frameset.htm

 

As explained in the above link, I have followed the ssame steps, but when I tried to import the idoc in NWBPM, I have got the below error.

 

"An internal error occurred during: "Importing MOIN models for XSDs and WSDLs..."

 

Java.lang.OutOfMemoryError: Java heap space

 

First I thought this error is something related to NWDS issue and later analysed it is related to the idoc service interface.

 

Then after few trail and errors finally I have imported the IDoc service interface in NWBPM. I would like to share how we can import the idocs service interface in NWBPM in this blog.

 

First we need to follow the steps mentioned in the below link.

 

http://help.sap.com/saphelp_nw73ehp1/helpdata/en/f1/24e6e6f548480b85197bde372d13c9/frameset.htm

 

 

For Idocs, some of the fields/segments will have the occurence as "999999999".

 

1.jpg

 

2.jpg

The tricky thing is to replace "999999999" with "unbounded"

 

3.jpg

and use this updated XSD in the service interface which will be used in NWBPM.

 

After importing this service interface in BPM, you can see the idoc namespace under "Data Types".

4.jpg

 

Hope this blogs helps in importing idocs into NWBPM.

What’s new in SAP Process Orchestration 7.31 SP13 / 7.4 SP08

$
0
0

As we are close to the holiday season, we cannot resist providing you a little gift with the new features and enhancements in SAP Process Orchestration 7.31 SP13/ 7.4 SP08. After reading this blog you for sure want to use the time now to upgrade to this latest SPs to leverage all new enhancements.

 

Gateway Java

One particular highlight of this version is a new deployment option for Gateway. Now SAP Process Orchestration can serve as a Gateway hub via the embedded Gateway (Java) component. Basically we took the Gateway ABAP and ported this to Java. As there is now only one middleware system needed to do application integration and expose service from the backend for UI consumption, this simplifies the landscape (though the backend component within the backend system is still needed). The backend component IW_BEP is already part of NetWeaver 7.40, for older releases it can be deployed as an add-on. Via the Gateway Java component the backend data can be exposed via OData channel (ODC) and also supports multi-destination, the aggregation of data from multiple backend systems. As these are great news, there are also some restrictions currently: only connection to ABAP backend systems; cannot run Fiori apps; cannot run Unified Inbox…but it is planned to work on this, it’s on our roadmap!

1.png

Sounds good to you!? Then please take a look at this blog to get further details:

Exposing Gateway Services from SAP Process Orchestration

 

 

New Process Integration features

For all the PI fans out there, let’s take a look at the new PI features:

  • Display flat file content on receiver side
  • Integration Directory programming interface enhancements
  • Support of receiver rules in integrated configuration
  • Copy content of successful message to new message and resend it
  • WS adapter supporting WS-RM, and SAML sender vouches
  • Support for receiver determinations without Sender Agreement in the message flow discovery
  • Disable field data type validation during IDoc-XML parsing (see SAP note 2036878)

 

Customer Connection Requirements

We have now closed the customer connection project and shipped the last missing feature: Flat File Display in PI Monitoring at sender side File Adapter. So now you can store the original or converted payload for e.g. auditing reasons or also test the file conversion. With the new parameter messageLog the new log version is added.

2.png

Support of receiver rules in Integrated Configuration object

We have closed now a gap in the Integration Configuration object on the Java side. You are able to maintain receiver determination condition once and reuse it in different configuration scenarios. There receiver rules specify the conditions which messages are forwarded to particular receivers. These rules can be reused across receiver determinations and Integrated Configuration objects.3.png

For the next SP it is planned to provide this also for Integration Flows.

 

Copy content of successful message to new message and resend it

For recovery of the receiving backend system, successfully processed messages can be sent again, or to be more precise copied messages will be sent in this case. Here, we do support two flavors, either send immediately or copy only and schedule later.

4.png

New Web services adapter on Adapter Engine

The Web service adapter also supports WS-RM(1.0 or 1.1), supporting SOAP version 1.1 and 1.2. Furthermore basic authentication and authentication via client certificate, WS addressing, HTTP proxy, WS-security signature and SAML sender vouches are supported.

5.png

6.png

 

 

Business Process Management features

There are also some news in the BPM-side of the house, for the developer (Dev), for the administrator (Admin) as well as for the end user (user). Let’s see what’s in.

 

Enhanced OData Service: Get Process Instance Details (Dev)

The OData services which come out of the box we have further enhanced with an additional service to get information about the process instance. Basically you will provide the process instance ID and get several attributes, e.g. name, subject, date etc. Exactly the same functional we have already in the Java API.

7.png

8.png

 

Task UI on external system (Dev)

The integration of Custom UIs has been enhanced. Now you can use placeholders to refer to the HTTP destination instead of deploying the task UI on the same system where BPM is running and using a relative URL. The placeholder is replaced at runtime with the destination URL, supporting root and sub-paths. In this case SSO is very much recommended, as you don’t want to disturb your users with an addition log on screen, right?!

9.png

 

Generate SAPUI5 task UI (Dev)

We know many of you have waited for this…and now it’s there! With this version you can now generate SAP UI5 tasks as you are used to it with Visual Composer or WebDynpro UIs. For sure you will get the full source code of the SAP UI5 application which can be adapted as required. Cool, isn’t it!? Currently “simply types” are only supported, but there is more to come.

10.png

 

Re-import service interface from ESR (Dev)

Once a service interface has been changed, you can now simply reimport this into Process Composer while preserving existing mappings.

11.png


Delete process instances without archiving (Admin)

There is an option available (job “bpm_delete_only”) to not write data to archive store (e.g. for testing) and just delete the selected process instances.

12.png

 

Save custom filters as preset in NWA Manage Processes (Admin)

Improvements for the usability will the administrator now also find in the monitoring area, while saving custom filters.

13.png

 

Show total count of process instances (Admin)

A small one; but in case you have a lot process instances running (you for sure have), and your list will be cut…you will get the total number of active processes with one click.

14.png


Completing task with comment (user)

The end user is now able to leave comment (optional) when the task is completed directly in the inbox with custom actions (e.g. approve or reject).

15.png

 

That’s all…

Hope you have enjoyed the information of these new features – so upgrade, make use of them and gain the benefits right now.

We wish you happy holidays, all the best for 2015 and are looking forward sharing with you the news next year – stay tuned

 

Additional information:

Claim Check enhancements in SAP Process Orchestration

$
0
0


Introduction


The Claim Check pattern can be implemented with SAP NetWeaver Process Orchestration EhP1 for SAP NetWeaver 7.3 SP5 and higher.  This article explains the enhancements to the Claim Check pattern, available with SAP NetWeaver 7.3 EhP1 SP14 or higher

 

It is easier to understand this with an example.  Consider a scenario where an applicant needs to apply for a renewal of his passport, as his passport is about to expire. This application needs certain documents to be submitted.  Examples of these documents are the proof of the applicant's address, age, identity etc.  The electronic copies of these documents are typically large, of the order of a few MB.  For the passport renewal, the applicant only needs to enter minimal details and the rest of the details would be looked up by the Passport processing application.

SAP Process Orchestration Solution

Let's see how to implement a solution for this scenario using Process Orchestration.  The application for passport renewal can be submitted only after all the relevant documents have been uploaded.  An integration centric process can be modeled in SAP Business Process Management with a Claim Check pattern so that the entire message containing the large payload(documents in this case) is not loaded into the BPM context, but only the minimal required data can be sent to BPM.  The process can collect all the required documents and once we have the full set of documents to proceed, the application can be submitted.

Infrastructure delivered with SAP NetWeaver Process Orchestration for Claim check

Dedicated Message Store

  • Stores the whole message externally of BPM
    (available with SAP NetWeaver Process Orchestration EhP1 for SAP NetWeaver 7.3 SP5 or higher)

SCPMappingBean

  • Extracts the large message and stores it in the Message Store
  • The message ID will be used as key and is handed over to the BPM process

         (available with SAP NetWeaver Process Orchestration EhP1 for SAP NetWeaver 7.3 SP5 or higher)

 

 

SCPMergingBean

  • Retrieves the large message from message store given the messageID.
  • Transforms the messages based on the defined mapping.

     (New, available with SAP NetWeaver Process Orchestration EhP1 for SAP NetWeaver 7.3 SP14 or higher)

What's new?

The enhancements for the Claim Check pattern are briefly summarized here:

  • A large payload not loaded into BPM process context
    With SAP NetWeaver Process Orchestration EhP1 for SAP NetWeaver 7.3 SP5, the aggregated large message was retrieved in the BPM process context itself.  This step is now eliminated by using the new SCPMerging bean in the receiver channel of the IFlow, so that the Large payload gets retrieved only before delivering the message to the receiver.
  • Support of data enrichment in BPM process context
    With  SAP NetWeaver Process Orchestration EhP1 for SAP NetWeaver 7.3 SP5, data enrichment done to the small message in the BPM context will be lost as only the message IDs were collected in the BPM context . This issue has also been addressed with the enhancements.

Steps to implement the Claim Check pattern

Step 1: Interfaces

1)  Define an interface in the Enterprise Service Repository(ESR), with the required fields as shown below. This will be referred to as the "Large" interface in this article.

           


  • Note: The 'PassportAttachment'  data type has an element BinaryData that stores the large files.


2)  Define an interface in ESR which has only the minimal fields required for processing in the BPM context.  This will be referred to as the "Lean" interface in this article.  The interface will have one additional field to store the messageID, to be used in subsequent steps.   

Step 2: Mappings

1) Define a message mapping to transform the Large message to Small message as shown below. Here, the messageId is a mandatory field with its value set to 0.

2) Define a multi mapping to transform the small messages to an aggregated message as shown.  Here the Source mapping will have both the small message as well as the corresponding large message.  The target message will be the  aggregated message.  (In this scenario the aggregated & large message are the same). 

In case data is enriched in the BPM context, the mapping could be performed in such a way that the enriched content from the Small message is mapped to the target, while the other fields could be mapped from the large message as shown here.


Note: For both of the above message mappings, we need to define Operation mappings to be used in subsequent steps.

Step 3: Configuring the SCPMappingBean

  The SCPMappingBean is used in the receiver channel of IFLOW1.


     

Parameters of the SCPMappingBean:

    •     mappingid:  Object ID of the operation mapping in the ESR. Here we need provide the operation mapping defined for Large to small transformation of message.
    •     persistuntil:  Time period in days for which the message is at least stored in the message store.
    •     xpath:  Expression to the field of the lean interface  in which the message ID will be placed by the bean

Step 4: Configuring the SCPMergingBean

The SCPMergingBean is used in the receiver channel of IFLOW2.


Parameters of the SCPMergingBean: 

  • mappingid:  The Object ID of the operation mapping in the ESR.  We need to provide the operation mapping defined for the multi mapping which transforms small messages to the aggregated message.
  • messageidxpath: Expression to the field of the lean interface using which the bean will retrieve the corresponding large message.

Step 4: BPM Process model changes

Steps required in BPM process model:

  • Import the Lean interface to be used in Start & Intermediate message events.
  • The Output of the Start & Intermediate message events are mapped to a data object, to store the small messages.
  • The automated activity 'Get Applicant details' is assigned to a custom webservice which returns the applicant details(Email ID, Place Of Birth, DateOf Birth) given the Passport Application Number. The content of the small message is enriched with this webservice call.
  • Once all the required data for submitting the passport renewal application is collected, the data is sent to the Passport System using the automated activity "Submit Passport data". The automated activity makes an Outbound call(XI3.0)  to send the small messages to the Receiver.

Step 5: Solution

The solution is briefly explained here before we get into further details.

1) The Passport application along with the one or more attached documents is sent to IFLOW1. This message gets transformed to a small message(with very few fields) .

2) The small message will start a bpm process and the subsequent messages(containing the other documents ) whose correlation key(Passport Application Number) matches will be consumed by the Intermediate message event.

3) Once all the required data is collected by the process, the messages are sent to the Passport Application System using IFLOW2 which takes care of transforming these messages to one aggregated message.The aggregated  message contains the Passport Application with all the required documents.


Results of the global NetWeaver PI Survey 2014 (Part 2 of 2)

$
0
0

While the first blog Results of the global NetWeaver PI Survey 2014 (Part 1 of 2) has focused on the general questions of the annual PI Survey of the International Focus Group for Integration (IFG for PI), this second one gives an insight into the last year's focus topic BPM.

 

Survey Results of Focus Topic BPM

 

Notation used for documenting  and modeling business processe including tools/notations for upfront discussions with the business

ModelingNotation.PNG

 

Used solution / tool for the documentation of business processes

BPMDocumentation.PNG

 

 

Is SAP Process Orchestration a strategic tool in your company?

(-- no interest at all, - minor priority, 0 neutral, + important, ++ very important)

POStrategicTool.PNG

 

Current and planned processes based on SAP NetWeaver PO

POScenarios.PNG

 

System strategy for human-centric processes with SAP NetWeaver BPM

 

SystemStrategy.PNG

 

Thanks again to all participants of the 2014 survey. The survey for 2015 is scheduled from May till August and will be announced on SCN beginning of May.

What’s new in SAP Process Orchestration 7.31 SP14 / 7.4 SP09

$
0
0

The latest SP of SAP Process Orchestration 7.31 / 7.4 has been available since end of last year. So it's about time to show you what features and enhancements we have shipped. I'm sure you will like it.

 

New CTC template for SLD self registration

There might be situations where you need to re-run the SLD self registration of your SAP Process Orchestration system, e.g., your current registration became inconsistent, or the configuration requires changes. With the new CTC template in the NetWeaver Administrator, you can re-register all components of an Advanced Adapter Engine Extended (AEX) or SAP Process Orchestration (PO) in the SLD. You can also use this wizard to connect the AEX/PO to another (central) SLD as part of the post-installation process. See also SAP note 2034226.

01 CTC.png

Receiver Rules in Integration Flows

Having shipped the support for Receiver Rules in Integrated Configuration Objects with 7.31 SP13 / 7.4 SP08, we now support Receiver Rules also within Integration Flows with one SP delay. Receiver Rules are re-usable xpath conditions to determine the recipients of the incoming messages. In the Process Integration Designer perspective of the SAP NetWeaver Developer Studio, you can create, read, change, and cancel Receiver Rules. You can also access all rules that have been previously created within the Integration Directory. In order to use the rules in Integration Flows, a new routing technique has been introduced. On the Routing Behaviour tab, choose Use Receiver Rules, and add one or more rules.

02 Receiver Rules.png

Supporting multiple mail attachments for sender mail adapter

So far, when reading mails from a mail server containing multiple attachments, the sender mail adapter only supported the creation of one single PI message whereas the mail attachments were attached to the PI message. By adding the module XIMailAdapterModule into the module chain of the sender mail adapter, you can process each attachment as separate PI message. Depending on the chosen settings, you can either create PI messages for all attachments and the actual payload of the mail or for the attachments only ignoring the payload. See also SAP note 2040884.

04 mail.png

Extended EOIO handling

For messages that need to be processed in a specific order, e.g., a purchase order needs to be processed before a change order, the Quality of Service (QoS) Exactly Once in Order (EOIO) is guaranteed by putting all messages on hold as long as the respective predecessor hasn't reached a final status. EOIO messages that went into an error need to be handled manually to resolve the error and to resume the queue processing. There might be situations where you rather would accept the violation of the EOIO delivery in order to avoid blocking queues, e.g., for messages that repeatedly fail during message delivery. In this case, we have introduced a new feature that allows you to automatically remove erronous messages from the EOIO queues so that the sequences can be continued. The exception handling can be configured individually per serialization context. You have two options either move the failed message to an error queue or remove it from the queue hence changing the QoS from EOIO to EO.

05 EOIO.png

New adapter for REST based services

A new adapter has been delivered for the provisioning and consumption of REST (Representational State Transfer) based services. It runs on the Adapter Engine and hence is supported on both PI Dual stack and Process Orchestration. We have recently published a series of blogs explaining the concepts of the adapter as well as selected features along sample scenarios, so I keep it short here and would refer to the blog series overview page. If you like to learn what is planned beyond the first shipment, see also my announcement from last year.
06 REST.png

Monitoring enhancements

The Message Flow Monitor in SAP Solution Manager allows you to track messages from end-to-end. So far, only asynchronous scenarios were supported. We have enhanced the data collector in SAP Process Orchestration to support the monitoring of synchronous scenarios. Prerequisite is that you have switched on the logging of the respective synchronous messages. Besides this, we have introduced a new status Application Error for messages that were processed successfully on the adapter engine however went into an application error in the backend. You can search for messages that resulted into an application error, and the new status is displayed in the local message monitoring as well as in the Message Flow Monitor. Furtehrmore, you can trigger alerts in case of application errors.

07 MFM.png

Enhancements of the copy and resend of successful messages feature

In the previous What's new blog for 7.31 SP13 / 7.4 SP08, we have introduced a new feature that allows you to copy and re-send already successfully processed messages. This is required for retrieving lost messages within your receiving backend system in case that a recovery is not otherwise possible. With the previous shipment, only copy and immediate send was supported. With the enhancement, copy only is added. This new option allows you to create a copy of the successful processed message so that you have the chance to edit the copied message before sending. Note that this feature needs to be handled cautiously since it potentially violates the Exactly Once delivery. So, I would like to stress that for performing this action you need a specific user role.

08 Copy.png

Further BPM OData services

The BPM OData services have been initially introduced with 7.31 SP09 / 7.4 SP04, and since then continously enhanced. The OData services provide you a simplified access to BPM processes and process instances to build your own custom interfaces. The current enhancements target the administrator role helping to build custom UIs for process administration and test automation by adding support for query a collection of process instances, cancel, suspend, and resume process instances. You can explore the supported OData services on your Process Orchestration system calling http://<host>:<port>/bpmodata. For more details about the BPM OData services, refer to the Custom UIs with the BPM OData service blog on SCN.

10 BPMOData.png

Improvements of the Claim Check pattern

The Claim Check pattern allows you to handle large message sizes within your BPM processes. You actually do not load the complete large payload into the BPM process context, instead you store the large payload outside of BPM and keep a reference to the data in the context. This ensures that the size of the process context is kept to a minimum. The Claim Check pattern is often used together with the Aggregation pattern where a number of incoming messages are collected and aggregated into a bulk message. In the Claim Check implementation so far, only the references were collected and kept within the process context, however the very last step within your BPM process required the complete large payload to be retrieved from the staging and put into the process context. This has been changed with the new and improved implementation. Now, the data retrieval and actual aggregation happens outside of the BPM process context leading to an improved runtime performance. Besides this, the enrichment of the data in the BPM process context is supported.

11 ClaimCheckOldVsNew.png

 

Hope this made you curious. For more details of all new 7.31 SP14 / 7.4 SP09 features, check out the release notes.

Post Upgrade/Installation Developer Check

$
0
0

We are currently operating a SAP NW PO 7.31 SP8 and will be upgrading to SP14 in the not too distant future. I wanted to put together a handover and acceptance checklist for my support team. The task was to perform a full system health check as quickly as possible whilst ensuring all relevant components are checked. Now we could go on for hours checking everything in minute detail but I don't believe that adds value. The following checklist is based upon my own experience and refering to the SAP document 'Readiness Check V1.4'

 

I would be interested to see if anyone finds this useful but even more importantly if there any any omissions.

 

 

For each system
Full system restart completed
Java Component Check
Go to http://<host>:<port>/dir/start/index.jsp
Select NWA
Logon
Operations Management, Systems, Start/Stop (All processes, nodes running)
Check individual applications server nodes (Release and Production Only)
Go to http://<host>:<port>/dir/start/index.jsp (specific addresses as above)
Select NWA
Logon
Operations Management, Systems, Start/Stop (All processes, nodes running)
Adapter Engine Test
Go to http://<host>:<port>/dir/start/index.jsp
Select configuration and monitoring home
Logon
Select Monitoring - Adapter Engine, then Component Monitor
Ensure all components are active
Cache
Go to http://<host>:<port>/dir/start/index.jsp
Select configuration and monitoring home
Testing and the Cache Connectivity Test
Start Test, wait then hit refresh
CPA Cache Full Refresh
Go to http://<host>:<port>/CPACache/refresh
Choose mode Full, hit send
Ensure test successful
Check Background Jobs are scheduled
Go to http://<host>:<port>/dir/start/index.jsp
Select NWA
Select SOA, Monitoring, Backgound Jobs Processing Monitor
Default Delete and Default Recover Job should be successful and active
Check Java jobs are scheduled
Go to http://<host>:<port>/dir/start/index.jsp
Select NWA
Select Operations, Jobs, Java Scheduler
Click on Tasks Tab
AlertConsumerJob
docDBTableCleanup
SLD Checks
Go to http://<host>:<port>/dir/start/index.jsp
Logon to the SLD
Administration, check server running
View Technical Systems
View Business Systems
View Products
View Software Component Systems
Create new Technical System
Create new Business System
Create new Product
Create New Software Component Version
Networking & Communications
RFC Destinations (check/test)
Go to http://<host>:<port>/dir/start/index.jsp
Select NWA
Logon
Configurations, Connectivity, Destinations
(Refer to document xxxxxxx for details of connected systems)
Ping Selected Communications Channels
Go to http://<host>:<port>/dir/start/index.jsp
Select configuration and monitoring home, Adapter Engine, Communication Channel Monitor
Find: Comms Channel Name
Ping Selected Communications Channels
Repeat for several coms channels (sender/receiver, different adapters)
Developer Tools
ESR
Import SWCV
New, Work Areas, Software Component Version, Import from SLD
Confirm able to create/edit objects
Create new Data Type, Save, Activate
Check Cache update. Menu-Environment, Cache Status Overview.
Publish to Service Registry
ID
Tools, Assign Business System, click continue (List of Business Systems from SLD)
Confirm able to create/edit objects
Create new Data Type, Save, Activate
Check Cache update. Menu-Environment, Cache Status Overview.
Publish to Service Registry
Eclipse
Comfirm correct version installed an starts
Confirm able to login
Confirm able to create/edit objects
Services Registry
Logon to Service Registry - Check published services
Complete dummy interface test
Complete dummy interface check and sign off
Check Reverse Proxy Routing
Transport System
Create, release and transport across landscape (ESR & ID Components)
Confirm transport is successful
Confirm objects appear in target system
Test components in Target system
Delete Components in all systems once complete
Alerting Test
Configure alert for Dummy Interface
Send Invalid Payload and monitor alerts
Check Alert in PI and SOLMAN
System available for Productive Use

APIs? API management? - Should We be really bothered?

$
0
0

APIs have been around for over a decade now. Early APIs were message 'contracts' exchanged between programs. With the advent of Internet, in the early 2000s, slowly there was an explosion of APIs that were web enabled. SOA then helped further shape the API paradigm.

 

Today the term API is predominantly synonymous with a service that is RESTful in nature and used extensively in area that concern the modern rich web and mobile. With the mass explosion of APIs, there has been a foray of technology and tools that proposes to deliver API management.

 

The first question that should probably come to ones mind is, 'What is so special about APIs that we need new technologies/tools in delivering and managing APIs?'.

 

To answer this, it is important that we understand what APIs really mean today and what managing them comprise of.

 

What is an API in the context of this article?

 

An API is;

 

1. A piece of code that will allow two software programs (applications) communicate with each other.

2. This communication is based on web protocols (HTTP/HTTPS) and is usually based on REST

3. An API will help connect business processes, services and data to both external and internal consumers (devices, applications etc)

4. Abstracts the consumer (ex. web or mobile developer) from the back-end application

 

Yes. You have guessed it right. This sound so much like a service (or if you prefer interface) one would develop on any integration platform like SAP Process Orchestration, Oracle Fusion, Webmethods etc. So why am I reinventing the wheel?

 

Now that we have a fair idea of what an API means let's look at some examples of where APIs are being used. Interestingly, we in many ways are interacting in a world of APIs. Every time you open Facebook or Google, there is an API at work. When you place an order or search for items on Amazon, there are APIs being invoked. Further on, if you have one of those smart watches or fitness devices (wearables) like the NikeFuel, APIs are core to them.

 

Note: Have you read about the famous Jeff Bezos' mandate?

 

So now that we have established that APIs are actually very much integral to our day-today life, let us look at what API management means.

 

An API Management platform will have the following key characteristics (not limited to the below);

 

apim_1_30032015.JPG

 

1. Tools for developing APIs (obviously!)

2. Discovery of APIs i.e. cataloguing, searching etc

3. Version and life cycle management

4. Developer Engagement- Controlled access to developer to discover and consume APIs, collaboration opportunities etc

5. Security - Support of standards such as oAuth, SSL, threat protection, encryption, SAML, IAM etc

6. Orchestration - API mashing, branching etc (Orchestration is a big word but try not to confuse this with heavy process based orchestration)

7. API Analytics

8. Caching and Optimization - ex. caching of frequently requested data, compression of messages etc

9. Monetization - A method for monitoring and recording the consumption and traffic of APIs thus incentivising (or the billing) them.

 

 

Q: Do we still need a platform for doing APIs? Cant investments in existing enterprise grade Integration/BPM technologies realize this for us?


It is important that we answer this question right away because only then is it possible for us to understand the value proposition that API management will deliver.

 

The core requirements/use cases APIs will serve are around mobile, web and the IoT space. This means we are looking at real time service calls that requires low latency to be maintained throughout. APIs usually do not accommodate heavy logic or orchestration and are philosophically KISS (Keep It Simple Stupid!) in nature.

 

The enterprise service bus, SOA and business process management platforms are sophisticated tools that manage a wide variety of integration and orchestration patterns but what makes API management a bit unique from the traditional A2A and B2B integration, is the aspect of functionalities like Developer Engagement, Caching, Incentivisation and support for standards such as OAuth. Analytics and reporting around APIs is also a key aspect.

 

More on Analytics & Incentivisation of APIs:


Some examples of what Analytics, reporting and Metrics means in the case of API management;

a. Analysis of traffic - This could be number of transactions, the source of the traffic etc

b. API trends - Availability, top trending API, least used API, error categories, error rates etc

c. Developer statistics - Number of Developers engaged, top developers, APIs by developers etc


Incentivisation is all about how to monetize APIs. There can be different models that organizations might wish to adopt. Some may be a 'Pay as You go' model or an unit based where the costs are proportioned according to the consumption of the APIs, another could be a revenue sharing model where the organization pays the developer or say a partner based on the revenue generated by the APIs developed by them, some organizations might even just simply choose to give away APIs for free so as to drive market adoption.


By now, it should have become clear that the Enterprise Integration and the API management technologies both have overlaps but are distinctly unique in the use cases they cater to serve. Thus in an organization that is finding itself stepping into significant mobile adoption, the consumerization of IT and those with predominant B2C interactions along with B2E and B2B, setting up APIs and the required technology to manage it becomes an exciting value proposition.


If your organization answers 'Yes' to the majority of the below questions, then you should potentially start evaluating the benefits of an API management solution;


1. Does your organisation have a strategy or vision around the adoption of mobility?

2. Are you looking to extend the collaboration and development of APIs to external parties, vendors or developers?

3. Will you be exposing your organization business data for public consumption?

4. Do you want to analyse trends of data consumption and utilize sentiment analysis of your social channels?

5. Is your business primarily front ended by the web or is the web the core channel your customer engage with you?

6. Does the word 'User Experience' keep getting mentioned in your strategic meetings?


In the modern enterprise landscape, thus the core integration and orchestration technologies (ESB, BPM, MFT etc) will continue to hold their place but will now start finding itself possibly accompanied by the new stack i.e. API management, each serving purposes, it is best capable to deliver.


apim_4_30032015.JPG



SAP has also realized the technological gaps it has in its product suite. The recent partnership with the industry leader in API management, Apigee, is a sign of SAP desperately trying to bridge gaps and strengthen its overall solution stack.


Note: Refer this blog for insights into the Integration and Orchestration offerings from SAP


So now I come back to the title of this blog and would like to ask the reader, 'Should we be really bothered about API management?'. Do post your thoughts in the comment section of this blog and hope to have an engaging discussion.


Java Development within the SAP Process Orchestration Platform

$
0
0

Java Development within the SAP Process Orchestration Platform

Introduction:The stability, robustness and capabilities of the SAP process orchestration makes it increasingly a platform of choice as an ESB, BPM, BRM and composite service platform. More so, this is made possible with the toolset and infrastructure provided by SAP. These tools include; Java Only installation for PI, NetWeaver Development Infrastructure (NWDI), NetWeaver Developer Studio (NWDS), JEE Application server. In this blog, I will highlight some custom java development use cases that underscores the power of the SAP Process Orchestration Platform. Subsequent blogs will provide details on how to achieve these use cases within SAP PO. The custom Java development use cases to be considered include;

  1. Process Orchestration with Java proxies
  2. Custom mapping functions in BPM with EJBs
  3. Java Mappings in PO operation mappings
  4. Creating Adapter Modules with EJBs

Tools, Infrastructure Needed:  Certain essential tools within a specific infrastructure is needed to implement some of the use cases mentioned above, they include;

  1. SAP Process Orchestration Java Only installation
  2. NetWeaver Developer Infrastructure (NWDI)-optional
  3. NetWeaver Developer Studio

Case 1: Process Orchestration with Java Proxies:Java Development with java proxies on the SAP PO platform comes handy in use cases where system-to-system orchestration needs to be implemented, especially for synchronous calls. However, when system-human interaction orchestration is necessary, then SAP BPM becomes the implementation method of choice. The use of java proxies has its pros and cons;

Pros: Better performance for system-to-system orchestration, leverages the AS Java infrastructure provided by SAP, desirable and comprehensive integration logic can be built in Java.

Cons: Availability of Developers/Architects with good Java knowledge is essential, Troubleshooting can be a difficult task

There are two flavors of Java proxies that can be developed in NWDS for building orchestration logic; the client java proxy and server java proxy.

a) Java Client Proxy: The Java client proxy is developed as the client application that consumes the Integrated Configuration (ICO) created in the Integration Directory. To develop a client proxy, a number of steps need to be implemented, these are described in high level here; 1) Create a development component of type EJB (from development infrastructure perspective) 2) Create a development component of type EAR (from development infrastructure perspective) and add the EJB dc created earlier as a dependency 3) add necessary SAP standard component (engine.jee5.facade, tc/esi/esp/wsrm/facade) as dependencies 4) Import a WSDL (file system) or a modelled outbound service interface (from ESR). 5) Generate a webservice client on the WSDL/service

ClientProxy.gif

6) Create a wrapper class to implement the bunch of classes generated in step 4. This is done by creating a session bean with a local interface, the session bean implementing class contains the logic to call the port and pass the request into the ICO. 7) Build the EAR and EJB dcs, then deploy the EAR file on the AS Java. 8) Go to the NetWeaver Administrator and configure the client proxy to consume the ICO. Within the NWAèSOAèApplication and Scenario CommunicationèSingle Service AdministrationèConsumer Proxies (search for the name of the WSDL/service interface on which the client proxy classes were created). Navigate to the “Configuration tab”, with the option “New” follow the wizard to configure consumption of the ICO (use the import from WSDL URL) option to import the WSDL of the ICO from the integration directory as well the endpoint. Follow the wizard and complete the configuration.

 

b) Java Server Proxy: The Java server proxy is developed as the service provider where the integration logic that drives the orchestration is developed. Integration logic capabilities that can be developed include; calling the client proxy EJBs, data transformation (mapping functionalities) or request and response data. To develop a server proxy, a number of high level steps are implemented; 1) Create a development component of type EJB (from development infrastructure perspective) 2) Create a development component of type EAR (from development infrastructure perspective) and add the EJB dc created earlier as a dependency 3) add necessary SAP standard component (engine.jee5.facade, tc/bl/logging/api, tc/je/webservices/api) and client proxy dcs as dependencies 4) Import a WSDL (file system) or a modelled inbound service interface (from ESR). 5) Generate a webservice Java Bean Skeleton on the WSDL/service

ServerProxy.gif

5) Develop the logic within the created “implementation bean class”. To be able to create annotations on the local interfaces of the client EJBs, the client classes must be exposed in the “public parts”  7) Build the EAR and EJB dcs, then deploy the EAR file on the AS Java system. 8) Go to the NetWeaver Administrator and configure the client proxy to consume the ICO. Within the NWAèSOAèApplication and Scenario CommunicationèSingle Service Administrationè Service Definitions (search for the name of the WSDL/service interface on which the server proxy classes were created). Navigate to the “Configuration tab”, with the option “New” follow the wizard to generate a new WSDL that will be used as the entry point into this service. Follow the wizard and complete the configuration. The WSDL URL can be used in the receiver channel configuration in the Directory.

 

N:B- The Sender channel for the ICO to be consumed by the client proxy is of type SOAP, also the receiver channel used in the ICO that calls the server proxy is also of type SOAP. However asynchronous calls, SOAP Sender channel with XI.30 message protocol can also be used for WS-RM. This also has to be stated during the configuration of the client proxy in NWA.

Case 2: Custom mapping functions in BPM with EJBs :  Often complex data transformations between two steps in the process flow of BPM becomes essential. In situations where basic mapping is not enough within the expression editor and also where operation mapping from the ESR cannot be imported for use, creating an EJB function becomes an alternative. To create an EJB for mapping use within the BPM, perform the following steps; 1) Create a development component of type EJB (from development infrastructure perspective) 2) Create a development component of type EAR (from development infrastructure perspective) and add the EJB dc created earlier as a dependency 3) add necessary SAP standard component (engine.jee5.facade, tc/bl/logging/api, tc/bpem/mapping/façade, /api, tc/je/sdo21/api) as dependencies 4) Create a session bean with a local interface, the local interface extends the “function” and “Sdo” SAP standard classes. Implement the session bean class to invoke the data object, perform the necessary mapping transformations and set the output back into the process 5) Deploy the EAR file of the ejb to the AS Java. 6) Navigate to the JNDI tree browser and search for the deployed ear, the JNDI via which this object can be located at runtime can be seen. 7) Create a mapping function artifacts  with the process development dc, configure the mapping artifact with input and output parameters as well as reference to the JNDI name of the deployed EAR file.8) Use the custom mapping from the expression editor.

 

Case 3: Java Mappings in PO operation mappings:SAP PI/PO provides different mapping possibilities to achieve transformation from the source message structure to the target message structure. Graphical mapping is often the first choice, but when graphical mapping is unable to achieve the mapping functionality, then XSLT mapping can be of irrelevance. However, implementing a Java mapping within PO might be the best option. The steps necessary to implement a Java mapping are highlighted here;

  1. Create a Java project in Eclipse or NWDS, thereafter create the rest of the necessary artifacts vis-à-vis package, class dependency.gif
  2. Add the necessary SAP jar files needed as dependencies to the project
  3. Implement the logic for the mapping within the Java Class
  4. Export the java class as a jar file to a local directory
  5. Create an Imported Archive object in the ESR
  6. Import the jar file into the Imported Archive object
  7. Select the java mapping jar file within the Operation Mapping

 

Case 4: Creating Adapter Modules with EJBs: In order to extend the functionalities of the SAP standard adapters (file, JMS, SOAP) during message processing, it is possible to create adapter modules with java classes. To create an adapter module, the following high level steps should be implemented;

1) Create a development component of type EJB (from development infrastructure perspective) 2) Create a development component of type EAR (from development infrastructure perspective) and add the EJB dc created earlier as a dependency 3) add necessary SAP standard component (engine.jee5.facade) and client proxy dcs as dependencies 4) Create a session bean and implement the necessary logic for the adapter module. 5) Define the JNDI name within the “ejb-j2ee-engine.xml” file as well as the EJB name within the “ejb-jar.xml” file. These names must be unique in order to avoid naming conflicts and the class loader loading a different bean than expected. 6) Build and deploy the EAR file, configure the communication channel with the adapter module based on the JNDI name.

 

Conclusion: In this first blog, I have highlighted a number of custom java developments that can be achieved by leveraging the SAP PO Platform. These blog series aims to help putting the essential possibilities into perspective for SAP PO consultants, Developers, Solution Architects and Enterprise Integration consultants. They can adopt these, to realize certain process integration/orchestration requirements within projects. It also aims to help tick the checklist box of essential skills set to consider when working with the SAP PO platform. Detailed step by step guide on how to realize these use cases will be discussed in my subsequent blogs. 

Viewing all 108 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>