Mulesoft : Create Cloudhub Notification


, , , , ,

We need different types of nontifications when creating an integration application using Mule or any other technology. We can of course use something like email or SMTP or any other specific API, but that ties the code strongly to the implementation. A better option is to use Mule Cloudhub’s Notifications which can generate different types of alerts with lot of details and also send email if required.

There are of course 2 parts to the solution –

A) Configuration and Setup in CloudHub

Follow instructions here to setup alert in CloudHub. These instructions work well, so I won’t repeat those.

B) Configuration and coding in Anypoint Studio

      1. Add Cloudhub Connector to your application. If you do not find it, download it from exchange inside your studio.
      2. In your error handler flow, add a Cloudhub Create Notification component.
      3. You may need to define a Cloudhub Configuration, Use your cloudhub username, password and environment details to do this.
      4. Now in the code open the file in XML, you can do this using the visual UI as well, but I prefer this in XML.
      5. Create a subflow to handle actual notifications
        You can add any number of custom properties and then refer to them when creating your notification in CloudHub
      6. Call the subflow from your error handler section



Create docker secret for Amazon ECR in Kubernetes using Java Client


, ,

It’s quite easy to use dockerhub with kubernetes deployments. Unfortunately that’s not the case if you want to use Amazon ECR as your container repository.

Amazon ECR has a few quirks before you can use it in your kubernetes platform. Amazon ECR does not allow you to use the API keys directly to create images. You need to generate a token to be used with ECR & the token is valid only for 12 hours.

Before you begin, please add below dependencies to your maven project.




There are 2 main parts in using Amazon ECR with Docker & Kubernetes –
I) Push Docker Image to Amazon ECR

  1. Create Amazon ECR Authorization Token
        private String accessKey;
        private String accessSecret;
        private String region;
        public AuthorizationData getECRAuthorizationData(String repositoryName) {
            //Create AWS Credentials using Access Key
            AWSCredentials awsCredentials = new BasicAWSCredentials(accessKey, accessSecret);
            //Create AWS ECR Client
            AmazonECR amazonECR = AmazonECRClientBuilder.standard()
                    .withCredentials(new AWSStaticCredentialsProvider(awsCredentials))
            Repository repository = null;
            //Describe Repos. Check if repo exists for given repository name
            try {
                DescribeRepositoriesRequest describeRepositoriesRequest = new DescribeRepositoriesRequest();
                List listOfRepos = new ArrayList();
                DescribeRepositoriesResult describeRepositoriesResult = amazonECR.describeRepositories(describeRepositoriesRequest);
                List repositories = describeRepositoriesResult.getRepositories();
                repository = repositories.get(0);
            }catch(Exception e){
                System.out.println("Error fetching repo. Error is : " + e.getMessage());
                System.out.println("Creating repo....");
                //Create Repository if required
                CreateRepositoryRequest createRepositoryRequest = new CreateRepositoryRequest().withRepositoryName(repositoryName);
                CreateRepositoryResult createRepositoryResult = amazonECR.createRepository(createRepositoryRequest);
                System.out.println("Created new repository : " + createRepositoryResult.getRepository().getRegistryId());
                repository = createRepositoryResult.getRepository();
    	    //Get Auth Token for Repository using it's registry Id
            GetAuthorizationTokenResult authorizationToken = amazonECR
                    .getAuthorizationToken(new GetAuthorizationTokenRequest().withRegistryIds(repository.getRegistryId()));
            List authorizationData = authorizationToken.getAuthorizationData();
            return authorizationData.get(0);
  2. Use token in docker java client
            String userPassword = StringUtils.newStringUtf8(Base64.decodeBase64(authData.getAuthorizationToken()));
            String user = userPassword.substring(0, userPassword.indexOf(":"));
            String password = userPassword.substring(userPassword.indexOf(":") + 1);
            System.out.println("ECR Endpoint : " + authData.getProxyEndpoint());
            //Create Docker Config
            DockerClientConfig config = DefaultDockerClientConfig.createDefaultConfigBuilder()
            DockerClient docker = DockerClientBuilder.getInstance(config).build();
  3. Build Docker image
    		String imageId = docker.buildImageCmd()
                            .withDockerfile(new File(params.getFilePath() + "\\Dockerfile"))
                            .exec(new BuildImageResultCallback())
  4. Tag Docker image
    		String tag = "latest";
            String repository = authData.getProxyEndpoint().replaceFirst("https://", org.apache.commons.lang.StringUtils.EMPTY) + "/" + params.getApplicationName();
    		TagImageCmd tagImageCmd = docker.tagImageCmd(imageId, repository, tag);
  5. Push Docker image to Amazon ECR
                  .exec(new PushImageResultCallback())
                  .awaitCompletion(600, TimeUnit.SECONDS);

II) Pull Docker Image from ECR into Kubernetes

  1. Create Secret in Kubernetes
  2. First we need to create Kubernetes config using kubernetes api server url and token for admin account.

    ApiClient client = Config.fromToken(master, token,false);
    String userPassword = org.apache.commons.codec.binary.StringUtils.newStringUtf8(Base64.decodeBase64(ecrAuthorizationData.getAuthorizationToken()));
    String user = userPassword.substring(0, userPassword.indexOf(":"));
    String password = userPassword.substring(userPassword.indexOf(":") + 1);

    This is one important difference between normal kubernetes secret and special docker ecr secret. Check the type “”

    V1Secret newSecret = new V1SecretBuilder()

    The content for kubernetes docker secret needs to be created in specifically formatted json using data as set below. This is then set as byte data in V1Secret.

    HashMap secretData = new HashMap(1);
    String dockerCfg = String.format("{\"auths\": {\"%s\": {\"username\": \"%s\",\t\"password\": \"%s\",\"email\": \"%s\",\t\"auth\": \"%s\"}}}",
    Map data = new HashMap();
    V1Secret namespacedSecret = api.createNamespacedSecret(params.getNamespace(), newSecret, true, params.getPretty(), params.getDryRun());

Zapier – Beyond the basics – Part II : POST data to an external API


In the last blog, I explained how it is possible to make custom calls to external API’s to fetch data. In this blog I’ll show how you can post data to an external API via a POST action.

First we add a “Code by Zapier” step as usual and then map required inputs. refer to my last blog for details.

Next we go the the actual ode section and add something like this :

var line_amount = parseFloat(inputData.line_amount);
var tax_amount = parseFloat(inputData.tax_amount);
var customer_id = Number(inputData.customer_id);

console.log("line_amount : " + line_amount + ", tax_amount : " + tax_amount + " , customer_id : " + customer_id);

var invoice =
"Line": [{
"Description": inputData.description,
"Amount": line_amount,
"DetailType": "SalesItemLineDetail",
"SalesItemLineDetail": {
"ItemRef": {
"value": "3",
"name": "Product"
"UnitPrice": line_amount,
"Qty": 1,
"TaxCodeRef": {
"value": inputData.tax_code
"TxnTaxDetail": {
"TxnTaxCodeRef": {
"value": "3"
"TotalTax": tax_amount,
"TaxLine": [{
"Amount": tax_amount,
"DetailType": "TaxLineDetail",
"TaxLineDetail": {
"TaxRateRef": {
"value": "3"
"NetAmountTaxable": line_amount
"CustomerRef": {
"value": customer_id,
"name": inputData.customer_name

console.log("Invoice :" + JSON.stringify(invoice));

var options = {
"Authorization": "Bearer " + inputData.access_token,
method : "POST",
body : JSON.stringify(invoice)


.then(function(res) {
console.log("response : " + res);
return res.json();
.then(function(json) {
var output = [{error: json.Fault, invoice:invoice, response:json}];
callback(null, output);

The important part to focus is the “options” passed in to fetch command where we specify the POST method, Headers and the JSON.stringify for body payload when body is of type JSON as is in the above case.

Lastly, Zapier uses fetch which uses a promise based design to process results, so we have the bunch of action->then->then code.


Zapier – Beyond the basics – Part I : Fetching data from external API


, ,

Zapier is an excellent tool for Integration for Small and Medium Business. It fills 80% of the needs for most of the scenarios.

However there are a few scenarios which require you to go out of the box that zapier provides. One such scenario is interacting with an API which is not available in Zapier or which has exposed only limited operations in their official Zapier App.

Thankfully Zapier has provided the tools to do so. In these series of posts, I’ll be covering some common use-cases for Zapier.

The primary tool or component for interacting with external API is the “Code by Zapier” component.

You can add this as an Action step.


I typically choose “Javascript” which is essentially Node JS.


Any data from other steps that you want to pass in can be added here and it’ll be available under “inputData” object. So if you add a property “first_name” then it’ll be available in code as “inputData.first_name”


And under the code section, you can include your actual code like

  .then(function(res) {
    return res.json();
  .then(function(json) {
    var output = {id: 1234, response: json};
    callback(null, output);

Zapier uses Node Fetch module to get data. It is a promise based library and requires some then, callback logic.

This is a simple example to call “GET” operation on any API. If you need to pass some headers, do it as :

var options = {
 "Authorization": "Basic Abcdesjjfjfj="
    .then(function(res) {
        return res.json();
    .then(function(json) {
        var html_content = json.archive_html;
        var output = {"content":html_content}; 
        callback(null, output);

In the next posts, we’ll go into some more complex scenarios including use of OAuth based API and POST using Zapier.

Which SOA Suite is correct for me? – Part 1


, , ,

This will be a series of multiple posts to answer the question. I’ll review both commercial as well as Open Source SOA/ESB’s to answer the question.
I’ll cover a simple typical scenario – yes the done to death PO approval 🙂

SOA Suites –
1) Oracle SOA Suite 12c
2) MuleESB
5) Apache (Camel, Drools, ActiveMQ)
6) FuseESB

Scenario –
1) Read PO XML from file(CSV)
2) Transform & Call WS with Canonical Interface
3) Run Business Rules to check if Approval requried
4) Send by FTP

Features –
1) Development effort & IDE
2) Deployment
3) Monitoring & Alerts
4) SOA Governance

Importance of understanding your database

Recently I was working on an integration project where the database used was MS SQL server and the integration platform was Oracle SOA Suite.

The service itself was quite simple – to fetch some data from SQL server and Enqueue it on JMS queue. We used DB Adapter for polling the database. We used the Delete strategy and the service was distributed on a cluster.

Once the ids were enqueued, there was a seperate EJB-based webservice which queried same database to create the canonical. We have used JPA Entity Beans for ORM. There is a particular query to get some extra information from a table which does not have foreign-key relation. The query used a single parameter – a string.

However we observed a huge performance issue for SQL server as well as the website hosted on the database. We observed 99% CPU usage.

It was our SQL DBA who found out the issue. The column in database was varchar, the index was based on same. However, the query parameter that got sent to the database was using nvarchar. This caused a full table scan and completely skipped the index.

The solution use “sendStringParametersAsUnicode” property of Weblogic SQL Server driver. By default everything gets sent as Unicode from JDBC Driver, using “sendStringParametersAsUnicode=false”, we made the driver send the parameter as varchar and immediately saw the difference. CPU usage was down to 1%.

This underscores the point that Frameworks and Engines abstract out a lot of features, but it is necessary to understand you database to make optimal use of it.

Reference –

Generic Architecture for Cloud Apps Integration : Integrate multiple apps using JMS Publish – Subscribe

With the advent of cloud, the trend is to use cloud-based best of breed software for different needs. Although this provides very good and deep functionality, it also opens up a lot of issues related to data management and integrity.

A very common scenario nowadays is to use SaaS application like Salesforce or SugarCRM for CRM. Then to use SaaS ERP like NetSuite or on-premise ERP. There could also be Quickbooks used for accounting. Besides these there are the HelpDesk apps.

All of these applications have their own internal databases, schemas and representation of your data. A change in one place needs to be reflected in other apps as well. This results in lot of difficulties on how to integrate the apps. Conventional star topology or app-to-app integration fall short.

This is where a Messaging Oriented Middleware (MOM) solution like ESB is very useful. I am presenting a generic architecture for multiple apps to apps integration.

Brief explanation of components in the proposed architecure –

  • Purchase Order – This is the input document that comes to the system. We generally need to update multiple systems based on the docuemnt. The format could be cXML or any custom XML schema. It could be transformed into a standard or canonical format accepted internally.
  • ActiveMQ JMS – The document is put on Topic of any messaging application. I have assumed ActiveMQ here, but it could be any MQ system that supports publish-subscribe model.
  • Transformers – We have 3 subscribers to this Topic, however each of them accepts different formats. To compensate for this, we have a trasnfromer for each of the subscriber. e.g., the PO to CRM transformer could transform the message from CXML format to Schema.
  • Subscribers – All 3 subscribers receive the message document and update their respective system with the data.

Enterprise Integration Patterns (EIP) with OpenESB Part 2 : Dynamic Router

This is second of the series Enterprise Integration Patterns (EIP) with OpenESB where we will cover Enterprise Integration Patterns using OpenESB.

According to Enterprise Integration Patterns,

The Dynamic Router is a Router that can self-configure based on special configuration messages from participating destinations.

Dynamic Router

We can realize this pattern in OpenESB using Dynamic Addressing which uses WS-Addressing to dynamically determine the service location at runtime.

To check this in action we need 3 Projects –

  • EJB Module implementing the Web Service – you can use any WSDL based web service, not necessary to use EJB.
  • BPEL – This will contain the actual code that determines the endpoint at runtime.
  • JBI / CASA – This is a composite application necessary to deploy the BPEL.

Step 1 – We create a web service with 2 operations – addition and subtraction. Both operations take 2 numbers as input and return result.
Here’s the xsd –

<xsd:schema xmlns:xsd=""

    <xsd:element name="NumbersAdditionRequest" type="tns:AdditionType"/>
    <xsd:element name="NumbersAdditionResponse" type="tns:OperationResultType"/>
    <xsd:element name="NumbersSubtractionRequest" type="tns:SubtractionType"/>
    <xsd:element name="NumbersSubtractionResponse" type="tns:OperationResultType"/>

    <xsd:complexType name="AdditionType">
            <xsd:element name="number1" type="xsd:int"/>
            <xsd:element name="number2" type="xsd:int"/>

    <xsd:complexType name="SubtractionType">
            <xsd:element name="number1" type="xsd:int"/>
            <xsd:element name="number2" type="xsd:int"/>

    <xsd:complexType name="OperationResultType">
            <xsd:element name="result" type="xsd:int"/>
            <xsd:element name="processor" type="xsd:string"/>

and here’s the wsdl –

<?xml version="1.0" encoding="UTF-8"?>
<definitions name="NumbersOperations"
        <xsd:schema targetNamespace="">
            <xsd:import namespace="" schemaLocation="numberBase.xsd"/>
    <message name="additionRequest">
        <part name="part1" element="ns:NumbersAdditionRequest"/>
    <message name="additionResponse">
        <part name="part1" element="ns:NumbersAdditionResponse"/>
    <message name="subtractionRequest">
        <part name="part1" element="ns:NumbersSubtractionRequest"/>
    <message name="subtractionResponse">
        <part name="part1" element="ns:NumbersSubtractionResponse"/>
    <portType name="NumbersOperationsPortType">
        <operation name="addition">
            <input name="input1" message="tns:additionRequest"/>
            <output name="output1" message="tns:additionResponse"/>
        <operation name="subtraction">
            <input name="input2" message="tns:subtractionRequest"/>
            <output name="output2" message="tns:subtractionResponse"/>
    <binding name="NumbersOperationsBinding" type="tns:NumbersOperationsPortType">
        <soap:binding style="document" transport=""/>
        <operation name="addition">
            <input name="input1">
                <soap:body use="literal"/>
            <output name="output1">
                <soap:body use="literal"/>
        <operation name="subtraction">
            <input name="input2">
                <soap:body use="literal"/>
            <output name="output2">
                <soap:body use="literal"/>
    <service name="NumbersOperationsService">
        <port name="NumbersOperationsPort" binding="tns:NumbersOperationsBinding">
            <soap:address location="http://localhost:8080/NumbersOperationsService/NumberService"/>
    <plnk:partnerLinkType name="NumbersOperations">
        <!-- A partner link type is automatically generated when a new port type is added. Partner link types are used by BPEL processes. 
In a BPEL process, a partner link represents the interaction between the BPEL process and a partner service. Each partner link is associated with a partner link type.
A partner link type characterizes the conversational relationship between two services. The partner link type can have one or two roles.-->
        <plnk:role name="NumbersOperationsPortTypeRole" portType="tns:NumbersOperationsPortType"/>

Here is the implementation of the webservice operations –

@WebService(serviceName = "NumbersOperationsService", portName = "NumbersOperationsPort", endpointInterface = "com.aurorite.websvcs.wsdl.numberoperationsvcs.numbersoperations.NumbersOperationsPortType", targetNamespace = "", wsdlLocation = "META-INF/wsdl/NumberService/NumbersOperations.wsdl")
public class NumberService {

    Logger logger = Logger.getLogger("");

    public com.aurorite.xml.schema.numberbase.OperationResultType addition(com.aurorite.xml.schema.numberbase.AdditionType part1) throws UnknownHostException {"Running this service at " + InetAddress.getLocalHost().getHostName() + " on port 8080");
        com.aurorite.xml.schema.numberbase.OperationResultType result = new com.aurorite.xml.schema.numberbase.OperationResultType();
        try {
            result.setResult(part1.getNumber1() + part1.getNumber2());
        } catch (Exception e){

        return result;
        //throw new UnsupportedOperationException("Not implemented yet.");

    public com.aurorite.xml.schema.numberbase.OperationResultType subtraction(com.aurorite.xml.schema.numberbase.SubtractionType part1) throws UnknownHostException {"Running this service at " + InetAddress.getLocalHost().getHostName() + " on port 8080");
        com.aurorite.xml.schema.numberbase.OperationResultType result = new com.aurorite.xml.schema.numberbase.OperationResultType();
        try {
            result.setResult(part1.getNumber1() - part1.getNumber2());
        } catch (Exception e){
        return result;
        //throw new UnsupportedOperationException("Not implemented yet.");


Note that I’m returning the machine name in the response. This’ll help us to determine who actually processed a particular operation.

Now to the BPEL. We’ll create a simple process looking like this –
Dynamic Router BPEL

In the BPEL, we add these 2 namespace references –

    <import namespace="" location="ws-bpel_serviceref.xsd" importType=""/>
    <import namespace="" location="addressing.xsd" importType=""/>

And now the most important part – Adding the Service Endpoint Reference

<assign name="assign_endpoint">
                        <!--<from>ns2:doXslTransform('urn:stylesheets:wrap2serviceref.xsl', $EPRVariable.eprValue)</from>-->
                        <wsa:ServiceName PortName="NumbersOperationsPort"
        <to partnerLink="NumberService"/>

The portname and Service name must match to the ones in target service.

Enterprise Integration Patterns (EIP) with OpenESB Part 1 : Content-Based Router

This is first of the series Enterprise Integration Patterns with OpenESB where we will cover Enterprise Integration Patterns using OpenESB.

According to Enterprise Integration Patterns,

The Content-Based Router examines the message content and routes the message onto a different channel based on data contained in the message. The routing can be based on a number of criteria such as existence of fields, specific field values etc. When implementing a Content-Based Router, special caution should be taken to make the routing function easy to maintain as the router can become a point of frequent maintenance. In more sophisticated integration scenarios, the Content-Based Router can take on the form of a configurable rules engine that computes the destination channel based on a set of configurable rules.

Content Based Router

It can be achieved in OpenESB using “If” component.

In this example it simply checks for the value in “num:operation” element.

<soapenv:Envelope xsi:schemaLocation="" xmlns:xsi="" xmlns:xsd="" xmlns:soapenv="" xmlns:num="">

Can a SOA be designed with REST?



Recently I answered a question on asking can a SOA be designed with REST? I’m cross-posting the answer here.


At a high Level the answer is Yes, however not completely.

SOA requires thinking about the system in terms of

  • Services (well-defined business functionality)
  • Components (discrete pieces of code and/or data structures)
  • Processes (Service orchestrations. Generally using BPEL)

Being able to compose new higher level services or business processes is a basic feature of a good SOA. XML, SOAP based Web Services and related standards are good fit for realizing SOA.

Also SOA has a few accepted principles –

  • Standardized service contract – Services adhere to a communications agreement, as defined collectively by one or more service-description documents.
  • Service Loose Coupling – Services maintain a relationship that minimizes dependencies and only requires that they maintain an awareness of each other.
  • Service Abstraction – Beyond descriptions in the service contract, services hide logic from the outside world.
  • Service reusability – Logic is divided into services with the intention of promoting reuse.
  • Service autonomy – Services have control over the logic they encapsulate.
  • Service granularity – A design consideration to provide optimal scope and right granular level of the business functionality in a service operation.
  • Service statelessness – Services minimize resource consumption by deferring the management of state information when necessary.
  • Service discoverability – Services are supplemented with communicative meta data by which they can be effectively discovered and interpreted.
  • Service composability – Services are effective composition participants, regardless of the size and complexity of the composition.

A SOA based architecture is expected to have Service Definition. Since RESTful web services lack a definitive service definition (similar to wsdl), it is difficult for a REST based system to fulfill most of the above principles.

To achieve the same using REST, you’d need to have RESTful Web Services + Orchestration (possible using some lightweight ESB like MuleESB or Camel)

Please also see this resource – From SOA to REST

Adding this part as clarification for below comment –

Orchestration is required to compose processes. That’s what provides the main benefit of SOA.

Say you have a order processing application with operations like

  • addItem
  • addTax
  • calculateTotal
  • placeOrder

Initially you created a process (using BPEL) which uses these operations in sequence. You have clients who use this Composed Service. After a few months a new client comes who has tax exemption, then instead of writing new service, you could just create a new process skipping the addTax operation. Thus you could achieve faster realization of business functionality just by re-using existing service. In practice there are mutiple such services.

Thus BPEL or similar (ESB or routing) technology is essential for SOA. Without business use, a SOA is not really a SOA.