complereinfosystem

imgpsh_fullsize_anim (6)

The Leading Text Messaging & Marketing Service for Your Business

The Leading Text Messaging & Marketing Service for Your Business

Challenge

Textellent offers a better way to engage and keep customers through text message marketing. To enable the Tax preparation services Textellent has to build the integration with different software such as Drake, Intuit ProSeries, Taxwise Desktop & Online, CrossLink Professional Tax Software, TaxWare, Tax Slayayer Pro, Lacerte Tax, Rig Tax and others. 

Textellent has to build at speed and scale to handle clients’ info uploads of Textellent customers from various Tax software, before the import files required to be validated and then processed.

What Complere did

Textellent, a leader in Business Texting Solutions and named to one of the top spots by Software Advice (a Gartner company) in the 2020 year-end FrontRunners Report for SMS Marketing Software. Textellent provides a robust, flexible platform with rich out-of-the-box functionality and strong integration capabilities with Tax Softwares.

Complere approach to building real-time Integrations as microservices using Talend ESB for each Tax Softwares, where we have split the problem into master, branches and slave concept. Masters service is responsible to route the validation or process request of the requested software branch, Branches are where validation or processing is done of the client info and the slave job is common for all the branches which are responsible to load the transformed data into the database.

The scope of the engagement included the following initiatives :

  • Build the transformations from scratch specific to each Tax software. 
  • Build a robust REST framework to take the request from the Textellent portal and send the same for validation and processing. 
  • Build the REST validation framework which should validate the client info uploads from Textellent customers, response error and its description in a realtime. 
  • Build the standard slave process to consume processed files produces by the branches. 
  • Build the error logging and rejects logging mechanism to identify and troubleshoot failures.
  • Build the standardised process to manage codebase in GIT and deployments. 
  •  Build a process to support both real-time client info validation and scheduled based client info processing.

Complere solution benefited the Textellent in the following ways : 

  • The generic framework in the system requires minimal customization, which ultimately saves development efforts and significantly reduces the cost of future implementation.
  • The Talend ESB tool based microservices reduced 70% efforts on the development side.
  • Using the Master, branches and Slave concept saved development costs by more than 60%
  • Real-time data validation solution increased the customer satisfaction score by identifying the mistakes in the client info uploads. 

Complere provided Textellent with a simple, easy to handle and high-performance data transformation solution. With the new solution, Textellent can enable its customers to uploads their client info and manage the transformed data in a centralized location. Improved rate and quality of transformations would not only help them quickly realize high ROI on their investment but also enable them to acquire major customers with a large business base in the long run.  

About 

Complete Infosystem is a global technology service firm serving as a trusted technology partner for our customers. We work with some of the world’s most innovative enterprises and independent software vendors, helping them leveraging technology and outsourcing in our specific areas of expertise. Our core philosophy of “Happy to comply with your request” communicates our belief in lavishing care and attention on our customers and employees.

board, chalk, business

Atlassian Bitbucket To AWS CodeCommit Using Bitbucket Pipelines

Hello everyone,

Today we going to learn about how we can sync the data from the bitbucket to the AWS CodeCommit .There can be cases we will be using Bitbucket from long time and now we have to shift ourself to the CodeCommit .So we will need to replicate our commits done till date in the bitbucket and will also need to do the sync on timely basis.To achieve our goal bitbucket pipeline will help us with it .Lets see how we can do it .

Short Description on steps we will be following

  • Creating a new and empty CodeCommit Repository where we are going to sync the data of the bitbucket repository
  • Creation of IAM Group which will have the access permissions which will allow us to commit the changes in the Codecommit repository
  • Creation of IAM  User through which we will commit the changes from the bitbucket to the CodeCommit
  • Creation of SSH Keys and adding in the Security Credentials of the user
  • Configure Bitbucket Pipelines which will help us to create the replication from the bitbucket repository to the CodeCommit and which will be helpful to maintain the sync on timely basis.

Procedure

  • Creation of CodeCommit RepositoryFirst we will create an empty repository by selecting the region where we want the CodeCommit repository to be .Following will be the steps to create a new repository
    • We will create an empty repository to commit the changes from Bitbucket
    • Open up AWS CodeCommit and select your region
    • Once you’ve created a repository, select the repository, click the “Connect” button, and choose the SSH option which we’ll be using later on, this is where you’ll find your connection information, and some instructions that you can refer back to later.

Creation of IAM Group

Here we will need to have the Permission to the user for the CodeCommit to commit the changes

  • Create a new IAM CodeCommit-Contributor
  • Assign the AWSCodeCommitPowerUser policy to this group

Creation of IAM  User

We will create a new user which will be helpful for us to get the data from the bitbucket to the CodeCommit

  • Create a new IAM user with a login of Bitbucket-User
  • assign the CodeCommit-Contributor group to it.
  • After creation we will add the SSH public key to the user which we will do below

Creation of SSH Keys and adding in the Security Credentials of the user

Access to CodeCommit repositories is provided by associating credentials or keys. In this case, we’re going to use SSH and generate public and private keys for use with the IAM user and Bitbucket Pipeline service.

To generate a new private and public key (Windows users, YMMV), we’ll open terminal and execute the following. We’re not going to provide a password here, just hit return when it asks.

To generate a new private and public key (Windows users, YMMV), we’ll open terminal and execute the following. We’re not going to provide a password here, just hit return when it asks.

ssh-keygen -f ~/.ssh/codecommit_rsa

  • This will generate 2 files, ~/.ssh/codecommit_rsa, which is the private key and ~/.ssh/codecommit_rsa.pub, which is the public key. Copy your public key to your clipboard:
pbcopy < ~/.ssh/codecommit_rsa.pub
or we can do is
sudo cat ~/.ssh/codecommit_rsa.pub and copy the contents in the clipboard
  • Open your IAM Bitbucket-User, and under “Security credentials”, click Upload SSH Key under “SSH keys for AWS CodeCommit”, and paste in your public key.
  • Once your public key is created, there will be an SSH key ID associated with it.
  • This will be used as your CodeCommit username when accessing repositories.

Set up Git and validate your connection

Let’s test the connection at this point to confirm that you’ve correctly associated your new key with the user, as well as validated that the user has the correct privileges in the CodeCommit profile assigned to the group. We’re going to use this same configuration later on with Bitbucket Pipelines, so keep it handy.

  • Create your ~/.ssh/config, and associate your IAM user’s SSH key ID and new private key with the CodeCommit hosts.Write the below details in the config file which we will create
Host git-codecommit.*.amazonaws.com
  User Your-IAM-SSH-Key-ID-Here [which is created in Security credentials when we uploaded the SSH key in iam user]
  IdentityFile ~/.ssh/codecommit_rsa
  • Now we will initialize the connection as below
ssh git-codecommit.us-east-1.amazonaws.com

The authenticity of host ‘git-codecommit.us-east-1.amazonaws.com (72.21.203.185)’ can’t be established.

RSA key fingerprint is SHA256:XXX/XXXXXX.
Are you sure you want to continue connecting (yes/no)? yes

Warning: Permanently added ‘git-codecommit.us-east-1.amazonaws.com,72.21.203.185’ (RSA) to the list of known hosts.

We should get the below response :

  • You have successfully authenticated over SSH. You can use Git to interact with AWS CodeCommit. Interactive shells are not supported.Connection to git-codecommit.us-east-1.amazonaws.comclosed by remote host.

Configure Bitbucket Pipelines

  • In order to use Bitbucket Pipelines, it needs to be enabled for the repository first. Under your repository settings, choose Pipelines and enable pipelines in bitbucket.
  • Now Pipelines is enabled, and before configuring that bitbucket-pipelines.yml file, lets initialize some Pipelines environment variables.
  • Under your repository settings, choose Repository Variables under Pipelines. We’re going to create 5 environment variables as below.

Following are the variables we will assign

  • CodeCommitConfig: The base64 encoded version of the SSH config we added to our ~/.ssh/config earlier that specifies the Host, User and IdentityFile.
    • We can create the base64 encoding below
cat ~/.ssh/config | base64 -w 0
  • CodeCommitHost: The host and region of your CodeCommit instance
  • CodeCommitKey: The base64 encoded version of your SSH private key that we generated (node that it’s hidden and encrypted in the above screenshot because Secured was selected, make sure you do this as well).We can create base4 encoding like
cat ~/.ssh/codecommit_rsa | base64 -w 0
  • CodeCommitRepo: The host, region and repository path of your repository.
  • CodeCommitUser: The SSH key ID associated with the public key on your AWS IAM user.[This is the SSH keyID which we will get in the Security Credentials in the IAM]
  • Lets create that bitbucket-pipelines.yml file, either add it using your favourite editor, or click “Configure bitbucket-pipelines.yml” and edit it directly on bitbucket.org.
pipelines:
  default:
    – step:
        script:
          – echo $CodeCommitKey > ~/.ssh/codecommit_rsa.tmp
          – base64 -d ~/.ssh/codecommit_rsa.tmp > ~/.ssh/codecommit_rsa
          – chmod 400 ~/.ssh/codecommit_rsa
          – echo $CodeCommitConfig > ~/.ssh/config.tmp
          – base64 -d  ~/.ssh/config.tmp > ~/.ssh/config
          – set +e
          – ssh -o StrictHostKeyChecking=no $CodeCommitHost
          – set -e
          – git remote add codecommit ssh://$CodeCommitRepo
          – git push codecommit $BITBUCKET_BRANCH
  • Below is the details of the pipeline script which we have created
  • Creates temporary files for $CodeCommitKey and $CodeCommitConfig then decodes them into place.
  • Adjusts the permissions on your primary key (some SSH clients require more secure privileges on this file)
  • Initializes the SSH connection to the CodeCommit host. It’s worth noting here that this command will “appear to fail”, so we need to disable error checking (set +e) on this script and let it fail silently and then re-enable error checking (set -e). -o StrictHostKeyChecking=no will prevent the service from needing to manually accept the remote host.
  • Add the CodeCommit repository as a remote and push the current ($BITBUCKET_BRANCH) branch

Notes

  • We will also require CodeCommit Repository as empty everytime
code, html, digital

Serverless Lambda Function For Talend Jobs

Learning about Talend and AWS is always fun and the way it is interacted is also full of fun. In this post we will see how run the talend job by deploying them in lambda function and run the lambda function using Serverless framework.

Lets get started with the prerequisites which are very important to start the topic:

Prerequisites

Installation of Node JS :

Lets Download the Node JS binaries from (https://nodejs.org/en/) site and install.
Node JS installation will install both node JS runtime and npm (node package manager)
NPM is used to install packages.

For linux we have to use command as

<code

yum install-y gcc-c++make
curl -sL https://rpm.nodesource.com/setup_6.x |sudo-Ebash
yum install nodejs
node -v

For windows we have to install like : https://nodesource.com/blog/installing-nodejs-tutorial-windows/

Installation of Apache Maven:

As we are done with the Nodejs installation we will start with apache maven

Following are the ways how we can install apache maven :

In Windows :

  • Go to the link :https://maven.apache.org/download.cgi unzip it and add the bin path to the environment variables like :
    • PATH=C:\apache-maven-3.6.0-bin\apache-maven-3.6.0\bin
    • We can verify if it is installed by checking mvn -version [Which will give us the version of the maven which is installed.]

In Linux :

  • wget http://mirror.olnevhost.net/pub/apache/maven/maven-3/3.0.5/binaries/apache-maven-3.0.5-bin.tar.gz
  • Run the wget command from the dir you want to extract maven too.
  • run the following to extract the tar,
    tar xvf apache-maven-3.0.5-bin.tar.gz
  • move maven to /usr/local/apache-maven
    mv apache-maven-3.0.5  /usr/local/apache-maven
  • Next add the env variables to your ~/.bashrc fileexport M2_HOME=/usr/local/apache-maven
    export M2=$M2_HOME/bin
    export PATH=$M2:$PATH
  • Execute these commands
    source ~/.bashrc
  • Verify everything is working with the following commandmvn -version

Installation of Oracle JDK

Serverless configuration

We will need to install serverless and add our credentials in config which will help us to connect with the aws

  1. Run the below command to install Serverless globallyserverless-stack-output is a plugin, aws-sdk is used to call Batch jobs and to install other dependencies using npm installnpm install -g serverlessnpm install
  2. serverless config credentials
sudo serverless config credentials –provider aws –key <ACCESS_KEY> –secret <SECRET_KEY>

Serverless installation of the AWS Lambda function

This will create an AWS Lambda function for the talend job using the serverless.

  • First we will make empty directory named as : serverless-lambda-talendjobname
  • Now in Visual code we will browse to the folder as cd serverless-lambda-talendjobname
  • Now we will create a serverless java maven template as
     serverless create –template aws-java-maven
  • Now we have to install the talend related directories for that we have one zip attached as Supporting_Talend_Jobs_For Serverless
  • This zip will help us to install all the lib files and will also generate the pom.xml for us which we can use it for the serverless_Project.
  • Following will be the generic folder structure which we will follow :
  • Following are the details of all the folders and zips present in the above image
    • serverless_project : We can give the generic name to the serverless_project which will be parent directory and where we will place all the related folders and zips .
    • code : This will contain the serverless code which we will install and deploy
    • lib : This will contain all the libs which are present and which needs to do mvn install
    • Supporting_Talend_Job_For_Serverless : This is the attached zip which is used to install the libs of the talend projects
    • TalendProject : This is the talend project for which we have to create the lambda function
  •  Working Procedure of Supporting_Talend_Job_For_Serverless :
    • 1st we have to unzip the Supporting_Talend_Job_For_Serverless and also unzip the TalendProject
    • TalendProject have jars at 2 places :
      1. TalendProject\lib
      2. TalendProject\TalendProject\lib
      3. We will place this jars in the lib folder which is shown in above image
    • Then we will run the bat file present in the directory D:\serverless_project\Supporting_Talend_Job_For_Serverless_0.1\Supporting_Talend_Job_For_Serverless\Supporting_Talend_Job_For_Serverless_run.bat which will do all the installation of the jars and also will generate the pom.xml file named as pom_generated_by_talend.xml
  • We will now replace the pom.xml file created in the code directory from the template to the one which is generated by talend
  • We will change the Handler.java as per the code like below
public ApiGatewayResponse handleRequest(Map<String, Object> input, Context context) {
        LOG.info(“received: {}”, input);
        testproject.newjob_0_1.newjob t2 = new testproject.newjob_0_1.newjob();
        String[] context2 ={};
        t2.runJob(context2);
        Response responseBody = new Response(“Success”,input);
        return ApiGatewayResponse.builder()
            .setStatusCode(200)
            .setObjectBody(responseBody)
            .setHeaders(Collections.singletonMap(“X-Powered-By”, “AWS Lambda & serverless”))
            .build();
}

Now we will try to install the dependencies as :

mvn clean install

When the build will be success we will deploy the sls as :

sls deploy

We can now check the function by invoking it as :

serverless invoke –function <functionname> -l

And we are done if we get any error then we can check the above steps .

pixel image

ETL With Airflow

Airflow is a platform to programmatically author, schedule and monitor workflows.

We use airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The airflow scheduler executes your tasks on an array of workers while following the specified dependencies, we have heard about airflow but we never knew how to get it working for the talend jobs and it will be very easy as we will have the UI for scheduling and monitoring the working flow of the talend jobs.

Lets check our To Do to achieve the goal

  1. Launching the instance of the Ec2 : We will be launching the ubuntu server for the installation of the airflow and also for copying the talend jobs in the server
  2. Installing Airflow in Ec2 instance : We will follow the steps for the installation of the airflow and get the webserver of the airflow working
  3. Adding of the talend job and creating DAGs file

Launching an ec2 instance in aws.

We will launch ubuntu 16.04 instance for airflow

Adding of Airflow in Ubuntu Server

Call Now Button