In a former article, we discussed what Composable Commerce is and why it’s a must for businesses to adopt. In short, brands that prioritize infusing digital throughout their entire strategy call for a trade framework that permits them to build, deploy, and optimize experiences. Composable Commerce is a new approach that offers digitally-driven brands back control so that they can keep up with customer requirements and outpace competitors.
See our goods:
In this blog post, we will explore the considerations to take into consideration when building and deploying a trade solution and find the strengths and weaknesses involved when composing solutions utilizing a rigid legacy system, a MACH-based structure, and a real Composable Commerce framework.
To exemplify how composition works, we will use search integration for example. For the business to increase conversion rates, it must make certain that clients can easily find what they’re searching for. To deliver search capabilities, you can use one of the many search providers that match your business requirements, such as Algolia, Swiftype, Coveo, Elasticsearch, and a lot of others. For illustration purposes, we will use Elasticsearch, a well-known participant in the search engine market.
Below is the high-level outline of the use-case we are going to be discussing.
Composing a trade solution
Now we will go in detail through all of the activities required to compose a trade solution and take a look at the questions which will have to be asked at each step. After that we’ll go over the composition process in the context of legacy trade software, MACH trade applications and Elastic Path. And we will do this with real life examples and code.
Software installation and configuration
This step includes the setup of both search and trade applications, and there are several important factors which need to be taken into account:
- Can the software be consumed on-demand from the cloud, or do you need to set it up on-premises/in a private cloud?
- Does the software work from the box, or does it require the installation procedure? How long does it take to configure the software to start using it?
- How long does it take to import enough of the data so that it could be examined?
This step can take a lot of effort and time, depending on multiple factors:
- What are the integration mechanisms available for each software application?
- Is documentation readily available, and does it include everything necessary to develop the integration?
- What are the integration use-cases and what the integration arrangement should look like?
- How long does it take to perform the integration, like writing all the important code for scripts, lambda functions, etc. )? Are there any tools available to hasten the process?
This last step is critical as it will help to ensure that the whole solution operates as planned and provides business value.
- What are the test use-cases, such as rapid tests, slow tests, performance tests? Would you like evaluations both for different software applications along with end-to-end tests?
- How quickly can you deploy the integration, like the associated lambda functions, other custom code, and compulsory test info?
- How are you going to run all of the vital unit and completing tests to be sure the composed solution works as intended? Can you do it automatically?
- How are you going to prepare the integration into production?
The more interaction points between the two applications, the more complex the integration process will be.
Overall quantity of effort needed to compose a trade software depends upon its capabilities, architecture, extensibility layout and lots of other aspects.
Composing a best-of-breed solution with a legacy platform can be a challenging task due to several factors:
- Legacy platforms lack modularity because they were made to be in 1 solution and also to be deployed and consumed as a whole.
- Extensions are often developed as an afterthought and rely on proprietary expansion variations that are unique to a specific platform.
While the Actions Necessary to compose an end-to-end solution with a legacy trade platform remain the same, they usually take a lot of effort and time:
- When a specific trade functionality is used in a legacy platform in a lot of places, it needs to be changed in an assortment of areas when extended. Because of this, things like a simple integration of a new payment system from the platform may have a significant period of time.
- Since the legacy platform is meant to be installed as a whole, you must package several changes with one another to minimize downtime or to settle with numerous downtime windows once the operation is set up incrementally.
- Testing also requires more effort as you are more likely to need to run end-to-end tests every time you make changes in a lot of areas throughout the codebase.
If you are integrating heritage or two software applications, the challenges listed above have a multiplicative effect. Along with this, with time, custom integrations reduce the maintainability of the platform. Moreover, solution upgrades can become an issue requiring an implementation project.
There is a new approach to exchange software that’s based on MACH architecture: Microservices, API-first, Cloud-native, and Headless. 1 instance of MACH technology is commercetools. MACH architecture enables us to avoid many challenges about the legacy architecture to accelerate speed-to-market. With this particular blog post, we will take a look at what steps have to carry out the use-case outlined above with commercetools.
- availability to commercetools merchant center
- Instance of Elasticsearch
- Produce a new job in the Merchant Center.
- Go to Settings > Developer Settings and click on Create API Client
- Save details ( CTP_PROJECT_KEY, CTP_CLIENT_ID, CTP_CLIENT_SECRET, CTP_API_URL, CTP_AUTH_URL ) because they are only shown once.
- Download the CT sample data project from Github.
- Npm setup
- In the root of the project create a file called .env containing your API client credentials:
CTP_PROJECT_KEY = myproject
CTP_CLIENT_ID = C0FFEEC0FFEEC0FFEEC0FFEE
CTP_CLIENT_SECRET = 133T133T133T133T133T
CTP_API_URL = https://api. undefined. undefined.commercetools. com
CTP_AUTH_URL = https://auth. undefined. undefined.commercetools. com
- Npm run import:data
Using AWS in this case:
- In AWS SQS, make a new standard queue.
- Create a Lambda function to process CT notifications.
- Produce an IAM function to your Lambda function which includes permissions to read from SQS.
- Configure the Lambda function to be triggered by messages coming from your SQS queue.
Your Lambda function should do the following:
- Procedure batched records coming from SQS.
- Authenticate against CT..
- Telephone the Product API to retrieve product info.
- Update the Elasticsearch index so.
Ascertain the public URL of your queue and then configure CT to send product events to that queue.
You may incorporate Commercetools with Elasticsearch without significant issues, but it is going to still require time to compose all of the necessary scripts and functions. More can be required to do so, based upon the integration complexity and the nature of integrated applications.
Now we’ll take a look at ways to implement the identical use-case with Elastic Path using a Composable Commerce framework.
More useful links:
- Access to Elastic Path environment
- Instance of Elasticsearch
- NPM package manager
- Build script that can be seen on GitHub.
To make a composable solution, we will use Elastic Path webhooks, which are a part of the Extensions Hub. Webhooks make it possible for applications to integrate with other systems by sending business events throughout the HTTP protocol.
Every time a new product is configured in a product catalogue or a product is changed, Elastic Path generates a company occasion. This event is sent to the webhook, and the event is processed by the Lambda function, which then passes the item information to Elasticsearch so the thing can be searched later on.
How to compose a solution:
The steps to compose a solution with Elastic Path and Elastic Search are easy and are executed using the following script:
- Npm conduct evaluation. Tests lambda function locally
- npm operate assemble. Packages the lambda function and generates configuration necessary for deployment.
- npm run deploy. Deploys the API gateway and lambda function and configures EPCC utilizing the integration webhook
- npm run integration-test. Performs an end-to-end evaluation to ensure that the integration is working:
- Searches for randomized test product utilizing Elasticsearch and doesn’t expect to find it.
- Creates test product in Elastic Path
- Searches for analysis product in Elasticsearch and expects to find it.
- Deletes test product from Elastic Path
- Searches for analysis product in Elasticsearch and doesn’t expect to find it.
- npm run undeploy. Integration webhook is removed from Elastic Path, APIGW and Lambda function are undeployed
A Really Composable Commerce provider, such as Elastic Path, provides you with All the benefits associated with MACH architecture and extends beyond it by bringing:
- Well-defined Packaged Business Capabilities, which can be easily combined with 3rd party or homegrown capabilities
- Accelerator Packages containing all essential artifacts such as scripts, lambda functions, configuration settings, extensions, and integration code
- Run-time composability that eliminates commerce platform downtime
While the overall integration procedure resembles Elastic Path and commercetools there are specific differences in implementation of the use-case, which are highlighted in the table below. The major difference comes in the fact that Elastic Path provides you with over simply microservices, that may be consumed via APIs such as commercetools, but also bundles functionality into Packaged Business Capabilities and Accelerator Packagesthat include all resources required for development.
This saves you valuable time by eliminating the necessity to”re-invent the wheel”, simplifying development and accelerating deployment and future optimization for a composable solution.