[ad_1]
Epos Now supplies level of sale and cost options to over 40,000 hospitality and retailers throughout 71 international locations. Their mission is to assist companies of all sizes attain their full potential by the ability of cloud expertise, with options which are reasonably priced, environment friendly, and accessible. Their options permit companies to leverage actionable insights, handle their enterprise from anyplace, and attain prospects each in-store and on-line.
Epos Now at present supplies real-time and near-real-time studies and dashboards to their retailers on high of their operational database (Microsoft SQL Server). With a rising buyer base and new information wants, the crew began to see some points within the present platform.
First, they noticed efficiency degradation for serving the reporting necessities from the identical OLTP database with the present information mannequin. A couple of metrics that wanted to be delivered in actual time (seconds after a transaction was full) and some metrics that wanted to be mirrored within the dashboard in near-real-time (minutes) took a number of makes an attempt to load within the dashboard.
This began to trigger operational points for his or her retailers. The top customers of studies couldn’t entry the dashboard in a well timed method.
Value and scalability additionally grew to become a significant drawback as a result of one single database occasion was attempting to serve many various use instances.
Epos Now wanted a strategic answer to handle these points. Moreover, they didn’t have a devoted information platform for doing machine studying and superior analytics use instances, in order that they selected two parallel methods to resolve their information issues and higher serve retailers:
- The primary was to rearchitect the near-real-time reporting function by transferring it to a devoted Amazon Aurora PostgreSQL-Appropriate Version database, with a selected reporting information mannequin to serve to finish customers. This may enhance efficiency, uptime, and value.
- The second was to construct out a brand new information platform for reporting, dashboards, and superior analytics. This may allow use instances for inside information analysts and information scientists to experiment and create a number of information merchandise, in the end exposing these insights to finish prospects.
On this publish, we focus on how Epos Now designed the general answer with assist from the AWS Information Lab. Having developed a robust strategic relationship with AWS over the past 3 years, Epos Now opted to make the most of the AWS Information lab program to hurry up the method of constructing a dependable, performant, and cost-effective information platform. The AWS Information Lab program presents accelerated, joint-engineering engagements between prospects and AWS technical assets to create tangible deliverables that speed up information and analytics modernization initiatives.
Working with an AWS Information Lab Architect, Epos Now commenced weekly cadence calls to give you a high-level structure. After the target, success standards, and stretch objectives have been clearly outlined, the ultimate step was to draft an in depth activity record for the upcoming 3-day construct section.
Overview of answer
As a part of the 3-day construct train, Epos Now constructed the next answer with the continuing assist of their AWS Information Lab Architect.
The platform consists of an end-to-end information pipeline with three most important parts:
- Information lake – As a central supply of fact
- Information warehouse – For analytics and reporting wants
- Quick entry layer – To serve near-real-time studies to retailers
We selected three totally different storage options:
- Amazon Easy Storage Service (Amazon S3) for uncooked information touchdown and a curated information layer to construct the inspiration of the info lake
- Amazon Redshift to create a federated information warehouse with conformed dimensions and star schemas for consumption by Microsoft Energy BI, working on AWS
- Aurora PostgreSQL to retailer all the info for near-real-time reporting as a quick entry layer
Within the following sections, we go into every part and supporting companies in additional element.
Information lake
The primary part of the info pipeline concerned ingesting the info from an Amazon Managed Streaming for Apache Kafka (Amazon MSK) subject utilizing Amazon MSK Join to land the info into an S3 bucket (touchdown zone). The Epos Now crew used the Confluent Amazon S3 sink connector to sink the info to Amazon S3. To make the sink course of extra resilient, Epos Now added the required configuration for dead-letter queues to redirect the unhealthy messages to a different subject. The next code is a pattern configuration for a dead-letter queue in Amazon MSK Join:
As a result of Epos Now was ingesting from a number of information sources, they used Airbyte to switch the info to a touchdown zone in batches. A subsequent AWS Glue job reads the info from the touchdown bucket , performs information transformation, and strikes the info to a curated zone of Amazon S3 in optimum format and format. This curated layer then grew to become the supply of fact for all different use instances. Then Epos Now used an AWS Glue crawler to replace the AWS Glue Information Catalog. This was augmented by way of Amazon Athena for doing information evaluation. To optimize for price, Epos Now outlined an optimum information retention coverage on totally different layers of the info lake to save cash in addition to maintain the dataset related.
Information warehouse
After the info lake basis was established, Epos Now used a subsequent AWS Glue job to load the info from the S3 curated layer to Amazon Redshift. We used Amazon Redshift to make the info queryable in each Amazon Redshift (inside tables) and Amazon Redshift Spectrum. The crew then used dbt as an extract, load, and rework (ELT) engine to create the goal information mannequin and retailer it in goal tables and views for inside enterprise intelligence reporting. The Epos Now crew wished to make use of their SQL information to do all ELT operations in Amazon Redshift, in order that they selected dbt to carry out all of the joins, aggregations, and different transformations after the info was loaded into the staging tables in Amazon Redshift. Epos Now could be at present utilizing Energy BI for reporting, which was migrated to the AWS Cloud and linked to Amazon Redshift clusters working inside Epos Now’s VPC.
Quick entry layer
To construct the quick entry layer to ship the metrics to Epos Now’s retail and hospitality retailers in near-real time, we determined to create a separate pipeline. This required creating a microservice working a Kafka client job to subscribe to the identical Kafka subject in an Amazon Elastic Kubernetes Service (Amazon EKS) cluster. The microservice obtained the messages, carried out the transformations, and wrote the info to a goal information mannequin hosted on Aurora PostgreSQL. This information was delivered to the UI layer by an API additionally hosted on Amazon EKS, uncovered by Amazon API Gateway.
Consequence
The Epos Now crew is at present constructing each the quick entry layer and a centralized lakehouse architecture-based information platform on Amazon S3 and Amazon Redshift for superior analytics use instances. The brand new information platform is greatest positioned to handle scalability points and assist new use instances. The Epos Now crew has additionally began offloading among the real-time reporting necessities to the brand new goal information mannequin hosted in Aurora. The crew has a transparent technique across the alternative of various storage options for the precise entry patterns: Amazon S3 shops all of the uncooked information, and Aurora hosts all of the metrics to serve real-time and near-real-time reporting necessities. The Epos Now crew may also improve the general answer by making use of information retention insurance policies in numerous layers of the info platform. This may tackle the platform price with out dropping any historic datasets. The info mannequin and construction (information partitioning, columnar file format) we designed significantly improved question efficiency and general platform stability.
Conclusion
Epos Now revolutionized their information analytics capabilities, profiting from the breadth and depth of the AWS Cloud. They’re now capable of serve insights to inside enterprise customers, and scale their information platform in a dependable, performant, and cost-effective method.
The AWS Information Lab engagement enabled Epos Now to maneuver from thought to proof of idea in 3 days utilizing a number of beforehand unfamiliar AWS analytics companies, together with AWS Glue, Amazon MSK, Amazon Redshift, and Amazon API Gateway.
Epos Now could be at present within the means of implementing the complete information lake structure, with a rollout to prospects deliberate for late 2022. As soon as dwell, they’ll ship on their strategic objective to supply real-time transactional information and put insights instantly within the palms of their retailers.
In regards to the Authors
Jason Downing is VP of Information and Insights at Epos Now. He’s accountable for the Epos Now information platform and product course. He focuses on product administration throughout a spread of industries, together with POS techniques, cell cash, funds, and eWallets.
Debadatta Mohapatra is an AWS Information Lab Architect. He has in depth expertise throughout large information, information science, and IoT, throughout consulting and industrials. He’s an advocate of cloud-native information platforms and the worth they’ll drive for patrons throughout industries.
[ad_2]