Muruga Subramanian

6 results

Accelerate App Delivery with Cognizant's Next Gen Continuous Integrator

The phrase “digital transformation” is ubiquitous these days. But what does it actually mean? Often, the heart of a successful digital transformation lies in a company’s ability to migrate data from legacy systems to the cloud for easier access while also updating relevant applications quickly and seamlessly to deliver the most optimal experience to customers and end-users. To accomplish this journey in modernization, software teams have embraced agile practices founded on the pillars of collaboration, flexibility and continuous improvement and delivery. This is where continuous integration and continuous delivery (CI/CD) comes in. CI/CD is an agile practice that enables developers to implement small changes on a frequent basis. Conducted in an automated and secure environment that maintains version control, CI/CD not only accelerates app delivery, it also reduces infrastructure costs, minimizes security risks, and allows for zero-downtime deployments and easier rollbacks. The only problem? Many organizations lack the in-house expertise and resources needed to develop an agile CI/CD solution. How Cognizant's Next Generation Integrator can help The “Next Generation Continuous Integrator” is a UI-based, multi-component migration tool which utilizes a template-driven approach for migrating apps from existing CI/CD systems as well as on-boarding new apps to Concourse . The technology is built on two pillars: Templates and Containers. A template-driven approach is environment-agnostic (i.e. it can be run on private or public clouds, hybrid clouds, non-cloud, etc.) especially while handling numerous multi-tiered and multi-technology apps. A container is a standard unit of software that packages up code and all its dependencies and can be published into a Central Hub (public cloud) or a Customer Specific Hub (private). Containers can be used by various teams and applications to run applications reliably from one computing environment to another. How it works Based on the various needs of a project, including the services to be migrated or on-boarded, the Next Gen Continuous Integrator’s usage lifecycle is as follows: Reusable and shared infrastructure templates are built in association with the source CI/CD DevOps team. These form the core driving artefacts for the tool to work. These templates are pushed to GitHub repo and then to MongoDB. The templates are then exposed in the tool’s UI for the end user to choose the appropriate job configurations they need for their target pipelines. The services, according to the chosen configuration above, are then on-boarded or migrated from source CI/CD (Jenkins, GoCD etc) to Concourse. The migration/on-boarding reports are generated and pushed to MongoDB. Benefits Easier and quicker migration — CI/CD allows for system-agnostic migration across a broad spectrum of CI/CD tools through templates Quicker turn-around time — Easily accommodates Change Requests at any phase Improved quality — Standardizes processes faster. For instance, Next Gen Continuous Integrator designs a shared services layer for centralized management of the multiple environments and accounts Increased Savings — Projected cost savings of >70% as automation of the infrastructure leads to freeing up of internal resources who would otherwise be engaged in infrastructure management Why MongoDB for Next Gen Continuous Integrator tool? To understand why MongoDB is best positioned for the Next Gen Continuous Integrator tool, we first need to walk through the significant challenges of using relational databases (RDBMS) with the Next Gen Continuous Integrator. Challenges include: “Infrastructure-as-Code” templates are unstructured as they vary across teams and tools The reports that the tool generates are dynamic in nature, but reporting schema changes are difficult to make in an inflexible RDBMS Migrated source services are interdependent with multiple responsibilities and hence can become time-consuming to onboard them to Concourse The monolithic architecture of RDBMS affects management, scalability, and continuous deployment MongoDB, on the other hand, offers significant advantages: Flexible schema — Able to handle different data types and has the flexibility to store different templates of migration Ease of development — Design supports greater developer productivity and developers find it intuitive to use Rich JSON document and query language — Able to query templates at a deep hierarchical level Portability — For either the container or template driven approach, MongoDB can run across any platform There is no one-size-fits-all solution when it comes to workload migrations or onboarding applications for modern businesses. Next Generation Continuous Integrator can help clients develop a template-driven, containerized migration solution that fits the organization’s specific needs with ease and minimal operations support. Roadmap and contact details Based on the scale of adoption and performance analysis of Next Gen Continuous Integrator tool across multiple tenants, we may move from on-premise MongoDB to cloud-based MongoDB Atlas . Contact partner-presales@10gen.com for more information. Contact CognizantQEandA@cognizant.com for more information

September 29, 2021

Mainframe Data Modernization with MongoDB Powered by Wipro's "ModerniZ" Tool

This post will highlight a practical and incremental approach to modernizing mainframe data and workloads into cloud-hosted microservices using MongoDB’s modern, general purpose database platform. Enterprises modernize their mainframes for a number of reasons—to increase agility, lower the risk of aging legacy skills, and reduce total cost of ownership (TCO). But the greatest underlying benefit and reason for modernization lies in a company’s ability to access and make sense of their own data more quickly. Gleaning valuable business insights through the use of real-time data and AI/ML models is at the heart of today’s most successful and innovative companies. Consider the following business processes and their reliance on real-time insights: Real-time fraud detection, KYC, and score calculation Supporting new API requirements – PSD2, Open Banking, Fintech APIs etc Payment pattern analysis Moving away from hundreds of canned reports to template-based self-service configurable reporting Real-time management reporting With the continued emergence of mobile and web apps, enterprises are looking to render content even faster, as well as scale up and down on demand. However, mainframes often serve as the true system of record (SoR) and maintain the golden copy of core data. In a typical enterprise, inquiry transactions on the mainframe contribute to over 80% of the overall transaction volume—in some cases up to 95%. The goal for organizations is to increase the throughput of these inquiry transactions with improved response time. However, during real-time transaction processing, middleware must orchestrate multiple services, transform service response, and aggregate core mainframe and non-core applications. This architecture prevents the legacy services from seamlessly scaling on-demand with improved response time. At the same time, CIOs have significant concerns around the risks of big bang mainframe exit strategies, especially in the financial services, insurance and retail sectors where these complex applications serve the core business capabilities. Wipro and MongoDB’s joint offering, “ModerniZ,” can help alleviate these risks significantly by introducing a practical, incremental approach to mainframe modernization. An incremental solution: Offloading inquiry services data with ModerniZ To keep up with changing trends in an agile manner, and to bridge the gap between legacy monoliths and digital systems of engagement, a tactical modernization approach is required. While complete mainframe modernization is a strategic initiative, offloading inquiry services data and making it available off the mainframe is a popular approach adopted by several enterprises. Below are a few business requirements driving mainframe data offload: Seamlessly scale volume handling by up to 5X Improve response time by up to 2X Reduce mainframe TCO by up to 25% Direct API enablement for B2B and Partners (change the perception of being antiquated) Improve time to market on new enhancements by up to 3X Provision separate security mechanism for inquiry-only services Access single view data store for intra-day reporting, analytics and events handling These business requirements, as well as the challenges in current mainframe environments, warrant an offloaded data environment with aggregated and pre-enriched data from different sources in a format which can be directly consumed by channels and front-end systems. This is where our joint offering for mainframe modernization with MongoDB and Wipro’s ModerniZ tool comes into play. ModerniZ is Wipro’s IP platform specially focused on modernizing the UI, services and data layer of System Z (Mainframe). ModerniZ has multiple in-house tools and utilities to accelerate every phase of the legacy modernization journey. Let’s dig deeper into the solution elements. "CQRS with ModerniZ" for transactional and operational agility Command Query Responsibility Segregation (CQRS) is an architectural principle that can prescribe an operation as a ‘command,’ which performs an action, or a ‘query,’ which returns data to the requestor—but not both. CQRS achieves this by separating the read data model from the write data model. Separation of these two operations in a business process helps optimize performance, reduce the cost associated with inquiry transactions, and create a new model which can grow vertically and horizontally. MongoDB, with its document model and extensive set of capabilities, is best suited to house this offloaded ‘read data model.’ MongoDB’s JSON/BSON document model helps in pre-enriching the data and storing it in ‘inquiry ready’ format which simplifies the overhead for the front-end consumers. Enterprise use cases for CQRS: Customer demographic information – e.g. monetary and nonmonetary transaction inquiries in banking and financial services Payer view – Healthcare Single view across policy administration engines and pricing engines Consolidated participant view in benefit administration platforms Single view of manufacturing production control systems across plants or countries The below process indicates the step-by-step approach in enabling CQRS by offloading the data and transactional volumes into a modernized landscape, and continuously syncing the data (based on the business criticality) across platforms. Figure 1. Mainframe data modernization process The visual below indicates the conceptual target architecture where the service delivery platform (API/Middleware layer) identifies and routes the transaction to the respective systems. Any update which happens in the legacy system will be cascaded to the target MongoDB based on the business criticality of the fields. Figure 2. Post data modernization process view Mainframe services will continue to get exposed as domain APIs via zCEE for the Command Services (Update Transactions) and the newly built microservices will serve the inquiry transactions by fetching data from MongoDB. Any data updates in the mainframe will be pushed to MongoDB. The below table indicates how different fields can be synced between the mainframe and MongoDB, as well as their corresponding sync intervals. Java programs/Spark consumes the JSON document from Kafka Cluster, merges into MongoDB and creates new documents. table, th, td { border: 2px solid black; border-collapse: collapse; } Sync Type Field Classsification Sync Strategy Type-1 Near Real Time Sync for critical fields Using Kafka Queues / CICS Event Triggers / DB2 / 3rd Pardy CDC Replicators / Queue Replicators Type-2 Scheduled Batch Polling sync for less critical fields Using Mini Intra-day Batch / Replicators / ELT Type-3 EoD Batch sync for non-critical fields Batch CDC Sync Sync / Update / ELT How MongoDB and Wipro's ModerniZ helps Wipro’s ModerniZ platform provides multiple tools to accelerate the modernization journey across phases, from impact analysis to design to build to deployment. For data modernization, ModerniZ has 5 tool sets like PAN (Portfolio Analyzer), SQL Converter, automated data migration, and so on—all which can be leveraged to yield a minimum committed productivity gain of 20%. Figure 3. Mainframe to cloud transformation using ModerniZ Why MongoDB for mainframe modernization? MongoDB is built for modern application developers and for the cloud era. As a general purpose, document-based, distributed database, it facilitates high productivity and can handle huge volumes of data. The document database stores data in JSON-like documents and is built on a scale-out architecture that is optimal for any kind of developer who builds scalable applications through agile methodologies. Ultimately, MongoDB fosters business agility, scalability and innovation. Some key benefits include: Deploys across cloud in nearly all regions Provides a document model that is flexible and maps to how developers think and code Costs a fraction of the price compared to other offloading solutions built using relational or other NoSQL databases and even a larger, sharded MongoDB environment brings magnitudes of savings compared to the traditional mainframe MIPS-based licensing model Allows complex queries to be run against data via an extremely powerful aggregation framework. On top of that, MongoDB provides a BI connector for dedicated reporting/business intelligence tools as well as specific connectors for Hadoop and Spark to run sophisticated analytical workloads Offers enterprise-grade management through its EA product and includes security features which cover all areas of security – authentication, authorization, encryption and auditing. Competitors often only offer a subset of those capabilities Provides a unified interface to work with any data generated by modern applications Includes in-place, real-time analytics with workload isolation and native data visualization Maintains distributed multi-document transactions that are fully ACID compliant MongoDB has successfully implemented mainframe offloading solutions before and customers have even publicly talked about the success (e.g. a top financial service enterprise in the EU) Roadmap and contact details Work in progress on offloading mainframe read workloads to GCP native services and MongoDB. Contact partner-presales@mongodb.com for more information.

September 16, 2021

Infosys Media Platform & MongoDB: Metadata Management and Workflow Orchestration Across Media Supply

Capitalize on current and innovative technologies in the media supply chain with the Infosys Media Platform (IMP). As a part of the cloud and cloud-based Infosys Cobalt ™ portfolio, Infosys’ unifying framework, built on MongoDB Atlas and MongoDB Enterprise Advanced , helps you facilitate creative collaboration, enable productions on an industrial scale, and monetize customer relationships. How? By integrating the various ecosystems involved and providing a common platform to connect you to services and technology solutions for the media content value chain. Infosys’ intelligently-woven media and metadata management framework, leveraging MongoDB’s document model, enables smart workflows and incorporates ML/AI to create, manage and moderate content metadata. This allows the orchestration workflows across different business functions. Additionally, the platform delivers the benefits of productivity, scalability, and agility via the cloud and streamlines collaboration among the ecosystem of partners and technology solutions. Why Infosys Media Platform (IMP)? The Infosys Media Platform consists of various modules that serve critical business functions, such as: The Curation & Digitization Module -- provides the master workflow and ingests content from internal archives and multiple sources using AI/ML to create a composite index of frame level and time-coded metadata of recognized elements (such as celebrities/known personalities, objects, brands, text, and images). It also enables intelligent ad spot identification and includes functions like automated QC, editing, review, and approval, and censor editing The Custom AI Module -- ensures that newly introduced elements such as celebrities, brands, etc., can be continuously trained and recognized. This can also be used to custom-train the AI models to recognize specific content as per customer’s needs The Localization Module -- enables collaboration across multiple locations and global vendors through automatic generation of closed captions and subtitles in multiple languages Metadata Management and Distribution Module -- enables global distribution to digital platforms at scale through standard workflow models and a state-of-the-art dashboard by orchestrating the accumulation of asset level and descriptive time-based metadata from production to delivery With the above modules, Infosys Media Platform can attain the following capabilities: Content Enrichment -- leverages AI models to process video files and generate time-coded metadata for post-production and distribution process Closed Captioning, Subtitling and Localization -- processes audio (dialogs from a video, lyrics from a song, speech from a podcast) and converts it into closed captions and subtitles Content Moderation -- recognizes the presence of mature content (profanity, violence, gore etc.) using the video content and speech detection capabilities Image Processing -- identifies various attributes of an image file, similar to the capabilities on video/music content Metadata Packaging & Distribution -- manages end-to-end supply chain of digital metadata creation, updates, packaging and distribution NLP Based Analytics -- using natural language processing capabilities of the platform, users can review any string of text (dialogs, lyrics, conversations) to determine the context of the conversation as well as the sentiment Why MongoDB for Infosys Media Platform? What company wouldn’t want a database platform that increases developer productivity and data-driven operational efficiency? MongoDB offers both. With MongoDB Atlas , you can reduce the time developers spend managing data and databases, so they can focus on value-added tasks like developing new apps. MongoDB’s document model and query language provide easier access to data, allowing developers to work quickly and efficiently to support new data structures and data types, as well as leverage database-supported roll-ups for analysis. Additionally, MongoDB Atlas introduces 100+ metrics and monitoring capabilities with its complete data platform built to improve operational productivity, so you can work smarter, not harder. A main feature of Infosys Media Platform is its cloud-agnostic nature; and as a cloud-agnostic and multi-cloud data platform, MongoDB is the only platform that satisfies IMP with its ability to run seamlessly across the globe. Through a mixed workload of real-time and transactional analytics, IMP also offers a roadmap on analytics, text search and data visualization capabilities--and MongoDB provides all these features. How MongoDB powers the data platform for Infosys Media Platform (IMP) Data for all the modules previously described is powered by the MongoDB data platform Details like profile data, accounts, ratings, translations, country/region details are stored in MongoDB Audit transactional data are currently running on SQL and there is a roadmap for moving to MongoDB data platform . In the future, MongoDB’s core capabilities will further enhance the Infosys Media Platform and customer experience. Our roadmap includes utilizing MongoDB ACID transactional capabilities to store audit details, as well as using MongoDB functions and triggers for creating cloud agnostic serverless functions. Additionally, MongoDB Atlas may be leveraged for full-text search capabilities applied to media data, and to create charts for dashboarding and real-time analytics of user subscriber data. Download the Modernization Guide

May 19, 2021

Appy Pie & MongoDB’s Seamless, No-Code Business Solutions for Mobile & Web Apps

>> Announcement: Some features mentioned below will be deprecated on Sep. 30, 2025. Learn more . The tech industry’s ceaseless and exponential growth is no longer a surprise. As long as clients and end-users remain interested in faster, more efficient services, then tech companies will continue to improve business processes to meet the demand. Simultaneously, these improvements will reduce costs and maximize revenue. It’s a win-win--if, of course, it’s done correctly. So, what’s behind most success stories? How do some companies launch and maintain applications at such rapid, expansive scale? Often, the key to success lies in fostering core business processes that are driven by automation. For many tech-based organizations, Appy Pie Connect has been the go-to seamless integration platform that helps them get started. And now, with MongoDB Realm , it’s about to get even easier. Users build best in-class-apps across Android, iOS, and web with MongoDB Realm’s mobile database, sync solution, and application development services. Why use Appy Pie & MongoDB Realm? Together, Appy Pie and MongoDB are driving seismic operational change. Originally, Appy Pie AppMakr product moved to MongoDB Realm for local storage. But after experiencing the immense ease and advantages offered by Realm--specifically, its offline-first database that supports cross-platform app development-- we decided to extend its benefits to the customers of Appy Pie Connect. As an automation platform, Appy Pie Connect helps businesses automate manual tasks through smart integrations, allowing for intuitive, instant sharing between apps less commonly connected like MailChimp and LinkedIn or Stripe and Gmail, and so on. By integrating MongoDB and MongoDB Realm with Appy Pie Connect, customers can easily store or retrieve data within multiple database sources. This enables the storage of flexible schemas and maintains consistency and integration. This unique “no code” technology allows organizations to extract and work with data from MongoDB and then apply that data to desired software through triggered-based actions. For example, users can set up a trigger for every event on their Google Calendar so that their Slack status corresponds and is updated at the start and end of each meeting. This way, data concurrency is maintained without any manual effort. Realm is a particularly great choice for the customers at Appy Pie Connect because of its effortless data syncing. View some of the common use cases below. Example 1 In the Meter Billing example below, cost is calculated in real time based on usage (e.g. video viewing time), resulting in a transparent “pay as you go” model. Example 2 With real-time data sync, any updates or changes to the application are immediately reflected without requiring users to update or reinstall. Example 3 All API failure logs are conveniently displayed to the admin on the Appy Pie dashboard so that immediate troubleshooting actions can be taken. Benefits of MongoDB Realm and MongoDB Atlas Appy Pie Connect already uses MongoDB Atlas, so moving to Realm -- a MongoDB product offered through Atlas -- was a natural choice. Realm allows mobile users to sync data quickly and seamlessly between mobile devices and backend systems, even if they go offline (sync will occur when they are connected) -- and Atlas enables it all.. Some benefits of using Atlas are as follows: Scalability flexible data schema Document oriented storage Ad hoc queries, indexing, and real-time aggregation Powerful tools for data analysis Serverless function Ultimately, Appy Pie Connect helps businesses convert MongoDB into a central data store by pulling in and replicating data from all its sources. This allows customers to create new MongoDB documents automatically from new Typeform entries, new files on Dropbox, new posts on WordPress, or other resources. Similarly, Appy Pie Connect can also send MongoDB data to other third-party apps, including WordPress, Salesforce, Slack, Mailchimp, Google Drive, and many more. This makes enterprise-wide communication and collaboration much more efficient. When data is pulled from MongoDB through automation, it can help streamline other areas of the business. For example, when you pull MongoDB data into MailChimp, you can automatically add a new subscriber on Mailchimp. This ensures that your lists grow automatically, as fast as your business does. Ex. Appy Pie Connect seamlessly sends MongoDB data to third-party apps Use cases Send data from MongoDB to LinkedIn (without any code!) to quickly post accurate job-related content Extract retail product data into Google Sheets to record valuable data in an organized manner in one place Post enterprise-related content from MongoDB to Twitter to streamline social media presence How it works Appy Pie Connect employs a trigger-action based function that allows you, as a platform user, to choose the two apps you want to connect. The process is very straightforward. Once you choose the apps you wish to integrate, you will be presented with multiple options to connect them. Simply click on the “Connect” button. To integrate with the selected applications accounts, simply allow API access for Appy Pie Connect. Next, design the workflows by mapping all of your data synced from the applications you are connecting. Once complete, you are ready to test your brand new Connect with your Trigger and Action apps. And, that’s it! It is time to experience the magic of Appy Pie Connect at work. Let the automation workflows take over the mundane, repetitive tasks, and move on to more innovative, exciting tasks. As the efficiency of an organization improves through automation, one of the most direct advantages is a marked reduction in cost. These integrations help save hundreds of hours of manual effort, thereby freeing up talented resources to instead focus their energy and intellect on more critical, innovative issues. With Appy Pie Connect and MongoDB Realm, businesses can ensure that their workforce is not only optimized but also inspired, a key factor to employee satisfaction and overall company success. Watch this demo to learn how to integrate MongoDB with Google Sheets using Appy Pie Connect to help you automate data exchange between MongoDB and Google Sheets with ease. Click here to learn more about MongoDB Realm

February 2, 2021

Accelerating Mainframe Offload to MongoDB with TCS MasterCraft™

Tata Consultancy Services (TCS), a leading multinational information technology services and consulting company, leverages its IP-based solutions to accelerate and optimize service delivery. TCS MasterCraft™ TransformPlus uses intelligent automation to modernize and migrate enterprise-level mainframe applications to new, leading-edge architectures and databases like MongoDB. In this blog, we’ll review the reasons why organizations choose to modernize and how TCS has made the process easy and relatively risk-free. Background Legacy Modernization Legacy modernization is a strategic initiative that enables you to refresh your existing database and applications portfolio by applying the latest innovations in development methodologies, architectural patterns, and technologies. At the current churn rate, about half of today’s S&P 500 firms will be replaced over the next 10 years $100T of economic value is ready to be unlocked over the next decade via digital transformation Source Legacy System Challenges Legacy technology platforms of the past, particularly monolithic mainframe systems, have always been challenged by the pace of disruptive digitalization. Neither the storage nor the accessibility of these rigid systems is agile enough to meet the increasing demands of volume, speed, and data diversity generated by modern digital applications. The result is noise between the legacy system of record and digital systems of engagement. This noise puts companies at a competitive disadvantage. It often manifests as a gap between customer service and user experience, impeding the delivery of new features and offerings and constraining the business from responding nimbly to changing trends. Operational costs of mainframe and other legacy systems have also skyrocketed. With each million instructions per second (MIPS) costing up to $4,000 per year, these older systems can create the equivalent of nearly 40% of an organization’s IT budget in technical debt, significantly increasing the overall annual run cost. And as qualified staff age and retire over the years, it’s becoming harder to find and hire people with the required mainframe skills. To manage MIPS consumption, a large number of our customers are offloading commonly accessed mainframe data to an independent operational data layer (ODL), to which queries are redirected from consuming applications. IT experts understand both the risk and the critical need to explore modernization options like encapsulation, rehosting, replatforming, refactoring, re-architecting, or rebuilding to replace these legacy systems. The key considerations when choosing an approach are familiar: risk of business disruption, cost, timelines, productivity, and the availability of the necessary skills. MongoDB + TCS MasterCraft™ TransformPlus = Transformation Catalyst To stay competitive, businesses need their engineering and IT teams to do these three things, among others: Build innovative digital apps fast Use data as a competitive moat to protect and grow their business Lower cost and risk while improving customer experience Some customers use a “lift and shift” approach to move workloads off the mainframe to cloud for immediate savings, but that process can’t unlock the value that comes with microservice architectures and document databases. Others gain that value by re-architecting and rewriting their applications, but this approach can be time consuming, expensive, and risky. More and more, customers are using a tools-driven refactoring approach to intelligently automate code conversion. What TCS MasterCraft™ TransformPlus Brings to the Table TCS MasterCraft™TransformPlus automates the migration of legacy applications and databases to modern architectures like MongoDB. It extracts business logics from decades-old legacy mainframe systems as a convertible, NoSQL document data model for deployment. This makes extraction faster, easier, and more economical, and reduces the risk that comes with rewriting legacy applications. With more than 25 years of experience, TCS’s track record includes: 60+ modernization projects successfully delivered 500M+ lines of COBOL code analyzed 25M+ lines of COBOL code converted to Java 50M+ new lines of Java code auto-generated What MongoDB Brings to the Table MongoDB’s document data model platform can help make development cycles up to 5 times faster. Businesses can drive innovation faster, cut costs by 70% or more, and reduce their risk at the same time. As a developer, MongoDB gives you: The best way to work with data The ability to put data where you need it The freedom to run anywhere Why is TCS collaborating with MongoDB for Mainframe Offload? Cost. Redirecting queries away from the mainframe to the ODL significantly reduces costs. Even cutting just 20%-30% in MIPS consumption can save millions of dollars in mainframe operating costs. Agility. As an ODL built on a modern data platform, MongoDB helps developers build new apps and digital experiences 3—5 times faster than is possible on a mainframe. User Experience. MongoDB meets demands for exploding data volumes and user populations by scaling out on commodity hardware, with self-healing replicas that maintain 24x7 service. More details can be found here . How TCS MasterCraftTM Accelerates Mainframe Offload to MongoDB Data Migration Configures target document schema to corresponding relational schema Automatically transforms relational data from mainframe sources to MongoDB documents Loads data to MongoDB Atlas with the latest connector support Application Migration Facilitates a cognitive code analysis-based application knowledge repository Ensures complete, comprehensive application knowledge extraction Automates conversion of application logic from COBOL to Java, with data access layer accessing data from MongoDB Splits monolithic code into multiple microservices Automates migration of mainframe screens to AngularJS-based UI Together, TCS MasterCraft™ TransformPlus and MongoDB can simplify and accelerate your journey to the cloud, streamlining and protecting your data while laying the foundation for digital success. Download the Modernization Guide to learn more.

October 28, 2020

1Data - PeerIslands Data Sync Accelerator

Today’s enterprises are in the midst of digital transformation, but they’re hampered by monolithic, on-prem legacy applications that don’t have the speed, agility, and responsiveness required for digital applications. To make the transition, enterprises are migrating to the cloud. MongoDB has partnered with PeerIslands to develop 1Data, a reference architecture and solution accelerator that helps users with their cloud modernization. This post details the challenges enterprises face with legacy systems and walks through how working with 1Data helps organizations expedite cloud adoption. Modernization Trends As legacy systems become unwieldy, enterprises are breaking them down into microservices and adopting cloud native application development. Monolith-to-microservices migration is complex, but provides value across multiple dimensions. These include: Development velocity Scalability Cost-of-change reduction Ability to build multiple microservice databases concurrently One common approach for teams adopting and building out microservices is to use domain driven design to break down the overall business domain into bounded contexts first. They also often use the Strangler Fig pattern to reduce the overall risk, migrate incrementally, and then decommission the monolith once all required functionality is migrated. While most teams find this approach works well for the application code, it’s particularly challenging to break down monolithic databases into databases that meet the specific needs of each microservice. There are several factors to consider during transition: Duration. How long will the transition to microservices take? Data synchronization. How much and what types of data need to be synchronized between monolith and microservice databases? Data translation in a heterogeneous schema environment. How are the same data elements processed and stored differently? Synchronization cadence. How much data needs syncing, and how often (real-time, nightly, etc.)? Data anti-corruption layer. How do you ensure the integrity of transaction data, and prevent the new data from corrupting the old? Simplifying Migration to the Cloud Created by PeerIslands and MongoDB, 1Data helps enterprises address the challenges detailed above. Migrate and synchronize your data with confidence with 1Data Schema migration tool. Convert legacy DB schema and related components automatically to your target MongoDB instance. Use the GUI-based data mapper to track errors. Real-time data sync pipeline. Sync data between monolith and microservice databases nearly in real time with enterprise grade components. Conditional data sync. Define how to slice the data you’re planning to sync. Data cleansing. Translate data as it’s moved. DSLs for data transformation. Apply domain-specific business rules for the MongoDB documents you want to create from your various aggregated source system tables. This layer also acts as an anti-corruption layer. Data auditing. Independently verify data sync between your source and target systems. Go beyond the database. Synchronize data from APIs, Webhooks & Events. Bidirectional data sync. Replicate key microservice database updates back to the monolithic database as needed. Get Started with Real-Time Data Synchronization With the initial version of 1Data, PeerIslands addresses the core functionality of real-time data sync between source and target systems. Here’s a view of the logical architecture: Source System. The source system can be a relational database like Oracle, where we’ll rely on CDC, or other sources like Events, API, or Webhooks. **Data Capture & Streaming.**Captures the required data from the source system and converts them into data streams using either off-the-shelf DB connectors or custom connectors, depending on the source type. 1Data implements data sharding and throttling, which enable data synchronization at scale, in this phase. Data Transformation. The core of the accelerator, when we convert the source data streams into target MongoDB document schemas. We use LISP-based Domain Specific Language to enable simple, rule-based data transformation, including user-defined rules. Data Sink & Streaming. Captures the data streams that need to be updated into the MongoDB database through stream consumers. The actual update into the target DB is done through sink connectors. Target system. The MDB database used by the microservices. Auditing. Most data that gets migrated is enterprise-critical; 1Data audits the entire data synchronization process for missed data and incorrect updates. Two-way sync. The logical architecture enables data synchronization from the MongoDB database back to the source database. We used MongoDB, Confluent Kafka and Debezium to implement this initial version of 1Data: The technical architecture is cloud agnostic, and can be deployed on-prem as well. We’ll be customizing it for key cloud platforms as well as fleshing out specific architectures to adopt for common data sync scenarios. Conclusion The 1Data solution accelerator lends itself to multiple use cases, from single view to legacy modernization. Please reach out to us for technical details and implementation assistance, and watch this space as we develop the 1Data accelerator further.

October 15, 2020