Use Cases · Published March 1, 2022
Data Logistics In Symplat
techManta Architecture Team
The following is a joint compilation by multiple members of the techManta organization.
Porting data from one ERP (Enterprise Resource Planner) to another is not an easy task especially when the
processes are mapped quite differently from each other. When one of our customers wanted to onboard onto a new ERP from their old one, we provided “Data Logistics” as part of Business Continuity as a service.
Data logistics had to work simultaneously as processes for our customer were being mapped on to the new ERP,
dataflows could change based on optimisation and inputs coming from the process managers, the assembly line managers and the implementation specialist; we were happy to oblige with these course corrections though.
These are some real-world problems we love solving at techManta.
Shipping Data for Data Logistics is just one of the Applications of Symplat.
Do not forget to check out our featured section on Symplat and the robust and expansive set of features it supports OOTB (Out of the box) :
Symplat (Low Code) : Symplat
TABLE OF CONTENTS
Business Continuity
Businesses cannot start with a clean slate when they adopt new technologies or a platform; business continuity is paramount for companies of any scale, small or a multi-geographic enterprise. It is also the one thing which holds businesses back from trying out newer technology stacks as well, the problem of dealing with existing transactional and master data.
As part of our Strategic Consulting we provide Business Continuity as a service, ‘Data-Logistics’ is one part of this service. Of course we also suggest companies to look up at the cloud during such projects too as a means of saving costs in the long-term and to scale on-demand. Scaling should be a non-event, a default; not an after thought.
For the ERP - Data Logistics Project we mentioned about at the start of this blog. We had to port data such as Inventory, Supplier, Sub-Contractor, Bill of Materials, Items, Chart of Accounts from an existing ERP to a cloud and web based ERP.
Please keep reading to understand where we excel compared to other technology service providers.
Data Flow Stages
For this specific case we were dealing with two different relational ERP data-models, we explained to our customer the stages and the complexity of data-logistics and how we were going to simplify it :
Step 1 : Reverse Engineer - the Relational Data Model using our ‘Advanced Reverse Engineering Module’ of our platform. We reverse engineered both the source and the destination model and had them in the same workspace for quick look - ups and mapping.
Step 2 : Mapping - Mapping data-flow attributes between the legacy model and the new ERP were auto-completed suggestions on our platform. Also, some entities on the new ERP had to be mapped to many entities from the legacy database with business logic thrown in and yet in other cases meaningful defaults had to be provided.
Step 3 : Generate View - Once the data had been mapped, a flat CSV file with mapped-data had to be generated for validation and loading on the new ERP. For us it is yet another “view” on Symplat because data can be marked-up using html, marked-down using markdown or serialized as JSON or even CSV in this case.
Step 4: Load - For this last step we had to write a specific data-pipe to complete the data-flow to actually have the data persisted on the new ERP.
Reverse Engineer
Reverse engineering is the process through which the conceptual schema of a database is reconstructed. On Symplat our Low-Code Platform, it is as simple as mouse clicks even offering fine grained control of importing a specific sub-set or a logical grouping of entities to concentrate on one module at a time. Our Reverse Engineering Module takes care of pulling in all the associations.
Mapping With AutoCompleted Suggestions
Once we had the Physical Model Reverse Engineered ( Databases are Physical Models in Symplat ), we now began the process of mapping the ‘Conceptual Model’ or ‘Logical Model’ to this Physical Model.
This is where our Low-Code platform really shines. Type-Mapping is OOTB (Out of the Box) and are auto-completed suggestions, we also support complicated expressions in place and added hooks to apply data transformation logic or “Business Rules”.
Generate View
We treat data as our ‘first-class-citizens’ in ‘Data-Logistics’ projects, once data has been sanitised, vetted and transformed using the prior steps, now it a matter of presentation. Presentation can range from markup using HTML, or a Tabular format using Markdown, or a JSON, or YAML or in this specific project a CSV to be imported into the new ERP. We built the flexibility into our platform to choose the format of data needed by our customer and their choice of ERP.
Data Loading
Once the data had been verified by the ERP Implementation specialist and the Department Heads. We had to
switch-on 2 ‘Data-Taps’ one tap for the QA Environment for Regression testing and another tap for Staging Environment for final porting.
A Tap is needed from opening & closing the data-flow to determine the chunk size or the batch size which also determines the success rate of Transactional Data being effectively ported.
We also versioned the CSV files generated from the data-mapping stage and logically chained them, which allowed our customer to go back & forth during this project to make further optimisations and account for regression issues.
If this is something which interests you. Please Contact Us :
Contact Us : Contact Us