Why structured data modelling exercise is essential for the quality of a software?
RDBMS has dominated as well dominating the data world. Often in hurry a structured data modelling exercise is jumped by software architects to save time. Is that a right approach?
In a software development, cost of fixing a bug phenomenally increases with the elongation of the defect escape duration.
For example, a bug escaped from requirement stage caught at deployment stage is extremely costly than when it is caught at the design stage itself.
I would have designed 10000s of tables under numerous database systems, RDBMS, NDBMS, NoSQL etc. I applied what I studied in my Information Engineering in these, viz., the Yourdon-deMarco Method of Normalization, Entity Relationship Diagram followed after Gane & Sarsen method of System Analysis using Context Diagram, Data Flow Diagrams etc. Initially it was a tedious procedure but later I realized my thought pattern itself gets aligned to these scientific methods resulting in numerous advantages, top ones are as follows:
- I realized without my knowledge I was taking care of many requirements that my users would need, most of them achieved by just using SQL statements itself.
- Integrity of data was a seamless realization.
- Thorough representation of data dictionary helped in organizing the huge volume of data in a manner to scale up the success of business analytics.
- Automatic explanation of business algorithm to developers connecting the user stories with the data model thereby drastically reducing errors.
- Increased amount of reusability of data.
- Lower maintenance cost.
- Ultimately a robust design.
Faster, Cheaper, Better - is the mantra I follow and I certainly celebrate to the Data Modelling principles.