DATA MODELLING

DATA MODELLING

What is Data Modelling?

Data modeling (data modelling) is the process of creating a data model for the data to be stored in a database. This data model is a conceptual representation of Data objects, the associations between different data objects, and the rules. Data modeling helps in the visual representation of data and enforces business rules, regulatory compliances, and government policies on the data. Data Models ensure consistency in naming conventions, default values, semantics, security while ensuring quality of the data.

Why use Data Model?

The primary goal of using data model are:

  • Ensures that all data objects required by the database are accurately represented. Omission of data will lead to creation of faulty reports and produce incorrect results.
  • A data model helps design the database at the conceptual, physical and logical levels.
  • Data Model structure helps to define the relational tables, primary and foreign keys and stored procedures.
  • It provides a clear picture of the base data and can be used by database developers to create a physical database.
  • It is also helpful to identify missing and redundant data.
  • Though the initial creation of data model is labor and time consuming, in the long run, it makes your IT infrastructure upgrade and maintenance cheaper and faster.

Types of Data Models

Types of Data Models: There are mainly three different types of data models: conceptual data models, logical data models, and physical data models, and each one has a specific purpose. The data models are used to represent the data and how it is stored in the database and to set the relationship between data items.

  1. Conceptual Data Model: This Data Model defines WHAT the system contains. This model is typically created by Business stakeholders and Data Architects. The purpose is to organize, scope and define business concepts and rules.
  2. Logical Data Model: Defines HOW the system should be implemented regardless of the DBMS. This model is typically created by Data Architects and Business Analysts. The purpose is to developed technical map of rules and data structures.
  3. Physical Data Model: This Data Model describes HOW the system will be implemented using a specific DBMS system. This model is typically created by DBA and developers. The purpose is actual implementation of the database.

Data Modeler: Job Description & Career Requirements

Career Definition for a Data Modeler

Data modeling is the process through which a mass of data is separated into a structure that makes it intelligible to the binary processes of computers and useful to a business or large institution. Often working on a team of data architects, data modelers are systems analysts engaged in translating business requirements into conceptual, logical, and physical data models, who may focus on issues such as reducing redundancy of data within an existing computer system or improving the way in which it moves from one system to another.

Required Education

Many jobs in data modeling require a bachelor's degree with an emphasis on computer or information science or applied mathematics; some employers seek those with graduate or postgraduate courses in business or information systems management. Courses recommended by the Association for Computing Machinery (ACM) include digital logic and data representation; computer architecture and organization; memory architecture and directions in computing. Some jobs can be obtained through a combination of practical experience and college courses in computer science.

Skills Required

Data modelers are enthusiastic learners, dedicated to customer service and quality control. They also have excellent problem-solving and time management skills.

Career and Economic Outlook

The median salary for all computer systems analysts, of which data modelers are a part, was $90,920 in May 2019, according to the U.S. Bureau of Labor Statistics (BLS). Jobs for computer systems analysts overall are projected to grow by 7% from 2019-2029, a rate faster than the national average, according to the BLS.

To view or add a comment, sign in

More articles by Ragini Trivedi

  • GIT

    Git is a mature, actively maintained open source project originally developed in 2005 by Linus Torvalds. Git is an…

  • APACHE SPARK

    What is Apache Spark? Apache Spark is an open-source, distributed processing system used for big data workloads. It…

  • DEVOPS

    What is DevOps DevOps is a collection of flexible practices and processes organizations use to create and deliver…

  • AZURE DATA ENGINEER

    What is Azure Data Factory? Azure Data Factory is a cloud-based data integration service that allows you to create…

  • GCP

    Google Cloud Platform (GCP), offered by Google, is a suite of cloud computing services that runs on the same…

  • ACTURIAL

    What Is Actuarial Science? Actuarial science is a discipline that assesses financial risks in the insurance and finance…

  • CLOUD OPERATIONS

    Cloud operations (CloudOps) is the management, delivery and consumption of software in a computing environment where…

  • SALESFORCE

    Salesforce, Inc. is an American cloud-based software company headquartered in San Francisco, California.

    1 Comment
  • REDSHIFT

    A Redshift Database is a cloud-based, big data warehouse solution offered by Amazon. The platform provides a storage…

  • UIPATH

    UiPath is a robotic process automation tool for large-scale end-to-end automation. For an accelerated business change…

    3 Comments

Others also viewed

Explore content categories