BIG DATA ANALYTICS

BIG DATA ANALYTICS

Big data describes large sets of diverse data—structured, unstructured, and semi-structured—that are continuously generated at high speed and in high volumes. Big data is typically measured in terabytes or petabytes. One petabyte is equal to 1,000,000 gigabytes. To put this in perspective, consider that a single HD movie contains around 4 gigabytes of data. One petabyte is the equivalent of 250,000 films. Large datasets measure anywhere from hundreds to thousands to millions of petabytes.

Big data analytics is the process of finding patterns, trends, and relationships in massive datasets. These complex analytics require specific tools and technologies, computational power, and data storage that support the scale.

Data analysis

This is the step in which raw data is converted to actionable insights. The following are four types of data analytics:

1. Descriptive analytics

Data scientists analyze data to understand what happened or what is happening in the data environment. It is characterized by data visualization such as pie charts, bar charts, line graphs, tables, or generated narratives.

2. Diagnostic analytics

Diagnostic analytics is a deep-dive or detailed data analytics process to understand why something happened. It is characterized by techniques such as drill-down, data discovery, data mining, and correlations. In each of these techniques, multiple data operations and transformations are used for analyzing raw data.

3. Predictive analytics

Predictive analytics uses historical data to make accurate forecasts about future trends. It is characterized by techniques such as machine learning, forecasting, pattern matching, and predictive modeling. In each of these techniques, computers are trained to reverse engineer causality connections in the data.

4. Prescriptive analytics

Prescriptive analytics takes predictive data to the next level. It not only predicts what is likely to happen but also suggests an optimum response to that outcome. It can analyze the potential implications of different choices and recommend the best course of action. It is characterized by graph analysis, simulation, complex event processing, neural networks, and recommendation engines.

Big data analytics isn't just about handling large datasets—it's about creating a bridge between chaos and clarity. This post encapsulates that beautifully 👍

Like
Reply

To view or add a comment, sign in

More articles by Harini . K

  • COMPUTER VISION

    Computer vision tasks include methods for acquiring, processing, analyzing, and understanding digital images, and…

  • DATA COLLECTION FOR AI

    Data Collection Data collection in artificial intelligence (AI) and machine learning (ML) means gathering information…

  • COMPUTER VISION

    Computer vision is a field of computer science that focuses on enabling computers to identify and understand objects…

  • IMPACT OF AI ON BUSINESS OPERATIONS

    The impact of artificial intelligence on business is profound, as it is transforming the way companies operate and…

  • DIGITAL TWIN

    A digital twin is a virtual model designed to accurately reflect a physical object. The object being studied for…

  • AI MONITORING

    AI monitoring is a critical process in the world of artificial intelligence that involves continuously observing and…

  • COGNITIVE ANALYTICS

    Cognitive Analytics simulate the human thought process to learn from the data and extract the hidden patterns from…

  • MACHINE LEARNING:

    Machine learning is a branch of artificial intelligence (AI) and computer science which focuses on the use of data and…

  • COMPUTER NETWORKS:

    Computer Network Computer Network tutorial provides basic and advanced concepts of Data Communication & Networks (DCN).…

    1 Comment
  • ETHICAL HACKING:

    Ethical hacking involves an authorized attempt to gain unauthorized access to a computer system, application, or data…

    1 Comment

Others also viewed

Explore content categories