How to convert zipped binary data to String format?

How to convert zipped binary data to String format?

Here Source is kafka consumer component. It contains all the zipped binary data.

So output type is byte[] format. Now we can consume all the data from mentioned kafka topic.

Article content
Kafka configuration

After kafka component we need to place tJavaRow to write the code to convert the zipped binary data string format.

Below mentioned details are decency that we need to import for the java code

import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.util.zip.GZIPInputStream;        
byte[] inputData = input_row.payload;
try {
	//calculate starting time
	log.info("UncompressMessage Activity START");
	long startTime = System.currentTimeMillis();
   // Create a GZIPInputStream to decompress the data
    ByteArrayInputStream byteStream = new ByteArrayInputStream(inputData);
    GZIPInputStream gzipStream = new GZIPInputStream(byteStream);

    // Read the decompressed data into a byte array
    ByteArrayOutputStream outputByteStream = new ByteArrayOutputStream();
    byte[] buffer = new byte[1024];
    int len;
    while ((len = gzipStream.read(buffer)) > 0) {
        outputByteStream.write(buffer, 0, len);
    }

    // Convert the decompressed data to a string
    String outputData = outputByteStream.toString("UTF-8");

    // Set the output string data
    output_row.payload = outputData;

} catch (Exception e) {
    e.printStackTrace();
}        

And above code will help to convert zipped binary data to string format. now we can use tLog component to print the data.











To view or add a comment, sign in

More articles by Sneha A

  • Python for Geographic Data Analysis - Chapter 1

    Python essentials Learning Objective: This presents some essential programming concepts and how to apply them in the…

  • Basic SQL questions

    Write the following queries in SQL, using the university schema Please, Take the sample data from attached link for…

  • Implementing Talend Job for Incremental Data Processing

    Recently I was stuck with one logic for hours..

    6 Comments
  • How to configure tESBConsumer

    It is similar to tRESTClient. It is used to call external API.

  • File upload or File download to the endpoint

    tHTTPClient tHTTPClient is a versatile component in Talend that enables you to access web services hosted on HTTP…

  • Talend ESB - SOAP listener

    Here, the scenario is listening the soap service continuously. Place tESBProviderRequest and tESBProviderResponse and…

  • Is JSON Structure handling difficult?

    Actually not, If you understand the JSON structure. In this article we are going to see about the objects within JSON…

    2 Comments
  • Dynamic Schema - DB to File

    The dynamic column retrieves the columns which are undefined in the schema. It can be only one column, also dynamic…

  • Extracting data from Kafka and loading into DB

    Kafka is a distributed event store and stream-processing platform. It can publish and consume JSON and XML Structures.

  • Talend Data Integration - Introduction

    There are many technologies booming now a days in the world. Cloud and Big data is one of the emerging technology in IT.

Explore content categories