Position Title: Sr. Data Engineer (AWS, Python, Snowflake) - Location: Houston, TX (Need Locals for Hybrid 3 days onsite)

Position Title: Sr. Data Engineer (AWS, Python, Snowflake)

Location: Houston, TX (Need Locals for Hybrid 3 days onsite)

Interview Process: Virtual and IN-person if required, candidate must be comfortable for both the cases

Duration: 6+ Months

Need 2 Manager’s References

Job Description :

-The ideal candidate for this role will be responsible for developing Python modules using Numpy, Pandas, and dynamic programming in AWS and Snowflake. They will expand and optimize our ETL and data pipeline architecture and data flow across our Business Portfolio. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our data product owners, developers, data architects, data analysts, and data scientists on BI and Analytic initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiative.

-The candidate will also assist in issue resolution, job orchestration, automation, and continuous improvement of our data integration processes.

Key Responsibilities

· -Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Cloud Integration ETL Tools, Cloud Data Warehouse, SQL, and AWS.

· -Design and develop ELT, ETL, event-driven data integration architecture solutions.

-Work with Data Analysts, Data Architects, BI Architects, Data Scientists, and Data Product Owners to establish an understanding of source data and determine data transformation/integration requirements.

· -Troubleshoot and tune complex SQL queries.

· -Utilize On-Prem and Cloud-based ETL platforms, Cloud Data Warehouse, AWS, GitHub, various scripting languages, SQL, querying tools, data quality tools, and metadata management tools.

· -Develop data validation processes to ensure data quality.

· -Demonstrated ability to work individually and as part of a team in a collaborative manner.


Preferred Skills & Qualifications


· -Bachelor’s degree (or foreign equivalent) in Computer Science, Computer Engineering, or a related field.

· -8+ years’ experience with data engineering, ETL, data warehouse, data mart, and data lake development.

-Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), and familiarity with a variety of databases.

-Experience working with Cloud Data Warehouses like Snowflake, Google BigQuery, Amazon Redshift.

-Experience with AWS cloud services: EC2, S3, Lambda, SQS, SNS, etc.

-Experience with Cloud Integration Tools like Matillion, Dell Boomi, Informatica Cloud, Talend, AWS Glue.

-Experience with GitHub and its integration with ETL tools for version control.

-Familiarity with modern data management tools and platforms including Spark, Hadoop/Hive, NoSQL, APIs, Streaming, and other analytic data platforms.

-Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc., is a plus.

-Experience with Agile/Scrum methodologies is valuable.

Apply here : https://c2cjobs.talenthubsolutions.com/


To view or add a comment, sign in

Others also viewed

Explore content categories