Data Engineer - ETL \/ Python \/ Hadoop \/ Snowflake

Pittsburgh, Pennsylvania


Employer: Compunnel
Industry: 
Salary: Competitive
Job type: Part-Time

Job Description: 

Responsibilities

Organize business needs into ETL/ELT logical models and ensure data structures are designed for flexibility to support scalability of business solutions

Craft and implement data pipelines utilizing Glue, Lambda, Spark, and Python

Define and deliver reusable components for ETL/ELT framework.

Define optimal data flow for system integration and data migration

Integrate new data management technologies and software engineering tools into existing structures

Qualifications

Experienced in Design, Development, and Implementation of large - scale projects in financial industries using Data Warehousing ETL tools (Pentaho)

Experience in creating ETL transformations and jobs using Pentaho Kettle Spoon designer and Pentaho Data Integration Designer and scheduling

Strong knowledge and experience of SQL, Snowflake, Python and Spark

Experience with Big Data/distributed frameworks such as Spark, Kubernetes, Hadoop, and Hive

Ability to design ETL/ELT solutions based on user reporting and archival requirements

Strong sense of customer service to consistently and effectively address client needs

Self-motivated; comfortable working independently under general direction

Primary Skills:
Spark
Python
Snowflake
Airflow
Hadoop

Education: Bachelors Degree

Additional client information:

Created: 2024-06-26
Reference: PATDC4959162
Country: United States
State: Pennsylvania
City: Pittsburgh
ZIP: 15216


Similar jobs: