Certified Cloud Engineer

[city, Wi], Not Specified


Employer: Saxon Global
Industry: 
Salary: Competitive
Job type: Full-Time

  • Role - Certified Cloud Engineer
  • Hybrid Work Schedule
  • LOCAL CANDIATES ONLY
  • 9+Month Contract
  • End Client - State of WI
  • Rate: 120/ hr on C2C
  • Interview: Remote/ MS Teams
  • Years of experience: 15-20+

Job Description:

We are seeking a highly skilled Data Engineer to join our dynamic team. In this role, you will collaborate closely with our data engineering and development teams to design, develop, test, and maintain robust and scalable ELT/ETL pipelines. Your work will involve using SQL scripts, Redshift stored procedures, and various AWS tools and services to ensure top-quality outcomes. You will architect, implement, and manage end-to-end data pipelines, emphasizing data accuracy, reliability, quality, performance, and timeliness.

Key Responsibilities:
  • Collaborate with cross-functional teams to translate business requirements into effective data solutions.
  • Design and implement ETL processes, including CDC and SCD logics, to integrate data seamlessly from diverse source systems.
  • Provide expertise in Redshift database optimization, performance tuning, and query optimization.
  • Design and implement efficient orchestration workflows using Airflow.
  • Integrate Redshift with AWS services such as AWS DMS, AWS Glue, AWS Lambda, Amazon S3, and more to build end-to-end data pipelines.
  • Perform data profiling and analysis to troubleshoot data-related challenges and develop solutions.
  • Proactively identify opportunities to automate tasks and create reusable frameworks.
  • Work closely with the version control team to maintain a well-organized and documented code repository using Git.
  • Provide technical guidance and mentorship to fellow developers, sharing best practices and optimizing Redshift-based data solutions.
Required Skills:
  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
  • 15 - 20+ years of hands-on experience designing, developing, and maintaining data pipelines and ETL processes on AWS Redshift, including data lakes and data warehouses.
  • Proficiency in SQL programming and Redshift stored procedures.
  • Hands-on experience with AWS services like AWS DMS, Amazon S3, AWS Glue, Redshift, and Airflow .
  • Strong understanding of ETL best practices, data integration, data modeling, and data transformation.
  • Experience with complex ETL scenarios, including CDC and SCD logics .
  • Demonstrated expertise in AWS DMS for seamless data ingestion.
  • Proficiency in Python programming for Airflow DAG development .
  • Familiarity with converting Oracle scripts and Stored Procedures to Redshift equivalents.
  • Knowledge of version control systems, particularly Git.
  • Ability to identify and resolve performance bottlenecks and fine-tune Redshift queries.
  • Strong coding and problem-solving skills with attention to data quality and accuracy.
  • Effective communication of technical concepts to non-technical stakeholders.
  • Proven track record of delivering high-quality data solutions within deadlines.
  • Experience working with large-scale, high-volume data environments.

Desired Skills:
  • AWS certifications related to data engineering or databases.
  • Proficiency in SQL programming and Redshift stored procedures.
  • Experience with complex ETL scenarios, including CDC and SCD logics.
  • Demonstrated expertise in AWS DMS for seamless data ingestion.

Created: 2024-05-09
Reference: SG - 89379
Country: United States
State: Not Specified
City: [city, Wi]