Sr. Data Engineer

Pikesville, Maryland


Employer: System One Holdings, LLC
Industry: 
Salary: Competitive
Job type: Full-Time

Sr. Data Engineer
Onsite -Troy Michigan-Open to c2c


Duties and Responsibilities:

  • The Sr. Data Engineer is responsible in understanding and supporting the businesses through the design, development, and execution of Extract, Transform, and Load (ELT/ETL), data integration, and data analytics processes across the enterprise.
  • You will stay on top of tech trends, experiment with and learn new technologies, contribute to the growth of data organization, participate in internal & external technology communities, and mentor other members of the team.
  • Provide technical leadership at every stage of the data engineering lifecycle, from designing data platforms, data pipelines, data stores, and gathering, importing, wrangling, querying, and analyzing data.
  • The Sr Data Engineer will work closely with various customers including their immediate project teams, business domain experts and other technical staff members.
  • Work daily within a project team environment, taking direction from project management and technical leaders.
  • Responsible for design, development, administration, support, and maintenance of the Snowflake Platform and Oracle Platform.
  • Participates in the full systems life cycle and cloud data lake/data warehouse design and build including recommendation of code development, integration with data marketplace or reuse and buy versus build solutions.

Technical Leadership:
  • Lead data integration across the enterprise thru design, build and implementation of large scale, high-volume, high-performance data pipelines for both on-prem and cloud data lake and data warehouses.
  • Lead the development and documentation of technical best practices for ELT/ETL activities.
  • Also, oversee a program inception to build a new product if needed.

Solution Design:
  • Lead the design of technical solution including code, scripts, data pipelines, processes/procedures for integration of data lake and data warehouse solutions in an operative IT environment.

Code Development:
  • Ensures data engineering activities are aligned with scope, schedule, priority and business objectives.
  • Oversees code development, unit and performance testing activities.
  • Responsible to code and lead the team to implement the solution.

Testing:
  • Leads validation efforts by verifying the data at various middle stages that are being used between source and destination and assisting others in validating the solution performs as expected.
  • Meets or exceeds all operational readiness requirements (e.g., operations engineering, performance, and risk management).
  • Ensure compliance with applicable federal, state and local laws and regulations.
  • Complete all required compliance training.
  • Maintain knowledge of and adhere to client's internal compliance policies and procedures.
  • Take responsibility to keep up to date with changing regulations and policies.


Qualifications:

Education and Years of Experience:
  • High School Diploma, GED, or foreign equivalent required.
  • Bachelor's in Computer Science, Mathematics or related field + 7 years of development experience preferred, or 10 years comparable work experience required.
  • 10 years of experience designing, developing, testing, and implementing Extract, Transform and Load (ELT/ETL) solutions using enterprise ELT/ETL
  • 15 years of comparable work experience.
  • 10 years of experience developing and implementing data integration, data lake and data warehouse solutions in an on-premise and cloud environment.
  • 5 years of experience working with Business Intelligence tools (IBM Cognos is preferred), Power BI and Alteryx.
  • 7 years of experience working with API's, data as a service, data marketplace and data mesh.
  • 10 years of experience with various Software Development Life Cycle methods such as Agile, SCRUM, Waterfall, etc.
  • 3-year experience in 100+ TB data environment.


Required and Desired Skills/Certifications:
  • Proven experience developing and maintaining data pipelines and ETL jobs using IBM DataStage, Informatica, Matillion, FiveTran, Talend or Dbt
  • Knowledge of AWS cloud services such as S3, EMR, Lambda, Glue, Sage Maker, Redshift & Athena and/or Snowflake.
  • Experienced in data modelling for self-service business intelligence, advanced analytics, and user application.
  • Experience with Data Science including AI/ML Engineering, ML framework/pipeline build and predictive/prescriptive analytics on Aws Sagemaker.
  • Experience with migrating, architecting, designing, building and implementing cloud data lake, data warehouses (cloud/on-prem), data mesh, data as a service, and cloud data marketplace.
  • Ability to communicate complex technical concepts by adjusting messaging to the audience: business partners, IT peers, external stakeholders, etc.
  • Proven ability to design and build technical solutions using applicable technologies; ability to demonstrate exceptional data engineering skills.
  • Ability to prioritize work by dividing time, attention and effort between current project workload and on-going day to day activities.
  • Demonstrates strength in adapting to change in processes, procedures and priorities.
  • Proven ability to establish a high level of trust and confidence in both the business and IT communities.
  • Strong teamwork and interpersonal skills at all management levels.
  • Proven ability to manage to a project budget.
  • Experience applying agile practices to solution delivery.
  • Must be team-oriented and have excellent oral and written communication skills.
  • Strong analytic and problem-solving skills.
  • Good organizational and time-management skills.
  • Experience in Strategic Thinking and Solutioning.
  • Must be a self-starter to understand existing bottlenecks and come up with innovative solution.
  • Demonstrated ability to work with key stakeholders outside the project to understand requirements/resolve issues.
  • Experience with data model design, writing complex SQL queries, etc., and should have a good understanding of BI/DWH principles.
  • Expertise in Relational Database Management System, Data Mart and Data Warehouse design.
  • Expert-level SQL development skills in a multi-tier environment.
  • Expertise in flat file formats, XML within PL/SQL, and file format conversion.
  • Strong understanding of SDLC and Agile Methodologies.
  • Strong understanding of model driven development.
  • Strong understanding of ETL best practices.
  • Proven strength in interpreting customer business needs and translating them into application and operational requirements.
  • Strong problem-solving skills and analytic skills with proven strength in applying root cause analysis.

Created: 2024-05-10
Reference: 319502
Country: United States
State: Maryland
City: Pikesville


Similar jobs: