Data Engineer

Work location: United States
Work arrangement: Remote
Salary range: US$ 170,000 per annum
Skills:
Data Science
Fundamentals of Statistics and Probability
Data Wrangling
Exploratory Data Analysis
Data Storytelling

Position Overview: We are seeking a versatile and motivated Data Engineer to join our team. The ideal candidate will be responsible for designing, implementing, and maintaining robust data pipelines, conducting ETL (Extract, Transform, Load) processes, performing various data engineering tasks, and managing cloud infrastructure. This role requires a hands-on approach and the ability to work collaboratively in a fast-paced environment.

This role exclusively recruits candidates based in the USA.

Key Responsibilities:

Data Pipeline Development:

  • Identify and implement technical solutions to support the data architecture.

  • Design, develop, and manage scalable data pipelines to process and analyze large volumes of data.

  • Optimize and troubleshoot data pipelines for performance and reliability.

  • Implement data integration solutions to consolidate data from multiple sources.

ETL Support:

  • Design and develop ETL processes to extract data from various sources, transform it to meet mission requirements, and load it into the appropriate systems/databases.

  • Maintain and enhance existing ETL workflows and scripts.

  • Ensure data quality and integrity throughout the ETL lifecycle.

  • Collaborate with data analysts and stakeholders to understand data requirements and deliver solutions.

Data Management and Governance:

  • Implement and manage data governance policies and procedures to ensure data accuracy and consistency.

  • Develop and maintain data documentation, including data dictionaries and derived metadata.

  • Ensure compliance with data security and privacy regulations.

  • Implement data monitoring and alerting solutions to proactively identify and address data quality issues.

Database Management:

  • Design and manage databases and data lakes to support mission needs.

  • Optimize database performance and storage.

  • Implement database backup, recovery, and archiving procedures.

  • Ensure data security and access controls.

Cloud Engineering:

  • Design, implement, and manage cloud infrastructure using platforms such as AWS and Databricks.

  • Develop and maintain cloud-based solutions to support the organization’s data engineering needs.

  • Ensure the scalability, reliability, and security of cloud environments.

  • Implement cloud automation and orchestration using tools like Terraform, Ansible, or CloudFormation.

  • Monitor and optimize cloud resource utilization and performance.

  • Implement and maintain continuous integration/continuous deployment (CI/CD) pipelines.

Qualifications:

  • Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field. Demonstrable experience is acceptable in lieu of a formal degree.

  • Minimum of 5 years of experience in data engineering, cloud engineering, ETL development, and/or system engineering.

  • Experience with programming languages such as Python, Java, or Scala.

  • Proficiency in SQL and experience with database management systems such as MySQL, PostgreSQL, or SQL Server.

  • Experience with ETL tools and processes.

  • Knowledge of data warehousing concepts and technologies.

  • Familiarity with cloud platforms

Preferred Skills:

  • Experience with big data technologies.

  • Experience with programming languages such as Python, Java, or Scala.

  • Experience with data modeling and schema design.

  • Certification in cloud platforms (AWS Certified Data Analytics, Microsoft Certified: Azure Data Engineer, etc.) a plus.

Healthcare, Retirement, PTO are just a few of the benefits Andeco has to offer

ANDECO Institute

https://andeco.org/
This application includes an assessment as the first step