Data Engineer

Work location: India
Work arrangement: Remote
Salary range: INR 2,500,000 - INR 3,000,000
Skills:
Data Extraction, Transformation, and Loading (ETL)
Apache Spark for Data Engineers
SQLite (Coding): Intermediate-Level Querying
Data Modeling Concepts
Snowflake

Job Summary:

We are seeking a skilled Data Engineer to lead the migration of our legacy database to a new system. This role involves designing the new database schema, developing ETL processes for data migration, and building pipelines to ensure ongoing data flow between the legacy and new databases. The ideal candidate will have expertise in database management, data modeling, and cloud platforms, with strong problem-solving skills and the ability to work both independently and in collaboration with teams.

Key Responsibilities:

  • Analyze and model the new database structure.

  • Develop and execute data migration plans (ETL).

  • Build and maintain data pipelines for real-time and batch data migration.

  • Ensure data integrity, security, and performance during the migration.

  • Collaborate with stakeholders and document processes.

Basic Skills & Tools:

  • Strong SQL and experience with relational and NoSQL databases.

  • Proficiency in ETL tools (e.g., Apache NiFi, AWS Glue).

  • Programming in Python, Java, or Scala.

  • Experience with cloud data services (AWS, Google Cloud, Azure).

  • Familiarity with data pipeline frameworks (Apache Airflow, Kafka).