Suvoda logo

Data Engineer

Suvoda
Department:Data Engineer
Type:REMOTE
Region:EU
Location:Iaşi, Romania
Experience:Mid-Senior level
Estimated Salary:€60,000 - €90,000
Skills:
AWSS3GLUELAKE FORMATIONATHENAREDSHIFTPYSPARKPYTHONSQLETLELTGRAPHQLDMSAURORAPOSTGRESQLDATA MESHDATA LAKECI/CDAIRFLOWTERRAFORMCLOUDFORMATIONKAFKAKINESIS
👁️ Views: 19🚀️ Applied: 14
Share this job:

Job Description

Posted on: October 15, 2025

Data Engineer (Remote - Romania)Department: Product Development

Reports to: Manager, Data Engineering

Suvoda is seeking a skilled and driven Data Engineer to help evolve our data platform towards a data mesh architecture. In this role, you’ll design and build domain-oriented data products and support near real-time reporting.You’ll work on building and optimizing ETL/ELT pipelines using AWS Glue and PySpark, ensuring scalable, high-performance data processing across our platform.

Responsibilities:

  • Contribute to the design and implementation of a data mesh architecture using GraphQL APIs to expose domain-owned data products.
  • Build and maintain a modern AWS-based data lake using S3, Glue, Lake Formation, Athena, and Redshift.
  • Develop and optimize ETL/ELT pipelines using AWS Glue and PySpark to support batch and streaming data workloads.
  • Implement AWS DMS pipelines to replicate data into Aurora PostgreSQL for near real-time analytics and reporting.
  • Support data governance, quality, observability, and API design best practices.
  • Collaborate with product, engineering, and analytics teams to deliver robust, reusable data solutions.
  • Contribute to automation and CI/CD practices for data infrastructure and pipelines.
  • Stay current with emerging technologies and industry trends to help evolve the platform.

Requirements:

  • Bachelor’s degree in a technical field such as Computer Science or Mathematics.
  • At least 4 years of experience in data engineering, with demonstrated ownership of complex data systems.
  • Solid experience with AWS data lake technologies (S3, Glue, Lake Formation, Athena, Redshift).
  • Understanding of data mesh principles and decentralized data architecture.
  • Proficiency in Python, SQL
  • Experience with data modeling, orchestration tools (e.g., Airflow), and CI/CD pipelines.
  • Strong communication and collaboration skills.

Preferred Qualifications:

  • Master’s degree, especially with a focus on data engineering, distributed systems, or cloud architecture.
  • Hands-on experience in infrastructure-as-code tools (e.g., Terraform, CloudFormation)
  • Expertise in AWS Glue and PySpark for scalable ETL/ELT development
  • Experience with event-driven architectures (e.g., Kafka, Kinesis).
  • Familiarity with data cataloging and metadata management tools.
  • Knowledge of data privacy and compliance standards (e.g., GDPR, HIPAA).
  • Background in agile development and DevOps practices.
Originally posted on LinkedIn

Apply now

Please let the company know that you found this position on our job board. This is a great way to support us, so we can keep posting cool jobs every day!

👁️ Views: 19🚀️ Applied: 14
RemoteITJobs.app logo

RemoteITJobs.app

Get RemoteITJobs.app on your phone!