Data Engineer
Build modern data pipelines, lakehouse architectures, and real-time streaming infrastructure for enterprise clients.
DataRemote (US)Full-time
About the Role
We're looking for a Data Engineer to help our clients build the data infrastructure that everything else depends on. You'll design and implement pipelines, warehouses, and governance frameworks that turn fragmented enterprise data into a trustworthy, accessible foundation.
What You'll Do
- Design and build modern data pipelines using batch and real-time streaming patterns
- Implement lakehouse architectures on cloud platforms (Snowflake, Databricks, BigQuery)
- Develop data quality frameworks, lineage tracking, and governance automation
- Migrate legacy ETL processes to modern, cloud-native architectures
- Collaborate with data scientists and analysts to ensure infrastructure meets analytical needs
What We're Looking For
- 4+ years of data engineering experience
- Proficiency in SQL and Python; experience with Spark, dbt, or Airflow
- Hands-on experience with cloud data platforms (Snowflake, Databricks, BigQuery, or Redshift)
- Understanding of data modeling, warehousing patterns, and streaming architectures (Kafka, Kinesis)
- Experience with infrastructure as code and CI/CD for data pipelines
- Strong problem-solving skills and attention to data quality
Nice to Have
- Experience with data governance tools (Collibra, Alation, or similar)
- Background in building data platforms for regulated industries
- Familiarity with ML feature stores and serving infrastructure
