Senior Data Engineer at Ruzora | Torre

Senior Data Engineer

You'll architect modern data infrastructure, powering AI-native startups and driving business intelligence.
Emma highlights
This highlight was written by Emma’s AI. Ask Emma to edit it.
Full-time

Legal agreement: Contractor

Currency exchange and taxes to be paid by:

Candidate

Compensation
USD4k - 6k/month
Negotiable
location_on
Remote (for Mexico residents)
Remote (for Argentina residents)
Remote (for Costa Rica residents)
Remote (for Colombia residents)
skeleton-gauges
You have opted out of job matches in .
To undo this, go to the 'Skills and Interests' section of your preferences.
Review preferences
Posted 14 days ago

Requirements and responsibilities


About the role: We are looking for a Senior Data Engineer to join our partner companies building modern data infrastructure for AI-native U.S. startups. You will design and build the data pipelines, warehouses, and analytics layer that power business intelligence and machine learning workflows. Work for innovative companies that treat data as a first-class product surface. Responsibilities: - Design and build scalable ETL/ELT pipelines using modern tools (Airflow, dbt, Dagster). - Architect data warehouses on Snowflake, BigQuery, or Redshift. - Build data models for analytics, ML features, and product dashboards. - Implement data quality, observability, and lineage tooling. - Optimize query performance and warehouse costs. - Collaborate with analytics and ML teams on data contracts and schema design. Requirements: - 5+ years of professional data engineering experience. - Strong Python and advanced SQL skills (query optimization, window functions, CTEs). - 3+ years with modern data warehouses (Snowflake, BigQuery, or Redshift). - Hands-on experience with dbt and orchestration tools (Airflow, Dagster, or Prefect). - Familiarity with both streaming (Kafka, Kinesis) and batch processing patterns. - Excellent English communication skills (B2+). Nice to have: - Experience with Spark, Databricks, or similar big-data frameworks. - Knowledge of feature stores (Feast, Tecton) for ML pipelines. - Background with data observability tools (Monte Carlo, Great Expectations). - Experience designing event tracking schemas and product analytics pipelines. Tech stack: - Python. - SQL. - dbt. - Airflow. - Snowflake. - BigQuery. - AWS. Benefits: - Competitive USD salary. - 100% remote work. - Flexible working hours. - Professional development budget. - Health insurance stipend. - Equipment allowance. Location: - Remote (LATAM) — open to candidates across Latin America.
Closes in:
0
days
0
hours
0
min
0
sec
Optionally, you can add more information later (benefits, pre-screening questions, etc.)
check_circle

Payment confirmed

A member of the Torre team will contact you shortly

In the meantime, continue adding information to your job opening.