Data Engineer at Tambourine | Torre

Data Engineer

You will revolutionize e-commerce by building scalable data solutions and empowering data-driven decisions.
Emma highlights
This highlight was written by Emma’s AI. Ask Emma to edit it.
Full-time
Provide your expected compensation while applying

+ Health insurance

location_on
Edificio Box XI, Calle 96, Bogotá, Colombia
skeleton-gauges
You have opted out of job matches in .
To undo this, go to the 'Skills and Interests' section of your preferences.
Review preferences
Posted 3 months ago

Requirements and responsibilities


Tambourine is one of the fastest-growing hospitality and tourism marketing firms. Combining best-in-class technology with creative design, we revolutionize e-commerce for hotels, resorts, and destinations. We are looking for a Bilingual Data Engineer to join our Analytics team in Bogotá. The Data Engineer II is a key member of the analytics team responsible for designing, building, and owning robust and scalable data solutions on Google Cloud Platform. This role involves developing complex data pipelines, implementing efficient data models, and managing infrastructure as code. The ideal candidate will take full ownership of projects, solve ambiguous problems independently, and collaborate closely with stakeholders to turn data into a critical asset for the organization. This is an on-site position at our Bogotá office. Requirements: * Bachelor’s degree. * 3+ years of professional experience as a Data Engineer. * Full English proficiency. * Proven expertise in Google BigQuery, including performance tuning, cost optimization, and writing complex queries. * Hands-on experience building and deploying data pipelines using GCP services (Cloud Functions, Cloud Run, Dataflow) and orchestration with Apache Airflow / Cloud Composer. * Strong experience in data modeling for analytical use cases. * Demonstrated ability to productionize the ingestion and transformation of GA4 data. * Familiarity with Infrastructure as Code (IaC) tools, preferably Terraform. * Understanding of how data architecture impacts BI tools like Looker. * Ownership and accountability for the reliability and quality of data pipelines. * Independent problem-solving skills and ability to tackle ambiguous challenges. * Experience in stakeholder management and translating business needs into technical solutions. * Collaborative approach and interest in mentoring junior team members. Responsibilities: Advanced Data Pipeline Architecture & Development: * Design, build, and deploy robust, scalable data pipelines using GCP services such as Cloud Functions, Cloud Run, and Dataflow. * Orchestrate complex workflows, manage dependencies, and ensure data SLAs using Apache Airflow (Cloud Composer). * Write, debug, and optimize complex SQL queries in BigQuery, implementing features like partitioning, clustering, and materialized views. * Provision and manage GCP infrastructure programmatically using IaC principles with tools like Terraform. Data Modeling & Analytics Enablement: * Design and implement efficient and scalable data models (e.g., star schema, dimensional models) within BigQuery to support analytical workloads and BI performance. * Productionize end-to-end ingestion and transformation of GA4 data streams, handling event-based data structures to create reliable datasets. * Collaborate with data analysts to understand the impact of data models in BigQuery on LookML development and dashboard performance in Looker. * Translate business requirements from stakeholders into technical specifications for data pipelines and models. Team Contribution & Ownership: * Take full ownership of data projects and pipelines from design to deployment, monitoring, and maintenance. * Perform root cause analysis on complex data issues and implement durable solutions with minimal supervision. * Mentor junior engineers through code reviews, technical guidance, and best practices. * Manage priorities and timelines for multiple concurrent projects. Nice to Have: * Google Cloud Professional Data Engineer certification. * Experience with streaming data technologies (e.g., Pub/Sub, Dataflow). * Proficiency in Python testing frameworks (e.g., Pytest) and package management. * Hands-on experience developing LookML models in Looker.
Optionally, you can add more information later (benefits, pre-screening questions, etc.)
check_circle

Payment confirmed

A member of the Torre team will contact you shortly

In the meantime, continue adding information to your job opening.