Senior Data Engineer / Solutions Architect at Level60 Consulting | Torre

Senior Data Engineer / Solutions Architect

You’ll lead data integration projects and shape engineering standards with cutting-edge cloud tools.
Emma highlights
This highlight was written by Emma’s AI. Ask Emma to edit it.
Full-time

Legal agreement: Employment

Compensation
USD2k - 3k/month
Negotiable

Engagement length: Open ended

location_on
Remote (for Colombia residents)
flightsmode
Visa sponsorship: No
skeleton-gauges
You have opted out of job matches in .
To undo this, go to the 'Skills and Interests' section of your preferences.
Review preferences
Posted 9 months ago

Requirements and responsibilities


Senior Data Engineer - Data Integration & ETL Development. Position Overview: We are seeking an experienced Senior Data Engineer specializing in data integration and ETL/ELT development to join our dynamic team. This role focuses on building robust data pipelines, managing complex data transformations, and implementing scalable data solutions across multiple platforms and technologies. Key Responsibilities: Data Integration & ETL Development: • Design, develop, and maintain ETL/ELT processes using modern tools including Azure Data Factory, Apache Airflow, dbt, SSIS, and Databricks. • Build and optimize data transformation workflows to ensure efficient data processing and quality. • Implement data integration solutions across various source systems and target platforms. • Develop automated data pipelines that support business intelligence and analytics initiatives. Database Management & Development: • Work with multiple database platforms including SQL Server, Oracle, PostgreSQL, MySQL, SQLite, and MongoDB. • Design and implement stored procedures, functions, and complex queries for data processing. • Perform database optimization and performance tuning. • Manage Snowflake environments including querying, data masking, and Cortex functionality. Team Leadership & Project Management: • Lead data integration development teams and mentor junior developers. • Manage end-to-end data integration projects from requirements gathering to deployment. • Define and implement company-wide standards for ETL development and data management. • Create comprehensive technical and functional requirements documentation. Data Governance & Quality: • Implement Master Data Management (MDM) solutions. • Establish data quality standards and validation processes. • Ensure data security and compliance with organizational policies. • Design and implement data masking and privacy protection measures. Reporting & Analytics: • Build reports and dashboards using Excel, Power BI, Hyperion, and Cognos. • Support business intelligence initiatives with reliable data delivery. • Collaborate with stakeholders to understand reporting requirements. Required Technical Skills: Programming & Scripting: • Python: Advanced proficiency with pandas, NumPy, LangChain, Selenium, BeautifulSoup. • SQL: Expert-level SQL development and optimization. • Jinja: Template engine for dynamic SQL generation. • Shell scripting for automation tasks. Data Platforms & Tools: • Cloud Platforms: Azure (Data Factory, DevOps), Databricks, Snowflake. • ETL Tools: SSIS, Apache Airflow, dbt (data build tool). • Databases: SQL Server, Oracle, PostgreSQL, MySQL, MongoDB. • Data Formats: JSON, CSV, XML, YAML, Parquet, Excel/TXT files. DevOps & Version Control: • Git & GitHub for source code management. • Azure DevOps/TFS for project management and CI/CD. • Master Data Services (MDS). • Linux and Windows operating systems. Required Experience: • 5+ years of experience in data engineering and ETL development. • 3+ years of hands-on experience with cloud-based data platforms (Azure, Snowflake, Databricks). • 2+ years of team leadership experience in data integration projects. • Proven track record of managing complex data integration initiatives. • Experience with Master Data Management implementation. Preferred Qualifications: • Bachelor's degree in Computer Science, Information Systems, or related field. • Experience in financial services, insurance, or similar regulated industries. • Certification in Azure Data Engineering or Snowflake. • Experience with modern data architecture patterns (data mesh, lakehouse, etc.). • Knowledge of data governance frameworks and best practices. What We Offer: • Opportunity to work with cutting-edge data technologies. • Leadership role in defining data engineering standards. • Remote work flexibility. • Professional development opportunities. • Collaborative team environment focused on innovation. Location: Remote work available with preference for candidates in the Americas time zone for team collaboration.
Optionally, you can add more information later (benefits, pre-screening questions, etc.)
check_circle

Payment confirmed

A member of the Torre team will contact you shortly

In the meantime, continue adding information to your job opening.