S

Saurabh Tiwari

About

Detail

Hyderabad, Telangana, India

Contact Saurabh regarding: 
work
Full-time jobs
Starting at INR2.5M/year ~USD26.1k/year

Timeline


work
Job
school
Education
folder
Project

Résumé


Jobs verified_user 0% verified
  • Zoetis
    Data Engineer
    Zoetis
    Jul 2025 - Current (11 months)
    • Built data quality checks, validation rules, and monitoring frameworks to ensure accuracy and reliability of production datasets. • Optimized PySpark transformations, improving job performance and reducing compute cost. • Developed curated reporting tables and Power BI datasets using Azure Databricks and ADF to support dashboards and analytics use cases.
  • MAQ Software
    Data Engineer
    MAQ Software
    Sep 2023 - Jul 2025 (1 year 11 months)
    • Built and maintained scalable ETL pipelines using SQL, Spark, and Python across multiple data sources. • Optimized data pipelines for 5+ sources, reducing ETL execution time by 50% through Spark and query optimization techniques. • Developed automated data ingestion and transformation workflows using Azure Data Factory and Databricks. • Implemented CI/CD pipelines for automated deployment of data pipelines, improving release efficiency and reducing errors. • Migrated SAP BO reports and SSIS cube to Power BI by building semantic data models and enabling scalable dashboarding solutions.
Education verified_user 0% verified
  • Lovely Professional University
    Bachelor of Technology - Computer Science and Engineering
    Lovely Professional University
    Jun 2020 - May 2024 (4 years)
    Percentage: 80%
  • M
    Microsoft Fabric Analytics Engineer Associate
    Nov 2024 - Current (1 year 7 months)
Projects (professional or personal) verified_user 0% verified
  • E
    E-commerce Data Pipeline Optimization
    Apr 2023 - Sep 2023 (6 months)
    • Developed an end-to-end data engineering pipeline to process e-commerce data, including product details, transactions, and customer data, using Azure Data Factory, Databricks, and SQL. • Designed and deployed a data pipeline that ingested and processed over 10 GB of e-commerce data daily, ensuring timely availability of data for analysis. • Fine-tuned data transformation jobs, reducing execution time by 40% and ensuring high data quality for downstream analytics.