S

Swathi Kodoori

About

Detail

Big Data Hadoop Developer

Contact Swathi regarding: 
work
Full-time jobs
person_search
Finding candidates

Timeline


work
Job
school
Education

Résumé


Jobs verified_user 0% verified
  • V
    ETL Hadoop/Big Data Developer
    VERIZON COMMUNICATIONS INC.
    Oct 2019 - Nov 2023 (4 years 2 months)
    • Involved in the requirement gathering, project documentation, design document, production deployment and support activities in collaboration with Yahoo Team. • Worked on Google Cloud Platform(GCP) and Hadoop VGrid for Yahoo • Implementing Spark Scala application in performing ETL using Spark Core, Spark SQL for both batch processing and streaming for interactive analysis requirement by configuring Kafka with Spark Streaming to collect the data from the Kafka. • Worked on migration project for platform change from NDL to VCG Yahoo and utilize VGrid environment • Experience in writing Spark applications using scala for Data validation, cleansing, transformations and custom aggregations. • Involved in converting Hive/SQL queries into Sp
  • M
    ETL Hadoop/Big Data Developer
    METLIFE, INC.
    Nov 2017 - Oct 2019 (2 years)
    • Responsible for handling business requirements, technical requirements and all the necessary details from source team such as nature of data, ingestion type to enable us in smooth ingestion of the systems. • Been part of Design Reviews, Sprint Planning's & Daily Project Scrums. • Involved in creating a framework which enables us to handle all types of data including unstructured and structured (XML, CSV, FAC, JSON, XML). • Used SQL and PL/SQL for development of Procedures, Functions, Packages and Triggers for ETL data loads. • Worked in designing, enhancing and scheduling SSIS Packages for transferring data from multiple data sources to SQL server 2014. • Worked on POC for informatica connectors with Bigdata Hadoop Horton works. • C
  • C
    ETL Hadoop/Big data Developer
    CareFirst BlueCross BlueShield
    Jun 2016 - Mar 2017 (10 months)
    • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment. • Gained understanding Hadoop Architecture and its related components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node programming paradigm • Involved in importing data from MySQL to HDFS using SQOOP. • Implemented the workflows using Apache Oozie framework to automate tasks. • Creating Hive tables, loading with data and writing hive queries which will run using spark. • Implemented Spark using Scala and utilizing Data frames and Spark SQL API for faster processing of data. • Initialized utilization of Data Warehouse ETL software during conversion of data to Oracle DB
  • C
    ETL Informatica Developer
    Cielo Talent | BHI Energy
    Feb 2016 - Jun 2016 (5 months)
    • Worked with Data Architects, Business Analysts and Independent Testing Team. • Responsible for analyzing, programming and implementing modifications to existing systems required by changes in the business environment using Informatica Power Center • Converted Business Specification Document to Technical Specification documentation for developing Source and Target mappings • Worked on data warehousing techniques for Slowly Changing Dimension phenomenon, surrogate key assignment, Normalization & De-normalization, Cleansing, Performance Optimization along with the CDC (Change Data Capture) • Involved in the creation of Oracle Tables, Views, Materialized views and PL/SQL stored procedures and functions. • Written SQL Queries and PLSQL s
  • B
    ETL Informatica Developer
    Benjamin Moore & Co.
    Jul 2015 - Nov 2015 (5 months)
    • Formulate and define best practice programming standards that meet regulatory compliance requirements for Implementation, Support and Upgrade Projects. • Working closely with Functional Leads, Business Analysts to determine optimal programming solutions to functional requirements in O2C, R2R and P2P process areas • Worked on all stages of Developing, Configuring, Fine-tuning ETL workflows, Code Reviewed, fixed bugs, testing and deployment • Uploading the Performance Test Plan, Test Scripts, Scenarios and Final Reports in the Quality Center for every application. • Prepared Traceability Matrices to track the requirements with the test cases and make sure none of them have been missed. Involved in Unit testing Interacted with QA team f
  • N
    Informatica Developer
    Nationwide Insurance
    Oct 2014 - Feb 2015 (5 months)
    • Worked with Business analysts and the DBA for requirements gathering, business analysis and designing of the data marts. • Preparation of technical specification document for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables in Data Marts and defining ETL standards using Informatica Power Center 9.5.1 • Processes to generate Daily, Weekly and Monthly data extracts were developed and the data files were sent across to the downstream applications. • Created and scheduled Sessions, Jobs based on demand, run on time and run only once • Monitored Workflows and Sessions using Workflow Monitor. • Performed Unit testing, Integration testing and System testing of Informatica
Education verified_user 0% verified
  • T
    Master of Science
    The University of Texas at Arlington Men's Basketball
    Aug 2012 - Dec 2013 (1 year 5 months)