FlowCV Logo
SARA CHOUDHURY
About me

Aspiring data engineer with a passion for solving complex problems. Proficient in foundational skills like SQL, Python, and eager to learn and apply new technologies. Committed to contributing to innovative data solutions.

Technical Skills
Languages :-

PYTHON

Database :-

MySQL, Postgress , MongoDB

Framework/Libraries :-

PySpark , Pandas

AWS

Glue, Athena, ETL, EC2, Lambda, SNS, Kinesis, IAM, S3, QuickSight

Others :-

GIT, GITHUB, PostMan , Knime

Soft Skills

Active Learning & Continuous Improvement

Adaptability & Flexibility

Communication & Presentation Skills

Achievement

5 Star HACKERRANK in Java

Certificates
Academic Background

BTECH in Computer Science

AKTU
2021 – 2024 | Ghaziabad, India

CGPA Is 8.5 till now

DIPLOMA IN MECHANICAL

BTE Lucknow
2018 – 2021 | Ghaziabad, India

Percentage is 78%

High school

CBSE
2018 | Delhi, India

Percentage is 70%

Industrial Training

Rapipay Fintech Private Limited

At Rapipay, I harnessed AWS tools like Athena, QuickSight, Glue, and crawler alongside SQL and PySpark to engineer robust data pipelines. By optimizing real-time processing, implementing efficient storage with S3, and utilizing crawler for data discovery, I enabled informed decision-making and streamlined operational efficiency.

Vigility Technology

As a Software Engineering intern, I improved coding, debugging, and problem-solving skills while working on various projects. I quickly grasped new concepts and gained hands-on experience with AWS tools like IAM, Lambda, EC2, and S3.

Lepto software Through Internshala

As a Data Engineering intern, I used Athena to clean and verify data, then promptly updated errors identified through queries over Jira. This ensured accurate data and streamlined communication for swift issue resolution during my internship.

AWS Cloud Foundation

Proficient in data processing and storage, eager to apply skills in building scalable and efficient data pipelines.

Projects

ETL Pipeline

  • I created an ETL pipeline using Python and SQL, extracting data, transforming it, and loading it into a database. I utilized AWS tools like S3 for storage and Glue for data cataloging. It's aimed at mastering ETL processes, refining data transformation skills, and automating workflows.
  • Automated Data Pipeline with QuickSight Integration

  • I spearheaded the implementation of AWS Data Pipeline to automate data transfer, incorporating QuickSight for visualization and analysis. This initiative streamlined workflows, ensuring data integrity with error monitoring and automatic retries.
  • Racing Game Using- Python

  • Engineered a dynamic car race game using Python, showcasing proficiency in game development and algorithmic design. Implemented interactive features and optimized performance for a engaging user experience.