Profile

Python backend engineer with experience building APIs, automation tools, and modular services. Strong focus on clean architecture, testing, and efficient data handling. Comfortable with GitHub workflows, CI/CD, Docker, and asynchronous collaboration.

Professional Experience
Oct 2024 – Present
Continental, Data Platform Engineer
  • Developed backend services in Python (FastAPI) for dataset management, analytics access, and automation.
  • Built internal tools and automation scripts in Python/Bash to streamline deployments and workflows.
  • Designed and maintained containerized services using Docker and AWS ECR.
  • Implemented CI/CD and observability using CloudWatch metrics, and Grafana.
  • Collaborated via GitHub (PRs, code reviews, issue tracking).
  • Nov 2023 – Nov 2024Porto, Portugal
    Critical Techworks | BMW Group, Python/DataOps Engineer
  • Designed and maintained Python microservices using clean architecture and OOP principles.
  • Built event-driven pipelines and backend components (Lambda, SQS, Step Functions).
  • Created CI/CD pipelines using GitHub Actions (tests, linting, deployments).
  • Implemented robust testing, code quality standards, and modular code patterns.
  • Nov 2021 – Nov 2023Porto, Portugal
    Infineon Technologies, Data Analytics Developer
  • Developed internal Python tooling and reusable modules for analytics and automation.
  • Built reliable pipelines using Python, SQL, and dashboard integrations.
  • Mar 2021 – Nov 2021Porto, Portugal
    Hotel Black Tulip, Business Intelligence Analyst
  • Developed internal Python tooling and reusable modules for analytics and automation.
  • Built reliable pipelines using Python, SQL, and dashboard integrations.
  • Projects
    Real Time Data Pipeline

    Real-time data pipeline with Kafka, Flink, Iceberg, Trino, MinIO, and Superset.

    Batch Data Pipeline

    Medallion batch data pipeline with Airflow, DuckDB, Delta Lake, Trino, MinIO, and Metabase. Full observability and data quality.

    Event-Driven DMS

    Async Python microservices with CDC, real-time WebSocket, and gRPC for high-performance document management.

    MarketPipe

    Docker containerized and configurable Airflow data pipeline for collecting and storing stock and cryptocurrency market data.

    DataFlow

    ETL pipeline using Pulumi for infrastructure as code, integrating AWS services and Snowflake for automated data flow.

    FinStockDash

    Streamlit Python-based web application to analyze historical stock data.

    Certificates
    Open Source and Community
    pandas, Contributor

    Improved the library’s data manipulation and reporting functionalities, supporting the development of efficient data pipelines and enabling scalable data solutions for analytics and modeling.