/resume

Resume of my professional experience

MIT LicenseMIT

Daniele Esposti

Contacts

E-mail address: daniele.esposti@gmail.com

Twitter: @expobrain

Github: https://github.com/expobrain

Blog: https://medium.com/@daniele.esposti

Introduction

I'm passionate about technology which supports the growth of the business, making existing systems stable and scalable, exploring new technologies. I never stop learning, I enjoy working as a team and investing time on helping people to grow.

Experience

Plum

Core Platform, Principal Backend Engineer - from 02/2022 until now

As a Principal Engineer:

  • Revolutionising code quality and developer experience through tooling, patterns, and processes
  • Facilitating collaboration between Backend, Mobile, and Data teams to streamline workflows and maximise productivity
  • Communicating technical milestones and progress to management to ensure seamless integration with business objectives
  • Transformed infrastructure management for Google Workspace setup and Data Team resources using Terraform, resulting in improved reproducibility, security, and cost-effectiveness
  • Reduced the cost of running Data Team's data pipeline by 15% and errors per week by migrating to DBT and introducing a new testing framework
  • Slashed backend CI cost by 66% without sacrificing performance or service level
  • Implemented standard signing tools to replace Yubikey, saving hardware costs and reducing setup time for developers
  • Boosted developer productivity and quality by decreasing bootstrap time of backend tests by 120x
  • Provided a scalable backend platform to support mobile end-to-end testing, enhancing product quality and robustness
  • Eliminated silos between backend engineering and data engineering to encourage collaboration and simplify data pipeline delivery for data-driven features.
  • Driven the design and implementation of a new money movement feed service which slashed the response time from up top 30s to constant 250ms with higher cycle time, higher deployment rate and lower development friction
  • Deployed internal package repository, ported common internal libraries to reusable packages and released boilerplates for fast bootstrapping of services, libraries and tools
  • Automate documentation of events published in the message broker for easy discovery from developer and automated generation of parser in downstream services reducing the time to consume events and drastically improved developer experience with automated code generation
  • Reducing costs of event ingestion from Backend to Data team by 10x improving the performances and observability of the pipeline

Technologies: Python, Terraform, Kubernetes, GCP, Processes, System Architecture, Mentoring

Bulb

Smart Energy Experience, Staff Engineer - from 07/2020 to 01/2022

Contributed to the development of an intelligent and cost-efficient EV charging system aimed at optimising electricity usage:

  • Oversaw technical direction and facilitated integration with internal and external systems at Bulb.
  • Built a scalable, event-driven microservices architecture with asynchronous code.
  • Provided guidance and mentorship to team members to support their professional development.
  • Collaborated closely with Product Managers to execute the implementation of the smart energy product.

Accomplishments:

  • Initiated and built the team and project for the Smart Energy Platform from scratch
  • Instituted documentation of systems and tools to automate diagnostic tasks, streamlining processes
  • Pioneered the utilisation of modern asynchronous programming in Python to enhance efficiency and productivity.

Technologies: Python, Typescript, Kubernetes, GCP

Revolut

### Data Infrastructure, Lead Engineer - from 10/2019 to 07/2020

As the Team Leader and Product Owner of the Data Infrastructure team, I spearheaded various initiatives to enhance the team's capabilities and improve its overall efficiency.

  • Improving documentation of the services and tools provided by the team, leading to increased clarity and understanding of our infrastructure among stakeholders.
  • Automating manual processes to streamline workflows, resulting in significant time and cost savings for the organisation.
  • Enhancing security and audit capabilities on data lakes access, ensuring compliance with regulatory requirements and reducing the risk of data breaches.
  • Providing infrastructure support to over 170 Engineers and Data Scientists, enabling them to perform their roles with greater efficiency and effectiveness.

Accomplishments:

  • Successfully migrating projects to use Hashicorp Vault for secrets management, improving security and access control for sensitive data.
  • Promoting the creation of a tool to automate internal tasks, further increasing team productivity and freeing up time for higher-value activities.

Technologies: Python, Terraform, GCP

Reporting Team, Lead Engineer - from 02/2019 to 09/2019

Team leader and Product Owner of the Reporting Team:

  • Build a prototype of a platform to automatise generation and delivery of reports at scale
  • Managing the weekly sprints and the roadmap
  • Building the team from ground up
  • Reporting to the CFO and the founder

Achievements:

  • Reduced the cost and complexity of delivering reports to external financial entities
  • Designed a scalable system usable by non-developers
  • Using the right technology to build smarter and more effective tools

Technologies: Python, Java, Docker, Airflow

Badoo

Data Platform, Lead Engineer - from 11/2017 to 01/2029

Team leader of the Hotpanel Data Platform Team:

  • Communicate with other teams to help them on using our analytics and tools
  • Collecting requirements for improvements or new tools
  • Ensuring that the data pipeline, processing between 200-250Gb/hour, is always up and running
  • Caring about the team members giving them ownership of projects and space for experiments

Achievements:

  • Reduced the pipeline failures by 90% from a couple of major failures per week to only one in the last 6 months
  • Increased processing capability from up to 80Gb/h to an average of 200Gb/h with peaks of 250Gb/h
  • Team been able to meet quarter objectives
  • Increased stability of the code base
  • Introduced C to build analytical tools which need high performance on huge amount of data
  • Enabled monitoring of all the components in the pipeline with SMS+call for the critical ones

Senior Data Platform, Senior Engineer - from 08/2015 to 11/2017

As a Senior Software Engineer in the Hotpanel Data Platform Team my job is broad:

  • Ensure the deployability of our code and tools
  • Maintain the code of the data pipeline
  • Build new tools in Javascript and Python for the backend and Frontend

Achievements:

  • Unit tests for all the codebase with Jest and component's snapshots
  • Moved services to Docker to better deployability and management
  • Introduced Flow a type system for JavaScript

Technologies: JavaScript ES.next, Python, Java, Shell scripting, Docker, React, Flow

Drugdev

Senior Software Engineer - from 06/2015 to 08/2015

Briefly working in Data Solution team as a Senior Developer to maintain the API of their Shared Investigator Platform.

Technologies: Python

Plentific

Senior Software Engineer and Team Lead - from 02/2014 to 05/2015

Started as a Senior Software Engineer and the promoted to Team Lead:

  • Develop and release the features required for the platform
  • Organising weekly work for the team members
  • Delivered the first major feature FindAPro in time and with better performance than required
  • Using CI with Jenkins to ensure the quality of the code and release as soon as possible

Technologies: JavaScript, Python, AWS, Jenkins, Angular.js, Django

Base79

Senior Software Engineer - from 09/2013 to 01/2014

As a full stack developer I was responsible to maintain and extend the Partners portal based.

Technologies: JavaScript, Python, Django, Backbone.js

Marine Software

Software Engineer - from 06/2010 to 08/2013

My task as a Software Engineer was to port the company's application from FoxPro to Python:

  • Rewrite the skeleton of the application with industry standard authentication and permissions layer
  • Porting of the main feature of the application
  • All the code unit tested and documented as standard practice
  • Implemented automatic data audit system
  • Designed and developed a system to work with a offline geographically distributed database with sync

Technologies: Python, C/C++, Qt

Freelance - from 06/2000 to 05/2010

I spent the first 10 years of my career as a freelance in Italy:

  • Developed the driver for gesture drive games
  • Developed the Alfa Romeo's Giulietta configurator website
  • Integration of the RFID-enabled mirror for JCPenney US from the interaction in the catalog to the payment system
  • Porting of a CAD system for 1D simulation from VisualBasic to Python
  • Implementing custom solution for office automation for small/medium businesses
  • IT management for small/medium businesses

Technologies: Python, ActionScript3, wxPython, MySQL, RFID