wlopezm-unal
As developer, I have skills in Python, SQL, FastAPI, Airflow, Langchain, Apache Spark. My background includes implementing ETL processes, machine learning
Colombia
Pinned Repositories
chatbot_langchain
Chatbot using Langchain and Gemini-pro
Chatbot_Langchain_MultiQueryRetriever
chatbot that allows you to interact with your PDF files. When you upload a document, it generates a split and a summary, stored in separate collections within the Qdrant vector database. When you ask a question, two queries are created to search for relevant information in the summary and split, optimizing token usage in the LLM model.
CNN_tensorFlow
Notebook on the use of the VGG19 model of a Convolutional Neural Network (CNN) for image classification of natural scenes. The framework is TensorFlow
COBRO_BANCARIO
Backend using FastApi and CRUD methodology. This project aims to receive data from users, look at bank loan statuses, look at the amount of money requested and approved. On the other hand, it has a notification option sending an email to the user in case the loan cancellation date has passed.
Machine-learning
Modelos de machine learning. you can see different notebook where i worked with machine learning model, data exploring data cleaning.
mlbookcamp-code
The code from the Machine Learning Bookcamp book and a free course based on the book
portafolio.github.io
portafolio_data_analytics.github.io
reddit_project_airflow_aws
This project focuses on implementing an ETL pipeline using Apache Airflow to efficiently extract data from Reddit, transform it as needed, and load it into an AWS S3 bucket. The use of Airflow allows for robust orchestration of the data workflow, ensuring that each step of the ETL process is executed in a reliable and repeatable manner.
Titanic_ship-streamlit
Machine Learning model, where using titanic ship data and see if is be able to predict if a passager was salved or died. This apply use Machien learning (Random Forest, gassianNB and Logistic Regressión) . Further, using streamlit together to FastApi be able to see the predict result
wlopezm-unal's Repositories
wlopezm-unal/chatbot_langchain
Chatbot using Langchain and Gemini-pro
wlopezm-unal/Chatbot_Langchain_MultiQueryRetriever
chatbot that allows you to interact with your PDF files. When you upload a document, it generates a split and a summary, stored in separate collections within the Qdrant vector database. When you ask a question, two queries are created to search for relevant information in the summary and split, optimizing token usage in the LLM model.
wlopezm-unal/CNN_tensorFlow
Notebook on the use of the VGG19 model of a Convolutional Neural Network (CNN) for image classification of natural scenes. The framework is TensorFlow
wlopezm-unal/COBRO_BANCARIO
Backend using FastApi and CRUD methodology. This project aims to receive data from users, look at bank loan statuses, look at the amount of money requested and approved. On the other hand, it has a notification option sending an email to the user in case the loan cancellation date has passed.
wlopezm-unal/Machine-learning
Modelos de machine learning. you can see different notebook where i worked with machine learning model, data exploring data cleaning.
wlopezm-unal/mlbookcamp-code
The code from the Machine Learning Bookcamp book and a free course based on the book
wlopezm-unal/portafolio.github.io
wlopezm-unal/portafolio_data_analytics.github.io
wlopezm-unal/Practice_frontend
This code is frontend to page where its topics is about some show data to Zelda Breath of the wild
wlopezm-unal/Project-airflow-AWSGlue
In this project we can run an ETL in AWS Glue by Orchestrating it with Airflow. This project we create a Docker Compose to raise the services as Airflow, Redis and PostgreSQL. PostgreSQL was use in this project to save metadata get of Airflow
wlopezm-unal/reddit_project_airflow_aws
This project focuses on implementing an ETL pipeline using Apache Airflow to efficiently extract data from Reddit, transform it as needed, and load it into an AWS S3 bucket. The use of Airflow allows for robust orchestration of the data workflow, ensuring that each step of the ETL process is executed in a reliable and repeatable manner.
wlopezm-unal/RPA_Coronavirus
RPA que automatiza la búsqueda y descarga de graficas de diferentes paises en torno a los casos confirmados y las muertes confirmadas, desde el año 2020 hasta la fecha actual
wlopezm-unal/Titanic_ship-streamlit
Machine Learning model, where using titanic ship data and see if is be able to predict if a passager was salved or died. This apply use Machien learning (Random Forest, gassianNB and Logistic Regressión) . Further, using streamlit together to FastApi be able to see the predict result
wlopezm-unal/web_scraping-with-AWS
It is a project in which the infrastructure is created in AWS using Terraform to create the VPC, EC2, S3, RDS, the security group. This in order to be able to run a web scraping code in EC2 and extract data regarding the English soccer league 2023-2024 and save that information in an S3.
wlopezm-unal/proyecto-tic
RPA process using Aws client. Where the final steps is a bucket S3 where it gonna save this data
wlopezm-unal/WordpressMysqlKubernete
Implementación de MySQL y WordPress en Kubernetes: Descubre cómo desplegar y gestionar eficientemente tu base de datos y tu sitio web en esta plataforma de orquestación de contenedores
wlopezm-unal/World-population-1950-2100
This repository is an RPA where, through web scraping and using the Selenium library, it extracts information about the world population from the years 1950 to 2100 from the website 'https://world-statistics.org' for different countries
wlopezm-unal/zoomcamps
Related documentation about zoomcamp