/llm_pentest

Pentesting POC using LLMs

Primary LanguagePython

LLM Pentest

In this project, we explore the use of LLMs in penetration testing. We propose an agent-based system that utilizes LLMs to generate reports, tasks, and identify vulner- abilities. This AI-assisted approach can be utilized for either fully automated or semi- automated penetration testing. In the latter case, the combination of human expertise and AI’s rapid data processing ability can enhance safety and leverage the strengths of both human and AI insights in cybersecurity. 1

Installation

Use the package manager poetry to install the dependencies.

  1. Install poetry
pip install poetry
  1. Create a virtual environment
poetry shell
  1. Install dependencies
poetry install

Before you start

All secret keys are stored in a .env file. You can find a .env.example file in the root directory. Copy it and rename it to .env. Then fill in the values.

You will need access to the GPT-4 API. Learn more about it here.

Usage

The easiest way to test the pentesting assistant, is by using the chainlit application. To do so, run the following command:

chainlit run src/chat/app.py

Then, open your browser and go to http://localhost:8000.

Authors

Disclaimer

This project is a project work for the course "AI in Cybersecurity" at the University of Deggendorf. It is only a proof of concept and not intended to be used in production.