personal defi portfolio analytics
An open source (AGPLv3), local/private tool to help track your defi degen escapades.
The current perfi functionality is focused on generating a "close enough" estimate of cost basis and loss/gain/income calculations from certain EVM-compatible chain transactions. We are generating a (US IRS) 8949-style output that can be used as a starting point for tax estimation purposes.
We built this because we couldn't find any tool that could remotely understand DeFi protocols/primitives and how they would map into disposals/income calculations. As far as we know, this is the most advanced DeFi accounting analysis tool in existence and has been tested against real wallets with tens of thousands of transactions interacting with hundreds of contracts.
This software is currently an alpha release. Features may be missing, broken, or confusing. There is no friendly UI yet (outside of a CLI and a generated spreadsheet output), the documentation is sparse, and anyone using this should be familiar with Python.
This software should not be used in lieu of accounting / tax review. While we believe that the results generated by this software are useful, they are almost guaranteed to be incorrect without manual adjustments. Also, for your convenience, here's the Disclaimer of Warranty from our LICENSE:
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
You should be familiar with and have Python 3.8 and Poetry installed. We won't be providing any support for setting up your software environment. In the future, we'll be building more accessible packaged installer releases.
perfi currently depends on several third party API providers (no configuration is required by default):
- DeBank OpenAPI - provides a helpful list of transaction history per chain
- CoinGecko API - provides day-resolution coin prices. No API key is required for but requests will be rate-limited. perfi caches and retries so your initial fetches will be slow, but it should eventually work
- Paid API plans are supported and can be entered in the initial setup
- ECB Euro foreign exchange reference rates daily conversion rates cached for [CurrencyConverter](Euro foreign exchange reference rates)
Here's how to install:
git clone https://github.com/AUGMXNT/perfi
cd perfi
poetry install
And how to run:
# Interactive initial perfi setup (entity, accounts, API keys)
poetry run python bin/cli.py setup
# NOTE: Many commands below require an entity name to operate on.
# For the rest of these examples, we assume you have an entity named 'peepo'
# OPTIONAL: You can also add entities or addresses manually
#
# Add an entity 'peepo'
# > poetry run python bin/cli.py entity create peepo
#
# Add an ethereum-style account named 'degen wallet'
# > poetry run bin/cli.py entity add_address peepo 'degen wallet' 'ethereum' '0x0000000000000000000000000000000000000000'
#
# You add as many wallets as you want to an entity
# Update the Coingecko price token list
poetry run python bin/update_coingecko_pricelist.py
# OPTIONAL: Import data from exchanges (more docs below)
# If you have data from Coinbase, Coinbase Pro, Kraken, Gemini, or Bitcoin.tax you can import this into perfi as well
# > poetry run python bin/import_from_exchange.py --entity_name peepo --file peepo-coinbase-2021-rawtx.csv --exchange coinbase --exchange_account_id peepo
# Import on-chain transactions
poetry run python bin/import_chain_txs.py peepo
# Generate tx/price asset mappings - this step is key for matching like-kind assets and making sensical output
poetry run python bin/map_assets.py
# Turn raw exchange/onchain txs into grouped logical/ledger txs
poetry run python bin/group_transactions.py peepo
### Generally you should not need to re-run anything above this line again ###
# OPTIONAL: Set the timezone used for reporting (defaults to US/Pacific)
# Get a list of valid time zone names with:
# > poetry run python bin/cli.py setting get_timezone_names
# And set your reporting timezone with:
# > poetry run python bin/cli.py setting set_reporting_timezone 'Europe/Lisbon'
# Calculate costbasis lots, disposals, and income
poetry run python bin/calculate_costbasis.py peepo
# Generate 8949 xlsx file
poetry run python bin/generate_8949.py peepo
Today, perfi supports importing trade data from Bitcoin.tax, Coinbase, Coinbase Pro, Gemini, and Kraken.
To import data from an exchange you run the command:
poetry run bin/import_from_exchange.py --entity_name <peepo> --file <path/to/export/file> --exchange <coinbase|coinbasepro|gemini|kraken|bitcointax> --exchange_account_id <anything_eg_default>
- use anything you want for the
--exchange-account-id
parameter; it's just used to help potentially differentiate multiple accounts from the same exchange - supported exchange names for the
--exchange
parameter are:coinbase
coinbasepro
gemini
kraken
bitcointax
- for the
--file
parameter, see below for which file you need to provide for a given exchange
- Coinbase
- Reports Section → Transaction History → Generate Report → All time, all assets, All transactions. Format should be CSV.
- Coinbase Pro
- Statements → Generate → Account Statement → Select date range and 'All Accounts'. Format should be CSV.
- Gemini
- Account → Settings → Statements and History → Transaction History → Exchange Transaction History → Click the download icon next to this label. Format should be XLSX.
- Kraken
- History → Export → Select 'Ledgers' and pick date range. Format should be CSV.
- Bitcoin.Tax
- Opening → Download. Format should be CSV.
bin/cli.py
should let you do what you need for updating logical and ledger transactions (updatings prices, transactions types). We try to be smart about updating downstream results, although if things look wonky, you may need to re-run bin/group_transactions.py
on down...
We've included --help
for some of the options in the various CLI apps as there's some functionality not in this document yet.
Disposals in generated 8949 xlsx sheets internally link to their associated lots with transaction ids and hashes. LibreOffice and Google Sheets have been tested to play nice with our output file.
While the costbasis disposal/income/lot results may be wrong/incomplete, we've included debug data that links to where the disposals drawdown from as well as both ledger and on-chain hashes. We've included a collated "Ledger TXs" sheet as well that includes USD Price and asset assignments so it should be helpful even if you're putting together your disposals manually or using a different tool for accounting (eg, if the current tax treatments or lot matching is not to your liking).
If you run into weird problems, you could try nuking data/perfi.db
and do a "clean" run (keeping the cache/cache.db
should be fine and will make runs a lot faster). Also, you can take a look in logs/
to see if there's more useful info there.
We also recommend DB Browser for SQLite for spelunking around in data/perfi.db
This is currently hard-coded. Here's a summary:
- We do specific-id lot matching, generally HIFO (although we do lowest cost basis for loan repayments), we don't do long-term/short-term optimizations, but the HIFO approach should generally still give near optimal (minimized) tax exposure for US tax rates
- Transfers (including CEX transfers, bridging) are assumed to not be disposal or income. If they are, you should use
bin/cli.py
to manually change the TX type - Wrapping/unwrapping is also considered non-disposal
- We will create zero-cost basis lots (with a flag) if necessary
- Single-staking or single-asset deposits/withdrawals are treated as deposits, not disposal
- Swaps/LP are like trades and are treated disposals
- We try to do a good job accounting for income from yield or interest (see Tests), however we can't track it currently if the protocol doesn't generate a deposit receipt
- We do a best effort for pricing of LP tokens and try to track receipt tokens properly
- For more details atm, take a look at
perfi/costbasis.py:process()
perfi/models.py:TxLogical.refresh_type()
- current tx types: borrow, repay, deposit, withdraw, disposal, lp, swap, yield
- we have flags for: receipt, ownership_change, disposal, and income
You may also want to check out BittyTax, a set of tools to help with UK taxes, or CoinTaxMan for German taxes.
- perfi runs as much on your local system as possible, and while we use some third party APIs to make our job easier, our goal is to minimize any PII stored/leaked remotely. We appreciate our privacy and we think others do too.
- This software is licensed with the AGPLv3, a strong copyleft license. This is meant to make sure that end-users of the software will always be able to have control and be able to modify this software to their liking, and as devs, we can maintain optionality/minimize free-riding
- If there's demand/we don't get bored, we may add some (privacy-first) freemium services (higher resolution price/other metadata), access to archival nodes, etc, in the future. In the meantime, if you find this useful or want to support development, donations are gratefully accepted (see below)
- For the exchange imports we've implemented, our HIFO lot matching algorithm almost exactly matches the best-in-class services we've tested (Bitcoin.tax) and beats others like Koinly or CoinTracking which fall down/have bugs (aggressively zero cost-basis transfers, not understanding foreign currency fiat transactions, etc)
- Turns out this stuff is sort of complex, though. We will be publishing some Architectural Decision Records in due time discussing how we handle transactions and cost basis and more
- We think it'll be useful to build a DDL and shared repository to allow community members to extend perfi's ability to understand arbitrary DeFi protocols and strategies
- While the focus of our initial release, taxes are not actually so interesting. Future plans for perfi focus more on the perf part:
- Tracking actual performance of DeFi investment strategies (performance vs benchmarks/hodling, accounting for transaction costs, tax efficiency)
- Farming/claim calculations and helpers
- You can see a preview of some of that sort of thing here: https://github.com/AUGMXNT/frax-analysis/
- Somes types of defi transactions still aren't handled well (balancer/Curve-style multi-asset LP entries, maybe some more exotic hedging/margin strategies aren't accounted for, DSA/EOA-style operations, like Instadapp, UniV3 multicalls, GMX/GLP) and are ignored (but should be logged)
- We don't support NFTs very well atm, sorry
- Tax treatments are hard coded and may not match your tax regime/preferences. In future versions we plan on making this easier to personalize/configure
- Only supports some EVM chains atm
- Doesn't account for fees atm (this is high on the priority list)
This is definitely NOT tax advice, but in the US, if you file an extension, and don't make an adequate prepayment, you will owe a Failure to Pay Penalty of 0.5%/mo (6% APR). Interesting, right?
- Note, the Failure to File Penalty is 5%/mo if you don't get things sorted by the time your extension period ends
The GUI is a single page JS app written in Typescript. We use Vue.js for routing/rendering components, Pinia for some state management of record collections, Quasar for the UI components, and Vite for the build system. The GUI reads and writes data by talking to an API server running on the same host.
Frontend
The Frontend app is built with Vue.js. Vue is similar to React; it allows you to compose pages comprised of components which selectively re-render to update the UI when a component’s backing data changes. The frontend files live in frontend/
.
More information on getting the frontend and backend running for dev purposes can be found in frontend/README.md
One common pattern in data-driven Single-Page Apps is to have stores which are responsible for loading data from an API server, keeping that data up-to-date in-memory, and providing the data for components to use when they are rendered. Vue recommends Pinia for data store management, so we use that. Our stores are very straightforward and all follow a similar pattern where they have an async method to load an initial collection of data from the API server into memory, and they have events to update that data in-memory in response to triggered events from the rendered components. See frontend/stores/entities.ts
for an example.
We use Quasar for its grid layout framework and for many of its pre-built components to help deliver a UI with a consistent look and feel. Quasar has large collection of Vue components, each with extensive documentation. See https://quasar.dev/vue-components for more info.
Backend
The backend API server is written in Python and uses FastAPI. The API file lives in perfi/api.py
If you know how FastAPI works, you’ll understand most of how the file is written. There are just a few notable things that might be unusual.
CORS: We need to set proper CORS headers on API responses so that browsers are OK to talk to the API. Since CORS origin values require a host and a port, and since the API server port (and the Web asset server port) may be dynamic (set via ENV vars), we have a set of hardcoded default origins which we extend with values that are present in the ENV vars. Look for API_PORT
and FRONTEND_PORT
env var usage in the file for more info.
Our EnsureRecord FastAPI helper class: Many API endpoints require an entity ID (for example /entities/1
or /entities/1/addresses
or addresses/1
etc). FastAPI has a dependency injection system to provide values to your routes, and we leverage this here with an EnsureRecord
class which gives us an easy way to query a particular data store (e.g. EntitiesStore
or AddressesStore
etc) for a given entity. For example:
@app.get("/entities/{id}")
def list_addresses_for_entity(entity: Entity = Depends(EnsureRecord("entity"))):
return entity
Here, you can see we define a GET to /entities/{id}
and we load that entity via querying the entity store for a record with that ID via the EnsureRecord
dependency.
Running CLI bins: Some of the API endpoints delete their work to simply executing one of the existing CLI bin commands. This is just so we don’t have to duplicate logic that’s already defined in the command handlers for the CLI.
from bin.cli import (
ledger_update_logical_type,
ledger_update_ledger_type,
ledger_update_price,
ledger_flag_logical,
ledger_remove_flag_logical,
ledger_move_tx_ledger,
)
We use Electron to bootstrap the app (start the API and Web asset servers, then open a chrome-less browser window to show the Single-Page App GUI. The electron files live in electron/
.
In order to start the API and Web Asset servers, we need to find free ports on the host system and start python processes for the servers, passing those ports in appropriately (via ENV vars). This happens inside the electron app’s entry point file electron/src/index.js
— see createWindow()
for the details.
Packaging the electron app is accomplished using a tool called electron-builder
which simply looks at some configuration data describing which OS/package formats to build for and then builds them. However since all Electron knows how to do is run Node.js code and open a web view, we need a way to take our API server and package it up with all of its python dependencies so it can run on the target host. For this, we use PyInstaller.
From PyInstaller’s website: "PyInstaller bundles a Python application and all its dependencies into a single package. The user can run the packaged app without installing a Python interpreter or any modules. PyInstaller supports Python 3.7 and newer, and correctly bundles many major Python packages such as numpy, matplotlib, PyQt, wxPython, and others." After examining a python file for packaging, PyInstaller produces a .spec
file (which is executable Python code) outlining exactly what it will package, and how. Our entry point for PyInstaller is app_main.py
and the corresponding spec file is app_main.spec
app_main.py
's job is simply to spin up the backend API server (for the Frontend JS app to use) and a simple HTTP static file server (to serve the Frontend JS app itself). It takes in API_PORT
and FRONTEND_PORT
env vars and defaults to 5000
and 50001
when those envs are not present.
Sometimes, it can be useful to know if Python code is executing inside a PyInstaller packaged python, or from without. This snippet helps here: IS_PYINSTALLER = getattr(sys, "frozen", False) and hasattr(sys, "_MEIPASS")
We use this technique inside perfi/constants/paths.py
in order to set the parent dir in the DATA_DIR
and CACHE_DIR
variables. For local development, we proceed as we did before, simply looking for the git root dir. But for the PyInstaller version, we use OS-specific sensible locations. See get_user_data_dir()
for more info.
An important detail about PyInstaller is that it is not a cross-compiler; it can only successfully package up python for the OS that you run the command inside. So, we use GitHub Actions to build out our final application, since we can run build steps on Windows, Ubuntu, and Mac hosts there.
The packaged app's version comes from electron/package.json
. The best way to update the version for new releases (that will trigger the Github Actions) is to use npm version <major|minor|patch>
in the electron
folder. This will bump the package.json version AND it create a commit with that version update, AND it will create a new git tag.
SETUP NOTE: in order for npm version
to work correctly, it needs to think that it's in a git repo. You will need to create an empty .git
folder inside electron
on your local repo. git will ignore this.
To trigger the build, you will need to push the tag to Github:
git push --tags
Inside .github/workflows/build_releases.yml
you’ll find the GitHub Actions workflow file that controls how our app is built. The workflow runs whenever a new tag matching the pattern v*
is pushed to GitHub (e.g. `v1.0.0' or 'vFoo').
The workflow will build out the app and put the final product into a new GitHub release named after the version
value inside electron/package.json
(not the tag name you use).
If your build fails at the publishing step, it could be because a release already exists for the configured current version number. Also note that all releases are created in Draft form and must be manually changed to Public before people can view them at https://github.com/AUGMXNT/perfi/releases
If you've found this software useful, feel free to zap us some coins/tokens. We promise to farm the shit out of it:
0x0bcee2cd8564b2c61caec20113f1f87a16e10cb0
- DeBank - this is by far the best DeFi portfolio viewer and we use their API extensively. If you are doing defi, you should definitely be using this tool.
- rotki - this app shares some goals with perfi, and would have saved a few hundred hours and counting of dev time if it supported the chains/protocols we needed. There's a small dedicated team that have been working for years - worth a look if it'll do what you need. They have a GUI, real documentation, and like users and stuff
- bitcoin.tax - if all you have is centralized exchange transactions, crypto taxes are a solved problem. We checked out a dozen services and bitcoin.tax did the best job of any of them (but most of them are probably good enough).
- staketaxcsv - MIT-licensed python code for exporting CSVs from multiple blockchains, including Algorand, Solana, and various Cosmos/IBC chains. Published/used by https://stake.tax/