Civil is a decentralized and censorship resistant ecosystem for online Journalism. Read more in our whitepaper.
This repository contains open-source code to capture and handle Civil-specific smart contract event log data. It is written in golang
. It currently captures Civil TCR and Civil Newsroom related events, but can be expanded to capture additional events.
For Civil's main open-source tools and packages, check out http://github.com/joincivil/Civil.
Civil's ecosystem is free and open-source, we're all part of it and you're encouraged to be a part of it with us. We are looking to evolve this into something the community will find helpful and effortless to use.
If you're itching to dwelve deeper inside, help wanted and good first issue labels are good places to get started and learn the architecture.
This project is using make
to run setup, builds, tests, etc.
Ensure that your $GOPATH
and $GOROOT
are setup properly in your shell configuration and that this repo is cloned into the appropriate place in the $GOPATH
. i.e. $GOPATH/src/github.com/joincivil/civil-events-crawler/
To setup the necessary requirements:
make setup
Relies on dep
https://golang.github.io/dep/ for dependency management, updating the /vendor/
directory in the project.
When adding and removing imports, make sure to run dep ensure
. Any adding or removing will require committing the updates on Gopkg.lock
and /vendor/
to the repository.
There are a few places where code/artifacts need to be moved or generated before the project can be built, tested, and/or linted. This is likely a place that can be streamlined and improved as time goes on.
The latest generated code is checked into the repository, except for the contract .abi/.bin
. The .abi/.bin
is only needed to generate the Solidity wrappers.
This project relies on artifacts from the main Civil repository http://github.com/joincivil/Civil. Please clone the Civil repository into a directory accessible by this repository.
To build and copy the Civil contract .abi/.bin files, run the scripts/abi_retrieval.sh
script:
scripts/abi_retrieval.sh /full/path/to/main/civil/repo /full/path/to/civil-events-crawler/abi
The destination directory is generally the civil-events-crawler/abi
directory in this repository. This will produce the .abi
& .bin
files from the artifacts in the Civil
repository.
There are Solidity wrappers that are created by abigen
from the go-ethereum
package. To generate, the .abi
/.bin
files are required from the step above. These will go into pkg/generated/contract
directory and are generated by running:
make generate-contracts
There is a number of Watch*
methods for each Civil Solidity contract wrapper that allow us to listen and stream contract events. The wrappers around these Watch*
methods are generated using the cmd/eventhandlergen
command. These will be placed into the pkg/generated/watchers
directory.
make generate-watchers
There is a number of Filter*
methods for each Civil Solidity contract wrapper that allow us to collect existing contract events. The wrappers around these Filter*
methods are generated using the cmd/eventhandlergen
command. These will be placed into the pkg/generated/filterers
directory.
make generate-filterers
This creates wrapper functions around each contract's set of filterers and watchers. A map of contract names to their smart contract address is passed in these functions and determines which set of filterer/watchers need to be started in the crawler. These are generated using the cmd/handlerlistgen
command. These will be placed into the pkg/generated/handlerlist
directory.
make generate-handler-lists
To generate all the things:
make generate
Check all the packages for linting errors using a variety of linters via gometalinter
. Check the Makefile
for the up to date list of linters.
make lint
make build
Runs the tests and checks code coverage across the project. Produces a coverage.txt
file for use later.
make test
Run make test
and launches the HTML code coverage tool.
make cover
The crawler relies on environment vars for configuration. At the root of the project, run:
CRAWL_ETH_API_URL=<RPC API URL>
CRAWL_CONTRACT_ADDRESSES="<contract short name>:<contract address>,..."
CRAWL_PERSISTER_TYPE_NAME=<persister type>
go run cmd/crawler/main.go
civiltcr
, newsroom
none
, postgresql
Add -logtostderr=true -stderrthreshold=INFO -v=2
as arguments for the main.go
command.
The crawler is build to accept an implementation of persistence interfaces as defined in pkg/model/persisttypes.go
. These interfaces allow the crawler to store down specific data related to it's operation as well as the events to be collected.
The initial reference implementation will be written for storing the data to PostgreSQL
. However, the hope to add additional implementations as needed and as the community sees fit.