Skip to content

Commit

Permalink
Merge branch 'safe-global:master' into master
Browse files Browse the repository at this point in the history
  • Loading branch information
subasshrestha authored Oct 11, 2023
2 parents fe7db6a + ec32be0 commit a8e99e4
Show file tree
Hide file tree
Showing 96 changed files with 2,956 additions and 1,928 deletions.
1 change: 1 addition & 0 deletions .env.test
Original file line number Diff line number Diff line change
Expand Up @@ -11,3 +11,4 @@ ETHEREUM_NODE_URL=http://localhost:8545
ETHEREUM_TRACING_NODE_URL=http://localhost:8545
ETH_HASH_BACKEND=pysha3
ENABLE_ANALYTICS=True
EVENTS_QUEUE_URL=amqp://guest:guest@localhost:5672/
2 changes: 1 addition & 1 deletion .github/workflows/cla.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ jobs:
- name: "CLA Assistant"
if: (github.event.comment.body == 'recheck' || github.event.comment.body == 'I have read the CLA Document and I hereby sign the CLA') || github.event_name == 'pull_request_target'
# Beta Release
uses: cla-assistant/[email protected].0
uses: cla-assistant/[email protected].1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# the below token should have repo scope and must be manually added by you in the repository's secret
Expand Down
41 changes: 24 additions & 17 deletions .github/workflows/python.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ jobs:
python-version: ["3.10"]

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
Expand All @@ -42,7 +42,7 @@ jobs:
ports:
- 6379:6379
postgres:
image: postgres:13
image: postgres:14
env:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
Expand All @@ -53,11 +53,20 @@ jobs:
--health-retries 5
ports:
- 5432:5432
rabbitmq:
image: rabbitmq:alpine
options: >-
--health-cmd "rabbitmqctl await_startup"
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- "5672:5672"
steps:
- name: Setup and run ganache
run: |
docker run --detach --publish 8545:8545 --network-alias ganache -e DOCKER=true trufflesuite/ganache:latest --defaultBalanceEther 10000 --gasLimit 10000000 -a 30 --chain.chainId 1337 --chain.networkId 1337 -d
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
Expand All @@ -67,7 +76,7 @@ jobs:
- name: Install dependencies
run: |
pip install wheel
pip install -r requirements-test.txt coveralls
pip install -r requirements-test.txt
env:
PIP_USE_MIRRORS: true
- name: Run tests and coverage
Expand All @@ -86,31 +95,29 @@ jobs:
ETHEREUM_TRACING_NODE_URL: http://localhost:8545
ETH_HASH_BACKEND: pysha3
REDIS_URL: redis://localhost:6379/0
- name: Send results to coveralls
continue-on-error: true # Ignore coveralls problems
run: coveralls --service=github
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Required for coveralls
EVENTS_QUEUE_URL: amqp://guest:guest@localhost:5672/
- name: Coveralls
uses: coverallsapp/github-action@v2
docker-deploy:
runs-on: ubuntu-latest
needs:
- linting
- test-app
if: github.ref == 'refs/heads/master' || github.ref == 'refs/heads/develop' || (github.event_name == 'release' && github.event.action == 'released')
steps:
- uses: actions/checkout@v3
- uses: docker/setup-qemu-action@v2
- uses: actions/checkout@v4
- uses: docker/setup-qemu-action@v3
with:
platforms: arm64
- uses: docker/setup-buildx-action@v2
- uses: docker/setup-buildx-action@v3
- name: Dockerhub login
uses: docker/login-action@v2
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USER }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Deploy Master
if: github.ref == 'refs/heads/master'
uses: docker/build-push-action@v4
uses: docker/build-push-action@v5
with:
context: .
file: docker/web/Dockerfile
Expand All @@ -123,7 +130,7 @@ jobs:
cache-to: type=gha,mode=max
- name: Deploy Develop
if: github.ref == 'refs/heads/develop'
uses: docker/build-push-action@v4
uses: docker/build-push-action@v5
with:
context: .
file: docker/web/Dockerfile
Expand All @@ -136,7 +143,7 @@ jobs:
cache-to: type=gha,mode=max
- name: Deploy Tag
if: (github.event_name == 'release' && github.event.action == 'released')
uses: docker/build-push-action@v4
uses: docker/build-push-action@v5
with:
context: .
file: docker/web/Dockerfile
Expand All @@ -154,7 +161,7 @@ jobs:
needs: [docker-deploy]
if: github.ref == 'refs/heads/master' || github.ref == 'refs/heads/develop'
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Deploy Staging
if: github.ref == 'refs/heads/master'
run: bash scripts/autodeploy.sh
Expand Down
36 changes: 29 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ a transaction that is pending to be sent to the blockchain.

## Index of contents

- [Docs](https://docs.gnosis-safe.io/backend/service-architecture)
- [Docs](https://docs.safe.global/safe-core-api/service-architecture)
- [Deploying the service](https://github.com/safe-global/safe-infrastructure)

## Setup for development
Expand Down Expand Up @@ -142,11 +142,27 @@ docker exec -it safe-transaction-service-web-1 python manage.py createsuperuser
- [v1.3.0 L2](https://github.com/safe-global/safe-deployments/blob/main/src/assets/v1.3.0/gnosis_safe_l2.json)
- [Other related contracts and previous Safe versions](https://github.com/safe-global/safe-deployments/blob/main/src/assets)

## Troubleshooting
## Service maintenance

### Issues installing grpc on a Mac M1
Service can run into some issues when running in production:

If you face issues installing the `grpc` dependency locally (required by this project) on a M1 chip, set `GRPC_PYTHON_BUILD_SYSTEM_OPENSSL=1` and `GRPC_PYTHON_BUILD_SYSTEM_ZLIB=1` and then try to install the dependency again.
### Indexing issues
You can tell there are indexing issues if:
- Executed transactions are missing from the API (`all-transactions`, `multisig-transactions`, `module-transactions`... endpoints). If you use the [Safe{Wallet} Web client](https://github.com/safe-global/safe-wallet-web) you should check what is the current state of the Safe Client Gateway cache as it might have outdated data.
- Asset transfers (ERC20/721) are missing from `all-transactions` or `transfers` endpoints.
- You see error logs such as "Cannot remove owner" or similar inconsistent errors when `worker-indexer` is processing decoded data.

There are multiple options for this. Connect to either `web` or `worker` instances. Running commands inside of `tmux` is recommended
(installed by default):
- `python manage.py check_index_problems`: it will try to automatically fix missing transactions.
Tokens related transactions (ERC20/721) will not be fixed with this method. This method will take a while, as it needs to compare
database data with blockchain data for every Safe.
- `python manage.py reindex_master_copies --from-block-number X --addresses 0x111 0x222`: if you know the first problematic block,
it's faster if you trigger a manual reindex. `--addresses` argument is optional, but if you know the problematic Safes providing
them will make reindexing **way** faster, as only those Safes will be reindexed (instead of the entire collection).

If you see ERC20/ERC721 transfers missing:
- `python manage.py reindex_erc20 --from-block-number X --addresses 0x111 0x222`: same logic as with `reindex_master_copies`.

## FAQ
### Why `/v1/safes/{address}` endpoint shows a nonce that indicates that a transaction was executed but the transaction is not shown or marked as executed in the other endpoints?
Expand All @@ -162,19 +178,25 @@ are deleted and indexing is restarted to the last `confirmed` block.

### If I add my chain to [safe-eth-py](https://github.com/safe-global/safe-eth-py/blob/master/gnosis/safe/addresses.py) will you support it?
No, for a chain to be supported we need to set up a dedicated infra for that network
and [have a proper RPC](https://docs.safe.global/learn/infrastructure/rpc-requirements)
and [have a proper RPC](https://docs.safe.global/safe-core-api/rpc-requirements)

### How can I interact with service?
Aside from using standard HTTP requests:
- [Safe API Kit](https://github.com/safe-global/safe-core-sdk/tree/main/packages/safe-service-client)
- [Safe{Core} API Kit](https://github.com/safe-global/safe-core-sdk/tree/main/packages/api-kit)
- [Safe-eth-py](https://github.com/safe-global/safe-eth-py)
- [Safe CLI](https://github.com/5afe/safe-cli): It has a `tx-service` mode to gather offchain signatures.

### What chains do you officially support?
https://docs.safe.global/learn/safe-core/safe-core-api/available-services
https://docs.safe.global/safe-core-api/available-services

### What means banned field in SafeContract model?
The `banned` field in the `SafeContract` model is used to prevent indexing of certain Safes that have an unsupported `MasterCopy` or unverified proxies that have issues during indexing. This field does not remove the banned Safe and indexing can be resumed once the issue has been resolved.

## Troubleshooting

### Issues installing grpc on a Mac M1

If you face issues installing the `grpc` dependency locally (required by this project) on a M1 chip, set `GRPC_PYTHON_BUILD_SYSTEM_OPENSSL=1` and `GRPC_PYTHON_BUILD_SYSTEM_ZLIB=1` and then try to install the dependency again.

## Contributors
[See contributors](https://github.com/safe-global/safe-transaction-service/graphs/contributors)
8 changes: 8 additions & 0 deletions config/gunicorn.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
"""
Store gunicorn variables in this file, so they can be read by Django
"""
import os

gunicorn_request_timeout = os.environ.get("WEB_WORKER_TIMEOUT", 60)
gunicorn_worker_connections = os.environ.get("WEB_WORKER_CONNECTIONS", 1000)
gunicorn_workers = os.environ.get("WEB_CONCURRENCY", 2)
60 changes: 51 additions & 9 deletions config/settings/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,12 @@
import environ
from corsheaders.defaults import default_headers as default_cors_headers

from ..gunicorn import (
gunicorn_request_timeout,
gunicorn_worker_connections,
gunicorn_workers,
)

ROOT_DIR = Path(__file__).resolve(strict=True).parent.parent.parent
APPS_DIR = ROOT_DIR / "safe_transaction_service"

Expand Down Expand Up @@ -47,9 +53,15 @@
# Enable analytics endpoints
ENABLE_ANALYTICS = env("ENABLE_ANALYTICS", default=False)

# GUNICORN
GUNICORN_REQUEST_TIMEOUT = gunicorn_request_timeout
GUNICORN_WORKER_CONNECTIONS = gunicorn_worker_connections
GUNICORN_WORKERS = gunicorn_workers

# DATABASES
# ------------------------------------------------------------------------------
# https://docs.djangoproject.com/en/dev/ref/settings/#databases
DB_STATEMENT_TIMEOUT = env.int("DB_STATEMENT_TIMEOUT", 60_000)
DATABASES = {
"default": env.db("DATABASE_URL"),
}
Expand All @@ -61,6 +73,7 @@
# https://github.com/jneight/django-db-geventpool#settings
"MAX_CONNS": DB_MAX_CONNS,
"REUSE_CONNS": env.int("DB_REUSE_CONNS", default=DB_MAX_CONNS),
"options": f"-c statement_timeout={DB_STATEMENT_TIMEOUT}",
}

DEFAULT_AUTO_FIELD = "django.db.models.BigAutoField"
Expand Down Expand Up @@ -99,6 +112,7 @@
"safe_transaction_service.notifications.apps.NotificationsConfig",
"safe_transaction_service.safe_messages.apps.SafeMessagesConfig",
"safe_transaction_service.tokens.apps.TokensConfig",
"safe_transaction_service.events.apps.EventsConfig",
]
# https://docs.djangoproject.com/en/dev/ref/settings/#installed-apps
INSTALLED_APPS = DJANGO_APPS + THIRD_PARTY_APPS + LOCAL_APPS
Expand Down Expand Up @@ -209,8 +223,18 @@
CELERY_BROKER_URL = env("CELERY_BROKER_URL", default="django://")
# https://docs.celeryproject.org/en/stable/userguide/optimizing.html#broker-connection-pools
# https://docs.celeryq.dev/en/latest/userguide/optimizing.html#broker-connection-pools
CELERY_BROKER_POOL_LIMIT = env(
"CELERY_BROKER_POOL_LIMIT", default=env("CELERYD_CONCURRENCY", default=1000)
# Configured to 0 due to connection issues https://github.com/celery/celery/issues/4355
CELERY_BROKER_POOL_LIMIT = env.int("CELERY_BROKER_POOL_LIMIT", default=0)
# https://docs.celeryq.dev/en/stable/userguide/configuration.html#broker-heartbeat
CELERY_BROKER_HEARTBEAT = env.int("CELERY_BROKER_HEARTBEAT", default=0)

# https://docs.celeryq.dev/en/stable/userguide/configuration.html#std-setting-broker_connection_max_retries
CELERY_BROKER_CONNECTION_MAX_RETRIES = env.int(
"CELERY_BROKER_CONNECTION_MAX_RETRIES", default=0
)
# https://docs.celeryq.dev/en/stable/userguide/configuration.html#broker-channel-error-retry
CELERY_BROKER_CHANNEL_ERROR_RETRY = env.bool(
"CELERY_BROKER_CHANNEL_ERROR_RETRY", default=True
)
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#std:setting-result_backend
CELERY_RESULT_BACKEND = env("CELERY_RESULT_BACKEND", default="redis://")
Expand All @@ -231,6 +255,7 @@
CELERY_TASK_QUEUE_MAX_PRIORITY = 10
# https://docs.celeryproject.org/en/latest/userguide/configuration.html#broker-transport-options
CELERY_BROKER_TRANSPORT_OPTIONS = {}

# https://docs.celeryq.dev/en/stable/userguide/configuration.html#std-setting-task_routes
CELERY_ROUTES = (
[
Expand All @@ -242,6 +267,10 @@
"safe_transaction_service.history.tasks.send_webhook_task",
{"queue": "webhooks", "delivery_mode": "transient"},
),
(
"safe_transaction_service.events.tasks.send_event_to_queue_task",
{"queue": "webhooks", "delivery_mode": "transient"},
),
(
"safe_transaction_service.history.tasks.reindex_mastercopies_last_hours_task",
{"queue": "indexing"},
Expand Down Expand Up @@ -408,7 +437,7 @@
"ETH_INTERNAL_TXS_BLOCK_PROCESS_LIMIT", default=10_000
)
ETH_INTERNAL_TXS_BLOCKS_TO_REINDEX_AGAIN = env.int(
"ETH_INTERNAL_TXS_BLOCKS_TO_REINDEX_AGAIN", default=6
"ETH_INTERNAL_TXS_BLOCKS_TO_REINDEX_AGAIN", default=10
)
ETH_INTERNAL_TXS_NUMBER_TRACE_BLOCKS = env.int(
"ETH_INTERNAL_TXS_NUMBER_TRACE_BLOCKS", default=10
Expand All @@ -435,7 +464,7 @@
"ETH_EVENTS_BLOCK_PROCESS_LIMIT_MAX", default=0
) # Maximum number of blocks to process together when searching for events. 0 == no limit.
ETH_EVENTS_BLOCKS_TO_REINDEX_AGAIN = env.int(
"ETH_EVENTS_BLOCKS_TO_REINDEX_AGAIN", default=10
"ETH_EVENTS_BLOCKS_TO_REINDEX_AGAIN", default=20
) # Blocks to reindex again every indexer run when service is synced. Useful for RPCs not reliable
ETH_EVENTS_GET_LOGS_CONCURRENCY = env.int(
"ETH_EVENTS_GET_LOGS_CONCURRENCY", default=20
Expand All @@ -446,8 +475,11 @@
ETH_EVENTS_UPDATED_BLOCK_BEHIND = env.int(
"ETH_EVENTS_UPDATED_BLOCK_BEHIND", default=24 * 60 * 60 // 15
) # Number of blocks to consider an address 'almost updated'.
ETH_REORG_BLOCKS_BATCH = env.int(
"ETH_REORG_BLOCKS_BATCH", default=250
) # Number of blocks to be checked in the same batch for reorgs
ETH_REORG_BLOCKS = env.int(
"ETH_REORG_BLOCKS", default=100 if ETH_L2_NETWORK else 10
"ETH_REORG_BLOCKS", default=200 if ETH_L2_NETWORK else 10
) # Number of blocks from the current block number needed to consider a block valid/stable

# Tokens
Expand All @@ -464,6 +496,10 @@
"TOKENS_ERC20_GET_BALANCES_BATCH", default=2_000
) # Number of tokens to get balances from in the same request. From 2_500 some nodes raise HTTP 413

TOKEN_ETH_PRICE_TTL = env.int(
"TOKEN_ETH_PRICE_TTL", default=60 * 30 # 30 minutes
) # Expiration time for token eth price

# Notifications
# ------------------------------------------------------------------------------
SLACK_API_WEBHOOK = env("SLACK_API_WEBHOOK", default=None)
Expand All @@ -485,10 +521,16 @@
)
)

ALERT_OUT_OF_SYNC_EVENTS_THRESHOLD = env.float(
"ALERT_OUT_OF_SYNC_EVENTS_THRESHOLD", default=0.1
) # Percentage of Safes allowed to be out of sync without alerting. By default 10%

# Events
# ------------------------------------------------------------------------------
EVENTS_QUEUE_URL = env("EVENTS_QUEUE_URL", default=None)
EVENTS_QUEUE_ASYNC_CONNECTION = env("EVENTS_QUEUE_ASYNC_CONNECTION", default=False)
EVENTS_QUEUE_EXCHANGE_NAME = env("EVENTS_QUEUE_EXCHANGE_NAME", default="amq.fanout")

# Cache
CACHE_ALL_TXS_VIEW = env.int(
"CACHE_ALL_TXS_VIEW", default=10 * 60
) # 10 minutes. 0 is disabled

# AWS S3 https://github.com/etianen/django-s3-storage
# ------------------------------------------------------------------------------
Expand Down
2 changes: 2 additions & 0 deletions config/settings/test.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,3 +47,5 @@
"level": "DEBUG",
}
}

EVENTS_QUEUE_ASYNC_CONNECTION = False
1 change: 0 additions & 1 deletion config/urls.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@
title="Safe Transaction Service API",
default_version="v1",
description="API to keep track of transactions sent via Gnosis Safe smart contracts",
contact=openapi.Contact(email="[email protected]"),
license=openapi.License(name="MIT License"),
),
url=settings.BACKEND_URL,
Expand Down
4 changes: 3 additions & 1 deletion docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,9 +24,11 @@ services:

rabbitmq:
image: rabbitmq:alpine
ports:
- "5672:5672"

db:
image: postgres:13-alpine
image: postgres:14-alpine
ports:
- "5432:5432"
environment:
Expand Down
4 changes: 3 additions & 1 deletion docker/web/celery/scheduler/run.sh
Original file line number Diff line number Diff line change
Expand Up @@ -13,4 +13,6 @@ fi
sleep 10

echo "==> $(date +%H:%M:%S) ==> Running Celery beat <=="
exec celery -C -A config.celery_app beat -S django_celery_beat.schedulers:DatabaseScheduler --loglevel $log_level
exec celery -C -A config.celery_app beat \
-S django_celery_beat.schedulers:DatabaseScheduler \
--loglevel $log_level
3 changes: 2 additions & 1 deletion docker/web/celery/worker/run.sh
Original file line number Diff line number Diff line change
Expand Up @@ -33,4 +33,5 @@ exec celery -C -A config.celery_app worker \
--concurrency=${TASK_CONCURRENCY} \
--max-memory-per-child=${MAX_MEMORY_PER_CHILD} \
--max-tasks-per-child=${MAX_TASKS_PER_CHILD} \
-Q "$WORKER_QUEUES"
--without-heartbeat --without-gossip \
--without-mingle -Q "$WORKER_QUEUES"
Loading

0 comments on commit a8e99e4

Please sign in to comment.