Skip to content
This repository has been archived by the owner on Jun 11, 2024. It is now read-only.

Commit

Permalink
Merge pull request #1902 from LiskHQ/1900-lisk-service-performance-is…
Browse files Browse the repository at this point in the history
…sues

Lisk service performance issues
  • Loading branch information
sameersubudhi authored Oct 30, 2023
2 parents 04d0de9 + 3898bbc commit 5bdcfb1
Show file tree
Hide file tree
Showing 77 changed files with 566 additions and 517 deletions.
3 changes: 2 additions & 1 deletion .github/workflows/semgrep.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,8 @@ jobs:

container:
# A Docker image with Semgrep installed. Do not change this.
image: returntocorp/semgrep
# TODO: Remove version pinning after https://github.com/returntocorp/semgrep/issues/9091 is resolved
image: returntocorp/semgrep:1.45.0

# Skip any PR created by dependabot to avoid permission issues:
if: (github.actor != 'dependabot[bot]')
Expand Down
4 changes: 3 additions & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,9 @@ logs-live-%:
print-config:
$(compose) config

build: build-app-registry build-connector build-indexer build-coordinator build-statistics build-fees build-market build-export build-gateway
build: build-local build-images

build-images: build-app-registry build-connector build-indexer build-coordinator build-statistics build-fees build-market build-export build-gateway

build-all: build build-template build-tests

Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -95,7 +95,7 @@ cd lisk-service
If you wish to build the local version of Lisk Service execute the following command below:

```bash
make build
make build-images
```

> This step is only necessary if you wish to build a custom or pre-release version of Lisk Service that does not have a pre-built Docker image published on the Docker Hub. The installation script chooses the last available stable version on Docker Hub, **unless** there is no local image. If you are unsure about any local builds, use the `make clean` command to remove all locally built docker images.
Expand Down
2 changes: 2 additions & 0 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -196,12 +196,14 @@ services:
- SERVICE_LOG_FILE=${SERVICE_LOG_FILE}
- SERVICE_LOG_LEVEL=${SERVICE_LOG_LEVEL}
- ENABLE_APPLY_SNAPSHOT=${ENABLE_APPLY_SNAPSHOT}
- DURABILITY_VERIFY_FREQUENCY=${DURABILITY_VERIFY_FREQUENCY}
- INDEX_SNAPSHOT_URL=${INDEX_SNAPSHOT_URL}
- ENABLE_SNAPSHOT_ALLOW_INSECURE_HTTP=${ENABLE_SNAPSHOT_ALLOW_INSECURE_HTTP}
- DOCKER_HOST=${DOCKER_HOST}
- MAINCHAIN_SERVICE_URL=${MAINCHAIN_SERVICE_URL}
- LISK_STATIC=${LISK_STATIC}
- DEVNET_MAINCHAIN_URL=${DEVNET_MAINCHAIN_URL}
- ACCOUNT_BALANCE_UPDATE_BATCH_SIZE=${ACCOUNT_BALANCE_UPDATE_BATCH_SIZE}
- JOB_INTERVAL_DELETE_SERIALIZED_EVENTS=${JOB_INTERVAL_DELETE_SERIALIZED_EVENTS}
- JOB_SCHEDULE_DELETE_SERIALIZED_EVENTS=${JOB_SCHEDULE_DELETE_SERIALIZED_EVENTS}
- JOB_INTERVAL_REFRESH_VALIDATORS=${JOB_INTERVAL_REFRESH_VALIDATORS}
Expand Down
2 changes: 2 additions & 0 deletions docker/example.env
Original file line number Diff line number Diff line change
Expand Up @@ -55,8 +55,10 @@ ENABLE_PERSIST_EVENTS=false
# DEVNET_MAINCHAIN_URL='http://devnet-service.liskdev.net:9901'
# ESTIMATES_BUFFER_BYTES_LENGTH=0
# ENABLE_APPLY_SNAPSHOT=false
# DURABILITY_VERIFY_FREQUENCY=20
# INDEX_SNAPSHOT_URL= ''
# ENABLE_SNAPSHOT_ALLOW_INSECURE_HTTP=false
# ACCOUNT_BALANCE_UPDATE_BATCH_SIZE=1000

# Moleculer jobs configuration
# JOB_INTERVAL_DELETE_SERIALIZED_EVENTS=0
Expand Down
19 changes: 16 additions & 3 deletions docs/antora/modules/ROOT/pages/configuration/index.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -391,9 +391,9 @@ Only to be used when the genesis block is large enough to be transmitted over AP
| `ENABLE_BLOCK_CACHING`
| boolean
| Boolean flag to enable block caching.
Enabled by default.
To disable it, set it to `false`.
| true
Disabled by default.
To enable it, set it to `true`.
| false

| `EXPIRY_IN_HOURS`
| number
Expand Down Expand Up @@ -698,6 +698,12 @@ By default, it is set to run every 15 minutes.
By default, it is set to `0`.
| 0

| `ACCOUNT_BALANCE_UPDATE_BATCH_SIZE`
| number
| Number of accounts for which the balance index is updated at a time.
By default, it is set to `1000`.
| 1000

| `MAINCHAIN_SERVICE_URL`
| string
| Mainchain service URL for custom deployments.
Expand All @@ -722,6 +728,11 @@ To accelerate the indexing process, the blockchain-indexer microservice also sup
|Enable or disable auto-apply snapshot feature. By default, the value is false.
|false

|`DURABILITY_VERIFY_FREQUENCY`
|number
|Frequency in milliseconds to verify if a block is indexed or rolled-back successfully. By default, it is set to 20.
|false

|`INDEX_SNAPSHOT_URL`
|string
|Custom snapshot download URL (expected to end with sql.gz).
Expand Down Expand Up @@ -756,6 +767,7 @@ module.exports = {
ENABLE_INDEXING_MODE: 'true',
ENABLE_PERSIST_EVENTS: 'false',
// ENABLE_APPLY_SNAPSHOT: 'false',
// DURABILITY_VERIFY_FREQUENCY: 20,
// INDEX_SNAPSHOT_URL: '',
// ENABLE_SNAPSHOT_ALLOW_INSECURE_HTTP: 'true',
// SERVICE_INDEXER_MYSQL_READ_REPLICA: 'mysql://lisk:[email protected]:3306/lisk',
Expand All @@ -770,6 +782,7 @@ module.exports = {
// LISK_STATIC: 'https://static-data.lisk.com',
// DEVNET_MAINCHAIN_URL: 'http://devnet-service.liskdev.net:9901',
// ESTIMATES_BUFFER_BYTES_LENGTH: 0,
// ACCOUNT_BALANCE_UPDATE_BATCH_SIZE: 1000,
// JOB_INTERVAL_DELETE_SERIALIZED_EVENTS: 0,
// JOB_SCHEDULE_DELETE_SERIALIZED_EVENTS: '*/5 * * * *',
// JOB_INTERVAL_REFRESH_VALIDATORS: 0,
Expand Down
2 changes: 1 addition & 1 deletion docs/antora/modules/ROOT/pages/index.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -220,7 +220,7 @@ The Gateway service provides the following APIs, which all users of Lisk Service

// [source,bash]
// ----
// make build
// make build-images
// ----
// Please note, this step is only necessary if you wish to build a custom or pre-release version of Lisk Service that does not have a pre-built Docker image published on the Docker Hub.
// The installation script chooses the last available stable version on the Docker Hub, *unless* there is no local image.
Expand Down
4 changes: 2 additions & 2 deletions docs/antora/modules/ROOT/pages/management/docker.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Mona Bärenfänger <[email protected]>

.Inside the lisk-service root folder
----
make build
make build-images
----
This creates the necessary Docker images to start Lisk Service in the containers.

Expand Down Expand Up @@ -348,7 +348,7 @@ git checkout v0.7.0
. Build the required updated Docker images
+
----
make build
make build-images
----
. Start Lisk Service in the containers
Expand Down
4 changes: 2 additions & 2 deletions docs/antora/modules/ROOT/pages/setup/docker.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -172,12 +172,12 @@ To build Lisk Service from local files, first, navigate to the `lisk-service` re

.Working directory: lisk-service/
----
make build
make build-images
----

Lisk Service is now ready to use on your machine.

TIP: If you skipped the step to configure Docker to run without `*sudo*` rights, you need to prepend `*sudo*` with aforementioned command: `*sudo make build*`
TIP: If you skipped the step to configure Docker to run without `*sudo*` rights, you need to prepend `*sudo*` with aforementioned command: `*sudo make build-images*`

== Connecting Lisk Service to a blockchain node

Expand Down
2 changes: 2 additions & 0 deletions docs/api/version3.md
Original file line number Diff line number Diff line change
Expand Up @@ -169,6 +169,7 @@ _Supports pagination._
"transactionRoot": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855",
"assetsRoot": "6e904b2f678eb3b6c3042acb188a607d903d441d61508d047fe36b3c982995c8",
"stateRoot": "95d9b1773b78034b8df9ac741c903b881da761d8ba002a939de28a4b86982c04",
"eventRoot": "7dee8ae1899582aabb0c4b967ceda6874329dba57b5eb23d7c62890917a55cbd",
"maxHeightGenerated": 559421,
"maxHeightPrevoted": 559434,
"validatorsHash": "ad0076aa444f6cda608bb163c3bd77d9bf172f1d2803d53095bc0f277db6bcb3",
Expand Down Expand Up @@ -5826,6 +5827,7 @@ Proxy request to directly invoke application endpoint. Returns endpoint response
"transactionRoot": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855",
"assetsRoot": "6e904b2f678eb3b6c3042acb188a607d903d441d61508d047fe36b3c982995c8",
"stateRoot": "95d9b1773b78034b8df9ac741c903b881da761d8ba002a939de28a4b86982c04",
"eventRoot": "7dee8ae1899582aabb0c4b967ceda6874329dba57b5eb23d7c62890917a55cbd",
"maxHeightGenerated": 559421,
"maxHeightPrevoted": 559434,
"validatorsHash": "ad0076aa444f6cda608bb163c3bd77d9bf172f1d2803d53095bc0f277db6bcb3",
Expand Down
2 changes: 2 additions & 0 deletions docs/api/websocket_subscribe_api.md
Original file line number Diff line number Diff line change
Expand Up @@ -96,6 +96,7 @@ Updates about a newly generated block.
"transactionRoot": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855",
"assetsRoot": "6e904b2f678eb3b6c3042acb188a607d903d441d61508d047fe36b3c982995c8",
"stateRoot": "95d9b1773b78034b8df9ac741c903b881da761d8ba002a939de28a4b86982c04",
"eventRoot": "7dee8ae1899582aabb0c4b967ceda6874329dba57b5eb23d7c62890917a55cbd",
"maxHeightGenerated": 559421,
"maxHeightPrevoted": 559434,
"validatorsHash": "ad0076aa444f6cda608bb163c3bd77d9bf172f1d2803d53095bc0f277db6bcb3",
Expand Down Expand Up @@ -146,6 +147,7 @@ Updates about a deleted block. This usually happens when the chain switches fork
"transactionRoot": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855",
"assetsRoot": "6e904b2f678eb3b6c3042acb188a607d903d441d61508d047fe36b3c982995c8",
"stateRoot": "95d9b1773b78034b8df9ac741c903b881da761d8ba002a939de28a4b86982c04",
"eventRoot": "7dee8ae1899582aabb0c4b967ceda6874329dba57b5eb23d7c62890917a55cbd",
"maxHeightGenerated": 559421,
"maxHeightPrevoted": 559434,
"validatorsHash": "ad0076aa444f6cda608bb163c3bd77d9bf172f1d2803d53095bc0f277db6bcb3",
Expand Down
2 changes: 1 addition & 1 deletion docs/build_from_source.md
Original file line number Diff line number Diff line change
Expand Up @@ -185,4 +185,4 @@ yarn run test

It is now possible to use your preferred editor to make changes in the source files. Take a look at the [template](../services/template) project in order to find some suitable examples.

Once completed, it is also possible to build Docker images with `make build` and run them using the method from the main [README](../README.md).
Once completed, it is also possible to build Docker images with `make build-images` and run them using the method from the main [README](../README.md).
2 changes: 1 addition & 1 deletion docs/prerequisites_docker_debian.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,6 @@ Follow the official [documentation](https://docs.docker.com/compose/install/) to

## Next steps

If you have all dependencies installed properly, it is possible to run pre-build Docker images with Lisk Service. It is also possible to build those images locally by using the `make build` command.
If you have all dependencies installed properly, it is possible to run pre-build Docker images with Lisk Service. It is also possible to build those images locally by using the `make build-images` command.

Refer to the main [README](../README.md) file regarding the next steps.
2 changes: 1 addition & 1 deletion docs/prerequisites_docker_macos.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,6 @@ The most recent version of Docker Desktop already contains docker-compose tool.

## Next steps

If you have all dependencies installed properly, it is possible to run pre-build Docker images with Lisk Service. It is also possible to build those images locally by using the `make build` command.
If you have all dependencies installed properly, it is possible to run pre-build Docker images with Lisk Service. It is also possible to build those images locally by using the `make build-images` command.

Refer to the main [README](../README.md) file regarding the next steps.
2 changes: 1 addition & 1 deletion docs/prerequisites_docker_ubuntu.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,6 @@ Please follow the official [documentation](https://docs.docker.com/compose/insta

## Next steps

When all of the dependencies are correctly installed, it will then be possible to run pre-build Docker images with Lisk Service. It is also possible to build those images locally by executing the `make build` command from within the `lisk-service` directory.
When all of the dependencies are correctly installed, it will then be possible to run pre-build Docker images with Lisk Service. It is also possible to build those images locally by executing the `make build-images` command from within the `lisk-service` directory.

Refer to the main [README](../README.md) file regarding the next steps.
2 changes: 2 additions & 0 deletions ecosystem.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -156,6 +156,7 @@ module.exports = {
ENABLE_INDEXING_MODE: true,
ENABLE_PERSIST_EVENTS: false,
// ENABLE_APPLY_SNAPSHOT: false,
// DURABILITY_VERIFY_FREQUENCY: 20,
// INDEX_SNAPSHOT_URL: '',
// ENABLE_SNAPSHOT_ALLOW_INSECURE_HTTP: true,
// SERVICE_INDEXER_MYSQL_READ_REPLICA: 'mysql://lisk:[email protected]:3306/lisk',
Expand All @@ -170,6 +171,7 @@ module.exports = {
// LISK_STATIC: 'https://static-data.lisk.com',
// DEVNET_MAINCHAIN_URL: 'http://devnet-service.liskdev.net:9901',
// ESTIMATES_BUFFER_BYTES_LENGTH: 0,
// ACCOUNT_BALANCE_UPDATE_BATCH_SIZE: 1000,
// JOB_INTERVAL_DELETE_SERIALIZED_EVENTS: 0,
// JOB_SCHEDULE_DELETE_SERIALIZED_EVENTS: '*/5 * * * *',
// JOB_INTERVAL_REFRESH_VALIDATORS: 0,
Expand Down
Binary file added framework/dist/lisk-service-framework-1.6.4.tgz
Binary file not shown.
Binary file added framework/dist/lisk-service-framework-1.6.5.tgz
Binary file not shown.
2 changes: 1 addition & 1 deletion framework/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -33,12 +33,12 @@ module.exports = {
sqlite3: require('./src/database/sqlite3'),
},
Utils: {
delay: require('./src/delay'),
requireAllJs: require('./src/requireAllJs'),
waitForIt: require('./src/waitForIt'),
Data: require('./src/data'),
fs: require('./src/fs'),
...require('./src/data'),
delay: require('./src/delay'),
},
Constants: {
...require('./constants/ErrorCodes'),
Expand Down
2 changes: 1 addition & 1 deletion framework/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "lisk-service-framework",
"version": "1.6.3",
"version": "1.6.5",
"description": "Lisk Service Framework",
"keywords": [
"lisk",
Expand Down
14 changes: 7 additions & 7 deletions framework/src/database/sqlite3.js
Original file line number Diff line number Diff line change
Expand Up @@ -70,24 +70,24 @@ const createDBConnection = async (dbDataDir, tableName) => {
return knex;
};

const getDBConnection = async (tableName, dbDataDir = DB_DATA_DIR) => {
if (!connectionPool[dbDataDir]) {
const getDBConnection = async (tableName, dbDataDir) => {
if (!connectionPool[tableName]) {
if (!(await exists(dbDataDir))) {
await mkdir(dbDataDir, { recursive: true });
}
connectionPool[dbDataDir] = await createDBConnection(dbDataDir, tableName);
connectionPool[tableName] = await createDBConnection(dbDataDir, tableName);
}

const knex = connectionPool[dbDataDir];
const knex = connectionPool[tableName];
return knex;
};

const createTableIfNotExists = async (tableConfig, dbDataDir = DB_DATA_DIR) => {
const createTableIfNotExists = async tableConfig => {
const { tableName } = tableConfig;

if (!tablePool[tableName]) {
logger.info(`Creating schema for ${tableName}`);
const knex = await getDBConnection(tableName, dbDataDir);
const knex = await getDBConnection(tableName);
await util.loadSchema(knex, tableName, tableConfig);
tablePool[tableName] = true;
}
Expand All @@ -100,7 +100,7 @@ const getTableInstance = async (tableConfig, dbDataDir = DB_DATA_DIR) => {

const createDefaultTransaction = async connection => util.startDBTransaction(connection);

await createTableIfNotExists(tableConfig, dbDataDir);
await createTableIfNotExists(tableConfig);

const dbOperations = util.getTableInstance(tableConfig, knex);

Expand Down
2 changes: 1 addition & 1 deletion framework/src/queue.js
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ const queueInstance = (

queue.on('failed', (job, err) => {
logger.warn(`${job.name} job failed`, err.message);
logger.warn(`${job.name} job failed`, err.stack);
logger.debug(`${job.name} job failed`, err.stack);
});

setInterval(async () => {
Expand Down
7 changes: 5 additions & 2 deletions framework/src/waitForIt.js
Original file line number Diff line number Diff line change
Expand Up @@ -17,14 +17,17 @@ const Logger = require('./logger').get;

const logger = Logger();

const waitForIt = (fn, intervalMs = 1000) =>
const waitForIt = (fn, intervalMs = 1000, resolveUndefined = false) =>
// eslint-disable-next-line implicit-arrow-linebreak
new Promise(resolve => {
// eslint-disable-next-line consistent-return
const timeout = setInterval(async () => {
try {
const result = await fn();
clearInterval(timeout);
resolve(result);
if (resolveUndefined || result !== undefined) {
return resolve(result);
}
} catch (err) {
logger.debug(`Waiting ${intervalMs}...`);
}
Expand Down
Loading

0 comments on commit 5bdcfb1

Please sign in to comment.