Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Caspian Migration #48

Merged
merged 15 commits into from
Feb 13, 2024
Merged
39 changes: 17 additions & 22 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,43 +30,37 @@ Environment variables:

- `XRP_AMOUNT`: The number of XRP to fund new accounts with. On the Testnet operated by Ripple, the current funding amount is 1,000 Testnet XRP.

## Google BigQuery Integration
## Caspian Integration

This application logs and analyzes data using Google BigQuery. To use this feature, you need to provide the necessary BigQuery credentials through environment variables.
This application logs and analyzes data using Caspian. To use this feature, you need to provide the necessary Caspian credentials through environment variables.

### Run the server with BigQuery (Optional):
### Run the server with Caspian (Optional):

Please replace `BIGQUERY_PROJECT_ID`, `BIGQUERY_CLIENT_EMAIL`, and `BIGQUERY_PRIVATE_KEY` with your actual project ID, client email, and private key.
Please replace `CASPIAN_ENDPOINT`, `CASPIAN_API_KEY`, `CASPIAN_PRODUCER_NAME`, `CASPIAN_ENTITY_NAME`, `CASPIAN_SCHEMA_TYPE`, and `CASPIAN_SCHEMA_VERSION `with your actual Caspian configurations.

### BigQuery Environment Variables:
### Caspian Environment Variables:

- `BIGQUERY_PROJECT_ID`: The ID of your Google Cloud project.
- `BIGQUERY_CLIENT_EMAIL`: The email address of your service account.
- `BIGQUERY_PRIVATE_KEY`: The private key from your service account JSON key file. Be sure to include the full private key, including the header and footer.
`CASPIAN_ENDPOINT`: The endpoint for your Caspian integration.
`CASPIAN_API_KEY`: Your Caspian API key for authentication.
`CASPIAN_PRODUCER_NAME`: The name of your data producer.
`CASPIAN_ENTITY_NAME`: The entity name for logging purposes.
`CASPIAN_SCHEMA_TYPE`: The schema type of your data.
`CASPIAN_SCHEMA_VERSION`: The version of your data schema.
Remember to properly secure your environment variables, especially the CASPIAN_API_KEY, to prevent unauthorized access to your Caspian account.

In case you are running this application in a trusted environment (like Google Cloud Platform), you don't need to provide the `BIGQUERY_CLIENT_EMAIL` and `BIGQUERY_PRIVATE_KEY`. The application will use Application Default Credentials (ADC) provided by the environment.
### how to run tests on Standalone Node
jonathanlei marked this conversation as resolved.
Show resolved Hide resolved

```

Please adjust the details as per your application requirements.


```


### how to run tests on Standalone Node
1. Creating a custom standalone rippled instance

Create a folder called config
jonathanlei marked this conversation as resolved.
Show resolved Hide resolved

Create a config file like xrpl.js uses in it’s CI for making standalone rippled instances: https://github.com/XRPLF/xrpl.js/blob/main/.ci-config/rippled.cfg
Create a config file like xrpl.js uses in it’s CI for making standalone rippled instances: https://github.com/XRPLF/xrpl.js/blob/main/.ci-config/rippled.cfg

If you want to change something, like increase the network_id, you can search the example config for the field name, then add it anywhere. For example:
If you want to change something, like increase the network_id, you can search the example config for the field name, then add it anywhere. For example:

[network_id]
1234


Go to right above config in the command line

Use the config folder to start a docker container using a command like in xrpl.js tests. Source for command pre-modification + explanation of each piece.
Expand All @@ -85,8 +79,8 @@ You should now have a running docker container with your custom config!

(If you were trying the network_id change, you should see it show up in the docker logs on startup!)

In ticket-queue.ts and account.ts, import `sendLedgerAccept` and `delayedLedgerAccept()` from utils.ts

In ticket-queue.ts and account.ts, import `sendLedgerAccept` and `delayedLedgerAccept()` from utils.ts
```
async function sendLedgerAccept(client: Client): Promise<unknown> {
return client.connection.request({ command: "ledger_accept" });
Expand All @@ -99,4 +93,5 @@ async function delayedLedgerAccept(): Promise<unknown> {
}

```

JST5000 marked this conversation as resolved.
Show resolved Hide resolved
use `delayedLedgerAccept()` before `client.submitAndWait()` and `await sendLedgerAccept()` after `client.submit()` in order to close the ledger on a standalone node.
26 changes: 12 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,25 +30,23 @@ Environment variables:

- `XRP_AMOUNT`: The number of XRP to fund new accounts with. On the Testnet operated by Ripple, the current funding amount is 1,000 Testnet XRP.

## Google BigQuery Integration
jonathanlei marked this conversation as resolved.
Show resolved Hide resolved
## Caspian Integration

This application logs and analyzes data using Google BigQuery. To use this feature, you need to provide the necessary BigQuery credentials through environment variables.
This application logs and analyzes data using Caspian. To use this feature, you need to provide the necessary Caspian credentials through environment variables.

### Run the server with BigQuery (Optional):
### Run the server with Caspian (Optional):

Please replace `BIGQUERY_PROJECT_ID`, `BIGQUERY_CLIENT_EMAIL`, and `BIGQUERY_PRIVATE_KEY` with your actual project ID, client email, and private key.
Please replace `CASPIAN_ENDPOINT`, `CASPIAN_API_KEY`, `CASPIAN_PRODUCER_NAME`, `CASPIAN_ENTITY_NAME`, `CASPIAN_SCHEMA_TYPE`, and `CASPIAN_SCHEMA_VERSION `with your actual Caspian configurations.

### BigQuery Environment Variables:
### Caspian Environment Variables:

- `BIGQUERY_PROJECT_ID`: The ID of your Google Cloud project.
- `BIGQUERY_CLIENT_EMAIL`: The email address of your service account.
- `BIGQUERY_PRIVATE_KEY`: The private key from your service account JSON key file. Be sure to include the full private key, including the header and footer.

In case you are running this application in a trusted environment (like Google Cloud Platform), you don't need to provide the `BIGQUERY_CLIENT_EMAIL` and `BIGQUERY_PRIVATE_KEY`. The application will use Application Default Credentials (ADC) provided by the environment.
`CASPIAN_ENDPOINT`: The endpoint for your Caspian integration.
`CASPIAN_API_KEY`: Your Caspian API key for authentication.
`CASPIAN_PRODUCER_NAME`: The name of your data producer.
`CASPIAN_ENTITY_NAME`: The entity name for logging purposes.
`CASPIAN_SCHEMA_TYPE`: The schema type of your data.
`CASPIAN_SCHEMA_VERSION`: The version of your data schema.

```

Please adjust the details as per your application requirements.


Remember to properly secure your environment variables, especially the CASPIAN_API_KEY, to prevent unauthorized access to your Caspian account.
```
12 changes: 6 additions & 6 deletions src/config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -11,12 +11,12 @@ export interface ConfigFile {
MIN_TICKET_COUNT?: number;
MAX_TICKET_COUNT?: number;

// Optional - See "defaults" for default values
BIGQUERY_DATASET_ID?: string;
BIGQUERY_TABLE_ID?: string;
BIGQUERY_CLIENT_EMAIL?: string;
BIGQUERY_PROJECT_ID?: string;
BIGQUERY_PRIVATE_KEY?: string;
CASPIAN_API_KEY?: string;
CASPIAN_ENDPOINT?: string;
CASPIAN_ENTITY_NAME?: string;
CASPIAN_PRODUCER_NAME?: string;
CASPIAN_SCHEMA_TYPE?: string;
CASPIAN_SCHEMA_VERSION?: number;
JST5000 marked this conversation as resolved.
Show resolved Hide resolved
}

export interface Config extends Required<ConfigFile> {}
Expand Down
126 changes: 77 additions & 49 deletions src/routes/accounts.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ import { config } from "../config";
import { getTicket } from "../ticket-queue";
import rTracer from "cls-rtracer";
import { incrementTxRequestCount, incrementTxCount } from "../index";
const https = require("https");

export default async function (req: Request, res: Response) {
incrementTxRequestCount();
Expand Down Expand Up @@ -90,7 +91,7 @@ export default async function (req: Request, res: Response) {
amount: Number(amount),
};

if (wallet.seed) {
if (wallet && wallet.seed) {
JST5000 marked this conversation as resolved.
Show resolved Hide resolved
response.seed = wallet.seed;
}

Expand All @@ -101,12 +102,12 @@ export default async function (req: Request, res: Response) {
} with ${amount} XRP (${status})`
);

if (config.BIGQUERY_PROJECT_ID) {
if (config.CASPIAN_API_KEY) {
JST5000 marked this conversation as resolved.
Show resolved Hide resolved
try {
await insertIntoBigQuery(account, amount, req.body);
console.log("inserted big query");
await insertIntoCaspian(account, Number(amount), req.body, client);
console.log("Data sent to Caspian successfully");
} catch (error) {
console.warn(`Failed to insert into BigQuery: ${error}`);
console.warn("Caspian Insertion Error:", error);
}
}
incrementTxCount();
Expand All @@ -120,50 +121,6 @@ export default async function (req: Request, res: Response) {
});
}
}
async function insertIntoBigQuery(
jonathanlei marked this conversation as resolved.
Show resolved Hide resolved
account: Account,
amount: string,
reqBody: any
): Promise<void> {
const { userAgent = "", usageContext = "" } = reqBody;
const memos = reqBody.memos
? reqBody.memos.map((memo: any) => ({ memo }))
: [];
const rows = [
{
user_agent: userAgent,
usage_context: usageContext,
memos: memos,
account: account.xAddress,
amount: amount,
},
];
const bigquery = new BigQuery({
projectId: config.BIGQUERY_PROJECT_ID,
credentials: {
client_email: config.BIGQUERY_CLIENT_EMAIL,
private_key: config.BIGQUERY_PRIVATE_KEY,
},
});

return new Promise((resolve, reject) => {
bigquery
.dataset(config.BIGQUERY_DATASET_ID)
.table(config.BIGQUERY_TABLE_ID)
.insert(rows, (error) => {
if (error) {
console.warn(
"WARNING: Failed to insert into BigQuery",
JSON.stringify(error, null, 2)
);
reject(error);
} else {
console.log(`Inserted ${rows.length} rows`);
resolve();
}
});
});
}

async function submitPaymentWithTicket(
payment: Payment,
Expand All @@ -190,3 +147,74 @@ async function submitPaymentWithTicket(

return result;
}

async function insertIntoCaspian(
account: Account,
amount: number,
reqBody: any,
client: Client
) {
const dataPayload = [
{
user_agent: reqBody.userAgent || "",
usage_context: reqBody.usageContext || "",
memos: reqBody.memos || [],
account: account.xAddress,
amount: amount,
network: client.networkID,
},
];

const postData = JSON.stringify({
producerName: config.CASPIAN_PRODUCER_NAME,
entityName: config.CASPIAN_ENTITY_NAME,
schemaType: config.CASPIAN_SCHEMA_TYPE,
schemaVersion: Math.round(config.CASPIAN_SCHEMA_VERSION),
data: dataPayload,
timestamp: Date.now(),
});
console.log(postData);
jonathanlei marked this conversation as resolved.
Show resolved Hide resolved

const options = {
method: "POST",
headers: {
"Content-Type": "application/json",
"x-api-key": config.CASPIAN_API_KEY,
},
};

return new Promise((resolve, reject) => {
const req = https.request(
ckniffen marked this conversation as resolved.
Show resolved Hide resolved
config.CASPIAN_ENDPOINT,
options,
(res: Response) => {
let data = "";

res.on("data", (chunk) => {
data += chunk;
});

res.on("end", () => {
if (res.statusCode === 200 || res.statusCode === 201) {
resolve(data);
} else {
reject(`Failed to send data to Caspian: ${data}`);
}
});
}
);

req.on("error", (error: any) => {
console.error("Request Error:", error);
reject({
message: `Failed to send data to Caspian: ${
error.message || "Unknown error"
}`,
attemptedData: postData,
});
});

req.write(postData);
req.end();
});
}