You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- `<GCSFUSE_VERSION>`: A Git tag (e.g., `v1.0.0`), branch name (e.g., `main`), or a commit ID on the GCSFuse master branch.
63
+
- `<PROJECT_ID>`: Your Google Cloud Project ID in which you want the VM and Bucket to be created.
64
+
- `<REGION>`: The GCP region where the VM and GCS buckets will be created (e.g., `us-south1`).
65
+
- `<MACHINE_TYPE>`: The GCE machine typefor the benchmark VM (e.g., `n2-standard-96`). This script supports attaching 16 local NVMe SSDs (375GB each) for LSSD-supported machine types.
66
+
- **Note:** If your machine type supports LSSD but is not included in the `LSSD_SUPPORTED_MACHINES` array within `run-benchmarks.sh` script, you may need to manually add it to ensure LSSDs are attached.
67
+
- `<IMAGE_FAMILY>`: The image family for the VM (e.g., `ubuntu-2504-amd64`).
68
+
- `<IMAGE_PROJECT>`: The image project for the VM (e.g., `ubuntu-os-cloud`).
A unique ID is generated based on the timestamp and a random suffix to name the VM and related GCS buckets.
82
+
83
+
2. **GCS Bucket Creation**:
84
+
A GCS bucket `gcsfuse-release-benchmark-data-<UNIQUE_ID>` is created in the specified region to store FIO test data.
85
+
86
+
3. **FIO Job File Upload**:
87
+
All `.fio` job files from the local`fio-job-files/` directory are uploaded to the results bucket.
88
+
89
+
4. **Data Transfer**:
90
+
A Storage Transfer Service job copies read data from `gs://gcsfuse-release-benchmark-fio-data` to the newly created test data bucket.
91
+
92
+
5. **VM Creation**:
93
+
- A GCE VM is created with the specified machine type.
94
+
- Boot disk size: 1000GB.
95
+
96
+
6. **`starter-script.sh` Execution**:
97
+
This script runs on the VM after creation. It:
98
+
- Installs common dependencies (e.g., git, fio, python3-pip).
99
+
- Builds GCSFuse from the specified version.
100
+
- Sets up local SSDs if enabled.
101
+
- Downloads FIO job files.
102
+
- Mounts the GCS bucket using the built GCSFuse binary.
103
+
- Monitors GCSFuse CPU and memory usage during FIO runs.
104
+
- Executes each FIO job and saves the JSON output.
105
+
- Uploads the FIO results and monitoring logs to GCS.
106
+
- Calls `upload_fio_output_to_bigquery.py` to push results to BigQuery.
107
+
108
+
7. **Cleanup**:
109
+
A cleanup functionis trapped to run on exit, ensuring the VM and the created GCS test data bucket are deleted.
110
+
111
+
---
112
+
113
+
## Output
114
+
115
+
### BigQuery
116
+
117
+
FIO benchmark results, including I/O statistics, latencies, and system resource usage (CPU/Memory), are uploaded to a BigQuery table with:
118
+
119
+
- **Project ID**: `gcs-fuse-test-ml`
120
+
- **Dataset ID**: `gke_test_tool_outputs`
121
+
- **Table ID**: `fio_outputs`
122
+
123
+
---
124
+
125
+
### Google Cloud Storage
126
+
127
+
- **FIO Test Data:** The FIO test data (copied from `gs://gcsfuse-release-benchmark-fio-data`) is uploaded to a newly created bucket dynamically named `gcsfuse-release-benchmark-data-<UNIQUE_ID>`.
128
+
- **Benchmark Results and FIO Job Files:** FIO JSON output files, benchmark logs, and FIO job files, are uploaded to the `gs://gcsfuse-release-benchmarks-results` bucket. The specific path within this bucket will be `gs://gcsfuse-release-benchmarks-results/<GCSFUSE_VERSION>-<UNIQUE_ID>/`.
129
+
- A `success.txt` file is uploaded to GCS upon successful completion of all benchmarks.
0 commit comments