The ASSAS Data Hub is a web application to store and visualize ASTEC simulation data on the Large Scale Data Facility at KIT. Its database contains the ASTEC archive in binary raw format and offers a conversion in other data formats. At the moment only a conversion in hdf5 data format is supported.
The ASSAS Data Hub is a flask web application, which requires the following additional software packages:
Entrypoint of the application is wsgi.py (Python Web Server Gateway Interface) and can be started with:
$ python wsgi.py
The application starts as a custom flask app. Test version available under http://assas.scc.kit.edu:5000/assas_app/home on a virtual machine inside the KIT infrastructure.
Runs on CONNECTIONSTRING = r'mongodb://localhost:27017/'
.
Restart NoSQL Database:
$ service mongod restart
The following command mounts the LSDF on the server system for the user USER
:
$ sudo mount -t cifs -o vers=2.0,username='USER',uid=$(id -u),gid=$(id -g) //os.lsdf.kit.edu/kit/scc/projects/ASSAS /mnt/ASSAS
The upload of ASTEC data is supported through an upload application under tools/assas_data_uploader.py
.
The use of the upload application requires the following:
- Request of a Partner- and Guest-KIT Account (https://www.scc.kit.edu/en/services/gup.php)
- Access to the LSDF with this Account (https://www.lsdf.kit.edu/)
- Configure a password-less ssh login to the login server of the LSDF (https://www.lsdf.kit.edu/docs/ssh/#using-ssh-on-linux-or-mac-os). The password-less configuration is mandatory to perform the upload. The application will not start without a password-less configuration.
Create a new ssh key pair with the following command:
This command creates a ssh key pair. For further usage it is recommended to just type enter two times and use the standard key name and no passphrase. The generated key pair is then placed at the standard location
$ ssh-keygen
~/.ssh/id_rsa.pub
and~/.ssh/id_rsa
. The generated key pair has to be used for the next commands. Please check that the path to the keys is correct. Transfer this public key to the login server of the LSDF with the following command:Add the private key as idenitiy on the local machine in executing the following command:$ ssh-copy-id -i ~/.ssh/id_rsa.pub <USERNAME>@os-login.lsdf.kit.edu
Note: Depending on your system it might be the case that your authentication agent is not started. You will get a message like$ ssh-add ~/.ssh/id_rsa
Could not open a connection to your authentication agent.
. In this case you can restart the ssh-agent with the following command:Depending on your operating system, you can also start your authentication agent with the following command:$ eval `ssh-agent -s`
If the authentication agent is started, the command$ ssh-agent /bin/sh
ssh-add ~/.ssh/id_rsa
can be executed again. Please test the password-less configuration before continuing with the next steps by executing the command:This command should open the terminal to the LSDF without asking for a password.$ ssh <USERNAME>@os-login.lsdf.kit.edu
- Installation of
Python3.10+
andrysnc
on the local machine (https://www.python.org/downloads/ and https://wiki.ubuntuusers.de/rsync/) - Definition of the upload parameters of the ASTEC archive according to the commandline interface described in the next section
The command-line interface of the upload application requires the following parameter:
- --user (-u): KIT internal batch which has access to the LSDF
- --source (-s): Absolute path to the directory tree which has to be uploaded (ASTEC Project directory)
- --name (-n): Corresponding name of the archive visible in the database
- --description (-d): Corresponding description of the archive visible in the database
- --archives (-a): Sub path to the actual ASTEC archive inside the directory tree, or a list of sub paths for multiple archives
The commandline interface of the upload application has the following optional parameter:
- --uuid (-i): Upload identifier of an upload process which was already started
- --debug (-l): Enable debug logging of the application
The parameter --uuid can be used to resume an interrupted or failed upload. One must determine the upload uuid from the standard output of the upload application or from the log file.
The upload application can be executed via command-line as follows:
$ python tools/assas_data_uploader.py -u my_user -s my_source_path -n my_name -d my_description -a my_archive_path
If there is a project tree with several ASTEC runs, one can define a list of archive paths:
$ python tools/assas_data_uploader.py -u my_user -s my_source_path -n my_name -d my_description -a my_archive_path_1, my_archive_path_2, ....
The application produces a log file for each execution. The name of the logfile starts with the upload uuid.
The database view displays a list with all available datasets and provides the following parameters:
Index
: Unique index of datasetSize
: Size of ASTEC binary archiveSize hdf5
: Size of the hdf5 file after conversionDate
: Date of the upload timeUser
: User which has uploaded the datasetStatus
: Status of the uploaded datasetName
: Given name of the uploaded dataset
By click on the column cell File
the user can download the hdf5 file.
By click on the parameter Name
the user comes to a detailed view with following meta information about the dataset.
A database entry can have the following states:
UPLOADED
: Direct after the upload the database entry is in this state.CONVERTING
: After the upload the conversion and post-processing will be started automatically.VALID
: If the conversion and the post-processing were successful the database entry is in a valid state.INVALID
: If the conversion and the post-processing were unsuccessful the database entry is in a valid state.
- Name (same as on the database view)
- Description
The following meta information is extracted during the upload and conversion process:
- Variables: List of extracted variables
- Channels: Number of ectracted channels
- Meshes: Number of extracted meshes
- Samples: Number of extracted samples
The ASSAS Data Hub provides a RESTful API to query training data in an automated way.
The test version fo the v1 API is available under the /test/assas_app/
path. All endpoints return JSON.
Some endpoints require authentication. Use your username and password to obtain a session or token as described in the login section of the web interface.
GET /test/assas_app/datasets
Query parameters:
status
(optional): Filter datasets by status (e.g.,valid
)limit
(optional): Limit the number of results
Example:
curl https://assas.scc.kit.edu/test/assas_app/datasets?status=valid
GET /test/assas_app/datasets/<uuid>
Example:
curl https://assas.scc.kit.edu/test/assas_app/datasets/123e4567-e89b-12d3-a456-426614174000
GET /test/assas_app/datasets/<uuid>/data/<variable_name>
Query parameters:
include_stats
(optional): Iftrue
, include statistics about the variable
Example:
curl "https://assas.scc.kit.edu/test/assas_app/datasets/123e4567-e89b-12d3-a456-426614174000/data/vessel_rupture?include_stats=true"
GET /test/assas_app/files/archive/<uuid>
Example:
curl -O https://assas.scc.kit.edu/test/assas_app/files/archive/123e4567-e89b-12d3-a456-426614174000
All API responses are JSON objects with at least the following fields:
success
:true
orfalse
data
: The requested data (if successful)message
orerror
: Error message (if not successful)
{
"success": true,
"data": {
"uuid": "123e4567-e89b-12d3-a456-426614174000",
"name": "Test Dataset",
"status": "valid",
"variables": ["vessel_rupture", "pressure", "temperature"]
}
}
If a request fails, the API returns a JSON object with success: false
and an error
or message
field describing the problem.