Skip to content

Commit

Permalink
Update OAS (#394)
Browse files Browse the repository at this point in the history
  • Loading branch information
fivetran-aleksandrboldyrev authored Jan 29, 2025
1 parent ada58dd commit b843657
Show file tree
Hide file tree
Showing 12 changed files with 39,057 additions and 28,916 deletions.
96 changes: 96 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,102 @@ The deprecated resources, datasources and fields removed:
- Field `fivetran_connector.local_processing_agent_id`.
- Field `fivetran_destination.local_processing_agent_id`.

New destination services supported:
- Supported service: `milvus`

New connection services supported:
- Supported service: `acculynx`
- Supported service: `acumatica`
- Supported service: `adyen`
- Supported service: `akeneo`
- Supported service: `amazon_dsp`
- Supported service: `ashby`
- Supported service: `aveva_pi`
- Supported service: `backbone_plm`
- Supported service: `bigin_by_zoho_crm`
- Supported service: `bigmarker`
- Supported service: `bing_webmaster_tools`
- Supported service: `canvas_data_2_by_instructure`
- Supported service: `clazar`
- Supported service: `clockify`
- Supported service: `clockodo`
- Supported service: `cloudflare_analytics`
- Supported service: `cloudtalk`
- Supported service: `compliance_checkpoint`
- Supported service: `connector_sdk`
- Supported service: `constant_contact`
- Supported service: `cornerstone`
- Supported service: `criteo_retail_media`
- Supported service: `deposco`
- Supported service: `dialpad`
- Supported service: `ehr`
- Supported service: `everflow`
- Supported service: `expensein`
- Supported service: `fillout`
- Supported service: `flywheel_digital`
- Supported service: `formstack`
- Supported service: `google_classroom`
- Supported service: `healthie`
- Supported service: `hilti_ontrack`
- Supported service: `jama_software`
- Supported service: `jibble`
- Supported service: `jobnimbus`
- Supported service: `khoros_communities`
- Supported service: `khoros_marketing`
- Supported service: `leap_crm`
- Supported service: `line_ads`
- Supported service: `lucca`
- Supported service: `maileon`
- Supported service: `mailjet`
- Supported service: `malomo`
- Supported service: `matomo`
- Supported service: `microsoft_power_bi`
- Supported service: `nice`
- Supported service: `okendo`
- Supported service: `oncehub`
- Supported service: `packiyo`
- Supported service: `pandadoc`
- Supported service: `phoenix_ads`
- Supported service: `pigment`
- Supported service: `placerai`
- Supported service: `planhat`
- Supported service: `podio`
- Supported service: `procore`
- Supported service: `prosperstack`
- Supported service: `qmatic_data_connect`
- Supported service: `reviewsai`
- Supported service: `rokt`
- Supported service: `ruddr`
- Supported service: `safebase`
- Supported service: `sana`
- Supported service: `sentry`
- Supported service: `shareasale`
- Supported service: `shipnetwork`
- Supported service: `singlestore_source`
- Supported service: `skimlinks`
- Supported service: `stickyio`
- Supported service: `sugarcrm`
- Supported service: `the_movie_database`
- Supported service: `tive`
- Supported service: `tremendous`
- Supported service: `triple_whale`
- Supported service: `venminder`
- Supported service: `vimeo`
- Supported service: `visma`
- Supported service: `wicked_reports`
- Supported service: `workleap_officevibe`
- Supported service: `xactly`
- Supported service: `zip`
- Supported service: `zonka_feedback`

New connection config fields supported:
- Added field `fivetran_connector.config.aws_secret_access_key` for services: `new_s3_datalake`.
- Added field `fivetran_connector.config.lakehouse_guid` for services: `onelake`.
- Added field `fivetran_connector.config.workspace_guid` for services: `onelake`.
- Added field `fivetran_connector.config.should_maintain_tables_in_databricks` for services: `adls`, `new_s3_datalake`, `onelake`.
- Added field `fivetran_connector.config.databricks_connection_type` for services: `adls`, `new_s3_datalake`, `onelake`.
- Added field `fivetran_connector.config.aws_access_key_id` for services: `new_s3_datalake`.

## [1.4.2](https://github.com/fivetran/terraform-provider-fivetran/compare/v1.4.1...v1.4.2)

## Added
Expand Down
336 changes: 313 additions & 23 deletions docs/data-sources/connector.md

Large diffs are not rendered by default.

56 changes: 48 additions & 8 deletions docs/data-sources/destination.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,17 +61,27 @@ Read-Only:
- `auth` (String) Field usage depends on `service` value:
- Service `snowflake`: Password-based or key-based authentication type
- `auth_type` (String) Field usage depends on `service` value:
- Service `adls`: Authentication type
- Service `databricks`: Authentication type
- Service `new_s3_datalake`: Authentication type
- Service `onelake`: Authentication type
- Service `redshift`: Authentication type. Default value: `PASSWORD`.
- `aws_access_key_id` (String) Field usage depends on `service` value:
- Service `new_s3_datalake`: AWS access key to access the S3 bucket and AWS Glue
- `aws_secret_access_key` (String, Sensitive) Field usage depends on `service` value:
- Service `new_s3_datalake`: AWS secret access key to access the S3 bucket and AWS Glue
- `bootstrap_servers` (Set of String) Field usage depends on `service` value:
- Service `confluent_cloud_wh`: Comma-separated list of Confluent Cloud servers in the `server:port` format.
- `bucket` (String) Field usage depends on `service` value:
- Service `big_query`: Customer bucket. If specified, your GCS bucket will be used to process the data instead of a Fivetran-managed bucket. The bucket must be present in the same location as the dataset location.
- Service `big_query_dts`: Customer bucket. If specified, your GCS bucket will be used to process the data instead of a Fivetran-managed bucket. The bucket must be present in the same location as the dataset location.
- Service `managed_big_query`: Customer bucket. If specified, your GCS bucket will be used to process the data instead of a Fivetran-managed bucket. The bucket must be present in the same location as the dataset location.
- Service `new_s3_datalake`: The name of the bucket to be used as destination
- Service `new_s3_datalake`: (Immutable) The name of the bucket to be used as destination
- `catalog` (String) Field usage depends on `service` value:
- Service `adls`: Catalog name
- Service `databricks`: Catalog name
- Service `new_s3_datalake`: Catalog name
- Service `onelake`: Catalog name
- `client_id` (String) Field usage depends on `service` value:
- Service `adls`: Client id of service principal
- Service `onelake`: Client ID of service principal
Expand Down Expand Up @@ -109,7 +119,7 @@ Read-Only:
- Service `sql_server_rds_warehouse`: Connection method. Default value: `Directly`.
- Service `sql_server_warehouse`: Connection method. Default value: `Directly`.
- `container_name` (String) Field usage depends on `service` value:
- Service `adls`: Container to store delta table files
- Service `adls`: (Immutable) Container to store delta table files
- Service `onelake`: Workspace name to store delta table files
- `controller_id` (String)
- `create_external_tables` (Boolean) Field usage depends on `service` value:
Expand Down Expand Up @@ -140,6 +150,10 @@ Read-Only:
- Service `snowflake`: Database name
- Service `sql_server_rds_warehouse`: Database name
- Service `sql_server_warehouse`: Database name
- `databricks_connection_type` (String) Field usage depends on `service` value:
- Service `adls`: Databricks Connection method. Default value: `Directly`.
- Service `new_s3_datalake`: Databricks Connection method. Default value: `Directly`.
- Service `onelake`: Databricks Connection method. Default value: `Directly`.
- `enable_remote_execution` (Boolean)
- `external_id` (String) Field usage depends on `service` value:
- Service `aws_msk_wh`: Fivetran generated External ID
Expand Down Expand Up @@ -173,22 +187,33 @@ Read-Only:
- Service `sql_server_rds_warehouse`: Server name
- Service `sql_server_warehouse`: Server name
- `http_path` (String) Field usage depends on `service` value:
- Service `adls`: HTTP path
- Service `databricks`: HTTP path
- Service `new_s3_datalake`: HTTP path
- Service `onelake`: HTTP path
- `is_private_key_encrypted` (Boolean) Field usage depends on `service` value:
- Service `snowflake`: Indicates that a private key is encrypted. The default value: `false`. The field can be specified if authentication type is `KEY_PAIR`.
- `is_private_link_required` (Boolean) Field usage depends on `service` value:
- Service `new_s3_datalake`: We use PrivateLink by default if your s3 bucket is in the same region as Fivetran. Turning on this toggle ensures that Fivetran always connects to s3 bucket over PrivateLink. Learn more in our [PrivateLink documentation](https://fivetran.com/docs/connectors/databases/connection-options#awsprivatelinkbeta).
- `is_redshift_serverless` (Boolean) Field usage depends on `service` value:
- Service `redshift`: Is your destination Redshift Serverless
- `lakehouse_guid` (String) Field usage depends on `service` value:
- Service `onelake`: (Immutable) OneLake lakehouse GUID
- `lakehouse_name` (String) Field usage depends on `service` value:
- Service `onelake`: Name of your lakehouse
- Service `onelake`: (Immutable) Name of your lakehouse
- `msk_sts_region` (String)
- `num_of_partitions` (Number) Field usage depends on `service` value:
- Service `confluent_cloud_wh`: Number of partitions per topic.
- `oauth2_client_id` (String) Field usage depends on `service` value:
- Service `adls`: OAuth 2.0 client ID
- Service `databricks`: OAuth 2.0 client ID
- Service `new_s3_datalake`: OAuth 2.0 client ID
- Service `onelake`: OAuth 2.0 client ID
- `oauth2_secret` (String, Sensitive) Field usage depends on `service` value:
- Service `adls`: OAuth 2.0 secret
- Service `databricks`: OAuth 2.0 secret
- Service `new_s3_datalake`: OAuth 2.0 secret
- Service `onelake`: OAuth 2.0 secret
- `passphrase` (String, Sensitive) Field usage depends on `service` value:
- Service `snowflake`: In case private key is encrypted, you are required to enter passphrase that was used to encrypt the private key. The field can be specified if authentication type is `KEY_PAIR`.
- `password` (String, Sensitive) Field usage depends on `service` value:
Expand All @@ -212,8 +237,12 @@ Read-Only:
- Service `sql_server_rds_warehouse`: Database user password
- Service `sql_server_warehouse`: Database user password
- `personal_access_token` (String, Sensitive) Field usage depends on `service` value:
- Service `adls`: Personal access token
- Service `databricks`: Personal access token
- Service `new_s3_datalake`: Personal access token
- Service `onelake`: Personal access token
- `port` (Number) Field usage depends on `service` value:
- Service `adls`: Server port number
- Service `aurora_postgres_warehouse`: Server port number
- Service `aurora_warehouse`: Server port number
- Service `azure_postgres_warehouse`: Server port number
Expand All @@ -225,6 +254,8 @@ Read-Only:
- Service `maria_warehouse`: Server port number
- Service `mysql_rds_warehouse`: Server port number
- Service `mysql_warehouse`: Server port number
- Service `new_s3_datalake`: Server port number
- Service `onelake`: Server port number
- Service `panoply`: Server port number
- Service `periscope_warehouse`: Server port number
- Service `postgres_gcp_warehouse`: Server port number
Expand All @@ -235,9 +266,9 @@ Read-Only:
- Service `sql_server_rds_warehouse`: Server port number
- Service `sql_server_warehouse`: Server port number
- `prefix_path` (String) Field usage depends on `service` value:
- Service `adls`: path/to/data within the container
- Service `new_s3_datalake`: Prefix path of the bucket for which you have configured access policy. It is not required if access has been granted to entire Bucket in the access policy
- Service `onelake`: path/to/data within your lakehouse inside the Files directory
- Service `adls`: (Immutable) path/to/data within the container
- Service `new_s3_datalake`: (Immutable) Prefix path of the bucket for which you have configured access policy. It is not required if access has been granted to entire Bucket in the access policy
- Service `onelake`: (Immutable) path/to/data within your lakehouse inside the Files directory
- `private_key` (String, Sensitive) Field usage depends on `service` value:
- Service `snowflake`: Private access key. The field should be specified if authentication type is `KEY_PAIR`.
- `project_id` (String) Field usage depends on `service` value:
Expand Down Expand Up @@ -317,16 +348,23 @@ Read-Only:
- `security_protocol` (String) Field usage depends on `service` value:
- Service `confluent_cloud_wh`: Security protocol for Confluent Cloud interaction.
- `server_host_name` (String) Field usage depends on `service` value:
- Service `adls`: Server Host name
- Service `databricks`: Server name
- Service `new_s3_datalake`: Server host name
- Service `onelake`: Server Host name
- `should_maintain_tables_in_databricks` (Boolean) Field usage depends on `service` value:
- Service `adls`: Should maintain tables in Databricks
- Service `new_s3_datalake`: Should maintain tables in Databricks
- Service `onelake`: Should maintain tables in Databricks
- `snapshot_retention_period` (String) Field usage depends on `service` value:
- Service `adls`: Snapshots older than the retention period are deleted every week. Default value: `ONE_WEEK`.
- Service `new_s3_datalake`: Snapshots older than the retention period are deleted every week. Default value: `ONE_WEEK`.
- Service `onelake`: Snapshots older than the retention period are deleted every week. Default value: `ONE_WEEK`.
- `snowflake_cloud` (String)
- `snowflake_region` (String)
- `storage_account_name` (String) Field usage depends on `service` value:
- Service `adls`: Storage account for Azure Data Lake Storage Gen2 name
- Service `onelake`: Storage account for Azure Data Lake Storage Gen2 name
- Service `adls`: (Immutable) Storage account for Azure Data Lake Storage Gen2 name
- Service `onelake`: (Immutable) Storage account for Azure Data Lake Storage Gen2 name
- `table_format` (String) Field usage depends on `service` value:
- Service `new_s3_datalake`: (Immutable) The table format in which you want to sync your tables. Valid values are ICEBERG and DELTA_LAKE
- `tenant_id` (String) Field usage depends on `service` value:
Expand Down Expand Up @@ -409,5 +447,7 @@ Read-Only:
- Service `snowflake`: Database user name
- Service `sql_server_rds_warehouse`: Database user name
- Service `sql_server_warehouse`: Database user name
- `workspace_guid` (String) Field usage depends on `service` value:
- Service `onelake`: (Immutable) OneLake workspace GUID
- `workspace_name` (String) Field usage depends on `service` value:
- Service `onelake`: OneLake workspace name
Loading

0 comments on commit b843657

Please sign in to comment.