Skip to content

Commit

Permalink
[PP-2196] Update API doc for OAuthM2M Support
Browse files Browse the repository at this point in the history
  • Loading branch information
jackyhu-db committed Oct 11, 2024
1 parent 14daf98 commit 49c83c8
Show file tree
Hide file tree
Showing 10 changed files with 232 additions and 102 deletions.
8 changes: 3 additions & 5 deletions ApiSpecifications.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ The Connect API is used to sign-in or sign-up a user with a partner with Databri
The order of events when connecting Databricks to a partner is as follows:

1. The user clicks the Partner tile.
2. The user confirms the Databricks resources that will be provisioned for the connection (e.g. the Service Principal, the PAT, the SQL Warehouse).
2. The user confirms the Databricks resources that will be provisioned for the connection (e.g. the Service Principal, the PAT or the service principal OAuth secret, the SQL Warehouse).
3. The user clicks Connect.
1. Databricks calls the partner's **Connect API** with all of the Databricks data that the partner needs.
2. The partner provisions any accounts and resources needed. (e.g. persisting the Databricks workspace\_id, provisioning a Databricks output node).
Expand Down Expand Up @@ -138,9 +138,6 @@ POST <partners/databricks/v1/connect>: [example, can be customized]
"is_connection_established" : true|false
"auth": { [Only present if is_connection_established is false]
"personal_access_token": "dapi..."
// or
"oauth_token": ..., [optional, reserved for future use]
"oauth_scope": ... [optional, reserved for future use]
}
}
"hostname": "organization.cloud.databricks.com",
Expand All @@ -162,7 +159,8 @@ POST <partners/databricks/v1/connect>: [example, can be customized]
"is_sql_endpoint" : true|false, [optional: same value as is_sql_warehouse]
"is_sql_warehouse": true|false, [optional: set if cluster_id is set. Determines whether cluster_id refers to Interactive Cluster or SQL Warehouse]
"data_source_connector": "Oracle", [optional, unused and reserved for future use: for data connector tools, the name of the data source that the user should be referred to in their tool]
"service_principal_id": "a2a25a05-3d59-4515-a73b-b8bc5ab79e31" [optional, the UUID (username) of the service principal identity]
"service_principal_id": "a2a25a05-3d59-4515-a73b-b8bc5ab79e31", [optional, the UUID (username) of the service principal identity]
"service_principal_oauth_secret": "dose..." [optional, the OAuth secret of the service principal identity, it will be passed only when partner config includes OAuth M2M auth option]
}
```

Expand Down
13 changes: 8 additions & 5 deletions OnboardingDoc.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Partner Connect is a destination inside a Databricks workspace that allows Datab

We made Partner Connect for 2 reasons:

1. We want to give our customers access to the value provided by the best data products in the market. Partner Connect removes the complexity from connecting products to Databricks by automatically configuring resources such as SQL Warehouses, clusters, PAT tokens, service principals, and connection files. It can also initiate a free trial of partner products.
1. We want to give our customers access to the value provided by the best data products in the market. Partner Connect removes the complexity from connecting products to Databricks by automatically configuring resources such as SQL Warehouses, clusters, PAT tokens, service principals, OAuth secrets and connection files. It can also initiate a free trial of partner products.
2. We want to help our partners build their businesses and incentivize them to create the best possible product experience for Databricks customers. For more on this topic, see [this blog post](https://databricks.com/blog/2021/11/18/build-your-business-on-databricks-with-partner-connect.html).

### Sample marketing materials and user experience demo
Expand All @@ -31,6 +31,9 @@ The following phrases will help you understand the Databricks product and this d
- **Persona Switcher:** The component on the upper left of the UI that allows the user to choose the active Databricks product. This controls which features are available in the UI, and not all users have access to all 3 options. Partner Connect is available to all 3 personas.
- **Personal Access Token (PAT):** A token that a partner product can use to authenticate with Databricks
- **Service Principal:** An account that a partner product can use when calling Databricks APIs. Service Principals have access controls associated with them.
- **OAuth M2M** It uses service principals to authenticate Databricks. It is also known as 2-legged OAuth and OAuth Client Credentials Flow. Partner product can use service principal UUD (client_id) and OAuth secret (client_secret) to authenticate with Databricks.
- **Service Principal OAuth Secret**: The service principal's secret that a partner product use it along with service principal UUID to authenticate with Databricks.


![](img/persona.png)

Expand Down Expand Up @@ -83,10 +86,10 @@ While there's some customization available, most partners have one of the follow

| Integration | Description |
|------------- | -------------|
| Read Partner | This is used by partners that purely need to read (select) data from the Lakehouse. In Partner Connect, the user selects which data to grant access to your product. Databricks provides the partner a SQL Warehouse and PAT with permissions to query that data. This is often used by **Business Intelligence and Data Quality partners**.
| Write Partner | This is used by partners that purely need to write (ingest) data into the Lakehouse. In Partner Connect, the user selects which catalog to grant write access to your product. Databricks provides the partner a SQL Warehouse and PAT with permissions to create schemas and tables in that catalog. This is often used by **Ingestion partners**.
| Read-Write Partner | This is used by partners that both need to read from and write to the Lakehouse. In Partner Connect, the user selects which catalog to grant write access and which schemas to grant read access for your product. Databricks provides the partner a SQL Warehouse and PAT with permissions to create schemas and tables in that catalog, as well as query the selected data. This is often used by **Data Preparation partners**.
| Notebook Partner | This is used by partners that want to demonstrate their integration with Databricks using a Databricks Notebook. Databricks provides the partner an Interactive Cluster and PAT with permissions. The partner can use the PAT to publish a Databricks Notebook and configure the Interactive Cluster.
| Read Partner | This is used by partners that purely need to read (select) data from the Lakehouse. In Partner Connect, the user selects which data to grant access to your product. Databricks provides the partner a SQL Warehouse, PAT with permissions or OAuth secret of the service principal with permissions to query that data. This is often used by **Business Intelligence and Data Quality partners**.
| Write Partner | This is used by partners that purely need to write (ingest) data into the Lakehouse. In Partner Connect, the user selects which catalog to grant write access to your product. Databricks provides the partner a SQL Warehouse, PAT with permissions or OAuth secret of the service principal with permissions to create schemas and tables in that catalog. This is often used by **Ingestion partners**.
| Read-Write Partner | This is used by partners that both need to read from and write to the Lakehouse. In Partner Connect, the user selects which catalog to grant write access and which schemas to grant read access for your product. Databricks provides the partner a SQL Warehouse, PAT with permissions or OAuth secret of the service principal with permissions to create schemas and tables in that catalog, as well as query the selected data. This is often used by **Data Preparation partners**.
| Notebook Partner | This is used by partners that want to demonstrate their integration with Databricks using a Databricks Notebook. Databricks provides the partner an Interactive Cluster, PAT with permissions or OAuth secret of the service principal with permissions. The partner can use the PAT or service principal secret to publish a Databricks Notebook and configure the Interactive Cluster.
| Desktop application Partner | This is used by partners that have a Desktop application (as opposed to a SaaS offering). In this integration, the user selects either an Interactive Cluster or SQL Warehouse and downloads a connection file to the partner product. This is often used by **Partners with Desktop applications**.<br /><br />N.B. For this type of integration, there is no need for the partner to implement the SaaS APIs mentioned elsewhere throughout this documentation.

## Changelog
Expand Down
30 changes: 0 additions & 30 deletions api-doc/Apis/ConnectionApi.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,33 +69,3 @@ All URIs are relative to *https://domainnameofpartner*
- **Content-Type**: application/json
- **Accept**: application/json

<a name="testConnection"></a>
# **testConnection**
> ConnectionTestResult testConnection(User-Agent, ConnectionInfo, Accept-Language, Content-Type)


Test the connection created by calling connect endpoint. This api is currently only used in automated tests. In the future it may be included in the partner connect experience.

### Parameters

|Name | Type | Description | Notes |
|------------- | ------------- | ------------- | -------------|
| **User-Agent** | **String**| The user agent making the call. This will be set to databricks. | [default to databricks] [enum: databricks] |
| **ConnectionInfo** | [**ConnectionInfo**](../Models/ConnectionInfo.md)| | |
| **Accept-Language** | **String**| Preferred language | [optional] [default to en-US] |
| **Content-Type** | **String**| Content type | [optional] [default to application/json; charset&#x3D;utf-8] |

### Return type

[**ConnectionTestResult**](../Models/ConnectionTestResult.md)

### Authorization

[basicAuth](../README.md#basicAuth)

### HTTP request headers

- **Content-Type**: application/json
- **Accept**: application/json

2 changes: 1 addition & 1 deletion api-doc/Models/Auth.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@

| Name | Type | Description | Notes |
|------------ | ------------- | ------------- | -------------|
| **personal\_access\_token** | **String** | Personal Access Token created for the Service Principal or the user | [default to null] |
| **personal\_access\_token** | **String** | Personal Access Token created for the Service Principal or the user. Note will be null if the auth_options in PartnerConfig is not null and does not contain the value AUTH_PAT.| [default to null] |
| **oauth\_token** | **String** | Oauth token. For future use. | [optional] [default to null] |
| **oauth\_scope** | **String** | Oauth scope. For future use. | [optional] [default to null] |

Expand Down
1 change: 1 addition & 0 deletions api-doc/Models/ConnectRequest.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@
| **is\_sql\_warehouse** | **Boolean** | Determines whether cluster_id refers to Interactive Cluster or SQL warehouse. | [optional] [default to null] |
| **data\_source\_connector** | **String** | For data connector tools, the name of the data source that the user should be referred to in their tool. Unused today. | [optional] [default to null] |
| **service\_principal\_id** | **String** | The UUID (username) of the service principal identity that a partner product can use to call Databricks APIs. Note the format is different from the databricks_user_id field in user_info. If empty, no service principal was created | [optional] [default to null] |
| **service\_principal\_oauth\_secret** | **String** | The OAuth secret of the service principal identity that a partner product can use to call Databricks APIs (see [OAuth M2M](https://docs.databricks.com/en/dev-tools/auth/oauth-m2m.html)). It will be set only when the `auth_options` in the [PartnerConfig](PartnerConfig.md) contains the value `AUTH_OAUTH_M2M`. | [optional] [default to null] |
| **connection\_scope** | **String** | The scope of users that can use this connection. Workspace means all users in the same workspace. User means only the user creating it. | [optional] [default to null] |

[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
Expand Down
1 change: 1 addition & 0 deletions api-doc/Models/PartnerConfig.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@
| **require\_manual\_signup** | **Boolean** | True if the partner requires a manual signup after connect api is called. When set to true, connect api call with is_connection_established (sign in) is expected to return 404 account_not_found or connection_not_found until the user completes the manual signup step. | [optional] [default to null] |
| **trial\_type** | **String** | Enum describing the type of trials the partner support. Partners can chose to support trial account expiration at the individual user or account level. If trial level is user, expiring one user connection should not expire another user in the same account. | [optional] [default to null] |
| **supports\_demo** | **Boolean** | True if partner supports the demo flag in the connect api call. | [optional] [default to null] |
| **auth\_options** | **List** | The available authentication methods that a partner can use to authenticate with Databricks. If it is not specified, `AUTH_PAT` will be used. The allowed options include <ul><li><b>AUTH_PAT</b></li><li><b>AUTH_OAUTH_M2M</b></li></ul>| [optional] [default to null] |
| **test\_workspace\_detail** | [**PartnerConfig_test_workspace_detail**](PartnerConfig_test_workspace_detail.md) | | [optional] [default to null] |

[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
Expand Down
10 changes: 9 additions & 1 deletion openapi/partner-connect-2.0.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -333,7 +333,9 @@ components:
properties:
personal_access_token:
type: string
description: Personal Access Token created for the Service Principal or the user
description: |
Personal Access Token created for the Service Principal or the users.
It will be null if the auth_options in PartnerConfig is not null and does not contain the value AUTH_PAT.
example: "token"
oauth_token:
type: string
Expand Down Expand Up @@ -480,6 +482,12 @@ components:
type: string
description: The UUID (username) of the service principal identity that a partner product can use to call Databricks APIs. Note the format is different from the databricks_user_id field in user_info. If empty, no service principal was created
example: "a2a25a05-3d59-4515-a73b-b8bc5ab79e31"
service_principal_oauth_secret:
type: string
description: |
The secret of the service principal identity that a partner product can use to call Databricks APIs.
It will be set only when the auth_options in PartnerConfig contains the value AUTH_OAUTH_M2M.
example: "secret"
connection_scope:
type: string
description: The scope of users that can use this connection. Workspace means all users in the same workspace. User means only the user creating it.
Expand Down
55 changes: 2 additions & 53 deletions src/main/scala/com/databricks/partnerconnect/example/Server.scala
Original file line number Diff line number Diff line change
Expand Up @@ -12,11 +12,8 @@ import akka.stream.{ActorMaterializer, Materializer}
import com.databricks.partnerconnect.example.formatters.JsonFormatters._
import com.databricks.partnerconnect.example.handlers.{
ConnectHandler,
DeleteAccountHandler,
DeleteConnectionHandler,
ExpireAccountHandler,
GetConnectorHandler,
TestConnectionHandler
GetConnectorHandler
}
import com.databricks.partnerconnect.example.service.{
AccountService,
Expand All @@ -27,9 +24,7 @@ import com.databricks.partnerconnect.example.util.ServiceLogger.withRequestLoggi
import com.databricks.partnerconnect.example.validators.{
ConnectValidator,
ConnectionInfoValidator,
DeleteAccountValidator,
DeleteConnectionRequestValidator,
ExpireAccountValidator
DeleteConnectionRequestValidator
}
import com.typesafe.scalalogging.Logger
import org.openapitools.client.model.{
Expand Down Expand Up @@ -60,17 +55,10 @@ case class Server(config: PartnerConfig) {
val connectionHandler =
new ConnectHandler(accountService, connectionService, config)
val deleteConnectionHandler = new DeleteConnectionHandler(connectionService)
val deleteAccountHandler =
new DeleteAccountHandler(accountService, connectionService)
val expireAccountHandler =
new ExpireAccountHandler(accountService)
val connectValidator = new ConnectValidator(config)
val connectionInfoValidator = new ConnectionInfoValidator()
val deleteConnectionRequestValidator = new DeleteConnectionRequestValidator()
val deleteAccountValidator = new DeleteAccountValidator()
val testConnectionHandler = new TestConnectionHandler(connectionService)
val getConnectorHandler = new GetConnectorHandler(config)
val expireAccountValidator = new ExpireAccountValidator()

def startServer(): Unit = {
val route = handleRejections(rejectionHandler) {
Expand Down Expand Up @@ -111,42 +99,6 @@ case class Server(config: PartnerConfig) {
}
}

val deleteAccountRoute: Route = path("delete-account") {
delete {
entity(as[AccountInfo]) { account =>
val validation =
deleteAccountValidator.validate(account)
validate(validation.valid, validation.toString) {
deleteAccountHandler.handle(account)
}
}
}
}

val expireAccountRoute: Route = path("expire-account") {
put {
entity(as[AccountUserInfo]) { accountUser =>
val validation =
expireAccountValidator.validate(accountUser)
validate(validation.valid, validation.toString) {
expireAccountHandler.handle(accountUser)
}
}
}
}

val testConnectionRoute: Route = path("test-connection") {
post {
entity(as[ConnectionInfo]) { connectionInfo =>
val validation =
connectionInfoValidator.validate(connectionInfo)
validate(validation.valid, validation.toString) {
testConnectionHandler.handle(connectionInfo)
}
}
}
}

val getConnectorsRoute: Route = path("connectors") {
get {
parameter("pagination_token") { p =>
Expand Down Expand Up @@ -185,9 +137,6 @@ case class Server(config: PartnerConfig) {
concat(
connectRoute,
deleteConnectionRoute,
expireAccountRoute,
deleteAccountRoute,
testConnectionRoute,
getConnectorsRoute
)
}
Expand Down
Loading

0 comments on commit 49c83c8

Please sign in to comment.