OAuth Integrations

OAuth integrations let content published to Connect Cloud access external data sources on behalf of the viewer, using each viewer’s own credentials. Connect Cloud handles acquiring and refreshing the time-limited access tokens, so content can read live data without storing or managing credentials itself.

With a Viewer integration, each viewer is prompted to log in to the data provider the first time they open the content. The content then sees only the data that viewer is authorized to access in the provider — permissions stay governed by the data system, not by Connect Cloud.

Adding a Data Source

Account Admins can configure OAuth integrations for the organization.

  1. Go to the Admin > Integrations tab.
  2. Click Add Data Source.
  3. Enter a Name and Description for the integration.
  4. Select a Data Provider (Databricks or Snowflake).
  5. Provide the provider-specific connection details:
    • Client ID
    • Client Secret
    • Host URL
    • Scopes
  6. Save the integration.

Once saved, the integration is available to publishers in the account, who can attach it to their deployed content.

Databricks

Before adding a Databricks data source in Connect Cloud, a Databricks account administrator must register an OAuth application. This is what produces the Client ID and Client Secret that Connect Cloud uses to authenticate viewers.

In Databricks, register a Confidential custom app integration (recommended over Public), supplying https://api.connect.posit.cloud/v1/integrations/callback as the redirect URI and the scopes you want to make available. A confidential app issues a Client Secret; a public app does not. Once registered, Databricks displays the Client ID and Client Secret to enter into Connect Cloud.

The Workspace Host is your Databricks workspace hostname — for example, cust-success.cloud.databricks.com.

Connect Cloud defaults the Scopes to all-apis sql offline_access openid profile email. Admins can update this if their Databricks configuration requires different scopes.

Code example

In deployed content, Connect Cloud sets the Posit-Connect-User-Session-Token header on each viewer request. Use the connectapi (R) or posit-sdk (Python) package to exchange that token for the viewer’s Databricks access token, then pass it to your Databricks client.

The examples below assume the following environment variables are set on the deployed content:

  • DATABRICKS_HTTP_PATH

R (Shiny)

library(shiny)
library(DBI)
library(odbc)
library(connectapi)
library(DT)

fetch_trips <- function(access_token) {
  host <- sub("https://", "", Sys.getenv("DATABRICKS_HOST"), fixed = TRUE)

  conn <- dbConnect(
    odbc::odbc(),
    driver = "Databricks",
    Host = host,
    Port = 443,
    HTTPPath = Sys.getenv("DATABRICKS_HTTP_PATH"),
    SSL = 1,
    ThriftTransport = 2,
    AuthMech = 11,
    Auth_Flow = 0,
    Auth_AccessToken = access_token
  )
  on.exit(dbDisconnect(conn))
  dbGetQuery(conn, "SELECT * FROM samples.nyctaxi.trips LIMIT 100")
}

ui <- fluidPage(
  actionButton("load", "Load data", class = "btn-primary"),
  DT::DTOutput("table")
)

server <- function(input, output, session) {
  data <- reactiveVal(NULL)

  observeEvent(input$load, {
    session_token <- session$request$HTTP_POSIT_CONNECT_USER_SESSION_TOKEN
    client <- connectapi::connect()
    credentials <- connectapi::get_oauth_credentials(client, session_token)
    data(fetch_trips(credentials$access_token))
  })

  output$table <- DT::renderDT({
    df <- data()
    if (is.null(df)) DT::datatable(data.frame()) else DT::datatable(df)
  })
}

shinyApp(ui, server)

Python (Shiny)

import os

import pandas as pd
from databricks import sql as dbsql
from posit import connect
from shiny import App, Inputs, Outputs, Session, reactive, render, ui

app_ui = ui.page_fluid(
    ui.input_action_button("load", "Load data", class_="btn-primary"),
    ui.output_data_frame("table"),
)


def server(input: Inputs, output: Outputs, session: Session):
    data = reactive.Value(None)

    @reactive.effect
    @reactive.event(input.load)
    def _():
        session_token = session.http_conn.headers.get(
            "Posit-Connect-User-Session-Token"
        )
        client = connect.Client()
        credentials = client.oauth.get_credentials(session_token)

        conn = dbsql.connect(
            server_hostname=os.environ["DATABRICKS_HOST"],
            http_path=os.environ["DATABRICKS_HTTP_PATH"],
            access_token=credentials["access_token"],
        )
        try:
            with conn.cursor() as cur:
                cur.execute("SELECT * FROM samples.nyctaxi.trips LIMIT 100")
                rows = cur.fetchall()
                cols = [d[0] for d in cur.description]
            df = pd.DataFrame(rows, columns=cols)
        finally:
            conn.close()
        data.set(df)

    @render.data_frame
    def table():
        return data() if data() is not None else pd.DataFrame()


app = App(app_ui, server)

Snowflake

Before adding a Snowflake data source in Connect Cloud, a Snowflake account administrator must create an OAuth security integration. This is what produces the Client ID and Client Secret that Connect Cloud uses to authenticate viewers.

In Snowflake, create a custom confidential OAuth security integration, for example:

CREATE SECURITY INTEGRATION POSIT_CONNECT_CLOUD
  TYPE = OAUTH
  ENABLED = TRUE
  OAUTH_CLIENT = CUSTOM
  OAUTH_CLIENT_TYPE = 'CONFIDENTIAL'
  OAUTH_REDIRECT_URI = 'https://api.connect.posit.cloud/v1/integrations/callback'
  OAUTH_ISSUE_REFRESH_TOKENS = TRUE;

Then retrieve the Client ID and Client Secret to enter into Connect Cloud:

SELECT SYSTEM$SHOW_OAUTH_CLIENT_SECRETS('POSIT_CONNECT_CLOUD');

The Account URL is your Snowflake account URL — for example, https://myorg-account_xyz.snowflakecomputing.com.

Connect Cloud defaults the Scopes to refresh_token. Admins can update this if their Snowflake configuration requires different scopes.

Code example

In deployed content, Connect Cloud sets the Posit-Connect-User-Session-Token header on each viewer request. Use the connectapi (R) or posit-sdk (Python) package to exchange that token for the viewer’s Snowflake access token, then pass it to your Snowflake client.

The examples below assume the following environment variables are set on the deployed content:

  • SNOWFLAKE_DATABASE
  • SNOWFLAKE_SCHEMA
  • SNOWFLAKE_WAREHOUSE

R (Shiny)

library(shiny)
library(DBI)
library(odbc)
library(connectapi)
library(DT)

fetch_sales <- function(access_token) {
  conn <- dbConnect(
    odbc::odbc(),
    driver = "Snowflake",
    Server = paste0(Sys.getenv("SNOWFLAKE_ACCOUNT"), ".snowflakecomputing.com"),
    Database = Sys.getenv("SNOWFLAKE_DATABASE"),
    Schema = Sys.getenv("SNOWFLAKE_SCHEMA"),
    Warehouse = Sys.getenv("SNOWFLAKE_WAREHOUSE"),
    Authenticator = "oauth",
    Token = access_token
  )
  on.exit(dbDisconnect(conn))
  dbGetQuery(conn, "SELECT * FROM SALES")
}

ui <- fluidPage(
  actionButton("load", "Load data", class = "btn-primary"),
  DT::DTOutput("table")
)

server <- function(input, output, session) {
  data <- reactiveVal(NULL)

  observeEvent(input$load, {
    session_token <- session$request$HTTP_POSIT_CONNECT_USER_SESSION_TOKEN
    client <- connectapi::connect()
    credentials <- connectapi::get_oauth_credentials(client, session_token)
    data(fetch_sales(credentials$access_token))
  })

  output$table <- DT::renderDT({
    df <- data()
    if (is.null(df)) DT::datatable(data.frame()) else DT::datatable(df)
  })
}

shinyApp(ui, server)

Python (Shiny)

import os

import pandas as pd
import snowflake.connector
from posit import connect
from shiny import App, Inputs, Outputs, Session, reactive, render, ui

app_ui = ui.page_fluid(
    ui.input_action_button("load", "Load data", class_="btn-primary"),
    ui.output_data_frame("table"),
)


def server(input: Inputs, output: Outputs, session: Session):
    data = reactive.Value(None)

    @reactive.effect
    @reactive.event(input.load)
    def _():
        session_token = session.http_conn.headers.get(
            "Posit-Connect-User-Session-Token"
        )
        client = connect.Client()
        credentials = client.oauth.get_credentials(session_token)

        conn = snowflake.connector.connect(
            account=os.environ["SNOWFLAKE_ACCOUNT"],
            authenticator="oauth",
            token=credentials["access_token"],
            warehouse=os.environ["SNOWFLAKE_WAREHOUSE"],
            database=os.environ["SNOWFLAKE_DATABASE"],
            schema=os.environ["SNOWFLAKE_SCHEMA"],
        )
        try:
            df = pd.read_sql("SELECT * FROM SALES", conn)
        finally:
            conn.close()
        data.set(df)

    @render.data_frame
    def table():
        return data() if data() is not None else pd.DataFrame()


app = App(app_ui, server)