Tiger Cloud: Performance, Scale, Enterprise
Self-hosted products
MST
You use the Kafka source connector in Tiger Cloud to stream events from Kafka into your service. Tiger Cloud connects to your Confluent Cloud Kafka cluster and Schema Registry using SASL/SCRAM authentication and service account–based API keys. Only the Avro format is currently supported.
This page explains how to connect Tiger Cloud to your Confluence Cloud Kafka cluster.
To follow the steps on this page:
Create a target Tiger Cloud service with real-time analytics enabled.
You need your connection details.
Take the following steps to prepare your Kafka cluster for connection to Tiger Cloud:
Create a service account
If you already have a service account for Tiger Cloud, you can reuse it. To create a new service account:
Log in to Confluent Cloud
.
Click the burger menu at the top-right of the pane, then press
Access control
>Service accounts
>Add service account
.Enter the following details:
- Name:
tigercloud-access
- Description:
Service account for the Tiger Cloud source connector
- Name:
Add the service account owner role, then click
Next
.Select a role assignment, then click
Add
Click
Next
, then clickCreate service account
.
Create API keys
- In Confluent Cloud, click
Home
>Environments
> Select your environment > Select your cluster. - Under
Cluster overview
in the left sidebar, selectAPI Keys
. - Click
Add key
, chooseService Account
and clickNext
. - Select
tigercloud-access
, then clickNext
. - For your cluster, choose the
Operation
and select the followingPermission
s, then clickNext
:Resource type
:Cluster
Operation
:DESCRIBE
Permission
:ALLOW
- Click
Download and continue
, then securely store the ACL. - Use the same procedure to add the following keys:
- ACL 2: Topic access
Resource type
:Topic
Topic name
: Select the topics that Tiger Cloud should readPattern type
:LITERAL
Operation
:READ
Permission
:ALLOW
- ACL 3: Consumer group access
Resource type
:Consumer group
Consumer group ID
:tigercloud-kafka/<tiger_cloud_project_id>
. See Find your connection details for where to find your project IDPattern type
:PREFIXED
Operation
:READ
Permission
:ALLOW
You need these to configure your Kafka source connector in Tiger Cloud.
- ACL 2: Topic access
- In Confluent Cloud, click
Tiger Cloud requires access to the Schema Registry to fetch schemas for Kafka topics. To configure the Schema Registry:
Navigate to Schema Registry
In Confluent Cloud, click
Environments
and select your environment, then clickStream Governance
.Create a Schema Registry API key
Click
API Keys
, then clickAdd API Key
.Choose
Service Account
, selecttigercloud-access
, then clickNext
.Under
Resource scope
, chooseSchema Registry
, select thedefault
environment, then clickNext
.In
Create API Key
, add the following, then clickCreate API Key
:Name
:tigercloud-schema-registry-access
Description
:API key for Tiger Cloud schema registry access
Click
Download API Key
and securely store the API key and secret, then clickComplete
.
Assign roles for Schema Registry
Click the burger menu at the top-right of the pane, then press
Access control
>Accounts & access
>Service accounts
.Select the
tigercloud-access
service account.In the
Access
tab, add the following role assignments forAll schema subjects
:ResourceOwner
on the service account.DeveloperRead
on schema subjects. ChooseAll schema subjects
or restrict to specific subjects as required.
Save the role assignments.
Your Confluent Cloud Schema Registry is now accessible to Tiger Cloud using the API key and secret.
Take the following steps to create a Kafka source connector in Tiger Cloud Console.
In Console
, select your service
Go to
Connectors
>Source connectors
. ClickNew Connector
, then selectKafka
Click the pencil icon, then set the connector name
Set up Kafka authentication
Enter the name of your cluster in Confluent Cloud and the information from the first
api-key-*.txt
that you downloaded, then clickAuthenticate
.Set up the Schema Registry
Enter the Service account ID and the information from the second
api-key-*.txt
that you downloaded, then clickAuthenticate
.Select topics to sync
Add the schema and table, map the columns in the table, and click
Create connector
.
Your Kafka connector is configured and ready to stream events.
Keywords
Found an issue on this page?Report an issue or Edit this page
in GitHub.