Tiger Cloud: Performance, Scale, Enterprise
Self-hosted products
MST
You use the Kafka source connector in Tiger Cloud to stream events from Kafka into your service. Tiger Cloud connects to your Confluent Cloud Kafka cluster and Schema Registry using SASL/SCRAM authentication and service account–based API keys. Only the Avro format is currently supported with some limitations.
This page explains how to connect Tiger Cloud to your Confluent Cloud Kafka cluster.
Early access: the Kafka source connector is not yet supported for production use.To follow the steps on this page:
Create a target Tiger Cloud service with real-time analytics enabled.
You need your connection details.
Note
This feature is currently not supported for Tiger Cloud on Microsoft Azure.
Take the following steps to prepare your Kafka cluster for connection to Tiger Cloud:
Create a service account
If you already have a service account for Tiger Cloud, you can reuse it. To create a new service account:
Log in to Confluent Cloud
.
Click the burger menu at the top-right of the pane, then press
Access control>Service accounts>Add service account.Enter the following details:
- Name:
tigerdata-access - Description:
Service account for the Tiger Cloud source connector
- Name:
Add the service account owner role, then click
Next.Select a role assignment, then click
AddClick
Next, then clickCreate service account.
Create API keys
- In Confluent Cloud, click
Home>Environments> Select your environment > Select your cluster. - Under
Cluster overviewin the left sidebar, selectAPI Keys. - Click
Add key, chooseService Accountand clickNext. - Select
tigerdata-access, then clickNext. - For your cluster, choose the
Operationand select the followingPermissions, then clickNext:Resource type:ClusterOperation:DESCRIBEPermission:ALLOW
- Click
Download and continue, then securely store the ACL. - Use the same procedure to add the following keys:
- ACL 2: Topic access
Resource type:TopicTopic name: Select the topics that Tiger Cloud should readPattern type:LITERALOperation:READPermission:ALLOW
- ACL 3: Consumer group access
Resource type:Consumer groupConsumer group ID:tigerdata-kafka/<tiger_cloud_project_id>. See Find your connection details for where to find your project IDPattern type:PREFIXEDOperation:READPermission:ALLOWYou need these to configure your Kafka source connector in Tiger Cloud.
- ACL 2: Topic access
- In Confluent Cloud, click
Tiger Cloud requires access to the Schema Registry to fetch schemas for Kafka topics. To configure the Schema Registry:
Navigate to Schema Registry
In Confluent Cloud, click
Environmentsand select your environment, then clickStream Governance.Create a Schema Registry API key
Click
API Keys, then clickAdd API Key.Choose
Service Account, selecttigerdata-access, then clickNext.Under
Resource scope, chooseSchema Registry, select thedefaultenvironment, then clickNext.In
Create API Key, add the following, then clickCreate API Key:Name:tigerdata-schema-registry-accessDescription:API key for Tiger Cloud schema registry access
Click
Download API Keyand securely store the API key and secret, then clickComplete.
Assign roles for Schema Registry
Click the burger menu at the top-right of the pane, then press
Access control>Accounts & access>Service accounts.Select the
tigerdata-accessservice account.In the
Accesstab, add the following role assignments forAll schema subjects:ResourceOwneron the service account.DeveloperReadon schema subjects. ChooseAll schema subjectsor restrict to specific subjects as required.
Save the role assignments.
Your Confluent Cloud Schema Registry is now accessible to Tiger Cloud using the API key and secret.
Take the following steps to create a Kafka source connector in Tiger Cloud Console.
In Console
, select your service
Go to
Connectors>Source connectors. ClickNew Connector, then selectKafkaClick the pencil icon, then set the connector name
Set up Kafka authentication
Enter the name of your cluster in Confluent Cloud and the information from the first
api-key-*.txtthat you downloaded, then clickAuthenticate.Set up the Schema Registry
Enter the service account ID and the information from the second
api-key-*.txtthat you downloaded, then clickAuthenticate.Select topics to sync
Add the schema and table, map the columns in the table, and click
Create connector.
Your Kafka connector is configured and ready to stream events.
The following Avro schema types are not supported:
Multi-type non-nullable unions are blocked.
Examples:
Multiple type union:
{"type": "record","name": "Message","fields": [{"name": "content", "type": ["string", "bytes", "null"]}]}Union as root schema:
["null", "string"]
Referencing a previously defined named type by name, instead of inline, is not supported.
Examples:
Named type definition:
{"type": "record","name": "Address","fields": [{"name": "street", "type": "string"},{"name": "city", "type": "string"}]}Failing reference:
{"type": "record","name": "Person","fields": [{"name": "name", "type": "string"},{"name": "address", "type": "Address"}]}
Only the logical types in the hardcoded supported list are supported. This includes:
decimal, date, time-millis, time-micros
timestamp-millis, timestamp-micros, timestamp-nanos
local-timestamp-millis, local-timestamp-micros, local-timestamp-nanos
uuid, duration
Unsupported examples:
{"type": "int","logicalType": "date-time"}{"type": "string","logicalType": "json"}{"type": "bytes","logicalType": "custom-type"}
Keywords
Found an issue on this page?Report an issue or Edit this page
in GitHub.