Note
The Kafka Connector is available in open beta. If your Samsara account is on the Early Adopter or Continuous Release track, you can opt in to test Kafka streaming in Feature Management.
Automatically stream Samsara live operations data into a Kafka cluster to power real-time applications and populate your data warehouse.
Samsara's Kafka producer supports the following use cases:
-
GPS locations of vehicles, trailers, equipment, and asset tags
-
Vehicle diagnostics, including odometer, engine hours, fault codes, fuel levels, and more
-
Events, including alert incidents, severe speeding, geofence entry and exit, document submitted, and more
For message payload for each data entity, see Kafka Connector Reference.
-
Kafka: Kafka 0.10.1.0 or a later release.
-
Samsara: Data streaming access through your Platform license
Some entities require licenses in Telematics and Safety Data for the data to be available to stream. For example, an AG license is requires to stream Equipment GPS.
Your IT team may need approval from your internal Security tea, to allow an external producer such as Samsara to write to your Kafka cluster. In addition, be aware of the following:
-
The Kafka connector requires a public endpoint to all brokers in your Kafka cluster.
-
Samsara requires the minimal set of permissions to produce a Kafka topic:
We recommend that you grant only the minimal permissions to WRITE and DESCRIBE for the specific topic to the authentication credentials.
Samsara encrypts secrets when connected to your Kafka cluster.
-
Samsara Kafka connector requests emit from a static set of IP addresses.
If you wish to restrict access to your Kaka cluster by source IP, you must add Samsara's IP addresses to your allow list.
To obtain the list of IP addresses, sign in to your Samsara dashboard and navigate to Settings (
) > Webhooks.
Samsara built the Kafka Connector and will assist with deployments as needed. Follow the steps below to get started. If there are any issues, please contact Samsara Support.
-
Set up the Kafka cluster.
If you do not already have a Kafka cluster, select a cloud platform of your choice to deploy one. There are several options you can use. Refer to our Data Connector guide to view a list of options and learn more.
The remaining steps to connected the Samsara Kafka producer to your Kafka cluster are illustrated through the Confluent cloud platform.
-
Create a topic in your Confluent Kafka cluster to receive data from Samsara.
If your Kafka cluster is inaccessible from outside of your network, ensure that you:
-
Allow inbound connections to all brokers for the Samsara Kafka Connector.
To obtain the list of IP addresses, sign in to your Samsara dashboard and navigate to Settings (
) > Webhooks.
-
Configure advertised listeners to externally resolvable addresses or hostnames. For more information, see the Confluent documentation about Kafka Listeners.
-
-
Configure a SASL/PLAIN user on the broker.
See steps 1-3 in the configuration documentation about Kafka Listeners.
You must configure SASL SSL, and not SASL PLAIN text.
-
Grant the user ACLS write access and define access to the new topic.
For your specific Topic name, grant the same Permission, Operation, and Pattern Types as shown, and then click Next.
-
Record the key and secret for future use in the Samsara dashboard.
-
From the Samsara dashboard, set up the Kafka Connector.
-
Navigate to Settings (
) > Data Streaming.
-
Select Connected Clusters.
-
Click + Create Connection.
-
Enter the following information generated from your Kafka cluster:
-
URL: You can find the URL in the Cluster Settings under Bootstrap server. This location may be different if the user is not using Confluent.
-
SASL Mechanism: Select SASL/PLAIN based on your Confluent set up.
-
Key: This field is the Key (Kafka SASL Username) that Confluent generated when you set up topic access as described in the previous step.
-
Secret: This field is the Secret (Kafka SASL Password) that Confluent generated when you set up topic access as described in the previous step.
-
Cluster Name: Provide a name to easily identify your authenticated cluster when setting up data streams
-
-
Click + Save when finished.
If the configuration is successful, Samsara displays a
Successfully created Kafka subscription
notification. The connector stream initialization will take a few minutes to complete. -
Select Streams.
-
Click + Create Stream and enter the following information:
-
Connected Cluster: Select a connected Kafka cluster with the intended topic destination.
-
Topic Name: Match the Topic Name in your connected cluster for the destination of this subscription stream.
-
Entities: Select one or more data entities to stream to a connected cluster. Each entity selected will be a unique stream.
-
-
Click + Create Streams when finished.
-
For issues related to the integration, contact Samsara Support.
Comments
0 comments
Article is closed for comments.