Transform Oracle database transactions into business events with SharePlex and Kafka

Oracle and other relational databases are perfect for storing “transactions”, the results of interactions between people or systems.   For example, when a customer purchases a widget from our store, we may need to perform the following steps in a single “unit of work”

  • Update the number of widgets “in-stock” at the store
  • Update various financial records to reflect the collection of payment from the customer

What relational databases are not so good at, without additional work (coding), is keeping track of changes in state, or historical data.   This would include things like how many widgets a particular customer has purchased over some arbitrary time period; or how fast a particular store’s stock of widgets is changing or the time between purchases. These “change events” can be inputs to “machine learning” processes that allow us to predict future events and provide deep insight into customer behavior.

Of course, we can write programs that will capture this information and allow us to generate reports, but, that requires understanding what we’re looking for before we start coding.    That code also takes time to develop and test.   

We need a tool that will capture the change data without requiring lots of resources to configure and manage.  Quest SharePlex fits this need; and we can use Kafka to move the data SharePlex provides from our Oracle database to the ultimate “consumer”.

SharePlex Change Data Capture

SharePlex Change Data Capture can write change data to a Kafka broker, a file, or a relational database.  The user has control over the data written, and can see only the changed columns or full before and after images of the entire row.  SharePlex can also write metadata, including the SCN and the time the data was changed.  The output can be an XML record, a JSON string, or a record inserted into a relational database.

Kafka

Once the data is sent to Kafka; it can then be consumed by any Kafka consumer, using Kafka connectors or streaming services.    If your ultimate destination is anywhere in the Azure Ecosystem, Microsoft makes things even easier with Event Hub for Kafka.  Since Event Hub can store the JSON strings persistently in an Azure Storage Container, it can be directly accessed by Azure Synapse, Azure Analytics or any other Azure subsystem.   The events SharePlex writes can also be sent to a Confluent Kafka broker, either on-premise or in the cloud.

Summary

I hope this will spark some ideas about how SharePlex and Kafka can help you make sense of the increasing data coming into your systems. 

Youi can learn the specifics of setting up SharePlex to move data to Azure Event Hub in this blog, or if you're using Confluent, view this blog

Related Content