Services in distributed systems depend on each other for data. Often a single piece of data will be used across multiple services and could be time-sensitive. We'll look at two ways in which this can occur.
Messages - Updates that are directly sent from one service to another.
Events - Updates that occur at a specific time and are not tied to any specific recipient or client.
Both of these ways of distributing information have the same net effect, processing, manipulation or storage but in which situation do they need to be applied? To understand this, lets look at the distribution patterns in more detail.
What are messages?
A message is used when we want to give a fixed data packet from one service to another. The messages are ephemeral, once consumed they are dropped from their queue and leave processing up to the consuming service. Many queues offer a FAN-OUT pattern to route message copies to other consumers.
In more traditional message-driven systems, data and system updates are sent as messages directly from the producer to the consumer. The producer waits for acknowledgement that the consumer received the message before it continues with its processes. This often creates issues that can degrade the efficiency of the system. In systems where data is shared across multiple consumers, we also introduce complexity into the producing service as it now needs to be aware of it's clients.
What are events?
An event contains data of an action that occurred at a specific time, for example the purchase of a product in an online store. The purchase event is defined by product, payment, and delivery as well as the time that the purchase occurred. The purchase , or event, was done in the purchasing service of the system, but has an impact in many other components, such as inventory, payment processing, and shipping.
In event-driven architecture, all actions in the system are defined and packaged as events to precisely identify the individual actions and how they are processed throughout the system.
Instead of processing updates one after the other in a serial fashion and limiting the system’s performance to its worst components, event-driven architecture allows the components to process events at their own pace. This architecture helps developers to build systems that are fast and scalable.
Some components of the distributed system produce events as a result of a specific action that is done in that component. These components are referred to as “producers”. When producers send these events and the events are read or stored in sequence, these events represent a replayable log of changes in the system, also called a “stream”.
A single event includes information that is required by one or many other components in the system, also known as “consumers”, to effect additional changes. The consumers can store, process, or react to these events. Many times consumers also run processes that produce events for other components in the system, so being a producer is not mutually exclusive from being a consumer.
|If similar information is required by multiple recipients, the sender must send a message designed for each recipient independently.||The producer sends individual events that are designed for consumption by multiple consumers.|
|If the receipient is delayed in acknowledging receipt, the producer can’t complete its process until it receives the acknowledgement.||The producer sends the events to an event processing system that can acknowledge receipt and guarantee delivery to the consumers.|
|If there is a break in the connection between a producer and a receipient, the producer doesn’t know if the event was processed or if it needs to resend the event.||The event processing system can track the communication between producers and consumers in the event of a broken connection.|
|In clustered deployments, each producer node has to send messages to each receipient node.||Each producer node sends events to the event processing system and each consumer node retrieves the events from that same system.|
Turning data into product
Many event-processing systems can also be implemented in simple pub/sub messaging systems paired with a database that collects events and makes them available via an API. But event-processing unlocks much more value.
When we decouple our systems and setup event-processing services to collect and route events, we can transform an event stream from individual actions into a warehouse of information. This allows us to share data in real-time with external clients, build reporting products from streams and create a richer data environment.
Events allows us to turn our streams into data products
Every event that occurs in your system can be analyzed, mined, and transformed to give you insight into your business and the power to build new capabilities.