Search Tutorials


AZ-204 - Develop Event-Based Solutions | JavaInUse

AZ-204 - Develop Event-Based Solutions

Azure Event Grid

Azure Event Grid is a fully managed event routing service that enables event-driven architectures. It uses a publish-subscribe model to connect event sources to event handlers with near real-time delivery.

Core Concepts

Events

The smallest unit of information describing something that happened. Each event follows the CloudEvents 1.0 schema or Event Grid schema, containing fields like id, source, type, time, and data.

Topics

An endpoint where events are sent. System topics are built-in for Azure services (e.g., Blob Storage, Resource Groups). Custom topics are user-defined for application events.

Event Subscriptions

A configuration that routes events from a topic to a handler. Supports filtering by event type or subject path to deliver only relevant events.

Event Handlers

The destination that processes events. Supported handlers include Azure Functions, Logic Apps, Event Hubs, Storage Queues, Service Bus, and Webhooks.

Event Filtering

Filter TypeDescriptionExample
Event TypeFilter by event typeMicrosoft.Storage.BlobCreated
Subject Begins WithFilter by subject prefix/blobServices/default/containers/images
Subject Ends WithFilter by subject suffix.jpg
AdvancedFilter on data field valuesdata.contentLength > 1024

Retry & Dead-Lettering

Event Grid retries delivery using an exponential backoff schedule. Default retry: 30 attempts over 24 hours. Events that cannot be delivered are optionally sent to a dead-letter storage container for later analysis.

Azure Event Hubs

Azure Event Hubs is a big data streaming platform and event ingestion service capable of receiving and processing millions of events per second. It is designed for high-throughput, low-latency data streaming.

Architecture

Namespace

A management container for one or more event hubs. Provides DNS integration and access control. Comparable to a Kafka cluster.

Partitions

An ordered sequence of events within an event hub. Partitions enable parallel processing. The number of partitions is set at creation time (1-32 for Standard, up to 2000 for Premium/Dedicated).

Consumer Groups

A view (state, position, or offset) of the entire event hub. Each consumer group can read the stream independently. Default consumer group: $Default.

Throughput Units (TUs)

Pre-purchased capacity units that control throughput. 1 TU = 1 MB/s ingress and 2 MB/s egress. Auto-inflate can scale TUs automatically.

Event Hubs Capture

Automatically delivers streaming data to Azure Blob Storage or Azure Data Lake Storage in Avro format, without additional code. Useful for data archival and batch processing.

Event Grid vs Event Hubs

FeatureEvent GridEvent Hubs
PatternReactive event routing (pub/sub)High-throughput event streaming
ThroughputReactive, per-eventMillions of events per second
RetentionNo retention (push delivery)1-90 days (configurable)
Use CaseState changes, notificationsTelemetry, logging, analytics pipelines
ProtocolHTTP pushAMQP, Kafka, HTTPS

Key Terms

TermDefinition
Event GridA fully managed event routing service using pub/sub for near real-time event delivery to handlers.
System TopicA built-in Event Grid topic for Azure service events (e.g., Blob Storage, Resource Group changes).
Custom TopicA user-defined Event Grid topic for application-specific events.
Event HubsA big data streaming platform for ingesting millions of events per second with low latency.
PartitionAn ordered sequence of events in Event Hubs that enables parallel processing by consumers.
Consumer GroupAn independent view of the event stream, allowing multiple readers to process events at their own pace.
CaptureAn Event Hubs feature that automatically archives streaming data to Blob Storage or Data Lake in Avro format.
Exam Tips:
  • Event Grid = reactive event routing (state changes, notifications). Event Hubs = high-throughput streaming (telemetry, analytics).
  • Event Grid supports system topics (Azure services) and custom topics (your apps).
  • Event Grid retries up to 30 times over 24 hours by default. Failed events go to dead-letter storage.
  • Event Hubs partitions enable parallelism. More partitions = higher throughput.
  • Consumer groups allow multiple independent readers of the same event stream.
  • Event Hubs Capture stores data in Avro format -- know this for the exam.
  • The $Default consumer group exists by default in every event hub.

Practice Questions

Q1. A developer needs to react to blob creation events in Azure Storage and trigger an Azure Function. Which service should they use?

  • Azure Event Hubs
  • Azure Event Grid
  • Azure Service Bus
  • Azure Queue Storage

Answer: B

Event Grid is designed for reactive event routing. It has a built-in system topic for Azure Storage that publishes BlobCreated events, which can trigger an Azure Function via an event subscription.

Q2. What format does Event Hubs Capture use to store archived events?

  • JSON
  • Parquet
  • Avro
  • CSV

Answer: C

Event Hubs Capture automatically archives events in Apache Avro format to Azure Blob Storage or Azure Data Lake Storage.

Q3. What is the purpose of partitions in Azure Event Hubs?

  • Encrypt events at rest
  • Enable parallel processing of events
  • Route events to different topics
  • Compress event payloads

Answer: B

Partitions are ordered sequences of events that enable parallel processing. Multiple consumers can read different partitions simultaneously, increasing throughput.

Q4. Which Event Grid filter allows you to receive only events for .png files in a storage container?

  • Event Type filter
  • Subject Begins With filter
  • Subject Ends With filter
  • Advanced filter

Answer: C

The Subject Ends With filter matches the suffix of the event subject. Setting it to ".png" would deliver only events for PNG files.

Q5. An application needs to process IoT sensor data at a rate of 500,000 events per second. Which service is best suited?

  • Azure Event Grid
  • Azure Event Hubs
  • Azure Queue Storage
  • Azure Logic Apps

Answer: B

Azure Event Hubs is a big data streaming platform designed for high-throughput ingestion of millions of events per second, making it ideal for IoT telemetry scenarios.

AZ-204 Developing Azure Solutions - Table of Contents

Master all exam topics with comprehensive study guides and practice questions.


Popular Posts