Alerting
Overview
The purpose of alerting is to bring attention to significant events and issues that arise during execution of a pipeline by sending messages via email, Slack, etc. To simplify the incorporation of alerting, pre-constructed patterns have been developed and can be included in a Solution Baseline project. This means there are only a few steps necessary to incorporate generated code for alerting purposes. This page is intended to explain the generated components that are included when alerting is enabled, and determining where to modify and customize elements to suit a specific implementation.
What Gets Generated
Alerting is enabled by default for projects that have a pre-fab data delivery pipeline.
| Alerting is currently only available for Spark Data Delivery Pipelines and will be available for PySpark Data Delivery and Machine Learning Pipelines in a future version. |
Default Method for Sending Alerts
When alerting is enabled, a few methods (outlined below) are generated in the base class of each step. These methods are called automatically upon step completion (whether successfully or with an exception) to send an alert. All of these methods have default logic, but can be customized by overriding the method in the step implementation class.
Configuring Your Alerting Service
The Solution Baseline provides several integration options for alerting purposes.
Alerting with Slack
The default alerting implementation is Slack. To use Slack Alerting, follow the steps below:
-
Add the aiSSEMBLE™ Slack alerting dependency
extensions-alerting-slackto the pipeline POM:
<dependencies> ... <dependency> <groupId>com.boozallen.aissemble</groupId> <artifactId>extensions-alerting-slack</artifactId> </dependency> ... </dependencies>
-
Add the SlackConsumer bean to the pipeline within the PipelinesCdiContext.java file
public List<Class<?>> getCdiClassses() {
// Add any customer CDI classes here
...
customBeans.add(SlackConsumer.class)
return customBeans;
}
-
Create the slack-integration.properties in the following path:
<project>-docker/<project>-spark-worker-docker/src/main/resources/krausening/base/slack-integration.properties
Messaging Integration
The default alerting implementation can be extended to publish the alerts to a Messaging topic. Adding a
microprofile-config.properties file with the following configurations will enable the Messaging integration for the
default Alert Producer:
kafka.bootstrap.servers=kafka-cluster:9093 (1)
mp.messaging.outgoing.alerts.connector=smallrye-kafka
mp.messaging.outgoing.alerts.topic=kafka-alert-topic-name (2)
mp.messaging.outgoing.alerts.key.serializer=org.apache.kafka.common.serialization.StringSerializer
mp.messaging.outgoing.alerts.value.serializer=org.apache.kafka.common.serialization.StringSerializer
| 1 | The hostname and port of the Messaging server to connect to. |
| 2 | The name of the Messaging topic to publish the alerts to. |
Please see the SmallRye documentation on the Kafka connector for more configuration details.