# events-v2-migration-script **Repository Path**: mirrors_DataDog/events-v2-migration-script ## Basic Information - **Project Name**: events-v2-migration-script - **Description**: No description available - **Primary Language**: Unknown - **License**: Apache-2.0 - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2021-10-22 - **Last Updated**: 2025-10-04 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # Events V2 Migration Script ## Overview This repository includes a script to be used for migrating to events v2 in Datadog. This includes migrating monitors and dashboard widgets (event stream, timeline and overlays). ```bash virtualenv venv3 -p python3 source venv3/bin/activate pip install -r requirements.txt export DD_API_KEY=xxx export DD_APP_KEY=xxx export DD_HOST=app.datadoghq.com ``` ## Monitors ### Plan: creates an execution plan for migrating event monitors For all event monitors, it displays the query of the event-v2 monitor that will be created by the apply command and all the references of the event-v1 monitor including SLOs, Downtimes. ``` python main.py monitors plan ************************************************************ Service down: https://app.datadoghq.com/monitors/12259383 event alert: events('priority:all Service down').rollup('count').last('5m') >= 1 event-v2 alert: events("Service down").rollup("count").last("5m") >= 1 ************************************************************ OOM: https://app.datadoghq.com/monitors/20177025 event alert: events('priority:all sources:docker OOM').rollup('count').last('5m') >= 1 event-v2 alert: events("source:docker OOM") "tpx-dhome-camera-platform"").rollup("count").last("5m") >= 1 ``` Monitors can be filtered by monitor id: ``` python main.py monitors plan --monitor_id 12259383 ************************************************************ Service down: https://app.datadoghq.com/monitors/12259383 event alert: events('priority:all Service down').rollup('count').last('5m') >= 1 event-v2 alert: events("Service down").rollup("count").last("5m") >= 1 ``` ### Apply: creates event-v2 monitors with downtimes The apply command executes the actions proposed in the plan. It creates the event-v2 monitors with a downtime (to avoid sending notifications from both monitors) and gives you its ID. ``` python main.py monitors apply --monitor_id 12259383 Service down: https://app.datadoghq.com/monitors/12259383 event alert: events('priority:all Service down').rollup('count').last('5m') >= 1 event-v2 alert: events("Service down").rollup("count").last("5m") >= 1 Apply change? y/n y event-v1 monitor (https://app.datadoghq.com/monitors/12259383) -> event-v2 monitor(https://app.datadoghq.com/monitors/47325625) A downtime has been set for the new monitor: https://app.datadoghq.com/monitors#/downtime?id=1465323109 Don't forget to migrate the references (downtimes, slos) manually, or automatically with: "python main.py monitors apply --monitor_id_v1 190739 --monitor_id_v2 47325937" ``` ### Apply: migrates event-v1 monitors dependencies The following command will: - Update SLOs if monitor-based SLOs are used with the event-v1 monitor. - Delete the Downtime previously created when creating the event-v2 monitor. - Update Downtimes if Downtimes are attached to the event-v1 monitor. ``` python main.py monitors apply --monitor_id_v1 190739 --monitor_id_v2 47325937 ``` Then, you have to delete the event-v1 monitor. ## Dashboards ### Plan: creates an execution plan for migrating events widgets For all dashboards, it displays the old and new event query for each event widgets that will be migrated by the apply command. ``` python main.py dashboards plan Found 23979 dashboards dashboard id: k2t-5qt-yi2 dashboard title: Service OOM dashboard URL: https://app.datadoghq.com//dashboard/k2t-5qt-yi2 ---------------------------------------------------------------- widget_title: ASN Input old query: "sources:docker OOM" new query: "source:docker OOM" ``` Dashboards can be filtered by dashboard id: ``` python main.py dashboards plan --dashboard_id fqc-p4x-q6c Found 1 dashboards dashboard id: k2t-5qt-yi2 dashboard title: Service OOM dashboard URL: https://app.datadoghq.com/dashboard/k2t-5qt-yi2 ---------------------------------------------------------------- widget_title: ASN Input old query: "sources:docker OOM" new query: "source:docker OOM" ================================================================ ``` ### Apply: migrate events widgets ``` python main.py dashboards apply --dashboard_id fqc-p4x-q6c Found 1 dashboards dashboard id: fqc-p4x-q6c dashboard title: Michael's Dashboard Wed, May 26, 4:39:52 pm dashboard URL: https://app.datadoghq.com/dashboard/fqc-p4x-q6c ---------------------------------------------------------------- widget_title: old query: "sources:rds" new query: "source:amazon_rds" widget_title: old query: "sources:rds" new query: "source:amazon_rds" ================================================================ Apply change? y/n/q ``` ## Terraform For terraform users, we strongly suggest to use [terraformer](https://docs.datadoghq.com/integrations/faq/how-to-import-datadog-resources-into-terraform/) Terraform allows you to import a resource as both state and HCL configuration. Given the following csv file: event_monitor_id: list of event-v1 monitors event_v2_monitor_id: list of event-v2 monitors downtimes: downtimes configured for event-v1 monitors ```csv event_monitor_id,event_v2_monitor_id,downtimes 40363636,46883411,, 43008951,46846940,, 43008957,46846852,1427306316, 43123218,46847531,1385726181, 43123221,46847490,1387831992, 43212957,46847427,1387844809, 43212961,46847080,1396441435 ``` The script below will generate HCL configuration, import the state and apply the same migration ````shell export DD_API_KEY=xxx export DD_APP_KEY=xxx #!/bin/bash file_name=$1 event_v2=($(cat $file_name | awk -F',' '{print $2}')) event_v2=$(printf "%s:" "${event_v2[@]}") downtimes_to_migrate=($(cat $file_name | awk -F',' '{print $3}')) downtimes_to_migrate=$(printf "%s:" "${downtimes_to_migrate[@]}") terraformer import datadog --resources=downtime,monitor --filter=monitor=$event_v2,downtime=$downtimes_to_migrate --api-key $DD_API_K\ EY --app-key $DD_APP_KEY sed -iE "/silenced = {/{N;N;d;}" generated/datadog/monitor/monitor.tf sed -i '' '/monitor_id.*=.*"0"/d' generated/datadog/downtime/downtime.tf sed -i '' '/active.*=.*/d' generated/datadog/downtime/downtime.tf sed -i '' '/disabled.*=.*/d' generated/datadog/downtime/downtime.tf while read p; do monitor_event_v1=$(echo $p | awk -F',' '{print $1}') monitor_event_v2=$(echo $p | awk -F',' '{print $2}') sed -i '' "s/$monitor_event_v1/$monitor_event_v2/g" generated/datadog/downtime/downtime.tf done <$1 ```