Skip to main content

· 6 min read
Hasan Gural

Welcome back, Friend!

In the previous article, we covered setting up the necessary resources for the AI Agent for Azure Policy Governance Assistant. Now, we will proceed with the final steps to complete and test the AI Agent:

  • Create the AI Agent using the Azure AI Foundry UI or PowerShell.
  • Develop and provide an example KQL query for retrieving compliance data.
  • Execute the KQL query and upload the results to the AI Agent’s Knowledge.
  • Test the AI Agent to ensure it accurately responds to policy compliance queries.

· 5 min read
Hasan Gural

Hello Friend,

Today, I’m going to build a quick-win solution called the AI Agent for Azure Policy Governance Assistant. This AI Agent is meant to help you rapidly identify non-compliant policies and improve governance across your Azure subscriptions or resource groups. Imagine that being able to ask the AI Agent questions like, "Which resources are non-compliant?" or "Which exemptions are about to expire within my subscription scope?" In this article, I’ll walk you through setting up the necessary resources using PowerShell scripts, and then show you how to integrate them with an AI Agent for actionable governance insights.

Overview

· 5 min read
Hasan Gural

Bicep parameter files allow you to define values in an individual file that are then passed to your main Bicep Templates file. The parameter file exposes values that may change from a given subscription, environment, and/or region. Leveraging a parameter file drives consistency in your IaC deployments while providing flexibility. For example, an organization can use these files to right-size nonproduction environments to save costs while maintaining the same core infrastructure across all deployments.

Reading Bicep Parameter Files with PowerShell

In addition, these parameter files streamline the CI/CD deployment process. Since each parameter file is under source control and passed into the appropriate automated deployment steps, they ensure a consistent and repeatable deployment experience. In this article, we will explore how to create, read, and use a Bicep parameters file via PowerShell.

· 6 min read
Hasan Gural

In Part 1, we covered the benefits, considerations, and real-world use cases for Azure VNet Service Endpoints. Now, let's take a closer look at how to track and manage service endpoint usage in your network, focusing on writing complex KQL queries to accurately identify service endpoint configurations. Once identified, you can start assessing your environment to understand the impact and optimize your network setup.

⚡ Querying Service Endpoint Usage with KQL

KQL is one of the most effective tools in Azure for querying large datasets, especially within Azure Resource Graph. It allows you to monitor and identify where service endpoints are being used in your environment and provides visibility into traffic patterns. I have tons of examples in my blog about KQL and Azure Resource Graph. You can check them out for more details.

· 4 min read
Hasan Gural

In this two-part article series, I’ll try to explore a common Azure scenario involving Virtual Network (VNet) Service Endpoints, specifically in the context of Azure Storage. Microsoft already provides extensive documentation on this topic, so you won’t see much detail on that in this article series—it's all available in the MS Docs. Instead, this article will focus on practical considerations, real-world use cases, and the network impacts when using service endpoints in your environment.

📚 Understanding Azure Service Endpoints

Service Endpoints offer secure, private connectivity between virtual networks and Azure PaaS services, such as Azure Storage, by utilizing the Azure backbone network. Microsoft highlights that service endpoints help optimize both performance and security by keeping traffic within Azure’s infrastructure, thus eliminating the need for routing traffic over the public internet.

· 4 min read
Hasan Gural

In Part 1 of this series, we built a PowerShell script that automates the process of pulling a public Docker image from Docker Hub and pushing it to your Azure Container Registry (ACR). In this second part, we'll focus on integrating that script into a CI/CD pipeline using GitHub Actions. This logic will ensure that our image management process is fully automated and runs on a scheduled basis.

With GitHub Actions, you can schedule the execution of your script, monitor its output, and ensure that your ACR is always updated with the latest image—without any manual intervention. Let’s walk through how to configure your GitHub Actions workflow.

· 6 min read
Hasan Gural

Keeping your container images up to date is a critical part of managing modern deployments. In this article, I'll explain how you can automate the process of pulling a public Docker image from Docker Hub and pushing it to your Azure Container Registry (ACR) using a PowerShell script. This approach is especially useful for overcoming Docker Hub’s rate limits by storing the image in your ACR.

Automated Image Management: From Docker Hub to Azure Container Registry

This article is part of a two-part series where I demonstrate how to build an automated process that leverages a Service Principal with Federated Credentials and strict Role-Based Access Control (RBAC) assigment. With RBAC in place, only the necessary permissions are granted, ensuring that the Service Principal has the least privileges required for this operation. In this example, we'll use the Cloudflare image as our reference, but you can adapt the process for any public image.

· 5 min read
Hasan Gural

In our previous article Identifying Workflows Inside Azure Logic Apps, we explored how to identify workflows running within Azure Logic Apps across different instances in an Azure tenant. That script allowed us to automate the process of retrieving workflow details such as name, helping with resource management and assesment. By leveraging PowerShell, we eliminated the need for manual tracking through the Azure portal and streamlined the reporting process. By the way as for now, there is no easy way to get this information from Azure Portal.

That approach was particularly useful for Logic Apps Standard, which can host multiple workflows within a single instance. By automating the data retrieval, we generated structured reports showing which workflows were running on which Logic Apps, making it easier to audit and optimize resources.

· 5 min read
Hasan Gural

Hey friends,

In this post, we will dive into generating a report that tracks which workflows are running on which Logic App instances within your tenant. If you are using Logic Apps, especially the Consumption or Standard-based, keeping track of these workflows across different instances can be a challenge. The good news is, with the help of PowerShell that I have developed, we can create a report that highlights which Logic App instances are running which workflows. This report can be crucial for troubleshooting, performance monitoring, and resource management.

· One min read
Hasan Gural

Following an amazing session at Global Azure 2024 - Istanbul, I’m happy to share the recording of my talk, "Deployment stack with Bicep – Insights and Experiences". Whether you attended live or couldn’t make it, you can now watch the full session at your convenience.

GlobalBootCamp2024

🎥 Watch the Recording Here: