I'm excited to announce that, continuing my journey since 2017, I will be speaking at the Azure Global Bootcamp 2025 on May 8- 10. This year, I’m eager to share more insights and learnings with the Azure community. Join me as we dive into the latest Azure advancements and tackle current tech challenges together.
88 posts tagged with "PowerShell"
View All TagsBuilding an AI Agent for Azure Policy Assistant - Part 2
Welcome back, Friend!
In the previous article, we covered setting up the necessary resources for the AI Agent for Azure Policy Governance Assistant. Now, we will proceed with the final steps to complete and test the AI Agent:
- Create the AI Agent using the Azure AI Foundry UI or PowerShell.
- Develop and provide an example KQL query for retrieving compliance data.
- Execute the KQL query and upload the results to the AI Agent’s Knowledge.
- Test the AI Agent to ensure it accurately responds to policy compliance queries.
Building an AI Agent for Azure Policy Assistant - Part 1
Hello Friend,
Today, I’m going to build a quick-win solution called the AI Agent for Azure Policy Governance Assistant. This AI Agent is meant to help you rapidly identify non-compliant policies and improve governance across your Azure subscriptions or resource groups. Imagine that being able to ask the AI Agent questions like, "Which resources are non-compliant?" or "Which exemptions are about to expire within my subscription scope?" In this article, I’ll walk you through setting up the necessary resources using PowerShell scripts, and then show you how to integrate them with an AI Agent for actionable governance insights.
Reading Bicep parameter files with PowerShell
Bicep parameter files allow you to define values in an individual file that are then passed to your main Bicep Templates file. The parameter file exposes values that may change from a given subscription, environment, and/or region. Leveraging a parameter file drives consistency in your IaC deployments while providing flexibility. For example, an organization can use these files to right-size nonproduction environments to save costs while maintaining the same core infrastructure across all deployments.
In addition, these parameter files streamline the CI/CD deployment process. Since each parameter file is under source control and passed into the appropriate automated deployment steps, they ensure a consistent and repeatable deployment experience. In this article, we will explore how to create, read, and use a Bicep parameters file via PowerShell.
Understanding Service Endpoint Usage for Storage Accounts - Part 2
In Part 1, we covered the benefits, considerations, and real-world use cases for Azure VNet Service Endpoints. Now, let's take a closer look at how to track and manage service endpoint usage in your network, focusing on writing complex KQL queries to accurately identify service endpoint configurations. Once identified, you can start assessing your environment to understand the impact and optimize your network setup.
⚡ Querying Service Endpoint Usage with KQL
KQL is one of the most effective tools in Azure for querying large datasets, especially within Azure Resource Graph. It allows you to monitor and identify where service endpoints are being used in your environment and provides visibility into traffic patterns. I have tons of examples in my blog about KQL and Azure Resource Graph. You can check them out for more details.
Understanding Service Endpoint Usage for Storage Accounts - Part 1
In this two-part article series, I’ll try to explore a common Azure scenario involving Virtual Network (VNet) Service Endpoints, specifically in the context of Azure Storage. Microsoft already provides extensive documentation on this topic, so you won’t see much detail on that in this article series—it's all available in the MS Docs. Instead, this article will focus on practical considerations, real-world use cases, and the network impacts when using service endpoints in your environment.
📚 Understanding Azure Service Endpoints
Service Endpoints offer secure, private connectivity between virtual networks and Azure PaaS services, such as Azure Storage, by utilizing the Azure backbone network. Microsoft highlights that service endpoints help optimize both performance and security by keeping traffic within Azure’s infrastructure, thus eliminating the need for routing traffic over the public internet.
Leveraging Bicep deployer for Automated RBAC Assignments
Hello Folks,
Today, I'll go through a topic that I believe is a real time-saver—one that keeps automation running smoothly and ensures it's effectively integrated into bicep templates. In this article, I'll share my experience with the Bicep deployer() function. I’ll explain how it streamlines the process of provisioning resources like Azure Key Vaults while automating RBAC-based access.
What Is the deployer Function?
The deployer() function in Bicep returns details about the identity executing the deployment. Essentially, it tells you which service principal or managed identity is running your deployment. I find this incredibly useful because it allows me to reference the deployer’s identity directly in my templates—ensuring that the correct permissions are automatically applied without hardcoding any object IDs for the deployments.
Example output of the deployer() function looks like this:
{
"objectId": "12345678-1234-1234-1234-123456789abc",
"tenantId": "87654321-4321-4321-4321-cba987654321"
}
Image Management from Docker Hub to Azure Container Registry - Part 2
In Part 1 of this series, we built a PowerShell script that automates the process of pulling a public Docker image from Docker Hub and pushing it to your Azure Container Registry (ACR). In this second part, we'll focus on integrating that script into a CI/CD pipeline using GitHub Actions. This logic will ensure that our image management process is fully automated and runs on a scheduled basis.
With GitHub Actions, you can schedule the execution of your script, monitor its output, and ensure that your ACR is always updated with the latest image—without any manual intervention. Let’s walk through how to configure your GitHub Actions workflow.
Image Management from Docker Hub to Azure Container Registry - Part 1
Keeping your container images up to date is a critical part of managing modern deployments. In this article, I'll explain how you can automate the process of pulling a public Docker image from Docker Hub and pushing it to your Azure Container Registry (ACR) using a PowerShell script. This approach is especially useful for overcoming Docker Hub’s rate limits by storing the image in your ACR.
This article is part of a two-part series where I demonstrate how to build an automated process that leverages a Service Principal with Federated Credentials and strict Role-Based Access Control (RBAC) assigment. With RBAC in place, only the necessary permissions are granted, ensuring that the Service Principal has the least privileges required for this operation. In this example, we'll use the Cloudflare image as our reference, but you can adapt the process for any public image.
Identifying Functions Inside Function Apps
In our previous article Identifying Workflows Inside Azure Logic Apps, we explored how to identify workflows running within Azure Logic Apps across different instances in an Azure tenant. That script allowed us to automate the process of retrieving workflow details such as name, helping with resource management and assesment. By leveraging PowerShell, we eliminated the need for manual tracking through the Azure portal and streamlined the reporting process. By the way as for now, there is no easy way to get this information from Azure Portal.
That approach was particularly useful for Logic Apps Standard, which can host multiple workflows within a single instance. By automating the data retrieval, we generated structured reports showing which workflows were running on which Logic Apps, making it easier to audit and optimize resources.