Skip to main content

95 posts tagged with "PowerShell"

View All Tags

· 5 min read
Hasan Gural

Bicep parameter files allow you to define values in an individual file that are then passed to your main Bicep Templates file. The parameter file exposes values that may change from a given subscription, environment, and/or region. Leveraging a parameter file drives consistency in your IaC deployments while providing flexibility. For example, an organization can use these files to right-size nonproduction environments to save costs while maintaining the same core infrastructure across all deployments.

Reading Bicep Parameter Files with PowerShell

In addition, these parameter files streamline the CI/CD deployment process. Since each parameter file is under source control and passed into the appropriate automated deployment steps, they ensure a consistent and repeatable deployment experience. In this article, we will explore how to create, read, and use a Bicep parameters file via PowerShell.

· 6 min read
Hasan Gural

In Part 1, we covered the benefits, considerations, and real-world use cases for Azure VNet Service Endpoints. Now, let's take a closer look at how to track and manage service endpoint usage in your network, focusing on writing complex KQL queries to accurately identify service endpoint configurations. Once identified, you can start assessing your environment to understand the impact and optimize your network setup.

⚡ Querying Service Endpoint Usage with KQL

KQL is one of the most effective tools in Azure for querying large datasets, especially within Azure Resource Graph. It allows you to monitor and identify where service endpoints are being used in your environment and provides visibility into traffic patterns. I have tons of examples in my blog about KQL and Azure Resource Graph. You can check them out for more details.

· 4 min read
Hasan Gural

In this two-part article series, I’ll try to explore a common Azure scenario involving Virtual Network (VNet) Service Endpoints, specifically in the context of Azure Storage. Microsoft already provides extensive documentation on this topic, so you won’t see much detail on that in this article series—it's all available in the MS Docs. Instead, this article will focus on practical considerations, real-world use cases, and the network impacts when using service endpoints in your environment.

📚 Understanding Azure Service Endpoints

Service Endpoints offer secure, private connectivity between virtual networks and Azure PaaS services, such as Azure Storage, by utilizing the Azure backbone network. Microsoft highlights that service endpoints help optimize both performance and security by keeping traffic within Azure’s infrastructure, thus eliminating the need for routing traffic over the public internet.

· 4 min read
Hasan Gural

Hello Folks,

Today, I'll go through a topic that I believe is a real time-saver—one that keeps automation running smoothly and ensures it's effectively integrated into bicep templates. In this article, I'll share my experience with the Bicep deployer() function. I’ll explain how it streamlines the process of provisioning resources like Azure Key Vaults while automating RBAC-based access.

What Is the deployer Function?

The deployer() function in Bicep returns details about the identity executing the deployment. Essentially, it tells you which service principal or managed identity is running your deployment. I find this incredibly useful because it allows me to reference the deployer’s identity directly in my templates—ensuring that the correct permissions are automatically applied without hardcoding any object IDs for the deployments.

Example output of the deployer() function looks like this:

{
"objectId": "12345678-1234-1234-1234-123456789abc",
"tenantId": "87654321-4321-4321-4321-cba987654321"
}

· 4 min read
Hasan Gural

In Part 1 of this series, we built a PowerShell script that automates the process of pulling a public Docker image from Docker Hub and pushing it to your Azure Container Registry (ACR). In this second part, we'll focus on integrating that script into a CI/CD pipeline using GitHub Actions. This logic will ensure that our image management process is fully automated and runs on a scheduled basis.

With GitHub Actions, you can schedule the execution of your script, monitor its output, and ensure that your ACR is always updated with the latest image—without any manual intervention. Let’s walk through how to configure your GitHub Actions workflow.

· 6 min read
Hasan Gural

Keeping your container images up to date is a critical part of managing modern deployments. In this article, I'll explain how you can automate the process of pulling a public Docker image from Docker Hub and pushing it to your Azure Container Registry (ACR) using a PowerShell script. This approach is especially useful for overcoming Docker Hub’s rate limits by storing the image in your ACR.

Automated Image Management: From Docker Hub to Azure Container Registry

This article is part of a two-part series where I demonstrate how to build an automated process that leverages a Service Principal with Federated Credentials and strict Role-Based Access Control (RBAC) assigment. With RBAC in place, only the necessary permissions are granted, ensuring that the Service Principal has the least privileges required for this operation. In this example, we'll use the Cloudflare image as our reference, but you can adapt the process for any public image.

· 5 min read
Hasan Gural

In our previous article Identifying Workflows Inside Azure Logic Apps, we explored how to identify workflows running within Azure Logic Apps across different instances in an Azure tenant. That script allowed us to automate the process of retrieving workflow details such as name, helping with resource management and assesment. By leveraging PowerShell, we eliminated the need for manual tracking through the Azure portal and streamlined the reporting process. By the way as for now, there is no easy way to get this information from Azure Portal.

That approach was particularly useful for Logic Apps Standard, which can host multiple workflows within a single instance. By automating the data retrieval, we generated structured reports showing which workflows were running on which Logic Apps, making it easier to audit and optimize resources.

· 5 min read
Hasan Gural

Hey friends,

In this post, we will dive into generating a report that tracks which workflows are running on which Logic App instances within your tenant. If you are using Logic Apps, especially the Consumption or Standard-based, keeping track of these workflows across different instances can be a challenge. The good news is, with the help of PowerShell that I have developed, we can create a report that highlights which Logic App instances are running which workflows. This report can be crucial for troubleshooting, performance monitoring, and resource management.

· One min read
Hasan Gural

Following an amazing session at Global Azure 2024 - Istanbul, I’m happy to share the recording of my talk, "Deployment stack with Bicep – Insights and Experiences". Whether you attended live or couldn’t make it, you can now watch the full session at your convenience.

GlobalBootCamp2024

🎥 Watch the Recording Here:

· One min read
Hasan Gural

I'm excited to announce that, continuing my journey since 2017, I will be speaking at the Azure Global Bootcamp 2024 on April 18-20. This year, I’m eager to share more insights and learnings with the Azure community. Join me as we dive into the latest Azure advancements and tackle current tech challenges together.

GlobalBootCamp2024