Azure SQL Database Export to Localhost using SqlPackage
Exporting an Azure SQL Server database locally is a common task for developers and administrators who need to create backups, test changes, or migrate databases to other environments. If your SQL Server instance is configured with Entra ID (formerly Azure AD) authentication, you might face an issue connecting to the database with username and password from the command line tool like SqlPackage. In this guide, I’ll walk you through the process of exporting an Azure SQL Server database to your local windows machine using Powershell, SqlPackage tool and Azure SQL Server with Entra ID authentication.Is Azure Resource Group just a Logical Container?
I was always thinking about Azure Resource Groups as a Logical container for resources that helps in managing and organizing resources in Azure. But it appears that there are some limitations that I was not aware of.Azure DevOps Terraform Pipeline
I was recently working on a project that required me to create a Terraform pipeline in Azure DevOps. I had never done this before, so I had to do some research to figure out how to set it up. In this article, I will share the final pipeline that I created, as well as some of the resources that I found helpful along the way.Azure Function Encountered an Error ServiceUnavailable From Host Runtime
I recently had an issue deploying Python 3.10 Azure Function on App Service Plan. The error message was:
9:38:43 AM myfunc: Deployment successful. deployer = ms-azuretools-vscode deploymentPath = Functions App ZipDeploy. Extract zip. Remote build.
9:38:56 AM myfunc: Syncing triggers...
9:39:50 AM myfunc: Syncing triggers (Attempt 2/6)...
9:40:51 AM myfunc: Syncing triggers (Attempt 3/6)...
9:42:01 AM myfunc: Syncing triggers (Attempt 4/6)...
9:43:01 AM myfunc: Syncing triggers (Attempt 5/6)...
9:45:12 AM myfunc: Syncing triggers (Attempt 6/6)...
9:45:33 AM: Error: Encountered an error (ServiceUnavailable) from host runtime.
Here is my investigation and solution.
Rethinking Code Comments in the AI Era
In the realm of software development, the utility of comments in code has long been a subject of debate. Traditionally, many have held the belief that high-quality code should speak for itself, rendering comments unnecessary. This viewpoint advocates for self-explanatory code through well-named variables, functions, and classes, and a logical assembly of the code structure. However, the advent of AI-assistants like GitHub Copilot is challenging this notion, ushering in a new perspective on the role of comments in coding. In this blog post, we’ll explore how AI is reshaping our approach to commenting, transforming it from a tedious task to an integral part of the coding process.Azure API Management to Azure Cosmosdb
In my previous posts I have showed you how to connect Azure API Management to Azure Service Bus and Azure API Management to Azure Storage account. In this post I will show you how to connect Azure API Management to Azure Cosmosdb.Host React frontend app on Azure Storage Account
Running a React frontend web application on an Azure Storage account is a great way to host your app with a low cost and high scalability. In this post, we’ll go through the steps of setting up a new Azure Storage account and deploying your React app to it.Azure API Management to Azure Service Bus
In my previous post I showed how to use Azure API Management to expose an Azure Storage Account. In this post I will show how to use Azure API Management to expose an Azure Service Bus. This combination is useful when you have a fire and forget HTTP endpoint and you expect irregular traffic. For example, you are designing a mobile application crash reporting system. You want to send the crash report to the server and forget about it. You don’t want to wait for the response. You don’t want to block the user interface. You don’t want to retry if the server is not available. It might happen that a new mobile app version has a significant bug and you get a lot of crash reports. In this case, it is reasonable to use Azure Service Bus to queue the crash reports for peak times and process them later. As in my previous post, I will use a scenario with two brands (Adidas and Nike) to show you the flexibility of the combination Azure API Management plus Azure Service Bus. I will also define everything in Terraform so the solution deployment is fully automated.Azure API Management to Azure Storage Account
There can be the case when you need to upload a file or metadata to Azure Storage Account from an application which is outside of your cloud infrastructure. You should not expose your storage account to the internet for that purpose. There are many reasons for that: securuty, flexibility, monitoring etc. The natural solution to acheave the goal is to use something in between of your Azure Storage Account and the public internet. Some kind of HTTP proxy that will manage user authentication, do some simple validations and then pass the request to Storage Account to save the request body data as a blob. Azure API Management is a perfect candidate for that role. It is a fully managed service that means you don’t need to create any custom application. It is highly available and scalable. It has a lot of features that can be used for your needs. In this article I will show how to configure Azure API Management to upload a file to Azure Storage Account.Breaking Through Barriers: Simplifying CD with Automated Azure SQL Schema Changes!
Have you ever tried to deploy a Microsoft SQL database schema to Azure SQL using a linux-based CI/CD pipeline? If you did, you probably know that it is not a trivial task. The reason is that the Microsoft.Data.Tools.Msbuild package is not available for linux. The package contains MSBuild targets and properties that are used to build and deploy database projects. The package is a part of SQL Server Data Tools (SSDT) and it is not available for linux. So if you want to deploy a database schema to Azure SQL using a linux-based CI/CD pipeline you need to use a different approach. In this article we will see how to deploy a Microsoft SQL database schema to Azure SQL using any linux-based CI/CD pipeline (Gitlab, GitHub, Azure DevOps, etc).Azure Function Serverless Deployment Python
Consumption plan is the cheapest way to run your Azure Function. However, it has some limitations. For example, you can not use Web Deploy, Docker Container, Source Control, FTP, Cloud sync or Local Git. You can use External package URL or Zip deploy instead. In this article I will show you how to deploy Python Azure Function Programming Model v2 App to Linux Consumption Azure Function resource using Zip Deployment.Azure Function Serverless Deployment Dotnet
Consumption plan is the cheapest way to run your Azure Function. However, it has some limitations. For example, you can not use Web Deploy, Docker Container, Source Control, FTP, Cloud sync or Local Git. You can use External package URL or Zip deploy instead. In this article I will show you how to deploy Dotnet Isolated Azure Function App to Linux Consumption Azure Function resource using Zip Deployment.Dapper queries synchronized with MSSQL database schema
Dapper is a MicroORM that allows you to control SQL queries you are executing and removes the pain of mapping the dataset results back to your domain model. The thing is that when you specify SQL queries you have to make sure they are valid against the current Database schema. One solution is to use Stored Procedures… other one is this…AWS Certificate Manager TLS Certificate import
When you are going to use TLS certificate with AWS CloudFront, AWS Elastic Load Balancing, AWS API Gateway or other integrated services you can use that by importing the certificate data to the AWS Certificate Manager.Deploying dotnet6 app on AWS ElasticBeanstalk Graviton2 with HTTPS termination on EC2 instance
On one of my project it was a requirement to reduce the ElasticBeanstalk hosting costs to minimum for dotnet application. The app is a kind of dashboard with limited audience and nothing heavy inside. I decided to use Graviton2 instances (ARM processors on board) in order to acheave best speed/cost ratio.My current command to format hard drive for linux NAS
Put it here to just not forget =) Here is my command to format my hard drives for NAS. I came to this when I bought 2 10TB hard drives for Chia coin mining.bash sudo mkfs.ext4 /dev/sde1 -T largefile4 -m 0
Azure Api Management Automated Revisions
Powershell script to deploy new WebAPI and release new API Management revision with zero downtime.Future for business analysts
Software products can be much better if anybody start asking questions about who will use it. We need user researchers called Interaction Designers (IxD) that will prove we are creating something that can be used by people. Where to find them? Business analysts are best people to start looking.3 steps to create integration tests for your ASP.NET MVC 5 application
When you develop an ASP.NET MVC application you should test it anyway. You can cover different parts of your application logic with unit tests or you can create tests that look like user interaction scenarios. These tests have several advantages over unit tests:
- These tests are independent of implementation and you can’t break your test by refactoring (the only scenario when you should modify your test is functional requirement change).
- These tests are good documentation for your application.
- These tests can give you idea about that your users need and how will they supposed to use concrete feature.
Merge SQL databases
Imagine you have 2 databases with identical schema. These databases were working for different application instances for some time. And now you need to merge them together for some reason. You can use different tools to achieve this goal. For example dbForge Data Compare or SQL Data Compare. But these tools cost money and if you don’t merge databases every day this is probably not an option for you. Also these tools does not know full specific of your database structure including unique indexes, check constraints and triggers. Another big deal is identity columns that are using as primary keys. For two databases these keys can be same but represent different entities. In my practice I face database merge task second time and here is how I handle it.RabbitMQ poisoned messages handling
You never know what will come from remote system. Even if you keep everything under control, there always can be something unexpected. In microservice architecture each component must be ready for everything.Custom domain name for Azure VM
If you are running VM on Azure you know that your default domain name looks like this:yourservicename.cloudapp.net
. Also public IP address is not static that means you can’t use ‘A’ DNS record on it.