Use Azure automation to start and stop Virtual Machines

If you have Virtual Machines in your Azure subscription, that don’t require 24×7 uptime, this is the blog post for you.

My blog post, where I hope to provide a detailed step by step, is based on the Microsoft official article Start/Stop VMs during off-hours solution in Azure Automation. I highly recommend that you read that article, since this post is more focused on the execution and not necessarily on a detailed explanation of every component.

So what’s the goal here? To be able to, without human interaction, start and stop virtual machines in your Azure subscription, daily. Cool, right?

Before I continue and in case that all you need is to stop Virtual Machines, for that you can leverage a much simpler process, by configuring the “Auto-shutdown” option, under the “Operations” section of a Virtual Machine settings.


But if you need to do both, shutdown and boot up machines, then continue reading.

What are the prerequisites to configure this solution?

To be able to configure the solution to start and stop virtual machines, you need the following:

You can create those resources when you’re enabling the solution, or separately by adding new resources in the “All Resources” tab.

How do I configure the solution?

You have two easy ways of configuring the solution.

Add a new resource in the “All resources” tab

This is the ideal solution  specially if you haven’t create the Automation account.


On the left hand side menu, browse to “All Resources”, click new and type “Start/Stop”. The solution would pop up for selection. Click “Create”.

Via your Azure Automation account

If you already have an automation account created, use it to access the Start/Stop VM solution.


Browse to the automation account and under “Related Resources” click “Start/Stop VM”. Then click “Learn more about and enable the solution”.

You will end up in the same creation page as shown in the option above.

Configure the solution

The step by step configuration of the solution is actually very simple. As noted in the begging of this post, all you need is to select an automation account, a log analytics workspace and configure the solution details.


First you start by selecting an existing or configuring a new Log Analytics workspace. If you create a new one, all you have to do is give it a name, associate it with a resource group (new or existing), select a location and for the Pricing tier keep the “Per GB”.


In the second step, you can select an existing or create a new automation account. If you create a new one, just select the name. The resource group and corresponding location will be locked to the one where the solution is being deployed. Also the Automation Account will be created as a “Run As Account”.

If you’re creating an automation account separately and you can’t see it for selection here, it might be because of several things, such as the account not being created as “Run As” (mandatory) or being in a resource group or location makes it unavailable.


Finally you can configure the most important, which is the solution parameters. Those include the following:

  • Target resource group – Enter the name of the resource group(s) that you want to target. Names are case-sensitive. If you want to target all groups enter “*”. If you want to target multiple use the comma as separator between group names.
  • VM Exlude List – Use this field to exclude any VM’s in your resource group that you don’t want the solution to affect. It’s important to understand that this solution will by default target the entire resource group, unless you exclude VMs here.
  • Daily Start and Stop Time – select the time that you want your VMs to be boot up and shut down, everyday.
  • Email functionality – if you want to receive an email notification each time an action is taken towards a VM (i.e shutdown), select yes and enter the email address you want to get the email on (multiple emails separated by commas).

How do I check if it worked?

Browse to your Automation Account and under “Process Automation > Jobs”.


Click on the latest job to see more details.


You can browse between tabs to check the details of the job execution. Pay special attention the the “All Logs” tab, where you can see the actions executed, number of errors and number of warnings.

The bottom line

Personally, I love this solution. It’s easy to deploy and saves me a ton of my Azure monthly credit.

You can go beyond what I showed you in this post and manually edit the job details, to do things like create an end date for the job, but this turn key Azure solution, although not extremely flexible (i.e targets entire workgroups and it’s tricky to specify exceptions in workgroups with a large number of VMs: it’s designed for daily boot up and shut down actions, etc), it’s very useful. 5 stars!!

Use it and give your own opinion. As always, any questions let me know.

Manage and forecast your Microsoft Azure spending

“Keep our Azure spending under control” is something that IT administrators and IT consultants hear very often. So how important is it to forecast and control that spending?

In my opinion, it’s extremely important and apparently Microsoft shares that opinion, which is probably one of the several good reasons that lead them to acquire the Israeli cloud startup Cloudyn.

Since the acquisition, in June 2017, Microsoft had the Cloudyn service available through the Azure Portal, but the end goal seems to be to fully replace Cloudyn by Azure Cost Management, by integrating all its features and functionalities.

Basically Microsoft is moving all Cloudyn cost management features from the Cloudyn portal into the Azure portal. Below you have an outline of what to use when, that you can and should read in the “What is the Cloudyn Service?” article.


As you can see above, Microsoft recommends Azure Cost Management for most offers and features.

It’s important that you note that today (this blog post was written in May 2019), Microsoft CSP subscriptions are still in the process of being moved from Cloudyn to Azure Cost Management. That also means that Azure Cost Management only supports Enterprise Agreements, pay-as-you-go and MSDN subscriptions.

It’s also important to note that today you can only register with Cloudyn if you’re in the Microsoft CSP program.


Now that you have some context of the ongoing transition, lets talk about my favorite Azure cost management feature: Forecasting

For those that still have access to Cloudyn, the “Forecast future spending” tutorial is a great read and will allow you to build your reports.

If you want to do leverage an API directly, to do things gather forecast information into your own portal, you can leverage the Forecast API that Microsoft has available.

Finally, if you haven’t already, go through this learning module that will teach you how to Predict Costs and optimize spending for Azure.

There’s so many different things that you can do, in terms of Cost analysis, forecasting and cost management in Azure. Hopefully this post gives you a high level overview and some resources to start from. Stay tuned for more information in future blog posts.





Azure Friday – the weekly videos you should not miss

If you’re ramping up your Azure skills, are an experience Azure consultant, administer Azure daily or if you simply like to learn more about the Microsoft Azure technology, then you should dedicate a few minutes per week and listen to the Microsoft Azure Friday live videos.

Videos are usually from around 10 to 15 minutes and include demos and/or very detailed explanations of new or existing services, as well as the Azure product group insight.

It’s very common to see Microsoft publishing multiple videos per week, some will be extremely detailed complex and some more high level.

You can subscribe it or add it to your calendar to make sure you don’t miss it.

Apply best practices with the Microsoft Azure Advisor service

Microsoft Azure has a free and personalized recommendations service, to apply Azure best practices, called Azure Advisor. If you haven’t heard about it or used it before, you should start now.

Microsoft describes the Azure Advisor as a “…personalized cloud consultant that helps you follow best practices to optimize your Azure deployments…”, in this excellent article, where you can read all about it.

The Advisor will give you recommendations for 4 categories:

  • High Availability
  • Security
  • Performance
  • Cost

Lets take a quick look on how you can implement those recommendations. Below you can see the main advisor page of my Azure subscription, which you can access from the bottom left menu option “Advisor”.


In my case it’s showing me recommendations for both High Availability and Security. If I click in the security recommendations, you’ll see that one of them is regarding Azure Storage accounts.


2 of my 3 storage accounts have non recommended security settings regarding secure transfer.


And finally I can see how exactly those settings should be adjusted.

Other very useful articles to learn more about the Azure Advisor:

There’s no additional cost to take advantage of the advisor recommendations. So as I said before, just get started!!

Azure Resource Manager PowerShell: How to change between subscriptions

Today’s post is a very simple one. For those of you that like me, have multiple subscriptions on your Azure account and automate a lot of your Azure work via PowerShell, you might need to change between subscriptions, in the same PowerShell session, to execute multiple tasks.

This can be done with one of the two following cmdlets:

And here is where the confusion comes. What’s the difference between the two cmdlets and which one should you use?

Well the answer is the cmdlets do the exact same thing, and you should use the “Set-AzureRMContext” cmdlet, specially if you put it into scripts, since it seems to be the replacement for the “Select-AzureRMSubscription” cmdlet.

In fact, this is what you get when you do a “Get-Help Select-AzureRMContext”:


As you can see above all references point to the new cmdlet.

Now a quick note on how the cmdlet works.

To list all of your subscriptions:


To change the context to a different subscription:

Set-AzureRMContext -subscription <SubscriptionID or SubscriptionName>

I hope the above is helpful. Happy scripting!

Azure: “CurrentStorageAccountName is not accessible” error when creating a VM via the SDK

I just recently faced an error when doing an Azure lab, that I thought I should blog about, since the resolution is very simple.

Here’s what happened, I was creating a VM via the SDK using the Service Module (Classic – see note below) and the following cmdlet:

New-AzureQuickVM –Windows –ServiceName “AV-AutoSVC” –name “AV-AutoVM” –ImageName $image –Password $password -Location “East US” -InstanceSize “Basic_A0” -AdminUsername avargasadmin

Note: You can create virtual machines both with the service model (classic deployment) or with the Resource manager (new portal). Both methods are available via SDK but the way to connect is different. You use Add-AzureAccount to login to the service model and Add-AzureRMAccount to login to the resource manager. See differences here.

And I got the following error:


New-AzureQuickVM: CurrentStorageAccountName is not accessible. Ensure the current storage account is accessible and in the same location or affinity group as the cloud service.

Now after digging a little bit more in my Azure tenant and what might be the cause of the problem I confirmed that I did had the storage account, and even with the -Location parameter in the cmdlet above forcing it to be “East US” (where my storage account is) I was still getting the same error.

Then I decided to run a the following cmdlet:



I then realized that I have no valid storage account associated with my subscription, and the solution to my problem was to run:

Set-AzureSubscription –SubscriptionName <YourSubscriptionName> –CurrentStorageAccount <YourStorageAccountName>

If you don’t know your storage account name run:

Get-Azurestorageaccount |fl storageaccountname, location

Once you do this, re run your New-AzureQuickVM cmdlet (note: this error should happen also when you’re running the New-AzureVM cmdlet) and the error should be gone.