Apply file permissions in a SharePoint Online document library using PowerShell

Hi all, this is a follow up post from the one I published yesterday, about applying permissions to folders in a SharePoint Online document library using PowerShell.

On this post we will look at how to apply those permissions to files, not folders. We will also have a different approach on the code. The code that I am sharing with you will apply permissions to all files within top level folders of the SharePoint library.

So what PowerShell module should you use?

Let me start by saying that, there’s multiple ways to programatically apply those permissions, to a SharePoint library. In this case I am using the SharePoint PnP PowerShell Module.

In the link above, you will be able to learn a bit more about the SharePoint Patterns and Practices module, as well as follow the steps to install it. Be aware that the PnP commands use CSOM, so you might get throttled at some point, if you execute too many.

Now lets look at the code in detail

I will try and break down the script, just so you understand all it does and adapt to your needs properly.

Configuration hard-coded values

It’s always best that you don’t hard-code any values in your script, just so you don’t have to edit it each time you want to run it for a different scenario, but I wanted to keep this one simple, so here it goes:

#This value if for your SharePoint Online Team site URL. In my case my team name is “Test1”. Change yours accordingly
#This is your list name. If you run a Get-PnPList you’ll see that Documents is for the Shared Documents library. You will need this for the cmdlet that sets the permissions
$ListName=”Documents”
#This is the user account you want to give permissions to
$UserAccount = “user1@domain.com”
#Role that you want to add (see permissions section for more information)
$Role = “Contribute”

Connect to the SharePoint PnP Online PowerShell

#Connect to PnP Online. You will get prompted for credentials.
Connect-PnPOnline -Url $SiteURL -Credentials (Get-Credential)

Grab all Folders

#I created a small try catch to exit the script if we can’t grab the folders
Try{
$AllFolders=Get-PnPFolderItem-FolderSiteRelativeUrl “/Shared Documents”-ItemType Folder -ErrorAction Stop
}
Catch{
Write-Host”Failed to list the folders”-ForegroundColor Red
Exit
}

Create a loop to process each folder, grabbing all files and applying the permissions

#And finally the code for the loop to go folder by folder grab the files and apply the permissions
Foreach ($Folder in $AllFolders){
$FolderName=$Folder.Name
$FolderRelativeURL=”/Shared Documents/”+$FolderName
Try{
$AllFiles=Get-PnPFolderItem-FolderSiteRelativeUrl $FolderRelativeURL-ItemType File -ErrorAction Stop
}
Catch{
Write-Host”Failed to list the files for ‘$($FolderName)'”-ForegroundColor Red
}
if ($AllFiles.count-ne0){
Foreach ($Filein$AllFiles){
try{
Set-PnPListItemPermission-List $ListName-Identity $File.ListItemAllFields-User $UserAccount-AddRole $Role-ErrorAction Stop
Write-Host”Folder $($FolderName): File $($File.Name) processed with success”-ForegroundColor Green
}
Catch{
Write-Host”Folder $($FolderName): Failed to apply permissions to file $($File.Name). Error: $_.Exception.Message”-ForegroundColor Red
}
}
}
Else{
Write-Host”‘$($FolderName)’ does not have any files”-ForegroundColor Yellow
}
}

Now the entire script for you to copy

#Config Variables

$SiteURL = "https://yourtenant.sharepoint.com/sites/Test1"

$ListName="Documents"

$UserAccount = "user1@yourtenant.onmicrosoft.com"

$Role = "Contribute"




#Connect to PnP Online

Connect-PnPOnline -Url $SiteURL -Credentials (Get-Credential)

Try{

$AllFolders=Get-PnPFolderItem-FolderSiteRelativeUrl "/Shared Documents"-ItemType Folder -ErrorAction Stop

}

Catch{

Write-Host"Failed to list the folders"-ForegroundColor Red

Exit

}

Foreach ($Folder in $AllFolders){

$FolderName=$Folder.Name

$FolderRelativeURL="/Shared Documents/"+$FolderName

Try{

$AllFiles=Get-PnPFolderItem-FolderSiteRelativeUrl $FolderRelativeURL-ItemType File -ErrorAction Stop

}

Catch{

Write-Host"Failed to list the files for '$($FolderName)'"-ForegroundColor Red

}

if ($AllFiles.count-ne0){

Foreach ($Filein$AllFiles){

try{

Set-PnPListItemPermission-List $ListName-Identity $File.ListItemAllFields-User $UserAccount-AddRole $Role-ErrorAction Stop

Write-Host"Folder $($FolderName): File $($File.Name) processed with success"-ForegroundColor Green

}

Catch{

Write-Host"Folder $($FolderName): Failed to apply permissions to file $($File.Name). Error: $_.Exception.Message"-ForegroundColor Red

}

}

}

Else{

Write-Host"'$($FolderName)' does not have any files"-ForegroundColor Yellow

}

}

What if I want to do a different folder

The code above is to apply permissions to the files the top level folders within the Shared Documents library of your Team site. If you want to change to a different folder, edit the following lines in the code:

  • Line 4 – $FolderRelativeUrl: Add the folder structure here (i.e “/sites/Test1/Shared Documents/FolderA/SubFolderB/”
  • Line 13 – FolderSiteRelativeURL parameter: Add the folder structure here as well (i.e “/Shared Documents/FolderA/SubFolderB”

How about the subfolders

This script does not process files inside subfolders. i.e you you have a top level folder “FolderA” the script will add permissions to all files inside that folder but it won’t add to files in the subfolder “FolderA\SubFolderA”. It’s much more complex to create an iteration that analyses and processes the depth of the folder structure and I wanted to keep this simple.

You can process subfolders separately by targeting them individually, following the steps if the section above.

How about permissions

The example above applies the role “Contributor” to the folders, for the user defined. If you want to know more details about which role to apply, please go to this excellent article to understand permission levels in SharePoint.

Final notes

If you haven’t read my previous post around SharePoint permissions, do it, to know more about folder level permissions and how to apply them via PowerShell. This post and the other one complement themselves really well.

There’s ways of making this script more complex and to do more things (like processing subfolders, processing multiple users, etc), but just like in the other post, the code shared in this one gives you a good baseline and good basic features.

I hope it’s useful!

 

Advertisements

Apply folder permissions in a SharePoint Online document library using PowerShell

Being a consultant with a primarily messaging background, it’s always interesting for me to blog about SharePoint and be out of my comfort zone.

What I am going to show you today, is how do you apply permissions to folders, in a SharePoint online document library, using PowerShell.

So what PowerShell module should you use?

Let me start by saying that, there’s multiple ways to programatically apply those permissions, to a SharePoint library. In this case I am using the SharePoint PnP PowerShell Module.

In the link above, you will be able to learn a bit more about the SharePoint Patterns and Practices module, as well as follow the steps to install it. Be aware that the PnP commands use CSOM, so you might get throttled at some point, if you execute too many.

Now lets look at the code in detail

I will try and break down the script, just so you understand all it does and adapt to your needs properly.

Configuration hard-coded values

It’s always best that you don’t hard-code any values in your script, just so you don’t have to edit it each time you want to run it for a different scenario, but I wanted to keep this one simple, so here it goes:

#This value if for your SharePoint Online Team site URL. In my case my team name is “Team1”. Change yours accordingly
#This is your list name. If you run a Get-PnPList you’ll see that Documents is for the Shared Documents library. You will need this for the cmdlet that sets the permissions
$ListName=”Documents”
#This is the user account you want to give permissions to
$UserAccount = “user1@domain.com”
#Role that you want to add (see permissions section for more information)
$Role = “Contribute”
#Relative URL of the parent folder for all folders you are applying permissions to (see different folder section below for more information on how to change this to target another folder)
$FolderRelativeURL = “/sites/Test1/Shared Documents/”

Connect to the SharePoint PnP Online PowerShell

#Connect to PnP Online. You will get prompted for credentials.
Connect-PnPOnline -Url $SiteURL -Credentials (Get-Credential)

Grab all Folders to apply permissions to

#I created a small try catch to exit the script if we can’t grab the folders
Try{
$AllFolders=Get-PnPFolderItem-FolderSiteRelativeUrl “/Shared Documents”-ItemType Folder -ErrorAction Stop
}
Catch{
Write-Host”Failed to list the folders”-ForegroundColor Red
Exit
}

 

Apply the permissions

#And finally the code for the loop to go folder by folder and apply the permissions
Foreach ($Folder in $AllFolders){
$RelativeURL=$FolderRelativeURL+$Folder.Name
Write-Host$RelativeURL
$FolderItem=Get-PnPFolder-url $RelativeURL
Set-PnPListItemPermission-List $ListName-Identity $FolderItem.ListItemAllFields-User $UserAccount-AddRole $Role
}

Now the entire script for you to copy

#Config Variables

$SiteURL = "https://yourtenant.sharepoint.com/sites/Test1"

$ListName="Documents"

$FolderRelativeURL = "/sites/Test1/Shared Documents/"

$UserAccount = "user1@yourtenant.onmicrosoft.com"

$Role = "Contribute"




#Connect to PnP Online

Connect-PnPOnline -Url $SiteURL -Credentials (Get-Credential)

Try{

$AllFolders=Get-PnPFolderItem-FolderSiteRelativeUrl "/Shared Documents"-ItemType Folder -ErrorAction Stop

}

Catch{

Write-Host"Failed to list the folders"-ForegroundColor Red

Exit

}

Foreach ($Folder in $AllFolders){

$RelativeURL=$FolderRelativeURL+$Folder.Name

Write-Host$RelativeURL

$FolderItem=Get-PnPFolder-url $RelativeURL

Set-PnPListItemPermission-List $ListName-Identity $FolderItem.ListItemAllFields-User $UserAccount-AddRole $Role

}

What if I want to do a different folder

The code above is to apply permissions in the top level folders within the Shared Documents library of your Team site. If you want to change to a different folder, edit the following lines in the code:

  • Line 4 – $FolderRelativeUrl: Add the folder structure here (i.e “/sites/Test1/Shared Documents/FolderA/SubFolderB/”
  • Line 13 – FolderSiteRelativeURL parameter: Add the folder structure here as well (i.e “/Shared Documents/FolderA/SubFolderB”

How about permissions

The example above applies the role “Contributor” to the folders, for the user defined. If you want to know more details about which role to apply, please go to this excellent article to understand permission levels in SharePoint.

Final notes

I hope this post is helpful. Like I stated initially there’s easy ways to make the script more complex but easier to manage, such as removing hard-coded values or for example creating a loop to add permissions to multiple users. Using the code above as reference will for sure save you some time or give you that quick win you need.

Microsoft Teams PowerShell – A simple use case to get you started

Not long ago I blogged about the new Microsoft Teams PowerShell module. Today I want to give you a quick example of how you can leverage it, to automate and make your work more efficient. I’ll show you how to list all Team Channels in your organization.

Connect to your Microsoft Teams Organization using PowerShell

The first thing you need to do is connect your Teams PowerShell module and authenticate to your Office 365 tenant.

  • If you don’t have the Microsoft Teams PowerShell module installed, click on the link in this article and install it
  • Once you have it installed the Connect-MicrosoftTeams cmdlet should be available. It’s as easy as running it and use the authentication prompt to pass the credentials, but you can also pass basic credentials if you want to, using the -credential parameter

TeamsPS01

List all Teams in your Microsoft Teams organization

To list all Teams in your organization, you can use the Get-Team cmdlet. By default the cmdlet will have as output the GroupID, DisplayName, Visibility, Archived, MailNickName and Description.

TeamsPS02

You can format your output to include any relevant Team attribute. Do a “Get-Team |fl” to list them all.

List all Team Channels in your organization

Now finally lets execute the use case of this post. To list all Team Channels in your organization, you can leverage the Get-TeamChannel cmdlet.

This cmdlet has a mandatory parameter -GroupID, which is basically the ID of each Team. That said you have two options:

Option 1: you run “Get-TeamChannel -groupid <TeamGroupID>”

TeamsPS03

You can use the Get-Team cmdlet to get the GroupId value for each team.

Option 2: you grab all Teams into an array and process each Team to list their channels, using the code snippet below.

$AllTeams = Get-Team

Foreach ($team in $AllTeams) {Get-TeamChannel -groupid $team.groupid |Ft $team.DisplayName, DisplayName}

TeamsPS04

What I did above, was changing the output of the command, to list in a readable way to which Team the Channels belong to. There are other ways, more organized, to format the output both to the console or an output file. Nevertheless this can easily guide you in that direction, and if you need any help let me know.

And that’s it. I can and will blog much more around Teams PowerShell. If you haven’t used it yet, you should.

Happy coding!

 

Manage your Exchange Online from the Azure Cloud Shell

The Microsoft Exchange product group, recently announced that you can now manage your Exchange Online, via the Azure Cloud Shell.

If you’re not familiar with the Azure Cloud Shell, it’s basically a browser-based Shell experience in the cloud, fully maintained by Microsoft and that you can leverage to manage your Azure and now also Exchange Online subscriptions.

It’s a step towards running automation from anywhere, with minimal to no effort, which to me is one of the next big things coming to us as IT Consultants.

I wrote a blog article recently, on how to use a multi factor authentication account to connect to Exchange Online, and what Microsoft just did was to provide, by default, the Exchange Online Remote PowerShell module, in the Azure Shell. Smart idea and I bet an awesome quick win for them.

So is there any gotchas?

The quick answer is not any major one, but I still want to point a few things. The first one is that you need an Azure Subscription, otherwise you’ll see the below.

ExShell01

Although many Organizations embracing the Microsoft cloud are already using Office 365 AND Azure, some are not. Some just use Office 365 and it’s good to point out that if you want to leverage this new feature, it’s time for you to create that Azure subscription. The only cost you’ll have with using Azure Cloud Shell, is Azure Storage (also mandatory) cost, which is almost insignificant.

Another smaller thing but also worth pointing out, is MFA (Multi Factor Authentication), as Microsoft expects that you have MFA enabled across all accounts. I guess that’s directly related to the fact that this module you’re leveraging is for login with MFA enabled admin accounts.

Finally Microsoft also points out that they will claim sessions that are inactive for more than 20 minutes. Have that in mind when you build your automation, or just when you have your session open for daily work. This is an expected behavior for this type of cloud and container based Shell.

What else should you know?

I am not going to transcribe the article I pointed you to, in the top of this article, but I just want to highlight the main takeaways:

  • You can access the Azure Cloud Shell in several different ways, but via the Azure portal (portal.azure.com) or directly via the Shell portal (shell.azure.com), are the two main ones.
  • All Exchange Online cmdlets should be available.
  • RBAC fidelity is intact
  • This does not mean there’s plans to decommission the current Exchange PowerShell module (yet?) ūüôā
  • You’ll use the Connect-EXOPSSession cmdlet to start the session
  • Microsoft provides SSO (Single Sign-On) just so you don’t have to login twice (i.e Azure Portal and Exchange Online) Yay!!!

And that’s it, enjoy!!!

 

Use a Multi Factor Authentication enabled account to connect to Exchange Online PowerShell

Security is a big theme for any online applications and Multi Factor Authentication is used more and more everyday. Those security standards easily extend to PowerShell or any other admin sessions, to execute tasks in your tenant. It will be more and more common to see organizations that no longer allow their IT administrators to connect to their Exchange Online using basic auth.

If you’re a heavy PowerShell user like me, this is for you. Microsoft has an excellent article on how to leverage an MFA account to connect to Exchange Online PowerShell.

You should read the article, as this things tend to be updated and you’ll be sure the have the latest steps, but in essence what you need to do is:

  • Make sure you have all prerequisites (i.e .Net Framework), specially in older versions of Windows
  • Install the Exchange Online Remote PowerShell module (from the Exchange Online EAC)
  • Make sure that WinRM allows basic authentication

Finally you can run the following command to connect to Office 365 commercial:

Connect-EXOPSSession -UserPrincipalName chris@contoso.com

And if you are connecting to a different instance, such as GCC High or 365 Germany, you need to specify the ConnectionUri and the AzureADAuthorizationEndPointUri parameters (see official article for parameter configurations).

Connect-EXOPSSession -UserPrincipalName <UPN> [-ConnectionUri <ConnectionUri> -AzureADAuthorizationEndPointUri <AzureADUri>]

Here’s how the PowerShell session looks like after you install the module.

MFAPS01

And the authentication process.

MFAPS02

And that’s it. Happy scripting!!!

 

The Microsoft Teams PowerShell module GA is here

Great news for those who, like me, are heavy users of PowerShell and its available modules for the different Microsoft workloads: The Microsoft Teams PowerShell module is now in general availability, after being in Beta since last year.

There are several new functionalities, but the one I want to highlight is the new switch -TeamsEnvironmentName in the Connect-MicrosoftTeams cmdlet, that allows you to connect to environments different from 365 commercial, such as Government tenants.

If you haven’t already, read the Teams PowerShell Overview Microsoft documents.

I will be posting about Teams PowerShell scripting soon. Enjoy!

The value of the BitTitan SDK to your Enterprise migration project

For those of you with consulting experience, specially in Enterprise projects, currently working in the migration business as a consultant, or in other roles, you’ll know that there’s a huge difference between small/medium size projects and enterprise projects, specially in the way the project is executed.

It’s usually very easy to use a user interface of a tool, to perform actions for 10, 20 or 50 users, but when you have dozens or hundreds of thousands of users to migrate, that quickly becomes a challenge. In this post, we’ll discuss some of the key areas where, leveraging the BitTitan PowerShell module as a key element for task execution in your project, will bring a huge value.

Automate reoccurring tasks

In enterprise migration projects you’ll have hourly, daily or weekly tasks that you have to perform, such as:

  • start/restart migrations
  • create reports
  • move users between migration stages
  • retry failed migrations
  • add new batch of users to migration project
  • schedule the Outlook profile reconfiguration for a batch of users

During your project, most if not all of the tasks described above will have to be executed multiple times a week or even multiple times a day, depending on the task.

So why should you automate those tasks? The answer depends on the task, but reasons should include saving man hours, making sure tasks get executed on time, link task executions (i.e move a user to a different project and start another migration pass), etc

Create your own reports or integrate with an external reporting system

The BitTitan user interface provides you a whole variety of “out of the box” reports, such as user migration statistics, user devices details, etc. But from my experience there’s always one or multiple reports, that you or your customer needs and there’s no easy way of extracting that out of any user interface. With PowerShell you can extract any information and combine them in custom reports, that you can build to cover the exact needs you have.

As an example, if you want the total number of users across all project with migrations completed, running, failed, completed with errors or queued, the best way to get that is via the BitTitan PowerShell. You can filter in the user interface and you can also send to your email the project statistics CSV, but that is done per project and it doesn’t scale, specially if you have a large number of projects.

Another example is to get item counts and sizes of successfully migrated items for all users in all projects. Again in the case you can use the project statistics CSV sent via email, but that does not scale if you have 50 projects and need that report twice a day, does it?

You can also leverage the information provided by PowerShell to feed a reporting portal, from where you can monitor all major aspects of the migration (like the ones mentioned in the above examples), and that PowerShell can update in short intervals of time (i.e every 5 minutes). That gives you the ability to synthesize the information that you consider most relevant and make it available to the entire team executing the project, as well as of course keeping it up to date, without human interaction, instead of for example having to compile it and send it via email multiple times a day.

Multi platform task execution

In the scope of an Enterprise migration project, the “BitTitan tasks” are just part of the equation. There’s much more to do than just move the data and reconfigure the Outlook clients. That being said, it’s very common for consultants to build and use complex scripting for Enterprise level migrations, that leverage multiple SDKs and PowerShell modules. Some examples:

  • Leverage the MSOnline PowerShell module to create users and assign licenses in Office 365 and after the task is checked and being completed, leverage the BitTitan PowerShell to start the prestage migration for those users.
  • Stamp forwarding addresses, either via Exchange online or Exchange on premises PowerShell modules, before triggering the full migration pass with the BitTitan PowerShell.
  • Verify that the BitTitan migration is completed, via the BitTitan PowerShell, and convert mailboxes to mail users or remote mailboxes, once the completion status is marked as successful.

The three examples above, highlight some of the most common tasks that require multi-platform execution, but it can get much more complex than that. You can automate task precedence and execute much more than just two tasks, leveraging all available PowerShell modules

Trying to link those tasks via the user interfaces, with human interaction, is very challenging and time consuming. They will of course require access to multiple user interfaces and during the execution you can’t automate the task dependencies, which will make it prone to error.

Optimize execution of complex or lengthy tasks

We talked about the benefits of automating reoccurring tasks, about multi platform task execution, but it’s also very relevant to mention task optimization for very complex or lengthy tasks. This doesn’t need to be a reoccurring task nor it needs to be cross platform, all you need to consider when you plan to code a task to be done via PowerShell is, “Can I remove a lot of complexity and execution time, if I run this task via PowerShell?”. If the answer is yes then, code it.

Removing the complexity will allow you to have literally anyone executing that task, and not just a senior resource in your project.

Removing execution time is self explanatory: It saves you money and helps keep you within the estimated project timelines, since it’s easier to predict a task execution time when it’s scripted vs when it’s executed by a human.

Now to put this into context, let me give you some examples of complex or lengthy BitTitan tasks:

  • Retry errors to all illegible users in a project with 10k users
  • Schedule DeploymentPro for users with 5 different vanity domains
  • Start a migration for a list of users provided via CSV
  • Create 25 MigrationWiz projects, one per each user batch
  • Create 10 MigrationWiz endpoints for multiple admin accounts
  • Add 5k recipient mappings into a MigrationWiz project

To be honest all of the tasks described above are more lengthy than complex, since in my opinion there’s not a lot of complexity in the BitTitan tools, nevertheless you might not want to have some lower level resources executing them.

On the other hand, all of the tasks above take a significant amount of time to complete and scripting them will save you a lot of precious hours.

Proactive monitoring and task execution

The BitTitan user interface already has some monitoring tasks, such as send an email to the administrator when a migration fails, but with PowerShell you can take monitoring and proactive task execution to the next level.

Instead of just notifying the project executor that a migration failed, you can trigger another migration retry, but you can go to the detail of only doing it if the failure reason is not something like bad admin credentials. You can make it as clever as you want.

You can also chose not to have to check your email, to see if migrations failed, and just create a script that every 30 minutes checks for failed migrations, checks the reason for failure and if it’s worth retrying, starts another migration pass.

The sentence above explains why I think proactive monitoring and proactive task execution are tied together. Let that sync in, imagine a project with 75 thousand users and with around 5 thousand concurrent migrations being executed at any given point in time, and think about how proactive monitoring and task execution can not only save you hundreds of work hours, but also keep your project timelines within schedule. The last thing that you want, in a large and complex migration project, is to lose hours of migration time, where nothing is being migrated for anyone, because an error occurred and the resolution time is high.

Summary

So how do you, based on everything you read above, plan an execute an Enterprise migration project, with automation?

Use a tool that you can automate with:

When you’re choosing a tool for your Enterprise project, you should heavily consider one that has a powerful PowerShell module, like the BitTitan tools do. You can check the BitTitan PowerShell documentation here.

Prepare and test all of your automation:

Make sure you have all your scripts ready and tested. If you need helped coding with the BitTitan SDK please reach out to me directly or to the BitTitan support, that will forward you to the appropriate department that will provide the help that you need.

And that’s it.. I hope this post has been helpful and please reach out if you have any questions!

 

 

BitTitan SDK: Color code your MigrationWiz project users

The BitTitan SDK is a key feature for all Enterprise migration projects. Some tasks, in large migration projects, are better being automated. It will save you hundreds of hours of repetitive work.

Just recently I got asked by a partner for a way to easily execute actions in batches of users, within the same MigrationWiz project. Some times the best option is to divide those users into separate projects and just execute those actions for all the users, but that’s not always the best option.

Now imagine this scenario: You have a project with 10000 users and you need to start a migration to just 800 of them. What should you do? Color code those users, filter by that color and execute the action.

(More information about color coding here on the BitTitan help center)

So how can you categorize 800 users in a project with 10000? Using the BitTitan SDK, of course.

The script below, that you can also find here, can be used to automatically color code your MigrationWiz users, based on a CSV file.

Below there’s a sample of the CSV file. It’s needs two columns:

  • Source Email – the MigrationWiz source email address
  • Flags – A number between 1 and 6. Each has its individual color.

CSVCategories

The execution is as follows:

  1. Prompt to authenticate with BitTitan credentials
  2. Prompt you to select the BitTitan workgroup where the MigrationWiz project is
  3. Prompt you to select the BitTitan Customer where the MigrationWiz project is
  4. Prompt you to select the MigrationWiz project
  5. Enter the full path of the CSV file (i.e C:\scripts\MyUsers.csv)
<#

.DESCRIPTION

This script will move mailboxes from a mailbox project to a target project

    

.NOTES

    Author          Antonio Vargas

    Date         Jan/2019

    Disclaimer:     This script is provided 'AS IS'. No warrantee is provided either expressed or implied.

Version: 1.1

#>

### Function to create the working and log directories

Function Create-Working-Directory {

param

(

[CmdletBinding()]

[parameter(Mandatory=$true)] [string]$workingDir,

[parameter(Mandatory=$true)] [string]$logDir

)

if ( !(Test-Path-Path $workingDir)) {

        try {

            $suppressOutput = New-Item -ItemType Directory -Path $workingDir -Force -ErrorAction Stop

$msg="SUCCESS: Folder '$($workingDir)' for CSV files has been created."

Write-Host-ForegroundColor Green $msg

        }

        catch {

$msg="ERROR: Failed to create '$workingDir'. Script will abort."

Write-Host-ForegroundColor Red $msg

Exit

        }

}

if ( !(Test-Path-Path $logDir)) {

try {

$suppressOutput=New-Item-ItemType Directory -Path $logDir-Force -ErrorAction Stop

$msg="SUCCESS: Folder '$($logDir)' for log files has been created."

Write-Host-ForegroundColor Green $msg

}

catch {

$msg="ERROR: Failed to create log directory '$($logDir)'. Script will abort."

Write-Host-ForegroundColor Red $msg

Exit

}

}

}

### Function to write information to the Log File

Function Log-Write

{

param

(

[Parameter(Mandatory=$true)] [string]$Message

)

$lineItem="[$(Get-Date-Format "dd-MMM-yyyy HH:mm:ss") | PID:$($pid) | $($env:username) ] "+$Message

    Add-Content -Path $logFile -Value $lineItem

}

### Function to display the workgroups created by the user

Function Select-MSPC_Workgroup {

#######################################

# Display all mailbox workgroups

#######################################

$workgroupPageSize=100

  $workgroupOffSet = 0

    $workgroups = $null

Write-Host

Write-Host-Object "INFO: Retrieving MSPC workgroups ..."

do

{

$workgroupsPage=@(Get-BT_Workgroup-PageOffset $workgroupOffSet-PageSize $workgroupPageSize)




if($workgroupsPage) {

$workgroups+=@($workgroupsPage)

foreach($Workgroupin$workgroupsPage) {

Write-Progress-Activity ("Retrieving workgroups ("+$workgroups.Length+")") -Status $Workgroup.Id

}

$workgroupOffset+=$workgroupPageSize

}

} while($workgroupsPage)

if($workgroups-ne$null-and$workgroups.Length-ge1) {

Write-Host-ForegroundColor Green -Object ("SUCCESS: "+$workgroups.Length.ToString() +" Workgroup(s) found.")

}

else {

Write-Host-ForegroundColor Red -Object "INFO: No workgroups found."

Exit

}

#######################################

# Prompt for the mailbox Workgroup

#######################################

if($workgroups-ne$null)

{

Write-Host-ForegroundColor Yellow -Object "ACTION: Select a Workgroup:"

Write-Host-ForegroundColor Gray -Object "INFO: your default workgroup has no name, only Id."

for ($i=0; $i-lt$workgroups.Length; $i++)

{

$Workgroup=$workgroups[$i]

if($Workgroup.Name-eq$null) {

Write-Host-Object $i,"-",$Workgroup.Id

}

else {

Write-Host-Object $i,"-",$Workgroup.Name

}

}

Write-Host-Object "x - Exit"

Write-Host

do

{

if($workgroups.count-eq1) {

$result=Read-Host-Prompt ("Select 0 or x")

}

else {

$result=Read-Host-Prompt ("Select 0-"+ ($workgroups.Length-1) +", or x")

}




if($result-eq"x")

{

Exit

}

if(($result-match"^\d+$") -and ([int]$result-ge0) -and ([int]$result-lt$workgroups.Length))

{

$Workgroup=$workgroups[$result]

Return$Workgroup.Id

}

}

while($true)

}

}

### Function to display all customers

Function Select-MSPC_Customer {

param

(

[parameter(Mandatory=$true)] [String]$WorkgroupId

)

#######################################

# Display all mailbox customers

#######################################

$customerPageSize=100

  $customerOffSet = 0

    $customers = $null

Write-Host

Write-Host-Object "INFO: Retrieving MSPC customers ..."

do

{

$customersPage=@(Get-BT_Customer-WorkgroupId $WorkgroupId-IsDeleted False -IsArchived False -PageOffset $customerOffSet-PageSize $customerPageSize)




if($customersPage) {

$customers+=@($customersPage)

foreach($customerin$customersPage) {

Write-Progress-Activity ("Retrieving customers ("+$customers.Length+")") -Status $customer.CompanyName

}

$customerOffset+=$customerPageSize

}

} while($customersPage)

if($customers-ne$null-and$customers.Length-ge1) {

Write-Host-ForegroundColor Green -Object ("SUCCESS: "+$customers.Length.ToString() +" customer(s) found.")

}

else {

Write-Host-ForegroundColor Red -Object "INFO: No customers found."

Exit

}

#######################################

# {Prompt for the mailbox customer

#######################################

if($customers-ne$null)

{

Write-Host-ForegroundColor Yellow -Object "ACTION: Select a customer:"

for ($i=0; $i-lt$customers.Length; $i++)

{

$customer=$customers[$i]

Write-Host-Object $i,"-",$customer.CompanyName

}

Write-Host-Object "x - Exit"

Write-Host

do

{

if($customers.count-eq1) {

$result=Read-Host-Prompt ("Select 0 or x")

}

else {

$result=Read-Host-Prompt ("Select 0-"+ ($customers.Length-1) +", or x")

}

if($result-eq"x")

{

Exit

}

if(($result-match"^\d+$") -and ([int]$result-ge0) -and ([int]$result-lt$customers.Length))

{

$customer=$customers[$result]

Return$Customer.OrganizationId

}

}

while($true)

}

}

### Function to display all mailbox connectors

Function Select-MW_Connector {

param

(

[parameter(Mandatory=$true)] [guid]$customerId

)

#######################################

# Display all mailbox connectors

#######################################




$connectorPageSize=100

  $connectorOffSet = 0

    $connectors = $null

Write-Host

Write-Host-Object "INFO: Retrieving mailbox connectors ..."




do

{

$connectorsPage=@(Get-MW_MailboxConnector-ticket $global:mwTicket-OrganizationId $customerId-PageOffset $connectorOffSet-PageSize $connectorPageSize)




if($connectorsPage) {

$connectors+=@($connectorsPage)

foreach($connectorin$connectorsPage) {

Write-Progress-Activity ("Retrieving connectors ("+$connectors.Length+")") -Status $connector.Name

}

$connectorOffset+=$connectorPageSize

}

} while($connectorsPage)

if($connectors-ne$null-and$connectors.Length-ge1) {

Write-Host-ForegroundColor Green -Object ("SUCCESS: "+$connectors.Length.ToString() +" mailbox connector(s) found.")

}

else {

Write-Host-ForegroundColor Red -Object "INFO: No mailbox connectors found."

Exit

}

#######################################

# {Prompt for the mailbox connector

#######################################

if($connectors-ne$null)

{




for ($i=0; $i-lt$connectors.Length; $i++)

{

$connector=$connectors[$i]

Write-Host-Object $i,"-",$connector.Name

}

Write-Host-Object "x - Exit"

Write-Host

Write-Host-ForegroundColor Yellow -Object "ACTION: Select the source mailbox connector:"

do

{

$result=Read-Host-Prompt ("Select 0-"+ ($connectors.Length-1) +" o x")

if($result-eq"x")

{

Exit

}

if(($result-match"^\d+$") -and ([int]$result-ge0) -and ([int]$result-lt$connectors.Length))

{

$global:connector=$connectors[$result]

Break

}

}

while($true)

}

}

Function Add-MW_Category {

param

(

[parameter(Mandatory=$true)] [Object]$Connector

)

# add items to a MigrationWiz project

$count=0

Write-Host

Write-Host-Object ("Aplying categories to migration item(s) in the MigrationWiz project "+$connector.Name)

    $importFilename = (Read-Host -prompt "Enter the full path to CSV import file")

    # read csv file

    $users = Import-Csv -Path $importFilename

    foreach($user in $users)

    {

     $sourceEmail = $user.'Source Email'

$flags=$user.'Flags'

        if($sourceEmail -ne $null -and $sourceEmail -ne "" -and $flags -in 1..6)

        {

$count++

Write-Progress-Activity ("Applying category to migration item ("+$count+")") -Status $sourceEmail

$mbx=get-mw_mailbox-ticket $mwTicket-ExportEmailAddress $sourceEmail

if ($mbx)

{

$Category=";tag-"+$flags+";"

$result=Set-MW_Mailbox-Ticket $mwTicket-ConnectorId $connector.Id-mailbox $mbx-Categories $Category

}

else

{

Write-Host"Cannot find MigrationWiz line item with source address: '$($sourceEmail)'"-ForegroundColor Yellow

}

}

else {

Write-Host"The line item with the address '$($sourceEmail)' and the flag '$($flags)' is not valid."-ForegroundColor Yellow

}

    }




if($count-eq1)

{

Write-Host-Object "1 mailbox has been categorized in",$connector.Name-ForegroundColor Green

}

if($count-ge2)

{

Write-Host-Object $count," mailboxes have been categorized in",$connector.Name-ForegroundColor Green

}

}

#######################################################################################################################

# MAIN PROGRAM

#######################################################################################################################

#Working Directory

$workingDir = "C:\scripts"

#Logs directory

$logDirName = "LOGS"

$logDir = "$workingDir\$logDirName"

#Log file

$logFileName = "$(Get-Date -Format yyyyMMdd)_Move-MW_Mailboxes.log"

$logFile = "$logDir\$logFileName"

Create-Working-Directory -workingDir $workingDir -logDir $logDir

$msg = "++++++++++++++++++++++++++++++++++++++++ SCRIPT STARTED ++++++++++++++++++++++++++++++++++++++++"

Log-Write -Message $msg

# Authenticate

$creds = Get-Credential -Message "Enter BitTitan credentials"

try {

# Get a ticket and set it as default

$ticket=Get-BT_Ticket-Credentials $creds-ServiceType BitTitan -SetDefault

# Get a MW ticket

$global:mwTicket=Get-MW_Ticket-Credentials $creds

} catch {

$msg="ERROR: Failed to create ticket."

Write-Host-ForegroundColor Red $msg

Log-Write -Message $msg

Write-Host-ForegroundColor Red $_.Exception.Message

Log-Write -Message $_.Exception.Message

Exit

}

#Select workgroup

$WorkgroupId = Select-MSPC_WorkGroup

#Select customer

$customerId = Select-MSPC_Customer -Workgroup $WorkgroupId

#Select connector

Select-MW_Connector-customerId $customerId

$result = Add-MW_Category -Connector $connector

$msg = "++++++++++++++++++++++++++++++++++++++++ SCRIPT FINISHED ++++++++++++++++++++++++++++++++++++++++`n"

Log-Write -Message $msg

##END SCRIPT
This is the link for my GitHub Gist, where all comments are welcomed regarding the code. Use the comment section as well if you want something changed.
You can find all my BitTitan SDK scripts in my GitHub repository.

BitTitan SDK: Retry individual errors for all users in your MigrationWiz document migration project

The BitTitan SDK is a key feature for all Enterprise migration projects. Some tasks, in large migration projects, are better being automated. It will save you hundreds of hours of repetitive work.

The script below, that you can also find in here, can be used to automatically retry errors in all users of your MigrationWiz project.

To retry errors in a user, he needs to:

  • Be in a “Completed” state
  • have at least one item error

The execution is as follows:

  1. Prompt to authenticate with BitTitan credentials
  2. Prompt you to select your MigrationWiz document project
  3. Identifies number of users eligible for a retry errors pass
  4. Exports to a CSV, created in the same folder from where the script was executed, a list of all successfully initiated retry errors passes
<#



.DESCRIPTION

This script needs to be run on the BitTitan Command Shell

    

.NOTES

.Version        1.0

    Author          Antonio Vargas

    Date            Feb/13/2019

Disclaimer: This script is provided ‚ÄėAS IS‚Äô. No warrantee is provided either expresses or implied.

    Change Log

#>

######################################################################################################################################################

# Main Program

######################################################################################################################################################

$connectors = $null

#Working Directory

$global:workingDir = [environment]::getfolderpath("desktop")

#######################################

# Authenticate to MigrationWiz

#######################################

$creds = $host.ui.PromptForCredential("BitTitan Credentials", "Enter your BitTitan user name and password", "", "")

try {

$mwTicket=Get-MW_Ticket-Credentials $creds

} catch {

write-host"Error: Cannot create MigrationWiz Ticket. Error details: $($Error[0].Exception.Message)"-ForegroundColor Red

}

#######################################

# Display all document connectors

#######################################

Write-Host

Write-Host -Object "Retrieving Document connectors ..."

Try{

$connectors=get-mw_mailboxconnector-Ticket $mwTicket-RetrieveAll -ProjectType Storage -ErrorAction Stop

}

Catch{

Write-Host-ForegroundColor Red -Object "ERROR: Cannot retrieve document projects."

Exit

}

if($connectors -ne $null -and $connectors.Length -ge 1) {

Write-Host-ForegroundColor Green -Object ("SUCCESS: "+$connectors.Length.ToString() +" document project(s) found.")

}

else {

Write-Host-ForegroundColor Red -Object "ERROR: No document projects found."

Exit

}

#######################################

# {Prompt for the document connector

#######################################

if($connectors -ne $null)

{

Write-Host-ForegroundColor Yellow -Object "Select a document project:"

for ($i=0; $i-lt$connectors.Length; $i++)

{

$connector=$connectors[$i]

Write-Host-Object $i,"-",$connector.Name,"-",$connector.ProjectType

}

Write-Host-Object "x - Exit"

Write-Host

do

{

$result=Read-Host-Prompt ("Select 0-"+ ($connectors.Length-1) +" or x")

if($result-eq"x")

{

Exit

}

if(($result-match"^\d+$") -and ([int]$result-ge0) -and ([int]$result-lt$connectors.Length))

{

$connector=$connectors[$result]

Break

}

}

while($true)

#######################################

# Get mailboxes

#######################################

$mailboxes=$null

$MailboxesWithErrors=@()

$MailboxErrorCount=0

$ExportMailboxList=@()

Write-Host

Write-Host-Object ("Retrieving mailboxes for '$($connector.Name)':")

Try{

$mailboxes=@(Get-MW_Mailbox-Ticket $mwTicket-ConnectorId $connector.Id-RetrieveAll -ErrorAction Stop)

}

Catch{

Write-Host-ForegroundColor Red "ERROR: Failed to query users in project '$($connector.Name)'"

Exit

}

Foreach ($mailboxin$mailboxes){

$LastMigration=get-MW_MailboxMigration-ticket $mwTicket-MailboxID $mailbox.id|? {$_.Type-ne"Verification"} |Sort-Object-Property Startdate -Descending |select-object-First 1

if ($LastMigration.Status-eq"Completed"){

try{

$MailboxErrors=get-mw_mailboxerror-ticket $mwTicket-mailboxid $mailbox.id-severity Error -erroraction Stop

}

Catch{

Write-Host-ForegroundColor Yellow "WARNING: Cannot find errors for mailbox '$($mailbox.ExportEmailAddress)'"

}

if (-not ([string]::IsNullOrEmpty($MailboxErrors))){

$MailboxesWithErrors+=$mailbox

$MailboxErrorCount=$MailboxErrorCount+$MailboxErrors.count

}

}

}

if($MailboxesWithErrors-ne$null-and$MailboxesWithErrors.Length-ge1)

{

Write-Host-ForegroundColor Green -Object ("SUCCESS: "+$MailboxesWithErrors.Length.ToString() +" mailbox(es) elegible to retry errors found")

Write-Host-ForegroundColor Green -Object ("SUCCESS: '$($MailboxErrorCount)' individual errors found that will be retried")

$RetryMigrationsSuccess=0

Foreach ($mailboxwitherrorsin$MailboxesWithErrors){

try{

$RecountErrors=get-mw_mailboxerror-ticket $mwTicket-mailboxid $mailboxwitherrors.id-severity Error -erroraction Stop

$result=Add-MW_MailboxMigration-ticket $mwTicket-mailboxid $mailboxwitherrors.id-type Repair -ConnectorId $connector.id-userid $mwTicket.userid-ErrorAction Stop

write-host-ForegroundColor Green "INFO: Processing $($mailboxwitherrors.ExportEmailAddress) with $($RecountErrors.count) errors"

$ErrorLine=New-Object PSCustomObject

$ErrorLine|Add-Member-Type NoteProperty -Name MailboxID -Value $mailboxwitherrors.id

$ErrorLine|Add-Member-Type NoteProperty -Name "Source Address"-Value $mailboxwitherrors.ExportEmailAddress

$ErrorLine|Add-Member-Type NoteProperty -Name "Destination Address"-Value $mailboxwitherrors.ImportEmailAddress

$ErrorLine|Add-Member-Type NoteProperty -Name "Error Count"-Value $RecountErrors.count

$ExportMailboxList+=$ErrorLine

$RetryMigrationsSuccess=$RetryMigrationsSuccess+1

}

Catch{

Write-Host-ForegroundColor Red "ERROR: Failed to process $($mailboxwitherrors.ExportEmailAddress). Error details: $($Error[0].Exception.Message)"

}

}

if ($RetryMigrationsSuccess-ge1){

Write-Host-ForegroundColor Yellow "INFO: $($RetryMigrationsSuccess) retry migrations executed. Exporting List to CSV."

$ExportMailboxList|Export-CSV .\List-UsersWithErrors.csv -NoTypeInformation

}

Else{

Write-Host-ForegroundColor Yellow "INFO: No retry migration passes were executed with success."

}

}

else

{

Write-Host-ForegroundColor Yellow "INFO: no users in project '$($connector.Name)' qualify for a retry errors pass. Make sure the users are in a completed state and have individual item errors logged."

Exit

}

}
This is the link for my GitHub Gist, where all comments are welcomed regarding the code. Use the comment section as well if you want something changed.
You can find all my BitTitan SDK scripts in my GitHub repository.

Azure Tip: Use PowerShell to check all blob spaced used in a Storage Account

Just recently, I had the need to be able to know the exact volume of all blob container data, within a specific Azure Storage Account.

This was part of a migration project, which in this case meant that I needed to report that data amount multiple times per day. Data was constantly being copied to and deleted from that Storage account, and the same applies to Blob containers being created, filled with data and deleted afterwards. So my only constant was the Storage Account, and I needed to know, every 2 hours, what was the volume of blob container data in that account.

After a quick of research I found this outstanding Microsoft article on how to leverage the Azure PowerShell module (yes, PowerShell to save the day again!!) to calculate the size of a Blob Storage Container.

The only limitation with the script in the article above was that it’s calculating the size of a single blob container, and I needed the combined size of all blob containers in my Storage Account.

So I had to adapt that script to my scenario, and I turned it into the following script:

# Connect to Azure
Connect-AzureRmAccount

# Static Values for Resource Group and Storage Account Names
$resourceGroup = "ChangeToYourResourceGroupName"
$storageAccountName = "changetoyourstorageaccountname"

# Get a reference to the storage account and the context
$storageAccount = Get-AzureRmStorageAccount `
-ResourceGroupName $resourceGroup `
-Name $storageAccountName
$ctx = $storageAccount.Context

# Get All Blob Containers
$AllContainers = Get-AzureStorageContainer -Context $ctx
$AllContainersCount = $AllContainers.Count
Write-Host "We found '$($AllContainersCount)' containers. Processing size for each one"

# Zero counters
$TotalLength = 0
$TotalContainers = 0

# Loop to go over each container and calculate size
Foreach ($Container in $AllContainers){
$TotalContainers = $TotalContainers + 1
Write-Host "Processing Container '$($TotalContainers)'/'$($AllContainersCount)'"
$listOfBLobs = Get-AzureStorageBlob -Container $Container.Name -Context $ctx

# zero out our total
$length = 0

# this loops through the list of blobs and retrieves the length for each blob and adds it to the total
$listOfBlobs | ForEach-Object {$length = $length + $_.Length}
$TotalLength = $TotalLength + $length
}
# end container loop

#Convert length to GB
$TotalLengthGB = $TotalLength /1024 /1024 /1024

# Result output
Write-Host "Total Length = " $TotallengthGB "GB"

 

The script above will provide you an output into the console of the total volume, in GB, that you have on a specific storage account.

To execute the script, follow the steps below:

  • Copy the entire code above to a notepad
  • Change the values of line 2 and 3, to the correct names of your Azure Resource group and your Azure Storage Account
  • Save the file as .ps1
  • Open a PowerShell window and execute the “script.ps1” file you just saved (see screenshot below)
  • Authenticate with your Azure username and password, when prompted

ScriptAllBlobs1

Execute the script as shown above.

AzureAuth

When prompted, authenticate.

endresult

And this is how the end result should look like.

Before I end this blog post I’d just like to point out that this script was written in a very simplistic way, and to address an urgent need that I had. With a couple more hours of work, you can make this script even easier to use and add all sorts of different features to it, such as:

  • Error handling
  • remove the hard coded values and list for selection all available storage accounts and resource groups
  • change the output format (i.e to CSV) and list sizes per blob container
  • allow you to select between multiple Azure subscriptions under the same account

The above are just some ideas on how to improve the script. I haven’t done it because I had no need for it, but by all means please let me know if you want/need an improved version. This one works just fine, if all you want is the total volume of blob data in a specific storage account.

Happy New Year!!!