AppVolumes 4.x Application Replication

A few years ago I ran across the issue of replication of Applications from AppVolumes from one site to another with AppVolumes. I spent some time figuring out the 2.x API to work with this. But had to rewrite the configuration to support the 4.x replication. As most of you are aware there is no published API document for AppVolumes. A bunch of people told me well you cant do this, and that more or less drove me to accomplish this stuff. So thank you for pushing me!

Problem: You have multiple sites using 4.x applications via VMware AppVolumes and you want to replicate the Applications, Packages, Entitlements, Lifecycle, and Stage info from one site to another or one site to many. We were looking to use a hub and spoke method and using a central AppVolumes environment to be the source of all truth, and replicate its data to other sites. The concept was that I did not want to have to package applications in each site, and did not want to have to do a manual copy of data from one to the other.

Using a Hub and Spoke replication looking something like below:

Requirements: For this script to work you must have a Pure storage setup at the source and target site. And must have Async Replication setup, and replicating a “Replication” LUN from one site to the other via scheduled snapshots. Looking something similar to:

So, did you ever want to replicate AppVolumes 4.x Applications, Packages, Assignments, and lifecycle status from one site to another with help of Pure storage?

Solution: I have created a script that will run through the processes below. Allowing you to start to build your Hub and Spoke configuration. By adding a few ForEach loops you can enable this to replicate from one source site to many! Allowing you to only have to create changes in the source site.


  • Connect to Target vCenter
  • Connect to target Pure storage array
  • Copy Replicated Snapshot to Replication LUN
  • Scan Storage
  • Mount Replication LUN
  • Resignature LUN
  • Scan Storage
  • Disconnect from vCenter
  • Connect to Source AppVolumes Server via API
  • Connect to Target AppVolumes Server via API
  • Force Rescan of AppVolumes Storage LUNs to discover Replication LUN on Target AppVolumes Server.
  • Mark Replication LUN as UnMountable on Target AppVolumes Server.
  • Find Storage Group for Replication LUN on Target AppVolumes Server.
  • Force Rescan of Storage Group on Target AppVolumes Server.
  • Force Replication of Storage Group on Target AppVolumes Server.
  • Force Import of Packages on Target AppVolumes Server.
  • Collect Source and Target Assignements
  • Collect Source and Target Packages
  • Collect Source and Target Products
  • Collect Source and Target Lifecycle Status
  • Unassign all Target Assignments not set in Source
  • Assign Source Assignments to Target, including Lifecycle status.
  • Remove the replication LUN from Target vCenter
  • Disconnect from vCenter

This script is located on my GitHub Repo:

I have been using this in a 2.x fashion for a few years now, and now have updated and made some advances to publish this 4.x replication script. The core of the script is the same as the 2.x process, it’s just a change in API calls for AppVolumes.

Posted in API, App Volumes, CLI and Powershell, DevOps, Horizon, Random Crap, vCenter, VDI | Tagged , , , , , , | Comments Off on AppVolumes 4.x Application Replication

AppVolumes 4.x API with Google Chrome and Powershell

Understanding VMware AppVolumes 4.x API Using Google Chrome or Microsoft Edge (Credge) Developer tools and using PowerShell to run API calls.

Over the last few years I have been spending a ton of time with the VMware AppVolumes APIs in the 2.x and 4.x builds. An amazing article was written by Chris Halsted (follow him on twitter or @chrisdhalstead ) Link way back in 2015. This is where I started working with the API’s. By no means do I think I’m some Jedi API person. It took me many FAILURES to get and understanding of things, and still fail a ton working through this. I fail more than I succeed, but when it works, I don’t touch it! Keep in mind this stuff is pretty much all I have done is just came up with a problem (AKA: Need to automate AppVolumes stuff) and just figure this stuff out. I am the type of person that learns a ton more if you just point me in the right direction and give me some examples and let me go.  As said I failed a ton but stuck with it and managed to make some amazing things work. Along the way I did have to reach out to Chris Halstead and a few others to help on a few things and he was able to point me in the right directions. So many thanks for the help Chris. So here is my attempt of translating my learning to a blog post.

Now off to the VMware AppVolumes 4.x API’s. As said Chris did an amazing job of covering the 2.x but the 4.x are a bit of a hidden black bag of tricks you have to jump through. So here is the results of many hours of trial and error and a bunch of failures and eventual success. Most of the credit to this flows off Chris Halstead’s hard work. I am taking his layout and just adjusting it with Powershell commands and the Chrome / Edge / Credge interface.

Assumptions: That you have and understanding of what API’s are specifically Rest. You can find more info here. Also you have an understanding of what Chrome or Credge (Microsoft Edge) is and how to use it. If not this blog post will not help you much. Understanding and access to a VMware AppVolumes Environment. And have a understanding of what how to use PowerShell.

For me working with Google Chrome is the easiest thing to work with. Chris Halsted used postman maybe I just don’t understand it enough to feel comfortable with it but I prefer Chrome. Either works as long as you get the results you want. Chrome allows me to see the process as I do it in the WebGui and allows me to decode it, and then work backwards to figure out how to do it in PowerShell or any REST call.

Lets get started! If you open Chrome, browse to the AppVolumes server you can click on the three dots in the upper right, go to More Tools, and then Developer Tools. Or Ctrl+Shift+I.

Once You have Developer mode open find the network tab. This is where I spend a ton of my time. This will show all the web calls of the webpage and allow you to decode how things are happening and learn how to decode the process. Should look like below:

The red button on the left is the Record button and the one next to it is the clear button. They will become your best friends. The other button is the Preserve Log check box. I like to have that on so I can see the whole chain as when things switch from frame to frame or from page to page it clears the log and you have a tendency to miss the thing you are wanting to see.

Now on your AppVolumes Logon screen type your creds and hit log on with developer mode on and preserve log enabled. You will see all kinds of things. But really this does not help you with the logon process but shows you some of the process.

A few things we should cover is RestAPI Methods or Verbs. There are 5 verbs but I have only found we use 3 in AppVolumes, but here are the 5.

POST = Create

GET = Read Data

PUT = Update or replace

PATCH = Partial Update or Modify Data

DELETE = Voodo / Delete (Don’t use unless you are sure!)

There can be some more data found here on Rest Methods.

Launching Session With PowerShell

Well going back to 2015 Chris Halstead posted that in order to start a session with AppVolumes manager via API you needed to follow a few things:

So in PowerShell you would use something like this

Invoke-RestMethod -SessionVariable AppVolSession -Method Post -Uri "https://(AppVolumesServerFQDN)/cv_api/sessions" -Body $AppVolRestCreds

Lets break this command up a bit. Invoke-RestMethod is a built in function of Powershell introduced in 3.0. It allows you to run rest commands directly from powershell. More Info Here!

-SessionVariable = This is the command we are using to save the session cookie. Notice we are not using a defined variable. But it is creating the variable called “AppVolSession” for use going forward. And future commands you just add the $AppVolSession varrible for the -SessionVariable

As we dig into the API viewer in Chrome you will see things displayed a bit different in the URL as it wont include cv_api. That is because by default the URL for Sessions is https://(AppVolumesServerFQDN)/sessions but that will do you no good for doing anything useful with the data. As its being displayed as HTML code.

But if you add the “cv_api” to it, it will change it from HTML to JSON/Text output allowing you to read and parse the data for other automations. So same URL https://(AppVolumesServerFQDN)/cv_api/sessions with the “cv_api” in it will allow you to get JSON data out of it.

URL = https://(AppVolumesServerFQDN)/cv_api/sessions enter your AppVolumes server FQDN and lets see how well it goes.

-Body = Oh wait forgot to tell you about the creds. I found this out the hard way after trial and error a bit. You must use Clear text Username and Password (if someone knows of another way please let me know) so this is what I do typically.

# Get Credentials
$Credentials = Get-Credential
$RESTAPIUser = $Credentials.UserName
$RESTAPIPassword = $Credentials.GetNetworkCredential().password
$AppVolRestCreds = @{
    username = $RESTAPIUser
    password = $RESTAPIPassword

This will form the credentials portion of the body correctly and allow you to authenticate session.

Now you can authenticate like below. Once you establish a session cookie it will look like below slowing:



From this point forward you can start doing other things in the AppVolumes Manager as you have an authenticated session cookie saved. If you check the session cooking by typing: $AppVolSession as that was what we used as a variable to save it in it will look something like below:

Getting AppVolumes version with API’s and Powershell

Here is my method. As explained I like the function of Invoke-RestMethod so to get Version data in powershell it would look something like this:

Invoke-RestMethod -WebSession $AppVolSession -Method Get -Uri https://(AppVolumesServerFQDN)/cv_api/version

Notice this time we called the variable “$AppVolSession as we want to use the session cookie, and we did not use the credentials body as we already have an authenticated session. And last thing you notice we switched the Method to “Get”. As we are getting data from the API. For more info on Rest Verbs Look here! Your output should look like below:

Most of the stuff Chris has called out in his blog about this still works today in the 4.x, the core calls work and the 2.x functions work but the 4.x they decided to change a ton of this stuff. And working with PowerShell you can just substitute the URL found in his blog with the Invoke-Reset command used for getting version data. For example Get AD Settings:

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri https://(AppVolumesServerFQDN)/cv_api/ad_settings

But as you see below the format is in JSON so easy way to deal with it is to save the contents to a variable.

Saved contents to variable $ADsettings and Now you can see the data below:

We are just changing the last portion of the URL to reach different things.

Directory Tab

# Get Online Entities
Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri “<a href="https://(AppVolumesServerFQDN)/cv_api/online_entities">https://(AppVolumesServerFQDN)/cv_api/online_entities</a>”

You Know you can take my word these records exist but that is not what this post exist for. Now lets open your AppVolumes URL to Directory and then Online. Now open your developer tab. Then click on the online tab to refresh the page. It should look something like below:

From here on the Developer Tab on the right side click on “Online_Entities” Like below

From here you can see the web details on the rendering of the webpage. Hmmm…… That URL looks a bit like what we are using for our API calls except we are adding “/cv_api” in there.

By paying attention to this you can use Get calls to get all the data in the tabs in the same fashion. Just going to the WebGUI and finding out the URL and just adding the /cv_api page infront of it.

Below is the data in my variable $

Now with he short learning out of the way, lets get to figuring out what all the others are. Below is a good list but not all of the API calls.

Get Users

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/users"

Get Computers

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/computers"

Get Groups

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/groups"

Get OUs

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/org_units"

Infrastructure Tab

Get Storage

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/storages"

Get Storage Groups

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/storage_groups"

Get Managed Machines

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/machines"

Inventory Tab

Get AppStack Applications (Products)

Invoke-RestMethod -WebSession $AppVolSession -Method get -Uri "https://(AppVolumesServerFQDN)/cv_api/app_volumes/app_products"

Get Packages

Invoke-RestMethod -WebSession $AppVolSession -Method get -Uri "https://(AppVolumesServerFQDN)/cv_api/packages"

Get Packages with all the data

Invoke-RestMethod -WebSession $AppVolSession -Method get -Uri "https://(AppVolumesServerFQDN)/app_packages?include=app_markers%2Clifecycle_stage%2Cbase_app_package%2Capp_product"

Get Lifecycle Data

Invoke-RestMethod -WebSession $AppVolSession -Method get -Uri "https://(AppVolumesServerFQDN)/cv_api/app_volumes/lifecycle_stages"

Get All 4.x Programs

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/programs"

Get App Assignments

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/app_assignments"

Get App Assignments All the details

Invoke-RestMethod -WebSession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/app_volumes/app_assignments?include=entities,filters,app_package,app_marker&"

Get App Attachments

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/app_attachments"

Get Writable Volumes

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/writeables"

Config Tab

Get License Usage

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/license"

Get Ad Domains

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/domains"

Get AD Domain More info

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/ldap_domains/$DomainID"

Get Administrators

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/administrators"

Get Machine Managers

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/configuration/hypervisor"

Get Storage

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/configuration/storage"

Get Managers

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/configuration/manager_services"

Get Settings

Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/configuration/settings"

Okay lets see if anyone ran into issues. I bet some of you ran into an issue like below:

If you look at the URL in the command you see that “Users” is capital. That is the issue. You must do these things to match case. I think they are all lower case, so when in doubt, make all the case lower case. It will work much better for you. I made this mistake a ton!

Dive into Put and Post

Now that we have covered all the boring Get info stuff lets go off the deep end a start causing damage. Lets head off to the Put and Post world. This where things begin to get interesting.

The variable $GroupID is the ID number of the group you want to import. So if you use the  command from above to get Storage groups put it into a variable this will allow you to parse it. Using something like below:

$StorageGroups = Invoke-RestMethod -Websession $AppVolSession -Method Get -Uri "https://(AppVolumesServerFQDN)/cv_api/storage_groups"

From there you can type $StorageGroups.Storage_Groups and it will tell you all the storage groups with full details. From there you can use  $ and it will tell you all the storage group ID’s out there. And from there when you now the Group IDs you can do things like Import, Replicate and Rescan.

You can see below the name and ID. AppVolumes calls all the changes via Group ID. As you see my Group ID is “1”

For the examples below $GroupID is the ID of the storage group you want to do something with.

Import from Storage Group

Invoke-RestMethod -Websession $AppVolSession -Method Post -Uri https://(AppVolumesServerFQDN)/storage_groups/$GroupID/import

Replicate Storage Group

Invoke-RestMethod -Websession $AppVolSession -Method Post -Uri "https://(AppVolumesServerFQDN)/storage_groups/$GroupID/replicate"


Invoke-RestMethod -Websession $AppVolSession -Method Post -Uri “https://(AppVolumesServerFQDN)/cv_api/storage_groups/$GroupID/rescan”

And if you want to do something with all the storage groups you can throw in my favorite thing the For loop. So example like this:

foreach($GroupID in $StorageGroups.Storage_Groups.ID){

    Invoke-RestMethod -SessionVariable $AppVolSession -Method Post -Uri "https://(AppVolumesServerFQDN)/storage_groups/$GroupID/import"

    Invoke-RestMethod -SessionVariable $AppVolSession -Method Post -Uri "https://(AppVolumesServerFQDN)/storage_groups/$GroupID/Replicate"

    Invoke-RestMethod -SessionVariable $AppVolSession -Method Post -Uri "https://(AppVolumesServerFQDN)/cv_api/datastores/rescan"


This will Import, Replicate and Rescan each of the storage groups.

Working with Applications. In working with applications you need to get a bunch of data before you can do things with it. Using your learning from above you need to get things like “Package ID” and “Lifecycle ID”

Get Package Data from Source and Target

$Packages = (Invoke-RestMethod -WebSession $AppVolSession -Method get -Uri "https://( AppVolumesServerFQDN)/app_volumes/app_packages?include=app_markers%2Clifecycle_stage%2Cbase_app_package%2Capp_product").data

Noticed I put “().data” around the command. That is to get your right inside the data subset. If you look at the variable “$Packages” it will hold all the data of your packages. Will look something like below, Notice my package ID is “2”.

Next thing you will need is Lifecycle Data. Really this is the Lifecycle master DB. Yes it changes between versions and is growing.

$Lifecycle = (Invoke-RestMethod -WebSession $AppVolSession -Method get -Uri "https://(AppVolumesServerFQDN)/app_volumes/lifecycle_stages").data

Again going right into the data and saving to a variable.

Below is the base command to set the Lifecycle of any Package.

Set Lifecycle Data

Invoke-RestMethod -WebSession $AppVolSession -Method put -Uri "https://(AppVolumesServerFQDN)/app_volumes/app_packages/$($$($"

But lets take this to my example. I want to set my package “2” to Published. So in order to do that we can do that with variables or with just numbers. Below is the example with variables. Variables work nice when you are want to do many of these things. And if you want to do a huge set just throw it in a for loop.

Below we are setting the Package “7zip” or ID “2” to “Published” that equals “3” in the AppVolumes DB.

$Package = ‘2’
$LifeCyclePublished = ‘3’

Invoke-RestMethod -WebSession $AppVolSession -Method put -Uri "https://(AppVolumesServerFQDN)/app_volumes/app_packages/$($$($LifecyclePublished)"

Now pull the package’s again, and it will show updated Lifecycle Stage ID. Like below:

Now lets take the same Package, And lets set it as current as its our production package. The below command will set it as current.

Set Source Current Status

Invoke-RestMethod -WebSession $AppVolSession -Method put -Uri "https://(AppVolumesServerFQDN)/app_volumes/app_products/$($Package.app_product_id)/app_markers/CURRENT?data%5Bapp_package_id%5D=$($"

Well congrats you just set the Package as Current. Now if you Get the package data again you will see something like below: (The Current marker is hidden under the “App_Marker”)

Below is broken out “App Markers”.

Now lets work on the fun mess of Assigning users. This one threw me through a loop for a few hours then it just clicked. It did take some help from Chris. I think the email said something like “Im so confused on assigning a user and can’t make it work. HELP!, can you give me a hint on how to make this work, not the answer!” Something to that extent, as cant find the email anymore. But for assigning users there are some specific data you need. The key parts you need to assign a user are the following:

  • App Product ID
  • Entities
  • Path
  • Entity Type
  • App Package ID
  • App Marker ID

-App Product ID – This comes from the Products. Below is the command to get Products.

Get Product Data from Source and Target

$Products = (Invoke-RestMethod -WebSession $AppVolSession -Method get -Uri "https://$AppVolServer/app_volumes/app_products").data

You will need to use a for loop to get this to a single product. Or you can just use the $Products.ID value. Below is an excerpt of my $Product variable.

-Entities (Who you are assigning and has to match specific form) This is a fun one, but in order to assign a user it HAS to be in this format:

CN=AD User,CN=User,CN=Users,DC=corp,DC=local

Now if you have OU’s you need to add all that fun stuff in there too.

Easy way to see this is to do the assignment with the Development tab open, once you hit complete find the call named “app_addignments” and click on that scroll all the way to the bottom and expand all the stuff under “Request Payload”

It will look something like this:

-Path = Meaning is it a User or Group

-Entity Type – This is either “User” or “Group”

-App Package ID – Can use “null” if you are not assigning to a package ID and just using current. For 90% of the people out there you are not assigning groups or users directly to a Package you are assigning it to the package marked as “Current”. If that is your case you can use “null”.

-App Marker ID – This is the Application Marker ID. This comes from the Packages, and we are just using the value $

Below we are building the Body for the assignment. This is the part that messed me up with the whole thing. Until I realized that in the Developer tab it gave me the answer. If you look at the Payload Break down it gives you the answer. Then you just need to translate it a bit.

From there you build this:

# Assign User to AppStack
$AssignUserOrGroup = “CN=AD User,CN=Users,DC=corp,DC=local”
$EntityType = “Group”
$AssignmentJsonBody = "{""data"":[{""app_product_id"":$($,""entities"":[{""path"":""$AssignUserOrGroup"",""entity_type"":""$EntityType""}],""app_package_id"":null,""app_marker_id"":$($}]}"

Now that you have the body built you can assign the user! That part is simple.

Invoke-RestMethod  -WebSession $AppVolSession -Method Post -Uri "https://(AppVolumesServerFQDN)/app_volumes/app_assignments" -Body $AssignmentJsonBody -ContentType 'application/json'

Then to unassign you can just find out the Assignment ID from the commands we covered early and boom. User is unassigned.

Un-assign user to AppStack

Invoke-RestMethod  -WebSession $AppVolSession -Method Post -Uri "https://(AppVolumesServerFQDN)/app_volumes/app_assignments/delete_batch?ids%5B%5D=$($"

There are many more things that can be accomplished with this. But this is just the start. I hope this helps you to understand how I figured out the API, and how the little things help you going forward. Look for some more blog posts to do with this.

Posted in API, App Volumes, CLI and Powershell, Random Crap, VDI | Tagged , , , , , | Comments Off on AppVolumes 4.x API with Google Chrome and Powershell

Linked Mode VCSA Replication Checker

A coworker of mine and I where chatting about VCSA Replication Agreements and the agreement is that there should be a fling for mapping replication agreements. So that night I thought about it. Really there is nothing stopping me from doing this. I pulled out the computer and in a few lines of code I had it working. I took the time over the next few nights making it more robust, and work through getting it to accept a seed node and and find all its VCSA replication partners and report back replication status. So turns out that the seeding loop was a little more than I had bargained for. Reached out to the community and talked with my friend Joe Houges @jhoughes and worked out the issues with the looping issues. Also helped with reminding me about SSH code stream.

Below is an example of the report the script with return. This shows the Source vCenter, Replication Partners, Replication Sequence Numbers, and validate the numbers are not off.

Same as above Just Image as seemed easier to see.

The Script will log into VCSA, enable bash, Pull replication status, clean up the data, and report the info and export it to a CSV, then disable bash. The code lives here:

Posted in Blogtober, CLI and Powershell, DevOps, Random Crap, vCenter, Virtulization | Tagged , , , | Comments Off on Linked Mode VCSA Replication Checker

AppVolumes Monitoring with HTTP Health Monitor

Over the two plus years we have transformed our businesses over to AppVolumes for publishing applications for VDI deployments. But there are a few things you take for for face value.

Like when you read the VMware docs here that tell you that you are supposed to use the URL https://AppVolServerFQDN/health_check for gathering the health check info. If that health check URL shows any thing other than a 200 OK status something is broke. Well this is where i break open the fact that does not actually work as you were made to believe.

Below is a screenshot of a common issue of a AppVolumes server and a patch cycle where the AppVolume server starts up and has no connectivity because the DB is down. No big deal, restart the AppVolumes service and you are good to go. But you would hope your health monitor would tell you this. As you have your monitoring solution to send alerts when its something other than 200. Well you are wrong, it wont send you anything.

As you see above the Status shows 304. Well according to the HTTP standards, a 304 is labeled as: “304 Not Modified” and “If a 304 response indicates an entity not currently cached, then the cache MUST disregard the response and repeat the request without the conditional.” Also meaning that a 300 status code is not seen as something being broken. Also you see above the AppVolumes server is in deed BROKEN. So pull the same HTTP status code via Powershell and you get this:

As you see here its returning 200 OK. Hmm. But that is expected due to 300 status codes are pretty much just another 200 code. But again its broken but it says its not.

Below is another example, where in fact the AppVolumes server is really broken, but the Status code still shows 200 OK even though its broke.

And again its showing 200 OK and its no doubt broken. The only difference here is its not to the point where the AppVolume service has loaded the cert so you will get a cert error, but still reports 200 OK. Below the same thing with Powershell used to get the webrequest.

Now I bet you are thinking well just monitor the AppVolume service on the OS. Well yes you should, but will not help in these two cases. The service is in a running state. So does no good.

I have in fact opened a VMware case on this and still waiting on a plan of action, but sure it will be the next build update before its fixed. So in the mean time I have dug through every page to try to figure a way to monitor these services so we can show true up and down status. I have come up with this solution. Monitor for the status code of this URL :


If it reports back 404 then things are good. If it reports anything other than 404, then something is broken. This does not fit all other use cases, as if the service is not started it will respond as a 404. So double edged sword. For me we are still monitoring the same Health_Check URL https://AppVolServerFQDN/health_check but also monitoring this. With the combination of both, you can see if its up and down. Just not with one URL like you hoped.

Best of luck to everyone on monitoring these till we get a full solution. And will update the blog once I receive feedback on the status.

Posted in App Volumes, Blogtober, Horizon, Monitoring, Random Crap, VDI | Tagged , , , , | Comments Off on AppVolumes Monitoring with HTTP Health Monitor

Using PowerShell to automate file adds to Teams and SharePoint online

As a bunch of us are starting to convert their environments over to Online SharePoint and Teams we are having to adjust how we do things. Now that teams uses the SharePoint Online as its back end file structure.

Problem: Need to find a way to add files to a teams site from an automated script.

Doing some research found there was a simple way to do this. The simplest easy way to add files to a Teams site is just to send a email to the teams email address. Odds are you are already emailing the report or task results to yourself or a team, why not just add the teams site Address. How do you find the email address? Well inside your teams site and the channel you want to send it to. Click on the three dots. It will look something like below.

Click on the Get Email address button and you will now have the email. This works if you are just sending email results to Teams. But what if you have an attachment. Well that also works. But its listed as an email attachment.

What if I just want to send a file on its own with no email? This is where things got fun. As teams has no real API that you can really use for stuff like this. (I may be wrong, as I could not find one.) Only way I could figure to do this was to add the files direct to the back end SharePoint site.

This requires you to install the PowerShell Module: SharePointPnPPowerShellOnline. This will give you the functions of where you can connect to SharePoint site and upload files directly to where you want them to go.

If you are trying to drop a single file into a teams file share you can run the command below:

$SharepointURL = ""
$OutPath = "C:\Temp\ToSharepoint\Test.txt"
Connect to Sharepoint
Connect-PnPOnline $SharepointURL -Credentials $(Get-Credential)
Send File to Teams Sharepoint site
Add-PnPFile -Folder "Shared Documents/Reports/FolderName" -Path $OutPath

If you want to add contents of a folder you can just add a loop and copy all files from that folder to the teams site.

$SharepointURL = ""
$OutPath = "C:\Temp\ToSharepoint"
Connect to Sharepoint
Connect-PnPOnline $SharepointURL -Credentials $(Get-Credential)
Send Files to Teams Sharepoint site
$Files = Get-ChildItem "$OutPath"
foreach($File in $Files){
Add-PnPFile -Folder "Shared Documents/Reports/FolderName" -Path $File.FullName

Add one of the two above to any of your reporting scripts and now you will have reports directly dumped into your Teams Site “Files” directory.

Posted in CLI and Powershell, Office 365, Random Crap | Tagged , , , | Comments Off on Using PowerShell to automate file adds to Teams and SharePoint online

Using Windows Credential Manager with Powershell.

I have been having trouble for a while dealing with credentials and how to store them when using them in scripts. I have done the whole just do Get-Credential and enter it every time I run the script but that is a great for one off scripts but not for scheduled tasks. In the past i had just been using the Import-Clixml and importing the creds saved from a txt file. This works well but now you have to deal with actual txt files. I ran across a article somewhere reading on something else and remember someone saying something about saving credentials to the Windows Credential manager. After doing some research and some digging and reading found this Gem of a Powershell module. CredentialManager Module is a easy module to use, and simplistic with only 4 commands.


With these 4 commands you can now save credentials and call credentials from the credential manager. This is a huge win for me. No more having to deal with cred files, trying to remember what account created the txt file and fighting that mess.

Creating Stored Credentials

New-StoredCredential -Comment 'Test_Creds' -Credentials $(Get-Credential) -Target 'TestCreds'
Showing the Credentials in the Credential Manager.

Using the stored Credentials

Get-StoredCredential -Target 'TestCreds'

This will show the below but that does not help you.

If you store this into a variable now you can use this variable for your credentials as you normally would.

$TestCreds = Get-StoredCredential -Target 'TestCreds'

Removing Stored Credentials

Cleaning up old credentials is always great housekeeping.

Remove-StoredCredential -Target 'TestCreds'

Using Strong Passwords

Get-StrongPassword -Length 20 -NumberOfSpecialCharacters 4

The command will get you the a password that is 20 characters long with 4 special characters. This is a quick way to generate a password for the needs.

With this little bit of info has saved me a huge amount of time. I am not claiming that Credential manager the most secure method, but its way better than saving the passwords in clear text in the script. And much more manageable than having to deal with txt files.

Posted in CLI and Powershell, DevOps, Random Crap, Security | Tagged , , , | Comments Off on Using Windows Credential Manager with Powershell.

Horizon Logon Monitor reporting

Updated 3 December 2019

If you have setup your logon monitor this is great. But its lacking a ton. How do you look at this as a holistic basis? How do you start to look at trends? Well there is noting out of the box for you. You pretty much have to build the solution on your own. Well you are in luck. I had some time on a flight to Dallas to kind of throw something together pretty quick.

What I built was a tool that will query the remote Logon Monitor folder, look through each of the log files and collect the following:

  •         Logon Date
  •         Logon Time Stamp
  •         Session Users
  •         Session FQDN
  •         Logon Total Time
  •         Logon Start Hive
  •         Logon Class Hive
  •         Profile Sync Time
  •         Windows Folder Redirection
  •         Shell Load Time
  •         Total Logon Script
  •         User Policy Apply Time
  •         Machine Policy Apply Time
  •         Group Policy Software Install Time
  •         Free Disk Space Avail

I would pull this from the each of the log files and put in a table view and export to a CSV. Yes this is noting to fancy, but from here you an publish the results to a SQL database instead, create a Web front end to show fancy graphs and if you are lucky you can put it behind Microsoft’s Power BI.

To use this you need to follow my previous post and setup Horizon Log on and configure the Remote Logon Monitor path.


Once you get this setup, you can set this script to run as a scheduled task to collect log data. This script is more setup as a framework and will continue to kind add to it as I have the time.

You can access the script here. Or can just be found on my GitHub site.

If you download this and fill in the remote log path and where and what you want to name the CSV. When you run the script you will get a CSV like below.


I have completed some major updates to this script. I have added the ability to turn on and off features. Also added the ability to clean up old log files so you are not filling up drives.

I have incorporated an email function that will attach the days CSV file with the performance stats, and it will also include a bar graph with the average logon times of the last 14 days organized by day. The chart will look like below. It will highlight the lowest time the color Green and the highest one the color Red. The email will also have a breakdown of the Averages for the day.


Also added the SQL functions so you can export the data to a SQL Data Base. As you run the script it will export the data to a SQL table. In a SQL server you have already stood up.  Inside the Git Repo is the SQL script to create the Table, and also the script to run for De-Duplication of the data, you should not run into duplicate records but for me and testing I ran into a ton.

Now from here the possibilities are pretty limitless. You can build a PowerBI site, you could build your own Webpage graphing the stats, or many other options.


Posted in Blogtober, CLI and Powershell, Horizon, VDI, Virtulization | Tagged , , | 1 Comment

Service Now Time Format

If you have started working with Service Now and the API, and started creating Changes with Times and Dates in them. Have you noticed the times are off in your changes, but the time is right in the script. This was a bit of a head scratches for a few, until it dawned on me to think about timezone. Beings for me every script was 6 hours off. So that made me think to write this.

Service Now Date Formats

Field Full Form Short Form
Year yyyy (4 digits) yy (2 digits, y (2 or 4 digits)
Month MMM (name or abbr.) MM (2 digits, M (1 or 2 digits)
Day of Month dd (2 digits) d (1 or digits)


Service Now Time Formats

Field Full Form Short Form
Hour (1-12) hh (2 digits) h (1 or 2 digits)
Hour (0-23) HH (2 digits) H (1 or 2 digits)
Minute mm 2(digits) m (1 or 2 digits)
Second ss (2 digits) s (1 or 2 digits)

By default service now uses the time format of:

yyyy-MM-dd HH:mm:ss

The time field only accepts String response so you must convert the time to a String.

When using PowerShell to typically you would do something like this:

(Get-Date).ToString(‘yyyy-MM-dd HH:mm:ss’)


But beings we are 6 hours off we have to do a bit outside of the box. For our environment we are set to UTC time for scripting but from my understanding is you can change this.

Here is my work around for this issue.

(Get-Date).AddHours(6).ToString(‘yyyy-MM-dd HH:mm:ss’)

You can use the AddHour to adjust time as needed, or you can use a variable to adjust as needed.  Or the best option is to convert the time to UTC using the method: ToUniversalTime()

(Get-Date).ToUniversalTime().ToString(‘yyyy-MM-dd HH:mm:ss’)

By using this you don’t have to mess with adjusting with daylight savings time BS.


Hope this helps going forward.



Posted in CLI and Powershell, DevOps, Service Now | Tagged , , | Comments Off on Service Now Time Format

Horizon View Instant Clone Pools Upgrade from 6.5 to 6.7

For those of you still yet to take the plunge from 6.5 to 6.7 there is one huge got you with it. It appears a bunch of people miss a single step. So here is my warning.

If you are using VMware Horizon and using Instant Clones you need to plan your migration accordingly.



  1. Take a snapshot of the parent VM on which you upgrade Horizon Agent to Horizon 7 version 7.5 or later. This snapshot is the master image for instant clones.
  2. Set the Storage Distributed Resource Scheduler (DRS) migration threshold to 3 in the cluster.
  3. Disable the instant-clone desktop pools.
  4. Upgrade vCenter Server to vSphere 6.7.
  5. To put the hosts that you plan to upgrade into maintenance mode, choose one of the following options.
    • Put the host directly into maintenance mode from vSphere Web Client then upgrade the host to vSphere 6.7. After the upgrade completes, use vSphere Web Client to exit maintenance mode.

    • Use the icmaint.cmd utility to mark a host for maintenance with the ON option. Marking a host for maintenance, deletes the master images, which are the parent VMs in vCenter Server from the ESXi host. Put the host into maintenance mode and upgrade to vSphere 6.7 ESXi. After the upgrade completes, exit the host from maintenance mode. Then, use the icmaint.cmd to unmark the host for maintenance with the OFF option.

  6. Enable the instant-clone desktop pools.
  7. Perform a push-image operation for each instant-clone desktop pool that uses the new snapshot.

    Only the hosts that are upgraded to vSphere 6.7 ESXi are used for provisioning. The instant clones created during the push-image operation might be migrated to other hosts that are not yet on vSphere 6.7.

  8. Verify that all hosts in the cluster are upgraded to vSphere 6.7.
  9. If you upgrade the parent VM from a previous version to be compatible with ESXi 6.7 and later (VM version 14), then upgrade VMware Tools on the parent VM. You must take a new snapshot of the parent VM, which is the master image for instant clones and perform a push-image operation on all the instant-clone desktop pools that used the previous version of this master image.
  10. If the Virtual Distributed Switch (vDS) is upgraded, power on the parent VM on to verify that there are no network issues. Following a vDS upgrade, you must take a new snapshot of the parent VM and perform a push-image operation on all the instant-clone desktop pools.

Pulled from the VMware Doc


If you decide not to do this you will get something like this……

WARN  (1008-1878) <CacheRefreshThread-https://vcenter:443/sdk> [ObjectStore] Host: HostName not available for provisioning. ConnectionState=connected, PowerState=poweredOn, MaintenanceMode=false, AdminRequestedMaintenance=0, MarkedAsFailed=false, Host Version: 6.5.0, Host API Version: 6.5

Just a the little got ya’s………

Posted in Horizon, vCenter, VDI, Virtulization | Tagged , , , | Comments Off on Horizon View Instant Clone Pools Upgrade from 6.5 to 6.7

Public Speaking around the world.

Last year I set out to put a huge effort to refine my public speaking skills and try to step it up a notch. I have reached out too many in the community to give feedback and just general over all content feedback. And I have received some amazing feedback.  With this feedback I have been able to put this into my presentations and adapt my Snarky and Humorous presentation style. Because of this I have been given an amazing opportunity to spread the word of VDI and automation around the world.

I have been able to present at places like:

Indianapolis, Sydney, Melbourne, St Louis, VMworld in San Francisco, Dallas, Rochester, Phoenix and  Portland to name most.maxresdefault

Through the year I have been able to do continues improvements and adapt more and more as I am adopting new things and adding new content. And looking back my presentation was focused on Automation in VDI with the purpose of showing my Recompose and Refresh Automation script. But as I have had time, and I have grown my skills as a presenter and my skills as a script writer, I have been able to create huge new content, take on monstrous projects like the AsBuilt Report for Horizon, and add these things to my presentation. Making it more of a jam packed presentation that is extremely hard to fit in to the 40 min time slot. By doing this I have able to grow new friendships with some amazing people because of my travels and the things I have done in the community. I cannot thank enough people for where I have made it. And if I start naming people, I will just leave someone out.

Also, by doing this it has really made me realize a bunch of things along the way. How important the community is, how we all help each other out at a blink of an eye. How important it is to have a great place to work, and willing to allow you to go out and speak and give back to the community. Also, how important it is to build a work and life balance. With out that you will go insane, I think. Construction

So, to everyone thank you for allowing me to come out and speak and do my part in helping the community And allowing me to meet some amazing people along the way. Not to mention a huge thank you to all the community for you support and allowing me to grow as a speaker, scripter, and community advocate.

Here is to an amazing 2019 with a few months to go, and Hopefully I will be able to make the rounds to some more amazing places next year and visit more countries, and bring the user back to the UserCon.


Posted in Blogtober, Public Speaking, Random Crap, VDI, VMworld | Tagged , , , , | Comments Off on Public Speaking around the world.