Notice: Function _load_textdomain_just_in_time was called incorrectly. Translation loading for the urvanov-syntax-highlighter domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in C:\home\site\wwwroot\wp-includes\functions.php on line 6114
powershell – Page 4 – A Geeks World

Updating SCVMM DHCP Server Agent for Update Rollup 3 with Powershell

I’ve been to a couple of customers in the past month who has applied Update Rollup 3 for System Center 2012 R2 Virtual Machine Manager, through WSUS, but didn’t read the fine print.

ur3

So I wrote a quick script to locate all Hyper-V Hosts with the old/incorrect version.

And the next step was obviously, how to update the agent on all the Hyper-V hosts remotely and automatically!
There are a couple of different ways to do this, let me go through a couple of them.

One of the easiest ways is to use Sysinternals PSExec, just run psexec against those servers and execute uninstall of the old and installation of the new agent. In my humble opinion, it’s too much manual work to do it this way with a lot of hosts. So I rather use Powershell.

Looking at the above Powershell example, you almost have a full script for doing the rest.
Have a look at this;

Word of warning, the above script should be considered a “proof of concept” or give you a rough idea of how to do it. I’ve run it once, and it did work so it will hopefully work for you too.

There is a minor problem with the above solution. That script will do something called a “double hop”. It’s when you run something on Computer A, which gets executed on Computer B which in turn tries to connect to Computer C and use the credentials provided in A. Two hops, aka double hop.
In the above script, it’s when it’s accessing the install files from a remote share.
And to solve that problem you have to enable something you might have heard about, called Kerberos Constraint Delegation on all Hyper-V hosts (or other servers you want to double hop via).
In most environments KCD is not enabled, so the above script would not work to 100%. In fact, the uninstall would work, but not the installation so would will end up with a server that’s missing the DHCP Agent.
In case you ran the script without reading this part or before adding KCD, I added a small safeguard against that by doing a Test-Path before uninstalling the agent which probably told you it failed.

My good friend and college Mikael Nyström wrote a great blog post here recently on how to rather utilize CredSSP instead of using KCD for tasks like this.

And here is a slightly modified script using CredSSP instead of KCD.

Word of warning, the above script should be considered a “proof of concept” or give you a rough idea of how to do it. I’ve run it once, and it did break anything in that environment, so it might work for you too.

Basically, the script will enable CredSSP on the computer you run it on, and allow the credentials to be used on all remote servers that’s part of your domain. It will then connect to all Hyper-V hosts known by SCVMM and enable those as Credential Receivers.
Following that part, it will once again connect to those servers and check if the SCVMM DHCP Agent is outdated and if it’s able to connect to the install location (SCVMM Servers C$ Share).
I made sure it verifies that it can connect to the install location before uninstalling the Agent. Because in case it can’t connect to SCVMM Server, I would rather have an old DHCP Agent, than no agent at all.
And finally, it’s uninstalling the old agent and installing the new one.
Done!

It’s also possible to use SCVMM’s Job function to schedule a job to be executed on all Hosts. But I’ll cover that in some future post.

Azure Pack: Add a new user to a plan automatically

Update: It looks like SMA is not executing the script when a new Tenant is created, but rather when a subscription is added to the user.
Trying to get it confirmed from Microsoft if that is a bug that’s been introduced in one of the latest updates. See comments for more details.

Problem: When a new employee for TrueSec (our company) is logging into Azure Pack he has to be added to the “Tenants – TrueSec Employees” plan manually.

Solution: One way is to add a “signup code” to the plan and tell new employees to manually join the plan with that specific code. It could work, but does not feel like the most optimal solution.

The desired way would be if all new employees could be added to that plan automatically. Is that possible?
– Of course it is, with the help of SMA! Let me show one way to do this.

Pre-Requisits: Connection Asset, SMA Runbook, Link Runbook to a tasma_asset1sk.

In my case, I’m using the MgmtSvcAdmin asset which looks like this. But you can also create other types of Connections with working credentials. Just notice that you have to enter the name of the Admin Site server in the Asset, as the script will use that info. And the useraccount specified obviously need access to use the Admin site (to modify the subscriptions).

Add a new Runbook with the script below. In my case, I’m using ADFS to connect to the Admin site, so the script has to generate a ADFS token first.
if you are not using ADFS, you will have to modify the script to use a normal Windows authentication. It’s the most common way to authenticate, so there shouldn’t be any problems finding example code for.

Though, please note that the script is currently matching the new users e-mail address to (in our case) @truesec.com or @truesec.se. If you don’t use ADFS, it’s possible for a user to type any name they want during registration and then possibly get added to a plan they should not have access too.

And finally, add a new Automation Task, you do that under Clouds -> Automation.
Object: SPF Tenant
Action: Create
Runbook: New-Tenant

The script:

I hope this helps you automating things in your environment. If you can think of any other great usages for SMA or have need for automating something. Please make a comment, maybe I’ll be able to assist.

Azure Pack: SMA Script to set a Static MAC Address for New Virtual NICs

When a user is using AzurePack to add additional Virtual Network Adapters to a Virtual Machine, they end up with a Dynamic MAC Address. This is regardless of what the settings are in the VM Template that were used to create the VM. The NIC(s) created at deployment of the VM, will honor the setting in the Template. It’s just when additional NICs are added this happens.

vmnic1

We have had some issues with VM’s using Dynamic MAC Addresses, where they got a new MAC Address after migrating to another host, resulting in Linux machines being unhappy and som other servers getting new DHCP Addresses.

I figured that this could be an excellent task to get more familiar with SMA and use that cool feature of Azure Pack. So I made a script which will execute when a new Network Adapter is added to a VM through AzurePack, and will set the MAC Address to a Static entry and let SCVMM pick one from the pool.

You will need to create a new Runbook called New-NetworkAdapter with tag SPF, and paste the above code into that runbook.

sma1And also add a SMA Connection Asset, with credentials for connecting to SCVMM.
Name the connection VmmConnection. The script will look for a connection object called VmmConnection, use that Username + Password to connect to the SCVMM Server specified in the same connection object.
vmm1

And finally, create an Automated Task of this information. sma2

Please let me know if you find this useful, if you have any issues or suggestions on how to improve my script.

List all VM’s with a Dynamic MAC Address

Short, simple script to list all VM’s which has NIC’s with a Dynamic MAC Address set.

It will give a list of all VM’s and the number of VM’s in that list.
Small, simple and efficient.

Controlling my intrusion detection system (alarm) via Powershell!

I’ve recently invested in a alarm for my house, after quite a lot of research i finally went for Siemens SPC 5320 for all the cool features, it feels like one of the most modern alarms out there. As it can be controlled from the web, smartphone applications for  iphone and android, usb, ethernet (ip) etc etc.

It’s more of a high-risk enterprise alarm, than a residence alarm as it got Grade 3 (and can have up to Grade 5) classification, making it usable for Banks and other high-risk objects, or to protect my geek lair (man-cave), son’s hideout, my wife’s gym, bedroom, the kitchen and other areas, also known as our home.

Another reason I bought that specific alarm is that a Swedish company called Lundix has recently released a Gateway that can talk with the Alarm and connect it to other systems. Or just execute things when specific triggers happen in the alarm. Like send a mail (to specific persons) when the Alarm goes off, or maybe notice the parents when a specific person arms or unarms the alarm, ie leaves and gets home from school.
It’s even possible to get noticed if that’s done outside the normal hours. For example if personnel in the store is late unarming the alarm, or arms it too early…

smart_home_integration3Summary: The SPC Web Gateway is providing a generic open web interface to Siemens SPC panels. The interface will simplify SPC integration with third party applications and products such as Home and Building Automation Systems, Smartphone Apps and Web applications. The Web API is using HTTP and REST principles (RESTful) for requests to SPC panel and WebSocket to reporting events from the SPC panel

And as the Gateway talks REST and WebSocket, it makes it possible to use Powershell! Looks how easy, cool, and smoth it is;

spcgateway

Next step is to add it to the new-user-creation workflow. So when a new user is created in AD, it will also create a user in the Alarm, generate a random PIN code and include that information in the Welcome Mail and SMS sent to new users.

Or possibly initiate a company wide erase of confidential data, in case of a Alarm. Anyone who can see that going wrong and causing problems? Especially as I saw some figures stating that more than 90% of all Alarms are false due to user error or indicator faults.

Well, I’ve just started playing around with it and if there is interest I’ll keep you posted on my progress and different automation scenarios i setup.

 

List and Remove Corrupted files reported by Data Deduplication with Powershell

I’ve been copying 7TB of data in about 100.000 files from an old fileserver to the new one, but I just noticed that some of the files are corrupted! Gahhh…

Chkdsk found some issues, but didn’t solve the problem. As this server is running Windows Server 2012 R2 with Data Deduplication I decided to have a look at that. Data Deduplication Errors

Yeah, unfortunately a lot of corrupted files with EventID 12800

So Data Deduplication is reporting a lot of corrupted files, and this error message didn’t really make me any happier.

Hopefully this quick and dirty powershell script that I just wrote can help you too.
As I still had the working fileserver with working files available, I decided to just delete all corrupted files with this script.

And then ran a robocopy script to recopy everything (it will skip any files that already exists making it a quite fast process).
robocopy /mir /copyall /r:1 /w:1 \\source\path \\destination\path

Updated 2014-05-22 16:22:  Added a full delete and copy script, which is a bit better written;

 

 

Azure Pack: Failed to load virtual machine templates for subscription …

Problem: When a user login to Azure Pack, they sometimes get the error message: “Failed to load virtual machine templates for subscription <subscription ID>”.
And if the user tries to deploy a a Virtual Machine, there is no templates to choose from.

Cause: I’m not sure what the real cause is, but it seems to be a bug where Azure Pack forgets that information. The template information is there, it’s just Azure Pack that does not read it.

Workaround: Until this is solved by Microsoft in a hotfix or next updated you will have to handle this by yourself.
You can as an Administrator touch the Plans so they are re-synced and it will immediately start working again. Or you can schedule a powershell script to run regularly, touching the plans.

Here is the powershell command I’ve setup for our environment.

First of all, notice that it’s using a file for the password, to make this automatic.
Use this command once to creat the password.txt file

And it’s using the Get-AdfsToken function to get the a ADFS Token from our ADFS Server (more info: Get-AdfsToken Function), but you can modify the above script to use a normal Windows Token too if you rather want to use the Windows Authentication site than ADFS. Then use this command (replace line 17 in the script above with this line):

 

Get-MgmtSvcToken to get a ADFS Token is broken

Last week, I spent hours trying to get Get-MgmtSvcToken to get a Admin Token from our ADFS server without succeeding.

Get-MgmtSvcToken

Creates an identity token.
Syntax

I tried every possible combination with both “-type WindowsADFS” and “-type ADFS” in combination with various URL’s that should have worked, but didn’t.

With the help of @vNiklas and @_marcvaneijk on Twitter, I was pointed to TechNet where there is a documented bug/error/problem with the Get-mgmtSvcToken command.
By writing this blog, I hope someone will find it through a search and save themselves some time as that TechNet article never showed up when I was searching.

Technet Article: Why can’t I get a token with the Get-MgmtSvcToken cmdlet?

And the solution is to use your own function instead like this;

 

Script for importing existing VMs into Azure Pack

As you start working with Azure Pack, you probably realize that you have a lot of existing VM’s that you would like to import into Azure Pack, and by that be able to use them just as you can handle all new ones?

All that’s needed for that, is to set the correct AzurePack user as the owner and SelfServicUuser on that Virtual Machine. And of course, have the machine in the correct “Cloud”.

Here is a small script which will help you out by;

  1. Asking in a Grid View, which Cloud you would like to import a machine in.
  2. Ask which user that should be the new owner of this VM.
  3. Let you pick, which VM from the Cloud you would like to import.

As we have multiple clouds, and users can have multiple subscriptions, I chose to make the script use GridView, to minimize the risk for human errors (typos).

 

Azure Pack: change Web Sites Default Domain DNS Suffix

To change the Default URL (DNS Suffix) for your Web Sites in Windows Azure Pack, follow these simple steps;

On the computer that is hosting the Web Sites Controller, run the following Windows PowerShell command:

Update (2014-07-14): It looks like the command above does not support -DnsSuffix anymore, but one of my readers has posted an alternative solution in the comments;

As an alternative you can use the following approach:

Check the change by using

And you’ll after that also have to do this:
On your SQL Server, open Management Studio.
In the Hosting Database admin.WebSystems table, change the
PublishingDns, FtpDns, and Subdomain to your desired URLs.

Restart your AzurePack servers to make the changes apply everywhere.