Powershell: Get Content Faster with ReadCount!

by Ryan 3. April 2013 13:41

Do you use Powershell?  Do you use Get-Content in Powershell to read files?  Do you sometimes work with large text files?

If you answered yes to any of the questions above, then read on - this post is for you!

I have a very simple tip that I used today in a script I was writing.  Thought I'd share.

Let's say you have a large text file, such as a packet log from a DNS server that you're debugging.  It might be 300 megabytes and millions of lines.  I was writing a script to parse the file and collect some statistics that I was after.

$LogFile = Get-Content $FileName
ForEach($_ In $LogFile)

When I ran this script against a 52MB text file, the script executed in about 22 seconds.  When I ran the script on a 150MB text file, Powershell proceeded to consume over 3GB of RAM within a few seconds, the script never finished, and after bringing my laptop (Win7 x64, 4GB RAM, 4CPU, PS v3, .NET 4.5) to a crawl for about 5 minutes, Powershell just gave up and returned to the prompt without outputting anything.  I guess it was some sort of memory leak.  But come on... a 150MB file is not even that big...

So I started looking through the help for Get-Content, and it turns out there's an easy workaround:

$LogFile = Get-Content $FileName -ReadCount 0
ForEach($_ In $LogFile)

The -ReadCount parameter specifies how many lines of content are sent through the pipeline at a time. The default is 1. A value of 0 sends all of the content through at one time.

Now when I run the script against the 52MB file, it completes in 2.8 seconds, and when I run it on the 150MB text file, it finishes in 10.2 seconds!




by Ryan 31. March 2013 12:39

I was discussing with some fellow IT admins, the topic of blocking certain websites so that employees or students couldn't access them from the work or school network.  This is a pretty common topic for IT in most workplaces.  However, I personally don't want to be involved in it.  I realize that at some places, like schools for instance, filtering of some websites may be a legal or policy requirement.  But at the workplace, if an employee wants to waste company time on espn.com, that is an issue for HR and management to take up with that employee.  And again in my opinion, it's not about how much time an employee spends on ESPN or Reddit either, but simply whether that employee delivers satisfactory results.  I don't want to handle a people problem with a technical solution.  I don't want to be the IT guy that derives secret pleasure from blocking everyone from looking up their fantasy football scores.  (Or whatever it is people do on espn.com.)  I could spend my entire career until I retire working on a web proxy, blocking each and every new porn site that pops up.  If there's one thing the internet has taught me, it's that there will always be an infinite number of new porn sites.

On the other extreme of black listing, someone then suggested white listing.  Specifically, implementing "DNS white listing" in their environment for the purpose of restricting what internet sites users were allowed to access to only a handful of internet sites.  Well that is a terrible idea.  The only proper way of doing this in my opinion is to use a real web proxy, such as ISA or TMG or Squid.  But I could not help but imagine how I might implement such a system, and then how I might go about circumventing it from the perspective of a user.

OK, well for my first half-baked idea, I can imagine standing up a DNS server, disabling recursion/forwarders on that DNS server, and putting my "white list" of records on that DNS server.  Then, by way of firewall, block all port 53 access to any other IP except my special DNS server.  Congratulations, you just made your users miserable, and have done almost nothing to actually improve the security of your network or prevent people from accessing other sites.  Now the users just have to find another way of acquiring IP addresses for sites that aren't on your white list.

Well how do I get name resolution back if I can't use my DNS server?  I have an idea... DNS over HTTP!

The guys at StatDNS have already thought about this.  And what's awesome, is that they've created a web API for resolving names to IPs over HTTP.  Here's what I did in 5 minutes of Powershell:

PS C:\> Function Get-ARecordOverHTTP([string]$Query) { $($($(Invoke-WebRequest http://api.statdns.com/$Query/a).Content | ConvertFrom-Json).Answer).rdata }

PS C:\> Get-ARecordOverHTTP google.com

PS C:\> Get-ARecordOverHTTP myotherpcisacloud.com

Simple as that. How cool is Powershell, seriously?  One line to create a function that accepts a name and returns a list of IPs by interacting with an internet web service.  Pretty awesome if you ask me.

As long as you have port 80 open to StatDNS, you have internet name resolution.  Now, to wrap this into a .NET-based Windows service...

Why Does It Say <Unknown Contact> When Viewing Network Share Permissions?

by Ryan 31. March 2013 10:15

I only get to work with Active Directory trusts every so often. I think multi-domain forests seem to be falling out of fashion. At least for those of us who've heard of federation. Regardless, here's an interesting issue I ran into the other day:

So I have this domain, contoso.com.  In contoso.com, there is a one-way forest trust established with fabrikam.com, such that contoso.com trusts fabrikam.com.  This way, users in fabrikam.com can access resources in contoso.com.

On pc1.contoso.com, I create a network file share. I add permissions for my own user account, contoso\ryan, to it.  Then, I add permissions for fabrikam\steve to access the file share also.  The operation was successful and the file share appears to be set correctly.  However, when I go back and view the permissions for the file share again, it takes a very long time, as if it were waiting on something to time out, and then eventually, this is what I see:

File Share Permissions

So why is fabrikam\steve showing up as <Unknown Contact> when viewed from the contoso.com domain?  What we have here, is a SID translation failure.  But why?  First, a little background.  Here is what Microsoft says about users in your forest who are members of another forest:

"When a trust is established between a domain in a forest and a domain outside of that forest, security principals from the external domain can access resources in the internal domain. Active Directory creates a foreign security principal object in the internal domain to represent each security principal from the trusted external domain. These foreign security principals can become members of domain local groups in the internal domain. Directory objects for foreign security principals are created by Active Directory and should not be manually modified. You can view foreign security principal objects from Active Directory Users and Computers by enabling advanced features." [ Source ]

So if you go look in the ForeignSecurityPrincipals container in contoso.com, you'll see an object that represents the user account of fabrikam\steve, but his friendly name or samAccountName is not part of that record. It's just a SID.  When we pull up a permissions file dialog box like the one above, Windows attempts a SID to name translation... but it fails.  There's a little bit of technical documentation on how SID translation occurs:

"LSA on the computer that the call is sent to (using the LSA RPC interface) will resolve the SIDs it can map and send on the remaining unresolved SIDs to a domain controller in the primary domain. The domain controller will resolve additional SIDs to account names from the local database, including SIDs found in SidHistory on a global catalog.

If SIDs cannot be resolved there, the domain controller will send remaining SIDs to domain controllers in a trusted domain where the domain part of the SID matches the trust information." [ Source ]

There are many functions regarding SID lookups, and I don't know exactly which ones are used at each location, but the general concept is the same and you can see how this procedure could take a while to time out. And when it fails, you see <Unknown Contact>.  Or maybe just the unresolved SID of security principal.  Depends on which version of Windows you're using and exactly which dialog box you're looking at.

The reason it happens in our scenario is because of the one-way trust. Contoso.com cannot call upon fabrikam to translate SIDs from its forest, because fabrikam does not trust contoso.

To fix it, we could allow anonymous SID translation in fabrikam... but for many that is an unacceptable security risk.  Or we could make the trust two-way.  Or, if you're unable to do either of those things, you could at least create a security group in contoso, add the individuals from fabrikam to that group, and just assign the group to the network share ACL.  The functionality would be the same but at least you wouldn't have to look at "<Unknown Contact>" every time you opened that dialog box.

Mystery solved.

For more information on this, see the ServerFault question that I answered here, as well as the much better Ask the Directory Services team blog post here.

Excuse Me Sir, You Got Your Rally In My Powershell

by Ryan 28. March 2013 18:04

If you work in the tech industry, especially in the area of software development, you might use or at least be familiar with Rally. It's a web-based project management tool that follows the principles of the Agile development lifecycle.  I've even seen Rally used to organize and track projects that were not actually software development projects.  I've seen this because I do not specifically work in the area of software development, yet I use Rally anyway.

I don't really care much for its web interface though. In my opinion, it often takes way more clicking around in the GUI than it should. Sometimes the GUI isn't intuitive and you end up making mistakes like deleting an entire User Story when you meant to just delete a single Task.

So I started thinking to myself.  "Self," I said. "I wonder if Rally has a REST API? If they did, I could interface with it in Powershell and automate a lot of simple tasks."

Well lo and behold they sure do.  Let's see how freaking simple this is with Powershell:

$Rally = New-WebServiceProxy https://rally1.rallydev.com/slm/webservice/1.41/meta/132645/rally.wsdl -Crednetial (Get-Credential) -Namespace Rally

And just like that, we're connected to Rally.  To verify, you could check the currently logged on user:


Now, let's pull some data from Rally. Let's retrieve, say, my 20 most recently-created Tasks in Rally:

PS C:\> $Query = $Rally.Query($null, "Task", "(Owner.Name = `"ryan@domain.com`")", "CreationDate desc", $True, 1, 20)
PS C:\> $Query.Results.Name

Write some code
Write some more code
Fix this defect
Fix that defect
Fix all the defects
Save the company from bankruptcy
Reconcile marriage
Catch the guy who keeps running the copier out of paper and not refilling it

Playing with REST services and web services is so delightfully easy with Powershell

Examining Internet Explorer Tracking Protection Lists (with Powershell)

by Ryan 22. March 2013 16:12

I know that Firefox and Chrome are still the only browsers that most people will ever even consider using, but ever since I started using Windows 8, I've not yet really felt a compelling reason to stop using the built-in IE10. It's fast, it passes Acid tests, and has a nice "Developer Mode" built-in where you can change IE and rendering versions on the fly (hit F12 on your keyboard.) That, and Compatibility Mode, which I think is really under-rated in a corporate environment, where employees are forced to use antiquated corporate websites that were built for IE5.

About the only thing I miss from Chrome is my sweet, sweet AdBlock. Does IE have anything that can compare? Well, it has Tracking Protection Lists, which is a good start:

IE Tracking Protection Lists

IE Tracking Protection Lists

The Personalized List kinda' sucks because you only have two choices - either block everything that IE detects as a Javascript-type thing that pulls info from other domains and is frequently seen in similar pages, or, you have to wait for IE to detect the script 10 times or so, then you have to go back in and manually choose to block it. 

Personalized Tracking List

Of course Google is going to be the number one offender here, since that's how they make money is by shoving ads in your face around every corner and making you want to buy stuff.  Honestly, I sympathize with online advertisers to a certain point, because I believe the internet would not have as much rich, free content as it does if internet advertisers were unable to make money by providing free services.  I mean, Google doesn't keep Youtube up just because they love paying those network bandwidth bills so much. But, the ads can quickly get simply too obnoxius, and we users need a way to filter all that junk that's flashed before our eyes.  Especially the Google ones that break Internet Explorer's back button.  It's Google's code that decides to open 4 instances of some doubleclick.net resource in a way that it causes you to have to hit the Back button in your browser 5 times to get back to where you were. From legalinsurrection.com

The flip-side of the coin is that if you block too much, certain websites will stop working properly. So you need to find a happy medium. Queue Tracking Lists.  Just like AdBlock, Tracking Lists are designed to block just the web content that has been decided is more harmful or annoying than good.  That way we can keep using our Gmail, Youtube, StackExchange, etc., without interruption, while still cutting out a good chunk of the ads.

You can download new Tracking Protection Lists online, right from within the tracking lists dialog box. Just click the "Get A Tracking Protection List Online..." link. It will take you to a website.  Notice in the screenshot above, that I have already added the "EasyList" TPL, from the same people that do AdBlock.  Also notice that the Tracking Protection List is simply an HTTP URI to a *.tpl text file.  And you know what that means... that means we can play with it from Powershell!

Say you wanted to examine a TPL and see if it contained a certain domain name. Well first, let's download the TPL and store it to a variable:

PS C:\> $list = Invoke-WebRequest http://easylist-msie.adblockplus.org/easylist.tpl

Now the content of the tracking list will be in $list.Content. These are the web resources that the list will block.  You can go here to see the syntax of TPLs. Warning: There will be distasteful words in this list... as the list is designed to block distasteful content.

Alright, so what if we want to know whether this TPL will block content from the domain streamcloud.eu. First, let's break $list.Content up into lines by splitting it on newlines:

PS C:\> $list.Content.Count
PS C:\> $Content = $list.Content.Split("`r`n")
PS C:\> $Content.Count

After splitting the content element of the web request on newlines, we can see that the TPL contains 10,203 lines. I first thought to split on [String]::NewLine, but that did not yield correct results. (Dat character encoding!) Now, keeping in mind that lines that start with a # are comments, let's see if we can find entries that contain streamcloud.eu:

PS C:\> foreach($_ in $Content) { If(!$_.StartsWith('#') -and $_.Contains("streamcloud.eu")) { $_ } }
-d streamcloud.eu /deliver.php
+d streamcloud.eu

So from this output, it appears that we are allowing streamcloud.eu, but we are specifically blocking any document named deliver.php coming from streamcloud.eu.  This could have also been written as:

PS C:\> foreach($_ in $Content) { If(!$_.StartsWith('#') -and $_ -match "streamcloud.eu") { $_ } }

But a lot of times I just naturally prefer the C# parlance. The good thing about Powershell is that you're free to mix and match.

You can certainly elaborate on the concept I've started on above, and I hope you will.  Until next time!

Encrypt-File and Decrypt-File using X.509 Certificates

by Ryan 11. March 2013 16:36

I like encrypting stuff.  Here are two Powershell functions I whipped up that can encrypt and decrypt a file using a valid key pair from an X.509 certificate. The encryption uses the public key, but only the corresponding private key can decrypt the data.


This Powershell function encrypts a file using a given X.509 certificate public key.
This Powershell function encrypts a file using a given X.509 certificate public key.
This function accepts as inputs a file to encrypt and a certificate with which to encrypt it.
This function saves the encrypted file as *.encrypted. The file can only be decrypted with the private key of the certificate that was used to encrypt it.
You must use a certificate that can be used for encryption, and not something like a code signing certificate.
.PARAMETER FileToEncrypt
Must be a System.IO.FileInfo object. $(Get-ChildItem C:\file.txt) will work.
Must be a System.Security.Cryptography.X509Certificates.X509Certificate2 object. $(Get-ChildItem Cert:\CurrentUser\My\9554F368FEA619A655A1D49408FC13C3E0D60E11) will work. The public key of the certificate is used for encryption.
PS C:\> . .\Encrypt-File.ps1
PS C:\> Encrypt-File $File $Cert
PS C:\> . .\Encrypt-File.ps1
PS C:\> Encrypt-File $(Get-ChildItem C:\foo.txt) $(Get-ChildItem Cert:\CurrentUser\My\THUMBPRINT)
Encrypt-File <System.IO.FileInfo> <System.Security.Cryptography.X509Certificates.X509Certificate2>
A file named $FileName.encrypted
Written by Ryan Ries - ryan@myotherpcisacloud.com

Function Encrypt-File

	Try { [System.Reflection.Assembly]::LoadWithPartialName("System.Security.Cryptography") }
	Catch { Write-Error "Could not load required assembly."; Return }	
	$AesProvider                = New-Object System.Security.Cryptography.AesManaged
	$AesProvider.KeySize        = 256
	$AesProvider.BlockSize      = 128
	$AesProvider.Mode           = [System.Security.Cryptography.CipherMode]::CBC
	$KeyFormatter               = New-Object System.Security.Cryptography.RSAPKCS1KeyExchangeFormatter($Cert.PublicKey.Key)
	[Byte[]]$KeyEncrypted       = $KeyFormatter.CreateKeyExchange($AesProvider.Key, $AesProvider.GetType())
	[Byte[]]$LenKey             = $Null
	[Byte[]]$LenIV              = $Null
	[Int]$LKey                  = $KeyEncrypted.Length
	$LenKey                     = [System.BitConverter]::GetBytes($LKey)
	[Int]$LIV                   = $AesProvider.IV.Length
	$LenIV                      = [System.BitConverter]::GetBytes($LIV)
	Try { $FileStreamWriter = New-Object System.IO.FileStream("$($FileToEncrypt.FullName)`.encrypted", [System.IO.FileMode]::Create) }
	Catch { Write-Error "Unable to open output file for writing."; Return }
	$FileStreamWriter.Write($LenKey,         0, 4)
	$FileStreamWriter.Write($LenIV,          0, 4)
	$FileStreamWriter.Write($KeyEncrypted,   0, $LKey)
	$FileStreamWriter.Write($AesProvider.IV, 0, $LIV)
	$Transform                  = $AesProvider.CreateEncryptor()
	$CryptoStream               = New-Object System.Security.Cryptography.CryptoStream($FileStreamWriter, $Transform, [System.Security.Cryptography.CryptoStreamMode]::Write)
	[Int]$Count                 = 0
	[Int]$Offset                = 0
	[Int]$BlockSizeBytes        = $AesProvider.BlockSize / 8
	[Byte[]]$Data               = New-Object Byte[] $BlockSizeBytes
	[Int]$BytesRead             = 0
	Try { $FileStreamReader     = New-Object System.IO.FileStream("$($FileToEncrypt.FullName)", [System.IO.FileMode]::Open)	}
	Catch { Write-Error "Unable to open input file for reading."; Return }
		$Count   = $FileStreamReader.Read($Data, 0, $BlockSizeBytes)
		$Offset += $Count
		$CryptoStream.Write($Data, 0, $Count)
		$BytesRead += $BlockSizeBytes
	While ($Count -gt 0)


This Powershell function decrypts a file using a given X.509 certificate private key.
This Powershell function decrypts a file using a given X.509 certificate private key.
This function accepts as inputs a file to decrypt and a certificate with which to decrypt it.
The file can only be decrypted with the private key of the certificate that was used to encrypt it.
.PARAMETER FileToDecrypt
Must be a System.IO.FileInfo object. $(Get-ChildItem C:\file.txt) will work.
Must be a System.Security.Cryptography.X509Certificates.X509Certificate2 object. $(Get-ChildItem Cert:\CurrentUser\My\9554F368FEA619A655A1D49408FC13C3E0D60E11) will work. The public key of the certificate is used for encryption. The private key is used for decryption.
PS C:\> . .\Decrypt-File.ps1
PS C:\> Decrypt-File $File $Cert
PS C:\> . .\Decrypt-File.ps1
PS C:\> Decrypt-File $(Get-ChildItem C:\foo.txt) $(Get-ChildItem Cert:\CurrentUser\My\THUMBPRINT)
Decrypt-File <System.IO.FileInfo> <System.Security.Cryptography.X509Certificates.X509Certificate2>
An unencrypted file.
Written by Ryan Ries - ryan@myotherpcisacloud.com

Function Decrypt-File

	Try { [System.Reflection.Assembly]::LoadWithPartialName("System.Security.Cryptography") }
	Catch { Write-Error "Could not load required assembly."; Return }
	$AesProvider                = New-Object System.Security.Cryptography.AesManaged
	$AesProvider.KeySize        = 256
	$AesProvider.BlockSize      = 128
	$AesProvider.Mode           = [System.Security.Cryptography.CipherMode]::CBC
	[Byte[]]$LenKey             = New-Object Byte[] 4
	[Byte[]]$LenIV              = New-Object Byte[] 4
	If($FileToDecrypt.Name.Split(".")[-1] -ne "encrypted")
		Write-Error "The file to decrypt must be named *.encrypted."
	If($Cert.HasPrivateKey -eq $False -or $Cert.HasPrivateKey -eq $null)
		Write-Error "The supplied certificate does not contain a private key, or it could not be accessed."
	Try { $FileStreamReader = New-Object System.IO.FileStream("$($FileToDecrypt.FullName)", [System.IO.FileMode]::Open)	}
		Write-Error "Unable to open input file for reading."		
	$FileStreamReader.Seek(0, [System.IO.SeekOrigin]::Begin)         | Out-Null
	$FileStreamReader.Seek(0, [System.IO.SeekOrigin]::Begin)         | Out-Null
	$FileStreamReader.Read($LenKey, 0, 3)                            | Out-Null
	$FileStreamReader.Seek(4, [System.IO.SeekOrigin]::Begin)         | Out-Null
	$FileStreamReader.Read($LenIV,  0, 3)                            | Out-Null
	[Int]$LKey            = [System.BitConverter]::ToInt32($LenKey, 0)
	[Int]$LIV             = [System.BitConverter]::ToInt32($LenIV,  0)
	[Int]$StartC          = $LKey + $LIV + 8
	[Int]$LenC            = [Int]$FileStreamReader.Length - $StartC
	[Byte[]]$KeyEncrypted = New-Object Byte[] $LKey
	[Byte[]]$IV           = New-Object Byte[] $LIV
	$FileStreamReader.Seek(8, [System.IO.SeekOrigin]::Begin)         | Out-Null
	$FileStreamReader.Read($KeyEncrypted, 0, $LKey)                  | Out-Null
	$FileStreamReader.Seek(8 + $LKey, [System.IO.SeekOrigin]::Begin) | Out-Null
	$FileStreamReader.Read($IV, 0, $LIV)                             | Out-Null
	[Byte[]]$KeyDecrypted = $Cert.PrivateKey.Decrypt($KeyEncrypted, $false)
	$Transform = $AesProvider.CreateDecryptor($KeyDecrypted, $IV)
	Try	{ $FileStreamWriter = New-Object System.IO.FileStream("$($FileToDecrypt.Directory)\$($FileToDecrypt.Name.Replace(".encrypted",$null))", [System.IO.FileMode]::Create) }
		Write-Error "Unable to open output file for writing.`n$($_.Message)"
	[Int]$Count  = 0
	[Int]$Offset = 0
	[Int]$BlockSizeBytes = $AesProvider.BlockSize / 8
	[Byte[]]$Data = New-Object Byte[] $BlockSizeBytes
	$CryptoStream = New-Object System.Security.Cryptography.CryptoStream($FileStreamWriter, $Transform, [System.Security.Cryptography.CryptoStreamMode]::Write)
		$Count   = $FileStreamReader.Read($Data, 0, $BlockSizeBytes)
		$Offset += $Count
		$CryptoStream.Write($Data, 0, $Count)
	While ($Count -gt 0)

Signing Powershell Scripts

by Ryan 8. March 2013 13:19

Hi guys,
Something you might know about me is that I think Powershell is just about the best thing to ever happen to Windows Server.  And even if you don't agree... well, you don't really have a choice because there are already tasks that must be done in Powershell and there is no GUI alternative, and that trend will only continue.
Powershell will become part of your life if you plan on continuing to work with Windows Server. PS is very powerful, and it makes many tasks much faster and more automatable.  But with that power, comes the side-effect that PS also makes many nefarious activities faster and easier as well.  An attacker has many more tools and methods (no pun intended) available to him or her if they have unrestricted access to Powershell on your server. You could restrict all Powershell usage on your servers, but that means losing an extremely valuable tool for administrators. There must be a compromise...

One effective way to mitigate this risk is to digitally sign your Powershell scripts with a certificate. By setting the execution policy of PS scripts on your servers to AllSigned, Powershell will not run scripts unless it can successfully validate the signature that is in the script.  That also means that the script cannot be modified after being signed, as the signature will no longer match the modified script.  This is much better than the RemoteSigned execution policy, which only mandates that scripts downloaded through an Attachment Execution Service-compliant application need to be signed, which is extremely easy to bypass. (e.g. just clear the alternate NTFS data stream.)
Good news is that code signing, especially in Powershell, is actually pretty easy to do.
First, as an administrator of an Active Directory domain with an Enterprise Certificate Authority, you need to make a code signing certificate for your users to use.  On your CA, open the Certificate Templates MMC snap-in.  You can also get there in the CA snap-in by right-clicking on Certificate Templates and choosing Manage.

Cert Templates

Duplicate the Code Signing template, so that you don't modify the original template.  Configure all the properties of this template as desired, including allowing Authenticated Users (or whichever users you desire) the ability to enroll in your new certificate template.

Cert Templates

I very creatively named my template "Code Signing Template v1."  Now go back to your Certificate Authority snap-in, and choose New -> Certificate Template to Issue.  Choose your new code signing cert.

Cert Templates

Keep in mind that this information is being uploaded to Active Directory, so make sure AD replication has taken effect before wondering why the certificate isn't already available at your location as soon as you issue it.

Now, on your workstation, where you are logged in as a regular user, you want to enroll for one of those new code signing certs.  You could do this in the Certificates MMC snap-in:

Cert Enrollment

But, since we like to save time, let's go ahead and instead submit the certificate request via Powershell:


You have to take the spaces out of the certificate template name.  But you'll know the certificate request was successful if the status is Issued.  You'll also notice that the public key of your new code signing cert is now published publicly in your Active Directory user account.  This is so everyone else in the domain can access your public key so that they can validate that you did, in fact, sign that exact Powershell script, so we can figure that it's safe to run.


Signing a Powershell script with your new code signing certificate is simple:

Powershell script sign

Just Set-AuthenticodeSignature $script $certificate, and it's signed.  The signature is actually a block of cipher-text that appears at the bottom of your newly signed script.


Script Signed

Your identity is also encoded in those comments, but is not encrypted.  So Powershell can tell who signed the script right off the bat.  Then, Powershell retrieves the public key of that identity (from AD for example.)  The script was signed using your private key, which you do not share, and can only be decrypted with your public key, which you do share.  So if Powersell is able to decode the signature with your public key, it can be reasonably assured that it was signed by you.  If the script changes by even a single character, then that hashed with your private key would result in an entirely different signature, therefore modification of the script means Powershell will not run it unless it is re-signed by another authorized user, and we would be able to tell who that was.
Alright, that's enough out of me.  Hopefully I've been able to at least pique your interest in the potential security benefits of code signing!

(And sorry for all the cheesy pixellation.)

Neat Windows Tricks (Or Back When I Was Young and Foolish Pt. III)

by Ryan 5. March 2013 18:29

I work with Windows a lot. Almost every day of my life since Windows 3.1, I've been submersed in Windows for both work and play. Getting to know it inside and out. Learning new Windows applications. Being excited for new releases of Windows because I know it'll bring big changes to the operating system which will make me have to learn new things. Every now and then, I even start to think I'm pretty knowledgable about Windows...

Which is why I'm always flabbergasted when someone non-chalantly shows me a relatively mundane Windows trick I never knew about, yet it's been under my nose the whole time!  And so, I bring you two Windows tricks that I learned about today. If you already knew about them, you can just think of me as an amateur and move on.  But if you didn't already know about them, they might just change the way you do your daily work!

C:\> SomeCommand | clip

 I never knew about this! On the command line, you can pipe the output of any program or command to your Windows clipboard and Ctrl+V it anywhere!


You can also do something like

C:\> clip < textFile.txt to quickly copy the contents of the file to your clipboard.

Alright, trick #2:

Shift + Right-Click to Run as Different User

This one was down-right embarrassing for me to have not known. Maybe I did know it at one time, but then I used it so infrequently that I forgot about it... I don't know. Back in the XP/2003 days, you could right-click on an application and choose "Run as..." which would prompt you for the credentials of another user under which to to run that process. But then in Vista onwards it disappeared. Sure, there's still "Run as administrator," but sometimes you need to run a process as someone else besides Administrator. Well, I always just got around it by launching a command prompt and doing something like runas /user:BobMarley@domain.com notepad.exe.

But I completely forgot that it's still there in the GUI.  All you have to do is Shift+Right-click:


Building a Windows Server That Boots With No Errors (Pt. 1 of NaN)

by Ryan 28. February 2013 17:33

One of my favorite pastimes (hey don't judge me) is properly configuring my Windows Servers so that they complete a cold boot and log on of a user without a single error in the Application or System event logs.  Even if the errors that are logged don't seem to have any impact on the system, I still don't want to see them.  Maybe some people don't care that errors are being generated by Windows during bootup, as long as the server still "works fine," but I do.  This might sound silly to you at first, but take Windows 2008 R2 SP1 as an example.   Right off the shelf, freshly installed with no modification or installed applications whatsoever, it is unable to boot and log on a user without logging an error in the event viewer.  Windows usually requires at least a little bit of jiggering out of the box before it boots with no errors in the event logs.  And that's before you even begin installing applications and changing up configurations into a virtually infinite number of permutations that could cause errors during system boot.

And that brings me to #1:

WMI Error 10*WMI Error 10*

Microsoft left this little bit of errant code buried deep within the bowels of their service pack 1 update for Windows 2008 R2 and Windows 7.  You'll see it once every time you boot the machine.  Luckily, it's a snap to fix.  Just save this VBscript as a *.vbs file and run it.  All it does is remove the vestigial reference that causes the error:

strComputer = "."
Set objWMIService = GetObject("winmgmts:" _
& "{impersonationLevel=impersonate}!\\" _
& strComputer & "\root\subscription")

Set obj1 = objWMIService.ExecQuery("select * from __eventfilter where name='BVTFilter' and query='SELECT * FROM __InstanceModificationEvent WITHIN 60 WHERE TargetInstance ISA ""Win32_Processor"" AND TargetInstance.LoadPercentage > 99'")

For Each obj1elem in obj1
  set obj2set = obj1elem.Associators_("__FilterToConsumerBinding")
  set obj3set = obj1elem.References_("__FilterToConsumerBinding")

  For each obj2 in obj2set
    WScript.echo "Deleting the object"
    WScript.echo obj2.GetObjectText_

  For each obj3 in obj3set
    WScript.echo "Deleting the object"
    WScript.echo obj3.GetObjectText_

  WScript.echo "Deleting the object"
  WScript.echo obj1elem.GetObjectText_

Here's the MS KB.

Here's another easy one... #2:

Error 7030*Service Control Manager Error 7030*

I've seen the SNMP service (which Microsoft is deprecating,) trigger this one on a Windows 2008 R2 SP1 server, as well as numerous third-party services.  The bottom line is that services aren't allowed to interact with the desktop anymore since 2008/Vista.  It's not supported by Microsoft any more, it's never a good idea, and if you are thinking about writing a Windows service that is meant to interact with the desktop of a logged-on user, then you should rethink it because your idea is wrong and stupid.  Most of all on a server.  I can think of few things that annoy me more than crusty old line-of-business applications that run on servers, but weren't actually designed for servers. (i.e., with GUI interaction required, stupid stuff like requiring that a person be logged on interactively to launch the app and then stay logged on indenfinitely, etc.)  Ugh.

The fix is to simply uncheck the "Allow service to interact with the desktop" checkbox in the properties of the service.  It is not supported any more and is only still there for compatibility with legacy code.  If a service (which runs in Session 0) tries to interact with the user's desktop, the user will get a popup message like this:

Interactive service*Session 0 Knocking!*

If you view the message, your own desktop will be suspended and you will be transferred to the twilight zone of Session 0's desktop, until you choose to return. The bottom line is if you have a Windows service that does this on modern versions of Windows, then that service is not compatible with Server 2008 and above, no matter what the vendor says.  As a developer, if you really need this ability to interact with a user's desktop, which you don't, you might consider doing something like using the Terminal Services API (wtsapi32.dll) to identify the active user sessions and starting a process in that session.

If you want to see whether a Windows service is configured to interact with the desktop using Powershell:

$svc = $(Get-WmiObject -Query "select * from win32_service where Name = 'MyService").Properties


Microsoft schools us on thie issue here and here.

And finally, #3:

DCOM Error 10016
*DCOM Error 10016*

This is the most interesting error of the three in my opinion.  It typically only happens after you start installing applications.  The event tells you that you should go have a look at the Component Services admin tool (mmc snapin,) but frankly a lot of admins don't know much about what the Component Services snapin does or how it works.  It can be somewhat daunting:


Well we know the error is telling us that a particular account (such as SYSTEM S-1-15-18) doesn't have some sort of access to some particular doodad in here.  But how do we find that AppID GUID that it mentions?  We could be idiots and right-click on every single node in this entire snapin one at a time... or we could be smart and take our search to the registry.  Look for HKEY_CLASSES_ROOT\AppID\{APPID-GUID}. That should tell you the name of the offending COM component.  All you have to do now is go back to the Component Services snapin, find the name of that component, go the security properties of it, and edit the security ACL of that component such that what ever account the event log was bitching about is given whatever access it wanted.  If you find that that the security properties of the component are greyed out so that you can't edit it, that's probably because TrustedInstaller has that on lockdown.  Go back to the registry, find the corresponding reg key, take ownership/give yourself permissions to it as necessary, restart the service (or reboot the OS,) and then you will be able to modify the security settings on that COM component.

I saw this myself just yesterday with the "SMS Agent" DCOM application.  The SMS (or SCCM) agent came preinstalled on the standard OS image that was being deployed to the machines I was working on.

So this has been the first in a series of me sharing one of my personal hobbies - making sure there are no application or system errors when Windows boots up.  If you have any boot or logon errors in the same spirit as those I've discussed here, feel free to drop me a line and I will feature your errors in a future post!

Azure Outage, The File Cabinet Blog, Etc.

by Ryan 23. February 2013 09:44

Got a mixed bag this weekend... I'm still busier than usual at work, so my thoughts have been more scattered lately. I'll just start typing and we'll see where it takes us.

First, Windows Azure. I just put this very blog on Azure not two days ago, and then yesterday they suffered a massive, embarrassing, world-wide secure storage outage. I say embarrassing because it was caused by an expired SSL certificate. That's right - a world-wide outage lasting hours that could have been prevented by someone taking 5 minutes to renew a certificate.

But let's get our facts straight here - Windows Azure didn't completely go down. It was specifically their secure storage services that went down, which I heard also affected provisioning of new resources, as well as a lot of unhappy customers who were running large storage and SQL operations over SSL that relied on that certificate. The outage didn't affect any HTTP or non-SSL traffic, so this blog just sat here relatively unscathed. Of course, I feel for those who put enterprise workloads up on Azure and did get hurt by the outage, but Azure is certainly not the first public cloud service to suffer a large-scale outage, and it won't be the last. But what makes this outage so particularly poignant is that it was so easily preventable with even the most basic of maintenance plans. Such as, I dunno, maybe a sticky note on someone's monitor reminding them to renew this immensely important SSL cert.

Here are a couple of screenshots, for the Schadenfreude:


Technet Forums


Azure Dashboard


Next thing on the list is pretty old news, but I never mentioned anything about it here. Ned Pyle, formerly of the Ask the Directory Services Team blog, has moved to Microsoft's home base and is now on the storage product team. He's still blogging though, on the File Cabinet blog. I don't know him well, but we have exchanged a few emails and blog comments about AD stuff and such. Up there with Russinovich in my opinion, he is one of my favorite tech people on the internet though, because his blogging is both entertaining and very educational. Lots of respect. Plus, check out the comments on his latest post here, where I correctly answer some AD arcana and get some validation from one of my heroes.

Now if I can just get the PFEs to make another one of those Microsoft Certified Master Active Directory posts, I'd be as happy as a civet eatin' coffee berries.

About Me

Name: Ryan Ries
Location: Texas, USA
Occupation: Systems Engineer 

I am a Windows engineer and Microsoft advocate, but I can run with pretty much any system that uses electricity.  I'm all about getting closer to the cutting edge of technology while using the right tool for the job.

This blog is about exploring IT and documenting the journey.

Blog Posts (or Vids) You Must Read (or See):

Pushing the Limits of Windows by Mark Russinovich
Mysteries of Windows Memory Management by Mark Russinovich
Accelerating Your IT Career by Ned Pyle
Post-Graduate AD Studies by Ned Pyle
MCM: Active Directory Series by PFE Platforms Team
Encodings And Character Sets by David C. Zentgraf
Active Directory Maximum Limits by Microsoft
How Kerberos Works in AD by Microsoft
How Active Directory Replication Topology Works by Microsoft
Hardcore Debugging by Andrew Richards
The NIST Definition of Cloud by NIST

MCITP: Enterprise Administrator


Profile for Ryan Ries at Server Fault, Q&A for system administrators


GitHub: github.com/ryanries


I do not discuss my employers on this blog and all opinions expressed are mine and do not reflect the opinions of my employers.