Active Directory List Object Mode

by Ryan 20. May 2013 12:00

This is something I've been wanting to blog about for a long time, but have been putting it off because I knew it might turn in to a long, time-consuming post. Well it's time to bite the bullet and get started.

We were facing a bit of a problem in one of our managed hosting environments. We had this high-volume, multitenant Active Directory being used by dozens of different customers. There was a business requirement in this domain that customers not be able to read from one another's organization units for the sake of the mutual privacy of the customers. Things seemed to be working well for a while, but one day, it appeared that customer users logging on to many of the client computers were failing to process Group Policy upon logon:

Event ID: 1101
Source: Userenv
User: NT Authority\System
Description: Windows cannot access the object OU=Customers, DC=contoso, DC=com in Active Directory. The access to the object may be denied. Group Policy processing aborted.

To start troubleshooting, I copied one of the affected user accounts and used it to log in to one of their machines, and I was able to reproduce the issue. Upon trying to update Group Policy with gpupdate.exe, I noticed that the computer configuration was updating fine, while only the user portion of the update failed, and the event 1101 was produced.

The basic layout of the OU structure in the domain was this:

    
CONTOSO.COM
    |
    + Customers (OU)
          |
          + Customer1 (OU)
          |
          + Customer2 (OU)
          |
          + ...

Still using my customer-level user account, I noticed that I was able to browse the contents of my own Customer1 OU, but I was not able to browse the contents of any other OU. The permissions on these OUs had certainly been modified.

In fact, it was that the read permission for the Authenticated Users security group had been removed from the access control list on the Customers OU. That explains the event 1101s and the GPO processing failures. From Microsoft:

[GPO processing fails] when the Group Policy engine cannot read one of the OUs.

The Group Policy engine must be able to read all OUs from the level of the user object or the computer object to the level of the domain root object. Also, the Group Policy engine must be able to read the domain root object and the site object of the computer. This is because these objects may contain links to group policies. If the Group Policy engine cannot read one of these OUs, the events that are mentioned in the "Symptoms" section will be logged.

So in satisfying the business requirement that no customer be allowed to list the contents of another customer's OU, Group Policy processing had been broken. But simply giving Authenticated Users their read permissions back on the Customers OU, they get to browse all the other customers OUs as well.

We needed the best of both worlds.

This Microsoft article would lead you to believe that if a security principal just had the Read gpLink and Read gpOptions access control entries, then GPO processing should work fine:

But that's not enough. The four ACEs that were needed on the Customers OU were:

  • Read gpLink
  • Read gpOptions
  • Read cn
  • Read distinguishedName

Now we're making progress, but we're still not out of the woods. Giving Authenticated Users the List Contents permission on the Customers OU would allow them to see the names of all the other customer's OUs, although now they show up as "Unknown" object types and can't have their respective contents listed. But that's a messy solution in my opinion and doesn't fully satisfy the requirement. Customer1 shouldn't even be aware of Customer2's existence.

There's one last piece of the puzzle missing, and that brings me to List Object Mode.

List Object Mode is one strategy available to Active Directory administrators to allow for hiding certain bits of data from certain users. List Object mode has to be enabled manually; it's turned off by default. To enable it, set the value of the dsHeuristics property in the Configuration partition to 001 using ADSI Edit, like so:

dsHeuristics

Now you will have a new access control entry in the list on objects in your forest: List Object. The ACE was actually there before, but Active Directory doesn't enforce it by default.

List Object Mode is a form of Access Based Enumeration, (not to be confused with file system ABE,) where items are not displayed to users that do not have List Object permissions to them. By default, when a user has the List Contents permission on an OU, and queries that OU, he or she is given a list of all child OUs in that parent OU, even if the user doesn't have read access to those other child OUs.  They show up in ADUC as "Unknown" object types and get that little blank page for an icon which is the Microsoft universal symbol for "wth is this?"

By using List Object permissions after having enabled it as just described, Active Directory evaluates the permissions of all the child objects under the object that was queried before returning the results to the user. Unless the user has the List Object permission on the object, it is omitted from the results. So now we have a customer user who is able to read just his or her own OU, and the other Customer OUs are completely hidden from view.

And no more Group Policy failures due to access denied, either.

So are there disadvantages to enabling and using List Object mode in your domain? Yes there are. So even though it may be appropriate for your environment, List Object Mode is not for everybody and it's not a decision that should be made lightly:

  • Significantly increased access control checks on LDAP queries = busier domain controllers.
  • You may need to rethink your entire User and Computer organization strategy to accommodate for how the new permissions work.
  • It's a less common configuration that fewer people are familiar with. Administrative complexity++. You need to fully document the change and make sure every administrator is aware of it.

So there you have it. Now go impress your friends with your knowledge of AD List Object Mode!

My Entry for the Advanced Event #3 of the 2013 Scripting Games

by Ryan 14. May 2013 09:09

Halfway done.  Here's my third entry for this year's Powershell games.  I used a workflow this time, mostly in an attempt to garner favor from the voters for using new features exclusive to PS3.  Even though the multithreading with jobs that I did in the last event is a neat idea, it really doesn't perform very well.  The workflow will likely perform better, though I don't know if it's going to handle the throttling of thread creation if I handed it a list of 500 computers.

#Requires -Version 3
Function New-DiskSpaceReport
{
	<#
		.SYNOPSIS
			Gets hard drive information from one or more computers and saves it as HTML reports.
		.DESCRIPTION
			Gets hard drive information from one or more computers and saves it as HTML reports.
			The reports are saved to the specified directory with the name of the computer in
			the filename. The list of computers is processed in parallel for increased speed.
			Use the -Verbose switch if you want to see console output, which is very useful if you
			are having problems generating all the desired reports.
		.PARAMETER ComputerName
			One or more computer names from which to get information. This can be a
			comma-separated list, or a file of computer names one per line. The alias
			of this parameter is -Computer. The default value is the local computer.
		.PARAMETER Directory
			The directory to write the HTML files to. E.g., C:\Reports. The directory
			must exist. The default is the current working directory.
		.INPUTS
			[String[]]$ComputerName
			This is an array of strings representing the hostnames of the computers
			for which you want to retrieve information. This can also be supplied by
			(Get-Content file.txt). This can be piped into the cmdlet.
		.INPUTS
			[String]$Directory
			The directory to save the HTML reports to. The directory must exist.
		.OUTPUTS
			HTML files representing the information obtained from all
			the computers supplied to the cmdlet.
		.EXAMPLE
			New-DiskSpaceReport
			
			This will generate a report for the local computer and output the HTML file to
			the current working directory.			
		.EXAMPLE
			New-DiskSpaceReport -ComputerName server01,server02,server03 -Directory C:\Reports
			
			This will generate three HTML reports for the servers and save them in the C:\Reports
			directory.
		.EXAMPLE
			New-DiskSpaceReport -Computer (Get-Content .\computers.txt)
			
			This will generate HTML reports for all the computers in the computers.txt file and
			save the reports in the current working directory.
		.EXAMPLE
			,(Get-Content .\computers.txt) | New-DiskSpaceReport -Directory C:\Reports
			
			This will generate HTML reports for all the computers in the computers.txt file and
			save the reports in C:\Reports. Please note the leading comma in this example.
		.NOTES
			Scripting Games 2013 Advanced Event 3
	#>
	[CmdletBinding()]
	Param([Parameter(ValueFromPipeline=$True)]
			[Alias('Computer')]
			[String[]]$ComputerName = $Env:Computername,
		  [Parameter()]
			[ValidateScript({Test-Path $_ -PathType Container})]
			[String]$Directory = (Get-Location).Path)
	
	Write-Verbose -Message "Writing reports to $Directory..."
	
	WorkFlow BuildReports
	{
		Param([String[]]$Computers, [String]$Directory)
		ForEach -Parallel ($Computer In $Computers)
		{			
			InlineScript
			{				
				Write-Verbose -Message "Generating report for $Using:Computer..."
				$Header = @'
				<title>Disk Free Space Report</title>
				<style type=""text/css"">
					<!--
						TABLE { border-width: 1px; border-style: solid;  border-color: black; }
						TD    { border-width: 1px; border-style: dotted; border-color: black; }
					-->
				</style>
'@
				$Pre  = "<p><h2>Local Fixed Disk Report for $Using:Computer</h2></p>"
				$Post = "<hr><p style=`"font-size: 10px; font-style: italic;`">This report was generated on $(Get-Date)</p>"
				Try
				{					
					$LogicalDisks = Get-WMIObject -Query "SELECT * FROM Win32_LogicalDisk WHERE DriveType = 3" -ComputerName $Using:Computer -ErrorAction Stop | Select-Object -Property DeviceID,@{Label='SizeGB';Expression={"{0:N2}" -F ($_.Size/1GB)}},@{Label='FreeMB';Expression={"{0:N2}" -F ($_.FreeSpace/1MB)}},@{Label='PercentFree';Expression={"{0:N2}" -F (($_.Freespace/$_.Size)*100)}};
					$LogicalDisks | ConvertTo-HTML -Property DeviceID, SizeGB, FreeMB, PercentFree -Head $Header -PreContent $Pre -PostContent $Post | Out-File -FilePath $(Join-Path -Path $Using:Directory -ChildPath $Using:Computer`.html)
					Write-Verbose -Message "Report generated for $Using:Computer."
				}
				Catch
				{
					Write-Verbose -Message "Cannot build report for $Using:Computer. $($_.Exception.Message)"
				}
			}
		}
	}
	
	If($PSBoundParameters['Verbose'])
	{
		BuildReports -Computers $ComputerName -Directory $Directory -Verbose
	}
	Else
	{
		BuildReports -Computers $ComputerName -Directory $Directory
	}
}

My Entry for the Advanced Event #2 of the 2013 Scripting Games

by Ryan 10. May 2013 11:21

More Powershell! I'm somewhat proud of this script.

#Requires -Version 3
Function Get-ComputerInfo
{
	<#
		.SYNOPSIS
			Gets some basic system information about one or more remote Windows computers.
		.DESCRIPTION
			Gets some basic system information about one or more remote Windows computers.
			Specifically designed to be able to fetch information from any version of
			Windows computer from Windows 2000 up. This Cmdlet takes only one parameter,
			-ComputerName. ComputerName can be a single computer name or IP address, or it
			can be an array of computer names. You can also use a file of computer hostnames,
			one per line. This function will return the information gathered from all
			of the computers. Remember to use a leading comma when piping an array to
			this cmdlet. See the examples for more details. Powershell 3.0 is the minimum
			required on the machine that runs this cmdlet, though the target computers 
			do not need Powershell at all. Use Get-Help Get-ComputerInfo -Examples  to see
			usage examples. Example 8 is my favorite!
		.PARAMETER ComputerName
			One or more computer names from which to get information. This can be a
			comma-separated list, or a file of computer names one per line. The alias
			of this parameter is -Computer.
		.PARAMETER MaxThreads
			Default is 4. This is the maximum number of threads that are allowed to
			run simultaneously. This is useful because network operations can block
			for a long time, making threading desirable. However, when using a very 
			large list of computers, spawning a huge number of concurrent threads can
			be detrimental to the system, so thread creation should be throttled.
			The max is 32. The alias for this parameter is -Threads.
		.INPUTS
			[String[]]$ComputerName
			This is an array of strings representing the hostnames of the computers
			for which you want to retrieve information. This can also be supplied by
			(Get-Content file.txt). This can be piped into Get-ComputerInfo.
		.OUTPUTS
			A collection of objects representing the information obtained from all
			the computers supplied to the cmdlet.
		.EXAMPLE
			Get-ComputerInfo server1,server2,server3
		.EXAMPLE
			Get-ComputerInfo -ComputerName server1,server2,server3 | Format-Table
		.EXAMPLE
			Get-ComputerInfo -ComputerName (Get-Content .\computers.txt) -MaxThreads 8
		.EXAMPLE
			,(Get-Content .\computers.txt) | Get-ComputerInfo -Threads 12
			
			(Please note the leading comma in this example.)
		.EXAMPLE
			,("server1","server2","server3") | Get-ComputerInfo
			
			(Please note the leading comma in this example.)
		.EXAMPLE
			$Computers = @("server1","server2","server3")
			,$Computers | Get-ComputerInfo
		
			(Please note the leading comma in this example.)
		.EXAMPLE
			"server1" | Get-ComputerInfo
		.EXAMPLE
			Get-ComputerInfo -ComputerName ($(Get-ADComputer -Filter *).Name) | Out-GridView
		.NOTES
			Scripting Games 2013 Advanced Event 2
	#>
	[CmdletBinding()]
	Param([Parameter(Mandatory = $True, ValueFromPipeline=$True, HelpMessage = 'Computer names to scan, e.g. server01,server02,server03')]
			[Alias('Computer')]
			[String[]]$ComputerName,
		  [Parameter(Mandatory = $False)]
			[Alias('Threads')]
			[ValidateRange(1, 32)]
			[Int]$MaxThreads = 4)
	
	# This is the collection of objects that this function will eventually return.
	$ComputerInfoCollection = @()
	
	# By using the unique job name of "GetComputerInfo", we avoid interfering with any other
	# unrelated jobs that might be running by coincidence.
	$JobName = "GetComputerInfo"
	
	# Clear any old jobs with the same name before we begin. -EA Stop ensures that errors will be caught.
	Try
	{
		Get-Job -Name $JobName -ErrorAction Stop | Remove-Job -Force
	}
	Catch
	{
		# No jobs with the name $JobName were running. We don't care.
	}
	
	# This is the work to be performed by each thread in a Start-Job command.
	$Work = {
		$ComputerInfo = [PSCustomObject]@{ Name = $Args[0]; IPAddresses = $null; OSCaption = $null; MegaBytesRAM = $null; CPUSockets = $null; TotalCores = $null; }
		Try
		{			
			$ComputerInfo.IPAddresses = $([System.Net.Dns]::GetHostEntry($Args[0])).AddressList
		}
		Catch
		{
			# The hostname did not resolve to an IP address, so there is no reason to keep going.
			$ComputerInfo.IPAddresses = "Could not resolve name!"
			Return $ComputerInfo
		}
		Try
		{
			$ComputerInfo.OSCaption = $(Get-WMIObject Win32_OperatingSystem -ComputerName $Args[0] -ErrorAction Stop).Caption
		}
		Catch
		{
			$ComputerInfo.OSCaption = "$($_.Exception.Message)"
		}
		Try
		{
			$ComputerInfo.MegaBytesRAM = [Math]::Round($(Get-WMIObject Win32_ComputerSystem -ComputerName $Args[0] -ErrorAction Stop).TotalPhysicalMemory / 1MB, 0)
		}
		Catch
		{
			$ComputerInfo.MegaBytesRAM = "$($_.Exception.Message)"
		}
		Try
		{
			$CPUInfo = Get-WMIObject Win32_Processor -ComputerName $Args[0] -ErrorAction Stop
			
            # SocketDesignation does not exist on Server 2000
            # $ComputerInfo.CPUSockets = $CPUInfo.SocketDesignation.Count
            # Also, Win 2000 does not care about Hyperthreading and does not distinguish
            # cores from sockets AFAIK, so TotalCores will be null if Win 2000. Not a big deal IMO.
            $ComputerInfo.CPUSockets = $CPUInfo.DeviceID.Count
			ForEach($CPU In $CPUInfo)
			{
				$Cores += $CPU.NumberOfCores
			}
			$ComputerInfo.TotalCores = $Cores
		}
		Catch
		{
			$ComputerInfo.CPUSockets = "$($_.Exception.Message)"
			$ComputerInfo.TotalCores = "$($_.Exception.Message)"
		}
		
		Return $ComputerInfo
	}
	
	ForEach($Computer In $ComputerName)
	{
		While($(Get-Job -State "Running" | Where-Object Name -EQ $JobName).Count -GE $MaxThreads)
		{
			# Max number of concurrent running threads reached - sleep until one is available.
			Start-Sleep -Milliseconds 500
		}
		Start-Job -Name $JobName -ScriptBlock $Work -ArgumentList $Computer | Out-Null
	}
	
	# Wait for all jobs to finish.
	# Get-Job -State "Running" -Name $JobName does not work for some reason, so let's do it in two steps.
	While(Get-Job -State "Running" | Where-Object Name -EQ $JobName)
	{
		Start-Sleep -Milliseconds 500
	}
	
	# Jobs are done, let's collect the results and store it in our collection.
	ForEach($Job In Get-Job -Name $JobName)
	{
		$ComputerInfoCollection += Receive-Job $Job
	}
	
	Return $ComputerInfoCollection
}

Probably the Craziest Powershell One-Liner I've Written To Date

by Ryan 7. May 2013 13:49

Someone at work asked me to identify duplicate computers in two separate AD forests, and remove the one that was no longer needed.  It's assumed as part of business policy that there should not be duplicate server hostnames anywhere in the company - even if they reside in different forests or domains.  But for some reason or another, a computer might get migrated from DomainA to DomainB, but the computer object stays behind in the old domain, etc.  So I decided to just collect all the computers from DomainA and DomainB (in ForestA and ForestB respectively), point out the computer accounts that had the same name in each domain, and list their PwdLastSet attribute next to their name.  If the machine had not updated its password in over 30 days in DomainA, while the machine password was up to date in DomainB, then it was reasonably safe to assume that the machine had been migrated out of DomainA and into DomainB, or vice versa.

I only had Powershell v2 on hand, so I didn't have the relative luxury of automatic foreach, etc.  After collecting the computer objects like $DomainAComputers = Get-ADComputer -Filter * -Properties *, check out this hideous monstrosity I came up with to compare them in a single line:

PS C:\Users\ryan> foreach($C In $(Compare-Object $($DomainAComputers|?{!($_.DistinguishedName.Contains("Disabled Accounts"))}|%{$_.Name}) $($DomainBComputers|?{!($_.DistinguishedName.Contains("Disabled Accounts"))}|%{$_.Name}) -IncludeEqual | ? { $_.SideIndicator -eq "==" })) { $o = $($DomainAComputers|?{$_.Name -eq $C.InputObject}); $n = $($DomainBComputers|?{$_.Name -eq $C.InputObject}); $o.DnsHostName + "`t" + $o.PasswordLastSet + "`t" + $n.DnsHostName + "`t" + $n.PasswordLastSet }

The output looks like this:

computer1.domainA.com    04/17/2013    computer1.domainB.com    01/21/2010
computer2.domainA.com    05/05/2013    computer2.domainB.com    10/11/2011
etc...

You can easily see now that the two computers in DomainA are active, while the computer objects of the same name in DomainB are stale, so I'll delete them.

Now don't get me wrong - this is not elegant or clever. It's thoroughly unreadable and ugly and I'd not brag about it except to say, "Haha, look how much s*#! I can cram on one single line of Powershell!"

A couple things that I thought were interesting:

  • Get-ADComputer gives you a free pseudo-attribute called PasswordLastSet, which is a nicely formatted DateTime. But it's not a "real" attribute of the object in Active Directory. Rather, it's the Powershell cmdlet's courtesy attribute where it automatically converts the real attribute - PwdLastSet - from file time (epoch seconds) to a .NET DateTime object. Many of the Active Directory cmdlets work that way.
  • Compare-Object -ExcludeDifferent didn't seem to work and I'm not sure why.  So I had to just use -IncludeEqual instead and isolate the names that were equal.

My Entry for the Advanced Event #1 of the 2013 Scripting Games

by Ryan 30. April 2013 10:49

I've been pretty excited about the annual scripting games. This is only my second Games, but they have been a terrific Powershell learning experience for me. This year it's being run by Don Jones and his gang:

http://scriptinggames.org/

http://powershell.org/wp/

Their PHP-based website has already shown to be a little buggy, and I will eat road salt before I enable Java in my browser, so I won't be using their chat room, but you have to cut them some slack as it's a brand new site that has never been used before.

When people comment on the scripts you submit, it can be humbling but is also a good learning experience for being able to tell what people wanted out of your script but didn't get. There's not any error-handling to speak of in this script - I knew that was a risk I was taking by submitting a script with no error handling, but the event description stated that I should "display all errors clearly," which is exactly what the script does with no error handling. Still, I could have still used error handling to make it a little more elegant. Also, I guess I've got to break down and start doing the -WhatIf and -Confirm junk, even though I don't exactly want to, it's going the extra mile.

Without further ado:

#Requires -Version 3
Function Move-OldFiles
{
	<#
		.SYNOPSIS
			Move files that are older than a specified number of days. (Default is 90 days.)
			Use the verbose switch if you want to see output, otherwise the Cmdlet shows only errors.
		.DESCRIPTION
			Move files that are older than a specified number of days (default 90) from the 
			source directory to a destination directory. Directory recursion is on by default,
			but can be disabled with the -NoRecurse switch. The subdirectory structure will be 
			preserved at the destination. By default, all files are moved, but a file pattern
			can be specified with the -Pattern parameter. By default, files that already exist at
			the destination and are readonly are not overwritten, unless the -Force switch is used.
			This cmdlet works with drive letters as well as UNC paths. By default, only errors are shown.
			Use the -Verbose switch if you want to see more output. This function requires Powershell 3.
		.PARAMETER SourceDirectory
			Specifies the source directory from which you want to move files. E.g. C:\Logs or C:\Logs\
			This must be a valid directory. The alias for this parameter is Source.			
		.PARAMETER DestinationDirectory
			Specifies the destination directory to which you want to move files. E.g. E:\Archives or
			E:\Logs\Old\ or \\SERVER02\Share\Logs. This must be a valid directory. The alias for
			this parameter is Destination.
		.PARAMETER OlderThan
			The number of days that a file's age must exceed before it will be moved. This is
			an optional parameter whose default is 90 days. This parameter must be a positive
			integer. The alias for this parameter is Age.
		.PARAMETER Pattern
			This is an optional filename filter. E.g., *.log or *.txt or Report*.html.
			The alias for this parameter is Filter.
		.PARAMETER NoRecurse
			This is a switch that indicates whether the cmdlet will process files in subdirectories
			underneath the specified source directory. By default, recursion is on. Optional.
		.PARAMETER Force
			This is a switch that indicates whether files that already exist at the destination
			and are readonly will be overwritten. By default they are not overwritten. Optional.
		.EXAMPLE
			PS C:\> Move-OldFiles -Source C:\Application\Log -Destination \\NASServer\Archives -OlderThan 90 -Filter *.log
		.EXAMPLE
			PS C:\> Move-OldFiles C:\Logs \\NASServer\Archives 90 *.log
		.EXAMPLE
			PS C:\> Move-OldFiles -SourceDirectory C:\Logs -DestinationDirectory \\NAS\Archives -Age 31 -Pattern *.log -Force
		.EXAMPLE
			PS C:\> Move-OldFiles C:\Logs \\NAS\Archives
	#>
	[CmdletBinding()]
	Param([Parameter(Position = 0, Mandatory = $True, HelpMessage = 'Source directory, e.g. C:\Logs')]
			[ValidateScript({Test-Path $_ -PathType Container})]
			[Alias('Source')]
			[String]$SourceDirectory,
	      [Parameter(Position = 1, Mandatory = $True, HelpMessage = 'Destination directory, e.g. \\NASServer\Archives')]
			[ValidateScript({Test-Path $_ -PathType Container})]
			[Alias('Destination')]
			[String]$DestinationDirectory,
		  [Parameter(Position = 2, Mandatory = $False, HelpMessage = 'The number of days old the file must be in order to be moved.')]
			[ValidateScript({$_ -GT 0})]
			[Alias('Age')]
			[Int]$OlderThan = 90,
		  [Parameter(Position = 3, Mandatory = $False, HelpMessage = 'The file pattern to match, e.g. *.log')]
			[Alias('Filter')]
			[String]$Pattern = "*",
		  [Parameter(Position = 4, Mandatory = $False, HelpMessage = 'Disable directory recursion, i.e. only copy the directory specified.')]
			[Switch]$NoRecurse = $False,
		  [Parameter(Position = 5, Mandatory = $False, HelpMessage = 'Specify to overwrite existing readonly files at the destination.')]
			[Switch]$Force = $False)
	
	$Start = Get-Date
    If(!($SourceDirectory.EndsWith("\")))
    {
	    $SourceDirectory = $SourceDirectory + "\"
    }
    If(!($DestinationDirectory.EndsWith("\")))
    {
        $DestinationDirectory = $DestinationDirectory + "\"
    }
	
	Write-Verbose "Source Directory:       $SourceDirectory"
	Write-Verbose "Destination Directory:  $DestinationDirectory"
	Write-Verbose "Move Files Older Than:  $OlderThan Days"
	Write-Verbose "Filename Filter:        $Pattern"
	Write-Verbose "Exclude Subdirectories: $NoRecurse"
	Write-Verbose "Overwrite if Readonly:  $Force"
	
	If($NoRecurse)
	{
		$SourceFiles = Get-ChildItem -Path $SourceDirectory -Filter $Pattern -File | Where-Object LastWriteTime -LT (Get-Date).AddDays($OlderThan * -1)
		Write-Verbose "$($SourceFiles.Count) files found in $SourceDirectory matching pattern $Pattern older than $OlderThan days."
	}
	Else
	{
		$SourceFiles = Get-ChildItem -Path $SourceDirectory -Filter $Pattern -File -Recurse | Where-Object LastWriteTime -LT (Get-Date).AddDays($OlderThan * -1)
		Write-Verbose "$($SourceFiles.Count) files found in $SourceDirectory matching pattern $Pattern older than $OlderThan days."
	}
	
	[Int]$FilesMoved = 0
	ForEach($File In $SourceFiles)
	{
		Write-Verbose "Moving $($File.FullName)"
		$DestinationFullName = $DestinationDirectory + $($File.FullName).Replace($SourceDirectory, $null)
		$DestinationFileDirectory = $DestinationFullName.Replace($DestinationFullName.Split('\')[-1], $null)
		If($PSBoundParameters['Verbose'])
		{
			Write-Progress -Activity "Move-OldFiles" `
						   -Status "Moving files..." `
						   -CurrentOperation "Transferring $($File.FullName)`..." `
						   -PercentComplete $([Math]::Round($FilesMoved / $SourceFiles.Count * 100, 0))
		}		
		If($Force)
		{
			If(!(Test-Path $DestinationFileDirectory -PathType Container))
			{
				Write-Verbose "Creating directory $DestinationFileDirectory"
				New-Item $DestinationFileDirectory -Type Directory | Out-Null
			}
		    Move-Item -Path $File.FullName -Destination $DestinationFullName -Force | Out-Null
		}
		Else
		{
			If(!(Test-Path $DestinationFileDirectory -PathType Container))
			{
				Write-Verbose "Creating directory $DestinationFileDirectory"
				New-Item $DestinationFileDirectory -Type Directory | Out-Null
			}
		    Move-Item -Path $File.FullName -Destination $DestinationFullName | Out-Null
		}
		$FilesMoved++
	}
	$End = Get-Date
	Write-Verbose "$($SourceFiles.Count) files were moved in $([Math]::Round(((New-TimeSpan $Start $End).TotalSeconds), 1)) seconds."
}

Get-FQDNInfo.ps1

by Ryan 22. April 2013 09:13

Someone recently asked me if I could write a script for them.  He had a list of several hundred fully qualified domain names (internet URLs essentially) in a file, and he had to get the IP address(es) of each FQDN and then some whois information about those IP addresses.  Running down a list of names and resolving them scriptomatically is a breeze of course, but the whois stuff sounded a little more tricky.  Luckily, ARIN has a sweet REST API ready to go, that made the whole script a snap.

I took the time to return all that data as objects so the output can be pipelined, and there is also an optional "save to CSV" parameter.  I think there are a couple more ways in which the script could be improved, but it works for now.  The output looks like this:

Get-FQDNInfo.ps1

And here's the whole script:

<#
.SYNOPSIS	
	Feed me a bunch of FQDNs, one per line, and I will give you as much info as
	I can about that IP address.
	
.DESCRIPTION
	This script takes an input file. The input file contains a list of FQDNs, one per line.
	With each FQDN, the script will attempt to resolve the name, and then find as much
	info as it can using ARIN REST services.
	
.PARAMETER InFile
	Specify a text file containing the FQDNs you want to scan.
	Each FQDN goes on a separate line. For example:
	
	host.foo.com
	barney.purpledinosaur.com
	et.phonehome.org

.PARAMETER OutFile
	Optional file to write the results to.

.INPUTS
	[System.String]$InFile - The name of the input file to read.
.INPUTS
	[System.String]$OutFile - Optional, the name of the output file to write.

.OUTPUTS
	[System.Object]$FQDNInfoCollection - Contains resolved FQDNInfo objects.

.OUTPUTS
	[System.IO.File]$OutFile - Optional, the output file to write.
	
.EXAMPLE
	PS C:\> .\Get-FQDNInfo.ps1 .\fqdns.txt outfile.txt

.EXAMPLE
	PS C:\> "fqdns.txt" | .\Get-FQDNInfo.ps1
	
.NOTES
	Name  : Get-FQDNInfo.ps1
	Author: Ryan Ries
	Email : ryanries09@gmail.com
	Date  : April 19, 2013

.LINK	
	http://www.myotherpcisacloud.com
#>

Param([Parameter(Mandatory=$True,ValueFromPipeline=$True)][ValidateScript({Test-Path $_ -PathType Leaf})][String]$InFile, [Parameter(Mandatory=$False,ValueFromPipeline=$False)][String]$OutFile)
$FQDNInfoCollection = @()
$EntriesProcessed = 0
Foreach($FQDN In Get-Content $InFile)
{
	$FQDNInfo = New-Object System.Object
	$FQDNInfo | Add-Member -Type NoteProperty -Name "FQDN"        -Value $FQDN
	$FQDNInfo | Add-Member -Type NoteProperty -Name "AddressList" -Value $null
	$FQDNInfo | Add-Member -Type NoteProperty -Name "CSVSafeList" -Value $null
	$FQDNInfo | Add-Member -Type NoteProperty -Name "NetRange"    -Value $null
	$FQDNInfo | Add-Member -Type NoteProperty -Name "CIDR"        -Value $null
	$FQDNInfo | Add-Member -Type NoteProperty -Name "NetName"     -Value $null
	$FQDNInfo | Add-Member -Type NoteProperty -Name "NetType"     -Value $null
	$FQDNInfo | Add-Member -Type NoteProperty -Name "RegDate"     -Value $null
	$FQDNInfo | Add-Member -Type NoteProperty -Name "Updated"     -Value $null
	$FQDNInfo | Add-Member -Type NoteProperty -Name "Comment"     -Value $null
	$FQDNInfo | Add-Member -Type NoteProperty -Name "SOA"         -Value $null
	
	Try	{ $FQDNInfo.AddressList = $([System.Net.Dns]::GetHostEntry($FQDN)).AddressList }
	Catch {	}
	If($FQDNInfo.AddressList -ne $null)
	{
		ForEach($A In $FQDNInfo.AddressList) { $FQDNInfo.CSVSafeList += "$($A)|" }
		$FQDNInfo.CSVSafeList = $FQDNInfo.CSVSafeList.TrimEnd('|')		
		Try
		{
			$ARINData = $(Invoke-WebRequest http://whois.arin.net/rest/ip/$($FQDNInfo.AddressList[0].ToString())`.txt).Content
			$ARINData = $ARINData.Split([Environment]::NewLine)
			Foreach($l In $ARINData)
			{
				If($l.StartsWith("NetRange:"))    { $FQDNInfo.NetRange = $l.SubString(16) }
				Elseif($l.StartsWith("CIDR:"))    { $FQDNInfo.CIDR     = $l.SubString(16) }
				Elseif($l.StartsWith("NetName:")) { $FQDNInfo.NetName  = $l.SubString(16) }
				Elseif($l.StartsWith("NetType:")) { $FQDNInfo.NetType  = $l.SubString(16) }
				Elseif($l.StartsWith("RegDate:")) { $FQDNInfo.RegDate  = $l.SubString(16) }
				Elseif($l.StartsWith("Updated:")) { $FQDNInfo.Updated  = $l.SubString(16) }
				Elseif($l.StartsWith("Comment:")) 
				{ 
					$FQDNInfo.Comment += $l.SubString(16)
					$FQDNInfo.Comment += " "
				}
			}
		}
		Catch { }
		& nslookup -q=soa $FQDN 2>&1> $Env:UserProfile`\temp.txt
		Foreach($_ In Get-Content $Env:UserProfile`\temp.txt)
		{
			If($_.Contains("primary name server =")) { $FQDNInfo.SOA = $_.Split('=')[1].Trim() }
		}		
	}	
	$FQDNInfoCollection += $FQDNInfo
	$EntriesProcessed   += 1
	Write-Host $EntriesProcessed "FQDNs processed."
}

If($OutFile.Length -gt 0)
{
	$FQDNInfoCollection | Export-CSV $OutFile -NoTypeInformation	
}
return $FQDNInfoCollection

Tags:

Powershell

Processor Shopping for SQL Server 2012

by Ryan 19. April 2013 12:35

AMD vs. IntelI almost never talk about SQL Server here, which is a shame, because I think SQL Server is amazing.  If you're planning on deploying SQL Server 2012, and you haven't picked out your hardware yet, then I hope this post finds you in time and helps you make your decision about what processor architecture to choose.  (I hope the graphic doesn't give it away...)  Also, keep in mind the date in which this is written - computer hardware changes rapidly.

 

 

You know you pretty much have two choices in CPUs: Intel or AMD.  There are several factors to weigh here: performance, hardware cost, and licensing cost.  So let's break those down and compare:

Performance: Keep in mind that we're designing a SQL Server here.  Different SQL Servers are under different types of workloads, but OLTP (online transaction processing) is one very common type. The TPC (Transaction Processing Performance Council) introduced the TPC-E benchmark in 2007, which simulates an OLTP workload on a SQL server.  What we end up with is a pretty solid method for benchmarking SQL servers of varying hardware configurations running identical workloads.  If you visit the website, it's pretty hard not to notice that the top 10 highest-performing servers and the top 10 best price/performance all belong to Intel processors with no exception.  But just for fun, let's see the numbers:

System Processor TPC-E Sockets Total Cores Score/Core
HP Proliant DL380 G7 Intel Xeon X5690 1284.14 2 12 107.01
IBM System x360 M4 Intel Xeon E5–2690 1863.23 2 16 116.45
HP Proliant DL385 G7 AMD Opteron 6282SE 1232.84 2 32 38.53
HP Proliant DL585 G7 AMD Opteron 6176SE 1400.14 4 48 29.17
IBM System x3850 * 5 Intel Xeon E7–4870 2862.61 4 40 71.57
NEC Express 5800/A1080a Intel Xeon E7–8870 4614.22 8 80 57.68

The trends evident from that table are that AMD prefers more cores per socket, AMD cores tend to perform much worse per core than Intel cores on an OLTP workload, and that crazy numbers of processor cores present with diminishing returns regardless of the manufacturer.  So far things are not looking good for AMD.  AMD can pack more cores on a die, but that just simply does not make up for their gap in single-threaded performance.

Hardware Cost: Let's get right down to some hardware prices. I'm going to price only the processors themselves, not the entire servers, because there are so many customizable options and accessories to choose from when speccing out an entire server and that would take me way longer than what I wanted to spend on this blog post.

Processor CDW.COM Price
Intel Xeon X5690 $1886.99
Intel Xeon E5–2690 $2332.99
AMD Opteron 6282SE $1287.99
AMD Opteron 6176SE $1505.99
Intel Xeon E7–4870 $5698.99
Intel Xeon E7–8870 $7618.99

AMD has a bit of a price advantage here, especially when you start getting to the high-end processors, but it's all for nothing once you take into account the 3rd piece of the puzzle:

Licensing: To be frank, Microsoft SQL Server 2012 Enterprise Edition is very expensive.  SQL used to be licensed on a per-socket basis.  SQL 2012 is now licensed per physical core.  This means "logical" cores such as those created by Intel's Hyperthreading are essentially free in regards to SQL 2012 licensing.  (There is the alternative Server + CAL licensing model as seen with the Business Intelligence Edition, but that's kinda' out of the scope of this article.  Enterprise Edition is where it's at.)  Each physical socket in your SQL server must use a minimum of 4 core licenses, and then you license two cores at a time after that for any additional cores more than 4 you have on your processor.

If you're thinking ahead, you can already tell this is bad news for AMD-based servers aspiring to run SQL 2012.  AMD processors have more cores, which equals higher SQL licensing costs, with lower performance per core to boot.  Microsoft realized this, and so they did AMD a favor by specifically giving most AMD processors a 25% discount on licensing costs.  But even with that discount, the numbers still speak for themselves, and AMD still comes out way behind:

AMD Opteron 6282SE 16 $82,488 2 $164,976 Intel Xeon E5–2690 8 $54,992 2 $109,984 Intel Xeon E5–4650 8 $54,992 4 $219,968 Intel Xeon X7560 8 $54,992 4 $219,968 Intel Xeon E7–4870 10 $68,740 4 $274,960 AMD Opteron 6180SE 12 $61,866 4 $247,464 AMD Opteron 6282SE 16 $82,488 4 $329,952

Processor Cores Per Socket Cost Total Sockets Total License Cost per Server
Intel Xeon X5690 6 $41,244 2 $82,488
AMD Opteron 6282SE 16 $82,488 2 $164,976
Intel Xeon E5–2690 8 $54,992 2 $109,984
Intel Xeon E5–4650 8 $54,992 4 $219,968
Intel Xeon X7560 8 $54,992 4 $219,968
Intel Xeon E7–4870 10 $68,740 4 $274,960
AMD Opteron 6180SE 12 $61,866 4 $247,464
AMD Opteron 6282SE 16 $82,488 4 $329,952

It just got really hard for me to recommend an AMD processor for use in a SQL Server 2012 server under almost any circumstances.  Let's take our Intel Xeon X5690 and our AMD Opteron 6282SE, which both have pretty similar TPC-E benchmark scores... only the AMD costs $82,488 more to license!  This is with AMD's 25% discount!  These are full retail prices of course, but the concept is the same, regardless of your Enterprise Agreement.

So, my fellow IT pros... please do the math before you pull the trigger on that new server, and make sure your $2000 in hardware savings isn't steamrolled by $80,000 of extra licensing costs.

* Citation - these numbers are from the book Professional SQL Server 2012 Internals and Troubleshooting by Bolton, Langford, Berry, et al.

AD Recycle Bin and a Eulogy for the Infrastructure Master

by Ryan 13. April 2013 19:45

Ruminate with me a while, won't you?

Ah, the Infrastructure Master.  Probably the least-appreciated FSMO role of all.  In discussions such as technical job interviews, most people can list the five FSMOs for me... maybe even tell me which are per-forest and which are per-domain... but if you then start asking for specifics about what each one of them actually does, the interviewee usually gets a bit more wobbly.  And I think the Infrastructure Master in particular is probably the most difficult of all to grasp.  I know it was certainly the last one for me to really "get."

I won't spill it all out here on what exactly the IM does - there's plenty of documentation out there if you're really interested. I would also direct you to this ServerFault post wherein I give a real-world example of what the IM does and what might happen if the IM is on the wrong domain controller.

This brings me to the Active Directory Recylce Bin.  The AD Recycle Bin was introduced in 2008 R2, and was a long time coming.  Before, restoring AD objects was a lot more arcane and cumbersome than it is with a good ole' Recycle Bin.  Considering the AD Recycle Bin is going on 5 years old now, even though it's an optional feature, there's less and less of an excuse as time goes on for you to not have it enabled in your AD domain.

(You don't, do you?)

So here's the interesting bit that you might not have known: (sorry for wasting your time if you did know) once you've enabled the AD Recycle Bin, your Infrastructure Master no longer has anything to do.  Nothing.  Not even in an environment where some domain controllers are not also global catalogs.

From Technet:

When the Recycle Bin optional feature is enabled, every DC is responsible for updating its cross-domain object references in the event that the referenced object is moved, renamed, or deleted. In this case, there are no tasks associated with the Infrastructure FSMO role, and it is not important which domain controller owns the Infrastructure Master role.

So as the AD Recycle Bin becomes more and more commonplace in Active Directory environments, it seems that the Infrastructure Master may slowly dwindle away until only the old guard even remembers what it was, and budding young IT pros will only have 4 FSMOs to remember.

ShareDiscreetlyWebServer v1.0.1.2

by Ryan 13. April 2013 16:47

Several improvements over the last release the past few days:

  • Some code optimizations. Pages are rendering about an order of magnitude faster now.
  • Just about all the HTML and Javascript has been exported to editable files so that an administrator can change up the code, color schemes, branding, etc., without needing to recompile the code.
  • The server can now send an S/MIME, digitally signed email to the person you want to send the URL to. Unfortunately some email clients (such as the Gmail web client) don't natively understand S/MIME, but Outlook handles it just fine. You can also get various plugins and programs to read S/MIME emails if your email client doesn't understand it. I'd rather lean toward more security-related bells and whistles than max compatibility for this project.

You can access the secret server at https://myotherpcisacloud.com.

ShareDiscreetlyWebServer v1.0.0.3

by Ryan 9. April 2013 13:17

I wrote a web service.  I call it "ShareDiscreetly".  Creative name, huh?

I wrote the server in C# .NET 4.0.  It runs as a Windows service.

ShareDiscreetlyWebServer serves a single purpose: to allow two people to share little bits of information - secrets - such as passwords, etc., in a secure, discreet manner.  The secrets are protected both in transit and at rest, using the FIPS-approved AES-256 algorithm with asymmetric keys supplied by an X.509 certificate.

Oh, and I made sure that it's thoroughly compatible with Powershell so that the server can be used in a scriptable/automatable way.

You can read a more thorough description of the server as you try it out here.

Please let me know if you find any bugs, exploits, or if you have any feature requests!

About Me

Name: Ryan Ries
Location: Texas, USA
Occupation: Systems Engineer 

I am a Windows engineer and Microsoft advocate, but I can run with pretty much any system that uses electricity.  I'm all about getting closer to the cutting edge of technology while using the right tool for the job.

This blog is about exploring IT and documenting the journey.


Blog Posts (or Vids) You Must Read (or See):

Pushing the Limits of Windows by Mark Russinovich
Mysteries of Windows Memory Management by Mark Russinovich
Accelerating Your IT Career by Ned Pyle
Post-Graduate AD Studies by Ned Pyle
MCM: Active Directory Series by PFE Platforms Team
Encodings And Character Sets by David C. Zentgraf
Active Directory Maximum Limits by Microsoft
How Kerberos Works in AD by Microsoft
How Active Directory Replication Topology Works by Microsoft
Hardcore Debugging by Andrew Richards
The NIST Definition of Cloud by NIST


MCITP: Enterprise Administrator

VCP5-DCV

Profile for Ryan Ries at Server Fault, Q&A for system administrators

LOPSA

GitHub: github.com/ryanries

 

I do not discuss my employers on this blog and all opinions expressed are mine and do not reflect the opinions of my employers.