Website Upgrade, Coding, and Dealing with NTFS ACLs on Server Core

by Ryan 9. January 2014 15:20

I apologize in advance - this blog post is going to be all over the place.  I haven't posted in a while, mainly because I've been engrossed in a personal programming project. Part of which includes a multithreaded web server I wrote over the weekend that I'm kind of proud of. My ShareDiscreetlyWebServer is single-threaded, because when I wrote it, I had not yet grasped the awesome power of the async and await keywords in C#.  They're très sexy. Right now the only verb I support is GET (because it's all I need for now,) but it's about as fast as you could hope a server written in managed code could be.

Secondly, I just upgraded this site to Blogengine.NET v2.9.  The motivation behind it was that today, I got this email from a visitor to the site:

Hi,
I tried to leave you a comment but it didnt work.
Can you please go through steps you took to migrate your blog to Azure as I am interested in doing the same thing.
Did you set it up as a Azure Web Site or use an Azure VM and deployed it that way?
Are you using BlogEngine or some other blog publishing tool.

Wait, my comments weren't working? Damnit. I tried to post a comment myself and sure enough, commenting on this blog was busted. It was working fine but it just stopped working some time in the last few weeks. And so, I figured that if I was going to go to the trouble of debugging it, I'd just go ahead and upgrade Blogengine.NET while I was at it.

But first, to answer the guy's question above, my blog migration was simple. I used to host this blog out of my house on a Windows Server running in my home office. I signed up for a Server 2012 Azure virtual machine, RDP'ed to it, installed the IIS role, robocopy'd my entire C:\inetpub directory to the new VM, and that was that.

So version 2.9 so far is a little lackluster so far.  They updated most of the UI to the simplistic, sleek "modern" look that's all the rage these days, especially on tablets and phones.  But in the process it appears they've regressed to the point where the editor is no longer compatible with Internet Explorer 11, 10, or 9. (Not that it worked perfectly before either.)  It's annoying as hell. I'm writing this post right now in IE with compatibility mode turned on, and still half of the buttons don't work.  It's crippled compared to the version 2.8 that I was on this morning.

That's ironic that the developers who wrote a CMS entirely in .NET, in Visual Studio, couldn't be bothered to test it on any version of IE.  Guess I'll wait patiently for version 3.0.  Or maybe write my own CMS after I get finished writing the web server to run it on.

But even after the upgrade, and after fixing all the little miscellaneous bugs that the upgrade introduced, it still didn't fix my busted comment system. So I had to dig deeper. I logged on to the server, fired up Process Monitor while I attempted to post a comment: 

w3wp.exe gets an Access Denied error right there, clear as day.  (Thanks again, ProcMon.)

If you open the properties of w3wp.exe, you'll notice that it runs in the security context of an application pool, e.g. "IIS APPPOOL\Default Web Site". So just give that security principal access to that App_Data directory.  Only one problem...

Server Core.

No right-clicking our way out of this one.  Of course we could have done this with cacls.exe or something, but you know I'm all about the Powershell.  So let's do it in PS.

$Acl = Get-Acl C:\inetpub\wwwroot\App_Data
$Ace = New-Object System.Security.AccessControl.FileSystemAccessRule("IIS APPPOOL\Default Web Site", "FullControl", "ContainerInherit, ObjectInherit", "None", "Allow")
$Acl.AddAccessRule($Ace)
Set-Acl C:\inetpub\wwwroot\App_Data $Acl

Permissions, and commenting, are back to normal.

Site Upgrade

by Ryan 7. October 2013 20:47

Upgraded this site from Blogengine.NET 2.5 to 2.8 this evening. This post is basically a test just to see if anything is broken from the upgrade. Sorry for the inconvenience.

 

Testing quotation.

Edit: Ugh, looks like the code formatter's broken. :(

Edit: Put the old SynatxHighlighter back in. It's not awesome, but it's better than nothing.

Edit: Well, one positive thing that came out of this is that I vastly improved Alex Gorbatchev's old SyntaxHighLighter for this blog. I just modernized the Powershell brush to include all the new Cmdlets and aliases, since the brush had not been updated since around 2009. Just let me know if you want it.

Now Powered By Windows Azure

by Ryan 20. February 2013 12:19

I transferred this blog to Windows Azure today. Up until today, I've been hosting this blog from inside my home. While I can boast almost zero unplanned downtime even from my mostly consumer-grade hardware and residential internet connection, I felt it was time to hoist this blog up into a slightly more professional and resilient environment. That way I don't have to worry about backups and hardware failures on my own gear taking down this blog. And it gives me more flexibility in terms of being able to tear my home lab apart and rearrange it without having to affect this blog.

It was very easy to move the blog to the new VM on Azure, which is good, because I've been so busy at work lately that I've had time for little else... such as blogging. I had to sign up especially for the "preview" of VM hosting from Azure, as it is apparently still in the preview phase. You get about a 50% discount until it goes General Availability. Anyway, both the virtual machine and the portal have both worked perfectly so far and I would not be surprised if it was really close to GA. The portal looks nice, polished, and works well. Comes with a nice, basic resource monitor so you can see your VM's CPU, memory, network usage, etc. over time from the web portal. The price is pretty low. Definitely lower than some other providers. My external IP address won't change. Plus they can host Server 2012 VMs, which some other providers are still catching up to. I chose the absolute slowest, lowest-spec VM that they would give me, because of the price. So the machine is a little slower than it was running on my own gear, but it's still enough for this measly blog. After the VM was done being imaged, I loaded IIS and SMTP on it, simply dumped the whole inetpub directory straight from my home machine into the VM, configured SMTP (so comment emails can be sent from this blog to my Gmail account, etc.,) and then turned the GUI off on the server with Powershell and logged out. Piece of cake.

Azure actually offers a lot of different hosting capabilities - not just virtual machines. In fact I don't think Azure even realized that they would offer IaaS when they first set out. But I chose the VM option because I'm most comfortable managing the OS myself... learning how to set up Visual Studio to publish websites from my desktop straight into an Azure service is totally new to me and I haven't even begun learning how to do that yet.

Blog Posts You Must Read

by Ryan 12. November 2012 08:43

The PFE Platforms team has published another blog post in their MCM: Active Directory series, which was such a fantastic post, it inspired me to create a "Blog Posts You Must Read" section over there on the side bar. It will be for blog posts and/or blog post series' that are so good that I find myself going back and reading them multiple times, or even going back to use them for reference material. I think that's a lot more meaningful than just a gigantic generic list of every website I know of.

More to come as I finish trawling through my bookmarks or stumble across new ones.

Log Parser 2.2 and Log Parser Studio

by Ryan 31. October 2012 19:56

At first I thought to title this post the same as the catchphrase of Log Parser: "The Whole World Is Your Database!"

But then I decided that was a bit too exciting for what I actually wanted to talk about.

So I just discovered Log Parser Studio a few days ago. LPS is a graphical frontend to Log Parser; quite similar to how SQL Management Studio is a GUI frontend to interacting with SQL Server.  I am, quite frankly, ashamed that I didn't already know about it. It's fantastic.

The thing is... Log Parser is a command-line utility that uses a very SQL-esque language to interact with logs. What kind of logs, you ask?  Any kind of logs! That's right... you can use it to query the Windows Security Event Log, or you can use it to query a folder full of IIS web server logs, or you can use it to query a log full of your own personal electric utility bills from last year!

However, Log Parser itself is a very complex, albeit powerful and flexible, command-line utility. Maybe you want something a little more user-friendly to get you started. That's exactly where Log Parser Studio, the GUI frontend, comes in to play.

As a little demonstration, I installed Log Parser 2.2 on my workstation. Then I downloaded Log Parser Studio to my workstation. I fired it up as a Windows application, and I pointed it to the remote IIS logs directory of this very web server. I then right-clicked on "IIS: Request per Hour" and chose "Run report now." As if I had just run a SQL query in SQL Management Studio, this window popped up:

 

Log Parser Studio Query*Click for Bigger*

This data is probably every single HTTP GET request made per hour, rather than a count of hits made by unique IP addresses, but the point is you now have this amazing utility that will parse practically any amount of data you can think of from any source of data you can think of. Go check it out and see how Log Parser is even capable of generating pie charts and bar charts and all sorts of crazy things using this data!

Web Server Upgrade

by Ryan 21. October 2012 13:29

Nothing too exciting to share right now, except that I upgraded this web server this morning from 2008 R2 to Server 2012.  The upgrade went perfectly, all my settings and applications were migrated properly, and most importantly the server was upgraded to IIS 8 and .NET 4.5 with no issues. I figured if there were going to be any problems, it was going to be there, since this website relies heavily on .NET.

I was able to remove the GUI shell, but I had to leave the "Graphical Management Tools and Infrastructure" feature enabled, as the server warned me that some certain important bits like IIS and SMTP might stop working without it. :P

BlogEngine.NET, SimpleCaptcha, and Spam

by Ryan 22. January 2012 10:59

I use BlogEngine.NET for this blog. I've loved it so far. It suits me perfectly because I also love .NET and C#.

BlogEngine.NET comes with a few "extensions" out of the box, and one of those extensions is called SimpleCaptcha. You simply configure it with a question and an answer. Visitors who supply the correct answer get to post comments. This wards off most of the spammers. But from what I'm seeing, is that whatever spammers use to automatically crawl the web, leaving little spam-filled coprolites in their wake, seems to be able to solve simple mathematical equations like 5+5, 3+7, and even (5+2)-1. I changed my captcha challenge to that latter equation and received a spam comment not five seconds later.

Maybe this will stop them...

So I figured the next best thing to do, without annoying and frustrating my visitors too much with those really bizarre graphical captchas that you can't even read half the time, was to change my SimpleCaptcha to something that was still simple, but required slightly more human-like thinking than what I suspect most spambots are capable of. Questions such as "what is the opposite of cold" or "a shape with four equal sides." These sorts of questions have brought my comment spam to a screeching halt. But there's one last problem: SimpleCaptcha is case sensitive and there's no immediately apparent way to turn it off. I don't want a visitor to type "Square" and not get their comment posted because they needed to have typed "square" instead.

So, to remedy this problem, simply access your web server and browse to wherever you have IIS/BlogEngine.NET installed. Then drill down to where SimpleCaptcha is. For me, it's C:\inetpub\wwwroot\App_Code\Extensions\SimpleCaptcha\. Open up the file SimpleCaptchaControl.cs in a text editor (or Visual Studio if you'd rather,) and find this method:

public void Validate(string simpleCaptchaChallenge)
{
   this.valid = this.skipSimpleCaptcha || this.simpleCaptchaAnswer.Equals(simpleCaptchaChallenge);
}

Simply change that one line to this:

public void Validate(string simpleCaptchaChallenge)
{
   this.valid = this.skipSimpleCaptcha || this.simpleCaptchaAnswer.Equals(simpleCaptchaChallenge,StringComparison.OrdinalIgnoreCase);
}

And you've just made your SimpleCaptcha not case-sensitive. The change takes effect as soon as you save the file; no restarts of anything are required.

Auditing Active Directory Inactive Users with Powershell and Other Cool Stuff

by Ryan 21. January 2012 10:36

Hello again, fellow wanderers.

I was having a hell of a comment spam problem here for a couple days... hope I didn't accidentally delete any legitimate comments in the chaos. (Read this excellent comment left on my last DNS post.) Then I realized that I might ought to change the challenge question and response for my simple captcha from its default... I guess the spammers have the old "5+5=" question figured out. :P

A few years ago, I made my own simple captcha for another blog that was along the lines of x + y = ? using PHP, but x and y were randomly generated at each page load. Worked really well. The simple captcha that comes boxed with BlogEngine.NET here is static. Being able to load a random question and answer pair from a pool of questions would be a definite enhancement.

Anyway, since we're still on the topic of auditing Active Directory, I've got another one for you: Auditing "inactive" user accounts.

I had a persnickety customer that wanted to be kept abreast of all AD user accounts that had not logged on in exactly 25 days or more. As soon as one delves into this problem, one might realize that a command-line command such as dsquery user -inactive x will display users that are considered inactive for x number of weeks, but not days. I immediately suspected that there must be a reason for that lack of precision, as I knew that any sort of computer geek/engineer that wrote the dsquery utility would not have purposely left out that measure of granularity unless there was a good reason for it.

So what defines an "inactive" user? A user that has not logged on to his or her user account in a period of time. There is an AD attribute on each user called LastLogonTimeStamp. After a little research, I stumbled across this post, where it is explained that the LastLogonTimeStamp attribute is not terribly accurate - i.e., off by more than a week. Now that dsquery switch makes a lot more sense. I conjecture that the LastLogonTimeStamp attribute is inaccurate because Microsoft had to make a choice when designing Active Directory - either have that attribute updated every single time a user account is logged on to and thus amplify domain replication traffic and work for the DCs, or have it only updated periodically and save the replication load.

To further complicate matters, there is an Active Directory Powershell cmdlet called Search-ADAccount that, when it returns users, it reports a LastLogonDate attribute. As it turns out, LastLogonDate is not even a real attribute, but rather that particular Powershell cmdlet's mechanism for translating LastLogonTimeStamp into a more human-readable form. (a .NET DateTime object.)

Next, there is another AD attribute - msDS-LogonTimeSyncInterval - that you can dial down to a minimum of 1 day, and that will have replication of the users' LastLogonTimeStamp attribute updated much more frequently and thus make it more accurate. Of course, this comes at the expense of additional load on the DCs and replication traffic. This may be negligible in a small domain, but may have a significant impact on a large domain.

*ADSI Edit*

Lastly, your other options for being able to accurately track the last logon time of users as close to "real-time" as possible involve scanning the security logs or attributes on all of your domain controllers and doing some heavy parsing. This is where event forwarding and subscriptions would really shine. See my previous post for details. I don't know about you guys, but all that sounds like a nightmare to me. Being able to track inactive user accounts to within 1 day is just going to have to suffice for now.

So we made the decision to decrease the msDS-LogonTimeSyncInterval, and I wrote this nifty Powershell script to give us the good stuff. Each major chunk of code is almost identical but with a minor tweak that represents the different use cases if given different parameters. Reading the comments toward the top on the five parameters will give you a clear picture of how the script works:

# ADUserAccountAudit.ps1
# Writen by Ryan Ries on Jan 19 2012
# Requires the AD Powershell Module which is on 2k8R2 DCs and systems with RSAT installed.
#
# Locates "inactive" AD user accounts. Note that LastLogonTimeStamp is not terribly accurate.
# Accounts that have never been logged into will show up as having a LastLogonTimeStamp of some time
# around 1600 AD - 81 years after the death of Leonardo da Vinci.
# This is because even though their LastLogonTimeStamp attribute is null, we cast it to a DateTime object
# regardless, which converts null inputs into a minimum date, apparently.
#
# For specific use with NetIQ AppManager, put this script on the agent machine at 
# C:\Program Files (x86)\NetIQ\AppManager\bin\Powershell (for 64 bit Windows. Just "Program Files" if 32 bit Windows.)

Param([string]$DN = "dc=corpdom,dc=local",         # LDAP distinguished name for domain
      [string]$domainName = "Corpdom",             # This can be whatever you want it to be
      [int]$inactiveDays = 25,                     # Users that have not logged on in this number of days will appear on this report
      [bool]$includeDisabledAccounts = $false,     # Setting this to true will include accounts that are already disabled in the report as well
      [bool]$includeNoLastLogonAccounts = $false)  # Setting this to true will include accounts that have never been logged into and thus have no LastLogonTimeStamp attribute.

# First, load the Active Directory module if it is not already loaded
$ADmodule = Get-Module | Where-Object { $_.Name -eq "activedirectory" } | Foreach { $_.Name }
if($ADmodule -ne "activedirectory")
{
   Import-Module ActiveDirectory
}

if($includeDisabledAccounts -eq $false)
{
   if($includeNoLastLogonAccounts -eq $false)
   {
      Write-Host "Enabled users that have not logged into $domainName in $inactiveDays days`r`nExcluding accounts that have never been logged into`r`nAccounts younger than $inactiveDays days not shown.`r`n-------------------------------------------------------"
      Search-ADAccount -UsersOnly -SearchBase "$DN" -AccountInactive -TimeSpan $inactiveDays`.00:00:00 | 
      Where-Object {$_.Enabled -eq $true -And $_.LastLogonDate -ne $null } |
      Get-ADUser -Properties Name, sAMAccountName, givenName, sn, lastLogonTimestamp, Enabled, WhenCreated |
      Where-Object {$_.WhenCreated -lt (Get-Date).AddDays(-$($inactiveDays)) } |
      Select sAMAccountName, givenName, sn, @{n="LastLogonTimeStamp";e={[DateTime]::FromFileTime($_.LastLogonTimestamp)}}, Enabled, WhenCreated |
      Sort-Object LastLogonTimeStamp |
      Format-Table   
   }
   else
   {
      Write-Host "Enabled users that have not logged into $domainName in $inactiveDays days`r`nIncluding accounts that have never been logged into`r`nAccounts younger than $inactiveDays days not shown.`r`n-------------------------------------------------------"
      Search-ADAccount -UsersOnly -SearchBase "$DN" -AccountInactive -TimeSpan $inactiveDays`.00:00:00 | 
      Where-Object {$_.Enabled -eq $true } |
      Get-ADUser -Properties Name, sAMAccountName, givenName, sn, lastLogonTimestamp, Enabled, WhenCreated |
      Where-Object {$_.WhenCreated -lt (Get-Date).AddDays(-$($inactiveDays)) } |
      Select sAMAccountName, givenName, sn, @{n="LastLogonTimeStamp";e={[DateTime]::FromFileTime($_.LastLogonTimestamp)}}, Enabled, WhenCreated |
      Sort-Object LastLogonTimeStamp |
      Format-Table 
   }
 
}
else
{
   if($includeNoLastLogonAccounts -eq $false)
   {
      Write-Host "All users that have not logged into $domainName in $inactiveDays days`r`nExcluding accounts that have never been logged into`r`nAccounts younger than $inactiveDays days not shown.`r`n------------------------------------------------------"   
      Search-ADAccount -UsersOnly -SearchBase "$DN" -AccountInactive -TimeSpan $inactiveDays`.00:00:00 |
      Where-Object { $_.LastLogonDate -ne $null } |
      Get-ADUser -Properties Name, sAMAccountName, givenName, sn, lastLogonTimestamp, Enabled, WhenCreated |
      Where-Object { $_.WhenCreated -lt (Get-Date).AddDays(-$($inactiveDays)) } |
      Select sAMAccountName, givenName, sn, @{n="LastLogonTimeStamp";e={[DateTime]::FromFileTime($_.lastlogontimestamp)}}, Enabled, WhenCreated |
      Sort-Object LastLogonTimeStamp |
      Format-Table   
   }
   else
   {
      Write-Host "All users that have not logged into $domainName in $inactiveDays days`r`nIncluding accounts that have never been logged into`r`nAccounts younger than $inactiveDays days not shown.`r`n------------------------------------------------------"   
      Search-ADAccount -UsersOnly -SearchBase "$DN" -AccountInactive -TimeSpan $inactiveDays`.00:00:00 |
      Get-ADUser -Properties Name, sAMAccountName, givenName, sn, lastLogonTimestamp, Enabled, WhenCreated |
      Where-Object {$_.WhenCreated -lt (Get-Date).AddDays(-$($inactiveDays)) } |
      Select sAMAccountName, givenName, sn, @{n="LastLogonTimeStamp";e={[DateTime]::FromFileTime($_.lastlogontimestamp)}}, Enabled, WhenCreated |
      Sort-Object LastLogonTimeStamp |
      Format-Table   
   }
}

So there you have it, a quick and dirty report to locate users that have been inactive for over x days. Accounts that were just created and not logged on to yet would have a LastLogonTimeStamp of null and would therefore show up in this report, so I threw the Where-Object {$_.WhenCreated -lt (Get-Date).AddDays(-$($inactiveDays)) } bit in there to exclude in any case the user accounts that were younger than the specified number of days required to consider an account "inactive." Furthermore, you might want to resist the urge just now to go a step further and programmatically disable inactive user accounts. Most organizations use service accounts and other special accounts that may not get logged into very often, and yet, all hell would break loose if you disabled them. I'm considering a system that disables the accounts, but also reads in a list of accounts which are "immune" and would therefore be ignored by the program. For a future post I guess.

Lastly, I want to thank Ned of the AskDS blog, without whom this post would not have been possible. (Now it sounds like a Grammy speech...) But seriously, I asked him about this stuff and he knew all the answers right away. Helped me out immeasurably on this.

A(nother) Fresh Start

by Ryan 19. November 2011 11:18

This is my 4,223rd "Hello World!"

My domain, including this website, went down for a while due to a domain overhaul/migration.  Such is the life of an enthusiast who's never happy with good enough.

So once I got everything re-situated, tech refreshed, new domain set up and configured, new web server built, etc., I decided I'd try something new with my web presence.

See... I've been building websites for about 15 years now.  Never anything great or fancy, but I always had something.  Do you remember those early to mid-nineties websites with the flashing starry night backgrounds, animated gifs of rotating skulls and dripping blood, while Van Halen's Jump MIDI played in the background?  Yeah... I admit I was a part of that problem.  I'm always creating a new looking site, it lasts for a year or two, then I get bored and radically redesign it.  The thing is, I always used to build websites in Notepad. (Notepad++ once I wised up a little more.)  I got pretty good at it too.  I can bang out HTML, PHP, Javascript and CSS from scratch without much thought.  I do not aspire to ever be known as a web developer or designer, but a good technologist should know how to do a little bit of everything.

Anyway, eventually I got a little more evolved and decided to try Wordpress. Wordpress is pretty cool. It makes standing up a new website too fast and easy to ignore. It has a plethora of user-created themes and plugins and it's completely customizable. What's not to like?

But with this go-round I wanted to try something that is not only new to me, but is 100% Microsoft-integrated.  By "Microsoft-integrated" I mainly mean being able to use an MSSQL backend instead of MySQL and offering .NET integration.  Not that there's anything wrong with MySQL, and not that I can't install PHP on my IIS web server with the click of a button, it's just that this site's purpose, other than for my geeky catharsis, is for exploring and learning Microsoft technologies.  I'm very comfortable with HTML and PHP, so writing web pages in .NET is a little intimidating, but it's also exciting to think about the potential.  I already love .NET for Windows development.  So the choice was simple.

Wait no it's not simple.  There are a ton of Content Management Systems (blogging platforms) out there.  Umbraco, Orchard, DotNetNuke, etc... Which one do I choose?

After I built and patched up my new Windows 2008 R2 Web Server, the first thing I did was install the Microsoft Web Platform Installer.  It's seriously bad ass; if you run Microsoft Web Servers, you need to at least take a look.  It was from there that I started looking at blogging platforms.  Wordpress is there and is obviously very popular, and it tried to tempt me back into its comforting embrace of PHP and MySQL.  But I resisted the urge; I was determined to learn something new.  So I chose Umbraco.  I toyed with it for about a day until I couldn't take it any more and uninstalled it.  I mean, I appreciate that it's a good and powerful product, but for me it just seemed very complicated with a kludgy UI. So today I again resisted the urge to go back to Wordpress, and instead tried out BlogEngine.NET.  And so far I'm really liking it.  It's not too complex, but it still sports some really awesome features that .NET and ASP have to offer.

So bear with me as I continue customizing and fleshing out this site.  Partly because I wanted to see and learn something new to me, and partly because I'm hoping to uncover something new with it that will make my 4,223rd blog even cooler.

About Me

Name: Ryan Ries
Location: Texas, USA
Occupation: Systems Engineer 

I am a Windows engineer and Microsoft advocate, but I can run with pretty much any system that uses electricity.  I'm all about getting closer to the cutting edge of technology while using the right tool for the job.

This blog is about exploring IT and documenting the journey.


Blog Posts (or Vids) You Must Read (or See):

Pushing the Limits of Windows by Mark Russinovich
Mysteries of Windows Memory Management by Mark Russinovich
Accelerating Your IT Career by Ned Pyle
Post-Graduate AD Studies by Ned Pyle
MCM: Active Directory Series by PFE Platforms Team
Encodings And Character Sets by David C. Zentgraf
Active Directory Maximum Limits by Microsoft
How Kerberos Works in AD by Microsoft
How Active Directory Replication Topology Works by Microsoft
Hardcore Debugging by Andrew Richards
The NIST Definition of Cloud by NIST


MCITP: Enterprise Administrator

VCP5-DCV

Profile for Ryan Ries at Server Fault, Q&A for system administrators

LOPSA

GitHub: github.com/ryanries

 

I do not discuss my employers on this blog and all opinions expressed are mine and do not reflect the opinions of my employers.