Wednesday, April 16, 2014

Spiceworks Migration: an existing connection was forcibly closed by the remote host

I recently had to migrate my Spiceworks install from a Windows XP machine to a Windows 2012 server.  There was nothing wrong with the Windows XP machine, other than XP being end of life.  We simply needed faster hardware as we were planning on using the new help desk system built into it; prior we were just using it as a network scanning and monitoring solution.

I had tested the Active Directory integration on the Windows XP machine and had it all working nicely.  Unfortunately when I brought it online on the Windows 2012 server, Active Directory users were no longer able to login.

After looking through a lot of posts I checked out the AD scanning settings and noticed that I was getting the error: "an existing connection was forcibly closed by the remote host" regardless of the user account I tried to log in as.  In fact, after some more playing, I noticed that none of the credentials I was trying were even saving to the system, perhaps it only saves if there is no error.  However, I did get a different error if I changed the name of the domain controller, and in the domain controllers security log I was able to see successful authentication attempts.

I was quite confused, I tried all sorts of suggestions and even verified that LDAP was working correctly using the ldp tool as one post suggested.  I tried looking through the logs but could find no no mention of keywords "LdapAD" or any of the other keywords that other people had mentioned finding.

Finally I ran across a post where the user solved their problem by changing the LDAP port to 3269.  That got me thinking, I had the same issue when I was first trying to set mine up and I had set mine to 3269 at that time as well to resolve the issue.  I tried removing that setting, and suddenly everything worked.  The only thing I can think of is that the port it likes is different between the WinXP machine I came from and the Win2012 machine I am now running on.  Since the AD machine never change it does not makes sense to me, but it works.

Thursday, February 6, 2014

Installing Sharepoint 2010

The Problem
It took me far longer than I care to admit to figure out how to successfully install Sharepoint 2010. I spent hours searching Google, giving up, trying stuff on my own, and going back to Google.  My major symptom was a "successful" install but the web administrative interface was simply a white screen, no errors, absolutely nothing.

I tried everything I could think of, including creating my own tested and working website and copying over the sharepoint files into it.  But the best I could ever do was generate a 500 error (better than a blank screen at least) as soon as the sharepoint web.config file was in the website folder.  This entire time I had focused almost exclusively on the website as being the source of the issue, it wasn't until I tried to uninstall (again) that I started thinking down another path.  This time the uninstall failed with the error:

Microsoft SharePoint 2010 uninstall did not complete successfully.
One or more required OFFICE components failed to complete successfully.  For more information consult the setup file.
This started me thinking about it's dependency on office.  I had always known Sharepoint was heavily integrated with office, but I had always assumed it was an optional feature, akin to a plugin.

Finally I happened on Marek Suscak's blog and was able to commensurate that I had gone through many of the same failed steps.  Although in my case most of the steps simply didn't apply as the values were already set to whatever the recommended fix was.  I had already installed a Complete, rather than stand-alone, instance, and I had giving it a domain account to run under.

The Solution
As a result of Marek's blog post I ended up with a copy of office installed on the server.  Sharepoint claims it does not require office to run, however the install must have fixed some of the other prerequisites that either installed corrupted or that I had tampered with because the sharepoint configuration wizard was able to get past the first couple of screens now.  However, it was still hanging on the IIS screen saying it was not installed; even though I had multiple sites running on the server.

A refreshingly fast search later I came upon a technet forum in which someone was having the same issue, and down at the very bottom, the very last suggestion was to install the legacy IIS 6 Management Compatibility features, I tried it and like magic my sharepoint admin site was suddenly working.

My Complaints
There are all sorts of things I could complain about in this install, although the worst part of it all was simply the total lack of error messages.  Even the log files didn't show anything that stood out.  Considering Microsoft's big push with .NET to give detailed error messages it felt like I had been thrown back to windows 95 days.

However, I think all of it could have been avoided if the prerequisites installer were simply better.  Either it failed, or I failed to install all the prerequisites; however it gave me a clean bill of health when I was done.  Obviously it missed a couple of very critical components.

Thursday, January 16, 2014

Windows update crashes ASP.NET web application using EnableEventValidation and EnableViewStateMac

I recently got woken up due to a production web application I am responsible for being down.  Most of the reports indicated the users were not getting .net errors, it simply was not responding to their requests for data.

When I logged onto the site, and navigated to the problem area, I was greated with this error: "A potentially dangerous Request.Form value was detected from the client"; along with the recommendation to disable EnableEventValidation if the error were incorrect.

In an effort to just get the site back up and running I flipped the switch on that page variable.  I'm not sure why the clients never saw the error, but as soon as I flipped that variable I started seeing the same problem that they were reporting; no errors, just no response to requests for data.

Since my development server was functioning perfectly I had to use Response.Write 's on the production server to start tracking down what was going on.  I was shocked to learn that the Page.IsPostBack variable was always returning true.  I got the site functioning again by finding an alternative variable I could use in its place.

With the problem under control efforts were redirected to figuring out what triggered my bad morning.

My co-worker discovered that windows updates had been applied by our hosting company to our production server.  Through the good old fashioned method of uninstalling until things started working again, he discovered that KB2894842 and KB2894843 were the culprits.  Apparently Microsoft had found a security flaw in pages that had EnableViewStateMac turned off and the fix had resulted in some rather odd results for our site.

EnableViewStateMac was turned off for our site because we were using javascript to submit data from one page to another.  It was necessary to avoid the postback mac validation error "Validation of viewstate MAC failed".  A better option might have been to use strictly .NET controls to handle postback, however it was an old site that originated in classic ASP and the javascript has always worked very well.

Later versions of .NET now have the ability to post to other pages on the site using the PostBackUrl attribute built into some controls.  I was hoping to capture the javascript generated by such a control and use it to modify my stand alone javascript to enable it to perform a successful post to the next page.

I was disappointed to discover that using the PostBackUrl attribute not only generated additional .NET javascript, but also created a __PREVIOUSPAGE hidden field with a hash value for the next page to validate.  However, I was pleasantly surprised to find out that as long as any control was rendered with that attribute then my original javascript was able to submit the page successfully with no errors.  In fact my site was now able to function with EnableViewStateMac turned on giving me back that additional piece of security.

The compromise I finally settled on was adding a button with a PostBackUrl to my master page and hiding it with CSS.

I am still not sure why the Page.IsPostBack variable stopped functioning correctly and that is part of what prompted this post.  Hopefully if I, or anyone else, runs into a similar problem in the future there will be a place to start looking due to this.

Tuesday, December 10, 2013

Microsoft Windows Event Viewer Subscriptions

Most IT administrators love client server models that allow them to manage or report on various things from a centralized location.  As part of my companies enhanced security policies, I was looking for a way to monitor and report on all user logon and logoff events on all the computers across our network.

Surprisingly this does not seem to be an overly popular thing to attempt as finding information on it was quite difficult.

At I found an article that explained how to setup Log Subscriptions, a feature which Microsoft has included in their operating systems beginning with Vista.  In summary it had me:

1. open an elevated command prompt (run as administrator)
2. on the central aggregator machine
2a. run the command "winrm qc -q"
2b. run the command "wecutil qc /q"
2c. open up the event viewer and create a new subscription under "Subscriptions"
3. on each client machine being subscribed to
3a. run the command "winrm qc -q"

Note: There are patches available for windows 2003 and windows XP that must be installed before they are able to participate as a subscriber or subscribed to machine.

The account used to setup the subscription on the aggregator machine must be added to the "Event Log Readers" group on the client machine.  Using a domain admin account avoids this requirement.

Some logs, the Security log in particular, require extra permissions to subscribe to it.  Even though the machine is accessed by a domain admin account the log is still being read by the local "Network Service" built-in account, so that account must also be added to the "Event Log Readers" local group on every client machine.  I was made aware of this by a post on Microsoft's Technet website.


WinRM has really come into its own in Windows Vista and later.  However, it can be installed on windows XP and 2003 as well by downloading the patch from Microsoft Here, and the KB for it is Here.

Once installed it should be noted that the new "server" may be listening on port 80, which means you would need to setup a separate subscription on the aggregator machine for all such installs and change the default port it queries to port 80.

Also, there is no "Event Log Readers" group on the older OSs.  In order to allow the security logs to be read on these older machines the registry needs to be modified for windows 2003 or the service needs to be run as Local System for windows XP as detailed here.

Another note for older machines: there are new logs that older machines do not support.  If you try and create a subscription for logs that the older machines can't handle, the older machines will likely throw error 0x6 and not return any data at all.


Of course the log viewer is pretty limited in storage and is certainly designed with reporting speed in mind.  Which meant I needed to find some way to save the data elsewhere.  The logical choice for me was Microsoft SQL Server since that is what my company uses.

After several days of pain and suffering learning Power Shell for the first time, I came up with the following script that was able to extract the data I cared about most from the ForwardedEvents log, upload it to SQL, and delete the log.

$WriteTableName = "RawLogs"
$ColumnsForQuery = "Level, EntryDate, Source, EventID, TaskCategory, LogName, Computer, TargetName, Message"
$ParamNames = "@Level, @EntryDate, @Source, @EventID, @TaskCategory, @LogName, @Computer, @TargetName, @Message"
$WriteConnectionString = "server=servername;Trusted_Connection=Yes;Database=EventLogs; Connection Timeout=120"
$WriteConn = New-Object System.Data.SqlClient.SqlConnection
$WriteConn.ConnectionString = $WriteConnectionString
$WriteConn.Open() | Out-Null
[string]$SQLQuery = ("INSERT INTO {0} ({1}) VALUES ({2})" -f
#CREATE TABLE [dbo].[RawLogs] (
# [Level] [varchar] (100) NULL ,
# [EntryDate] [varchar] (100) NULL ,
# [Source] [varchar] (500) NULL ,
# [EventID] [varchar] (100) NULL ,
# [TaskCategory] [varchar] (500) NULL ,
# [LogName] [varchar] (500) NULL ,
# [Computer] [varchar] (500) NULL ,
# [TargetName] [varchar] (500) NULL ,
# [Message] [varchar] (max) NULL

# $yesterday = (Get-Date) - (New-TimeSpan -day 1)
#Get-WinEvent -logname "ForwardedEvents" | where {$_.timecreated -ge $yesterday} |
#Format-Table TimeCreated, ID, ProviderName, LevelDisplayName, Message -AutoSize -Wrap | out-file  C:\ForwardedEvents.txt
# $events = Get-WinEvent -logname "ForwardedEvents" -MaxEvents 5 | where {$_.timecreated -ge $yesterday}
#time calculation = miliseconds * seconds * minutes * hours = 1000*60*60*12 = 12 hours
$query = '*[System[TimeCreated[timediff(@SystemTime) <= 43200000]]]' #43200000]]]'
[xml]$xmlevents = wevtutil qe ForwardedEvents /q:$query /e:Events
#$xmlevents.Events.Event | %{ $_.System } | select Computer | export-csv 'C:\ForwardedEvents.txt' -NoTypeInformation
#$xmlevents.Events.Event | select @{Name="EventID"; Expression={$_.System.EventID}},@{Name="Computer"; Expression={$_.System.Computer}},@{Name="Message"; Expression={$_.RenderingInfo.Message}} | export-csv 'C:\ForwardedEvents.txt' -NoTypeInformation
#@{Name="TargetName"; Expression={ $_.EventData.InnerXml.substring($_.EventData.InnerXml.indexOf('TargetUserName'),$_.EventData.InnerXml.indexOf('TargetUserName')+20) }},
$DataImport = $xmlevents.Events.Event | select @{Name="Level"; Expression={$_.System.Level}},
@{Name="EntryDate"; Expression={$_.System.TimeCreated.SystemTime}},
@{Name="Source"; Expression={$_.System.Provider.Name}},
@{Name="EventID"; Expression={$_.System.EventID}},
@{Name="TaskCategory"; Expression={$_.RenderingInfo.Task}},
@{Name="LogName"; Expression={$_.RenderingInfo.Channel}},
@{Name="Computer"; Expression={$_.System.Computer}},
@{Name="TargetName"; Expression={ $_.EventData.InnerXml.substring($_.EventData.InnerXml.indexOf('>', $_.EventData.InnerXml.indexOf('TargetUserName'))+1,$_.EventData.InnerXml.indexOf('<', $_.EventData.InnerXml.indexOf('TargetUserName'))-($_.EventData.InnerXml.indexOf('>', $_.EventData.InnerXml.indexOf('TargetUserName'))+1)) }},
@{Name="Message"; Expression={$_.RenderingInfo.Message}}
wevtutil.exe cl ForwardedEvents # erase the event log
ForEach($Obj in $DataImport)
$writeCmd = new-object System.Data.SqlClient.SqlCommand
$writecmd.Connection = $WriteConn
If ($Obj -ne $Null)
        If ($Obj.Level -ne $Null -and $Obj.Level.GetType().ToString() -ne "System.Xml.XmlElement") { $writeCmd.Parameters.AddWithValue("@Level", $Obj.Level) | out-null }
else { $writeCmd.Parameters.AddWithValue("@Level", [DBNull]::Value)  | out-null }
If ($Obj.EntryDate -ne $Null -and $Obj.EntryDate.GetType().ToString() -ne "System.Xml.XmlElement") { $writeCmd.Parameters.AddWithValue("@EntryDate", $Obj.EntryDate) | out-null }
else { $writeCmd.Parameters.AddWithValue("@EntryDate", [DBNull]::Value)  | out-null }
If ($Obj.Source -ne $Null -and $Obj.Source.GetType().ToString() -ne "System.Xml.XmlElement") { $writeCmd.Parameters.AddWithValue("@Source", $Obj.Source) | out-null }
else { $writeCmd.Parameters.AddWithValue("@Source", [DBNull]::Value)  | out-null }
If ($Obj.EventID -ne $Null -and $Obj.EventID.GetType().ToString() -ne "System.Xml.XmlElement") { $writeCmd.Parameters.AddWithValue("@EventID", $Obj.EventID) | out-null }
else { $writeCmd.Parameters.AddWithValue("@EventID", [DBNull]::Value)  | out-null }
If ($Obj.TaskCategory -ne $Null -and $Obj.TaskCategory.GetType().ToString() -ne "System.Xml.XmlElement") { $writeCmd.Parameters.AddWithValue("@TaskCategory", $Obj.TaskCategory) | out-null }
else { $writeCmd.Parameters.AddWithValue("@TaskCategory", [DBNull]::Value)  | out-null }
If ($Obj.LogName -ne $Null -and $Obj.LogName.GetType().ToString() -ne "System.Xml.XmlElement") { $writeCmd.Parameters.AddWithValue("@LogName", $Obj.LogName) | out-null }
else { $writeCmd.Parameters.AddWithValue("@LogName", [DBNull]::Value)  | out-null }
If ($Obj.Computer -ne $Null -and $Obj.Computer.GetType().ToString() -ne "System.Xml.XmlElement") { $writeCmd.Parameters.AddWithValue("@Computer", $Obj.Computer) | out-null }
else { $writeCmd.Parameters.AddWithValue("@Computer", [DBNull]::Value)  | out-null }
If ($Obj.TargetName -ne $Null -and $Obj.TargetName.GetType().ToString() -ne "System.Xml.XmlElement") { $writeCmd.Parameters.AddWithValue("@TargetName", $Obj.TargetName) | out-null }
else { $writeCmd.Parameters.AddWithValue("@TargetName", [DBNull]::Value)  | out-null }
If ($Obj.Message -ne $Null -and $Obj.Message.GetType().ToString() -ne "System.Xml.XmlElement") { $writeCmd.Parameters.AddWithValue("@Message", $Obj.Message) | out-null }
else { $writeCmd.Parameters.AddWithValue("@Message", [DBNull]::Value)  | out-null }
$writecmd.CommandText = $SQLQuery
$Null = $writecmd.ExecuteNonQuery()

This script will lose any events that come in between when the data is loaded into Power Shells memory and the next line where the log is truncated.  It also doesn't recover any data lost due to SQL upload errors.

It is also important to note that this script must be run with elevated permissions in the task scheduler, otherwise it will fail to clear the event log on each run.


I did notice an odd problem on windows 7 machines that were running VMWare player.  When I tried to enable WinRM I got the error:

WinRM firewall exception will not work since one of the network connection types on this machine is set to Public

Unfortunately, on a domain connected machine, this setting can not be easily modified; fortunately I found a powershell script here that was able to do the trick for me:

$nlm = [Activator]::CreateInstance([Type]::GetTypeFromCLSID([Guid]"{DCB00C01-570F-4A9B-8D69-199FDBA5723B}"))
$connections = $nlm.getnetworkconnections()
$connections |foreach {
 if ($_.getnetwork().getcategory() -eq 0)

Tuesday, October 8, 2013

WebResource.axd The resource cannot be found 404

Web Resources are a very nice features of .net giving the developer the capability to bundle a bunch of files into a single DLL for easy transport.  However, it is not very forgiving when it comes to coding errors.  There are a lot of sites out there that offer details on how to add a web reference, what is missed is just how precise each piece must be written.  Unfortunately when it breaks the errors are simple 404's which are not helpful in the least.

Step 1: Include the file in your project, right click, to go properties, and set it as an Embedded Resource.

Step 2: add: [assembly: WebResource("DefaultNamespace.Folder1.Folder2.FileName.png", "image/png")]

This step has a couple of gotcha's.  First the contentType (the second parameter) must be correct, WebResource.axd will serve up the file with this content type.  So a content type of text/html used with an image file will result in a screen full of ascii.

Secondly the first parameter is the name of the file after it has been compiled.  A reflector tool comes in very handy here for verification purposes, I use dotPeek.  When compiled the file is named using the folder structure it is contained in prefixed by the Default namespace with periods as separators.  The Default namespace can be found by right clicking on the project name and opening properties.  Note: it is NOT the assembly name.  Failure to perform this part correctly will simply result in the file never being found, again a 404 error.

Step 3: Use something like this:

this.Page.ClientScript.GetWebResourceUrl(this.GetType(), "DefaultNamespace.Folder1.Folder2.FileName.png");

this.Page.ClientScript.RegisterClientScriptResource(this.GetType(), "DefaultNamespace.Folder1.Folder2.FileName.js");

Those lines allow you to get the URL for an image, or add a script reference at the beginning of the HTML output for the page.  Notice how the filename referenced is IDENTICAL to the filename used in the [assembly: tag up above?  If these two do not match and are not correct then it will result in a 404 error; although a URL or script reference may still be generated, it will not be the correct one.

Also, the GetType() call is very important as well.  The type referenced must at the least be in the same project as the embedded resource, one person said it needed to be in the same folder as well, but I am fairly certain that is not the case.  To play it safe I just use this.GetType() in the local file or Resource.cs file that generates the WebResource URL.  I say playing it safe because I have been bitten before by renaming my Resource.cs file and having my typeof(PageName) reference a completely different, but still valid object.  This resulted in no compile errors and a good looking URL, but always 404 errors.


It can be very helpful to determine which /WebResource.axd is the one you are looking for in your HTML source code.  To help with this someone over at Telerik has created a nice page that allows you to decrypt the URL and see a more friendly name.

Something to look for in the friendly name is your assembly name, if it does not reference the assembly that houses the embedded resource then you probably screwed up the Type when generating the URL.

Tuesday, October 1, 2013

Upgrading from Microsoft.Practices.EnterpriseLibrary v 3.0 to v 6.0

Obviously the versions I was jumping in between are quite different, this application had not been upgraded in some time.  Taking one step at a time I decided to start by swapping out the DLLs.  I ran into a minor issue when I only modified the references for one of the projects in my solution to the new version, somehow one of the other projects rolled back the files during compile time to the previous version.  Re-updating the files and modifying the references for all the projects at once solved that issue.

But then I was presented with an odd error:

The type or namespace name 'Practices' does not exist in the namespace 'Microsoft'

It was especially confusing since intellisense and object name highlighting was detecting the presence of the namespace correctly.  After some web searching I ran into this thread, which implied that the target version of the framework I was building for needed to be newer or different then the one I was currently using.  I had yet to convert the project over so I was still compiling for version 3.5 of the framework.  I changed that to version 4.5 and was then able to build successfully.


Unfortunately that was not the end of my problems.  I was using asp:ScriptManager in my project ( any control that references ScriptManager, such as the replacement control from the AjaxControlToolkit, will also have this issue ) and as a result when I tried running the project under the .net 4.0 web engine I got this error:

Could not load type 'System.Runtime.CompilerServices.ExtensionAttribute'....

I was unable to find a work around for that error as long as I was using the ScriptManager control.  If I removed that control then the web application worked perfectly.  Since my server only had windows 2003 on it upgrading IIS to .net 4.5 was not an option.  I finally decided to roll back from EnterpriseLibrary v6.0 to v5.0.414 and build the project for .net 4.0 instead of 4.5.

This worked well until I started getting an odd error:

Could not load file or assembly 'Microsoft.Practices.EnterpriseLibrary.Data, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference.

It turned out that one of my solutions was hanging onto an old version of one of my other projects DLLs, even though the primary project was referencing the sub project in my solution apparently it did not refresh that sub projects DLL in its cache.  This was an issue since at one point I had the sub project referencing version of the EnterpriseLibrary DLL.  I removed and re-added the primary project's reference to the sub project, the cache refreshed itself and all was well.

Thursday, August 1, 2013

Organizing the Kindle Library with Kindle Collection Manager

While I was deployed to Afghanistan I did a lot of reading.  With limited technological resources that was the best method of passing the time.

My girlfriend had been wonderful enough to purchase me an Amazon Kindle prior to me leaving country.  I had taken the time to download every public domain free PDF book I could find, which was quite a few.  However, while the Kindle is the best e-reader I have seen, it's organization and search features are some of the worst I have seen.

To help overcome this issue, I found a small program called Kindle Collection Manager; which helps organize the books on the Kindle.  It is not a fancy or many featured program like Calibre, it is designed to be a super light weight method of modifying the library file directly on the kindle itself.

While the kindle can hold close to 3000 books, its OS is incapable of running at a decent speed with so much of its memory consumed.  While the Kindle may be incapable of being filled to capacity with books, this program does allow be to group about 1000 books into manageable collections.  I believe that should be enough books for me to carry around at any one time.  The rest of my digital library can remain organized on my computer and I can swap out collections as desired.