Tuesday, December 10, 2013

Microsoft Windows Event Viewer Subscriptions

Most IT administrators love client server models that allow them to manage or report on various things from a centralized location.  As part of my companies enhanced security policies, I was looking for a way to monitor and report on all user logon and logoff events on all the computers across our network.

Surprisingly this does not seem to be an overly popular thing to attempt as finding information on it was quite difficult.

At windowsecurity.com I found an article that explained how to setup Log Subscriptions, a feature which Microsoft has included in their operating systems beginning with Vista.  In summary it had me:

1. open an elevated command prompt (run as administrator)
2. on the central aggregator machine
2a. run the command "winrm qc -q"
2b. run the command "wecutil qc /q"
2c. open up the event viewer and create a new subscription under "Subscriptions"
3. on each client machine being subscribed to
3a. run the command "winrm qc -q"

Note: There are patches available for windows 2003 and windows XP that must be installed before they are able to participate as a subscriber or subscribed to machine.

The account used to setup the subscription on the aggregator machine must be added to the "Event Log Readers" group on the client machine.  Using a domain admin account avoids this requirement.

Some logs, the Security log in particular, require extra permissions to subscribe to it.  Even though the machine is accessed by a domain admin account the log is still being read by the local "Network Service" built-in account, so that account must also be added to the "Event Log Readers" local group on every client machine.  I was made aware of this by a post on Microsoft's Technet website.

WINDOWS XP and 2003 NOTE:

WinRM has really come into its own in Windows Vista and later.  However, it can be installed on windows XP and 2003 as well by downloading the patch from Microsoft Here, and the KB for it is Here.

Once installed it should be noted that the new "server" may be listening on port 80, which means you would need to setup a separate subscription on the aggregator machine for all such installs and change the default port it queries to port 80.

Also, there is no "Event Log Readers" group on the older OSs.  In order to allow the security logs to be read on these older machines the registry needs to be modified for windows 2003 or the service needs to be run as Local System for windows XP as detailed here.

Another note for older machines: there are new logs that older machines do not support.  If you try and create a subscription for logs that the older machines can't handle, the older machines will likely throw error 0x6 and not return any data at all.

STAGE 2

Of course the log viewer is pretty limited in storage and is certainly designed with reporting speed in mind.  Which meant I needed to find some way to save the data elsewhere.  The logical choice for me was Microsoft SQL Server since that is what my company uses.

After several days of pain and suffering learning Power Shell for the first time, I came up with the following script that was able to extract the data I cared about most from the ForwardedEvents log, upload it to SQL, and delete the log.

$WriteTableName = "RawLogs"
$ColumnsForQuery = "Level, EntryDate, Source, EventID, TaskCategory, LogName, Computer, TargetName, Message"
$ParamNames = "@Level, @EntryDate, @Source, @EventID, @TaskCategory, @LogName, @Computer, @TargetName, @Message"
$WriteConnectionString = "server=servername;Trusted_Connection=Yes;Database=EventLogs; Connection Timeout=120"
$WriteConn = New-Object System.Data.SqlClient.SqlConnection
$WriteConn.ConnectionString = $WriteConnectionString
$WriteConn.Open() | Out-Null
[string]$SQLQuery = ("INSERT INTO {0} ({1}) VALUES ({2})" -f
$WriteTableName,
$ColumnsForQuery,
$ParamNames
)
#CREATE TABLE [dbo].[RawLogs] (
# [Level] [varchar] (100) NULL ,
# [EntryDate] [varchar] (100) NULL ,
# [Source] [varchar] (500) NULL ,
# [EventID] [varchar] (100) NULL ,
# [TaskCategory] [varchar] (500) NULL ,
# [LogName] [varchar] (500) NULL ,
# [Computer] [varchar] (500) NULL ,
# [TargetName] [varchar] (500) NULL ,
# [Message] [varchar] (max) NULL
#) ON [PRIMARY]
#GO


# $yesterday = (Get-Date) - (New-TimeSpan -day 1)
#Get-WinEvent -logname "ForwardedEvents" | where {$_.timecreated -ge $yesterday} |
#Format-Table TimeCreated, ID, ProviderName, LevelDisplayName, Message -AutoSize -Wrap | out-file  C:\ForwardedEvents.txt
# $events = Get-WinEvent -logname "ForwardedEvents" -MaxEvents 5 | where {$_.timecreated -ge $yesterday}
#time calculation = miliseconds * seconds * minutes * hours = 1000*60*60*12 = 12 hours
$query = '*[System[TimeCreated[timediff(@SystemTime) <= 43200000]]]' #43200000]]]'
[xml]$xmlevents = wevtutil qe ForwardedEvents /q:$query /e:Events
#$xmlevents.Events.Event | %{ $_.System } | select Computer | export-csv 'C:\ForwardedEvents.txt' -NoTypeInformation
#$xmlevents.Events.Event | select @{Name="EventID"; Expression={$_.System.EventID}},@{Name="Computer"; Expression={$_.System.Computer}},@{Name="Message"; Expression={$_.RenderingInfo.Message}} | export-csv 'C:\ForwardedEvents.txt' -NoTypeInformation
#@{Name="TargetName"; Expression={ $_.EventData.InnerXml.substring($_.EventData.InnerXml.indexOf('TargetUserName'),$_.EventData.InnerXml.indexOf('TargetUserName')+20) }},
$DataImport = $xmlevents.Events.Event | select @{Name="Level"; Expression={$_.System.Level}},
@{Name="EntryDate"; Expression={$_.System.TimeCreated.SystemTime}},
@{Name="Source"; Expression={$_.System.Provider.Name}},
@{Name="EventID"; Expression={$_.System.EventID}},
@{Name="TaskCategory"; Expression={$_.RenderingInfo.Task}},
@{Name="LogName"; Expression={$_.RenderingInfo.Channel}},
@{Name="Computer"; Expression={$_.System.Computer}},
@{Name="TargetName"; Expression={ $_.EventData.InnerXml.substring($_.EventData.InnerXml.indexOf('>', $_.EventData.InnerXml.indexOf('TargetUserName'))+1,$_.EventData.InnerXml.indexOf('<', $_.EventData.InnerXml.indexOf('TargetUserName'))-($_.EventData.InnerXml.indexOf('>', $_.EventData.InnerXml.indexOf('TargetUserName'))+1)) }},
@{Name="Message"; Expression={$_.RenderingInfo.Message}}
wevtutil.exe cl ForwardedEvents # erase the event log
#$DataImport
#Exit
ForEach($Obj in $DataImport)
{
$writeCmd = new-object System.Data.SqlClient.SqlCommand
$writecmd.Connection = $WriteConn
If ($Obj -ne $Null)
{
        If ($Obj.Level -ne $Null -and $Obj.Level.GetType().ToString() -ne "System.Xml.XmlElement") { $writeCmd.Parameters.AddWithValue("@Level", $Obj.Level) | out-null }
else { $writeCmd.Parameters.AddWithValue("@Level", [DBNull]::Value)  | out-null }
If ($Obj.EntryDate -ne $Null -and $Obj.EntryDate.GetType().ToString() -ne "System.Xml.XmlElement") { $writeCmd.Parameters.AddWithValue("@EntryDate", $Obj.EntryDate) | out-null }
else { $writeCmd.Parameters.AddWithValue("@EntryDate", [DBNull]::Value)  | out-null }
If ($Obj.Source -ne $Null -and $Obj.Source.GetType().ToString() -ne "System.Xml.XmlElement") { $writeCmd.Parameters.AddWithValue("@Source", $Obj.Source) | out-null }
else { $writeCmd.Parameters.AddWithValue("@Source", [DBNull]::Value)  | out-null }
If ($Obj.EventID -ne $Null -and $Obj.EventID.GetType().ToString() -ne "System.Xml.XmlElement") { $writeCmd.Parameters.AddWithValue("@EventID", $Obj.EventID) | out-null }
else { $writeCmd.Parameters.AddWithValue("@EventID", [DBNull]::Value)  | out-null }
If ($Obj.TaskCategory -ne $Null -and $Obj.TaskCategory.GetType().ToString() -ne "System.Xml.XmlElement") { $writeCmd.Parameters.AddWithValue("@TaskCategory", $Obj.TaskCategory) | out-null }
else { $writeCmd.Parameters.AddWithValue("@TaskCategory", [DBNull]::Value)  | out-null }
If ($Obj.LogName -ne $Null -and $Obj.LogName.GetType().ToString() -ne "System.Xml.XmlElement") { $writeCmd.Parameters.AddWithValue("@LogName", $Obj.LogName) | out-null }
else { $writeCmd.Parameters.AddWithValue("@LogName", [DBNull]::Value)  | out-null }
If ($Obj.Computer -ne $Null -and $Obj.Computer.GetType().ToString() -ne "System.Xml.XmlElement") { $writeCmd.Parameters.AddWithValue("@Computer", $Obj.Computer) | out-null }
else { $writeCmd.Parameters.AddWithValue("@Computer", [DBNull]::Value)  | out-null }
If ($Obj.TargetName -ne $Null -and $Obj.TargetName.GetType().ToString() -ne "System.Xml.XmlElement") { $writeCmd.Parameters.AddWithValue("@TargetName", $Obj.TargetName) | out-null }
else { $writeCmd.Parameters.AddWithValue("@TargetName", [DBNull]::Value)  | out-null }
If ($Obj.Message -ne $Null -and $Obj.Message.GetType().ToString() -ne "System.Xml.XmlElement") { $writeCmd.Parameters.AddWithValue("@Message", $Obj.Message) | out-null }
else { $writeCmd.Parameters.AddWithValue("@Message", [DBNull]::Value)  | out-null }
$writecmd.CommandText = $SQLQuery
$Null = $writecmd.ExecuteNonQuery()
}
}
$WriteConn.close()


This script will lose any events that come in between when the data is loaded into Power Shells memory and the next line where the log is truncated.  It also doesn't recover any data lost due to SQL upload errors.

It is also important to note that this script must be run with elevated permissions in the task scheduler, otherwise it will fail to clear the event log on each run.

VMWARE NOTE:

I did notice an odd problem on windows 7 machines that were running VMWare player.  When I tried to enable WinRM I got the error:

WinRM firewall exception will not work since one of the network connection types on this machine is set to Public

Unfortunately, on a domain connected machine, this setting can not be easily modified; fortunately I found a powershell script here that was able to do the trick for me:


$nlm = [Activator]::CreateInstance([Type]::GetTypeFromCLSID([Guid]"{DCB00C01-570F-4A9B-8D69-199FDBA5723B}"))
$connections = $nlm.getnetworkconnections()
$connections |foreach {
 if ($_.getnetwork().getcategory() -eq 0)
 {
 $_.getnetwork().setcategory(1)
 }
}

Tuesday, October 8, 2013

WebResource.axd The resource cannot be found 404

Web Resources are a very nice features of .net giving the developer the capability to bundle a bunch of files into a single DLL for easy transport.  However, it is not very forgiving when it comes to coding errors.  There are a lot of sites out there that offer details on how to add a web reference, what is missed is just how precise each piece must be written.  Unfortunately when it breaks the errors are simple 404's which are not helpful in the least.

Step 1: Include the file in your project, right click, to go properties, and set it as an Embedded Resource.

Step 2: add: [assembly: WebResource("DefaultNamespace.Folder1.Folder2.FileName.png", "image/png")]

This step has a couple of gotcha's.  First the contentType (the second parameter) must be correct, WebResource.axd will serve up the file with this content type.  So a content type of text/html used with an image file will result in a screen full of ascii.

Secondly the first parameter is the name of the file after it has been compiled.  A reflector tool comes in very handy here for verification purposes, I use dotPeek.  When compiled the file is named using the folder structure it is contained in prefixed by the Default namespace with periods as separators.  The Default namespace can be found by right clicking on the project name and opening properties.  Note: it is NOT the assembly name.  Failure to perform this part correctly will simply result in the file never being found, again a 404 error.

Step 3: Use something like this:

this.Page.ClientScript.GetWebResourceUrl(this.GetType(), "DefaultNamespace.Folder1.Folder2.FileName.png");

this.Page.ClientScript.RegisterClientScriptResource(this.GetType(), "DefaultNamespace.Folder1.Folder2.FileName.js");

Those lines allow you to get the URL for an image, or add a script reference at the beginning of the HTML output for the page.  Notice how the filename referenced is IDENTICAL to the filename used in the [assembly: tag up above?  If these two do not match and are not correct then it will result in a 404 error; although a URL or script reference may still be generated, it will not be the correct one.

Also, the GetType() call is very important as well.  The type referenced must at the least be in the same project as the embedded resource, one person said it needed to be in the same folder as well, but I am fairly certain that is not the case.  To play it safe I just use this.GetType() in the local file or Resource.cs file that generates the WebResource URL.  I say playing it safe because I have been bitten before by renaming my Resource.cs file and having my typeof(PageName) reference a completely different, but still valid object.  This resulted in no compile errors and a good looking URL, but always 404 errors.

Troubleshooting:

It can be very helpful to determine which /WebResource.axd is the one you are looking for in your HTML source code.  To help with this someone over at Telerik has created a nice page that allows you to decrypt the URL and see a more friendly name.

Something to look for in the friendly name is your assembly name, if it does not reference the assembly that houses the embedded resource then you probably screwed up the Type when generating the URL.

Tuesday, October 1, 2013

Upgrading from Microsoft.Practices.EnterpriseLibrary v 3.0 to v 6.0

Obviously the versions I was jumping in between are quite different, this application had not been upgraded in some time.  Taking one step at a time I decided to start by swapping out the DLLs.  I ran into a minor issue when I only modified the references for one of the projects in my solution to the new version, somehow one of the other projects rolled back the files during compile time to the previous version.  Re-updating the files and modifying the references for all the projects at once solved that issue.

But then I was presented with an odd error:

The type or namespace name 'Practices' does not exist in the namespace 'Microsoft'

It was especially confusing since intellisense and object name highlighting was detecting the presence of the namespace correctly.  After some web searching I ran into this thread, which implied that the target version of the framework I was building for needed to be newer or different then the one I was currently using.  I had yet to convert the project over so I was still compiling for version 3.5 of the framework.  I changed that to version 4.5 and was then able to build successfully.

------------

Unfortunately that was not the end of my problems.  I was using asp:ScriptManager in my project ( any control that references ScriptManager, such as the replacement control from the AjaxControlToolkit, will also have this issue ) and as a result when I tried running the project under the .net 4.0 web engine I got this error:

Could not load type 'System.Runtime.CompilerServices.ExtensionAttribute'....

I was unable to find a work around for that error as long as I was using the ScriptManager control.  If I removed that control then the web application worked perfectly.  Since my server only had windows 2003 on it upgrading IIS to .net 4.5 was not an option.  I finally decided to roll back from EnterpriseLibrary v6.0 to v5.0.414 and build the project for .net 4.0 instead of 4.5.

This worked well until I started getting an odd error:

Could not load file or assembly 'Microsoft.Practices.EnterpriseLibrary.Data, Version=6.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference.

It turned out that one of my solutions was hanging onto an old version of one of my other projects DLLs, even though the primary project was referencing the sub project in my solution apparently it did not refresh that sub projects DLL in its cache.  This was an issue since at one point I had the sub project referencing version 6.0.0.0 of the EnterpriseLibrary DLL.  I removed and re-added the primary project's reference to the sub project, the cache refreshed itself and all was well.

Thursday, August 1, 2013

Organizing the Kindle Library with Kindle Collection Manager

While I was deployed to Afghanistan I did a lot of reading.  With limited technological resources that was the best method of passing the time.

My girlfriend had been wonderful enough to purchase me an Amazon Kindle prior to me leaving country.  I had taken the time to download every public domain free PDF book I could find, which was quite a few.  However, while the Kindle is the best e-reader I have seen, it's organization and search features are some of the worst I have seen.

To help overcome this issue, I found a small program called Kindle Collection Manager; which helps organize the books on the Kindle.  It is not a fancy or many featured program like Calibre, it is designed to be a super light weight method of modifying the library file directly on the kindle itself.

While the kindle can hold close to 3000 books, its OS is incapable of running at a decent speed with so much of its memory consumed.  While the Kindle may be incapable of being filled to capacity with books, this program does allow be to group about 1000 books into manageable collections.  I believe that should be enough books for me to carry around at any one time.  The rest of my digital library can remain organized on my computer and I can swap out collections as desired.

Monday, July 8, 2013

Uploading Large files with HTML 5

The task was to fix a silverlight based web uploader.  The application was needed because so many companies are blocking port 21 for FTP, so we were looking for a web based solution to allow similar functionality through the firewall.

Since silverlight was end of life I started by upgrading the system to a free flash based plug in.  It worked beautifully in IE, but FF and Chrome both had issues with files larger than 500MB.  Most of my research on the Internet claimed this was a limitation/bug in flash; I thought it was odd that it worked correctly in IE though.  I later discovered it had nothing to do with the browser type but rather random chance in the available memory space that .NET had to allocate which just happened to occur at the same time I was testing the IE browser.

When I was unable to quickly find a solution to the flash issue, I turned to HTML 5.  Being new technology there were very few things built for it.  I found two HTML 5 uploaders, but neither of them seemed to work for me, probably because I did not fully understand the technology at the time.  So, I decided to roll my own solution using HTML 5, javascript, and chunking.

I got the entire solution working correctly, including having the file chunks uploaded into SQL Server and using varbinary(max) to combine them in the database.  As I started testing larger files I noticed that the code combining the chunks ran exponentially slower the more chunks there were.  I was expecting it to slow down at a linear rate, but with a 1GB file it took a couple of hours to combine 1024 1MB chunks; which was unacceptable to me.

Next I tried combining the chunks on the web server using an external command prompt "type" process.  It worked perfectly for small files.  However, the number of chunks ended up surpassing the parameter length that is allowed to be passed to a command line program, which brought that solution to a halt.

Then I tried combining the files in C#.NET, which ended up being hands down the fastest method yet.  However I then ran into the issue of the file size overloading the byte array for uploading into SQL Server.  This last discovery happened to be one of the most important ones I had made yet.  As it turns out the byte array erroring out was the root of my flash issue as well.  The error #2038 that flash had been returning to me was a generic error code that simply meant something had gone wrong on the server, but flash had no idea what the problem was.  For most people across the web this error seemed to stem from a permissions issue, but for me the byte array problem was the source of my issue.  Manually stepping through the .NET code when flash sent the request to it seems to be the only reliable method of tracking down errors like this.

So I used the OPENROWSET with BULK option in SQL Server.  This was working perfectly manually, but when I tried to automate it I ran into the issue that my sql service used a local account that did not exist on the web server and so could not authenticate to grab the file.  EXECUTE AS LOGIN did not work either to impersonate an Active Directory account because the local sql service did not have permissions to query the AD server.

Having .NET copy the newly combined file over to the database server was the key to solving the final problem.  After the database upload was completed I then had .NET clean up all the chunks and the merged file giving me a finished product that just needed some GUI polishing.

In Summary:

  • Using HTML5 I uploaded all the file chunks to the server.
  • In .NET I sorted and combined the newly uploaded chunks.
  • I moved the combined file to a location my SQL server could access.
  • Using OPENROWSET with the BULK option I copied the file into the database.

Wednesday, May 8, 2013

Active Reports HTML viewer skinning

My company recently migrated from Microsoft's reporting services to Active Reports.  I wrote a post here about my complaints and assessment of the product.

One of the big problems with the HTML viewer is the HTML code it generates and its heavy reliance on absolute positioning.  It is some of the worst code I have seen a control generate, I suppose it is supposed to be for flexibility, however none of the flexibility has been made available to us yet.

In order to make this control work for my companies spreadsheet style reporting, I had to come up with a cross between the HtmlView and the RawHTML output.  The raw HTML worked better in page flow usually, but it didn't have paging which is a pretty nice feature.  So I generating the following CSS snippet in an attempt to make the HtmlView just a little closer to the RawHTML output.

.rptDiv and #controlTable and #rptFilter are my own personal divs wrapping the viewer output and my custom parameters, they should not be necessary for the CSS to function correctly.

There are three main features that this CSS changes.  Overflow to remove the extra scrollbars that the viewer adds to the page.  Left and top positioning to get rid of the dumb one inch white space margin that the viewer creates.  And Z-index because moving the main panel one inch up and left causes an invisible portion of the viewer control to be placed on top of any controls in the nearby area; and so it needs to be stuff behind them.

#viewer-layout-main-panel > div > div > div > div > span {
    border:solid 1px;
    border-color:white;
}
.rptDiv,
.viewer-layout-container,
.viewer-layout-container div {
    overflow:visible !important;
    z-index:2;
}
#controlTable,
.viewer-layout-main-panel {
    border:none !important;
}
.viewer-layout-main-panel > div > div {
    left:-1in;
    top:-1in;
    width:0px !important;
           
}
#rptFilter,
#viewer-layout-toolbar-panel {
    z-index:3 !important;
    position:relative;
}

Custom serialization

I recently ran into an issue where using Session variables to maintain state was not a good fit for my web application.  But I had far too many variables to easily be able to save and retrieve them all from a database each time I wanted one.

So I ended up creating a custom object to hold all of my variables, then wrote a serialization piece to dump the object to a string that could be stored as well as used to rebuild the object.

While this does seem to be a pretty common need across the net, I have seen very few complete examples of how to implement it.  So here is the first version of my parameters class:

Notice the ToString method as well as the constructor where all of the cool stuff happens.  A quick note, I am using fields in this example, however if you wanted to hide your fields behind property get and set statements you could easily convert the reflection code from field references to property references.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Reflection;

namespace MyNameSpace {
  public class ContactListParams {
    public string sFilter = "";
    public string sText = "";
    public string trcCL_grdPI = "";
    public string trcCL_grdSE = "";
    public string trcCL_grdSD = "";
    public string trcCL_Disp = "";
    public string trcCL_Comp = "";
    public string trcCL_FName = "";
    public string trcCL_LName = "";
    public string trcCL_Addr3 = "";
    public string trcCL_MSt = "";
    public string trcCL_Zip = "";
    public string trcCL_Phone = "";
    public string trcCL_Industry = "";
    public string trcCL_CID = "";
    public string PCB_Status = "";
    public string progType = "";
    public string userCanReassign = "";
    public string iRecordCount = "";
    public string sPriority = "";
    public string trcCntLstSort_Des = "";

    public ContactListParams() { } // basic constructor

    ///


    /// Creates a new contactlistparams from the data returned through the ToString method.
    ///

    ///
    public ContactListParams(string InitialData) {
      Type type = typeof(ContactListParams);
      string[] fielddata = InitialData.Split(Convert.ToString((Char)3));
      foreach (string field in fielddata) {
        if (field != null && field != "") {
          string[] data = field.Split(Convert.ToString((Char)1));
          FieldInfo fi = this.GetType().GetField(data[0]);
          if(fi != null) fi.SetValue(this, data[1]);
        }
      }
    }

    public override string ToString() {
      string ret = "";
      Type type = typeof(ContactListParams);
      FieldInfo[] fields = type.GetFields();
      foreach (var field in fields) {
        object value = field.GetValue(this);
        ret += field.Name + Convert.ToString((Char)1) + value.ToString() + Convert.ToString((Char)3);
      }
      return ret;
    }
  }
}

Wednesday, April 17, 2013

Active Reports "OpenedReport is undefined" or "ViewerViewModel is undefined"

Active Reports is one of the top reporting systems available for .NET, unfortunately it has a couple of major shortcomings.

The first problem is the nearly complete lack of documentation.  It kind of reminds me of the early days of Microsoft Windows and trying to find documentation on any given API call.  Also, while friendly, their support staff seems lacking in basic technical knowledge if anything out of the ordinary goes wrong.

The second problem is the systems focus on windows development.  This focus is so heavy that their web offering has been all but neglected.  They do have a couple of nice viewers for the web, but some of the basic functionality, like exporting, is missing from them; the HTML viewer doesn't even have the ability to print.  All this functionality is supported by the system, but has to be written and added by the developer using the product; a huge shortcoming in my opinion.  This lack of integrated details is consistent throughout the entirety of their web offering.

With the shortcomings out of the way, on to the problem this post is about.  The pesky "OpenedReport" error.  There are lots of posts all over the web about this error, but few people seem to really understand it, and Active Reports staff just point developers to the HttpHandlers setup page to solve the problem.

This error occurs when the page is unable to find or load the primary javascript libraries used by Active Reports.  The "ViewerViewModel" error has the same root cause, it just shows up when using the flash viewer rather than the PdfReader viewer.  Active Reports hosts all of their javascript in a dll called "GrapeCity.ActiveReports.Web.v7" (or whatever version you are using).  The files are then accessed using the following calls to the web server:


 src="Command=ArResource;ScriptId=lib.jquery-1.7.2.min.js.ar7"
 src="Command=ArResource;ScriptId=lib.json2.js.ar7"
 src="Command=ArResource;ScriptId=resources-7.1.7470.0-en-US.js.ar7"
 src="Command=ArResource;ScriptId=RSU-7.1.7470.0.js.ar7"

Those calls can all be seen in the source code for the page.  If you notice all the file extensions end in .ar7 which is not a standard web extension.  This is where the HttpHandlers come in, they are supposed to recognize this extension and redirect it to the correct dll to serve up the desired content.  However, the request must first make it to the asp.net engine in order for the web.config file to be able to handle the request.  This means that IIS must either pass all extensions through to the asp.net engine, or have the .ar7 extension added to it's list.

I am running IIS6, so I went in to the properties for my website, the Home Directory tab and the Configuration button and added a new extension of .ar7 pointing to aspnet_isapi.dll.  If you don't know the path of your dll look at any of the other extensions, almost all of them will point to this dll.  If you wished you could also point them directly to the activereports.web.v7 dll, but I prefer .net to handle all my requests and so pointed it to the generic handler to keep things consistent.

When adding this new extension mapping there are two check boxes which are VERY important.  "Script engine" should be checked, this tells IIS that this is an executable file and to go ahead and run it.  The "Verify that file exists" must be UN-checked; since all the files that active reports will be requesting are virtual files IIS will never be able to find them and would reject all requests for them if this box is checked.

Adding that extension along with the web.config handlers should solve your problem, this is my detailed analysis on this error.

Monday, April 15, 2013

Blank page in Active Reports using flash viewer

I started using Active Reports for the first time and wanted to try out the various viewers.  After spending hours trying to figure out why the flash viewer was not showing anything but a blank page I started to be convinced that there was a problem with the plugin in my browser.

So, I downloaded a free swf video off the net and hard coded some HTML to display it on my page.  When that worked I returned to troubleshooting Active Reports itself.  Viewing the source for my page I found my hard coded flash code and compared with with the auto generated flash code from Active Reports.

Every help site I had been to said to put the Active Report swf files ( there are two of them ) into the root of your application, which is where mine were.  Looking at the HTML output however, I noticed that it was actually looking for them in the same folder that the page was being executed it; which was not the root of my site.  Moving the two swf files and related Theme folder into the correct sub-directory everything started to work.

Edit: I recently ran into a similar version of this problem.  I published a website after first deleting all the files in the destination directory.  When I ran the report using the HTML viewer I got the error "Failed to send request to ./ActiveReports.ReportService.asmx/RunReport - Internal Server Error".  One guy said a handler issue caused this problem for him.  In my case it was the actual .asmx file that was missing since it never actually got published.  As soon as I copied it into the destination directory things started working again.

Thursday, March 7, 2013

Google Spreadsheets Data Aggregation

Google Spreadsheets has been getting better and better over the last couple of years, and they have added many features that I have waited for, however there is still one much requested feature that they have not yet implemented.

The charts within Google Spreadsheets are not able to manipulate the data, they can simply report on it.  Many people, myself included, have wanted a chart that is able to aggregate data and report on the totals obtained.  For example a call log spreadsheet where employees use a form to enter data about calls they have received.  Let's say the spreadsheet has the following columns:

Date | Name | Call Length | Notes

Now let's say that the employees each receive between 10 and 20 calls per day for varying lengths of time.  We want a chart that will give us a pretty graph on how much time was spent on the phone by each employee for each day.  But since all the data is spread out by call there is no chart that can currently provide this.

To solve this problem we must first create a new sheet with the following columns:

Date | Joe | Sue | Greg | John

Where the names are an exhaustive list of the employees receiving the calls.  It's not a pretty method of doing things, but it is the method I have found to work.  The Date column will have the formula

=Unique(MainSheet![DateCellStart]:MainSheet![DateCellEnd])

The cells below it will be auto filled with the CONTINUE formula and google spreadsheets will keep them updated with a unique list of dates from your original spread sheet.  In each of the employee columns you will put the value:

=SUMIF(MainSheet![DateCellStart]:MainSheet![DateCellEnd] , [UniqueDateColumnAndCell], MainSheet![Call Length Cell])

This will tell google spreadsheets to hunt through the main date column and find all the dates that match the unique date referenced.  It will then sum up all the call length values that correspond to those dates.  Now we have a spreadsheet that has the data all summed up nicely in a way that can be easily read by either a human or the pretty google charts.  Simply apply a chart to the columns in the newly created sheet and you are done.