.Net ramblings
# Monday, 30 January 2006
Fix: Unable to validate data, MachineKey.GetDecodedData
i kept getting these sporadic error messages on my web applications and i could never figure out why. 
Unable to validate data at
System.Web.Configuration.MachineKey.GetDecodedData(Byte[] buf, Byte[] modifier, Int32 start, Int32 length, Int32& dataLength) at
System.Web.UI.LosFormatter.Deserialize(String input)
searching the net just reveals a troop of similarly frustrated users without solutions.  but today i found the actual reason why, and the work-around.
this excellent post on experts-exchange has the answer.  apparently it happens when users leave a page open for a long time, and then cause a post back.  something to do with the machine key being automatically generated and it changes before the user causes the postback, and it can't validate it then. 
the fix is to set a static machine key. 
There is a kb article with some code to generate a key for you, + instructions.  

Sorted.


Monday, 30 January 2006 22:53:44 (GMT Standard Time, UTC+00:00)  #    Comments [13]  .Net General | Asp.Net

ADOX + Excel: bogus worksheets
a web site i'm working on imports excel documents, and does some processing on them for importing into a database.  I use the code from this post to do the importing, and it works nicely.  I recently came across a problem where i was encountering duplicate records, and it took me ages to figure out why.  Apparently a 'named range' of cells in a worksheet is treated as a Table by ADOX.  so you get more than you bargain for when you iterate through the tables in the resulting DataSet. 
I was able to work around the problem by discarding any tables that do NOT end in the dollar $ character.


Monday, 30 January 2006 15:30:12 (GMT Standard Time, UTC+00:00)  #    Comments [1]  .Net General | Asp.Net | Database

# Wednesday, 25 January 2006
Crystal Reports TextObject ignores newline, carriage return, \r\n
I have a text object in my crystal report, and i set the value programatically for it, using something like this:
(rpt.Section3.ReportObjects["txtDate"] as TextObject).Text = DateTime.Now.ToShortDateString();
this works fine, until you put a string with line breaks inside it, specifically \r\n or the carriage return character.  This is a bug in CR for .Net, you can read the official blurb here.  the work-around code they post is in VB, and it made no sense to me when i read it.
what they actually do is make you change the TextObject into a FormulaField (you have to view the 'Field Explorer' tab next to the toolbox, and drag on a Formula Field).  Then you set 'CanGrow' to true, and then you go back to your code, and do the most arcane work-around i've ever seen.  you set the formula to your text string, but you must surround it in single quote characters, and replace \r\n with some inline managed code as follows: ' + chr(10) + '
when i first read this, i thought they had made a syntax error, but this is the way to do it, and it works.  the new format for setting the formula field is:
rpt.DataDefinition.FormulaFields[0].Text = "'" + YourString.Replace("\r\n", "' + chr(10) + '") + "'";


Wednesday, 25 January 2006 13:52:05 (GMT Standard Time, UTC+00:00)  #    Comments [16]  .Net General | .Net Windows Forms | Asp.Net

# Thursday, 19 January 2006
The best free zip / compression software: 7-zip
i gave up using winzip a long time ago because it is way too 'naggy' in terms of making you register. 
windows 2000/XP has its own built-in 'send to > compressed folder' option when you right-click a file or rolder.  but the windows zip software is pretty rubbishy.  it takes forever to set or remove passwords on archives, or to simply delete a small file from an archive can take 20 seconds. 

by far the best zip software i have used is called 7-zip and you can download it for free at http://www.7-zip.org
it is lightning fast, and although the custom zip format '7z' does not appear to be anything great, especially because its not compatible with other zip software, set the program to use normal 'zip' by default and it works great. 


Thursday, 19 January 2006 11:57:28 (GMT Standard Time, UTC+00:00)  #    Comments [0]  General

# Friday, 13 January 2006
Localization: Setting text direction for a web page
This is the first time i have been asked to add localization support for a content management system.  One of the supported languages is Arabic, which is usually written right-to-left.  to achieve this, you could obviously change your CSS or markup with something like <p align="right"> but this is not the way to do it.

there is an attribute in the HTML tag that controls the direction of text (and controls) on the page, called 'dir', which can take values 'ltr' or 'rtl', corresponding to "left-to-right" and "right-to-left" respectively.  The advantage of setting this property is that it instructs the browser to render all direction-related elements of the page accordingly.  for example, in RTL mode, when you type in a text box, the cursor stays flush to the right and the characters spread across to the left as you type.  similarly drop-down-menu's are rendered differently, as are bullet points, and the browser scrollbar.  these changes are shown in the screenshot below:



In this content management system, i set the direction based on the user's language setting, stored in a cookie.  to set the DIR attribute programatically, i suggest the following HTML tag in your Master/Template aspx page:
<html xmlns="http://www.w3.org/1999/xhtml" runat="server" id="htmlTag">
Then you can access it in the code behind, like so:
protected void Page_Load(object sender, EventArgs e)
{
this.htmlTag.Attributes.Add("dir", Util.HtmlDir);
...
and then the method to determine the direction, based on the cookie. note that 'Language' in my code is an enumeration.
/// <summary>
/// Returns rtl or ltr, depending on the current language setting
/// </summary>
public static string HtmlDir
{
get
{
Language lang;
try
{
lang = (Language)Enum.Parse(typeof(Language), HttpContext.Current.Request.Cookies["Language"].Value.ToString());
}
catch
{
// cookie not present, default to english.
lang = Language.English;
}
switch(lang)
{
case Language.Arabic:
return "RTL"; // right to left
default:
return "LTR"; // left to right
}
}
}

public enum Language
{
English,
Arabic,
...
}

for more information on languages and their directions, and related browser issues, see this excellent article from W3C


Friday, 13 January 2006 16:28:16 (GMT Standard Time, UTC+00:00)  #    Comments [2]  Asp.Net

# Thursday, 12 January 2006
Fix: GridView DataFormatString not applied
for some bizarre reason, you have to set HtmlEncode=false on a bound column in a gridview, to get the DataFormatString to work.
i hope this helps somebody else staring at their gridview in disbelief as to why it doesn't work by default!


Thursday, 12 January 2006 16:01:38 (GMT Standard Time, UTC+00:00)  #    Comments [64]  Asp.Net

# Monday, 09 January 2006
Xml Serialization: Properties left out without 'set' accessor
I ran into a frustrating problem today where some properties of my objects weren't getting serialized.  I noticed it was the ones without 'set' accessors.  I didn't want to put set accessors in because the properties should be read-only, but apparently the xml serializer needs the set accessor to de-serialize. 
there is a good discussion on it on this thread.
Luckily we can use the ReadOnlyAttribute setting on the property to prevent it being modified by a programmer at design time.
or you can use the Soap serializer instead of xml, whatever that is.


Monday, 09 January 2006 17:57:34 (GMT Standard Time, UTC+00:00)  #    Comments [0]  .Net General | .Net Windows Forms | Asp.Net

Great Application: Event Monitor (free)
since i installed .net 2.0 on my web server, i sometimes forget that you can't have .net 1.1 and .net 2.0 running in the same application pool, and then IIS kindly stops all the web sites if i make this mistake when configuring a new website. 
so i wanted to get an email about it if i forget, and went looking on d'internet for a decent little app to monitor events and email me if anything goes wrong. 
the best one i found is called Event Monitor, and it is a great little program.  nice enough UI and good filtering ability.

i set it up to monitor for events with id 1062 (the one that happens when i put .net 1.1 and 2.0 in the same app pool by accident..) and now i get an email straight away if it happens.  2 hours of tic-tac freshness in just 2 calories. great huh?!


Monday, 09 January 2006 15:09:43 (GMT Standard Time, UTC+00:00)  #    Comments [0]  Windows Server

# Wednesday, 04 January 2006
PROBLEM: Firefox clipboard not working sometimes
Note: i still don't know a work around for this problem.  The post below seems to help somewhat, but does not solve the problem completely.

Tom Keating wrote a very good description of Firefox's dodgy clipboard functionality on this blog post.  More correctly, it is a problem with Microsoft Office locking the clipboard and preventing other applications using it.  I found that by closing all Office applications, the firefox copy/paste problem went away temporarily.  I did some more testing and remembered that after i installed Office 2003, i hid the annoying Office Clipboard and stopped it popping up every time i copied something.  I suspected that it was still running in the background, interfering with other applications like Firefox. The way to turn it off completely is to open up Word, Edit > Office Clipboard.  Click on the options button at the bottom, and untick all the menu options.

This has minimised the problem a lot for me, it doesn't bother me much anymore.  However Tom wrote back to say he still gets the problem. 


Wednesday, 04 January 2006 17:45:32 (GMT Standard Time, UTC+00:00)  #    Comments [5]  General

# Thursday, 29 December 2005
Sending files in chunks with MTOM Web Services and .NET 2.0
just posting my code-project article on http chunking web services with MTOM here for reference.  by the way it has been updated on codeproject...

Screenshot of windows forms client, uploading a file

Introduction

In trying to keep up to speed with .NET 2.0, I decided to do a .NET 2.0 version of my CodeProject article "DIME Buffered Upload" which used the DIME standard to transfer binary data over web services. The DIME approach was reasonably efficient but the code is quite complex and I was keen to explore what .NET 2.0 has to offer. In this article, I use version 3.0 of the WSE (Web Service Enhancements) which is available for .NET 2.0 as an add-in, to provide a simpler and faster method of sending binary data in small chunks over HTTP web services.

Background

Just a recap on why you may need to send data in small chunks at all: if you have a large file and you want to send it across a web service, you must understand the way it all fits together between IIS, .NET, and the web service call. You send your file as an array of bytes, as a parameter to a web service call, which is all sent to the IIS web server as a single request. This is bad if the size of the file is beyond the configured MaxRequestLength of your application, or if the request causes an IIS timeout. It is also bad from the point of view of providing feedback of the file transfer to the user interface because you have no indication how the transfer is going, until it is either completed or failed. The solution outlined here is to send chunks of the file one by one, and append them to the file on the server.

There is an MD5 file hash done on the client and the server to verify that the file received is identical to the file sent.

Also, there is an upload and download code included in this article.

Adventures with MTOM

MTOM stands for SOAP "Message Transmission Optimization Mechanism" and it is a W3C standard. To use it (and to run this application), you must download and install WSE 3.0, which includes MTOM support for the first time. If you look in the app.config and web.config files in the source code, you will see sections referring to the WSE 3 assembly, and a messaging clientMode or serverMode setting. These are necessary to run MTOM in the application.

The problem with DIME was that the binary content of the message was sent outside the SoapEnvelope of the XML message. This meant that although your message was secure, the Dime Attachment may not be secure. MTOM fully complies with the other WS-* specifications (like WS-Security) so the entire message is secure.

It took me a while to realise that when MTOM is turned on for the client and the server, WSE automatically handles the binary encoding of the data in the web service message. With DIME and WSE 2.0, you had to configure your app for DIME and then use DimeAttachments in your code. This is no longer necessary, you just send your byte[] as a parameter or return value, and WSE makes sure that it is sent as binary, and not padded by XML serialization as it would be in the absence of DIME or MTOM.

How it works

The web service has two main methods, AppendChunk is for uploading a file to the server, DownloadChunk is for downloading from the server. These methods receive parameters for the file name, the offset of the chunk, and the size of the buffer being sent/received.

The Windows Forms client application can upload a file by sending all the chunks one after the other using AppendChunk, until the file has been completely sent. It can do an MD5 hash on the local file, and compare it with the hash on the file on the server, to make sure the contents of the files are identical. The download code is very similar, the main difference is that the client must know from the server how big the file is, so that it can know when to stop requesting chunks.

A simplified version of the upload code is shown below (from the WinForms client). Have a look in the code for Form1.cs to see the inline comments + the explanation of the code. Essentially, a file stream is opened on the client for the duration of the transfer. Then the first chunk is read into the Buffer byte array. The while loop keeps running until the FileStream.Read() method returns 0, i.e. the end of the file has been reached. For each iteration, the buffer is sent directly to the web service as a byte[]. The 'SentBytes' variable is used to report progress to the form.

using(FileStream fs = new FileStream(LocalFilePath, FileMode.Open, FileAccess.Read))
{
int BytesRead = fs.Read(Buffer, 0, ChunkSize);
while(BytesRead > 0 && !worker.CancellationPending)
{
ws.AppendChunk(FileName, Buffer, SentBytes, BytesRead);
SentBytes += BytesRead;
BytesRead = fs.Read(Buffer, 0, ChunkSize);
}
}

Example of the BackgroundWorker class in .NET 2.0

.NET 2.0 has a great new class called 'BackgroundWorker' to simplify running tasks asynchronously. Although this application sends the file in small chunks, even these small chunks would delay the WinForms application and make it look crashed during the transfer. So the web service calls still need to be done asynchronously. The BackgroundWorker class works using an event model, where you have code sections to run for DoWork (when you start), ProgressChanged (to update your progress bar / status bar), and Completed (or failed). You can pass parameters to the DoWork method, which you could not do with the Thread class in .NET 1.1 (I know you could with delegates, but delegates aren't great for thread control). You can also access the return value of DoWork in the Completed event handler. So for once, MS has thought of everything and made a very clean threading model. Exceptions are handled internally and you can access them in the Completed method via the RunWorkerCompletedEventArgs.Error property.

The code shown below is an example of the ProgressChanged event handler:

private void workerUpload_ProgressChanged(object sender, ProgressChangedEventArgs e)
{
// update the progress bar and status bar text
this.toolStripProgressBar1.Value = e.ProgressPercentage;
this.statusText.Text = e.UserState.ToString();
// summary text is sent in the UserState parameter
}

I have used four BackgroundWorker objects in the application:

  • one to manage the upload process,
  • one to manage the download process,
  • another to calculate the local MD5 file hash in parallel while waiting for the server result,
  • and another to download the list of files in the Upload server folder to allow the user to select a file to download.

The reason I use a BackgroundWorker object for each task is because the code for each task is tied in to the events for that object.

A good example of Thread.Join()

When the upload or download is complete, the client asks for an MD5 hash of the file on the server, so it can compare it with the local file to make sure they are identical. I originally did these in sequence. But it can take a few seconds to calculate the result for a large file (anything over a few hundred MB), so the application was waiting five seconds for the server to calculate the hash, and then five more seconds for the client to calculate its own hash. This made no sense, so I decided to implement a multi-threaded approach to allow them to run in parallel. While the client is waiting on the server, it should be calculating its own file hash. This is done with the Thread class, and the use of the Join() method which blocks execution until the thread is complete.

The code below shows how this is accomplished:

// start calculating the local hash (stored in class variable)
this.hashThread = new Thread(new ThreadStart(this.CheckFileHash));
this.hashThread.Start();

// request the server hash
string ServerFileHash = ws.CheckFileHash(FileName);

// wait for the local hash to complete
this.hashThread.Join();

if(this.LocalFileHash == ServerFileHash)
e.Result = "Hashes match exactly";
else
e.Result = "Hashes do not match";

There is a good chance that the two operations will finish at approximately the same time, so very little waiting around will actually happen.

Performance compared with DIME

I found that MTOM was about 10% faster than DIME in my limited testing. This is probably to do with the need to package up each chunk into a DIME attachment, which is no longer necessary with MTOM. I was able to upload files of several gigabytes in size without problems.

Obviously, there is an overhead with all this business of reading file chunks and appending them, so the larger the chunk size, the more efficient your application will be. It should be customised based on the network and the expected size of files. For very small files, it is no harm to use small chunk sizes (e.g., 32 Kb) because this will give accurate and regular feedback to the user interface. For very large files on a fast network, consider using 4000 Kb to make good use of the bandwidth and reduce the File Input/Output overhead. If you want to send chunks larger than 4 MB, you must increase the .NET 2.0 Max Request Size limit in your web.config.

Conclusions

Feel free to use this code and modify it as you please. Please post a comment for any bugs, suggestions, or improvements. Enjoy!


Thursday, 29 December 2005 20:34:15 (GMT Standard Time, UTC+00:00)  #    Comments [24]  .Net General | .Net Windows Forms | Asp.Net