.Net ramblings
# Friday, 19 August 2005
Tortilla EspaƱola: the real deal

Ever since i went to Malaga in Southern Spain 10 years ago, i have tried and failed to reproduce the authentic taste of the amazing "Tortilla EspaƱola", the Spanish Omelette.  I remember paying about a euro for a large tortilla that would be perfectly acceptable to eat for breakfast, lunch or (and!) dinner.
fortunately, i came across a recipe online today that i am posting here for future reference.  although i am fairly handy with the old omelettes in general, this was a real find, in particular the discovery that you fry the potatoes in lots of olive oil, which makes them go soft and gives a lovely soft texture to the whole tortilla.
You can see it on it's original location here, i'm only copying it here in case that url ever disappears or goes down.

Spanish TortillaServes four as a main course; twelve as a tapa.

  • 1 and 3/4 cups vegetable oil for frying (or plain olive oil)
  • about 5 medium-sized potatoes, peeled
  • 2 tsp. coarse salt
  • 2 or 3 medium-sized onions, diced
  • 5 medium cloves garlic, very coarsely chopped
  • 6 large eggs
  • 1/8 tsp. freshly ground black pepper

In a 10 or 11 inch non-stick skillet (should be at least 2 inches deep), heat the oil on medium high. While the oil is heating, slice the potatoes thinly, about 1/8 inch. Transfer to a bowl and sprinkle on the 2 tsp. of salt, tossing to distribute it well.

When the oil is very hot (a potato slice will sizzle vigorously around the edges without browning), gently slip the potatoes into the oil with a skimmer or slotted spoon. Fry the potatoes, turning occasionally (trying not to break them) and adjusting the heat so they sizzle but don't crisp or brown. Set a sieve over a bowl or else line a plate with paper towels. When the potatoes are tender, after 10 to 12 min., transfer them with the skimmer to the sieve or lined plate.

Add the onions and garlic to the pan. Fry, stirring occasionally, until the onions are very soft and translucent but not browned (you might need to lower the heat), 7 to 9 min. Remove the pan from the heat and, using the skimmer, transfer the onions and garlic to the sieve or plate with the potatoes. Drain the oil from the skillet, reserving at least 1 Tbs. (strain the rest and reserve to use again, if you like) and wipe out the pan with a paper towel so it's clean. Scrape out any stuck-on bits, if necessary.

In a large bowl, beat the eggs, 1/4 tsp. salt, and the pepper with a fork until blended. Add the drained potatoes, onions, and garlic and mix gently to combine with the egg, trying not to break the potatoes (some will anyway).

Heat the skillet on medium high. Add the 1 Tbs. reserved oil. Let the pan and oil get very hot (important so the eggs don't stick), and then pour in the potato and egg mixture, spreading it evenly. Cook for 1 min. and then lower the heat to medium low, cooking until the eggs are completely set at the edges, halfway set in the center, and the tortilla easily slips around in the pan when you give it a shake, 8 to 10 min. You may need to nudge the tortilla loose with a knife or spatula. (I found i had to turn it down very low to keep it from burning)


Set a flat, rimless plate that's at least as wide as the skillet upside down over the pan. Lift the skillet off the burner and, with one hand against the plate and the other holding the skillet's handle, invert the skillet so the tortilla lands on the plate (it should fall right out). Set the pan back on the heat and slide the tortilla into it, using the skimmer to push any stray potatoes back in under the eggs as the tortilla slides off the plate. Once the tortilla is back in the pan, tuck the edges in and under itself (to neaten the sides). Cook until a skewer inserted into the center comes out clean, hot, and with no uncooked egg on it, another 5 to 6 min.

Transfer the tortilla to a serving platter and let cool at least 10 min. Serve warm, at room temperature, or slightly cool. Cut into wedges or small squares, sticking a toothpick in each square if serving as an appetizer.

If the idea of cold tortilla doesn't get you going, you should try it, it might surprise you like it did me.  I didn't even like eggs when i got hooked on tortillas :)

Many thanks and all credits to Sarah Jay for sharing this great recipe.
By the way, it's incredibly filling because of all that oil, so eat about half as much as you'd think, then wait a while to see how you get on!  no wonder the spaniards have so many siestas, eating tortilla all the time would knock anyone out.


Friday, 19 August 2005 16:34:36 (GMT Daylight Time, UTC+01:00)  #    Comments [2]  General

HowTo: Clear a stuck print job in Windows

MS Word crashed in the middle of a print job, and the document was stuck in the print queue, not obeying commands to delete or cancel it, even after reboots.
To fix this, i opened c:\windows\system32\spool\PRINTERS and deleted all the files there.  If the files are locked and won't delete, stop the Print Spooler service first and then delete the files.
This worked for me!

Friday, 19 August 2005 13:41:49 (GMT Daylight Time, UTC+01:00)  #    Comments [1]  General | Windows Server

# Tuesday, 09 August 2005
FIX: Server 2003 won't shutdown because of 'logged on users'
i encountered problems trying to shut down or reboot my windows server 2003 while logged in via remote desktop.  i initiate a shut down from the start up menu, and it logs me off and quits the RD session, but doesn't power off.  If i try and connect in again with RD, it shows me a blank screen on the server for a few seconds, and then quits with no error message.

if i try running "shutdown -s" then i click OK to the message that the computer is shutting down in 30 seconds, but it never does. system log entry: "Application popup: System Shutdown : The system is shutting down. Please save all work in progress and log off. Any unsaved changes will be lost. "

finally, if i push the power button it doesn't complete the shut down either. (i have 'shutdown' selected in power management for when the button is pressed).  i do see an event ID 26 in the system log at the time i pressed the button, with the message 'Application popup: Windows : Other people are logged on to this computer. Shutting down Windows might cause them to lose data. Do you want to continue shutting down?'.  this is a bit useless because the server has no mouse/keyboard/monitor, and i have no way to interact that message except using remote desktop (which doesn't even show that message on screen if i am logged in via RD), and i want to force a software shutdown without being asked questions i can't answer!

finally i found a work around to use the command line tool shutdown.exe with some more severe arguments.  This saves my raid array re-synching from hard-resets.

To Reboot:
shutdown.exe -r /t 10 /d p:1:1 /c "Maintenance, Planned"
To Shutdown:
shutdown.exe -s /t 10 /d p:1:1 /c "Maintenance, Planned"

Tuesday, 09 August 2005 09:31:10 (GMT Daylight Time, UTC+01:00)  #    Comments [2]  Windows Server

# Tuesday, 02 August 2005
How to properly use the javascript string replace function

i've seen a ton of different approaches to people using the javascript string function, including some homemade versions. 

the javascript replace function uses regular expressions, so it doesn't work like the C# or VB functions.  (i hoped it would)

what is also unusual for c# programmers is that the pattern you pass into the function does not get wrapped in quotes.

example: to replace all single-quotes in a string variable (called s) with the ` character:

s = s.replace(/'/gi, '`');

if that looks like gobbledegook i'll explain. the first / character starts the pattern, and the ' character is what we want to replace. the second / character ends the pattern and allows us to include options for the regex parser. g means global and i means ignore case.

i'm really only posting this so i'll remember it myself!


Tuesday, 02 August 2005 13:43:45 (GMT Daylight Time, UTC+01:00)  #    Comments [1]  General

# Monday, 01 August 2005
Scrollable DIV css code, useful for big checkboxlists etc
DIV.scroll
{
	height: 9em;
	overflow: auto;
	border: 1px solid #666;
	background-color: #e8e8e8;
} 

I ran into a problem designing a web app that had a few checkboxlists, and when the real data got imported, the checkboxlists took up most of a page with all the entries!  this is obviously no good, but the app still needed a checkboxlist rather than a single-select dropdownlist, so i found this css code that you can apply to a DIV wrapped around the checkboxlist.  you can set a width if you like but i just let it fill whatever container the div is already in. 

works in firefox and IE6 in windows, didn't try on anything else. 


Monday, 01 August 2005 14:59:54 (GMT Daylight Time, UTC+01:00)  #    Comments [1]  Asp.Net

# Thursday, 30 June 2005
Crystal Reports for .Net, locked file when exporting to PDF with a crHtmlText field

i made a simple change to a crystal report in the VS designer, by changing the text format for a text box to crHtmlText instead of crStandardText.  little did i realise this would break the report altogether and cause it to fail when i try to export as a PDF.  i had forgotten i did this change at all and thought it was a permissions issue, but not at all, crystal reports is just crap.

my advice is don't use crHtmlText


Thursday, 30 June 2005 15:50:42 (GMT Daylight Time, UTC+01:00)  #    Comments [0]  Asp.Net

IIS6 basic authentication (challenge/response) with PHP not working

i spent ages trying to figure this out.  i set up a secure folder on a php site, and turned off anonymous access in IIS on the folder.  i have it set to basic authentication, and given a windows user account RX permissions on the folder.  so a visitor gets prompted to login with a username/password when they try to access it.  even when i enter the correct details, i am not allowed in, i get 403.1 access denied due to ACL or whatever.

i found out it was because i hadn't given my new user read/execute permissions on the php-cgi.exe file in the PHP folder.  this solved it. hope this helps someone out there who runs into the same problem.


Thursday, 30 June 2005 15:48:38 (GMT Daylight Time, UTC+01:00)  #    Comments [1]  Windows Server

# Tuesday, 24 May 2005
A few useful functions for importing and exporting data in asp.net. Excel, PDF, Datasets, DataGrids, etc.

the last few web projects i've been working on have had a lot of import/export requirements, and i've put together a class library that contains the functionality outlined below.  A lot of it i have collected from newsgroups and modified to my own purposes.

Import an Excel File

Imports all the worksheets in an excel file into a .Net dataset, with one datatable for each work sheet. Uses the ADOX COM component (reference Microsoft.ActiveX extensions 2.8 in VS, and adodb) and OleDb.  The reason i chose to use ADOX was because you need to know the worksheet names in the excel file if you're to query them with OleDb, and this isn't possible in my case.  So i use ADOX to iterate through the table names and create a new ADO.Net DataTable for each one.
I found some problems with excel documents that appeared to contain empty columns past the used range of cells, but OleDb complained about "Too many fields defined", which i presume is because it interprets all the excel columns that go on from A to XYZ or whatever, as proper columns, when we're only interested in using the used range of cells.  To overcome this, I opened the excel file and copied the range of cells into a new worksheet and deleted the old one, and it worked fine. 

Sample usage (with a Html File Control called 'fileToUpload' on the page):  

This sample code below uses the ImportExcel method in the class library code, to have the user browse to the excel file, and then upload it to the server and import it into a DataSet.

string fileName = ((System.Web.UI.HtmlControls.HtmlInputFile)this.fileToUpload).PostedFile.FileName;
if(fileName != "")
{
   try
   {
      // upload the excel file to a temp directory (needs write permissions)
      string uploadPath = Server.MapPath("/Temp/" + fileName.Remove(0,fileName.LastIndexOf("\\") + 1)));
      ((System.Web.UI.HtmlControls.HtmlInputFile)this.fileToUpload).PostedFile.SaveAs(uploadPath);

      // load the excel file into a dataset for processing
      DataSet ds = ImportExcel.ImportExcel(uploadPath);

Export Crystal Report To PDF

Converts a crystal report object into a PDF document that opens in Adobe Acrobat Reader.

Export a DataTable or DataView to Excel

This export method is similar to the common technique of binding a dataset to a datagrid/gridview and rendering the contents to produce a HTML table that Excel can understand. However the datagrid approach is not reliable if the data contains html characters, e.g. < or >, it produces invalid XML, which causes problems in Excel and OpenOffice. An alternative approach is to derive a GridView control that automatically sets HtmlEncode = true on all the BoundColumns, but this can produce very bloated output where non ASCII characters are represented and Excel will not decode the HtmlEncoded text.  I found the simplest approach is to parse the dataview and write out an XHTML table. This way the output is guaranteed to be valid XHTML, and compatible with Excel and OpenOffice (use the HtmlDocument filter when opening the file). 

In case you are worried about the performance of traversing the data like this, don't be, because it is sure to be less code than what happens inside the DataGrid class :)

Export a DataTable or DataView to CSV

this is similar to the above approach of parsing through the data and outputing the delimiters appropriately. there are regular expression based approaches, which i have tried before, but i found them unreliable when dealing with a complex character set, especially when trying to output in a format that both OpenOffice and Excel will be able to open.  This way i know i can trust, and it is lightning fast as well.

Comments?

if you have any questions on how to use it, or if you find bugs, or even better if you have some improvements... post a comment below. 
Enjoy.  Tim.

The Code

using System;
using System.Configuration;
using System.Data;
using System.Data.OleDb;
using System.IO;
using System.Text;
using System.Text.RegularExpressions;
using System.Web;
using System.Web.Caching;
using System.Web.Security;
using System.Web.SessionState;
using System.Web.UI;
using System.Web.UI.WebControls;

namespace Tim.Library.WebForms
{
    /// <summary>
    /// Provides functionality to import and export datasets, datagrids, excel files etc.
    /// </summary>
    public class ImportExport
    {
        public enum ExportFormat{Excel, CSV};

        /// <summary>
        /// Imports all the worksheets in an excel file into a dataset,
        /// with one datatable for each sheet.
        /// </summary>
        public static DataSet ImportExcel(string path)
        {
            ADOX.CatalogClass cat = new ADOX.CatalogClass();
            // create an ADODB connection to use with the catalog
            ADODB.ConnectionClass connAdox = new ADODB.ConnectionClass();
            string connectionString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=\"" + path + "\";Extended Properties=Excel 8.0;";
            // open the excel ADOX connection to get the table names
            connAdox.Open(connectionString, "admin","",0);
            cat.ActiveConnection = connAdox;
            DataSet ds = new DataSet();
            // create an OleDb connection to get data into ADO.Net
            OleDbConnection connOleDb = new OleDbConnection (connectionString);
            connOleDb.Open();
            foreach(ADOX.Table t in cat.Tables)
            {
                try
                {
                    string name = t.Name.Trim('_');
                    if(ds.Tables.Contains(name))
                        continue;    // avoid duplicate worksheet names... strange behaviour where multiple tables were added when only one sheet existed
                    OleDbCommand cmdSelect = new OleDbCommand (@"SELECT * FROM [" + name + "]", connOleDb);
                    OleDbDataAdapter dba = new OleDbDataAdapter();
                    dba.SelectCommand = cmdSelect;
                    DataTable dt = new DataTable(name);
                    dba.Fill(dt);
                    ds.Tables.Add(dt);
                }
                catch(Exception ex)
                {    
                    throw ex;
                }
            }
            connOleDb.Close();                
            connAdox.Close();
            return ds;
        }

        /// <summary>
        /// Opens a PDF window containing the specified crystal report object
        /// The Crystal DLLs must be deployed with the web app for this to work.
        /// </summary>
        /// <param name="rpt">The report object</param>
        /// <param name="filename">Should include include .pdf</param>
        public static void ExportCrystalReportToPDF(CrystalDecisions.CrystalReports.Engine.ReportClass rpt, string filename)
        {
            MemoryStream stream = (MemoryStream)rpt.ExportToStream(CrystalDecisions.Shared.ExportFormatType.PortableDocFormat);
            byte[] bytes = new Byte[stream.Length];
            stream.Read(bytes, 0, (int)stream.Length);
            stream.Close();
            HttpResponse response = HttpContext.Current.Response;
            response.Clear();
            response.ClearContent();
            response.ClearHeaders();
            response.Buffer= true;
            response.ContentType = "application/pdf";
            response.AddHeader("Content-Disposition", "attachment;filename=\"" + filename + "\"");
            response.BinaryWrite(bytes);
            response.End();
        }


/// <summary>
/// This method is an overload for the DataView version of the same name.
/// </summary>
public static void DataTableToCsv(DataTable dt, string filename, bool WriteToResponse)
{
DataView dv = new DataView(dt);
DataViewToCsv(dv, filename, WriteToResponse);
}

/// <summary>
/// Parses a dataview into a CSV format. I always use tab separated columns, with \n separated rows.
/// </summary>
/// <param name="dv">The data source</param>
/// <param name="filename">If WriteToResponse is true, this must be a file name, otherwise a full path+file name to save the file to</param>
/// <param name="WriteToResponse">if true, Response.Writes the output to the client browser,
/// otherwise writes the contents to the specified file path</param>
public static void DataViewToCsv(DataView dv, string filename, bool WriteToResponse)
{
char ColDelim = '\t';
char RowDelim = '\n';

using(StringWriter sw = new StringWriter())
{
// output the header row
foreach(DataColumn dc in dv.Table.Columns)
sw.Write(CsvEscape(dc.ColumnName) + ColDelim);
sw.Write(RowDelim);

foreach(DataRowView dr in dv)
{
foreach(object o in dr.Row.ItemArray)
sw.Write(CsvEscape(o.ToString()) + ColDelim);
sw.Write(RowDelim);
}

if(WriteToResponse)
{
HttpResponse response = HttpContext.Current.Response;
response.Clear();
response.Charset = System.Text.UTF8Encoding.UTF8.WebName;
response.ContentEncoding = System.Text.UTF8Encoding.UTF8;
response.AddHeader("Content-Disposition", String.Format("attachment; filename=\"{0}\";", filename));
response.ContentType = "text/txt";
response.Write(sw.ToString());
response.End();
}
else
{
File.WriteAllText(filename, sw.ToString());
}
}
}

/// <summary>
/// Strips out any row/col delimeters. This could be slightly destructive but not important in my case :)
/// </summary>
public static string CsvEscape(string s)
{
return Regex.Replace(s, "\r|\n|\t", "");
}

/// <summary>
/// This method is an overload for the DataView version of the same name.
/// </summary>
public static void DataTableToXhtmlTable(DataTable dt, string filename, bool WriteToResponse)
{
DataView dv = new DataView(dt);
DataViewToXhtmlTable(dv, filename, WriteToResponse);
}

/// <summary>
/// This export method is similar to the common technique of binding a dataset to a datagrid/gridview
/// and rendering the contents to produce a HTML table that Excel can understand. However the datagrid
/// approach is not reliable if the data contains html characters, e.g. < or >, it produces invalid XML,
/// which causes problems in Excel and OpenOffice.
/// An alternative approach is to derive a GridView control that automatically sets HtmlEncode = true on
/// all the BoundColumns, but this can produce very bloated output where non ASCII characters are represented
/// and Excel will not decode the HtmlEncoded text.
/// I found the simplest approach is to parse the dataview and write out an XHTML table. This way the
/// output is guaranteed to be valid XHTML, and compatible with Excel and OpenOffice (use the HtmlDocument filter).
/// </summary>
/// <param name="dv">The data source</param>
/// <param name="filename">If WriteToResponse is true, this must be a file name, otherwise a full path+file name to save the file to</param>
/// <param name="WriteToResponse">if true, Response.Writes the output to the client browser,
/// otherwise writes the contents to the specified file path</param>
public static void DataViewToXhtmlTable(DataView dv, string filename, bool WriteToResponse)
{
using(StringWriter sw = new StringWriter())
{
sw.Write("<table border=\"1\">\n");

// output the header row
sw.Write("<tr>\n");
foreach(DataColumn dc in dv.Table.Columns)
sw.Write("<th>{0}</th>\n", XmlEscape(dc.ColumnName));
sw.Write("</tr>\n");

foreach(DataRowView dr in dv)
{
sw.Write("<tr>\n");
foreach(object o in dr.Row.ItemArray)
sw.Write("<td>{0}</td>\n", XmlEscape(o.ToString()));
sw.Write("</tr>\n");
}
sw.Write("</table>\n");

if(WriteToResponse)
{
HttpResponse response = HttpContext.Current.Response;
response.Clear();
response.Charset = System.Text.UTF8Encoding.UTF8.WebName;
response.ContentEncoding = System.Text.UTF8Encoding.UTF8;
response.AddHeader("Content-Disposition", String.Format("attachment; filename=\"{0}\";", filename));
response.ContentType = "application/vnd.ms-excel";
response.Write(sw.ToString());
response.End();
}
else
{
File.WriteAllText(filename, sw.ToString());
}
}
}

/// <summary>
/// Replace < & > characters with their xml escaped equivalents
/// </summary>
public static string XmlEscape(string s)
{
s = Regex.Replace(s, "<", "&lt;");
s = Regex.Replace(s, ">", "&gt;");
s = Regex.Replace(s, "&", "&amp;");
return s;
}
}
}

Tuesday, 24 May 2005 23:45:00 (GMT Daylight Time, UTC+01:00)  #    Comments [7]  Asp.Net

# Wednesday, 18 May 2005
HowTo: set up disk status monitoring (i.e. for a raid array) and send results by email

background

i have a raid 1 array on my server, and i was surprised to see no easy solution to setting up an email alert if one of the disks should fail.  luckily it has never happened but apparently windows doesn't even pop up a task-bar alert if it happens. 

what doesn't work

i tried the Windows performance monitoring and alerts, and while it can tell you the average read/writes per second, it can't tell you how many disks are online.  so i looked into WMI which has an API for hard disks, but i encountered a problem with the API not returning the information it is supposed to for disk status.

enter DiskPart.exe

then i discovered DiskPart.exe, a powerful disk management utility bundled with windows server 2003, i think you can get it for server 2000 too with an admin pack or something. 
you can run DiskPart.exe from the command prompt, type: "Select Disk 0" then hit return, next type "Detail Disk" and hit return.  you should see a list of all the volumes on the disk, along with the disk status "Healthy".
In the case of a raid array, i found it more useful to run the following commands:

select disk 0
select volume 0
detail volume

this outputs all the disks that make up the volume (Disk 0 and 1, in my case), and their status. 
Fortunately, you can pipe the output of DiskPart to a text file, and you can also tell DiskPart to run a list of commands from a script, so the whole thing can be automated as a scheduled task in Windows.  I also wrote a simple c# console app to send the contents of the output file in an email.

The .cmd file i have scheduled contains the following lines:

@diskpart /s c:\Scripts\disk_part_commands.txt > c:\scripts\disk_status.txt
@SendMail server@whatever.ie tim@whatever.ie "Server Disk Status" c:\scripts\disk_status.txt

The SendMail program takes in the following parameters: from address, to address, email subject, text file path.  it is hard coded to use the localhost mail server which you can change if you want in the source code (.Net C#).

Download the send mail console app: SendMail.exe (16 KB)

Download the send mail source code: SendMail.txt (1.26 KB)

Taking it further

if i was really serious about it, i would parse the text file and read in the status directly, and only send the email if the status is something other than 'Healthy', but i think it's nice to get an email once a week anyway from the server, reporting it's disk status and letting you know it is still alive.  if somebody does write a little program to parse the output, you could post it here as a comment, that would be great.

i run the script every week, so in the worst case it could operate for 7 days with a dead drive, and the chance is very slim that the other drive will fail within this time.


Wednesday, 18 May 2005 18:39:12 (GMT Daylight Time, UTC+01:00)  #    Comments [2]  Windows Server

# Wednesday, 11 May 2005
Fix: Forms authentication redirects to a bogus default.aspx page, with RedirectFromLoginPage()

hi,
i've read a lot of posts on microsoft.public.dotnet.framework.aspnet.security about people who ran into problems using forms authentication, and the RedirectFromLoginPage() method, which always redirects to a default.aspx.  this is a big problem if you use sub-folders that don't have a default.aspx page, as in my case.
i read some posts that suggested manually Response.Redirecting the user to the url in the querystring, but actually this is incorrect because Forms Auth puts the default.aspx in that querystring even if the user wasn't at a page called default.aspx. 

i put together a simple solution to get the redirecting to work properly, and am posting it here for future reference:

  • The Login page (Login.aspx) must be set up to read the HTTP_Referrer, and add it to the ViewState in the first Page_Load on that page.
  • In the btnLogin_Click event on Login.aspx, the SetAuthCookie() event should be called, and the user should be Response.Redirected to the referrer value in the viewstate.
  • So you ignore the querystring that Forms Authentication adds on to the Login page.

Here is sample code:


*****************
Login.aspx
*****************

private void Page_Load(object sender, System.EventArgs e)
{
 if(!IsPostBack)
  ViewState["originalUrl"] = Request.UrlReferrer.AbsoluteUri;
}

private void btnLogin_Click(object sender, System.EventArgs e)
{
 string originalUrl = ViewState["originalUrl"];
 if(originalUrl == null || originalUrl == "") // in case the viewstate is corrupt, use default.aspx by 'default'
  originalUrl = "default.aspx";
 
 // do your password checking here
 // if it's all ok then...
 FormsAuthentication.SetAuthCookie(username, false);
 Response.Redirect(originalUrl, true);
}

Wednesday, 11 May 2005 13:33:48 (GMT Daylight Time, UTC+01:00)  #    Comments [0]  Asp.Net