Thursday, October 27, 2005

Voices of reason in a sea of hype

I am so tired of the term "Web 2.0". When ever I see that that term bandied about, I get a mental flashback to an old Verizon commercial where a consultant is pronouncing the word paradigm as "Pair-Uh-Dig-Um". It's another example of "buzzword bingo". What does it mean? And I do mean that specificly.

You get the impression that there is a line drawn somewhere, that separates "Web 1.0" from "Web 2.0". As near as I can tell, the definition du jour of "Web 2.0" is that it's interactive web services with dynamic content and social participation. That right there is vague enough to mean anything and nothing. Now I'm starting to blog postings that describe "Web 2.0" as a metaphysical process. This isn't Aug. 16, 1987, people. Let's just look at the web for what it is, a means of communication, and leave the mumbo-jumbo stuff out of it.

Nicholas Carr has a great article, "The amorality of Web 2.0", that pretty much nails how I feel about this thing. Joel Spolsky (required reading for anyone in software development) has a very good rant "Architecture Astronauts Are Back", with a followup with the links to see what made him crazy (but crazy in a good way).

Wednesday, October 26, 2005

RE: Yet another command line parsing system

This looks like a logical way to handle command line parameters while still following convention of using an app.config file.

....I used another arguments parser from Code Project, "C#/.NET Command Line Arguments Parser".
I like it because it works like the ASP.NET querystring parser - it handles the parsing (quoted strings, different delimiter styles) and exposes a string dictionary with the results.

I use a GetSettings accessor that reads the default from the app.config file, but allows overrides via command line. I like this approach because settings are their standard location (app.config), and any config setting can be overriden via command line without an attribute change and a recompile.

private static int Main(string[] args)
    Processor processor1 = 
new Processor(args);
return processor1.Process();

private Arguments arguments;

public Processor(string[] args)
this.arguments = new Arguments(args);

public Process()

private string GetSetting(string key)
string setting = string.Empty;
if (this.arguments[key] != null)
        setting = 
        setting = ConfigurationSettings.AppSettings.Get(key);
if (setting == null)
return string.Empty;
return setting;

[via [JonGalloway.ToString()]]

RE: //TODONT: Use a Windows Service just to run a scheduled process

I made this mistake once. But I feel much better now thank you.

A Windows Service is the wrong solution to scheduling one-off custom processes. The right solution for scheduling simple processes is the Windows Task Scheduler.

Read the whole posting here

[From [JonGalloway.ToString()]]

Fun with BCP

Here's a quick and dirty way to export a table to a text file with SQL Server. With some minor tweaking, it should also work for views and stored procedures that return result sets.

Here is a simple method of exporting all the data from SQL Server table to a Text File

CREATE Procedure BCP_Text_File

@table varchar(100), 
@FileName varchar(100) 

If exists(Select * from information_Schema.tables where table_name=@table)
        Declare @str varchar(1000) 
        set @str='Exec Master..xp_Cmdshell ''bcp "Select * from '+db_name()+'..'+@table+'" queryout "'+@FileName+'" -c''' 
    Select 'The table '+@table+' does not exist in the database'

Execute this procedure by giving Table Name and the File Name

EXEC BCP_Text_File 'Employee','C:\emp.txt'

Now all the data from Employee table will be exported to the text file which will be located at C:\emp.txt

[from WebLogs @]

Tuesday, October 25, 2005

Whither DAAB?

If you were looking for a reason to dump DAAB, it looks like the good parts have already been folded into ADO.NET 2.0.
The ADO.NET data providers in .NET 2.0 provide factory and common ADO.NET classes that make it easy to keep your code independent from a particular ADO.NET data provider or database product.
[Simple Talk]

How much is your blog worth?

In my case diddly squat....

A blast from the past

Back in the Amiga days, SteveX was a name every programmer knew from his VirusX, ScreenX, PointerX applications. These days, he's in the .NET world and has a cool link for string formatting in C#.

Quick lesson on how to strip text out using RexEx

There's just something about RegEx that makes my ears bleed. Fortunately other people get it. Here's an example of how to filter text out of an expression, courtesy of Jeff Atwood's Coding Horror....

For example, if the word fox was what I wanted to exclude, and the searched text was:

The quick brown fox jumped over the lazy dog.

... and I used a regular expression of [^"fox"] (which I know is incorrect) (why this doesn't work I don't understand; it would make life SO much easier), then the returned search results would be:

The quick brown jumped over the lazy dog.

Regular expressions are great at matching. It's easy to formulate a regex using what you want to match. Stating a regex in terms of what you don't want to match is a bit harder.

One easy way to exclude text from a match is negative lookbehind:


But not all regex flavors support negative lookbehind. And those that do typically have severe restrictions on the lookbehind, eg, it must be a simple fixed-length expression. To avoid incompatibility, we can restate our solution using negative lookahead:


You can test this regex in the cool online JavaScript Regex evaluator. Unfortunately, JavaScript doesn't support negative lookbehind, so if you want to test that one, I recommend RegexBuddy. It's not free, but it's the best regex tool out there by far-- and it keeps getting better with every incremental release.

Monday, October 24, 2005

New device detected: Boeing 747

I would have loved to have seen that ono pop up on the scrren.

Back in 1994, Boeing considered equipping each seat with a serial modem.
Laptop users could hook up to the modem and dial out. (Dial-up was the primary means of connecting to the Internet back in those days.)

We chuckled at the though of attaching the serial cable and getting a Plug-and-Play pop-up message:

New device detected: Boeing 747

[via The Old New Thing]

Deleting lots of data in batches

The fun part is in the "where..." bit, knowing how to set your query to only get a portion of the data is heavy lifting here.
I know where I'm going to be using this in some soon to be written code. For the code, the data will timestamped, I can safely iterate by day and nuke all of the records for each day.

So we've all come across the need to delete 10 million records. however we all no that this won't be quick and will result in a large log file and as we get nearer deleting the 10 millionth row the process is going very slowly.

Well the standard way around this is to run the command in batches, this way our transaction is never very big. So you can write a while loop and check an iterator, but first you need to get into the loop so you need to store the iteration in a variable and have something like this

set rowcount 10000
declare @rc int
set @rc =0
while @rc < 1000 
  --Do my update/delete etc 
  delete from mytable where ....
 @rc = @rc+1

Well in SQL 2005 in SQLCMD mode and the new TOP clause in an update/delete you can do the following

--your update statement
delete top (10000) from mytable where ....
:go 1000

which of these looks easier to you. I vote for number 2.

[via WebLogs @]

Regular Expression Tools

I'm still at the point where I can use stuff like ^\d{3}[-| ]?\d{2}[-| ]?\d{4}$ and not fully understand the pieces. So i'm always looking for decent RegEx tools. Eric Gunnerson had a good post about suing RegEx to validate a SSN

But first, a word about tools. It's a lot easier to use a tool to do this sort of thing than it is to write code to do it. So, I suggest one of the following

RE: Another Blast of Cold Water in the face - "The Build Master" by Vincent Maraia

Bill lists a couple of good references for doing the build the right way. I want these two books.

Eric Garulay, one of the uber cool folks over at Addison-Wesley shot me a copy of The Build Master Microsoft's Software Configuration Management Best Practices by Vincent Maraia ( Build Master Web Site ) . Well, looks like Kim is going to appropriate the book from me so I need to hurry up and get it read, fortunately that's not a problem because it's pretty amazing.  Let me step back a second.  A while ago, my buddies at sent me a book titled Expert .NET Delivery using NAnt and CruiseControl .NET  and it was a real eye opener.  Until then, my build strategy was about as sophisticated as “Get Latest Version” from Visual Source Safe, flipping the Release bit and compiling. 

For rest of the posting, go to [Bill's House O Insomnia]

Saturday, October 22, 2005


I had tears coming out of my eyes when I played this.

[via Engadget by way of Metafilter]

Friday, October 21, 2005

Internet Health Report

If you think that the Internets are running slow, click this link. What you will see is a chart of the Internet backbone providers and the relative speed that they are talking to each other.

Thursday, October 20, 2005

RE: SqlDependency changes for RTM [Sushil Chordia]

I want to play with this feature. I have implemented something with similiar functionality using UDP in an extended stored procedure, but this looks much simpler.

As mentioned in my previous blog, SqlDependency is a new feature in .Net framework 2.0, which provide a mechanism to notify an app when a cache is invalidated. We got enough feedback from customers in Beta 2 with regards ease of deployment (some issues here) and security that we decided to make some changes for the final release. These new changes are now available as part of the September CTP. Following is a quick example on how to get Notification working on the September CTP bits. (Things new in September CTP are marked in RED)

using System;
using System.Data;
using System.Data.SqlClient;
class QuikExe
  public static string connectionstring = "Get Connection String From The Config File";
  public void DoDependency()
    using (SqlConnection conn = new SqlConnection(connectionstring))
      Console.WriteLine("Connection Opened...");

      SqlCommand cmd = conn.CreateCommand();
      cmd.CommandText = "Select i from dbo.test";

      //Notification specific code
      SqlDependency dep = new SqlDependency(cmd);
      dep.OnChange += delegate(Object o, SqlNotificationEventArgs args)
        Console.WriteLine("Event Recd");
        Console.WriteLine("Info:" + args.Info);
        Console.WriteLine("Source:" + args.Source);
        Console.WriteLine("Type:" + args.Type);

      SqlDataReader r = cmd.ExecuteReader();
      //Read the data here and close the reader
      Console.WriteLine("DataReader Read...");

  public static void Main()
      //Start the listener infrastructure on the client

      QuikExe q = new QuikExe();
      Console.WriteLine("Wait for Notification Event...");
      //Optional step to clean up dependency else it will fallback to automatic cleanup


read the rest here...

Just say no to CLR UDTs

Alex Papadimoulis is pretty adamant about not using CLR UDTs in SQL Server 2005.

No one has asked me that question just yet ["When Should I Use SQL-Server CLR User Definied Types (UDT)?"], but with the release of SQL Server 2005 just around the corner, I'm sure a handful of people will. Unlike regular User Defined Types, CLR UDTs are a new feature of SQL Server 2005 that allows one to create a .NET class and use it as a column datatype. As long as a few requirements are followed, one can create any class with any number of properties and methods and use that class as a CLR UDT.

Generally, when a new feature is introduced with a product, it can be a bit of a challenge to know when and how to use that feature. Fortunately, with SQL Server's CLR UDTs, knowing when to use them is pretty clear:


Let me repeat that. Never. You should never use SQL Server CLR User Defined Types. I'm pretty sure that this answer will just lead to more questions, so allow me to answer a few follow-up questions I'd anticipate.

The full article can be read here. Another reason to take a pass on CLR UDTs is that it ties your database to SQL Server 2005. If you are doing an app that can run on SQL Server 2000, you just shot yourself in the foot.

Another reason to avoid CLR UDTs? Eliminating error messages like "File or assembly name udtname, Version=, Culture=neutral, PublicKeyToken=389619d4c1235f8a, or one of its dependencies, was not found."

Spam weasels

I finally got a spam comment. In fact, it was from another Blogger user. That's in violation of the Blogger TOS, so forwarded his infomation to the Blogger people (Google). I gave Google about a week to take whatever action they wanted to take against that user and then I removed that comment. It was odd that way the spam appeared. I posted a new entry and within 2 minutes, it got a comment. This is by no means a high traffic site, it's mainly to keep track of things.

Nine reasons not to use serialization

There's a good article on The Code Project that explains why you shouldn't use serialization to store data. The root problem is that the information that gets serialized out is strongly typed. In other words, whatever wrote that data out, better be the same thing that read it back in again. Should your code change it's data structures, trying to read in serialized data from a previous version will break the code. That kinda defeats the purpose of using XML to store data. And not in a good way.

Thursday, October 13, 2005

RE:Oldest noodles unearthed in China

Paleolithic dorm food discovered:

The remains of the world's oldest noodles have been unearthed in China. The 50cm-long, yellow strands were found in a pot that had probably been buried during a catastrophic flood. Radiocarbon dating of the material taken from the Lajia archaeological site on the Yellow River indicates the food was about 4,000 years old. That date is about 1000 years older than what had been considered the oldest known instance of noodles.

[Via The BBC]

Wednesday, October 12, 2005

RE: October 12, 2005

As usual, Joel nails this one...

“Custom development is that murky world where a customer tells you what to build, and you say, ‘are you sure?’ and they say yes, and you make an absolutely beautiful spec, and say, ‘is this what you want?’ and they say yes, and you make them sign the spec in indelible ink, nay, blood, and they do, and then you build that thing they signed off on, promptly, precisely and exactly, and they see it and they are horrified and shocked, and you spend the rest of the week reading up on whether your E&O insurance is going to cover the legal fees for the lawsuit you've gotten yourself into or merely the settlement cost. Or, if you're really lucky, the customer will smile wanly and put your code in a drawer and never use it again and never call you back.”

Set Your Priorities

[Via Joel on Software]

RE: Why do login dialogs have a "User" field?

I like Jeff Atwood's blog, but I don't agree with his posting about removing the "User" field from the dialog box. If you pull the user out of the user/password combination, you have to force unique passwords in your system. That's a huge hurdle. I know couples where they share the same password for their individual email accounts. Technically, that's less secure than different passwords for each others account, but it's easier to manage.

Another problem is that two people could have unique passwords that only differ by a single charactor. If you mistype your password when you login and your password matches the password of another user, then you will login as that user without any warning.

In The Humane Interface, the late Jef Raskin asks an intriguing question: why do login dialogs have a "User" field?

Shouldn't login dialogs look more like this?

Login dialog without user field

And you know what? He's right. Your password alone should be enough information for the computer to know who you are.

Click here for the rest of that article

[Via Coding Horror]

Friday, October 07, 2005

RE: Performance Analysis Tools from Microsoft (Free)

You can never have enough tracing tools.....

You may have noticed that Micorsoft have opened up and are particpating more and more in the community, blogs, newsgroups, events etc. One of the others is the making available of the tools used internally.

Having just got my latest SQLMag I read with interest the article on tools being used in Microsoft PSS  (Product Support Services). These tools are available for download from Microsoft;en-us;887057

The one that most interests me is the read80Trace. This processes trace files, including rollover files, and provides summary information, aggregates etc. The great thing is that it works out the sp calls from the text data so you don't have to do it yourself. And then stores the data in a normalised database so you can perform your own queries on it.

There is one gotcha, in that it needs EndTime in your trace files, which isn't in a trace by default. If you have trace files you want to process using this tool you can load your data into a table add a derived column of EndTime that is dateadd(ms,duration,starttime), load the trace back into profiler and save to a file. Not great but it does work.

These tools can also be used to reprocess the profiled data.

If you haven't look at these tools you should, the help is quite useful as is the article above.

[Via SQLJunkies Blogs]

Wednesday, October 05, 2005

RE: If All Movies Ever Made Were Really About Software...

A Few Good Objects
Col. Jessup: You want destructors?
Kaffe: I think I'm entitled.
Col. Jessup: You want destructors?
Kaffe: I want deterministic finalization.
Col Jessup: You can't handle deterministic finalization!

Click here for Pulp Compilers, Full Metal Packet, and the rest.

[Via K. Scott Allen]