Thursday, April 2, 2009

Four Pillars of Asp.Net

Not that long ago, there was just ASP.NET. But ASP.NET has expanded to include different approaches to development. Recently, I was chatting with Microsoft’s Scott Hunter and Steve Walther over drinks at DevConnections Orlando and Steve mentioned “the four pillars of ASP.NET.” The four pillars are the four ways you can build web sites with ASP.NET, both today and in the future. The four pillars are:

1. ASP.NET Web Forms. Until recently, this was the only pillar of ASP.NET. Everything was done using the familiar rich server-side controls that many have grown to love (and some have learned to despise which is the seed for the other three pillars).

So what’s wrong with Web Forms? Well, many developers believe that ASP.NET Web Forms are too much of a black box with a healthy-dose of black magic. There’s a price you pay for the Web Form event model and lifecycle and that price is control over the exchange of data between server and browser. Some say the Web Form model with its incessant ViewState is a pig. The Web Form model also makes it difficult to create testable applications and practice test-driven development (TDD). On the other hand, ASP.NET AJAX and ASP.NET MVC (pillars 2 and 3, respectively) come with a steep learning curve, and for many developers will take longer to develop the equivalent application. Pillar 4 (Dynamic Data), on the other hand, has limited applicability, IMO.

My opinion is that with the possible exception of Dynamic Data for the creation of simpler intranet applications, nothing in my mind beats ASP.NET Web Forms (especially when paired with the ASP.NET AJAX UpdatePanel control) for creating good, solid, and responsive applications that, while not the leanest and meanest of applications nor very testable, are easy to grok and master for the novice through advanced developer. Web forms are also the best match for the developer looking to make the move to ASP.NET from Microsoft desktop (Win Forms or WPF) development. And with each new version, Microsoft is making it easier to tame the pig (ViewState) and reduce its size: ASP.NET 2.0 introduced ControlState and ASP.NET 4.0 changes the inheritance model of ViewState so it will be easier to turn off ViewState by default and only turn it on when needed.

2. ASP.NET AJAX. This pillar came to life with the ASP.NET AJAX Extensions for ASP.NET 2.0 / Visual Studio 2005. ASP.NET AJAX is now integrated into ASP.NET and Visual Studio 2008 and consists of both a set of server-side controls as well as the AJAX client-side libraries. In regards to its existence as a “pillar,” I would argue that Microsoft is really only talking about the client-side of ASP.NET AJAX. The reason I say this is that the Upgrade Panel and other server-side AJAX controls merely extend the Web Form pillar, giving Web Form controls the ability to do asynchronous postbacks.

There are many in the hard-core ASP.NET AJAX community that believe that a true AJAX application needs to be built from the ground up without the use of Web Forms and the server-side controls. Using this approach, ASP.NET merely becomes a way to emit HTML with embedded JavaScript (and references to the Microsoft ASP.NET AJAX and JQuery libraries) that calls back to ASP.NET (or WCF) web services. When the HTML and JavaScript hit the browser, that’s when the action begins.

The promise of this approach is a much snappier user interface and a much more scalable web site. On the downside, this requires programming in a looselytyped language with a weird inheritance model, spotty support for IntelliSense, and, while improved, lousy debugger support. Another downside, the lack of smart client-side controls is likely to be remedied in the ASP.NET 4.0 timeframe. Microsoft is busy improving the client-side story, complete with client-side controls, for ASP.NET 4.0 (if you are curious, check out
http://asp.net/ajax and click on the “ASP.NET AJAX Preview on CodePlex” link for a peek at what’s coming). Regardless, I believe this will pillar will always be for a subset of ASP.NET developers who don’t shun the client-side.

3. ASP.NET MVC. This pillar is the newest to emerge from Microsoft. In fact, as of this writing, it’s only a week old, having been released at Mix09. Some ASP.NET curmudgeons would call this a throwback to the days of ASP “classic” spaghetti code, but for many others--especially the alt.net crowd and transplants from Ruby and Java--this represents the cat’s pajamas on the Microsoft web stack. (Of course, it’s amazing how quickly developers find problems in the latest programmer’s paradise--usually before its release--and I’m sure the MVC aficionados are already looking to the next release.)

The basic idea behind ASP.NET MVC is to separate out the three concerns of the MVC pattern: the model, view, and controller. The model represents the data model, the view is the user interface that presents the data and interacts with the user, and the controller is the command center that takes inputs from the view, pushes and pulls data into/from the model, and decides what to do next. By separating out these concerns (as purely as possible), you improve the ability to create unit tests for your applications and, at least on some level, improve application maintainability. If you are into test driven development, then this is the pillar to hook your horse to.

4. Dynamic Data. IMO, Dynamic Data is a misnomer. From its name, one would tend to think this is yet another data access technology from Microsoft. It is not. I would have preferred Microsoft to use the name Dynamic Application or Dynamic Application Builder. MSDN Help says, “ASP.NET Dynamic Data is a framework that lets you create data-driven ASP.NET Web applications easily.

You start with Dynamic Data by creating a database and then using either LINQ to SQL or the Entity Framework to create a model over your data. Next, create a new project or web site in Visual Studio 2008 (with SP1) using one of the Dynamic Data Web Application templates, make a fairly simple change to the web site’s Global.asax, and Visual Studio builds a dynamic ASP.NET application over your data model. The resulting site support the creation, retrieval, updating, and deletion (commonly referred to as the CRUD operations) of rows in the underlying database.

Dynamic Data uses what is termed dynamic scaffolding to construct your application. This means if you change your data model, the application will be instantly updated: there’s no code that needs to be re-generated.

Dynamic Data is customizable. Thus, if you don’t like how Dynamic Data presents a datatype or a particular field, or want to change how it performs validation of a field, you can change the templates behind these. You can also change the master page used as well as a number of other details.

Kudos to Microsoft for Dynamic Data--even though I hate the name. Just realize that this pillar, unlike the other three, is only applicable to a subset of ASP.NET applications that fit in well with the Dynamic Data approach to applications: applications, which are primarily data-centric, intranet-based applications. That said I could see many places where I might use Dynamic Data, though I am still trying to work through the security implications of opening up the database for pretty much unrestricted reading and writing of database tables.

(For those who would like to see an MVC version of Dynamic Data: I’d expect to see such a beast come down the pike at some point from Microsoft.)

Conclusion
So what does this mean to the developer? I have both good and bad news.

The good news is that you now have a choice in how you develop your ASP.NET applications. If you love the responsiveness and coolness of AJAX or you need to scale your applications big-time, then you’ll love ASP.NET AJAX. If, OTOH, you are into unit tests or TDD, you will love ASP.NET MVC. Finally, if you were looking for an easier way to build you basic CRUD web application, you’ll want to take a look at Dynamic Data. Don’t need any of the above? There’s no need to despair--ASP.NET Web Forms are here for the long haul!

The bad news is that you now have a choice in how you develop your ASP.NET applications. (Wait a second, wasn’t that also the good news?)This means you have more things to learn. It also means that in trying to support all four pillars, Microsoft may be taking some of its focus off Web Forms. After all, there’s only so many PMs and devs at Microsoft on the ASP.NET team. Furthermore, this means that if you are a manager like me worried about maintainability of applications and hiring developers, your job just got more difficult because one person’s idea of ASP.NET development is not necessarily someone else’s. Still, I think the good news here, outweighs the bad.

Microsoft used to present ASP.NET Web Forms vs. MVC as a choice between a car and a motorcycle. Both will get you to your job , but some (the majority of the population, I might add) prefer driving a car, while a sizable minority love their motorcycles which give you better gas mileage and independence, but don’t protect you in the rain. To stretch this analogy to its breaking point, let me suggest that ASP.NET AJAX is like riding a bicycle to work (lean and mean, best gas mileage, but it requires you to exercise for your commute and exposes you to the elements like the motorcycle) while Dynamic Data is like taking the bus to work (let metro do the driving for you.)

What about Silverlight?
Silverlight is really just the next generation of a one-click desktop application, that is hosted in the browser and runs on a multi-platform .NET virtual machine. Not to take anything away from Silverlight, but it’s not ASP.NET.

Does this mean that ASP.NET Web Forms is going away?
Although some people “in the know” have stated in the past couple of years that either ASP.NET AJAX or ASP.NET MVC was the future of ASP.NET, this is not the official position of Microsoft. Think about it, since 98%--yes, this is a total educated guess--of ASP.NET development is still done using ASP.NET Web Forms, and this percentage is unlikely to change significantly over the next several years, Microsoft would be stupid to kill off Web Forms. It doesn’t make any economic sense, and since Microsoft is a for-profit entity, I think Web Forms will be a major thrust of ASP.NET for many years to come.

In my opinion, Microsoft has added the three new pillars for the following reasons:

· Microsoft is always trying to follow innovation and buzz in the industry. Three years ago, that buzz was in AJAX. Today, MVC, separation of concerns, and TDD is all the rage so Microsoft has countered with ASP.NET MVC.

· Microsoft is always trying to broaden the appeal of ASP.NET. Microsoft has been trying to woo the open source community for years (after initially discounting it). And in order to do this, it needs to embrace many of the ideals of this community, which tends to live on the bleeding edge and is into AJAX, testability, and TDD, amongst other things.

· Microsoft truly wants to improve the productivity of its customers. After all, if you improve the ability of your customers, in our case corporate and independent developers, to get their jobs done, you’ve gone a long way to attracting and retaining customers. In Microsoft’s eyes (and many of its customers), this is a win, win situation.

I, for one, would like to thank Microsoft for the four pillars of ASP.NET. In producing the four pillars, Microsoft has given ASP.NET developers the choices they both want and need.

Import Excel sheet to Data Table

using System;
using System.ComponentModel;
using System.Data;
using System.Data.OleDb;
using System.Text;


public class ExcelBase : Component, IDisposable
{
#region Constructors
public ExcelBase()
{
UseFinalizer = false;
}
public ExcelBase(string WorkBook)
: this()
{
this.WorkBook = WorkBook;
}
#endregion
#region Workbook/range settings
string workbook;
/// &ltsummary>
/// The workbook (file) name to query
///
[DefaultValue(null)]
public string WorkBook
{
get { return workbook; }
set
{
CloseConnection();
workbook = value;
determinedrange = null;
}
}
/// &ltsummary>
/// The Range which to query. This can be any Excel range (eg &quotA1:B5") or
/// just a worksheet name.
/// If this value is null, the first sheet of the &ltsee cref=&quotWorkBook"/> is used
///
private string range;
[DefaultValue(null)]
public string Range
{
get { return range; }
set
{
range = value;
determinedrange = null;
}
}
private int worksheetindex = 0;
/// &ltsummary>
/// The 0 based INDEX of the worksheet to query.
/// If you want to set the name of the worksheet, use &ltsee cref=&quotRange"/> instead.
/// NB: if &ltsee cref=&quotRange"/> is set, this property is ignored
///
[DefaultValue(0)]
public int WorkSheetIndex
{
get { return worksheetindex; }
set
{
worksheetindex = value;
determinedrange = null;
}
}
#region Range formatting
/// &ltsummary>
/// If a range was determined in a previous step, keep it buffered here
///
string determinedrange;
/// &ltsummary>
/// Gets the properly formatted sheet name
/// if no worksheet was provided, read out sheet information and select everything
/// from the first sheet
///
public string GetRange()
{
if (determinedrange == null)
{
string range = Range;
if (range == null)
range = DetermineRange();
if (range.IndexOf(':') == -1 && !range.EndsWith("$"))
range += "$"; //sheetname has to be appended with a $
determinedrange = "[" + range + "]";
}
return determinedrange;
}

/// &ltsummary>
/// See &ltsee cref=&quotAutoDetermineRange"/> property for more info
///
/// &ltreturns>
string DetermineRange()
{
string sheet = GetSheetName(worksheetindex);
if (!autodeterminerange) return sheet;
return new RangeFinder(this, sheet).ToString();
}
#region RangeFinder
class RangeFinder
{
OleDbDataAdapter da;
DataTable dtSchema;
ExcelDataRange rng = new ExcelDataRange();
Import eb;
int cols;
/// &ltsummary>
/// minimum amount of columns that need to be filled
/// &ltseealso cref=&quotminfilled"/>
///
int min;
public RangeFinder(ExcelBase Owner, string sheet)
{
this.eb = new Import(Owner.WorkBook);
eb.Range = sheet;
eb.UseHeaders = false;
eb.InterMixedAsText = true;
//DataTable dt = eb.Query();
try
{
eb.OpenConnection();
//get the number of rows and columns
da = new OleDbDataAdapter("select * from [" + sheet + "]", eb.Connection);
dtSchema = new DataTable();
da.FillSchema(dtSchema, SchemaType.Source);
cols = dtSchema.Columns.Count;
int rows = (int)ExecuteScalar("select count(*) from [" + sheet + "]");
//fill the range object
rng.From.Row = rng.From.Column = 1;
rng.To.Row = rows;
rng.To.Column = cols;

min = (int)(cols * minfilled);
//now rng contains the complete square range of data containing cells
//try to narrow it by getting as much hits as possible
DecreaseRange();
}
finally
{
indexReader.Close();
eb.CloseConnection();
}
}
object ExecuteScalar(string sql)
{
return new OleDbCommand(sql, da.SelectCommand.Connection).ExecuteScalar();
}

string indexquery;
string GetIndexQuery()
{
if (indexquery == null)
{
StringBuilder sql = new StringBuilder("select 0");
int i = 0;
foreach (DataRow dr in dtSchema.Rows)
{
string colname = "[" + dr["column_name"].ToString() + "]";
sql.Append("+iif(").Append(colname).Append(" is null,0,1)");
}
sql.Append(" as ind from ");
indexquery = sql.ToString();
}
return indexquery;
}
//ExcelDataRange indexRange;
DataTable indexTable = new DataTable();
OleDbDataReader indexReader;
int GetIndex()
{
if (!Forward)
{
indexReader.Close();
indexReader = null;
da.SelectCommand.CommandText = string.Format(" select * from {0}:{0}"
, rng.To.Row);
}
if (indexReader == null)
indexReader = da.SelectCommand.ExecuteReader();
int cnt = 0;
if (!indexReader.Read()) return -1;
for (int i = 0; i < indexReader.FieldCount; i++)
{
if (!indexReader.IsDBNull(i)) cnt++;
}
return cnt;
da.TableMappings.Clear();


da = new OleDbDataAdapter(da.SelectCommand.CommandText, eb.conn);
indexTable = new DataTable();
//da.FillSchema(indexTable, SchemaType.Source);
da.Fill(indexTable);
return indexTable.Columns.Count;
}
/// &ltsummary>
/// minimum percentage that needs to be filled to count as a datarow
///
const double minfilled = .75;
/// &ltsummary>
/// The amount of subsequent (or preceding) rows that need to be filled a &ltsee cref=&quotminfilled"/> percentage
/// for it to count as a datarow
///
const int CheckRows = 3;
/// &ltsummary>
/// Decrease the range step by step
/// The problem is that when obtaining all, a lot more nulls are returned
/// than you would visibly see. That makes most algorithms to get the
/// block useless.
/// this is also why just obtaining the datatable complete and removing the
/// rows will not suffice: the proper field data types will not have been set
/// Best way I could figure without using interop was to increase the start
/// range to see if the avarage filled values increase.
///
void DecreaseRange()
{
for (; ; )
{
if (GetIndex() >= min)
{
int i = 0;
for (; i < CheckRows; i++)
{
AlterRange(1);
if (GetIndex() < min)
{
break;
}
}
if (i == CheckRows)
{
AlterRange(-i);
if (Forward)
Forward = false;
else
break;
}
}
if (rng.From.Row > rng.To.Row)
throw new Exception("Could not determine data range");
AlterRange(1);
}
}
bool Forward = true;
void AlterRange(int i)
{
if (Forward)
rng.From.Row += i;
else
rng.To.Row -= i;
}

public override string ToString()
{
return rng.ToString();
}
struct ExcelRange
{
public int Row, Column;
public ExcelRange(int Col, int Row)
{
this.Column = Col;
this.Row = Row;
}
public override string ToString()
{
//return string.Format(&quotR{0}C{1}", Row, Column);
string res = Row.ToString();
int col = Column;
while (col > 0)
{
int cc = col % 26;
char c = (char)('A' + cc - 1);
res = c.ToString() + res;
col /= 26;
}
return res;
}
}
struct ExcelDataRange
{
public ExcelRange
From, To;
public override string ToString()
{
return GetRange(From, To);
}
static string GetRange(ExcelRange from, ExcelRange to)
{
return from.ToString() + ":" + to.ToString();
}
public string TopRow()
{
return GetRange(From, new ExcelRange(To.Column, From.Row));
}
public string BottomRow()
{
return GetRange(new ExcelRange(From.Column, To.Row), To);
}
}
}
#endregion
#endregion

/// &ltsummary>
/// Checks if the &ltsee cref=&quotWorkBook"/> exists
///
public bool WorkBookExists
{
get { return System.IO.File.Exists(WorkBook); }
}
/// &ltsummary>
/// Checks if the workbook exists and throws an exception if it doesn't
/// &ltseealso cref=&quotWorkBookExists"/>
///
protected void CheckWorkbook()
{
if (!WorkBookExists) throw new System.IO.FileNotFoundException("Workbook not found", WorkBook);
}
#endregion
#region Connection
/// &ltsummary>
/// Creates &nbspa NEW connection. If this method is called directly, this
/// class will not check if it is closed.
/// To get a handled connection, use the &ltsee cref=&quotConnection"/> property.
///
/// &ltreturns>
public OleDbConnection CreateConnection()
{
CheckWorkbook();
return new OleDbConnection(
string.Format("Provider=Microsoft.Jet.OLEDB.4.0;" +
"Data Source={0};Extended Properties='Excel 8.0;HDR={1};Imex={2}'",
WorkBook, useheaders ? "Yes" : "No",
imex ? "1" : "0")
);
}
private bool useheaders = true;
/// &ltsummary>
/// Determines if the first row in the specified &ltsee cref=&quotRange"/> contains the headers
///
[DefaultValue(true)]
public bool UseHeaders
{
get { return useheaders; }
set
{
if (useheaders != value)
{
CloseConnection();
useheaders = value;
}
}
}
private bool imex;
/// &ltsummary>
/// if this value is &ltc>true, 'intermixed' data columns are handled as text (otherwise Excel tries to make a calcuated guess on what the datatype should be)
///
[DefaultValue(false)]
public bool InterMixedAsText
{
get { return imex; }
set
{
if (imex != value)
{
CloseConnection();
imex = value;
}
}
}
private bool autodeterminerange;
/// &ltsummary>
/// Tries to obtain the range automatically by looking for a large chunk of data. Use this value if there's a lot of
/// static around the actual data.
/// Beware though: this takes some additional steps and can cause performance loss
/// when querying larger files.
/// automatically determening the range is not fullproof. Be sure to check the results
/// on first time use.
/// NB: if the &ltsee cref=&quotRange"/> is set, this property is ignored.
///
[DefaultValue(false)]
public bool AutoDetermineRange
{
get { return autodeterminerange; }
set
{
if (autodeterminerange != value)
{
autodeterminerange = value;
determinedrange = null;
}
}
}
OleDbConnection conn;
/// &ltsummary>
/// Gets a connection to the current &ltsee cref=&quotWorkBook"/>
/// When called for the first time (or after changing the workbook)
/// a new connection is created.
/// To close the connection, preferred is the use of &ltsee cref=&quotCloseConnection"/>
///
public OleDbConnection Connection
{
get
{
if (conn == null)
{
conn = CreateConnection();
UseFinalizer = true;
}
return conn;
}
}
/// &ltsummary>
/// Closes the connection (if open)
///
public void CloseConnection()
{
if (conn != null && ConnectionIsOpen)
conn.Dispose();
conn = null;
UseFinalizer = false;
}
protected void CloseConnection(bool OnlyIfNoneOpen)
{
if (OnlyIfNoneOpen)
{
if (--opencount > 0 || wasopenbeforerememberstate) return;
}
CloseConnection();
}
/// &ltsummary>
/// Opens the &ltsee cref=&quotConnection"/>
///
public void OpenConnection()
{
OpenConnection(false);
}
int opencount;
bool wasopenbeforerememberstate;
protected void OpenConnection(bool RememberState)
{
if (RememberState && opencount++ == 0) wasopenbeforerememberstate = ConnectionIsOpen;
if (!ConnectionIsOpen)
Connection.Open();
}
public bool ConnectionIsOpen
{
get { return conn != null && conn.State != ConnectionState.Closed; }
}

#endregion
#region IDisposable Members
public void Dispose()
{
CloseConnection();
}
~ExcelBase()
{
Dispose();
}
private bool usefinalizer;
bool UseFinalizer
{
get { return usefinalizer; }
set
{
if (usefinalizer == value) return;
usefinalizer = value;
if (value)
GC.ReRegisterForFinalize(this);
else
GC.SuppressFinalize(this);
}
}
#endregion
#region Helper functions
/// &ltsummary>
/// queries the connection for the sheetnames and returns them
///
/// &ltreturns>
public string[] GetSheetNames()
{
OpenConnection(true);
try
{
// Read out sheet information
DataTable dt = Connection.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, null);
if (dt == null || dt.Rows.Count == 0)
{
throw new Exception("Could not get sheet names");
}

string[] res = new string[dt.Rows.Count];
for (int i = 0; i < res.Length; i++)
{
string name = dt.Rows[i]["TABLE_NAME"].ToString();

if (name[0] == '\'')
{
//numeric sheetnames get single quotes around them in the schema.
//remove them here
if (System.Text.RegularExpressions.Regex.IsMatch(
name, @"^'\d\w+\$'$"))
name = name.Substring(1, name.Length - 2);
}
res[i] = name;
}
return res;
}
finally
{
CloseConnection(true);
}
}
/// &ltsummary>
/// Gets the name of the first sheet
/// (this is also the default range used, when no &ltsee cref=&quotRange"/> is specified)
///
/// &ltreturns>
public string GetFirstSheet()
{
return GetSheetName(0);
}
public string GetSheetName(int index)
{
string[] sheets = GetSheetNames();
if (index <>= sheets.Length)
throw new IndexOutOfRangeException("No worksheet exists at the specified index");
return sheets[index];
}
#endregion
}
public class Import : ExcelBase
{
#region Static query procedures
/// &ltsummary>
/// Imports the first worksheet of the specified file
///
/// &ltparam name=&quotFile">
public static DataTable Query(string File)
{
return Query(File, null);
}
/// &ltsummary>
/// Imports the specified sheet in the specified file
///
/// &ltparam name=&quotFile">
/// &ltparam name=&quotRange">The worksheet or excel range to query
/// &ltreturns>
public static DataTable Query(string File, string Range)
{
return new Import(File, Range).Query();
}
public static DataTable Select(string File, string Sql)
{
Import i = new Import(File);
i.SQL = Sql;
return i.Query();
}
#endregion

#region Constructors
public Import() { }
public Import(string WorkBook) : base(WorkBook) { }
public Import(string WorkBook, string Range)
: this(WorkBook)
{
this.Range = Range;
}
#endregion

#region SQL Query
private string fields = "*";
/// &ltsummary>
/// The fields which should be returned (default all fields with data: "*")
///
[DefaultValue("*")]
public string Fields
{
get { return fields; }
set { fields = value; }
}
void ResetFields()
{
fields = "*";
}
private string where;
/// &ltsummary>
/// An optional where clause. Works pretty much the same as 'normal' SQL. (Default=null)
///
[DefaultValue(null)]
public string Where
{
get { return where; }
set { where = value; }
}
/// &ltsummary>
/// The sql to perform. If this value is filled, &ltsee cref=&quotWorkSheet"/> and &ltsee cref=&quotWhere"/> are ignored
///
public string SQL;
protected string GetSelectSQL()
{
if (SQL != null) return SQL;
// if no sql was provided, construct from worksheet and where
string sql = string.Format("select {0} from {1}", fields, GetRange());
if (where != null)
sql += " WHERE " + where;
return sql;
}
/// &ltsummary>
/// Performs the query with the specifed settings
///
/// &ltreturns>
public DataTable Query()
{
return Query((DataTable)null);
}
/// &ltsummary>
/// Same as &ltsee cref=&quotQuery()"/>, but an existing datatable is used and filled
/// (it will be your own responsibility to format the datatable correctly)
///
/// &ltparam name=&quotdt">
/// &ltreturns>
public DataTable Query(DataTable dt)
{
CheckWorkbook();
try
{
OpenConnection(true);
if (dt == null)
dt = new DataTable();
new OleDbDataAdapter(GetSelectSQL(), Connection).Fill(dt);
return dt;
}
finally
{
CloseConnection(true);
}
}
/// &ltsummary>
/// Fills the datatable with the results of the query
/// (wrapper around &ltsee cref=&quotQuery(DataTable)"/>)
///
/// &ltparam name=&quotdt">
public void Fill(DataTable dt)
{
Query(dt);
}
#endregion

}

Export Data to Excel Sheet

using System;
using System.Collections.Generic;
using System.Data;
using System.IO;

class ExportExcel
{


public static void Export(string sheetToCreate, DataTable dtToExport, string tableName)
{
List rows = new List();
foreach (DataRow row in dtToExport.Rows) rows.Add(row);
subExportToExcel(sheetToCreate, rows, dtToExport, tableName);
}
private static void subExportToExcel(string sheetToCreate, List selectedRows, DataTable origDataTable, string tableName)
{
char Space = ' ';
string dest = sheetToCreate;
int i = 0;
while (File.Exists(dest))
{
dest = Path.GetDirectoryName(sheetToCreate) + "\\" + Path.GetFileName(sheetToCreate) + i + Path.GetExtension(sheetToCreate);
i += 1;
}
sheetToCreate = dest;
if (tableName == null) tableName = string.Empty;
tableName = tableName.Trim().Replace(Space, '_');
if (tableName.Length == 0) tableName = origDataTable.TableName.Replace(Space, '_');
if (tableName.Length == 0) tableName = "NoTableName";
if (tableName.Length > 30) tableName = tableName.Substring(0, 30);
//Excel names are less than 31 chars
string queryCreateExcelTable = "CREATE TABLE [" + tableName + "] (";
Dictionary colNames = new Dictionary();
foreach (DataColumn dc in origDataTable.Columns)
{
//Cause the query to name each of the columns to be created.
string modifiedcolName = dc.ColumnName.Replace(Space, '_').Replace('.', '#');
string origColName = dc.ColumnName;
colNames.Add(modifiedcolName, origColName);
queryCreateExcelTable += "[" + modifiedcolName + "]" + " text,";
}
queryCreateExcelTable = queryCreateExcelTable.TrimEnd(new char[] { Convert.ToChar(",") }) + ")";
//adds the closing parentheses to the query string
if (selectedRows.Count > 65000 && sheetToCreate.ToLower().EndsWith(".xls"))
{
//use Excel 2007 for large sheets.
sheetToCreate = sheetToCreate.ToLower().Replace(".xls", string.Empty) + ".xlsx";
}
string strCn = string.Empty;
string ext = System.IO.Path.GetExtension(sheetToCreate).ToLower();
if (ext == ".xls") strCn = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + sheetToCreate + "; Extended Properties='Excel 8.0;HDR=YES'";
if (ext == ".xlsx") strCn = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + sheetToCreate + ";Extended Properties='Excel 12.0 Xml;HDR=YES' ";
if (ext == ".xlsb") strCn = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + sheetToCreate + ";Extended Properties='Excel 12.0;HDR=YES' ";
if (ext == ".xlsm") strCn = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + sheetToCreate + ";Extended Properties='Excel 12.0 Macro;HDR=YES' ";
System.Data.OleDb.OleDbConnection cn = new System.Data.OleDb.OleDbConnection(strCn);
System.Data.OleDb.OleDbCommand cmd = new System.Data.OleDb.OleDbCommand(queryCreateExcelTable, cn);
cn.Open();
cmd.ExecuteNonQuery();
System.Data.OleDb.OleDbDataAdapter da = new System.Data.OleDb.OleDbDataAdapter("SELECT * FROM [" + tableName + "]", cn);
System.Data.OleDb.OleDbCommandBuilder cb = new System.Data.OleDb.OleDbCommandBuilder(da);
//creates the INSERT INTO command
cb.QuotePrefix = "[";
cb.QuoteSuffix = "]";
cmd = cb.GetInsertCommand();
//gets a hold of the INSERT INTO command.
foreach (DataRow row in selectedRows)
{
foreach (System.Data.OleDb.OleDbParameter param in cmd.Parameters)
param.Value = row[colNames[param.SourceColumn]];
cmd.ExecuteNonQuery(); //INSERT INTO command.
}
cn.Close();
cn.Dispose();
da.Dispose();
GC.Collect();
GC.WaitForPendingFinalizers();
}



}

Friday, March 27, 2009

SQL SERVER - 2008 - SCOPE_IDENTITY Bug with Multi Processor Parallel Plan and Solution

This article is very serious and I would like to explain this as simple as I can. SCOPE_IDNETITY() which is commonly used in place of @@Idnetity has bug when run in Parallel Plan. You can read my explanation of @@IDENTITY vs SCOPE_IDENTITY() vs IDENT_CURRENT in earlier article.

The bug is listed here in connect site SCOPE_IDENTITY() sometimes returns incorrect value. Additionally, the bug is also listed in Book Online on last line of the SCOPE_IDENTITY() documentation.

When parallel plan is executed SCOPE_IDENTITY or IDENTITY may produce inconsistent results. The bug will be fixed in future versions of SQL Server. For SQL Server 2008 or any of the earlier version it should be fixed right away using following workarounds.

Single Processor Execution of Query: (Prefered method is to use OUTPUT clause mentioned below)
In short if have multiprocessor and your SQL Server is set to use multiprocessor to execute queries, it is all good till it encounters SCOPE_IDENTITY. You can use (MAX_DOP=1) option to execute query using single processor. As using single CPU can hurt the performance the better solution is to use OUTPUT clause as mentioned below.

USE AdventureWorks
GO
SELECT *
FROM HumanResources.Department
OPTION (MAXDOP 1)
GO

Whole server can be also set to run parallelism in serialization.

sp_configure ’show advanced options’, 1;
GO
RECONFIGURE WITH OVERRIDE;
GO
sp_configure ‘max degree of parallelism’, 1;
GO
RECONFIGURE WITH OVERRIDE;
GO

Using OUTPUT clause: (Preferred Method)
Refer my previous article for how to use OUTPUT clause information : SQL SERVER - 2005 - OUTPUT Clause Example and Explanation with INSERT, UPDATE, DELETE.I have included the same example along with this article for quick reference.

USE AdventureWorks;
GO
/* Creating the table which will store permanent table */
CREATE TABLE TestTable (ID INT IDENTITY(1,1), TEXTVal VARCHAR(100))
/* Creating temp table to store ovalues of OUTPUT clause */
DECLARE @TmpTable TABLE (ID INT, TEXTVal VARCHAR(100))
/* Insert values in real table as well use OUTPUT clause to insert
values in the temp table. */
INSERT TestTable (TEXTVal)
OUTPUT Inserted.ID, Inserted.TEXTVal INTO @TmpTable
VALUES (‘FirstVal’)
INSERT TestTable (TEXTVal)
OUTPUT Inserted.ID, Inserted.TEXTVal INTO @TmpTable
VALUES (‘SecondVal’)
/* Check the values in the temp table and real table
The values in both the tables will be same */
SELECT * FROM @TmpTable
SELECT * FROM TestTable
/* Clean up */
DROP TABLE TestTable
GO

While executing above code we will get following result. It is very clear from result that Identity which is inserted and which is retrieved using OUTPUT clause is same.

If have you any problem, question about using OUTPUT clause or MAXDOP, please feel free to leave comment here and send me email. I want to make sure that community is aware of this issue and have solution ready. Microsoft will take necessary steps to fix the issue but till then as community members it is our responsibility that we help each other and make sure our applications are running smoothly. Spread the word for this article.

Reference : Pinal Dave


SQL SERVER - Insert Multiple Records Using One Insert Statement - Use of UNION ALL

Update: For SQL Server 2008 there is even better method of Row Construction, please read it here : SQL SERVER - 2008 - Insert Multiple Records Using One Insert Statement - Use of Row Constructor

This is very interesting question I have received from new developer. How can I insert multiple values in table using only one insert? Now this is interesting question. When there are multiple records are to be inserted in the table following is the common way using T-SQL.

USE YourDB
GO
INSERT INTO MyTable (FirstCol, SecondCol)
VALUES ('First',1);
INSERT INTO MyTable (FirstCol, SecondCol)
VALUES ('Second',2);
INSERT INTO MyTable (FirstCol, SecondCol)
VALUES ('Third',3);
INSERT INTO MyTable (FirstCol, SecondCol)
VALUES ('Fourth',4);
INSERT INTO MyTable (FirstCol, SecondCol)
VALUES ('Fifth',5);
GO

The clause INSERT INTO is repeated multiple times. Many times DBA copy and paste it to save time. There is another alternative to this, which I use frequently. I use UNION ALL and INSERT INTO … SELECT… clauses. Regarding performance there is not much difference. If there is performance difference it does not matter as I use this for one time insert script. I enjoy writing this way, as it keeps me focus on task, instead of copy paste. I have explained following script to new developer. He was quite pleased.

USE YourDB
GO
INSERT INTO MyTable (FirstCol, SecondCol)
SELECT ‘First’ ,1
UNION ALL
SELECT ‘Second’ ,2
UNION ALL
SELECT ‘Third’ ,3
UNION ALL
SELECT ‘Fourth’ ,4
UNION ALL
SELECT ‘Fifth’ ,5
GO

The effective result is same.

Reference : Pinal Dave