DataReader vs. DataAdapter
I'm totally stoked to see Bill Vaughn blogging because he's 1) Irish 2) named William 3) He's a bad a33. I saw Sahil and Rocky chimed in on this issue - and it's one that I feel like running my big trap about too. In the debate between which to use, there are basically two ends of the spectrum. On the one side there's the “DataReaders have better performance and performance == better” Back in my early days, I was in this camp. Then there's the other extreme “DataAdapters are really easy to use and easy == better”. I'll admit, I'm oversimplifying a little, but at the end of the day - the two ends of the spectrum are some reasonable derivation of the arguments stated above.
The longer I work in programming the less inclined I am to ever use 'better' because with a few exceptions, better is totally context dependent. The notable exception is the statement “C# Is better than VB.NET and C# Programmers are better than VB.NET Programmers” [Just kidding]. But there's on REALLY BIG DIFFERENCE between the two methodologies. If you have 100,000 records in the database when you fire your xxxAdapter.Fill statement, you'll have 100,000 rows in your dataset. This would prove, among other things, that you have rocks in your head and shouldn't be pulling over 100,000 records but that's another discussion.
The same isn't necessarily so w/ xxxDataReaders. If you had a really volatile database with tons of inserts or deletes happening at any given time - and had 100,000 records in it when you called ExcecuteReader - you don't really know how many records you'll have at the end. Is this a big deal? It really depends on your scenario. If 90,000 of them were deleted immediately it probably would be, but again, you can't really speak to every developer scenario so in some cases it might be preferable, in others it may not be.
Another pretty big difference is a corrolary to this. With a DataReader, since you don't know how many records you have until you have them - you have to resort to workarounds if you want to give a UI cue as to how many records are encompassed in the query. You can use another SELECT COUNT(*) query or you can loop through your reader. The first method sucks b/c it produces 'unnecessary' overhead. I use quotes on that because it is pretty much necessary in some cases if telling the user how many records returned is important. The other work around is looping through the reader and then counting as you go. But this sucks b/c the user may want to take a different course of action depending on how many rows were returned. But once you eat a record, it's gone.
And speaking to the greater simplicity of DataAdapters, I don't know if I buy that argument wholesale although I would agree that in general, they are easier to use. If I have can't open a connection, something that's always a possibility, I may want to respond differently than if my query is bad, I don't have permission to access a proc or whatever else. So letting an adapter open and close a connection does mean that opening and closing will be handled correctly, if you don't open and close your connections manually, you are going to have to program for exceptions a little differently. Sure, you can just catch xxxException and check the error message, but in some cases, that's a pretty lame way to go about 'handling' not being able to open a database. For instance, you may have a warm standby server that's available that contains all of your lookup data - if your first connection fails to open, you trap the exception and run to the other database - or call a web service, or read an xml file - or whatever. Granted that 99.9% of the time, you probably won't respond like this so I'm probably splitting hairs but figured it was worth mentioning.
Another pretty big difference between the two comes in the form of distributed scenarios. You can't serialize a DataReader so you can't remote it. It's pretty easy to work around in most instances, and I guess proper Service Oriented Architectured apps just provide a facade and return the data - so it's not an issue there becuase your remote object can have a hard reference to your processing handler - but it's something worth considering.
On the other hand, it's VERY easy to whip up a Typed Dataset and fill it with an adapter. As such, a lot of functionality is already created for you. Sure, you can created a strongly typed collection and fill it with a DataREader, but it'll take a little work, a little more than it would to do the same with an Adapter and a strongly typed dataset. Moreover, there is a ton of built in functionality with Datasets, like DataRelation, Dataviews, DataTable.Select, Expression Columns etc. Building all of this into a collection of your own is doable and simple, but it's a pain in the a33. Actually, if you were going to use all of these features and you didn't use a Dataset, you must either be billing by the hour and enjoy bilking your clients - or you're nuts.
The main time I opt for a reader is when I'm running through a fairly large record set and just need to values I'm processing for a task that's handled onesies twosies. For instance, I recently had a module that needed to write a bunch of data from the database to a text file. In this case, a datareader was ideal because as soon as I wrote the data to the text stream, I didn't need it anymore. These record sets could get really big so just walking through it, writing it to the stream and being done with it made a lot of sense. And for cases where it fits, the DataReader just plain old kicks a33. Other times it's just a bunch of extra code that provides little benefit.
All in all I have to agree with Bill on this one [what else is new] - Performance isn't everything and maintenance and ease of development are every bit as big time issues as performance in most cases.
So which one is 'better'? I'll have to give you the classic MBA response, “It depends” [As an aside, back when I was in grad school - that's the #1 thing most profs emphasized - when asked a question by a client, that answer is always “It depends”. It makes for a fine answer in the coding world as well.]