A pretty common task I run across is counting the number of occurrences for a specific string, Primary Key ID etc. A few examples: checking a valid username/password combination or an existing value in the database to prevent duplicate/redundant data. Typically if there were no joins involved I would typically just do something like the following:
public bool DoesExist(string someValue) {
     using (SomeEntity eFactory = new SomeEntity()) {
     return eFactory.SomeTable.Where(a => a.Value == someValue).Count() > 0; }
}
]]>
Or use the Parallel PLINQ version if there were a considerable amount of rows assuming the overhead involved in PLINQ would negate any performance advantage for smaller tables:
public bool DoesExist(string someValue) {
     using (SomeEntity eFactory = new SomeEntity()) {
     return eFactory.SomeTable.AsParallel().Where(a => a.Value == someValue).Count() > 0; }
}
]]>
However if there were multiple tables involved I would create a Stored Procedure and return the Count in a Complex Type like so:
public bool DoesExist(string someValue) {
     using (SomeEntity eFactory = new SomeEntity()) {
     return eFactory.SomeTableSP(someValue).FirstOrDefault().Value > 0; }
}
]]>
Intrigued on what the real performance impact was across the board and to figure out what made sense depending on the situation I created a common scenario, a Users Table like so: [caption id="attachment_1377" align="aligncenter" width="267"] Users SQL Server Table Schema[/caption] Populated this table with random data from 100 to 4000 rows and ran the above coding scenarios against it averaging 3 separate times to rule out any fluke scores. In addition I tested looking for the same value run 3X and a random number 3X to see if the row's value position would affect performance (if it was at the near the end of the table or closer to the beginning). I should note this was tested on my HP DV7 laptop that has an A10-4600M (4x2.3ghz CPU) running Windows 8 x64 with 16GB of ram and a Sandisk Extreme 240GB SSD. [caption id="attachment_1378" align="aligncenter" width="300"] LINQ vs PLINQ vs Stored Procedure Count Performance Graph[/caption] The most interesting aspect for me was the consistent performance of the Stored Procedure across the board no matter how many rows there were. I imagine the results are the same for 10,000, 20,000 etc. I'll have to do those tests later. In addition I imagine as soon as table joins come into the picture the difference between a Stored Procedure and a LINQ query would be even greater. So bottom line, use a Stored Procedure for counts. The extra time to create a Stored Procedure, import it into Visual Studio (especially in Visual Studio 2012 where it automatically creates the Complex Type for you) is well worth it.