Google PlusFacebookTwitter

Trying out NDepend

By on Sep 2, 2016 in Development, Statistics, Tooling, Tools | 0 comments

What is this? I got the chance to try out NDepend, although with everything else, it took me quite some time until I finally got around to picking it up. Now NDepend is a tool, both command line, stand alone and as an addin to Visual Studio which allows you to do static analysis and reporting on your .Net projects. It uses a Linq based paradigm to build rules around the code metadata it reads from the solution making it incredibly versatile in terms of extending and customizing. In this blog post I’ll take you through my initial 2-3h user journey in trying out the tool. Installation After writing a few installers myself using both the old fashion visual studio included tools and lately using Wix I can appreciate that the NDepend team didn’t waste time trying to do this. NDepend simply comes as a zip file which contents you drop in a folder, completely fine and it’s a...

Free coverage control in Visual Studio

By on Jul 14, 2014 in Development, Statistics, Testing, Tooling, Tools | 0 comments

Code coverage is a good tool to force you to keep maintaining and adding unit and integration tests to your solution. It also increases your confidence in making changes if you can see that the code your changing is actually covered by tests both before and after your changes. Problem is, integrated code coverage in Visual Studio only comes with the Premium version and not all of us has an extra 3500$ to spend just to get some nice code statistics (admittedly that’s not all you get but it is the main feature of VS Premium I’ve used). So, are there any options? Why yes! I’ve recently started using OpenCover in conjunction to ReportGenerator. Both really good solid libraries to run tests, output coverage and then format it into a nice and clean HTML format. 1. Add the nuget packages: Shell Install-Package OpenCover Install-Package ReportGenerator 123 Install-Package...

SQL query statistics

By on Sep 18, 2013 in Database, Debugging, Performance, Statistics | 0 comments

This is no work of mine but I wanted to highlight it since it´s such a good thing to utilise when optimizing your database and queries. PgSQL SELECT SUBSTRING(qt.text, (qs.statement_start_offset/2)+1, ((CASE qs.statement_end_offset WHEN -1 THEN DATALENGTH(qt.text) ELSE qs.statement_end_offset END - qs.statement_start_offset)/2)+1), qs.execution_count, qs.total_logical_reads, qs.last_logical_reads, qs.min_logical_reads, qs.max_logical_reads, qs.total_elapsed_time, qs.last_elapsed_time, qs.min_elapsed_time, qs.max_elapsed_time, qs.last_execution_time, qs.creation_time, qp.query_plan FROM sys.dm_exec_query_stats qs CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) qt CROSS APPLY sys.dm_exec_query_plan(qs.plan_handle) qp WHERE qt.encrypted = 0 ORDER BY qs.total_logical_reads DESC 1234567891011121314151617181920212223 SELECTSUBSTRING(qt.text, (qs.statement_start_offset/2)+1,((CASE...

Rounding or grouping datetime in SQL

By on Sep 9, 2013 in Database, Statistics | 0 comments

When retrieving data for statistical usage in SQL it´s oftenly useful to group the data to some kind of time interval. Easiest and most used is probably simple grouping by date: PgSQL SELECT cast(Timestamp as date), COUNT(*) FROM MyTable GROUP BY cast(Timestamp as date) 1 SELECT cast(Timestamp as date), COUNT(*) FROM MyTable GROUP BY cast(Timestamp as date) However, some times you will want to go even deeper in detail and group things based on hours or maybe even minutes. This can be achieved by rounding the timestamp. What we do is simply to count the hours/minutes from zero-time and add this to a zero-time date. Beginning-Of-Time + COUNT(Hours from Beginning-Of-Time) = Your datetime rounded to hours PgSQL SELECT dateadd(HOUR, datediff(HOUR, 0, Timestamp), 0), COUNT(*) FROM MyTable GROUP BY dateadd(HOUR, datediff(HOUR, 0, Timestamp), 0) 1 SELECT dateadd(HOUR, datediff(HOUR, 0,...

Want to group by date? Cast to date!

By on Maj 30, 2013 in Database, Development, Statistics | 0 comments

When extracting timed statistics from SQL you usually want these stats grouped by some kind of time and more often than not this time unit is days. I used to do this unnessesarily complicated by converting the dates to varchars in formats I didn’t even want and maybe even use datepart(day, datetime) if the data spanned within a month. Then the good man Sören Helenelund told me about casting to date. Neat and gives me exactly what I want: PgSQL SELECT cast(getdate() as date) 1 SELECT cast(getdate() as date) It will output the todays date in my localized format 2013-05-30 and can be used to group things or whatever. Wonderful and easy to remember. Thanks...

Table size

By on Dec 7, 2012 in Database, Performance, Statistics | 0 comments

When using large databases it might be interesting to keep an eye out on what of your tables that actually consumes all that disk. There´s a lot of resources on this out there but I thought I´d repeat it anyways since its such a neat little thing: PgSQL EXEC sys.sp_spaceused 'thetable' 1 EXEC sys.sp_spaceused 'thetable' It will show you a neat table gridview of your database table size like this: And if you like you can use the this to retrieve it for all your tables in one grid: PgSQL  -- Declare a temp in-memory-table  DECLARE @sizeTable as TABLE (name varchar(50), numrows int, reserved varchar(50), data varchar(50), index_size varchar(50), unused varchar(50))  -- Select out the results of sp_spaceused for each table  INSERT INTO @sizeTable (name, numrows, reserved, data, index_size, unused)  EXEC sys.sp_MSforeachtable 'EXEC sp_spaceused ''?'' '  -- Select from the temp-table  SELECT...