# Thursday, February 26, 2009

As I blogged before, relog is quite useful. The syntax examples on the TechNet page however are pretty much useless if you want to go to SQL Server. So let's look at a few scenarios and the syntax to make them work, but before that, let's get the DSN and the counter data file.

Configure the Data Source Name to the SQL Server as System DSN based on the SQL Server driver (SQLSRV32.DLL)... Native Client does NOT work. The name of the DSN in the syntax samples will be PerfDB.

Next is the file with performance data. Relog will detect the format from its internal structure (if it is a valid counter log file), so you do not have to specify if your file is comma separated (.csv), tab separated (.tsv) or binary (.blg). Since binary is the most practical format for large amounts of data, the file for the syntax examples will be c:\my perflogs\p_log01.blg (and consecutive numbers for any next file).

One final comment before going to the scenarios; relog creates three tables in the SQL Server database targeted by the DSN (if they do not already exist). These tables are;relog_schema

  • dbo.CounterData (holds the actual values of the counter)
  • dbo.CounterDetails (holds the machine, object, counter and, if applicable, the instance)
  • dbo.DisplayToID (holds information on the perfmon data that was loaded)

Senario 1: Load all counters into the database

relog "c:\my perflogs\p_log01.blg" -f SQL -o SQL:PerfDB!1stRun

All clear, except for the blue 1stRun (and the exclamation mark preceding it). The reason is that it is possible to write multiple logs to the same database. Each time log information is written to the database, a new DisplayToID record is created. The name given in blue is the  DisplayString. If the DisplayString is used before, the data will be added under the same GUID.

Scenario 2: Load all counters between begin and end datetime into the database

relog "c:\my perflogs\p_log02.blg" -f SQL -o SQL:PerfDB!2ndRun -b 26-02-2009 10:00:00 -e 26-02-2009 10:30:00

TechNet says the format for relog date times should be M/d/yyyy hh:mm:ss, which is the internal format of the log files is M/d/yyyy hh:mm:ss.sss, minus the milliseconds. In reality, relog looks at the regional settings, including the customizations you did to the regional settings! The string in the sample is valid for the regional settings Dutch (Netherlands), so dd-mm-yyyy HH:mm:ss. Best way to find out what format relog expects is to run relog /?.

Together with the previous issue of the char(24) storage of a binary converted datetime string, this regional settings dependency is horrible handling of datetime. For globalization support it would be great if relog was given an extra switch to indicate that datetime strings are in ISO 8601 or ODBC format, independent of the regional setting.

Scenario 3: Load a limited set of counters into the database

relog "c:\my perflogs\p_log03.blg" -q -o c:\counters.txt

Edit the c:\counters.txt file to only include the counters to be written to the database.

relog "c:\my perflogs\p_log03.blg" -cf c:\counters.txt -f SQL -o SQL:PerfDB!"select set of counters from the 3rd run"

It is possible to combine scenarios 2 and 3 to load a limited set of counters between two datetimes. Also, if you want spaces in the DisplayString, it can be done with the double quotes as shown in this example.

Thursday, February 26, 2009 3:42:13 PM (W. Europe Standard Time, UTC+01:00)
# Saturday, January 31, 2009

A common question for students during/after attending a Microsoft training: "Where do we get the virtual PC images we used during course?"

The answer is, you don't... Microsoft provides these images for classroom use only by Certified Partners for Learning Solutions and Microsoft Certified Trainers. Quite understandable, as these images, contain a lot of software. However, the students question is valid too, for practice and exam preparation. And you can get about the same experience you had in class, based on Microsoft's "Run IT on a Virtual Hard Disk" program. Run IT on a Virtual Hard Disk allows you to download and use a fully installed evaluation version of an installed product. So here is how you can build your own VHD for the 2779 or 2780 courses.

  1. Your PC; I recommend you use a PC with at least 1.5 GB of RAM and Windows XP.
  2. Virtual PC; download and install Virtual PC 2007 (if you want, you can use Virtual Server 2005 R2 instead). For the download and more information see the Microsoft Virtual PC site.
  3. SQL Server VHD; download the 4 files image files for the SQL Server 2005 Evaluation VHD and unpack the VHD. SQL Server 2005 is currently not listed on the Run IT on a Virtual Hard Disk site.
  4. SQL Server installation media; download the Evaluation Edition of SQL Server 2005 (180-day version), requires Windows Live ID. Some labs/practices/demonstrations require multiple instances, these are installed on the 2779 and 2780 images, but not on the Evaluation VHD. So you may need to install the SQLINSTANCE2 and SQLINSTANCE3.
  5. SQL Service Pack; the VHD for SQL Server Evaluation has no Service Pack for SQL Server applied, whereas the MOC courses 2779 and 2780 are based on SQL Server 2005 SP1. Links to the SQL Server 2005 Service Packs:
    1. SQL Server 2005 Service Pack 1
    2. SQL Server 2005 Service Pack 2
    3. SQL Server 2005 Service Pack 3
  6. Create a Virtual machine in Virtual PC based on the downloaded VHD. Set the amount of memory to at least 1024 MB, also enable undo disks.
  7. Start and log in to the Virtual PC guest, you will need the administrator password Evaluation1. You will notice that the Windows Server 2003 operating system is not activated, therefor you only have a limited period for evaluation.
  8. From the student CD that came with your courseware, run Allfiles.exe. This will extract all files required by the practices, labs and demonstrations. Note that the setup will be different from what you were used to during the course. The files in the course were on a separate VHD for each module, which was mounted as D:\. After unpacking Allfiles.exe the whole course is in one folder tree (typically C:\Program Files\Microsoft Learning\27xx\). Note that you may have to compensate for paths and server names; so when you are in 2780 module 4, a path D:\democode\SalesCert.cer should be changes to C:\Program Files\Microsoft Learning\2780\Mod04\SalesCert.cer. Likewise the name of the server is different too, so MIAMI should be changed to WIN2K3R2EE.

Tips about downloading and file interaction between the Virtual PC host, the Guest and the Internet.

  1. When you have a ISO-file on your host, you can mount this ISO as CD/DVD in the guest. You can also instruct Virtual PC guest to use the CD/DVD drive from the host.
  2. When you have normal files on your host, you can use the Virtual PC Shared Folders feature; this exposes a folder on the host as a network drive on the guest.
  3. You can use the Networking feature of Virtual PC to use Shared networking (NAT) or your hosts Network adapter to allow access to the network and to Internet, so you can download files directly into your Virtual PC guest.

Main differences between the MOC and Eval VHD's

  MOC Eval
Server name MIAMI Win2k3R2EE
SQL Server Edition Developer SP1 Enterprise no SP
Instances [default]
SQLINSTANCE2
SQLINSTANCE3
[default]
SQL Service Account MIAMI\SQLServer [LocalSystem]
Password Pa$$w0rd Evaluation1
Course files One VHD per module All files in a folder tree, paths have to be checked/changes.
Saturday, January 31, 2009 6:53:43 PM (W. Europe Standard Time, UTC+01:00)
# Monday, November 24, 2008

For those of you who are not familiar with the tool Relog; it is part of Windows and allows you to reprocess System Monitor logs. This is quite useful, as logging counters to the binary format (.blg) is very space efficient. However to analyze those counters, you may want to use SQL Server. Relog allows to rewrite the log and also creates the tables for you on SQL Server, if not yet present (syntax samples). The unfortunate part is in the CounterDateTime column of the CounterData table, despite the column name, the date type is char(24). When trying to convert the char(24) to a datetime, this throws the error;

Msg 241, Level 16, State 1, Line 2
Conversion failed when converting datetime from character string.

I was getting a bit annoyed by this error, as a conversion from a char to a datetime should succeed if the string is valid... and say 2008-09-30 12:10:15.322 looked valid. I even checked Tibors ultimate datetime guide to see if I missed something, because I was unsure of the impact of the last byte (a valid string yyyy-mm-dd HH:mi:ss.sss actually is 23 characters). Also when trying to insert a string with a valid datetime in a variable, the conversion succeeds. Next I turned my attention to how the data was stored in the CounterDateTime column, by inserting a valid datetime string and retrieving one that was inserted by Relog (both returning the string (char(24)) and its binary representation varbinary(24)).

Source Character representation Binary representation
Relog insert 2008-09-30 12:10:15.322 0x323030382D30392D33302031323A31303A31352E33323200
Manual insert 2008-09-30 12:10:15.322 0x323030382D30392D33302031323A31303A31352E33323220

The difference is in the last byte of the binary representation, the datetime string written by relog isn't padded with spaces as one would expect to happen for the unused positions in a char. Instead it is zeroed as one would expect for unused bytes in a binary string. To get a datetime representation of the CounterDateTime a double conversion is needed;

CAST(SUBSTRING([CounterDateTime],1,23) AS datetime)

Now for the "nasty" in Relogs database schema:

  • Changing the CounterDateTime to datetime is a no-go. Relog does not want it.
  • Adding a computed column with the conversions above is a no-go. Relog does not have all columns bound.

What remains is creating a view on top of the CounterData table with this conversion included.

Another issue that may save you some time, the DSN you create to access the SQL Server should be the "SQL Server" provider, not the "SQL Native Client".

Monday, November 24, 2008 8:48:37 PM (W. Europe Standard Time, UTC+01:00)
# Wednesday, October 8, 2008

A while back, I wrote about using sp_ procedures in master to create a toolbox, which works fine for me... until recently I hit a strange problem. I developed a procedure that would read the definition of a table and would create a change_log table and trigger to populate that table. On my dev-environment (Windows Vista x64, SQL Server 2005 Dev Ed x64 SP2) this worked like I expected. But when I tested the procedure on a different server, it didn't!?! A very little sample that touches the essence of the problem I witnessed;

use master
go
if
object_id('sp_test','p') is not
null
drop procedure sp_test
go
create
procedure sp_test
as
select
db_name() + '.' + table_schema + '.' + table_name
from information_schema.
tables
order by table_schema, table_name
go
exec
sp_test
go
use
adventureworks
go
exec
sp_test
go
use
master
go
drop
procedure sp_test
go

Now on my dev machine, this once listed all tables in master and once all tables in AdventureWorks, as intended. But on other servers, it returned the tables from master on both executions of sp_test (the db_name function is executed correctly in both cases; once master, once AdventureWorks). For some reason, when referencing objects, the stored procedure stayed to master. I was puzzled, but before crying BUG (out loud) I tried the newsgroups and a prompt reply from SQL Server MVP Roy Harvey pointed me to the undocumented stored procedure sp_MS_marksystemobject. Simply execute;

exec sp_MS_marksystemobject sp_test

and you won't suffer from the inconsistent behavior I witnessed.

Technorati tags: ,
Wednesday, October 8, 2008 5:51:28 PM (W. Europe Daylight Time, UTC+02:00)

Just checked the Prometric site and the status for my 70-432 (71-432) and 70-448 (71-448) changed from tested to passed ;-).

Technorati tags: , ,
Wednesday, October 8, 2008 10:55:23 AM (W. Europe Daylight Time, UTC+02:00)
# Friday, August 8, 2008

Yesterday evening I got into a fight with the SSIS XML Source Data Flow Source. It actually was the first time I used this Data Flow Source. I had no expectations other than, point to your XML-file and get the data. Wrong... I did point to an XML-file, generate a schema and... no columns! I ended up with the error:

Validation error. Data Flow Task: XML Source [000]: A component does not have any inputs or outputs.

In cases like this, Google should be your friend... well I read a lot about SSIS in general, some even about the XML Source but nothing to provide me with answers or even help me out. It was after reading a post by Oren Eini that I decided I was on the wrong trouble shooting track and a good night sleep would be the best next step.

This morning I started with clear vision and an open mind. No answers through Google, nothing useful on Connect, so I tried if I could reproduce my problem with an other document. The document I created was of a very simple structure;

<people>
  <person>
    <firstname>Jan</firstname>
    <lastname>Aerts</lastname>
  </person>
  <person>
    <firstname>Anne</firstname>
    <lastname>Mulders</lastname>
  </person>
</people>

It worked! But now I had to find out why the document above worked, and the other one didn't. Again I read about SSIS in general and a little something about the XML Source. In particular (SSIS in general), I stumbled upon a post by Jamie Thomson, sounding familiar and one about the XML Source I wish I came across earlier: Using XML Source by Matt Masson. I could already agree with his opening comment, especially the various degrees of success. While reading Matt's article I had this feeling my XML document might actually be to simple... it occurred to me that the XML Source was not just going to read XML, it was trying to represent the XML as one or more tables.

A very simple representation of my original document is;

  <person id="1">
    <firstname>Jan</firstname>
    <lastname>Aerts</lastname>
  </person>
<row column1="value">
  <column2>value</column2>
  <column3>value</column3>
</row>

The simplest representations Matt used, are:

<rootgoo>
  <goo>
    <subgoo>value</subgoo>
    <moregoo>1</moregoo>
  </goo>
  <goo>
    <subgoo>value</subgoo>
    <moregoo>2</moregoo>
  </goo>
</rootgoo>
<table>
  <row>
    <column1>value</column1>
    <column2>value</column2>
  </row>
  <row>
    <column1>value</column1>
    <column2>value</column2>
  </row>
</table>

AND

<root>
  <row CustomerID="1" TerritoryID="1" AccountNumber="AW00000001" />
  <row CustomerID="2" TerritoryID="1" AccountNumber="AW00000002" />
</root>
<table>
  <row column1="value" column2="value" column3="value" />
  <row column1="value" column2="value" column3="value" />
</table>

So my document could never be translated to a table... to get back to Oren's post: If only SSIS had told me so with a clear error or even a dialog in the XML Source, that would have saved me a couple of hours!

Or better, since XML Source tries to get data from the XML, it could do a best effort as wrapping something that looks a single row into a table (and to take it one step simpler, represent a single value as a table with just one row and one column.). If you'd like to see some improvement here too, take a moment to vote on FeedbackID 361057.

On a version note, it happens with SQL Server 2005 (SP2) and SQL Server 2008 (PreRelease).

Friday, August 8, 2008 12:40:45 PM (W. Europe Daylight Time, UTC+02:00)
# Wednesday, August 6, 2008

REDMOND, Wash. — Aug. 6, 2008 — Microsoft Corp. today announced the release to manufacturing of Microsoft SQL Server 2008 -->

Wednesday, August 6, 2008 8:46:46 PM (W. Europe Daylight Time, UTC+02:00)
# Saturday, August 2, 2008

As most will know sp_ does not stand for stored procedure, it stands for system stored procedure. But calling your procedure sp_something doesn't make it a system procedure automatically, it just hints the server how to resolve the procedure.

When a procedure that starts with sp_ is called, first the master database is checked if it is a real system stored procedure. The books online shows this behavior by creating a procedure in AdventureWorks called dbo.sp_who. However, since sp_who is a real system stored procedure, the existence of AdventureWorks.dbo.sp_who is always ignored. If the procedure is not a real system stored procedure, the connected database is checked for the existence of the stored procedure. If it is in the database you're currently connected to, it gets executed. If it isn't in the database you're currently connected to, it is retrieved from master (or you receive an error if it isn't there either). You can verify this behavior based on the following code.

USE AdventureWorks
GO
CREATE PROCEDURE sp_sayhello
AS
SELECT 'Hello from AdventureWorks, you are connected to ' + DB_NAME() + '.'
GO
USE master
GO
CREATE PROCEDURE sp_sayhello
AS
SELECT 'Hello from master, you are connected to ' + DB_NAME() + '.'
GO

Now, when executing sp_sayhello while connected to AdventureWorks, it will return;

Hello from AdventureWorks, you are connected to AdventureWorks

With any other database, say msdb, you get the following result.

Hello from master, you are connected to msdb

So there are two reasons why starting your stored procedure name with sp_ isn't smart;

  • Performance; each time the procedure is called, a (futile) lookup is done against the master database.
  • Future; if you have a stored procedure in your database called sp_dosomething and Microsoft implements a system stored procedure sp_dosomething in SQL Server, your application is broken.

There is however one scenario where creating stored procedures with sp_ is smart: When you create it in master as part of your own standardized way of working. Creating your own toolbox so to say. With SQL Server 2005 and 2008 there is an automatic separation, your sp_ procedures are created in the dbo schema by default and the real system stored procedures reside in the sys schema (the actual system stored procedures are in the mssqlsystemresource database).

Your own sp_ procedures and schemas: DON'T!!! It does not work if the schema in master isn't dbo.

USE master
GO
CREATE SCHEMA
toolbox
GO
CREATE PROCEDURE
toolbox.sp_sayhello
AS
SELECT
'Hello from master. You are connected to ' + DB_NAME() + '.'
GO
USE
AdventureWorks
GO
EXEC
sp_sayhello -- Doesn't work
EXEC toolbox.sp_sayhello -- Doesn't work
EXEC master.toolbox.sp_sayhello -- Executes against master, not AdventureWorks.
GO
USE
master
GO
DROP PROCEDURE
toolbox.sp_sayhello
GO
DROP SCHEMA
toolbox
GO

Your own sp_ procedures and non-privileged users: Make sure the login has permissions to execute the procedure from master and that any needed permissions are held in the target database. To illustrate, a login, mapped to a user in AdventureWorks will execute a stored procedure named sp_maketable. To make this work, public (therefor any login through guest, which is appropriate for master) will receive execute permissions on the procedure and create table and alter schema permissions are granted to the user in AdventureWorks. The table is created in the default schema of the user.

USE master
GO
CREATE PROCEDURE sp_maketable
AS
CREATE TABLE tblTest (col1 int)
GO
GRANT EXECUTE ON dbo.sp_maketable TO public -- Make sure permissions allow the user to execute.
GO
CREATE LOGIN np_user WITH PASSWORD = 'secret', DEFAULT_DATABASE = AdventureWorks
GO
USE AdventureWorks
GO
CREATE USER np_user FOR LOGIN np_user WITH DEFAULT_SCHEMA = Sales
GO
GRANT CREATE TABLE TO np_user -- Make sure the user has proper permissions in the database.
GO
GRANT ALTER ON SCHEMA::Sales TO np_user -- Make sure the user has proper permissions in the schema.
GO
EXECUTE AS LOGIN = 'np_user'
GO
SELECT SUSER_SNAME(), USER_NAME() -- Verify it is executing as the user.
GO
EXEC sp_maketable
GO
REVERT

Important stuff when writing your own sp_ 's:

  • BACKUP DATABASE master just became even more important.
  • Double check on which of your own procedures you grant execute permissions.
  • Use a proper naming convention, like including your company name, to avoid naming collision with future Microsoft system stored procedures.
  • If a procedure exists with the same name in one of your databases and you are connected to that database, the local procedure gets executed, not the central one from master.
  • Document.
  • Mark your sp_ as system object with sp_MS_marksystemobject
Technorati tags: ,
Saturday, August 2, 2008 8:37:45 PM (W. Europe Daylight Time, UTC+02:00)
# Tuesday, June 17, 2008

After running through the prep-guide (looking through a pair of SQL Server 2005 glasses), I identified a couple of topics worth giving a closer look. The topics are derived from the prep-guide, my comments about the topic added in blue italics and the bulleted list refers to (mostly) BOL-resources. This post is based on the prep-guide for 70-432 with published date June 11, 2008

Installing and Configuring SQL Server 2008 (10 percent)

Configure additional SQL Server components.
This objective may include but is not limited to: SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS), SQL Server Reporting Services (SSRS), replication. Not that I expect this to be really different from SQL Server 2005, but if your background is just DBA (MCTS/MCITP) it may be your first encounter with the BI-components.

Maintaining SQL Server Instances (13 percent)

Implement the declarative management framework (DMF).
This objective may include but is not limited to: create a policy; verify a policy; schedule a policy compliance check; enforce a policy; create a condition.

Back up a SQL Server environment.
This objective may include but is not limited to: operating system-level concepts. I don't expect a lot of fireworks, but the operating system-level concepts made me curious.

  • Planning for Disaster Recovery Actually, I'm still curious what is meant by operating system-level concepts. This link from BOL is actually my best shot at a document where some broader considerations are presented.

Managing SQL Server Security (15 percent)

Manage transparent data encryption.
This objective may include but is not limited to: impact of transparent data encryption on backups.

Maintaining a SQL Server Database (16 percent)

Back up databases.
This objective may include but is not limited to: full backups; differential backups; transaction log; compressed backups; file and filegroup backups; verifying backup. Only compressed backups is to be classified as new.

Performing Data Management Tasks (14 percent)

Implement data compression.
This objective may include but is not limited to: sparse columns; page/row.

Maintain indexes.
This objective may include but is not limited to: create spatial indexes; create partitioned indexes; clustered and non-clustered indexes; XML indexes; disable and enable indexes; filtered index on sparse columns; indexes with included columns; rebuilding/reorganizing indexes; online/offline. Spatial and filtered indexes on sparse columns are of interest here, along with "is not limited to" which could be indexes on hierarchyid columns.

Optimizing SQL Server Performance (10 percent)

Implement Resource Governor.

Use Performance Studio.

  • Data Collection Entry page, includes How-To
  • Again, Performance Studio, also an MS-Name-Game, what you're really looking for is Data Collection... and trying to get that confirmed, I found this webcast by Bill Ramos (62 minutes).

The rest, well it is all too familiar from SQL Server 2005. Sure, I'll look for some "What's new" resources, but I think the above pretty much covers what I need to familiarize my self with.

Technorati tags: , ,
Tuesday, June 17, 2008 6:13:20 PM (W. Europe Daylight Time, UTC+02:00)
# Tuesday, June 10, 2008

Release Candidate 0 is available for download (and downloading) and the MCTS exam 70-432 went into beta testing (and I registered). Since the beta is only running from June 9th through June 30th, I had to go for 27th as it was the only gap in my schedule. Let's see if I can find the time to blog about my preparations...

Technorati tags: , ,
Tuesday, June 10, 2008 8:02:47 PM (W. Europe Daylight Time, UTC+02:00)