Wednesday, 12 July 2017

Solution to SQL Agent Job "Error retriving WMI variable WMI(TextData)"

I've been trying to implement SQL Agent Alerts on disk growth for a day now.

I've followed a few different guides on WMI alerting, but every time the alert fires, the SQL Job fails with the below error, and trying to play with different methods to ESCAPE the WMI tokens was unfruitful.

Unable to start execution of step 1 (reason: Error retriving WMI variable WMI(variable name here): 0x80041002).  The step failed.

SQL Junkie Share's blog had the answer. I needed to go to my SQL Agent properties, and under Alert System tick "Replace tokens for all job responses to alerts". Thanks Akhil!

Tuesday, 11 April 2017

T-SQL Tuesday #89 - The times they are a-changing: Don't fear automation; fear becoming redundant


“I suppose we really ought to be using Powershell for these checks” one of our infrastructure technicians commented to another after patching a server a short time ago - they still manually remote in and confirm the servers are operating as expected. Yes! I thought, that’d be a great shift in thinking. Across the room, a senior co-worker commented to another, in a decided but contemptuous tone “You know, studies have shown that automation leads to lower wages”. That comment took me by surprise.

I’ve always considered that doing my best for my employer is a key factor in being a good employee. Automation is an important tool towards doing more, more reliably, with less, and allows you to configure routine tasks to be performed automatically, without needing your assistance. This frees you to do more, and more reliably. It also means making yourself redundant from certain tasks, to free you to accomplish others; If you’d rather not do that due to flimsy fears around the future of job demand in your industry, well, that’s not a character trait I admire.

Will automation have an affect on our jobs? Yes. Environments change, and requirements change. Should we fear that? No, I don’t think so. Not if we are able to be fluid. We need to be able to hold our current job description loosely, while still being dedicated in our work right now. It’s likely that some-time in the future, there won’t be a job that does what you currently do. You need to be able to adapt when the environment changes.

This past weekend attended an SQL Saturday, where I sat through a day long High Availability / Disaster Recovery pre-conference session. During the session I sat beside a man who is an expert in the database-side of BI, and who was attending this session as a beginner on the subject of HA/DR. Next door, a man who was known, just a few years ago, as an expert in SQL Server Integration Services had re-branded, and was presenting as one of the world’s most well-known Power BI presenters. Both of those men are experts I look up to, and both of those men have been intentional about continuing to learn.

In the future there will likely be less Database Administrators, but those who are experts in their fields will still be in demand. Those who know how to support a range of technologies at a deep level will still be required, perhaps even more so than now. Those, however, who are content with one year’s worth of experience, repeated n times over, might not be.

An attribute I appreciate in the PASS community is the commitment members show to learning. PASS members discipline ourselves to improve at what we do, by teaching others and being taught by others. We aren’t content with mediocrity in our job, but make the time to study, to gain a deeper level of understanding in our fields, or humble ourselves to allow us to be seen as a beginner in others.

Am I worried about automation? A little, there’s a lot I feel I need to keep up with. Will automation reduce the demand for roles in the database administration field? Perhaps. Will my co-worker be out of a job? Not if he continues to learn and adapt to the changing environment.

What about you? As Buck Woody quipped, do you have 10 years experience, or one year of experience 10 times over?

This post is a contribution to T-SQL Tuesday #89 - The times they are a-changing. See more on T-SQL Tuesday here.

Monday, 20 February 2017

Adding an identity column or column with a default constraint to a table that already has data

Today I had a coworker ask me to review code that had a select statement moving data out of a table, truncating the table, adding an new column with identity and a default constraint, then re-inserting the data into the table. His explanation was that he couldn't add an identity column or a column with a default constraint to a table that contained data.

I displayed to him that moving data out of the table isn't needed to add either a default constraint or identity.

Here's an example of how to do that:

Create a table containing existing data:

create table Customer (
Firstname nvarchar(200)
, Surname nvarchar(200)
)

insert into customer(
firstname
, surname
)
select 'Bill', 'Palmer'
union all select 'Sarah', 'Moulton'
union all select 'Wes', 'Davis'

select firstname
, surname
from customer

There's no problem with adding an identity column to an existing table, except perhaps the time it takes to run this statement - keep that in mind if you're dealing with a very large table.

alter table customer add id int identity(1,1)

select id
, firstname
, surname
from customer

That query will succeed, and return the results of a select statement that includes the new ID column, with values!

Adding a column with a default constraint to an existing table requires a little trick. For this example, let's assume we want to add a bit column to customer, indicating whether the customer is a VIP. 

If you do the usual command to add a new column with a default constraint, as below, you may be surprised.

alter table customer add isVIP bit default(0)

select id
, firstname , surname , isVIP
from customer


Adding the default constraint will succeed, but the returned result-set will have null values in the "isVIP" column for the existing rows. To populate existing rows with the default value, run this statement instead:

alter table customer add isVP bit default(0) not null

select id
, firstname
, surname
, isVP
from customer

The existing rows with be populated your default value, and return that value in your result-set. 

If you want this column to allow NULL values, run the below statement to set the column to nullable, keeping the recently created default constraint.

alter table customer alter column isVP bit null

Monday, 13 February 2017

Scripting synonym changes after migrating servers

During a recent database migration, I've needed to modify a bunch of synonyms to point to a linked server to have the new server name.

Here's a query I wrote to automate changing those server names.

An example call is

exec @OldDatabasePath = '[OLDSERVERNAME].[DATABASE_NAME], @NewDatabasePath = '[NEWSERVERNAME].[DATABASE_NAME]', @ViewOnly = 1

Stored Procedure definition is below.

CREATE procedure [dbo].[RemapSynonyms]
 @OldDatabasePath nvarchar(500)
 , @NewDatabasePath nvarchar(500)
 , @ViewOnly bit = 1
as
begin
 if (right(@olddatabasepath, 1) <> ']' or right(@newdatabasepath, 1) <> ']' )
 begin
  print 'Database paths should end with "]"'
 end
 else
 begin
  set nocount on

   declare @synonymreplacements table (
   oldpath nvarchar(500)
   , newpath nvarchar(500)
   , synonym_name nvarchar(500)
   , schema_name nvarchar(500)
   , sqlcommand nvarchar(1000)
   , resultorder tinyint
  )
  insert into @synonymreplacements(
   oldpath
   , newpath
   , synonym_name
   , schema_name
  )
  select synonyms.base_object_name
   , replace(
    base_object_name
    , @olddatabasepath
    , @newdatabasepath
   )
   , synonyms.name
   , schemas.name
  from sys.synonyms
   inner join sys.schemas
    on synonyms.schema_id = schemas.schema_id
  where synonyms.base_object_name like replace(replace(@olddatabasepath, '[', '|['), ']', '|]') + '%'  escape '|'

   IF EXISTS (SELECT * FROM sys.dm_exec_cursors(@@SPID) where name = 'c_synonyms')
  BEGIN
   CLOSE c_synonyms
   DEALLOCATE c_synonyms
  END

   declare @current_old_path nvarchar(500)
  declare @current_new_path nvarchar(500)
  declare @current_synonym_name nvarchar(500)
  declare @current_schema_name nvarchar(500)

   declare c_synonyms cursor local fast_forward for
  select oldpath, newpath, synonym_name, schema_name
  from @synonymreplacements as synonymreplacements

   open c_synonyms
  fetch next from c_synonyms into @current_old_path, @current_new_path, @current_synonym_name, @current_schema_name

   while(@@FETCH_STATUS=0)
  begin
   if(@current_old_path <> @current_new_path)
   begin
    declare @query nvarchar(700) = 'drop synonym [' + @current_schema_name + '].[' + @current_synonym_name + ']; create synonym [' + @current_schema_name + '].[' + @current_synonym_name + '] for ' + @current_new_path + ';'

     update synonymreplacements
    set sqlcommand = @query
    from @synonymreplacements as synonymreplacements
    where synonymreplacements.schema_name = @current_schema_name
     and synonymreplacements.synonym_name = @current_synonym_name

     if(@viewOnly = 0)
    begin
     exec (@query);
    end
   end

    fetch next from c_synonyms into @current_old_path, @current_new_path, @current_synonym_name, @current_schema_name
  end

   if((select count(*) from @synonymreplacements) = 0)
  begin
   insert into @synonymreplacements(schema_name, synonym_name, resultorder, oldpath)
   select 'No relevant synonyms were found', '', 1, @OldDatabasePath
  end

   if(@viewOnly = 1)
  begin
   insert into @synonymreplacements(schema_name, synonym_name, resultorder)
   select 'Running with viewOnly = 1', '', 2
  end

   select synonymreplacements.schema_name + '.' + synonymreplacements.synonym_name as [Synonym Name]
   , synonymreplacements.oldpath as [Old Path]
   , synonymreplacements.newpath as [New Path]
   , synonymreplacements.sqlcommand as [SQL Command]
  from @synonymreplacements as synonymreplacements
  order by resultorder desc
  set nocount off

  end
end

Identifying unencrypted SQL connections

I'm working on implementing encrypted connections within my team to protect against packet sniffing, however a few developers seem behind on implementing encrypted connections. The below query helped me identify those developers for follow-up, and gave me the information they needed to identify which application was making the unencrypted connections.

select sessions.login_name
, sessions.program_name
, sessions.host_name
, sessions.login_time
, sessions.last_request_start_time
, sessions.status
, sessions.cpu_time
from sys.dm_exec_connections as connections
  inner join sys.dm_exec_sessions as sessions
  on connections.session_id = sessions.session_id
where connections.encrypt_option = 'FALSE'
order by sessions.login_name


Resolving "Cannot create an instance of OLE DB provider "OraOLE.DB.Oracle" after changing the SQL Server Agent account.

After an SQL Server Agent service account change, scheduled jobs that call an Oracle linked server began failing with the error

Executed as user: <username>. Cannot create an instance of OLE DB provider "OraOLEDB.Oracle" for linked server <linked server name>. [SQLSTATE 42000] (Error 7302).  The step failed.

The stored procedures called by those jobs executed as expected when I executed them from my account.

No ACCESS DENIED issues came up in Process Monitor while attempting to make the connection.

After some websurfing, this MSDN blog suggested modifying the MSDAINITIALIZE security settings to give the agent account all permissions on both Launch and Activation Permissions, Access Permissions and Configuration Permissions.

After doing that, the error changed to

Executed as user: <username>. The OLE DB provider "OraOLEDB.Oracle" for linked server <linked server name> reported an error. Access denied. [SQLSTATE 42000] (Error 7399)  Cannot get the column information from OLE DB provider "OraOLEDB.Oracle" for linked server <linked server name>. [SQLSTATE 42000] (Error 7350).  The step failed.

Progress! Searching for that error message took me to this SQL Server Central post, where a user suggested ticking "Allow inprocesses" for the Oracle linked server provider.


Success! The job now completed as expected.


Tuesday, 17 February 2015

ReportServer.dbo.ExecutionLogStorage table only shows a couple of months back

In my workplace we use the reports based off the SSRS Reporting Services database ExecutionLogStorage to identify report usage and alert us to report execution failures, in the form of the below statement to identify how often different reports have been used, how long they take to run, and how often they fail:

select Catalog.path, ExecutionLogStorage.*
from ExecutionLogStorage
inner join Catalog
on ExecutionLogStorage.ReportID = Catalog.ItemID

Unfortunately, on a server that went online some-time last year, the report execution logs are very limited. After doing some research I found the answer here (Microsoft TechNet article).

If the query below returns anything other than -1, your report execution logs will be being deleted after the resulting number of days.

select *
from ConfigurationInfo
where name = 'ExecutionLogDaysKept'

If it is, update it to -1 to stop historical logs from being deleted.

As to how to restore the deleted ExeuctionLogStorage entries, you'll need to restore a copy of the database to a temporary location, and run insert operations to re-add those rows to the table.