Archive for the ‘DBA’ Category

I use this script all the time when setting up an SSIS package. (Unfortunately, I can’t remember where I found the original code. I’ve adapted it slightly, so if anyone recognises the original then let me know and I’ll link to it.)

The Problem

When setting up a data flow in SSIS the data transfer speed can be very slow because the default settings in the package have not been optimised.

The Solution

SSIS Properties

The code below will show you each table in the database. I take the column MaxBufferSize and round it down to the nearest hundred – so 87235 becomes 87000. I use this value as the DefaultBufferMaxRows value. I change the DefaultBufferSize from 10485760 to 104857600 (same number but add a zero to the end). Finally, I’ll add values to the BlobTempStoragePath and BufferTempStoragePath, normally I’ll use C:\temp, but make sure the directory exists and you’re probably better choosing a value not on the C drive.

SELECT s.[name] + '.' + t.[name] as TableName, SUM (max_length) as [row_length], 10485760/ SUM (max_length) as MaxBufferSize
FROM sys.tables as t
JOIN sys.columns as c
ON t.object_id=c.object_id
JOIN sys.schemas s
ON t.schema_id=s.schema_id
GROUP BY s.[name], t.[name];

These changes will allow SSIS to load more rows simultaneously and so should speed up your loading. I tend to use OLD connection for Source and Destination.

Advertisements

It’s been a long time since I’ve posted on here, mainly because I no longer work primarily as a DBA but more as a SQL Developer, I was also looking after thousands of instances so came across a lot of issues. However, this one had me stuck for a few days. We have a developer who wanted to run ‘R’ and a 2016 instance which should have let him, but no dice.

The Problem

When running the following ‘R’ test script:

EXEC sp_execute_external_script
@language =N’R’,
@script=N’OutputDataSet<-InputDataSet',
@input_data_1 =N’SELECT 1 AS hello'<
WITH RESULT SETS (([hello world] int not null));
GO

I got the following error:
Msg 39021, Level 16, State 1, Line 1 Unable to launch runtime for ‘R’ script.
Please check the configuration of the ‘R’ runtime. Msg 39019, Level 16, State 1, Line 1 An external script error occurred: Unable to launch the runtime. ErrorCode 0x80070490: 1168(Element not found.).

The Solution

I was running on SQL Server 2016 (13.0.4001.0) with no previous ‘R’ installations or CTP instalattions.
The solution was to uninstall and install the dll.
In my case the path to it was C:\Program Files\Microsoft SQL Server\MSSQL13.MSSQLSERVER\R_SERVICES\library\RevoScaleR\rxLibs\x64\RegisterRExt

So, first I opened up dos with admin rights and ran:
“C:\Program Files\Microsoft SQL Server\MSSQL13.MSSQLSERVER\R_SERVICES\library\RevoScaleR\rxLibs\x64\RegisterRExt” /uninstall

After that I ran
“C:\Program Files\Microsoft SQL Server\MSSQL13.MSSQLSERVER\R_SERVICES\library\RevoScaleR\rxLibs\x64\RegisterRExt” /install

Note: Each time you run the uninstall or install script it will stop and start your SQL instance

This was for a default instance. I think, if you are using a named instance you need to add: /instance:InstanceName after the /install flag

 

Adding extra files to Tempdb

Posted: November 3, 2015 in DBA, Design, Files, Tempdb

This isn’t actually a problem. When I build a new instance I like a tempdb file for each cpu. This script will create those files in the directory where the current tempdb data file is. Please note that it sizes tempdb data and log files too, you may want to change those values

/* Script Starts*/
DECLARE @cpuCount as int;
DECLARE @Files as int;

SELECT @Files = COUNT(*)
FROM tempdb.sys.database_files
WHERE type_desc = ‘ROWS’;

SELECT @cpuCount = cpu_count /hyperthread_ratio
FROM sys.dm_os_sys_info
–Print @cpuCount
–Print @Files

Alter Database Tempdb modify file(Name=tempdev, Size=300, filegrowth = 150MB);
Alter Database Tempdb modify file(Name=templog, Size=50, filegrowth = 50MB);

DECLARE @FileLocation as Varchar(750);

–You may need to check that the name of the tempdb data file is tempdb.mdf (select * from tempdb.sys.master_files)
Set @FileLocation = (SELECT SUBSTRING(physical_name, 1,
CHARINDEX(N’tempdb.mdf’,
LOWER(physical_name)) – 1) DataFileLocation
FROM master.sys.master_files
WHERE database_id = 2 AND FILE_ID = 1)
–Print @FileLocation

DECLARE @diff as int;
SET @diff = @cpuCount – @Files
If @diff > 7
set @diff = 8 – @Files

DECLARE @x as TinyInt;
SET @x = @Files;
DECLARE @file as Varchar(10);
DECLARE @fileName as Varchar(250);
While @diff > 0
BEGIN
SET @file = ‘tempdev’ + Cast(@x as varchar(2))
SET @fileName = @FileLocation + ‘\’ + @file + ‘.ndf’;
DECLARE @sql as Varchar (8000);
SET @sql = ‘ALTER DATABASE TempDb ‘;
SET @sql = @sql + ‘ADD FILE ‘;
SET @sql = @sql + ‘( ‘;
SET @sql = @sql + ‘NAME = ‘ + @file + ‘, ‘;
SET @sql = @sql + ‘FILENAME = ”’ + @fileName + ”’, ‘;
SET @sql = @sql + ‘SIZE = 300MB, ‘;
SET @sql = @sql + ‘FILEGROWTH = 150MB’;
SET @sql = @sql + ‘);’;
Exec (@sql);
SET @diff = @diff -1;
SET @x = @x + 1;
END;

The Problem

The server was on a VM and the system drive was backed up. We had separate drives for SQL Data, SQL Logs and SQL Backups. Only the backups went to tape.
As part of the decommissioning of a server we lost all the drives on our production box.
They were replaced and the system disk restored. The SQL Backups were also restored. The data and log drives were empty.

The Solution

The restore of the system drive gave us the SQL binaries but we lacked the system databases. In our support database I had a record of where the data and log files were stored so I could recreate the directory structures. I also used the SQL Config manager to find out where the error logs went and created those directories too.
SQL wouldn’t start, however, as there was no master db. To get it going I found another instance of SQL which was at the same product level. I copied the master mdf and ldf into the appropriate drives (I had to stop this SQL instance to get these files as they can’t be copied while SQL is running). I was now able to get SQL to start for a second. It was filing now due to a lack of Tempdb.
At first I was a bit confused as Tempdb is created when SQL server starts. Then I remembered that it was created from model. I went back to that other SQL install and took a copy of model mdf and ldf.
I was now able to Start SQL up and attempt to do a restore of the local master db.
At this point I encountered another issue. To restore master you need SQL to be running in single user mode. That’s fine but there was an app which was connecting to this box every half-second and it kept stealing that single session.
To stop the app I used IPSec (http://searchwindowsserver.techtarget.com/feature/Use-IPSec-to-manage-connections) to block the ip address the app was using – I got this from the SQL error logs. Thinking about it now I might have been able to turn of TCIP and used shared memory.
With the app blocked I was able to restore maters and then MSDB. I didn’t do Model as the copy I had borrowed was from an identically set-up SQL. You may want to restore model to be on the safe side.
At this point I could start SQL up normally and from SSMS I did a restore of all the user databases.

Lessons

We are now creating an archive of Master and model mdf and ldf files. We can then use these without stopping another SQL instance.
You could also copy master and model files into the backup directory whenever you do a new build, apply a service pack or create a hot fix (ie whenever the version number changes). You could then pull these from the backup directory.

The Problem

When I tried to deploy our SQL Maintenance plans to a server nothing happened.
I then checked the server and go the error: The affinity mask specified does not match the CPU mask on this system.

I ran sp_configure and saw that advanced options were not turned on. I wanted to turn them on so I could set the affinity mask properly. When I tried to do this I got the error: The affinity mask specified does not match the CPU mask on this system. I ran sp_configure again and I saw that the configured value was set to 1 but the run value was set to 0. This happened even I fI used Reconfigure with Override.
I restarted the instance to see if that would help. It didn’t.
I tried to change the affinity mask through Properties on the Instance in SSMS. It was all greyed out so I was unable to change anything.

Solution

I won’t detail all the things I tried which didn’t work, but there were many.
I changed the server start-up parameters to include –f. This starts the server in single user mode with limited configuration.
I also stopped the SQL Agent Servers as I didn’t want it taking my single connection.
There was an application which connects as soon as SQL starts. This was local so I couldn’t disable TCP/ip or use ipSec to stop it getting in. The solution was simpler – I disabled the account as it was using a SQL Login.
I then restarted SQL.
In SSMS I created a DAC connection to the box by going to File > New Query > then connected as
ADMIN:Servername
I ran sp_configure and could see that affinity mask had a configured value of 2 but a run value of 0 – due to the limited configuration start up. I ran sp_confiure and set it to 0. I did the same with the I/O affinity mask.
I then removed the –f start up parameter and restarted SQL. I re-enabled the SQL Login I’d disabled and started up the SQL Agent.
All was now fine.

Why did it happen?

This was a VM with only one CPU. I assume the affinity settings were put in place on a physical box. The machine was then virtualised with one CPU and the problems started.

Problem

We have some SQL Server 2000 instances where Identity Checks and Index Reorganisations were failing. The error stated that Quoted Identifier was set to OFF.

Solution

Open the job and add this -S serverName\Instance Name before -PlanId.

Then at the end add –SupportComputedColumn

Note

If the maintenance plan itself is changed then the job definition will change and you’ll need to re-fix them.

This solution comes from an MS article which you can read in full here:

http://support.microsoft.com/kb/902388

This happened on a SQL 2000 system. I wanted to change the autogrowth on a file from 10% to a MB setting. I did this through the GUI and got an error saying the system couldn’t find the file I was trying to change.

I then tried to change it using ALTER Database. It still said there was no such file.

I right-clicked on the DB and checked the files again. Then I had a look in dbo.sys.files and could see it. I went through windows Explorer and took a look at the file. It was there and in the place where SQL was telling me it was – yet it still insisted it wasn’t there.

Solution

I ran this script:

SELECT *
FROM master..sysaltfiles
WHERE dbid= DB_ID();

This time the file had a different logical name. It was actually a better name than the one sysfiles was claiming. So I ran a script to change th efile’s name to the one sysAltFiles had:

ALTER Database DBName
MODIFY FILE (NAME = ‘DBName_data’, NEWNAME = ‘ DBName _Data’);

Now I was able to get in and change the autogrow setting.