Monday, July 27, 2009

“Unable to load client print control” Error When Trying to Print from SQL Reporting Services

We’ve seen this error on and off for quite some time but for some reason we’ve had a batch of calls recently regarding it. 

The error occurs when a user attempts to print a report from SQL Server Reporting Services (SSRS) Report Manager using the printer button on the report viewer.  The issue is caused by an update to Windows Server (ActiveX Kill Bits security update) that breaks the printing function in SSRS.

You can read about the update in http://support.microsoft.com/kb/956391.

The easiest fix is to apply SQL Server Service Pack 3.

Friday, July 24, 2009

“A fatal error was encountered” When Applying Service Pack 3 to Dynamics GP 9.0 on Windows Server 2008

First things first.  I would like to give credit to one of our technical consultants, Henry Pollock, for getting to the bottom of this one.

During a recent upgrade, we received the following error when attempting to apply Service Pack 3 on a Dynamics GP 9.0 install on a new Windows Server 2008 box.

clip_image002

Note that the original install of GP 9.0 had gone fine.  A little digging and it turns out you need to disable UAC (User Access Control) on Windows Server 2008 prior to running the service pack install.

If I were a betting man, I’d guess we’d come across a similar issue on Vista workstations.

Monday, July 20, 2009

Calculating Aggregates in Dynamics GP Integration Manager

We recently came across a bit of a kink when developing an accounts payable integration using Integration Manager for a client.  The client was providing us with a single CSV file of expense report line items.  Each line item represented a distribution for an expense report that would contain one or more distributions.

No problems so far.  We created two sources from the same file, header and distributions.  The header source was grouped by Expense Report ID.  With that we were all set except that it turned out the client was unable to supply us with the expense report total in the source file.  Since Integration Manager doesn’t have any facility to aggregate values when grouping we were stuck.  We had no way to set the Purchases amount in the AP document header.

We briefly considered preprocessing the source file to add the required column but decided to explore some of the scripting capabilities we hadn’t used in the past.  20 minutes and 5 lines of VBScript later we’d solved our problem.

The solution is to keep a running total of the distribution amounts as you’re integrating them and then assign that to the AP Purchases amount before saving each document.  To start with we need a global variable to store our running total and need to initialize it to $0.00.  To do this we add the following VBScript to the Before Integration event of the integration:

SetVariable "PurchaseAmount", CCur(0.00)

The SetVariable statement saves a value that can be accessed from any other script in the integration.  Note that we’ve explicitly converted it to a currency amount using CCur(0.00).  Global variables are Variants and will take on the type of their initial value.

Now that we’ve initialized our variable we need to set the running total by capturing the distribution amounts as we process them.  We do this by setting the Debit Amount of the distribution to Use Script and setting the following script”

SetVariable "PurchaseAmount", _
    GetVariable("PurchaseAmount") + _
    SourceFields("Distributions.ExpenseAmount")

CurrentField = SourceFields("Distributions.ExpenseAmount")

The first line is adding the expense amount to the running total.  The GetVariable function is returning the value of the global variable we set using SetVariable.  The second line is setting the Debit Amount to the current expense amount.  Obviously this is a simple example where we’re assuming that the expense amounts are always positive.

Once we’ve got our total for the document we want to assign it to the document Purchases amount.  To do this we go back to our integration level events and assign the following script to the Before Document Commit event that takes place right before saving each AP document:

DestinationFields("Purchases") = _
    GetVariable("PurchaseAmount")
SetVariable "PurchaseAmount", CCur(0.00)

The first line sets the current documents Purchases amount to the running total.  Since we’re not in working within the destination mappings, we set the Purchases field using the DestinationFields object that gives us access to all of the destination fields.  The second line reinitializes the running total to $0.00 for the next document.

So there you have it.  The ability to calculate aggregates and running totals within Integration Manager.  Beyond the obvious AP and AR integrations, I’m sure there are numerous useful and probably less the useful purpose for this.

Aggregate away!

Friday, July 17, 2009

Automating the Transfer of Dynamics GP Databases – Part 4

In parts 1 through 3 we:

In Part 4 we’re going to tie it altogether.  And to do that we’re going old school.  The batch file.  The following batch file will perform the entire process.

ECHO OFF

ECHO Deleting previous backup files from both source and destination folders
DEL "E:\SQLBackups\*.bak"
DEL "E:\SQLBackups2\*.bak"

ECHO Creating backup script
"C:\Program Files\Microsoft SQL Server\90\Tools\Binn\SQLCMD.EXE" -S GP10DEVPLAYPEN -E -i CreateBackupScript.sql -b -o BackupScript.sql

ECHO Creating restore script
"C:\Program Files\Microsoft SQL Server\90\Tools\Binn\SQLCMD.EXE" -S GP10DEVPLAYPEN -E -i CreateRestoreScript.sql -o RestoreScript.sql

ECHO Creating login transfer script - Using the -d parameter to ensure connecting to the master db
"C:\Program Files\Microsoft SQL Server\90\Tools\Binn\SQLCMD.EXE" -S GP10DEVPLAYPEN -E -d master -i CreateLoginTransferScript.sql -o LoginTransferScript.sql

ECHO Backing up databases
"C:\Program Files\Microsoft SQL Server\90\Tools\Binn\SQLCMD.EXE" -S GP10DEVPLAYPEN -E -i BackupScript.sql -b -o Backup.log

ECHO Copy back files to folder accessible to second SQL server or instance.
XCOPY "E:\SQLBackups\*.bak" "E:\SQLBackups2\"

ECHO Stop and start second SQL instance to ensure that all connections have been cleared out
net stop mssql$SQL2005_2
net start mssql$SQL2005_2

ECHO Restoring databases
"C:\Program Files\Microsoft SQL Server\90\Tools\Binn\SQLCMD.EXE" -S GP10DEVPLAYPEN\SQL2005_2 -E -i RestoreScript.sql -b -o Restore.log

ECHO Transferring logins
"C:\Program Files\Microsoft SQL Server\90\Tools\Binn\SQLCMD.EXE" -S GP10DEVPLAYPEN\SQL2005_2 -E -i LoginTransferScript.sql -b -o LoginTransfer.log

It’s a little difficult to read here (better to cut and paste into a text editor) so here’s what it’s doing:

  • Deleting all backup files in both the source and destination folders.  Don’t use folders that are being used by other backup files as they will be deleted as well.
  • Creating the backup, restore and login transfer scripts.  Note that all of the creation scripts are run against the source server.
  • Running the backup script against the source server.
  • Copying all of the backups to the destination server folder.  In a multi-server environment this would presumably be a share on the destination server.
  • This procedure will pick up new company databases and SQL logins used by GP but will not remove deleted ones.
  • Stopping and starting the destination SQL instance.  This is a quick and dirty way to kill any connections to the databases you’re restoring.
  • Restoring the databases to the destination server.
  • Running the logins script.

A few things to take note of:

  • I’m performing this entire operation on single server with two instances of SQL 2005.  However, I am performing all of the actions necessary to do this between two machines.
  • This was written and tested using SQL 2005.  Some of the constructs, particularly common table expressions and the use of SQLCMD.EXE in the batch file won’t work in SQL 2000 and would have to be modified.  My guess is that most of this would work in SQL 2008.
  • I’m assuming integrated logins.  In other words, the logged in user’s AD login or service account that’s running the batch file needs to be a member of the sysadmin role on both SQL server instances.
  • I came across an error in the restore when there was an additional but empty data file associated with a database.  It wasn’t included in the backup and so errored on the MOVE clause in the restore for that file.

The trickiest part of this was developing the restore script as I needed to take into account all of the possible scenarios of the database files.  Certainly, having the same volume and folders structure on your destination environment would simplify that.

Are There Better Ways of Doing This?

That’s always the question isn’t it.  There are a hundred ways to skin a cat.  Was the 8 hours I spent putting all this together worth it?  I certainly learned a few new things.

There are certainly other ways of doing it.  Other technologies that would be interesting to explore in this regard include:

  • SSIS
  • .NET or ASP.NET apps
  • PowerShell.  I haven’t had the chance to play with PowerShell but it’s becoming the default command line tool for most Microsoft products.

for now I’ll leave that exploration to others.

Thursday, July 16, 2009

Automating the Transfer of Dynamics GP Databases – Part 3

In Parts 1 and 2 we created scripts that backup all of the Dynamics GP databases and then restore them to another SQL instance.  Once that’s done we’ll need refresh security.  To do that we need to transfer the SQL logins, except sa, used by Dynamics GP.

The following SQL script creates a login script that can be run against the second SQL to create the logins, including their original passwords and SIDS.

--  Ensure that when running this script using SQLCMD.EXE that you set the
--  database to master.
--
--  Setting NOCOUNT on prevents the row count message from appearing at the bottom.
--  Note that the SQL statements do not include column headers.  This makes
--  the entire output of the results window executable.

SET NOCOUNT ON

IF  EXISTS (
        SELECT *
            FROM sys.objects
            WHERE object_id = OBJECT_ID(N'[dbo].[fn_Hexadecimal]')
                AND type in (N'FN', N'IF', N'TF', N'FS', N'FT'))
    DROP FUNCTION [dbo].[fn_Hexadecimal]

GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
-- =============================================
-- Author:        Limey
-- Create date: 7/16/2009
-- Description:   
--   This function converts a varbinary to a
--   hexadecimal string value.
--   It is used to convert a password hash and
--   login SID to a string representation.
-- =============================================
CREATE FUNCTION dbo.fn_Hexadecimal
(
    -- Add the parameters for the function here
    @binvalue varbinary(256)
)
RETURNS varchar (514)
AS
BEGIN
    DECLARE @charvalue varchar (514)
    DECLARE @i int
    DECLARE @length int
    DECLARE @hexstring char(16)
    SELECT @charvalue = '0x'
    SELECT @i = 1
    SELECT @length = DATALENGTH (@binvalue)
    SELECT @hexstring = '0123456789ABCDEF'
    WHILE (@i <= @length)
    BEGIN
      DECLARE @tempint int
      DECLARE @firstint int
      DECLARE @secondint int
      SELECT @tempint = CONVERT(int, SUBSTRING(@binvalue,@i,1))
      SELECT @firstint = FLOOR(@tempint/16)
      SELECT @secondint = @tempint - (@firstint*16)
      SELECT @charvalue = @charvalue +
        SUBSTRING(@hexstring, @firstint+1, 1) +
        SUBSTRING(@hexstring, @secondint+1, 1)
      SELECT @i = @i + 1
    END

    -- Return the result of the function
    RETURN @charvalue
END
GO

--  This common table expression returns all enabled Dynamics GP
--  logins and their properties.
WITH DynamicsGPLogins
AS (
    SELECT
            SID = dbo.fn_Hexadecimal(ServerPrincipals.sid)
            ,UserName = ServerPrincipals.name
            ,DefaultDatabaseName = ServerPrincipals.default_database_name
            ,IsPolicyChecked =
                CASE
                    WHEN SQLLogins.is_policy_checked  = 1 THEN 'ON'
                    ELSE 'OFF'
                END
            ,IsExpirationChecked =
                CASE
                    WHEN SQLLogins.is_expiration_checked  = 1 THEN 'ON'
                    ELSE 'OFF'
                END
            ,PasswordHash = dbo.fn_Hexadecimal(CAST(LOGINPROPERTY(ServerPrincipals.name, 'PasswordHash') AS varbinary (256)))
        FROM sys.server_principals AS ServerPrincipals
            LEFT JOIN sys.syslogins AS Logins
                ON Logins.name = ServerPrincipals.name
            LEFT JOIN sys.sql_logins AS SQLLogins
                ON ServerPrincipals.name = SQLLogins.name
        WHERE ServerPrincipals.type IN ( 'S', 'G', 'U' )
            AND ISNULL(Logins.hasaccess, 0) = 1
            AND ISNULL(Logins.denylogin, 1) = 0
            AND ISNULL(ServerPrincipals.is_disabled, 1) = 0
            AND ServerPrincipals.name <> 'sa'
            AND ServerPrincipals.name IN (
                    SELECT
                            USERID
                        FROM DYNAMICS.dbo.SY01400)
)

--  This select statement uses the list of Dynamics GP
--  logins to create a create login script.
SELECT
'
IF EXISTS (
        SELECT *
        FROM sys.server_principals
        WHERE name = N''' + UserName + ''')
    DROP LOGIN ' + QUOTENAME(UserName) + '
GO

CREATE LOGIN ' + QUOTENAME(UserName) + '
    WITH PASSWORD = ' + PasswordHash + ' HASHED,
    SID = ' + SID + ',
    DEFAULT_DATABASE = ' + DefaultDatabaseName + ',
    CHECK_POLICY = ' + IsPolicyChecked + ',
    CHECK_EXPIRATION = ' + IsExpirationChecked + '
GO
'
    FROM DynamicsGPLogins

There are two basic sections to this query.  The first creates a scalar function in the master database that converts a varbinary to a varchar.  This is used to convert the SID and password hash to a text representation that can be put in the CREATE LOGIN statement.

The second section of the script actually creates the login script.  There are two parts to this are:

  • The common table expression that returns SQL logins associated with Dynamics GP (from the DYNAMICS SY01400 table) and the associated properties.  It filters out any logins that are disabled or have been denied access.
  • The final part is a select that uses this common table expression to actually create the CREATE LOGIN statements.

This is a simplified version of the SQL login transfer script found in Microsoft Knowledgebase Article 918992 and will produce the following SQL script:

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

IF EXISTS (
        SELECT *
        FROM sys.server_principals
        WHERE name = N'DYNSA')
    DROP LOGIN [DYNSA]
GO

CREATE LOGIN [DYNSA]
    WITH PASSWORD = 0x01008E7655067D77DE8FCC167C184C22B8B47BC3CA3CA2FFA86E HASHED,
    SID = 0x42F411B53282584D96B94C0B09491091,
    DEFAULT_DATABASE = master,
    CHECK_POLICY = OFF,
    CHECK_EXPIRATION = OFF
GO

IF EXISTS (
        SELECT *
        FROM sys.server_principals
        WHERE name = N'LESSONUSER1')
    DROP LOGIN [LESSONUSER1]
GO

CREATE LOGIN [LESSONUSER1]
    WITH PASSWORD = 0x0100F81A338610819AD7298F890F6DF93302BC329D14B0F23D9D HASHED,
    SID = 0xA33F190403906349BA4074FC68B7F883,
    DEFAULT_DATABASE = master,
    CHECK_POLICY = OFF,
    CHECK_EXPIRATION = OFF
GO

IF EXISTS (
        SELECT *
        FROM sys.server_principals
        WHERE name = N'LESSONUSER2')
    DROP LOGIN [LESSONUSER2]
GO

CREATE LOGIN [LESSONUSER2]
    WITH PASSWORD = 0x0100C093A6586BBC58FE0CB74886EC4F0AB442FC60766EF9805E HASHED,
    SID = 0xBF47CE1CB51C1A4CABC2A75980AAF78D,
    DEFAULT_DATABASE = master,
    CHECK_POLICY = OFF,
    CHECK_EXPIRATION = OFF
GO

Note that since this creates logins with the same SIDs, they’ll automatically be associated with database users.

In Part 4 we’ll look at putting this all together.

Automating the Transfer of Dynamics GP Databases – Part 2

In Part 1 we wrote a SQL script that will create a backup script for all of the Dynamics GP databases to help us automate the transfer all of Dynamics GP’s databases to another SQL server or instance.  In Part 2 we’ll do the same for restoring the databases.

First, let’s review the T-SQL for restoring a database.  In this case the DYNAMICS database.

RESTORE DATABASE DYNAMICS
    FROM DISK = 'E:\SQLBackups2\DYNAMICSTransfer.bak'
    WITH RECOVERY,
        MOVE 'GPSDYNAMICSDat.mdf' TO 'E:\SQLData2\GPSDYNAMICSDat.mdf',
        MOVE 'GPSDYNAMICSLog.ldf' TO 'E:\SQLData2\GPSDYNAMICSLog.ldf'

Not too bad but there’s a bit of a kink here.  The MOVE clauses put the the database files in the desired location on the destination instance.  If you're putting the files in the exact same folder structure on the destination as the source then we’re all set and can skip the MOVE clauses and build our RESTORE statements the same way as the backup script as follows:

--  Setting NOCOUNT on prevents the row count message from appearing at the bottom.
--  Note that the SQL statements do not include column headers.  This makes
--  the entire output of the results window executable.

SET NOCOUNT ON

DECLARE @BackupFolder nvarchar(max)
SET @BackupFolder = 'E:\SQLBackups2\'

--  Create statement to backup DYNAMICS database
SELECT
        '
        RESTORE DATABASE DYNAMICS
            FROM DISK = ''' + @BackupFolder + 'DYNAMICSTransfer.bak''
            WITH RECOVERY
        GO
        '

UNION ALL

--  Create statements to backup company databases.
SELECT
        REPLACE(
        '
        RESTORE DATABASE {COMPANYDB}
            FROM DISK = ''' + @BackupFolder + '{COMPANYDB}Transfer.bak''
            WITH RECOVERY
        GO
        ',
        '{COMPANYDB}', LTRIM(RTRIM(INTERID)))
    FROM DYNAMICS.dbo.SY01500

 

If, as in many cases, your folder structure is different on the machine you’re transferring to, you need to deal with the MOVE clause.  To do this we need to know how many files there are and their logical file names. 

If we were dealing strictly with Dynamics GP Utilities created databases with the default files we could easily just hardcode the two MOVE clauses and insert the company ID in the appropriate places.  However, if you’ve added files to a database (such as spanning it over multiple volumes), or restored one company database over another (such as restoring a production database over a training one for training purposes), your logical file names will not be standard.

The database file information is most easily extracted from the master.sys.master_files system view in the master database.  Once we have the list of files we can build a statement that creates a restore script as follows:

--  Setting NOCOUNT on prevents the row count message from appearing at the bottom.
--  Note that the SQL statements do not include column headers.  This makes
--  the entire output of the results window executable.

SET NOCOUNT ON;

--Declare and set variables for the restore folders
DECLARE @BackupFolder nvarchar(max);
DECLARE @NewDataFolder nvarchar(max);
DECLARE @NewLogsFolder nvarchar(max);

SET @BackupFolder = 'E:\SQLBackups2\';
SET @NewDataFolder = 'E:\SQLData2\';
SET @NewLogsFolder = 'E:\SQLData2\';

--  This common table expression returns the list of Dynamics
--  databases, identifying the min and max file IDs.
WITH Databases
AS
(
    SELECT
            DatabaseID = Databases.database_id
            ,DatabaseName = Databases.name
            ,MinFileID = MIN(DatabaseFiles.file_id)
            ,MaxFileID = MAX(DatabaseFiles.file_id)
        FROM master.sys.databases AS Databases
            INNER JOIN master.sys.master_files AS DatabaseFiles
                ON Databases.database_id = DatabaseFiles.database_id
        WHERE Databases.name = 'DYNAMICS'
            OR Databases.name IN (
                SELECT
                        INTERID
                    FROM DYNAMICS.dbo.SY01500)
        GROUP BY
            Databases.database_id
            ,Databases.name
),

--  This common table expression returns the list of files associated
--  with the Dynamics databases, identifying the first and last files
--  in the list.
DatabaseFiles
AS
(
    SELECT
            DatabaseName = RTRIM(LTRIM(Databases.DatabaseName))
            ,LogicalFileName = DatabaseFiles.name
            ,PhysicalFileName =
                CASE
                    WHEN DatabaseFiles.file_id > 2 THEN
                        CASE
                            WHEN type_desc = 'ROWS' THEN 'GPS'
                                + RTRIM(LTRIM(Databases.DatabaseName))
                                + 'Dat_' + CAST(DatabaseFiles.file_id AS nvarchar(3)) + '.' + SUBSTRING(physical_name, (LEN(physical_name) - 2), 3)
                            ELSE 'GPS' + RTRIM(LTRIM(Databases.DatabaseName)) + 'Log_'
                                + CAST(DatabaseFiles.file_id AS nvarchar(3)) + '.' + SUBSTRING(physical_name, (LEN(physical_name) - 2), 3)
                        END
                    ELSE
                        CASE
                            WHEN type_desc = 'ROWS' THEN 'GPS'
                                + RTRIM(LTRIM(Databases.DatabaseName)) + 'Dat.' + SUBSTRING(physical_name, (LEN(physical_name) - 2), 3)
                            ELSE 'GPS'
                                + RTRIM(LTRIM(Databases.DatabaseName)) + 'Log.' + SUBSTRING(physical_name, (LEN(physical_name) - 2), 3)
                        END
                    END
            ,FileType = type_desc
            ,FileID = DatabaseFiles.file_id
            ,IsFirstFile =
                CASE
                    WHEN DatabaseFiles.file_id = Databases.MinFileID THEN 1
                    ELSE 0
                END
            ,IsLastFile =
                CASE
                    WHEN DatabaseFiles.file_id = Databases.MaxFileID THEN 1
                    ELSE 0
                END
        FROM Databases
            INNER JOIN master.sys.master_files AS DatabaseFiles
                ON Databases.DatabaseID = DatabaseFiles.database_id
)

--  The SELECT statement outputs the appropriate parts of the
--  RESTORE statement depending on whether this the first, last
--  or a middle file.
SELECT
        CASE
            WHEN IsFirstFile = 1 THEN
'
RESTORE DATABASE ' + DatabaseName + '
    FROM DISK = ''' + @BackupFolder + DatabaseName + 'Transfer.bak''
    WITH RECOVERY,
        MOVE ''' + LogicalFileName + ''' TO ''' + @NewDataFolder + PhysicalFileName + ''',
'
            WHEN IsLastFile = 1 THEN
'        MOVE ''' + LogicalFileName + ''' TO ''' + @NewDataFolder + PhysicalFileName + '''
GO
'
            ELSE
'        MOVE ''' + LogicalFileName + ''' TO ''' + @NewDataFolder + PhysicalFileName + ''',
'
        END
    FROM DatabaseFiles
    ORDER BY
        DatabaseName,
        FileID

So this is a little involved and is best read by copying into a query window but essentially we’re doing the following:

  • The first common table expression, Databases, returns a row for each Dynamics database and includes the minimum and maximum file IDs.  This will allow us to know what the first and last files are for a particular database and thus know when to prepend the RESTORE clause at the beginning and the GO statement at the end.
  • The second common table expression, DatabaseFiles, returns the list of database files with their logical names and new physical locations.  Additionally, it indicates whether a particular file is the first or last one in the list.
  • The final SELECT statement actually builds the RESTORE statement using the DatabaseFiles common table expression.

Using my two company test environment, you get the following restore script:

----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

RESTORE DATABASE DYNAMICS
    FROM DISK = 'E:\SQLBackups2\DYNAMICSTransfer.bak'
    WITH RECOVERY,
        MOVE 'GPSDYNAMICSDat.mdf' TO 'E:\SQLData2\GPSDYNAMICSDat.mdf',
        MOVE 'GPSDYNAMICSLog.ldf' TO 'E:\SQLData2\GPSDYNAMICSLog.ldf'
GO

RESTORE DATABASE FOUR
    FROM DISK = 'E:\SQLBackups2\FOURTransfer.bak'
    WITH RECOVERY,
        MOVE 'GPSFOURDat.mdf' TO 'E:\SQLData2\GPSFOURDat.mdf',
        MOVE 'GPSFOURLog.ldf' TO 'E:\SQLData2\GPSFOURLog.ldf',
        MOVE 'FOURDat_2' TO 'E:\SQLData2\GPSFOURDat_3.ndf'
GO

RESTORE DATABASE TWO
    FROM DISK = 'E:\SQLBackups2\TWOTransfer.bak'
    WITH RECOVERY,
        MOVE 'GPSTWODat.mdf' TO 'E:\SQLData2\GPSTWODat.mdf',
        MOVE 'GPSTWOLog.ldf' TO 'E:\SQLData2\GPSTWOLog.ldf'
GO

Note that the FOUR company database has had it’s additional data file properly listed in the MOVE clauses.

If we now run this script against our second server (or server instance), we’ll restore all of the databases we backed up previously.

In our Part 3 we’ll look at how to reattach logins to database users.

Wednesday, July 15, 2009

Automating the Transfer of Dynamics GP Databases – Part 1

Once upon a time I wrote a blog entry on how to write a script that writes a script that runs against every Dynamics GP company database.  Now let’s do something useful with it.

More and more of our clients are having to deal with the alphabet soup regulatory compliance such as SOX (Sarbanes Oxley), PCI DSS, HIPPA etc. or simply deciding to implement good IT change control policies.  Often this means that before deploying that fancy new eConnect integration to production, we need to test it in a separate test environment.

No big deal.  Build out a new server, install SQL, GP, copy over the databases…How many companies did you say you had?  200…huh…Better cancel my dinner plans.

Best of all, you really need to keep that environment refreshed on a pretty regular basis for support purposes.  So how do you transfer 200 company databases (and don’t forget they lowly DYNAMICS database) and still manage to make it home for The Daily Show?

In this four part entry we’ll figure out how to:

Let’s start with backing up.  Time to dig up the dusty Books Online and find out how to backup a database in the query window.  No double clicking here.  Here’s how we’d backup the DYNAMICS database:

BACKUP DATABASE DYNAMICS
    TO DISK = 'E:\SQLBackups\DYNAMICSTransfer.bak'
            WITH FORMAT,
        MEDIANAME = 'DYNAMICSBackup',
        NAME = 'Full Backup of DYNAMICS for Transfer'

So, let’s use our handy dandy code to blow that out so that we build a script that backs up the DYNAMICS database and then each of the company databases.

--  Setting NOCOUNT on prevents the row count message from appearing at the bottom.
--  Note that the SQL statements do not include column headers.  This makes
--  the entire output of the results window executable.

SET NOCOUNT ON

DECLARE @BackupFolder nvarchar(max)
SET @BackupFolder = 'E:\SQLBackups\'

--  Create statement to backup DYNAMICS database
SELECT
        '
        BACKUP DATABASE DYNAMICS
            TO DISK = ''' + @BackupFolder + 'DYNAMICSTransfer.bak''
                    WITH FORMAT,
                MEDIANAME = ''DYNAMICSBackup'',
                NAME = ''Full Backup of DYNAMICS for Transfer''

        GO
        '

UNION ALL

--  Create statements to backup company databases.
SELECT
        REPLACE(
        '
        BACKUP DATABASE {COMPANYDB}
            TO DISK = ''' + @BackupFolder + '{COMPANYDB}Transfer.bak''
                    WITH FORMAT,
                MEDIANAME = ''{COMPANYDB}Backup'',
                NAME = ''Full Backup of {COMPANYDB} for Transfer''

        GO
        ',
        '{COMPANYDB}', LTRIM(RTRIM(INTERID)))
    FROM DYNAMICS.dbo.SY01500

This should produce the following backup script that will backup all of the Dynamics databases:

----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

        BACKUP DATABASE DYNAMICS
            TO DISK = 'E:\SQLBackups\DYNAMICSTransfer.bak'
                    WITH FORMAT,
                MEDIANAME = 'DYNAMICSBackup',
                NAME = 'Full Backup of DYNAMICS for Transfer'

        GO

        BACKUP DATABASE FOUR
            TO DISK = 'E:\SQLBackups\FOURTransfer.bak'
                    WITH FORMAT,
                MEDIANAME = 'FOURBackup',
                NAME = 'Full Backup of FOUR for Transfer'

        GO

        BACKUP DATABASE TWO
            TO DISK = 'E:\SQLBackups\TWOTransfer.bak'
                    WITH FORMAT,
                MEDIANAME = 'TWOBackup',
                NAME = 'Full Backup of TWO for Transfer'

        GO

At this point we’re all set to backup the databases.  In Part 2 we’ll create a script that restores them.

Dynamics GP, SQL 2008, Windows 2008 and 64bit

Wow, it’s been over a month since my last post.  I guess between a week and a bit of unplugged (from work at least) vacation and the resulting crush of undone work afterword got me out of the swing of things.

Due to the upcoming end of support for GP version 8, we’re in the midst of a lot of version 10 upgrades.  For those clients still on version 8, they’re often on older hardware, OS and SQL versions and so are often moving to new hardware and are looking for OS and SQL Server guidance.  Should they go to SQL 2008?  What are the advantages of 64bit? 

These questions have come from our clients and functional consultants (the people may not be able to write a recursive common table expression but actually know what deferred revenue is).  Rather than simply continue to answer queries from clients and consultants on a one-off basis I sent the following in an email to our consultants and thought I might share it.

Why 64bit?

32bit Windows can only access up to 4GB or RAM.  In addition, 2GB is reserved for the OS with only 2GB remaining for applications.  This means that at  a base level, 32bit SQL Server can access at most, 2GB of RAM.  There are some system gymnastics that will allow you to access up to 3GB on Windows Server Standard or 7GB on Windows Server Enterprise but due to the required gymnastics these are not ideal.  In addition, due to the fact that a 64bit OS processes data in 64bit rather than 32bit chunks, it can process more data faster.  No gymnastics are required.

The 64bit version of Windows Server Standard(2003 or 2008) allows you to address up to 32GB of physical RAM.  The 64bit version of Windows Server Enterprise allows you to address up to 1TB of physical RAM.

So how does this help with SQL Server?

The more of a database that SQL Server can load into memory, the faster its performance will be.  If it can cache an entire database into memory then the only time it has to access the disk is for writes.  All reads are done from memory and thus considerably faster than reading from disk.  In an ideal world you would want to have enough free memory to load all of your commonly used databases.  So if you have 4 3GB databases, you would ideally want to have 12GB or RAM plus the RAM required for the OS and any other applications.  In addition to being able to address this amount, a 64bit SQL Server will have improved performance due to the 64bit data processing itself.  In fact, there should be performance improvements even when running 64bit SQL on a server with only 4GB or RAM.

How does 64bit SQL Server affect my Dynamics GP implementation?

64bit SQL Server itself should not present any issues with a Dynamics GP implementation.  GP will see 64bit or 32bit SQL as the same.  However, due to the fact that SQL Server 64bit must be installed on a 64bit OS there are issues with regards to installing the Dynamics GP client on the SQL Server.  Some components, most notably the FRx client are not supported on 64bit Oss, even though they support connecting to a 64bit SQL Server.

So what about the Dynamics GP client and components?

In the past, we’ve generally installed a “server” install of the Dynamics GP client on the SQL Server.  This includes the Dyndata file share that is used by the workstation clients.  Unlike earlier versions Dynamics GP, there is no longer a separate “server” install of the client.  All clients are equal and all “server” functions can be performed from any client workstation with little to no adverse affects on performance. 

As a best practice, it is always advisable to avoid installing additional applications on the SQL Server for the following reasons:

  • To maximize the amount of resources  (CPU, RAM and disk) available for SQL itself.
  • To reduce or eliminate the need to remote desktop directly into the SQL server for security reasons.  This also reduces the possibility of inadvertent reboots or system changes of a production SQL server.
  • To reduce the possibility of additional applications or services interfering with SQL or other services and causing instability issues on the SQL Server.
  • Avoiding issues with clustering if the SQL server currently is or is planned to by clustered.
  • Avoiding issues with installing the client on a 64bit OS.
  • To allow the SQL hardware to be upgraded/replaced without affecting the Dynamics GP install (except for possibly the client ODBC connections)

So where should I install the Dynamics GP client?

It is recommended that a “server” or “administrative” copy of the Dynamics GP client be installed on a dedicated application server.  This application server should be a 32bit OS.  This is where the Dyndata folder will reside as well as additional components such as eConnect, Web Services, RDM etc.  In addition, it is recommended that a copy of the client’s current version of Microsoft Office (at least Excel) also be installed on this server for the purposes of troubleshooting GP/Office functionality.  Also, a copy of the SQL Server Client Tools (SQL Server Management Studio and SQL Server Business Intelligence Studio) should be installed on this server so that most, if not all, SQL administrative functions can be performed without remote desktopping into the SQL server.

As noted, this should ideally be a dedicated server so that the Dynamics GP install and the various components do not adversely affect or are not adversely affected by other applications installed on the server.  Since many of our clients have implemented virtual environments, and this is a perfect candidate for a virtual server (despite Microsoft’s statements to the contrary), it should not be an additional burden to create and support this environment.

When is it appropriate to recommend a single server environment?

If the client has a small number of users (<=10) and it is not expected that the total size of all of the active databases (Dynamics or other applications) will exceed 4GB over the life of the installation and the client is not in a virtualized environment and so would require additional hardware and OS licenses for an application server then it would be appropriate to recommend a single server environment.  This environment would be required to be 32bit OS and 32bit SQL Server.

Is a two server environment the most I’ll need?

No.  If you have a particularly high transaction environment on your SQL server, such as high transaction Dynamics environment or other large, high transaction databases other than Dynamics GP, then it would be appropriate to split off other components of SQL Server, such as Reporting Services and Analysis Services to their own dedicated servers.    Note that in a clustered SQL Environment, Reporting Services and Analysis Services would be required to be put on servers other than the SQL Server.  If this were the case then the Enterprise version of SQL Server would be required.

What about SQL Server 2008?

Now that Microsoft has released Service Pack 1 for SQL Server 2008, more clients may be looking to deploy it to take advantage of some of its new features.  Most of the commonly used components are Dynamics GP, including SRS reports, are supported on SQL Server 2008 when using SP2 or SP3 of Dynamics GP or the particular GP component.  FRx is currently not supported on SQL Server 2008, however, it has been installed and is working correctly on SQL Server 2008 in the Caturano Dynamics GP environment.  Microsoft is currently evaluating FRx’s compatibility with SQL Server 2008 and expects to support it in FRx 6.7 SP 11, due out in July, 2009.

When might I recommend SQL Server 2008?

SQL Server 2008 has a number of new features that some clients my find compelling including:

  • A (reported) 10-15% performance improvement over SQL Server 2005.
  • Transparent Data Encryption that encrypts the entire database, log and backup files.  This feature might be of interest to clients that store credit card numbers, 1099 vendor SSNs, employee payroll information etc. and may be subject to Payment Card Industry Data Security Standards (PCI DSS) or the Mass Privacy regulations going into affect 1/1/2010.
  • Built in data (including data viewing) and configuration auditing for clients that might face compliance issues (PCI, SOX, HIPPA, Mass Privacy)
  • Completely rewritten SQL Server Reporting Services with improved performance (particularly multi-user performance) and new more user-friendly report builder for report authoring.

What about Windows Server 2008?

Like SQL Server 2008, Windows Server 2008 has passed the initial service pack test (it’s currently on SP2) and so many clients may be migrating to it or at least setting up new hardware/VMs with it.    Like SQL Server 2008, some components, most notably FRx are not supported on Windows Server 2008.

Can you give me a quick reference on component support?

Sure.  The following components are supported on 32bit only.  Thus if you use any of these components, they must be installed on a 32bit OS, ideally an app server:

  • Rapid Implementation Tools (including Rapid Configuration and Rapid Migration)
  • Business Portal for Microsoft Dynamics GP 10.0, Internet Information Services server
  • Catalog Maintenance
  • Microsoft Dynamics Security Synchronization Utility
  • Business Portal Migration Utility
  • Analysis Cubes for SQL Server 2000 (client and server).  Note that this is for SQL Server 2000.  The Analysis Cubes for SQL Server 2005 are supported on 64bit.
  • Benefit Self Service Suite and Certification, Licensing, and Training Manager for Business Portal
  • Personal Data Keeper
  • Workflow
  • Workflow Software Development Kit
  • Workflow History Archiving Utility
  • Microsoft FRx Reporter 6.7
  • Microsoft FRx Forecaster 6.7
  • Microsoft FRx Web Port 6.7
  • Enterprise Reporting 7.5
  • SmartTags
  • eExpense
  • Field Service Anywhere
  • Microsoft Dynamics Solution Accelerator for Compliance Management

The following components are not supported on Windows Server 2008 (Assume SP3 for all supported components).  Thus if you use any of these components, the app server must be running Windows 2003:

  • Rapid Implementation Tools (including Rapid Configuration and Rapid Migration)
  • Analysis Cubes for SQL Server 2000 (client and server).  Note that this is for SQL Server 2000.  The Analysis Cubes for SQL Server 2005 are supported on 64bit.
  • Workflow.  Note that Workflow will be supported with the release of Microsoft Office SharePoint Server Service Pack 2. 
  • Microsoft FRx Reporter 6.7.  Note that it is expected to be supported on the release of SP 11.
  • Microsoft FRx Forecaster 6.7
  • Microsoft FRx Web Port 6.7
  • Enterprise Reporting 7.5
  • SmartTags
  • Field Service Anywhere
  • Microsoft Dynamics Solution Accelerator for Compliance Management

The following components are not supported on SQL Server 2008 (Assume SP3 for all supported components).  Thus if you use any of these components, you must use SQL Server 2005 (64bit or 32bit is fine):

  • Rapid Implementation Tools (including Rapid Configuration and Rapid Migration)
  • Analysis Cubes for SQL Server 2000 (client and server).  Note that this is for SQL Server 2000.  The Analysis Cubes for SQL Server 2005 are supported on 64bit.
  • Microsoft FRx Reporter 6.7.  Note that it is expected to be supported on the release of SP 11.
  • Microsoft FRx Forecaster 6.7
  • Microsoft FRx Forecaster 7.0
  • Microsoft FRx Web Port 6.7
  • Enterprise Reporting 7.5
  • SmartTags
  • Field Service Anywhere
  • Microsoft Dynamics Solution Accelerator for Compliance Management

Can you give me a quick summary of the environments?

Absolutely.  Here they are:

Recommended Standard Environment:

  • 2 Servers
  • Server 1 (SQL Server) – Windows Server 2003 or 2008 64bit, SQL Server 2005 (or 2008 based on compatibility) 64bit.
  • Server 2 (Application Server) – Windows Server 2003 32bit, Dynamics GP full client install (“admin” or “server” install), all required components (FRx, eConnect, Web Services), Microsoft Office (clients current version), DynData file share, SQL Server Client Tools (SQL Server Management Studio, SQL Server Business Intelligence Studio).

Low Volume Environment:

  • 1 Server – Windows Server 2003 32bit, SQL Server 2005 32bit, Dynamics GP full client install (“admin” or “server” install), all required components (FRx, eConnect, Web Services), Microsoft Office (clients current version), DynData file share.

High Transaction Volume Environment:

  • 3 Servers
  • Server 1 (SQL Server) – Windows Server 2003 or 2008 64bit, SQL Server 2005 (or 2008 based on compatibility) 64bit database engine only.  Do not install Reporting Services or Analysis Services.
  • Server 2 (Application Server) – Windows Server 2003 32bit, Dynamics GP full client install (“admin” or “server” install), all required components (FRx, eConnect, Web Services), Microsoft Office (clients current version), DynData file share, SQL Server Client Tools (SQL Server Management Studio, SQL Server Business Intelligence Studio).
  • Server 3 (Reporting & Analysis Server) – Windows Server 2003 or 2008 64bit, SQL Server 2005 Reporting Services(or 2008 based on compatibility) 64bit (32bit if reporting from external 32bit ODBC sources such as MySQL or MS Access), SQL Server Analysis Services, SQL Server Integration Services.

High Transaction Volume/High BI Volume Environment:

  • 4 Servers
  • Server 1 (SQL Server) – Windows Server 2003 (or 2008 based on compatibility) 64bit, SQL Server 2005 (or 2008 based on compatibility) 64bit database engine only.  Do not install Reporting Services or Analysis Services.
  • Server 2 (Application Server) – Windows Server 2003 32bit, Dynamics GP full client install (“admin” or “server” install), all required components (FRx, eConnect, Web Services), Microsoft Office (clients current version), DynData file share, SQL Server Client Tools (SQL Server Management Studio, SQL Server Business Intelligence Studio).
  • Server 3 (Reporting Server) – Windows Server 2003 or 2008 64bit, SQL Server 2005 Reporting Services(or 2008 based on compatibility) 64bit (32bit if reporting from external 32bit ODBC sources such as MySQL or MS Access).
  • Server 4 (Analysis Server) –Windows Server 2003 or 2008 64bit, SQL Server Analysis Services, SQL Server Integration Services.