Archive

Posts Tagged ‘SQL Server’

A SQL day in the Life…

February 27, 2012 Leave a comment

Needed to take a short break from ‘Creating the Lab’ <<said with a big announcer voice with full on reverb>> and take a detour (detox?) down SSIS-ville for a second.  When you get to wear many hats, the hits keep coming.  Wee-haaaaaaa! (expletive deleted…).  First this little segue started out with a simple enough request: We need to add this ‘other’ data to the data warehouse.  Easy right?

Maybe a picture would help?

image

The data warehouse and app lives in the company space and the ‘other’ data lives on a different network in web-ville.

Do to security stuff, this problem was resolved with a third server… something like this…

image

But it’s not as clean as that…  Webbie writes a file that the intermediate server picks up from the web and deposits it to a file share where the receiver checks the directory for the file and imports it when it’s there.  Well, I didn’t want to follow that same path and my goal was to simplify it using a single script and a point to point VPN.  Just too many moving parts and many possible points of failure.

So my mental picture looks like: 

image

I think this looks simpler.  Now, to get the script to do what I need it to do…

First, says I, I’ll put this all in SSIS, Schedule it as a SQL Job and voilà, I get something like…

image

I’m going to leave this diagram simple because this no workee… no matter what variation I tried.

For example: Connect to Web uses a superset of credentials that have all the right permissions.  The connection is visible to all yet the subsequent SSIS package never see’s the connection.  Now to be honest, I actually did not try the final configuration that is shown above and that “may” have worked.  But in the process of trouble shooting, I discovered that PowerShell could fire off the connection, then the SSIS package without a problem.

So, Pretty Pictures asside, this is what I ended up with:

#instantiate the VPN connection using RASDIAL
rasdial <Connection Name> <connection credentials> /domain<web domain>

#execute the SSIS package using dtexec
dtexec /f "C:\scripts\<package file name>.dtsx" | Out-File $File –append

#close VPN connection
rasdial <connection name> /disconnect

This doesn’t seem to log much but the pipe directive sends all the output to the file giving you a very rich report:

Time Started: 02/27/2012 05:33:57
Microsoft (R) SQL Server Execute Package Utility
Version 10.50.1600.1 for 64-bit
Copyright (C) Microsoft Corporation 2010. All rights reserved.

Started: 5:34:06 AM
Progress: 2012-02-27 05:34:17.30
Source: Post to Final Tbls
Validating: 0% complete
End Progress
Progress: 2012-02-27 05:34:17.92
Source: Post to Final Tbls
Validating: 25% complete
End Progress
Progress: 2012-02-27 05:34:18.12
Source: Post to Final Tbls
Validating: 50% complete
End Progress
Progress: 2012-02-27 05:34:18.17
Source: Post to Final Tbls
Validating: 75% complete
End Progress
Progress: 2012-02-27 05:34:18.37
Source: Post to Final Tbls
Validating: 100% complete
End Progress
Progress: 2012-02-27 05:34:18.37
Source: Prep
Validating: 0% complete
End Progress
Progress: 2012-02-27 05:34:18.37
Source: Prep
Validating: 25% complete
End Progress
Progress: 2012-02-27 05:34:18.47
Source: Prep
Validating: 50% complete
End Progress
Progress: 2012-02-27 05:34:18.47
Source: Prep
Validating: 75% complete
End Progress
Progress: 2012-02-27 05:34:18.55…

Hmmm…  maybe a little more info in the log would be helpful

#instantiate vars and file name
$dt = get-date
$d = $dt.dayofweek
$File ="C:\scripts\ServiceLog_" + $d + ".txt"
#start log
"—————— Started:" + $dt + "  ——————" | Out-File $File <<add –append if overwrite not desired>>

So far so good.

# begin Processing
# open VPN
"—————— Connection opened: " + (get-date).toshorttimestring() + " ————" | Out-File $File –Append

rasdial <connectionName> <credentials> /domain:<domain>

"—————— SSIS Started: "  + (get-date).toshorttimestring() + "  ————————–" | Out-File $File -Append
dtexec /f "C:\scripts\PullDataFromRWeb.dtsx" | Out-File $File -append
"—————— SSIS Completed: " + (get-date).toshorttimestring() + " —————-" | Out-File $File -append

Finally
# close vpn
rasdial <connectionName> /disconnect
"—————— Connection Terminated: " + (get-date).toshorttimestring() + "———" | Out-File $File -Append

Ah, much better.

Output?

image

Left off the SSIS package.  I’ll update this later once the job runs again.

Ok, now I set up the package to use a domain account and set up the schedule and got…

Error: 2012-02-28 10:42:04.10
   Code: 0xC0016016
   Source: 
   Description: Failed to decrypt protected XML node "DTS:Password" with error 0x8009000B "Key not valid for use in specified state.". You may not be authorized to access this information. This error occurs when there is a cryptographic error. Verify that the correct key is available.
End Error
Error: 2012-0

Oh ya, file system SSIS and security… encryption is an issue.

Easy to fix…

image

Set a password…

image

Now, a little bit of error trapping and the final script looks something like:

#instantiate vars and file name

$dt = get-date
$d = $dt.dayofweek
$File ="C:\scripts\ServiceLog_" + $d + ".txt"
$state = ""
#start log

# begin create error handling and function defs

# open VPN Function

Function connectVPN
{

"—————— Started:" + $dt + " ——————" | Out-File $File
"—————— Connection opened: " + (get-date).toshorttimestring() + " ————" | Out-File $File -Append

rasdial <connection name> <credentials> /domain:RMLSWeb | Out-File $File -Append

}

#execut the SSIS package
Function executeSSIS
{
"—————— SSIS Started: " + (get-date).toshorttimestring() + " ————————–" | Out-File $File -Append

dtexec /De <password for SSIS> /f "C:\scripts\<packageName>.dtsx" | Out-File $File -append

"—————— SSIS Completed: " + (get-date).toshorttimestring() + " —————-" | Out-File $File -append
}

# close vpn

Function disconnectVPN
{
rasdial "<connection>" /disconnect
"—————— Connection Terminated: " + (get-date).toshorttimestring() + "———" | Out-File $File -Append
}
#cls
connectVPN
$state = test-connection "<server name – FQDN>"
#write-host $state.length

if ($state.length -gt 0)
{
executeSSIS
disconnectVPN
}
else
{
"—————— Connection Failed: " + (get-date).toshorttimestring() + "———" | Out-File $File -Append
rasdial "<connection>" /disconnect
"—————— Connection Terminated: " + (get-date).toshorttimestring() + "———" | Out-File $File -Append
}

A little note about the above script.   It creates 7 log files Monday – Sunday.  In the script I’m running, I’m checking on this information daily so adding addition elements like sending notifications and such isn’t needed.  If it was in a big server farm, I’d add a few more bells and whistles to provide constructive notice.  Since a ‘Short history’ is sufficient, I changed the log action in the first function from:

"—————— Connection opened: " + (get-date).toshorttimestring() + " ————" | Out-File $File –Append

to

"—————— Connection opened: " + (get-date).toshorttimestring() + " ————" | Out-File $File

(Dropped the append)

This way it’s a new file that remains small and I will always have 7 days of history to go back to.

Thus my output directory looks like:

image

Deploying SharePoint … the beginning…

August 16, 2011 Leave a comment

First and the most important item here is that this will be based on a Very Small SharePoint implementation.  The service is to support less than 100 users.  If there was more trust of the cloud based products, the web based version of SharePoint would have been a likely choice.  Not so much for us though.  Even though we are a small company, we’ve employed the enterprise version of SharePoint.  This was after running through Foundation, and then Standard versions.  With Microsoft, all the good stuff comes in the Enterprise version.  No a la carte here…Hello?  Microsoft?  A La Carte model please???

One of the first lessons learned was that SharePoint is more of a concept than a concrete object you can put your hands on.  Sure, it has that element but…like that piece of dense chocolate cake that goes thunk… it’s heavier than you think.  SharePoint is a server product that can have a inclusive SQL Express server or access an external server.  Depending on which model you opt for, be prepared to create a lot of AD entries if you go for the more than a single server model.

SQL Server is a critical element of SharePoint

When isn’t SQL Server a critical element?  While SharePoint does provide functionality to access external content, SQL Server is a central and critical piece of the SharePoint solution.  In our deployment, I set it up so that we used an external, dedicated SQL Server to support SharePoint first.  Secondary databases were added after the solution was in place and evaluated for their impact.  The SQL Server is a bare metal server.  While it has been documented that SQL will run virtualized, we did not want to take the chance and dedicated a server.  Hardware is getting cheaper by the day.

What version?  Since it is well known the the future of Microsoft products is going to be on 64 bit servers, we went with the latest version of SQL on a 64 bit box (SQL 2008 R2 x64).

image

Servers and Web Roles…More fun

Start up a Windows Server (2008 R2 x64 please) and you’re confronted with ROLES.  Well, SharePoint needs a ROLE or two to function.  Since it is an application the Application Server Role would be good.  Web?  I think SharePoint has “some” web parts to it, so IIS would probably be a good idea.  Go to this PowerShell and SharePoint link for PowerShell stuff.

In addition to the roles mentioned, some other stuff should be set up ahead of time:

NOTE: I’ve tried to pull in the latest links but please double check as things to change and it’s always best to get stuff fresh.

In the small scenario, all of these components and roles are on one server.  The only separation used is that the SQL Server is on a different box.

Looking around my office and thinking about this project, I have four books on SharePoint 2010 that I’ve collected over the past year or so (see below).  Most of these were not even in Print when I started this project.  It’s been a long an interesting road for sure.  Some are dog eared.  Others a full of sticky notes and yellow highlighter.  Go figure…

SharePoint has a lot of Service Applications that can be enabled

image

Not all of the services should be activated just because.  Only activate what you need.  Just because you’ve activated it doesn’t mean you’re done either.

To get familiar with PowerShell support, go to the Microsoft SharePoint 2010 Products on the server menu:

image

Open up the SharePoint 2010 Management Shell and type Get-Command.

You will see a long list…

image

SharePoint Search:  As mentioned in other blogs, there are two SharePoint search facilities, SharePoint Search and Fast Search.  For fast, you want a different server.  We’ve not implemented Fast.  Normal search will suffice for now.  When we need thumbnails and other fancy stuff, we’ll get another server for Fast.

One of the roles that will need to be enabled for our server will be Email Integration.  This will require some separate steps but is necessary for records and content management.

Again, since this is a limited server farm, the default of having server administration role on the same server is not an issue.  When our farms grows HUGE, then it might make sense to put the role on it’s own dedicated server.

Development Needs: Say what you will, having your own personal SharePoint server to do development is a good thing.  Having it as a virtual system is even better.  To facilitate this, I leveraged VMware Workstation to build separate SharePoint environments to mimic the three systems we have in place: Foundation, Standard and Enterprise.  Why go through the trouble of setting Virtual Environments for SharePoint?  In a word – SNAPSHOT.  SharePoint is a large environment that can go sideways on you in an instant.  Not all sideways events are easy to recover from either.  By using a virtual environment, you can test stuff out without risking your sanity.  I follow the same process for almost any additional tweak to the system:

1. Snapshot

2. Develop

3. Confirm

4. Snapshot

If only the server was running with a similar facility!  I’d have fewer gray hairs…

Notice the VMware reference?  Well, I have to admit that I’m a bit confused on Microsoft’s virtual pc offerings.  Eventually, I’m sure I’ll find a concise HOWTO and start using it.  So far, mostly due to lake of time, the resources on MSDN and Virtualization seem, detailed but scattered.  <—opinion!!!

Recap:

Server acquired

Pre-requisites installed

At least one installation of SharePoint using VMware completed and a snapshot taken

Ready for the plunge, bringing up the server…taking down the server…bringing it back up.  Crud, it isn’[t working as expected and what is this about Kerberos anyway?

Books Used…

http://www.wrox.com/WileyCDA/WroxTitle/Beginning-SharePoint-2010-Development.productCd-0470584637.html

http://book.pdfchm.net/microsoft-sharepoint-2010-unleashed/9780672333255/ – the overall reference book

http://blah.winsmarts.com/2009-12-Microsoft_SharePoint_2010__Building_Solutions_for_SharePoint_2010.aspx <- a great reference for setting that all important Development Environment.

http://www.wrox.com/WileyCDA/WroxTitle/Professional-SharePoint-2010-Development.productCd-0470529423.html

Of course various .Net and SQL server references as well.

BIDS, SSIS and SQL Server

August 4, 2011 Leave a comment

Abstract…

You have an SSIS package using server store and you need to update it because somebody changed the database underneath you.  Cool, simple beans, easy to fix right?  Really it is pretty simple and quick.  Oh, there’s lots you can read about best practices, security and such but this article isn’t going there.  You may see something like …

“decrypt protected XML node "PassWord" with error 0x8009000B "Key not valid for use in specified state.". “

if all the steps have not be done. 

The following order of steps should work for you just fine…<insert boilerplate disclaimer…these steps worked just fine on my systems, you systems may vary and the process is suggested as learning content only…ok, that should do it.  Remember, no guarantees>

Step one…say hello to my little friend, the package

A best practices note would be to work from Source Control (TFS, of course) so, open the last version of the project from source control.

Once the package is open, on your control flow tab, you should see a little warning icon on your dataflow task (assuming that the issue comes from a schema change).  The Icon looks like…

image

Love the Icon.  You’ll get to see it a lot here…

Double click on the pump with the error and that will take you to the data flow tab.  On your data source, you should see the warning icon again… step 2 will address how to fix that.

Step two… sending my little friend away…

For the source, the process is simple if your setup is simple.  Hey, if you’ve already done some tricky stuff, your level is probably above this article.  Anyway, double click on your source and you’ll get a warning that the underlying meta data has changed.  Go ahead and answer “yes” to the question and this will bring up the Source Editor.  In this case, the source is an OLE DB Source. 

Here I would suggest clicking on Preview to verify you’ve not got any errors with your data pull (especially if the pull is done with a SQL statement).  Make any necessary changes and reclick on Preview.

Repeat this process for all the Control Flows that have the warning icon.

Now comes the fun part, adding the package back to the server.

Step three… Import?

You are working from the server and the package runs fine within BIDS (Business Intelligence Development Studio).  You go to the server (open up an SSIS instance) and the package still wont run.  Huh!

On the one hand you think;

  • package on server
  • package run good
  • package on server still bad

While you may be running bids out of the MSDB store, the compiled package has not been loaded back into the package store.  This is where “import” comes into play.

First, back in bids, make sure you “BUILD” your project.   Once this is completed, a file is create in the \bin directory under your project.  For example: <path>\..\SSIS File\PDA Project\PDA Project\PD\bin

Note:In your project, if you click on the package, you’ll see the “Full Path” in the properties box.  This is where the \bin folder will be created when you “Build” the project. 

Ok, project built and we’re done right?  Not yet.

The next step is to go back to SSMS (SQL Server Management Studio) and open up the SSIS server instance.  Here you will see…

image

MSDB is what you want to work with.  Right clicking on MSDB brings up the following sub menu;

image

Click on Import Package and you get the file explorer

image

So far pretty straight forward.  The next step will change how the package actually will or won’t run on the db server.

Open up the Protection level by clicking on the … right above the HELP button.  This opens up the following dialogue

image

I’m using server storage and roles for access control because this is running in the context of a SQL Job with a predefined account that has appropriate access to all locations needed in the package.  For more information go to:Package Protection Levels on MSDN

Go ahead and click through the OKs.

Testing the result…

When you were in BIDS you clicked on the image to make sure it ran.  There is not a nice ICON in SSMS.  N SSMS, right click on the package and then click on Run Package

image

You will then be presented with the Execute Package Utility.  To run in your user context, leave “Use Windows Authentication Checked”.  If you have a specific service account native to SQL Server, you can use that.  Click on Execute to make sure the package runs.

SNAGHTMLdbd7cc

Summary

That’s pretty much it.  To recap.  This process opens an existing package from the MSDB store.  Modifies then rebuilds the package.  The last step is re-importing the package back into SQL Server.  Tools used:

BIDS – A part of the Visual Studio IDE focused on Business Intelligence

SSMS – SQL Server Management Studio

Integration Services Server – Imports and manages packages

Import and Export facility to Import the new build into the server