Sunday, January 31, 2016

Prep notes for my upcoming talk on 'Partially Contained Databases'

Did a prep session this past Friday  for my upcoming talk (Feb. 3) on 'Partially Contained Databases' to the Sacramento SQL Server Users Group (@SacSQLUG)  (Some #spoileralerts if any the group is reading this before the Feb meeting)  Had a few great, gracious current or former co-workers view the talk and demo over a remote screen sharing session and provide feedback. 

  1. The feedback from a group of people with differing experiences with SQL Server was hugely valuable.  As a presenter, you are so close to the topic, you can include things as obvious that you forgot actually took you quite a few leaps to discover yourself.
  2. The call to action, which I will add to the slides, is to 'try it out'.  Try using partial containment for a simple example like a reporting service account that should have access to only one database.
  3. This topic is fast becoming a standard practice to employ for high availability/disaster recovery scenarios.  (At this time, I'll leave it to the viewer's great mental ability to make the connection) 

Some of the core feedback (which may only make sense when the final presentation is shared)
  •  hard time following the difference between types of accounts
  •  too much time spent on the 'intro'; that is the demonstration of backup/restore in an uncontained database to demo the problem trying to be solved
  • appreciated mention of the connection to certification exams
  •  pre-requisites of DBA knowledge 
My thought process incorporating this feedback:
  • Simplified the login and user creation in the demo.  Namely, removed the example using Windows authentication
  • Removed a time-consuming section on various permutations of recovering logins where the password is known/not known, original server is available/unavailable.   This content is saved for a future blog post.
  • Pre-requisites: well, I think it's OK to make an assumption or two since the talk is aimed to the SQL Server user group.  And in any case, that ship has sailed for this week's talk.  I may review the abstract  for future use and make sure it calls out the viewer is familiar with database backup/restore. 
Hurts my heart a bit to give up the Windows auth and login recovery content, because the content is a good practice.  But the time they eat up distracts from the two key  takeaways:

  1. With a partially contained database, logins can live with the schema of a user database rather than the system database (that's a clue for the High Availability/Disaster Recovery connection)
  2. The db_owner and other database roles have additional power within a user database.
Fun stuff.  and i think if i just script out the sample database creation step, all the scripts could be published to let others run the demo themselves.

Friday, January 29, 2016

Musings on Data Warehouse projects- dealing with data sources of varying quality

Had a discussion this past week with some great colleagues re: the quality of the database design of a potential source system (or lack thereof, depending on point of view)  I've racked my brain trying to find an article that provided some caution in dealing with this.  Almost sure it came from either a book or article from the Kimball Group.   I will paraphrase it:

From time to time, a data source will appear to warrant improvement of some kind.  The data source provider may be all for it.  Couldn't that spreadsheet be turned into a proper system?  
Avoid such efforts.  Your team will be constantly distracted by work of a very different cadence than that which data warehousing requires.

The closest I could find was a snippet in “The Microsoft Data Warehouse Toolkit”, from the chapter on Business Requirements Example: Adventure Works Cycles  (emphasis mine)

"….price lists and international support are important issues to his organization, but they are transaction system problems because they involve enhancing transaction systems or building new IT infrastructure. You can help with better reporting, but you shouldn't be dealing with connectivity and data capture issues if you can avoid it. "

I have taken this viewpoint into many battles over scope of a data warehouse project…  I have not won all of those battles ;)  Nonetheless, it’s a positive outcome to have this caution incorporated into your mindset when evaluating sources.  When possible move the system development 'out' to the proper parties as quickly as possible.


Wednesday, October 28, 2015

Oracle's SQL Developer (11gR2) Date display issue

Oracle’s SQL Developer (as shipped with 11gR2) has a bug feature that by default, causes only the date portion of datetime columns to show in the displayed query results.



Here’s the default

And here it is set to the ISO 8601-style date
YYYY-MM-DD HH24:MI:SS

Good luck!



Friday, June 19, 2015

Generate a Remote Desktop Connection Manager config file from SQL Server Central Management Server

If you are a SQL Server database administrator (DBA) who also has need to administer servers via Remote Desktop, this script will be of tremendous help to you at least once, and may be something you run regularly.

This script will query your SQL Server Central Management Server (not using it yet? - check out Easily Manage your SQL Server with CMS and PBM Webcast) and generate the contents of a a config file for Microsoft's Remote Desktop Connection Manager.  All free tools or built into your SQL Servers.

Get the script at:

Prerequisites:
Install Remote Desktop Connection Manager (RDCMan) version 2.7 from https://www.microsoft.com/en-us/download/details.aspx?id=44989
Have populated a SQL Server Central Management Server (CMS) with at least one group and server

To use, run the script against the instance holding your CMS data.
Click on the link to the XML.  Select All and Copy:
Create a new text file with the extension *.rdg (for example MyCMSServers.rdg)  Paste the XML content:
 
In Remote Desktop Connection Manager, File->Open the *.rdg file

Details:
All unique host names are placed under a group called '_All Servers'.  The 'Smart Lists' dynamically filter based on the groups in your SQL CMS.  This is done by placing a comment in the RDCMan node for each server.  

The script doesn't yet generate hierarchical groups in RDCMan.  Keep an eye out for edge cases that I haven't tested yet: special characters in the server or group names, hosts that are found in more than one group, etc.

Get the script at:

Hope you enjoy!

Wednesday, May 28, 2014

An approach to SQL Server Rough Tuning

Looking for a way to address SQL Server database performance in a production, virtualized environment?  There are many sources of expert advice from very smart people in the SQL Server world.  But often the most well-thought, well-intentioned advice is not easily or quickly implemented in a complex organization.  The reasons could be technical, political or simply availability of time and people. 

This diagram is my current approach to "rough" tuning a SQL Server: The idea that a server administrator or database administrator (DBA) can turn various knobs and flip switches to assigned scarce resources to a database server, but as a practical matter, the inner workings of a given application and database may not be changed… or at least not changed quickly. 


Example situations:
  • A third party vendor application with a proprietary schema, the application might have updates, but there are dependencies or license costs that take time.
  • An internal application has received an influx of new activity, but the development team is fully off on another high-priority project
  • A legacy application with original developers long gone; no known test environment to experiment


Method
Took some best practices, including some selections from the guidance on the SQL Server perfmon counters of interest poster available from Dell/Quest, and added some of the basic steps available to Server administrators and Database Administrators.

Document available as a PDF

Monday, March 24, 2014

SQL Server Data Tools - An Installation Adventure

With SQL Server 2012, your old friend "Business Intelligence Developer Studio" or "BIDS" has been replaced by a component called "SQL Server Data Tools".  SQL Server Data Tools, or SSDT is the primary authoring environment for SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS), and SQL Server Reporting Services (SSRS) projects.  In addition, a new way of working with SQL Server Databases is available.  The new method treats a SQL Server database as a Visual Studio project, enabling adaption of developer concepts like version control and deployment.  

While SSDT is a very innovative product, the steps to arrive at a working installation of SSDT can be a bit of an adventure.  If you have installed SQL Server 2012 from installation media, selecting all features, or at least selected all client tools, your adventure begins here.  

An entry for SQL Server Data Tools will appear in your Start menu, under the folder for "Microsoft SQL Server 2012"


Click on SQL Server Data Tools and you will soon see that the "Microsoft Visual Studio 2010 Shell" is starting up.

You will be promoted to select environment settings.  If you used SQL Server 200X's Business Intelligence Developer Studio or plan to focus this machine on SQL Server Business Intelligence projects, select "Business Intelligence Settings".  Click 'Start Visual Studio'.

After a potentially brief pause, and this loading screen...

… and you are brought into the 'Microsoft Visual Studio 2010 Shell"

Click on New Project…

Now you can actually create SSAS, SSIS, or SSRS projects at this point.  You are good to go as far as that feature is concerned.  Then check out the new option "SQL Server"…

Note the text "Provides instructions for installing Microsoft SQL Server Data Tools from the web".  Clicking this link presents this message...

Clicking "Install" opens this web page...

If you click on 'Download SQL Server Data Tools', as of the time of this writing, you're taken to a page about downloading Visual Studio 2013 trial editions.  I went this route on another machine with Visual Studio 2013 installed, so far, haven't quite figured out how to get SSDT enabled, so...

Click back…
And click SSDT Visual Studio 2012

On the next page, scroll about halfway down to step 2 and click 'Download SQL Server Data Tools for Visual Studio 2012'

For a single installation, go ahead and click 'Run'.  (To save the file for use on other workstations, click 'Save')

Then thoroughly read the License Terms, and if they are amenable, click 'I agree…' and 'Install'
The Microsoft .NET Framework 4.5.  Your environment may vary.
Restart is required if the .NET Framework 4.5 is installed

After the restart, Setup will continue

After this, the SQL Server Data project will be available under 'SQL Server Data Tools', right?  No, to access the SQL Server tools, they're under 'Microsoft Visual Studio 2012'

The 'SQL' menu  with the 'Schema Compare' and 'Data Compare' are now available.

The SQL Server Data Tools look very compelling.  As I'm help my organization migrate to SQL Server 2012, the new features will replace some work that currently requires manual processes or intricate scripting.  Perhaps the install story will tighten up as SQL Server 2014 is released or the Visual Studio 2013 version becomes a bit more clear to me  It is a bit of an adventure to install - nothing difficult, just unclear at some steps the progress towards a working installation.  Hopefully, this blog post helps you be more confident when you decide to start using SQL Server Data Tools.

Friday, March 21, 2014

IT Project Staffing for Emerging and Legacy applications


I've been catching up in the past week or on the Oregon Health Exchange ("Covered Oregon") issues.  This apparently started popping off in November 2013.  There was a recent spate of articles on GovTech regarding the oversight.

The Cover Oregon Website Implementation Assessment by First Data contained an interesting nugget in their recommendations regarding IT Project Staffing:

IT Project Staffing - The exchange project was a large, complex IT project. Complex IT projects introduce an innate resource risk that can only be mitigated through careful staff planning. First Data recommends the State reconsider how IT projects are staffed in the State. The exchange project filled many of its staffing needs using temporary positions, which are difficult to fill due to their lack of employment security. Additionally, qualified staff hired into temporary positions are likely to continue to search for alternate permanent state positions. Consequently, the exchange project regularly struggled to sustain the anticipated project team size and skills. As a result, a large number of staff members were acquired through contracts. Where possible, introducing temporary positions or consultant positions to an organization to backfill or support the systems that will be replaced would naturally align staff attrition with the technology and application lifecycles. Reserving the permanent or long-term positions for the ‘go-forward’ technologies will also provide the state with the capability to develop stronger, more cohesive IT support teams. 

The opposite tact is common, historically, of course.  The emerging project is established with temporary positions or consultants.  Existing, permanent staff remain with the legacy application.  This very commonly leads to a brain drain as soon as it becomes clear the emerging project will be the new normal.  Absent extraordinary efforts to retain those experienced staff - pay, working environment, chances for new projects - they will simply start searching for other employment.

First Data's recommendation is the opposite.  Place the temporary positions with the legacy system immediately, where it will naturally tail down.  If the legacy system is needed longer, extending a temporary position is relatively easy, the person in that position may be relieved not to have to start a job search soon.  The experienced staff immediately start adding value to the emerging system based on the familiarity with the organization and working environment.